GB2288304A - Computer graphics - Google Patents

Computer graphics Download PDF

Info

Publication number
GB2288304A
GB2288304A GB9506722A GB9506722A GB2288304A GB 2288304 A GB2288304 A GB 2288304A GB 9506722 A GB9506722 A GB 9506722A GB 9506722 A GB9506722 A GB 9506722A GB 2288304 A GB2288304 A GB 2288304A
Authority
GB
United Kingdom
Prior art keywords
texture
levels
data
map
select
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9506722A
Other versions
GB9506722D0 (en
Inventor
Raymond Lee Fitzgerald
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Evans and Sutherland Computer Corp
Original Assignee
Evans and Sutherland Computer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Evans and Sutherland Computer Corp filed Critical Evans and Sutherland Computer Corp
Publication of GB9506722D0 publication Critical patent/GB9506722D0/en
Publication of GB2288304A publication Critical patent/GB2288304A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Abstract

Dynamic computer graphics images are enhanced by mapping two-dimensional texture onto objects with texture definition appropriate to the range from an eye point to the object. With varying definition texture map levels stored, each carrying different texture resolutions (degrees of definition) select sets of levels are paged for mapping operations based on the range to an object. In an operation with two select sets, one set (the top) includes the five least detailed or least defined levels while the other set, the whole, constitutes all levels. The choice between the two select sets for texture mapping an object is determined based on the range, and a specific ratio of pixel view frustum values provides an effective test criteria. <IMAGE>

Description

COMPUTER GRAPHICS The present invention relates to computer graphics and in particular to processes and systems for use therein.
In recent years, significant advances have occurred in the field of computer graphics. For example, in the simulator area, real time dynamic pictures can be displayed, for example revealing a terrain as it would appear from a moving aircraft, complete with buildings and various other features. Typically, such systems utilize a display device, as in the form of a cathode ray tube (CRT), to provide dynamic images for visually simulating actual flight experiences.
Various forms of dynamic displays have been accomplished utilizing graphics data definitive of objects and surface textures. However, a common weakness of such systems has involved the texture capability, both in terms of the general image quality and the time it takes to "tweak" the texture so as to make it "behave" properly. Accordingly, a need exists for systems with greater realism via phototexture, better behaved texture and texture that does not require many hours to tune and adjust.
Effective improvements in computer graphics texturing systems have involved the use of so-called "MIP" maps, carrying different texture resolutions for the same area. Essentially, several textures are computed as levels reflecting the distance from which the texture is to be viewed. As the distance increases, the texture detail becomes fuzzy or less sharply defined.
Although traditional MIP map techniques are effective for texturing objects in a dynamic display, a considerable difficulty arises in storing and manipulating the volume of data required for advanced systems.
To consider a specific example, feature textures might be mapped on the side of a building to indicate a particular surface structure, for example, brick.
Typically, with the presence of the building in the scene, texture is paged from storage for texture mapping the building. By utilizing a MIP map pyramid (levels of filtered texture data) the building can be variously textured with regard to definition as the range changes, such that fuzziness decreases as the eye point approaches the building. Of course, the scene may include many textured features, such as for example buildings, and so the volume of MIP map data is considerable, imposing rather extreme demands on the active or working memory of the system. Accordingly, a need exists for an improved system to simplify and enhance operations utilizing MIP map techniques for texture mapping dynamic displays.
It is an object of the present invention is to provide a computer graphics process and a computer graphics system that may be used to meet this need.
According to one aspect of the present invention there is provided a computer graphics process for producing dynamic images with textured features comprising the steps of: storing graphics image data including texture map data defined in a plurality of levels (for example, in relation to definition); paging select sets of levels from said texture map data for processing said graphics image data; processing said graphics image data to provide image display signals representative of said dynamic images processed from said graphics image data and including mapping said texture map data to texture features of said images with said select set of levels; and displaying images in accordance with said image display signals.
According to another aspect of the invention a computer graphics system for producing dynamic images with textured features as viewed from a moving eye point comprises: data storage for graphics image data including texture map data defined in a plurality of levels (for example, in relation to definition); an image generator to provide image display signals representative of said dynamic images processed from said graphics image data and including texture mapping structure with an active memory, said texture mapping structure processing said texture map data to texture features as represented by said display signals; a data paging structure for selectively paging select sets of said plurality of texture map levels into said active memory for texturing features; and a display unit coupled to receive said display signals from said image generator to produce dynamic images therefrom.
The process and system enable selective paging of MIP map levels into active memory to create dynamic images with respect to a current eye point. The invention is in this regard based on the recognition that under certain circumstances, features of an image can be textured effectively using less than all the levels in an entire MIP texture map pyramid. That is, considerable saving of active memory is afforded with little compromise to the displayed image by selectively breaking the MIP texture map pyramid into fragmentary pageable units for selective use.
Essentially, recognizing that the higher resolution levels of a MIP map pyramid are applicable only when the eye point is near the texture map (since these levels will alias at range), selectively paging levels of the MIP pyramid as they are needed, has been discovered to be an efficient online data reduction technique. In accordance herewith, portions (levels of texture elements or texels) of the MIP pyramid are selectively paged into active memory based on the distance from an object (to be textured) to the eye point.
In one embodiment of the present system, the entire MIP texture map pyramid may be paged into active memory.
Alternatively, only several of the lowest levels are paged. The several select lower levels, for example, five lower levels of detail, are referred to as the "top". The present development is based on the recognition that in many displays, a very considerable portion of the texture in a scene can be accounted for by using only the top.
In accordance herewith, it has been determined that the portion of the texture in a scene that can be accounted for using only the top may exceed 90%. In a three-dimensional (3D) system, the area over which the top is sufficient may be even greater. In one embodiment, for a given eye point (assuming a uniform distribution) only one-half of one percent of all texture maps applicable required the entire MIP pyramid structure. For the balance, over ninety-nine percent, the tops of the maps would suffice.
As described in detail below, and in accordance with one operating mode, selective paging of selective sets of levels is determined based on the relative size of a specific level texture element (texel) in a map level, and the perspective picture element (pixel) size. In one embodiment, the size of a level four texel (highest level of detail in the top) is compared against the perspective size (using the range to the feature and the field of view) of two pixels of the display. If the level four texel size is too large, then the entire map is paged into active memory, otherwise, the top is sufficient.
Various formats of selection and pyramid dissection will be apparent from the detailed description below.
A computer graphics process and system according to the present invention will now be described, by way of example, with reference to the accompanying drawings, in which: FIGURE 1 is a graphic representation illustrating a view frustum radiating from an eye point with respect to screen space and world space as treated herein; FIGURE 2 is a plan view of a component pixel frustum illustrating a representative relation to changing depth; FIGURE 3 is a graphic representation illustrating content for a pixel window with respect to a textured polygon at varying depths; FIGURE 4 is a graphic representation of a memory organization for storing MIP maps; FIGURE 5 is a graphic representation of a MIP map pyramid including several levels of filtered texture data; FIGURE 6 is a graphic representation of an area illustrating the fragments over which entire MIP maps and MIP tops are utilized;; FIGURE 7 is a diagrammatic perspective view of texels and a pair of MIP maps illustrating interpolation operation; FIGURE 8 is a block diagram of the example of computer graphics system in accordance with the present invention; FIGURE 9 is a detailed block diagram of a component of the system of FIGURE 8; and FIGURE 10 is a graphic representation illustrating the operation of the system of FIGURE 9.
A detailed illustrative embodiment of the present invention is disclosed herein; however, recognizing that a wide variety of specific embodiments of the disclosed system are possible, it is merely representative.
Nevertheless, the illustrative embodiment is deemed to afford the best embodiment for purposes of disclosure and to provide a basis for the claims herein which define the scope of the present invention.
Initially, consideration of some graphic representations will be helpful as related to understanding the present development. Initially, considerations are with regard to accomplishing computer graphics displays with textured surfaces.
The process of applying texture patterns to surfaces is generally referred to as "texture mapping" and is treated at length in the book Principles of Interactive Computer Graphics, 2nd edition, Newman & Sproul, McGraw Hill Book Company, 1979. Non-uniform texture mapping is well-known in the art as treated in an article entitled "Survey of Texture Mapping" by Paul c. Heckbert, published in IEEE Computer Graphics and Applications, November 1986, pp. 56-67. MIP maps and their use in computer graphics for texture mapping are treated in a paper entitled "Pyramidal Parametrics" by Lance Williams, published July 1983 in Computer Graphics, vol. 17, no. 3.
The article has been identified by the Association for Computing Machinery as ACM 0-89791-109-1/83/007/0001.
Texture mapping essentially involves locking textures to defined objects or polygons to accomplish textured surfaces in a display. The mapping of texture or other images onto surfaces is more effective if the texture is rendered progressively more fuzzy as the polygon moves away from the viewer. Such operation is in accordance with the perspective nature of observation by the human eye. For example, the squares of a checkerboard are vividly clear to the normal eye when viewed at a distance of ten feet. However, if the checkerboard is moved away from the eye, boundaries between individual squares of the board progressively become more fuzzy with less definition. At some point, perhaps a few hundred feet, the individual black and white squares of the checkerboard simply fade to a uniform grey, totally void of definition.Effective texture mapping reflects these changes as they would appear to an observer.
As treated below, MIP map texturing involves a MIP texture map pyramid composed of multiple versions of the same source motif, for example, bricks or any other pattern, each version having a progressively coarser resolution. Accordingly, depending on the distance from the eye point to the object or feature being textured, an appropriately coarse map level is selected for use.
Actually, in practice, two map levels are selected from which values are interpolated.
Referring now to FIGURE 1, a textured polygon 24 (representing part of an object) is illustrated in world space. Note that the various space designations as used in the field of computer graphics are treated in the referenced text, Principles of Interactive Computer Graphics. In summary, world space or object space (three-dimensional) serves to define objects prior to any geometric transformations. In eye space, objects are transformed so that the eye or view point is the origin for coordinates and view rays are along the Z-axis.
Screen space involves further transformations to account for the perspective foreshortening of the view pyramid and with clipping performed. As a function of computer graphics processing, objects in screen space are mapped to an eye point, typically on a pixel grid. A discussion of world space and the related transforms to accomplish displays appears in Chapter 8 of a book, Fundamentals of Interactive Computer Graphics by Foley and Van Dam, published in 1984 by Addison-Wesley Publishing Company.
To represent the polygon 24 in a display related to an eye point 0, areas of the polygon are defined in screen space at a screen 28. In accordance with convention, the screen 28 comprises the base of a pyramidal view frustum 30 with an apex 32 at the eye point 0. In a sense, the viewing screen 28 (base of the frustum 30) may be treated as analogous to the screen of a television set through which world-space objects (including the polygon 24) are viewed.
In accordance with traditional practice, the space of the screen 28 is dissected into small picture elements (pixels). Specifically, for example, an array of one million pixels may be organized as one thousand rows, each of one thousand pixels. A representative pixel 34 (idealized and grossly enlarged) is illustrated at a generally central location of the screen 28. Note that a ray 36 extending from the eye point 0, passes through the centre of the pixel 34 to a point 38 on the polygon 24.
The ray 36 exemplifies perhaps a million of such rays that dissect the scene or image of primitives (as the polygon 24) into pixels. For a display, each pixel is processed to accomplish representative signals in a storage, for example, a frame buffer, which is scan converted, for example, into a raster pattern for driving a display device, for example, a cathode ray tube (CRT).
As illustrated in FIGURE 1, the polygon 24 is to bear a texture 35 in the form of a checkerboard. In accordance herewith, the texture 35 is mapped onto the polygon 24 utilizing select levels of a MIP map texture pyramid depending essentially on the range from the eye point 0 to the polygon 24 and the orientation of the polygon (perspective size). Considerable economy of memory as well as transfer operations result from the selectivity.
For purposes of explanation, consider that the pixel 34 represents a substantial area with respect to the polygon 24. Actually, the pixel will represent a single colour, however, treating an enlarged area will be helpful to the explanation. FIGURE 2 shows the pixel 34 in a sectioned plan view and illustrates the polygon as it might appear. at different ranges in a pixel frustum 39. That is, the polygon is shown at various relative depths, i.e., indicated as polygon areas 24a, 24b, and 24c, each progressively more remote from the pixel 34 in screen space. As the polygon 24 moves away from the pixel 34 (arrow 37) more of the texture 35 (FIGURE 1) is visible and it becomes fuzzy in the picture. The phenomena is illustrated in FIGURE 3 and will now be considered.
FIGURE 3 shows the texture areas of the polygon 24 (FIGURE 3) contained by the pixel 34 as the polygon moves away from the pixel 34 (and the eye point 0) in the Z dimension as indicated by the arrow 37 (FIGURE 2). As the polygon area 24a (FIGURES 2 and 3) is positioned near the screen 28 (contiguous to the pixel 34) the pixel 34 is occupied primarily by a light area La and only slightly occupied by a dark area Da. As represented, an area ratio of approximately four to one is illustrated.
With progressive depth displacement of the polygon 24, as illustrated by the polygon area 24b, the pixel 34 embraces greater detail of the texture 35. Of course, the change simply results from the fact that the polygon 34 is deeper in the pixel frustum 39 (FIGURE 2).
Accordingly, the frustum has a larger base at the polygon 24 to encompass a greater area of the texture 35.
Further displacement of the polygon 24 from the screen 28 is illustrated by the polygon area 24c and results in a still greater area of the texture 35 being located within the pixel 34.
For each of the depicted situations, a different level of definition or detail is appropriate for displaying a quality image. That is, the level of detail of an object in the picture should become fuzzy as the object is moved further away from the eye point.
Relating the phenomena to FIGURE 3, the border between the area La and area Da of the pixel area 24a would be substantially sharper than the borders between the areas Lc and Dc within the pixel 24c. Accordingly, different computed levels of a MIP map pyramid are used for texturing a feature or object. That is, rather than to repeatedly calculate the averages for each pixel, a MIP map is addressed by texture coordinates U and V to provide weighted averages that have been computed for the contribution from a textured polygon to a pixel. To further illustrate, consider a form of memory organization for accomplishing such operation, as shown in FIGURE 4.
Areas are defined in FIGURE 4 of progressively reduced size, to indicate the resolution levels of a MIP map pyramid that are selectively paged in accordance herewith. Typically, an image is provided in its colour components red, green and blue. For a pixel in a textured polygon, pre-computed texel averages are addressed for each colour component by the coordinates u and v (FIGURE 4). Each of the colour components are provided in look-up tables of varying degrees of specificity to be identified and interpolated. Levels of detail are related to distance from the eye point as indicated by a line 41 and perspective size. At the most detailed level, the computed texel averages of blue (B), for example, are stored in a section 40 while the values for the green component are stored in a section 42 and the values for the red component are stored in a section 44.
In accordance with the memory organization, the fourth quadrant or section 46 is arranged to progressively include sets of three smaller sections, defining the colour level. The reducing pattern continues in a similar quadrant-by-quadrant division until ultimately sections 51 are provided.
While the representation of FIGURE 4 illustrates a memory organization for the different levels, FIGURE 5 is a side elevation of a MIP texture map pyramid illustrating the diminishing MIP map levels in stacked relationship.
For simplicity, consider the pyramid for a single colour. As illustrated, the MIP pyramid 52 comprises n levels of texture data extending from a tip (least defined detail level 0) downward to a base at level n-l.
The highest levels (lowest level of detail) are designated as a top 54 (five levels) while the whole is designated 55. At the base, the level n-l is the highest level of detail followed by the level n-2.
To consider an exemplary format, based on a 512 x 512 texel array as the maximum size for the base, the following table indicates sizes.
TABLE Level Texel Array O lxl 1 2x2 2 4x4 3 8x8 4 16x16 5 32x32 6 64x64 7 128x128 As illustrated in FIGURE 5, select set of map levels 0 through n-l are paged as alternatives, either the top 54 or the entire map 52. In various arrangements, the MIP pyramid may be broken into any number of select sets of levels; however, in accordance with one operating embodiment, a break into two pieces or sets has been found to be effective.
With the MIP map texture pyramid stored for selectively addressing or paging, in accordance herewith, selection of a select set of levels, either the whole 55 or top 54, is based on range and field of view.
For one embodiment based on two sets of levels (top 54 and whole 55), FIGURE 6 illustrates the areas of selectivity. That is, a series of concentric rings R0, R1, R2, R3 and R4 define annular areas with respect to texturing operations. In that regard, only the shaded area within the ring R4 requires the entire or whole 55 of the map pyramid to be used in texturing operations.
Conversely, excluding the shaded area within the ring R4, in the areas within each of the larger concentric rings (R3, R2, R1 and R0), shading was successfully accomplished fuzing only the top 54 as illustrated in FIGURE 5. A profound economy is thus illustrated.
In the course of texture mapping, a point 38 (FIGURE 1) of interest may indicate a texture map level that lies intermediate two adjacent map levels. To illustrate, referring to FIGURE 7, a point 60 of interest lies between two different levels, for example, levels L3 and L4 as represented in FIGURE 7 by a pair of single texels 62 and 64. Accordingly, neither of the texels 62 or 64 is appropriate with respect to the point 60. In such an event, an interpolation is performed involving the four surrounding coordinate corners of the texels 62 and 64 at the two map levels L3 and L4. Specifically, the points 66, 67, 68 and 69 of the texel 62 are interpolated in combination with the points 76, 77, 78 and 79 of the texel 64.Interpolation (usually but not necessarily linear), as well known in the art, is a calculation of a texture value rom the eight values of the surrounding points and accordingly a value (intensity and colour) is determined for texturing the pixel identified by the impact point 60.
Recapitulating to some extent with respect to the graphics representations as explained above, the texturing operation basically involves mapping a texture pattern or image onto the surface of a primitive, polygon or object, utilizing traditional techniques. For example, the operation may involve applying a brick texture to the exterior wall of a building as a part of a view terrain dynamically displayed with respect to a moving eye point. In accordance herewith, sets of levels from MIP maps are used to texture objects with various levels of detail depending upon the range. For an object or feature near the eye point, the detail must be clear and sharp, thaz is, very high definition. For a remote object, the level of detail reduces and the texture becomes somewhat fuzzy.
In the disclosed embodiment, the MIP data may be considered in the form of a pyramid comprising n levels of filtered texture data as illustrated in FIGURE 5. As explained above, the system of the present development is based on the recognition that select sets can be used with little compromise to image quality. In that regard, higher resolutions of a MIP map pyramid are needed in a select set only when the eye point is near the object.
Furthermore, for typical dynamic image creation, much of the display is remote. Accordingly, selectively paging select sets of levels of the MIP map pyramid, as they become necessary, is an efficient online data reduction mechanism. Note that one paged select set may be the entire map.
In accordance herewith, depending on the distance from the eye point to an object, a select group or select set of MIP levels are paged into active memory to accomplish the texturing operations as reflected in pixel calculations. As a result, individual pixel data signals are stored in a display system, typically including a frame buffer and display unit.
In the operation of a contemporary image generator, graphic image data is utilized including texture map data defined in levels of detail. Accordingly, the image generator processes the graphic image data to provide pixel display signals. In accordance herewith, a data paging structure selectively pages select sets (including the full set) of levels from the texture map pyramid into the image generator. Accordingly, features are textured efficiently and economically.
As indicated above, the selective data transfer is determined by the texel size in relation to the perspective pixel size, using the range to the feature and the field of view. Essentially, the consideration involves the texel size, determined by the distance from the eye point 0 (FIGURE 1) to the polygon 24 in world space in relation to the size of the pixel 34 in screen space. As indicated, in one operating embodiment, the size of a level 4 texel is compared against the size of two pixels in the display. If the level 4 texel size is too large, the entire map is paged into the image generator, otherwise, only the top is paged. Clearly, the levels of MIP texture maps can be variously fragmented in other embodiments and other criteria relating to range can be employed for selective paging.
In view of the above explanations of operating steps within the system, reference now will be made to FIGURE 8 showing an operating embodiment implementing the development. A real-time system computer 142 (FIGURE 8, left) functions as a system controller, as in a conventional system. For example, the computer 142 may take the form of a Motorola Model MVME 147S-1, available from that company, which is located in Phoenix, Arizona.
The real-time system computer 142 is served by a control input unit 144 which may take various forms including a manual input terminal, another computer, or virtually any source of control input information.
Essentially, in accordance with contemporary techniques, the input unit 144 interfaces the real-time computer 142 for driving an object management processor 146. An environmental memory 152 is embodied in the object management processor 146 along with an object pager control 157. Note that the environmental memory 152 stores three-dimensional data defining objects in world space, sometimes referred to as "geometric data".
Functionally, the object management processor 146 is intimately associated with a display processor 148 that is connected to a texture memory 150 (active, for twodimensional data). Note that basically, the combination of the real-time computer system 142 and the object management processor 146 along with the display processor 148 may take the form a Model ESIG-3000 Image Generator available from Evans & Sutherland Computer Corporation located in Salt Lake City, Utah. Modifications involve texture data management.
The texture memory 150 within the processor 148 and the environmental memory 152 within the processor 146 each receive data from a mass storage 154 controlled by the computer 142 as indicated by a control path 156. As suggested by the drawing, the mass storage 154 may take the form of a disk storage designed for the transfer of address data to both the texture memory 150 and the environmental memory 152 as indicated by the lines 158 and 159. Specifically, the texture memory 150 stores two-dimensional MIP data to be mapped selectively onto surfaces of objects. Note that from the select set of MIP map levels paged into the texture memory 150, typically two are designated to provide the texels (for example, texels 62 and 64, FIGURE 7) from which a value is computed. The operation is executed for each pixel effected by the object (polygon 24, FIGURE 1).
Consequently, çast access is a necessary characteristic and space in te texture memory 50 is cherished. In accordance herewith, by selective paging of MIP map data, substantial savings occur in memory space and data transfer operations.
The texture memory 150 receives select levels (all being a possibility) of MIP maps from the mass storage system 154 under the control of the computer 142 and the object pager control 157 in the processor 146.
Essentially as the object management processor initiates activity on a particular object, the object pager control 157 determines the select set of levels (whole 55 or top 54) in the MIP map pyramid 50 (FIGURE 5) needed for texturing the object. In accordance with the selection, the object pager control 157 operates through the computer 142 to address and control (page) the desired select set of MIP levels for transfer from the mass storage system 154 to the texture memory 150.
As indicated above, in many instances, in view of the distance from the eye point to the object, only the top 54 (FIGURE 5) of the MIP pyramid need be paged into the active texture memory 150. From that location, the display processor 148 texture maps the object for storage pixel-by-pixel in a frame buffer 164 from which the display data is scanned for display by a display unit 166.
Considering the operation of the system of FIGURE 8 in somewhat greater detail, the real-time system computer 142 along with the object management processor 146 and the display processor 148 function as a pipeline to provide display signals to the frame buffer 164. The computer 142 implements the subject matter of displays controlling the mass storage system 54 to selectively load and maintain the texture memory 150 as explained above. Additionally, the environmental memory 152 also is loaded and maintained to accommodate the development of a dynamic image with a moving eye point. The object management processor 146 receives control data, with the consequence that object or polygon data is supplied from the object processor 146 to the display processor 148.
The accumulation and preliminary processing of object data to accomplish basic image data for the display processor 148 is well-known in the art.
Accordingly, the display processor 148 receives basic data for processing object pixels to be stored in the frame buffer 164. As explained in detail above, the display processor 148 utilizes selective texture map data stored in the active texture memory 150 to process individual pixels for the frame buffer 164. It is to be understood that the texture maps may be stored in a variety of configurations or memory organizations for fast access; however, in accordance herewith, select numbers of levels (select sets of all or less than all) are paged from the mass storage system 154 into the texture memory 150. The selectivity is based on the result of a texel/pixel comparison as will now be considered with respect to the block diagram of FIGURE 9.
Within the object management processor 146 (FIGURE 8), certain operations are performed, specifically, the object pager control 157 incorporates a level 4 texel store 202 (FIGURE 9). Generally, the store 202 receives signals from an object data store 208 that are representative of the size of a level 4 texel. That value is supplied from the store 202 to a comparator 204, also connected to receive an indication of pixel size from a store 206. Specifically, the store 206 provides signal indications representative of the perspective size of two pixels. Accordingly, in the disclosed embodiment, a level 4 texel (from store 202) is compared with the size of two perspective pixels (from store 206) to determine the select set of levels that will be paged into the texture memory 150 (FIGURE 8).
Recapitulating to some extent, the object management processor 146 (FIGURE 8) will always request the tops 54 (FIGURE 5) of the MIP pyramids 52 that are encountered during a pager traversal. The request for the whole 55 map pyramid depends on the proximity of the eye point 0 (FIGURE 1) to the polygon 24. The object management processor 146 etches the whole texture map when the texel size at level 4 (FIGURE 5) is larger than the size of two pixels on the screen 28 (FIGURE 1). FIGURE 10 illustrates a case in which two pixels are the same perspective size as a level 4 texel. A view triangle 220 extends from the eye point 0 through the screen 28 to an arrow D representing the size of a level 4 texel. The dimension of a shorter arrow A indicates the size of two pixels located a distance B from the eye point 0. The distance R reflects a measure from the eye point 0 to the level 4 texel.
FIGURE 10 illustrates a ratio test that is true if the perspective size of a texel is equal to or greater than the perspective size of two pixels, i.e., D/R is equal to or greater than A/B. The equation can be modified slightly to simplify the operation of the management processor 146 (FIGURE 8). Specifically, by squaring all terms, the object management processor can do the perspective size comparison without the complications of calculating square roots. Accordingly, if R2*A2/B2 < D2 is true, then the object management processor 146 will command the full texture map pyramid.
The test or comparison operations as set forth above may be executed by a structure as represented in FIGURE 9, either in the basic form or in the squared configuration. Accordingly, a relatively simple comparison test is performed for the texture processing of each object by the comparator 204 utilizing the values as developed in the stores 202 and 206. However, a multitude of other options and variations departing from those disclosed above are available without departing from the spirit of the present development. In that regard, the top may define various numbers of levels, the comparison may be variously implemented and a variety of interpretation techniques might be employed.
Accordingly, although certain detailed structures and processes have been disclosed, the appropriate scope hereof is deemed to be in accordance with the claims as set forth below.

Claims (16)

Claims:
1. A computer graphics process for producing dynamic images with textured features comprising the steps of: storing graphics image data including texture map data defined in a plurality of levels (for example, in relation to definition); paging select sets of levels from said texture map data for processing said graphics image data; processing said graphics image data to provide image display signals representative of said dynamic images processed from said graphics image data and including mapping said texture map data to texture features of said images with said select set of levels; and displaying images in accordance with said image display signals.
2. A prcess according to Claim 1 wherein said select sets of levels are paged based on the distance to a feature as depicted.
3. A process according to Claim 2 wherein graphics image data is processed by pixels and wherein said select sets are paged in accordance with a ratio of pixel size and eye point to screen distance with a ratio of texel size to the range to the texel.
4. A computer graphics system for producing dynamic images with textured features as viewed from a moving eye point comprising: data storage for graphics image data including texture map data defined in a plurality of levels (for example, in relation to definition); an image generator to provide image display signals representative of said dynamic images processed from said graphics image data and including texture mapping structure with an active memory, said texture mapping structure processing said texture map data to texture features as represented by said display signals; a data paging structure for selectively paging select sets of said plurality of texture map levels into said active memory for texturing features; and a display unit coupled to receive said display signals from said image generator to produce dynamic images therefrom.
5. A system according to Claim 4 wherein said data paging structure includes means for indicating the distance from said eye point to a surface for texture mapping and means for selecting a select set of said texture map levels in accordance with said distance.
6. A system according to Claim 4 or Claim 5 wherein said image generator processes said graphics image data, mapping said texture map data as texture elements.
7. A system according to Claim 6 wherein said means for indicating distance includes means for determining size relationships between texture elements of a predetermined map level and screen pixels to be processed by said image generator.
8. A system according to any one of Claims 4 to 7 wherein said data paging structure selectively pages one of two select sets of texture map levels into said active memory determined by the distance from said eye point to a picture element in process.
9. A system according to Claim 8 wherein said two select sets of texture map levels consist of the entire map and select top levels.
10. A system according to claim 9 wherein said top levels consists of the five least detailed levels.
11. A system according to any one of Claims 4 to 10 for texture mapping image features and wherein said data paging structure selectively pages a single select set of texture map levels into said active memory for processing each feature.
12. A system according to any one of Claims 4 to 11 wherein said image generator interpolates between map levels to process said texture map data.
13. A system according to any one of Claims 4 to 12 wherein said image generator processes data and said data paging structure compares a ratio of pixel size and eye point to screen distance with a ratio of texel size to the range to the texel.
14. A system according to claim 13 wherein values of said ratios are squared.
15. A computer graphics process substantially as hereinbefore described with reference to the accompanying drawings.
16. A computer graphics system substantially as hereinbefore described with reference to the accompanying drawings.
GB9506722A 1994-04-01 1995-03-31 Computer graphics Withdrawn GB2288304A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US22183294A 1994-04-01 1994-04-01

Publications (2)

Publication Number Publication Date
GB9506722D0 GB9506722D0 (en) 1995-05-24
GB2288304A true GB2288304A (en) 1995-10-11

Family

ID=22829589

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9506722A Withdrawn GB2288304A (en) 1994-04-01 1995-03-31 Computer graphics

Country Status (2)

Country Link
CA (1) CA2144914A1 (en)
GB (1) GB2288304A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0810553A2 (en) * 1996-05-17 1997-12-03 Seiko Epson Corporation Texture mapping apparatus
GB2315968A (en) * 1996-07-26 1998-02-11 Hewlett Packard Co Method for maintaining contiguous texture memory for cache coherency
WO1998014905A2 (en) * 1996-09-30 1998-04-09 Cirrus Logic, Inc. Dynamic switching of texture mip-maps based on depth
WO1998022911A1 (en) * 1996-11-21 1998-05-28 Philips Electronics N.V. Method and apparatus for generating a computer graphics image
GB2343599A (en) * 1998-11-06 2000-05-10 Videologic Ltd Texturing systems for use in three-dimensional imaging systems
US6097397A (en) * 1997-11-20 2000-08-01 Real 3D, Inc. Anisotropic texture mapping using silhouette/footprint analysis in a computer image generation system
US6313846B1 (en) 1995-01-31 2001-11-06 Imagination Technologies Limited Texturing and shading of 3-D images
GB2383248A (en) * 2001-12-14 2003-06-18 Imagination Tech Ltd Texturing using blend buffer

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2104759A (en) * 1981-05-22 1983-03-09 Marconi Co Ltd Apparatus or storing video data
GB2171579A (en) * 1985-02-20 1986-08-28 Singer Link Miles Ltd Apparatus for generating a visual display
WO1988002156A2 (en) * 1986-09-11 1988-03-24 Hughes Aircraft Company Digital simulation system for generating realistic scenes
GB2240015A (en) * 1990-01-15 1991-07-17 Philips Electronic Associated Texture memory addressing
GB2240016A (en) * 1990-01-15 1991-07-17 Philips Electronic Associated Texture memories store data at alternating levels of resolution

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2104759A (en) * 1981-05-22 1983-03-09 Marconi Co Ltd Apparatus or storing video data
GB2171579A (en) * 1985-02-20 1986-08-28 Singer Link Miles Ltd Apparatus for generating a visual display
WO1988002156A2 (en) * 1986-09-11 1988-03-24 Hughes Aircraft Company Digital simulation system for generating realistic scenes
GB2240015A (en) * 1990-01-15 1991-07-17 Philips Electronic Associated Texture memory addressing
GB2240016A (en) * 1990-01-15 1991-07-17 Philips Electronic Associated Texture memories store data at alternating levels of resolution

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Pyramidal Parametrics Com Graph, Vol 17 No 3 (Proc. SIGGRAPH1983)at pages 1-11, Lance Williams *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6330000B1 (en) 1995-01-31 2001-12-11 Imagination Technologies Limited Method and apparatus for performing arithmetic division with a machine
US6313846B1 (en) 1995-01-31 2001-11-06 Imagination Technologies Limited Texturing and shading of 3-D images
EP0810553A2 (en) * 1996-05-17 1997-12-03 Seiko Epson Corporation Texture mapping apparatus
EP0810553A3 (en) * 1996-05-17 1999-09-01 Seiko Epson Corporation Texture mapping apparatus
US5781197A (en) * 1996-07-26 1998-07-14 Hewlett-Packard Company Method for maintaining contiguous texture memory for cache coherency
GB2315968B (en) * 1996-07-26 2000-11-08 Hewlett Packard Co Method for maintaining contiguous texture memory for cache coherency
US5917497A (en) * 1996-07-26 1999-06-29 Hewlett-Packard Company Method for maintaining contiguous texture memory for cache coherency
GB2315968A (en) * 1996-07-26 1998-02-11 Hewlett Packard Co Method for maintaining contiguous texture memory for cache coherency
US5973701A (en) * 1996-09-30 1999-10-26 Cirrus Logic, Inc. Dynamic switching of texture mip-maps based on pixel depth value
WO1998014905A3 (en) * 1996-09-30 1998-07-16 Cirrus Logic Inc Dynamic switching of texture mip-maps based on depth
WO1998014905A2 (en) * 1996-09-30 1998-04-09 Cirrus Logic, Inc. Dynamic switching of texture mip-maps based on depth
WO1998022911A1 (en) * 1996-11-21 1998-05-28 Philips Electronics N.V. Method and apparatus for generating a computer graphics image
US6097397A (en) * 1997-11-20 2000-08-01 Real 3D, Inc. Anisotropic texture mapping using silhouette/footprint analysis in a computer image generation system
GB2343599A (en) * 1998-11-06 2000-05-10 Videologic Ltd Texturing systems for use in three-dimensional imaging systems
GB2343599B (en) * 1998-11-06 2003-05-14 Videologic Ltd Texturing systems for use in three dimensional imaging systems
US7116335B2 (en) 1998-11-06 2006-10-03 Imagination Technologies Limited Texturing systems for use in three-dimensional imaging systems
GB2383248A (en) * 2001-12-14 2003-06-18 Imagination Tech Ltd Texturing using blend buffer
GB2383248B (en) * 2001-12-14 2005-12-07 Imagination Tech Ltd 3-dimensional computer graphics system

Also Published As

Publication number Publication date
CA2144914A1 (en) 1995-10-02
GB9506722D0 (en) 1995-05-24

Similar Documents

Publication Publication Date Title
US5651104A (en) Computer graphics system and process for adaptive supersampling
US5699497A (en) Rendering global macro texture, for producing a dynamic image, as on computer generated terrain, seen from a moving viewpoint
US7034846B2 (en) Method and system for dynamically allocating a frame buffer for efficient anti-aliasing
US7126615B2 (en) Color compression using multiple planes in a multi-sample anti-aliasing scheme
KR100421623B1 (en) Hardware architecture for image generation and manipulation
US4570233A (en) Modular digital image generator
US5598517A (en) Computer graphics pixel rendering system with multi-level scanning
US7053907B2 (en) Image data generating apparatus
US5986663A (en) Auto level of detail-based MIP mapping in a graphics processor
US6720975B1 (en) Super-sampling and multi-sampling system and method for antialiasing
EP0321291A2 (en) Microtexture for close-in detail
US20060170703A1 (en) Color compression using an edge data bitmask in a multi-sample anti-aliasing scheme
CA2301607C (en) An improved method and apparatus for per pixel mip mapping and trilinear filtering
JPH04222071A (en) Method and apparatus for texture mapping
WO2006080115A1 (en) Drawing method, image generating device, and electronic information apparatus
US6184893B1 (en) Method and system for filtering texture map data for improved image quality in a graphics computer system
US5719598A (en) Graphics processor for parallel processing a plurality of fields of view for multiple video displays
KR19990022627A (en) Method and device for texture mapping
GB2288304A (en) Computer graphics
US6714195B1 (en) Image processing apparatus
EP1058912B1 (en) Subsampled texture edge antialiasing
US8115780B2 (en) Image generator
US6326976B1 (en) Method for determining the representation of a picture on a display and method for determining the color of a pixel displayed
KR100466473B1 (en) Image texture mapping device using texel caches
KR20010089257A (en) Improved s-buffer anti-aliasing method

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)