GB2171579A - Apparatus for generating a visual display - Google Patents

Apparatus for generating a visual display Download PDF

Info

Publication number
GB2171579A
GB2171579A GB08504406A GB8504406A GB2171579A GB 2171579 A GB2171579 A GB 2171579A GB 08504406 A GB08504406 A GB 08504406A GB 8504406 A GB8504406 A GB 8504406A GB 2171579 A GB2171579 A GB 2171579A
Authority
GB
United Kingdom
Prior art keywords
data
memory
processing means
pattern
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB08504406A
Other versions
GB2171579B (en
GB8504406D0 (en
Inventor
Dennis Alan Cowdrey
Roger Graham Loveless
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Link Miles Ltd
Original Assignee
Link Miles Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Link Miles Ltd filed Critical Link Miles Ltd
Priority to GB08504406A priority Critical patent/GB2171579B/en
Publication of GB8504406D0 publication Critical patent/GB8504406D0/en
Publication of GB2171579A publication Critical patent/GB2171579A/en
Application granted granted Critical
Publication of GB2171579B publication Critical patent/GB2171579B/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading

Abstract

A flight simulator has a flight simulation computer 1 to which a computer generated image apparatus is connected to provide a visual display on a CRT display 13 for a trainee pilot. Flight position and attitude data generated by the simulation computer 1 is supplied to the GGI general purpose computer 2. Digital data for surface vertices are stored in a database memory 5 together with codes for overall intensity, texture pattern and surface pane. A surface transformation processor 6 generates a perspective transformation of viewable surface vertices and supplied raster line surface start and end points to a surface frame buffer memory 7. A line processor 8 writes corresponding pixel data into a line buffer memory 9. The intensity data is modulated in an intensity modulator unit 10 before D to A conversion for the CRT display 13. Position and attitude data is supplied by the general purpose computer 2 to a texture generator having an inverse perspective transformation matrix memory 16 for such data, a pipelined inverse perspective transformation processor 17 which utilizes such data and screen pixel coordinates XS and YS to generate texture memory plane coordinates Xe and Ye and a level of detail signal LOD to be applied to address a near texture memory 24 and a far texture memory 28, or a far texture memory 25 alone. Intensity modulation values are thus obtained and supplied to the intensity modulator unit 10 to modulate the pixel intensity data for pixels in textured surfaces. <IMAGE>

Description

SPECIFICATION Apparatus for Generating a Visual Display This invention relates to apparatus for generating a visual display, and especially to computer generated image apparatus.
Various techniques and apparatus for computer generated images are discussed in "Computer Image Generation", edited by Bruce J. Schachter and published by John Wiley & Sons of New York in 1983,1SBN 0-471-87287-3.
One field in which computer image generation is being used is visual display systems for flight simulation systems. A typical flight simulation system has a replica of a pilot's seat and aircraft controls and instruments mounted on an enclosed platform which is moved to a limited extent to simulate changes in attitude of an aircraft in response to use of the controls by a trainee pilot in the seat and accelerations associated with changes of motion. A visual display is also mounted on the platform and presents the trainee pilot with a simulation of the view to be expected for the simulated flight, such flights being based on real possible journeys or flying conditions. The visual display may be presented by two or three display devices arranged to simulate aircraft windows through which the pilot would see the outside world.
United Kingdom patent specification GB 2,019,336 describes a visual display apparatus for a ground-based craft flight simulator comprising a raster scan type display device for viewing by a trainee pilot, and a synthetic image generator which supplies the display device with a signal representing an image of a sky, a horizon and a simulated cloud surface being overflown. This known apparatus has a digital store holding a single pattern cycle of a repetitive pattern which is used to create the cloud pattern effect in one dimension. In another United Kingdom patent specification, GB 2061074, an apparatus is described which is intended to be combined with a polygon-based raster scan image generator and produces a signal for creating a so-called "textured" appearance in a displayed ground plane.A polygon-based raster scan image generator is a known type of computer image generator that, by utilising the simulated attitude and position data produced by a computer-controlled flight simulator, calculates the raster display coordinates of the boundaries of surfaces in the simulated perspective view of territory or cloud being flown over, and controls the intensity of the display so as to produce different assigned intensities, and usually colours, for the different surfaces.The intensity of any particular surface simulated by a polygon-based raster scan image generator acting alone is uniform over that surface, and consequently lacks the visual clues provided in real situations by the surface detail of surfaces. A well known example of the visual clues given by surface detail is the information given by the pattern formed by bricks in a brick wall. GB 2061074 describes a surface detail generator which is intended to be used to modulate the intensity signal output by a polygon-based raster scan image generator so as to create perspective transformed views of a single pattern lying in the displayed ground plane only.Also, although GB 2061074 discusses a memory hierarchy of the pattern stored in a plurality of levels of detail and the selection of an appropriate level of detail by using the sample spacing computed in simulated ground distance from the effective sample spacing along a raster line as a measure, GB 2061074 does not provide any technique for avoiding banding effects which can result from the changeovers between successive levels of detail, and since a large plurality of levels of detail are stored, the amount of memory required to store the pattern at all these levels is relatively large.
Furthermore, GB 2061074 does not disclose any means of associating particular surface detail effects with particular surfaces defined by the output of the polygon-based raster scan image generator.
According to one aspect of the present invention there is provided apparatus for generating a visual display, comprising first memory means storing geometrical surface data and associated surface appearance data including surface intensity data, the geometrical surface data including coordinates for points characteristic of defined surfaces and plane identifying data associating each of one or more defined surfaces with a plane definable within the coordinate system of the said points, first processing means arranged to select from the data stored in the first memory means geometrical surface data and the associated surface appearance data for creating a display of a restricted two-dimensional perspective view of a region defined by observation point and attitude data supplied to the first processing means, the first processing means including input means for receiving variable observation point and attitude data, display means coupled to the first processing means for displaying the said restricted perspective view, second memory means storing surface detail data representing at least one planar pattern, the surface detail data comprising a plurality of intensity modulation values each stored at an address defined by a pair of pattern plane coordinates, second processing means coupled to the input means to receive observation point and attitude data and to means for generating and supplying thereto pairs of display coordinates corresponding to points in the visual display, the second processing means being such as to generate for each pair of display coordinates supplied thereto a corresponding pair of pattern plane coordinates substantially determined by an inverse perspective projection from the observation point to a plane defined in the coordinate system of the said points characteristic of defined surfaces, the first processing means including means for supplying the said plane identifying data to the second processing means and the second processing means being adapted to utilize a perspective transformation associated with the plane identifying data, addressing means coupling the second processing means to the second memory means so as to address the second memory means with the pairs of pattern plane coordinates, and intensity modulation means so coupling the second memory means to the display means as to cause the intensity data selected by the first processing means for a displayed point to be modulated by an intensity modulation value obtained from the second memory means for the same displayed point.
This first aspect of the invention enables surface detail to be applied to selected surfaces visible in the display.
According to another aspect of the invention there is provided apparatus for generating a visual display, comprising first memory means storing geometrical surface data and associated surface appearance data including surface intensity data and surface detail pattern identifying data, the geometrical surface data including coordinates for points characteristic of defined surfaces, first processing means arranged to select from the data stored in the first memory means geometrical data and the associated surface appearance data for creating a display of a restricted two-dimensional perspective view of a region defined by observation point and attitude data supplied to the first processing means, the first processing means including input means for receiving variable observation point and attitude data, display means coupled to the first processing means for displaying the said restricted perspective view, second memory means storing surface detail data representing a plurality of identifiable planar patterns, the surface detail data for each pattern comprising a plurality of intensity modulation values each stored at an address defined by a pair of pattern plane coordinates, second processing means coupled to the input means to receive observation point and attitude data and to means for generating and supplying thereto pairs of display coordinates corresponding to points in the visual display, the second processing means being such as to generate for each pair of display coordinates supplied thereto a corresponding pair of pattern plane coordinates substantially determined by an inverse perspective projection from the observation point to a plane defined in the coordinate system of the said points characteristic of defined surfaces, the first processing means including means for supplying the said surface detail pattern identifying data to the second memory means, addressing means coupling the second processing means to the second memory means so as to address the second memory means with the pairs of pattern plane coordinates, the second memory means being adapted to output the intensity modulation values associated with the said pairs of pattern plane coordinates in the pattern identified by the said surface detail pattern identifying data supplied thereto by the first processing means, and intensity modulation means so coupling the second memory means to the display means as to cause the intensity data selected by the first processing means for a displayed point to be modulated by an intensity modulation value obtained from the second memory means for the same displayed point.
According to a further aspect of the invention there is provided apparatus for generating a visual display, comprising first memory means storing geometrical surface data and associated surface appearance data including surface intensity data, the geometrical surface data including coordinates for points characteristic of defined surfaces, first processing means arranged to select from the data stored in the first memory means geometrical surface data and the associated surface appearance data for creating a display of a restricted two-dimensional perspective view of a region defined by observation point and attitude data supplied to the first processing means, the first processing means including input means for receiving variable observation point and attitude data, display means coupled to the first processing means for displaying the said restricted perspective view, second memory means storing surface detail data representing at least one planar pattern at a plurality of levels of detail, the surface detail data comprising a plurality of intensity modulation values each stored at an address defined by a pair of pattern plane coordinates, second processing means coupled to the input means to receive observation point and attitude data and to means for generating and supplying thereto pairs of display coordinates corresponding to points in the visual display, the second processing means being such as to generate for each pair of display coordinates supplied thereto a corresponding pair of pattern plane coordinates substantially determined by an inverse perspective projection from the observation point to a plane defined in the coordinate system of the said points characteristic of defined surfaces, and to generate a level of detail signal representative of a measure of minimum spacing in the pattern plane at which a difference in appearance can be represented by the display means, the second processing means supplying the level of detail signal to the second memory means, addressing means coupling the second processing means to the second memory means so as to address the second memory means with the pairs of pattern plane coordinates at one or more levels of detail determined by the level of detail signal, and intensity modulation means so coupling the second memory means to the display means as to cause the intensity data selected by the first processing means for a displayed point to be modulated by an intensity modulation value obtained from the second memory means for the same displayed point, the intensity modulation means including means for combining two or more intensity modulation values read out from two or more levels of detail in the second memory means in response to. a respective pair of pattern plane coordinates so as to produce the said intensity modulation value which modulates the selected intensity data. The addressing means may be arranged to address the second memory means at pairs of adjacent levels of detail and the intensity modulation means be such as to blend the values from the said adjacent levels so as to prevent banding effects at the display means.The addressing means may be such as to transform the pairs of pattern plane coordinates in response to predetermined values of the level of detail signal so as to use the said pair of pattern plane coordinates to address a predetermined restricted area of the pattern plane relative to a plurality of different addressing schemes, whereby for the predetermined values of the level of detail signal, the second memory means may store the respective planar pattern in the form of one or more sub-units of the pattern.
The second processing means may include means for generating successive pairs of pattern plane coordinates by interpolation. Similarly, successive values of the level of detail signal may be generated by interpolation.
The said input means of the first processing means may include a general purpose computer adapted to supply perspective transformation data for use by the first processing means and inverse perspective transformation data for use by the second processing means. The second processing means may include transformation data memory means for storing inverse transformation data supplied thereto by the general purpose computer and be adapted to select such stored data in dependence upon plane identifying data supplied thereto by the first processing means.
The invention will now be described in more detail, by way of example, with reference to the accompanying drawings, in which: Figure 1 is a block schematic diagram of an apparatus embodying the invention, and Figure 2 is a block diagram of circuitry for implementing part of the embodiment of Figure 1.
In the Figure 1 of the drawings there is shown in a schematic block diagram a visual display processing system for a flight simulator having a flight simulation computer 1 which serves as the host computer in relation to the visual display processing system.
The flight simulation computer 1 controls an enclosed platform (not shown) on which a trainee pilot is seated. The trainee pilot operates a replica of the controls of a particular aircraft and the computer 1, in response to input signals developed by the replica controls, moves the platform in a manner which to some extent produces the sensation of changes in attitude and movement of the simulated aircraft. The simulation computer 1 also generates data signals representative of the longitude, latitude, altitude and attitude of the aircraft corresponding to the position and attitude of the aircraft which would have resulted from the trainee pilot's operation of the replica controls. These position and attitude signals are supplied to a general purpose computer 2 which is part of the visual display processing system.
The trainee pilot is presented with a visual display of the view associated with the current position and attitude on the screen of a cathode ray tube 13. The display is generated by a raster scan, and for each frame of the display, the general purpose computer 2 generates geometrical transformation matrix data signals 3 which it supplies to a surface transformation processor 6, and geometrical inverse transformation matrix data signals 4 which it supplies to a transformation matrix memory 16.
A typical simulated flight is a flight from one airport to another. To provide a visual display corresponding to such a flight, the screen of the cathode ray tube 13 must show the pilot's view from take off at the first airport to landing at the second airport. The visual display processing system must be able to show correctly the view determined by any reasonably possible flight path. To limit the requirements, it is typically arranged that the pilot can be presented with a view of the ground features and sky within a 100 mile radius of a reference point, known as the runway threshold, of each airport, the view between the airports outside these limits being restricted to flight above a cloud level, so that only sky and cloud features are then displayed.The geometric data for creating displays of the ground features and sky and cloud are held in a database memory 5 in the form of lists of coordinates for surface vertices, these coordinates being referred to as earth coordinates and being given in three dimensions Xe, Ye and Ze to allow for the display of buildings, other noticeably three dimensional features, and a cloud base. The geometric data for overflown cloud features is also held in the database memory in a similar form. The database memory 5 also holds data assigning a value for the intensity of each surface defined by a set of vertices, and texture data in the form of two codes, referred to as the pattern number for each defined surface which indicates the texture effect to be displayed for the defined surface, and the texture plane number which identifies the type of processing required in texture generation.
The earth coordinates for each airport have as their origin the respective runway threshold. The general purpose computer 2 is therefore arranged to supply to the surface transformation processor 6 a signal indicating the currently applicable earth coordinates origin, together with the transformation matrix data signals which the processor 6 uses to transform the earth coordinates of each surface vertex read out from the memory 5 within the earth coordinate system addressed into pilot coordinates Xp, Yp and Zp, the pilot coordinate system having the currently computed position of the pilot's eye as its origin and an angular relation to the earth coordinate system determined by the currently computed attitude, i.e. roll, pitch and yaw, of the aircraft.The complete list of sets of stored earth coordinates having as origin the runway threshold indicated by the general purpose computer 2 is transformed by the processor 6 into pilot coordinates. To select from the transformed coordinates those which will be used to determine the view displayed by the cathode raytube 13, constants determined by the position of the pilot in the aircraft, the area of the window defined by the cathode ray tube 13 and the distance of the pilot's eye from the screen of the tube 13, are used by the processor 6 to reject surface vertices represented by pilot coordinates not falling within the square section pyramid defined by the pilot's eye and the window.The pilot coordinates of the remaining surface vertices are then transformed by the processor 6 into intermediate screen coordinates which have as their origin the centre point of the screen ofthe cathode ray tube 13.
The pilot coordinates, termed Xp, Yp and Zp, have the Zp axis defined as the line from the pilot's eye through the centre of window and perpendicular to the plane of the window, the Xp coordinate being parallel to the width of the window, and the Yp coordinate being parallel to the height of the window. The screen coordinates, termed Xs and Ys are respectively parallel to the width of the window and parallel to the height of the window, both lying in the plane of the window. The window lies in the plane of the cathode ray tube screen and has the centre of the screen as its centre point. However, to compensate for a delay in the part of the processing system that produces texture signals, the height of the window is chosen to be greater than the heightofthe screen, the raster lines being scanned vertically.
The matrix data signals 3 generated by the computer 2 include representations of the aircraft position Xo, Yo and Zo in the applicable earth coordinate system, and a rotation matrix [ H ] which defines the attitude of the aircraft. The processor 6 calculates pilot coordinates Xp, Yp, Zp for each set of earth coordinates Xe, Ye, Ze as follows::
h11 = cos # cos # + sin # sin #sin # h12=-cos q) sin &commat;+sin sin 6 cos yJ hl3=-sin (p cos 6 h21=cos 6 sin Ep h22=cos 6 cos Lp h23=sin 6 h31 sin q cos #-cos # sin 6 sin qi h32=-sin sin #-cos # sin 6 cos 4i h33=cos cos 6 is angle of roll, positive for right wing down 6 is angle of pitch, positive for nose up and tIJ is angle of yaw, positive for rightwards.
For (p=6=4;=O, the Xp, Zp and Yp axes are aligned respectively with the Xe, Ye and Ze axes, so that Yp then represents the vertical direction.
To obtain final screen coordinates Xs and Ys, the processor 6 calculates DXp NSL NSL Xs= x + Zp W 2 and DYp NPE NPE Ys= x + Zp H 2 where D is the distance from the origin of the pilot coordinate to the screen along the Zp axis, NSL is the number of scan lines across the width W of the window, and NPE is the number of pixels along a raster line equal in length to the height H of the window.
The offset quantities NSL NPE and 2 2 are included so that the screen coordinates Xs, Ys are generated having the bottom left hand corner of the window as origin as viewed by the pilot.
Since the simulated flight over three dimensional objects requires the display to show the varying appearance of, for example, buildings standing one behind the other as the aircraft flies past, the database memory 5 holds its lists of geometrical data in an order and format which enables the processor 6 to reject those surfaces which are invisible from the current position of the pilot. This facility, referred to as attributing priorities to surfaces, and its method of implementation are known to those skilled in the art.
From the sets of screen coordinates for surface vertices, the processor 6 generates a list of surface start and surface end points for each raster line. This list, together with the associated surface intensity and texture data, is written into one half of a frame buffer memory 7 for surface data. The other half of the frame buffer memory 7 contains the surface data for the preceding frame which is read by a' line processor 8. The processor 8 reads the surface data one raster line at a time and writes intensity and texture information for each pixel into part of a line buffer memory 9.The writing operation by the line processor 8 is synchronised by pulses from a raster sync pulse generator 8a which also controls the timing of the raster line scans of the cathode ray tube 13, the synchronising pulses supplied to the processor 8 being generated at the pixel rate during each line. The line buffer memory 9 is capable of storing more than one line so that an intensity modulator unit 10 can read out a complete line of pixel intensity data while the processor 8 is writing another line into the buffer memory 9.
The intensity modulator unit 10 includes logic circuitry for multiplying pixel intensity values input from the line buffer memory 9 by modulating values input by the texture data generating part of the system. The modulated intensity values are supplied by the modulator unit 10 to a digital to analog converter unit 12 which supplies corresponding analog intensity signals to the cathode ray tube 13.
The data base memory 5 may include colour data for each surface, and the cathode ray tube 13 may be capable of providing a colour display. Such colour data is stored in the memory 5 as a code for each surface, there being a finite number of distinguishable colours used, each being assigned a unique code. The colour data is passed through to the line buffer memory 9 with the surface intensity data but is not affected by the operation of the intensity modulator unit 10, through which the colour data passes to the digital to analog converter unit 12 where red, blue and green analog signals are generated with relative intensities determined by the colour data and a combined intensity determined by the modulated intensity data supplied by the intensity modulator unit 10.
The geometrical inverse transformation matrix data signals 4 supplied by the general purpose computer 2 of the visual display processing system to the transformation matrix memory 16 represents, in the present embodiment, two matrices, one for transforming screen coordinates into earth coordinates in the ground plane where Ze=0, and the other for transforming screen coordinates into earth coordinates in the sky plane, where Ze=Zc, i.e., the height of the cloud base. Which of these two matrices is supplied to a pipelined inverse perspective transformation processor 17 is determined by a texture plane number signal 14 supplied to the matrix memory 16 by the line buffer memory 9. The database memory 5 includes sky plane geometric feature data associated with one texture plane number, the ground plane geometric feature data being associated with the other texture plane number.The transformation matrices required to convert the screen coordinates into earth coordinates are inverse perspective matrices.
The screen coordinates are supplied to the processor 17 by a pixel counter 18 and a line counter 19.
These two counters are driven by synchronising pulses supplied by the raster sync pulse generator 8a. Each raster line displayed on the screen of the tube 13 contains, for example, 768 pixels. However, the height of the window is 1024 in this case, and it is arranged that only the middle 768 pixels of data from the line buffer memory 9 are displayed on the screen. This allows sufficient time for the processing carried out by the inverse perspective transformation processor 17 and subsequent stages in the generation of texture data up to the intensity modulator 10 to be carried out in relation to the respective pixel intensity value input from the line buffer memory 9 to the intensity modulator unit 10.The first pixel to be displayed in any raster line is pixel number 128. Consequently, the value Ys=128 is supplied by the pixel counter 18 to the processor 17 at a time exactly sufficiently in advance of the output of the intensity data for pixel 128 by the line buffer memory 9 to ensure that the texture modulation applied to that intensity is the correct modulation for pixel number 128. This timing relationship is ensured by the signals supplied by the raster sync pulse generator 8a to the line processor 8 and the pixel counter 18. It is also arranged that the texture plane number signal 14 and a texture pattern number signal 15 which are read out from the line buffer memory 14 and 15 are those corresponding to pixels sufficiently in advance of the pixel whose intensity is currently being read by the intensity' modulator unit 10. The texture plane number signal 14 and the texture pattern numbersignal 15 are representative of the texture plane numbers and texture pattern numbers read from the database memory 5 and passed on by the processor 6 via the memory 7 and processor 8 to the line buffer memory 9 without alteration.
In addition to storing inverse transformation matrices, the transformation matrix memory 16 also stores a set of values of a constant, R, there being one value for each texture plane number. The values of R are supplied to the memory 16 by the general purpose computer 2 and are calculated from: where
Ps is the screen dimension of a pixel, D is the perpendicular distance from the pilot's eye to the screen, and Zo is the perpendicular distance from the pilot's eye to the texture plane concerned, i.e., in the present example, to either the ground plane or the sky plane.
The use of the values of R will be explained hereinafter.
The pipelined inverse perspective transformation processor 17 calculates the earth coordinates Xe and Ye corresponding to the pair of screen coordinates X5 and Y5 supplied thereto by the line counter 19 and the pixel counter 18 at every eighth pixel. These values of Xe and Ye are supplied to respective linear interpolators 21 and 22 which generate pairs of Xe and Ye coordinates at the pixel rate by interpolating values between those supplied by the processor 17.
In order to create a texture effect in the ground and sky planes, the intensity modulator unit 10 must modulate the pixel intensity data supplied from the line buffer memory 9 by varying the intensity values for pixels within each surface to be textured which lies within either the ground plane or the sky plane. To give a realistic appearance to the display, the present system takes into account the distance of the pilot from the part of the texture plane viewed, and the angle made between that part and the line of sight. Also, unrealistic regularities due to the need to use simple repetitive texturing patterns are diminished within the system.
The system includes therefore a texture memory, represented by the three blocks 24, 25 and 28 in Figure 1, which stores intensity modulation values arranged in two dimensional arrays which are referred to hereinafter as texture tiles. Each value in each such texture tile is addressable by a pair of Xe, Ye coordinate values. However, the notional area covered in the simulated ground or sky plane by a single texture tile is relatively small and must be used repeatedly to cover the larger areas of ground or sky plane displayed.
Accordingly, only a predetermined number of the least significant bits of the digital values of Xe and Ye are used in addressing any point in a texture tile.
Each texture tile is in the form of a square area of square elements, the value of the intensity modulation held within any given element being constant. The side of each element represents a length Te in feet in the earth coordinate system. The texture memory holds tiles having only integer values of Te, those chosen being the powers of 2 from 2" to 210, so that the smallest value Of Te is 1 foot, and the largest is 1024 feet.
In order to limit the storage hardware required for patterns with a small value of Tel i.e. 16 ft or less, only four basic subunits of the overall tile are stored, each consisting of a square of tile elements with a side length equivalent in the earth coordinate system to 16 feet. The overall tile is chosen to have a side length of 512 feet so that 1024 sub-units are required. However, a satisfactory effect is achieved by filling the 1024 sub-unit positions by selecting, in a fixed pseudorandom sequence, from the 32 possible choices of sub-unit. The 32 choices arise from the four different orientations, 90 rotation apart, of the four basic sub-units and their mirror images.
Texture tiles with values of Te of 32 feet and 64 feet are also 512 feet square. Texture tiles with Te of 128, 256,512and 1024 feet are ail 2048 feet square.
In this embodiment, the texture memory stores eight different texture patterns, each pattern being held in eleven different levels of resolution, referred to hereinafter as levels of detail. The eleven different levels of detail are achieved by storing eleven texture tiles for each pattern, the tiles being those having element side lengths Te of 1,2,4,8, 16,32,64, 128,256,512, and 1024 feet. Each level of detail is indicated by an LOD value calculated from: LOD=log2 Te.
The texture tiles with values of LOD of 0, 1,2,3,4,5 and 6 are stored in a near texture memory block 24, and the texture tiles with values of LOD of 7, 8, 9 and 10 are stored in a far texture memory block 25. Texture tiles with values of LOD of 7 are also stored for each of the eight texture patterns in a far texture memory block 28. The LOD 7 titles of memory block 28 are used to add large scale texture features to the smaller scale texture features generated from to LOD 0 to LOD 6 tiles of memory block 24 as will be explained hereinafter.
The eight different texture patterns may simulate, for example, runway surface, grass, rough water, smoothwater, sand, concrete and two types of cloud surface.
The linear interpolators 21 and 22 generate a respective pair of earth coordinates Xe and Ye for each pair of screen coordinates Xs, Yse The generated pair of earth coordinates thus corresponds to a pixel in the display screen. The three memory blocks 24, 25 and 28 receive the earth coordinates as address signals which specify a particular element in texture tiles selected by the application of the texture pattern number signal 15 and a level of detail signal indicating the value of LOD. The texture pattern number signal 15 selects the texture pattern, and the level of detail signal selects the two texture tiles for that pattern in the memory block 24 or 25 having the integer values of LOD adjacent to the values indicated by the signal 15.
To explain the use of the level of detail signal, a simple example will be described in which the view presented to the pilot's eye is of a horizontal ground plane uniformly textured, at ground level, up to the horizon. The texture pattern may be grass for example. From the pilot's position, the appearance of the ground plane will be of texture in which the detail diminishes towards the horizon. This effect can be simulated by producing the texture modulation from tiles of increasing LOD value as the earth coordinates of the pixel concerned approach the horizon.To avoid a horizontal banding effect in the displayed texture, modulation values from a pair of adjacent LOD value tiles are blended in accordance with: lB=lN(lF)+lN+1 F where N and N+1 are the adjacent LOD values, IN and 1N+1 are the respective intensity values from the two tiles, 1B is the resultant blended intensity value, and F is a fraction determined by
where TeN and Tenet are the tile element side lengths in the tiles with LOD values N and N+1 respectively, and P5 is the length of one pixel as projected onto the ground plane. The distance from the pilot's eye to the centre of this projection is Zp cos e where e is the angle between the Zp axis and the line of sight.If the actual length of a pixel at the screen is Psl the projection P5 perpendicular to the line of sight at the distance Zp cos 6 is given by Ps Zp cos 6 P5 D cos 6 If a is the angle between the line of sight and the ground plane, Pn P5 sin a The relationship between 6 and a is given by Zo cos 6 sin a Zp where Zo is the vertical earth coordinate of the pilot's eye. Hence,
Cos 6 ranges from 1 at the centre of the screen to 0.878 at the corner, so that its variation is small enough to allow cos 6 to be treated as constant at 1. The equation used therefore to calculate P5 is:
The level of detail signal supplied to the memory blocks 24 and 25 and to a blender 27 represents the value of log22 Pe. The outputs, which are two signals represent IN and 1N+1 from the block 24 or the block 25 are coupled through a switch 26 to the blender 27 which calculates F and produces a signal representing IB which it supplies to a mixer 31.
The value of log22 Pe is calculated by a level of detail processor 20 for one pixel in every eight, to correspond to the values of Xe and Ye calculated by the processor 17, which supplies a value of Zp corresponding to each pair of earth coordinates Xe and Ye calculated.
Since log2 2Pe=2 log Zp+log2 (Ps/D)-log2 I Zo I +log2 (2), the matrix memory 16 supplies a signal of value R=log2 (P/D)-log2 | Zo I +log2 (2/D) to the level of detail processor 20 for each frame to be displayed.
Hence the level of detail signal generated by the processor 20 is represented by log, 2 Pe=R+2 log, Zp Since the level of detail values of the texture tiles stored in the memory blocks 24, 25 and 28 are given by LOD=log2 Te it will be seen that 2Pe represents a notional texture tile element side length.
A linear interpolator 23 generates a value of log, 2Pe for each pair of Xe, Ye values applied to the memory blocks 24,25 and 28.
Since the texture tiles selected for reading in the texture memory are addressed at the pixel rate, it will be seen that each addressed tile is sampled in the ground plane, in this example, at intervals of Pe. By selecting from memory block 24 or memory block 25 the texture tiles having LOD values equal to the integer part of loge 2Pe, say N, and equal to (N+1), it is ensured that successive tile elements are each sampled at least once but not more than twice in succession during reading out of the tile LOD value N, and successive tile elements are each sampled at least twice but not more than four times in succession during reading out of the tile with LOD value N + 1. These rates of sampling ensure a sufficiently rapid spatial modulation of the displayed ground surface to avoid chequer board effects and scintillation.
When the value of log, 2Pe is less than 7, the two texture tiles are selected from the memory block 24, and the switch 6 connects only the block 24 outputs to the blender 27. When the value of log, 2Pe is equal to or greater than 7, the two texture tiles are selected from the memory block 25, and the switch 26 connects only the block 25 outputs to the blender 27.
The blender 27 also receives the LOD signal indicating the value of log, 2Pe, and it calculates the value of Ffrom the relationship F=2f-1 where f is the fractional part of log, 2 Pe, i.e. f=log2 2PeN since 2Pe F 1 TeN so that log2 (F+1)=log22Pe-log2Tew.
If the value of log22Pe is between 6 and 7, then the intensity modulation values read from the tile with LOD value 6 are blended with zero intensity modulation signals. Similarly, if the value of log22P5 is above 10, the modulation values are read from the tile with LOD value 10 and are blended with zero intensity modulation signals. This ensures that the texture effects fade out towards LOD values 7 and 11.
When the value of log22Pe is below 7, the LOD value 7 tile for the chosen pattern is read from the memory block 28. The LOD value 7 tile is therefore oversampled, and a smoothing interpolation process is carried out by a two dimensional interpolator 29 operating on three adjacent tile element values read out from the memory block 28.
For each pair of values of Xe and Ye applied to the block 28, three values are read out as follows.
The tile elements for LOD value 7 have Te7 128.
Let Xe # = X + #X + 128 and Ye =Y+AY+21- 128 Then (X±2, Y + ) are the coordinates, in whole tile elements, of the centre of the bottom left hand one of four tile elements forming a square containing the point Xe, Ye, and (AX, AY) are the coordinates of that point relative to that centre. The coordinates, in tile elements, of the centres of the other three elements are (X+21, Y+1 +21), (X+1 +21, Y + ), and X +1 +, Y+1 +. If (AX+AY) < 1, then point Xe, Ye lies in the lower triangle defined by the centres (X + , Y + ), (X + , Y +1 + ) and (X + 1 + , Y + ).If (#X +#Y) # 1, then Xe, Ye lies in the upper triangle defined by the centres (X+2, Y+ 1 +), (X +1 + , Y + ) and (X+ 1+21, Y+1 +21).
Let the intensity modulation values at (X + , Y + ), (X + , Y +1 + ), (X + 1 + , Y + ) and (X +1 + , Y +1 + ) be respectively, A, B, C and D. Then, if (AX+AY) < 1, the three values read out are A, B and C, and if (EX+AY)1, the three values read out are B, C and D.For the lower triangle the interpolated niodulationvalue is calculated as lL=A+AX(CA)+AY(BA), and for the upper triangle the interpolated modulation value is calculated as Iu = D + (1-#X)(B-D) + (1-#Y)(C-D) The interpolator 29 implements the equations: IL = A(1-#X-#Y) + B#Y + C#X and IU = D[1-(1-#X)-(1-#Y)] + B(1-#X) + C(1-#Y) X, Y, #X and AY are calculated from X+AX=(Xe64)/1 28 and Y + #Y = (Ye-64)/128 where X and Y are integers, and AX and AY are each less than 1.
A switch 30 connects the output of the interpolator 29 to a mixer 31 only when the memory block 24 is operative. The mixer 31 combines the blended output of the near texture memory block 24 with the interpolated output of the far texture memory block 28 in equal proportions.
When only the far texture memory block 25 is operative, the mixer 31 passes the blended output from the blender 27 to the switch 32 without change.
The switch 32 is closed for all values of the pattern number signal 15 except for one which indicates no texture. When closed, the switch 32 couples the output of the mixer 31 to the intensity modulation unit 10.
The pattern number signal indicating no texture is applied when a surface which does not lie in a texture plane, i.e. in the present example does not lie in the ground plane or the sky plane, is being produced in the display.
The operations of the processor 17, and the interpolators 21, 22 and 23, will now be described in more detail.
The processor 17 receives Xsl Y5 values from the line counter 19 and the pixel counter 18, and for every eighth pixel, produces a pair of Xe, Ye values, and a Zp value.
The values of Xp and Yp corresponding to Xs, Ys are given by W. Zp NSL Xp= (Xs- D. NSL 2 H.Zp NPE Yp= Ys D. NPE 2 The values of Xe, Ye and Ze, are related to the values of Xp, Zp and Yp by
where [H][H]-1 = [I], the identity matrix.
However, [H] is orthogonal, so that [ H ] -1 is the transpose of [H].
Hence,
Zo is the altitude of the aircraft, and Ze is the height above ground level of the point (Xe, Ye, Ze). Since the texture tiles in the texture memory blocks 24, 25 and 28 are addressed by the values of Xe and Ye only, they are effectively in the ground plane. Consequently, the relationships used for applying texture to the displayed ground plane are obtained by putting Ze = 0 in equation (1). For the sky plane it is necessary to treat the actual ground plane level as the sky plane, and therefore Z0 is replaced by (- #Zc# + #Z0#), where Zc is the cloud base height, and Ze again put to zero, in equation (1).
Planes parallel to the ground plane but situated between the aircraft and the ground plane can also be textured if ZO is replaced in equation (1) by ZOZB where ZB is the height of the plane to be textured above the ground plane, Ze again being put to zero. However, since the present example uses interpolation to generate values of Xe, Ye and log, 2Pe, only the texturing of the ground and sky planes is carried out.
From equation (1 ), for the ground plane, Hence
It can be shown that
where Xt=XeXO and Yt=YeYO and
and
The elements of [ U ] are constant for each frame to be displayed, and are supplied to the processor 17 from the matrix memory 16, except Xs and Ys which are supplied by the line and pixel counters 19 and 18.
The processor 17 calculates
then the values of Xt and1, and finally the values of Xe and Ye, the values of Xo and Yo being supplied by the matrix memory 16. The value of (1/Zp) is calculated from 1/Zp=U31 X5+U32Y5+U23.
The interpolated values of Xe generated by the interpolator 21 are calculated in accordance with Xn+i=Xn+i(Xn+8Xn)/3 for i=0 to 7, the interpolator 21 receiving Xe and Xn+8 as successive Xe values calculated by the processor 17.
The interpolated values of Ye are generated in the same manner by the interpolator 22.
The use of the interpolators 21 and 22 reduces the computational load of the processor 17 by about 80% of the amount which would be required for calculating all values of Xe and Ye using the processor 17 alone.
For the sky plane, Ze is replaced by (- I Ze I + I Ze I ) in the equation for Zp, and hence in U31, U22 and U33. Consequently the use of interpolation causes an error where the values of X5 and Y5 change from defining a point in the ground plane to a point in the sky plane, or vice versa, which does not coincide with the end of an interpolation interval. However, as such changes always take place at the horizon, the texture modulation is zero, and the error cannot be seen. It is arranged that texture, in the present example, fades out within 8 pixels of the horizon.This type of error is the reason why only the ground and sky planes are textured when the interpolators 21 and 22 are used.
The processor 17 computes Xe, Ye for every eighth pixel along a raster line starting at Ys=T where T is the delay between the input of the processor 17 and the output of the mixer 31.
The transformation matrix memory 16 comprises an arrangement of RAMs. The elements of [U] are loaded into these RAMs by the general purpose computer 2 during a frame update sequence. There is one U matrix for each texture plane. The correct U matrix is selected by the texture plane number signal 14. A fresh value of R is also supplied to the memory 16 by the computer 2 during each frame update sequence.
The level of detail processor 20 generates the value of log22Pe as an output only for the range 0log2 2Pe < 1 1. If log, 2P5 is calculated to be less than zero, then the output value generated is zero. If log, 2Pe is equal to or greater than 11, output from the texture memory blocks 24,25 and 28 is set to zero so that no modulation of the intensity values from the line buffer memory 9 results.
The linear interpolator 23 operates in the same way as the interpolators 21 and 22.
The three linear interpolators 21,22 and 23 can be implemented as two circuit cards each containing two 16 bit interpolator circuits which are clocked at half the pixel rate, e.g. 20 Megahertz. Each 16 bit interpolator circuit generates simultaneous values for the odd and even pixels. The even pixel output is passed through an additional register clocked by the opposite phase pixel rate to provide one pixel of delay relative to the odd pixel output.
The memory blocks 24 and 25 are then also implemented as two circuit cards, one for even pixels and the other for odd pixels, with each card including memory for texture tiles of the eight patterns at all values of LOD from 0 to 10. Tile elements are stored as 5 bit logarithmic modulation values representing a modulation range of ' to < 2. Zero modulation has a value of 1 which, in logarithmic form, is zero.
The memory block 28 is, in this example, implemented as two circuit cards, one for even pixel output and the other for odd pixel output.
The outputs from the blender 27 and the interpolator 29 are representative of, respectively, log, IB and log2 it where Is is the blended intensity modulation value, and I is the interpolated intensity modulation value. The mixer 31 produces an output representative of
where IT is the final intensity modulation value.
Fig. 2 shows an implementation of the memory blocks 24 and 25 with the switch 26 and blender 27. In the circuit of Fig. 2 the values of Xe and Ye supplied to a latch circuit 33 which passes the five bits of Xe corresponding to values 24 to 28 feet and the five bits of Ye also corresponding to values 24 to 28 feet to a subtile decode circuit 34 which decodes the ten bits to one of 1024 addresses of subtile codes, the subtile codes at these addresses being a random sequence formed from thirty two codes each representing one of the thirty two possible subtile arrangements created by eight dispositions of four subtiles. Each of the thirty two subtile codes is represented by a five bit number in which the two most significant bits define the basic one of four subtitles and the three least significant bits define a respective one of the eight possible dispositions.
The three least significant bits from the subtile decode circuit 34 are supplied to an orientating circuit 35 which is also supplied the four bits of Xe and the four bits of Ye corresponding to the values 23 feet down to 20 feet. In dependence upon the value of the three bit signal from the subtile decode circuit 33, the orientating circuit 35 transforms the low order bits for Xe and Ye into values for subtile coordinates TX and TY for addressing points within a subtile. The operation is summarised in the following table in which X represents the low order Xe bits and Y represents the low order Ye bits.
Orientation code TX TY O X Y 1 Y X 2 X Y 3 Y X 4 X Y 5 Y X 6 X Y 7 Y X The subtile decode circuit 34 may be a PROM, and the orientating circuit 35 may be a suitable logic gate and inverter circuit.
The subtile code and subtile coordinates thus generated are used only for tiles at levels 0, 1, 2, 3 and 4.
The values of Xe and Ye are passed through by the latch 33 for use at levels 5 to 11.
The outputs of the subtile decode circuit 34 and the orientating circuit 35 are supplied to an address generator 36 together with the values of Xe and Ye, the LOD signal and the pattern number signal. The address generator 32 generates two thirteen bit addresses which it supplies respectively to a left RAM 37 and a right RAM 38. The three high order bits are obtained by decoding the pattern number signal and identifies which one of the eight possible patterns is selected. All levels 0 to 10 are stored for each pattern in the RAMs 37 and 38, each 1 K word block in each RAM containing at least part of the tile for each level of one pattern.One bit, TS0, from the subtile decode circuit 34 or one bit from Xe is used to determine which RAM is selected for a particular level of detail, a pair of adjacent levels being selected for each value of the LOD signal with one of the levels being selected from the left RAM 37 and the other level from the right RAM 38.
The value of the integer part of the LOD signal is decoded to form part of the addresses supplied to the RAM 37 and 38. For LOD values 0, 1,2,3 and 4, each of the RAMs 37 and 38 contains two subtiles. If the four basic subtiles are identified as 00,01, 10 and 11 by the output bits TS1 and TS0 from the subtile decode circuit 34, then if for level 0 the subtiles 00 and 10 are held in the left RAM 37 and subtiles 01 and 11 are held in the right RAM 38, the subtiles 01 and 11 for level 1 are held in RAM 38, the subtiles 00 and 10 are held in RAM 37 for level 2 and the subtiles 01 and 11 are held in RAM 38 for level 2, and so on up to level 4. This arrangement ensures that corresponding subtiles are selected for the adjacent levels up to level 4. A table of address bits for the RAMs 37 and 38 for one pattern is shown below.
Blended Tile Array Address Map
Ram Address Bits Left/Right Address Contents 9 8 7 6 5 4 3 2 1 0 Ram Select 0 4 near texture subtiles TS0 to 511 at level 0 0 TY0 TX0 TY1 TX1 TY2 TX2 TY3 TX3 TS1 512 4 near texture subtiles to 639 at level 1 1 0 0 TY1 TX1 TY2 TX2 TY3 TX3 TS1 TS0 640 4 near texture subtiles to 671 at level 2 1 0 1 0 0 TY2 TX2 TY3 TX3 TS1 TS0 672 4 near texture subtiles to 679 at level 3 1 0 1 0 1 0 0 TY3 TX3 TS1 TS0 680 4 near texture subtiles to 683 at level 4 1 0 1 0 1 0 1 0 TS0 TS1 TS0/X8* 684 far texture tile to 685 at level 10 1 0 1 0 1 0 1 1 0 Y10 X10 686 to 687 Zero 1 0 1 0 1 0 1 1 1 - 688 far texture tile to 695 at level 9 1 0 1 0 1 1 0 Y9 X9 Y10 X10 696 to 703 Zero 1 0 1 0 1 1 1 - - - 704 far texture tile to 735 at level 8 1 0 1 1 0 Y8 X8 Y9 X9 Y10 X10 736 near texture tile to 767 at level 6 1 0 1 1 1 Y6 X6 Y7 X7 Y8 X8 768 near texture tile to 895 at level 5 1 1 0 Y5 X5 Y6 X6 Y7 X7 Y8 X8 896 far texture tile to 1023 at level 7 1 1 1 Y7 X7 Y8 X8 Y9 X9 Y10 X10 *For selecting left/right at leve 4 use X8 if level N and TS0 if level N+1. X0 to X10 have values of 2 ft to 210 ft respectively. Similarly for Y, TX and TY.
As an example, if the value of the LOD signal is 2.5, the level 2 is selected in one RAM and the level 3 is selected in the other RAM, i.e. bits 9 down to 5 are 10100 in the address generated for one RAM, and bits 9 down to 3 are 1010100 in the address generated for the other RAM. Which RAM supplies the address for level 2 and which the address for level 3 is determined by the value of TS0. The value of TS1 determines which level 2 or 3 subtile of the two in each RAM is to be addressed. The element within the subtile is addressed by the values of TY2, TX2, TY3 and TX3 in level 2, and the values of TY3 and TX3 in level 3. TY0 to TY3 are the bits 0 to 3 of TY, and TX0 to TX3 are the bits 0 to 3 of TX. For levels 5 and above, one complete tile is stored in each RAM.Thus if a level 5 and a level 6tile are to be selected, the level 5 tile will be addressed by 1, 1,0, Y5, X5, Y6, X6, Y7, X7, Y8, and the level 6 tile will be addressed by 1,0, 1, 1, 1, Y6, X6, X7, Y8. The value of X8 determines which RAM provides the level Stile and which the level 6 tile. For levels 7 to 10, the value of X10 orX10 determines which RAM is addressed for a particular level, the other RAM automatically being addressed for the adjacent higher level. For LOD values between 6 and 7 and between 10 and 11,oneRAM is addressed with 1,1, 1,YY,X7,Y8,X8,Y9,X9,Y10or1,0, 1,0,1,0,1, 1,0,Y10,andthe otherRAMisaddressedwithl,0,1,0,1,1,1,-,-,-,orl,0,1,0,1,0,1,1,1,-,bothofwhichareaddressesfor zero modulation values.
The RAM selecting bit TS0, X8 or X10 is also supplied to two multiplexer circuits 39 and 40 which, in dependence upon the value of the RAM selecting bit, direct the output of the RAM providing the N level to a N level input of the blender 27, and the output of the RAM providing the N+1 level to an N+1 level input of the blender 27.
The bits of Xe and Ye used to address a subtile of level n are the bits with the values 23 feet down to 2" feet, where n=0, 1, 2 or 3. When a subtile of level 4 is to be addressed, there is only one element per subtile, so that only TS0 and TS1 are needed to specify the element completely.
When 7 LOD 11, the RAMs 37 and 38 are addressed directly from Xe and Ye with the bits having the values 210 feet down to 21 feet.
It will be seen that the ten least significant bits of the RAM address word define the point selected with the tiles for one pattern.
The database memory 5 stores, for each surface defined therein, a control word of four bits which encode the plane number and the pattern number. This control word is decoded at the line buffer memory 9 into the three bit pattern number code which forms the three higher bits of the RAM address, and a one bit plane number, the two planes in this example being the ground plane and the sky plane. Two of the control word values are not used, one specifies a default for non-textured surfaces, one specifies a non-textured ground plane, and another one specifies a non-texture sky. The remaining twelve specify textured ground or sky planes.
To avoid interruption of the operation of the linear interpolators 2122 and 23 at boundaries between the ground or sky plane and non-textured surfaces lying in other planes, it is arranged that the control word for default for non-textured surfaces is associated with non-textured surfaces not lying in the ground or sky plane, and that the occurrence of this control word at the line buffer memory 9 disables the plane number of the non-ground plane, non-sky plane surface and preserves the plane number of the background, which is either the ground plane or the sky plane. The effect of this operation is not seen, since the switch 32 prevents the output of the mixer 31 from reaching the intensity modulation unit 10*except when a ground plane or sky plane surface is to be textured.
The memory block 28 includes three RAMs, each holding a copy ofthetile array for level 7 of each pattern only. Three tile elements, one from each RAM, are addressed simultaneously. Two of these elements supply the values B and C, and the other supplies A or D. The address word is formed from the Xe and Ye bits with values 210 feet down to 27 feet. The fractional parts AX and AY are formed from the bits with values 26 feet down to 2-' feet. The values of (1-AX) and (1EY) are generated approximately by a logic inversion. The interpolation function is carried out by using five PROMs.
The following table gives the sizes of the tile elements and numbers of elements for LOD levels 0 to 10.
Texture Tile Parameters
Near Texture Tiles (512'x512') Far Texture Tiles (2048'x2048') Size of Stored Array Size of Stored Array LOD Tile Element (Level N) Dimension FT Elements FT Elements FT 0 1 4x16x16 4x16x16 1 : 2 4x8x8 ,, Blend with Interpolate 2 4 4x4x4 ,, level level 7 N+1 3 8 4x2x2 4 16 4x1x1 5 32 16x16 512xS12 6 64 8x8 7 128 16x16 2048x2048 Blend with 8 256 No near texture 8x8 level N+1 9 512 4x4 10 1024 2x2 " * * At levels 6 and 10 blend with zero modulation value.
Although the example of visual display generating apparatus described hereinbefore with reference to the drawings is intended to be used in a flight simulation system, it will be apparent that similar apparatus within the scope of the invention may be adapted for use in a simulation system for a ground based vehicle such as a tank, or for use in a simulation system for a helicopter or an amphibious craft.

Claims (10)

1. Apparatus for generating a visual display, comprising first memory means storing geometrical surface data and associated surface appearance data including surface intensity data, the geometrical surface data including coordinates for points characteristic of defined surfaces, first processing means arranged to select from the data stored in the first memory means geometrical surface data and the associated surface appearance data for creating a display of a restricted two-dimensional perspective view of a region defined by observation point and attitude data supplied to the first processing means, the first processing means including input means for receiving variable observation point and attitude data, display means coupled to the first processing means for displaying the said restricted perspective view, second memory means storing surface detail data representing at least one planar pattern, the surface detail data comprising a plurality of intensity modulation values each stored at an address defined by a pair of pattern plane coordinates, second processing means coupled to the input means to receive observation point and attitude data and to means for generating and supplying thereto pairs of display coordinates corresponding to points in the visual display, the second processing means being such as to generate for each pair of display coordinates supplied thereto a corresponding pair of pattern plane coordinates substantially determined by an inverse perspective projection from the observation point to a plane defined in the coordinate system of the said points characteristic of defined surfaces, addressing means coupling the second processing means to the second memory means so as to address the second memory means with the pairs of pattern plane coordinates, and intensity modulation means so coupling the second memory means to the display means as to cause the intensity data selected by the first processing means for a displayed point to be modulated by an intensity modulation value obtained from the second memory meansforthe same displayed point.
2. Apparatus according to claim 1, wherein the geometrical surface data stored in the first memory means includes plane identifying data associating each of one or more defined surfaces with a plane definable within the coordinate system of the said points characteristic of defined surfaces, the first processing means including means for supplying the said plane identifying data to the second processing means and the second processing means being adapted to utilize a perspective transformation associated with the plane identifying data.
3. Apparatus according to claim 1 or 2, wherein the surface detail data stored in the second memory means represents a plurality of identifiable planar patterns, the surface appearance data stored in the first memory means includes surface detail pattern identifying data, the first processing means includes means for supplying the said surface detail pattern identifying data to the second memory means, and the second memory means is adapted to output the intensity modulation values associated with the said pairs of pattern plane coordinates in the pattern identified by the said surface detail pattern identifying data supplied thereto by the first processing means.
4. Apparatus according to any preceding claim, wherein the surface detail data stored in the second memory means represents at least one planar pattern at a plurality of levels of detail, the second processing means is such as to generate a level of detail signal representative of a measure of minimum spacing in the pattern plane at which a difference in appearance can be represented by the display means, and to supply the level of detail signal to the second memory means, the addressing means is such as to address the second memory means with the pairs of pattern plane coordinates at one or more levels of detail determined by the level of detail signal, and the intensity modulation means includes means for combining two or more intensity modulation values read out from two or more levels of detail in the second memory means in response to a respective pair of pattern plane coordinates so as to produce the said intensity modulation value which modulates the selected intensity data.
5. Apparatus according to claim 4, wherein the addressing means is arranged to address the second memory means at pairs of adjacent levels of detail and the intensity modulation means is such as to blend the values from the said adjacent levels so as to prevent banding effects at the display means.
6. Apparatus according to claim 4 or 5, wherein the addressing means is such as to transform the pairs of pattern plane coordinates in response to predetermined values of the level of detail signal so as to use the said pair of pattern plane coordinates to address a predetermined restricted area of the pattern plane relative to a plurality of different addressing schemes, and for the predetermined values of the level of detail signal the second memory means stores the respective planar pattern in the form of one or more sub-units of the pattern.
7. Apparatus according to any preceding claim, wherein the second processing means includes means for generating successive pairs of pattern plane coordinates by interpolation.
8. Apparatus according to any one of claims 4,5 and 6, wherein the second processing means includes means for generating successive values of the level of detail signal by interpolation.
9. Apparatus according to any preceding claim, wherein the first processing means includes a general purpose computer adapted to supply perspective transformation data for use by the first processing means and inverse perspective transformation data for use by the second processing means.
10. Apparatus according to claims 2 and 9, wherein the second processing means includes transformation data memory means for storing inverse transformation data supplied thereto by the general purpose computer for a plurality of planes and is adapted to select such stored data in dependence upon plane identifying data supplied thereto by the first processing means,
10. Apparatus according to claims 2 and 9, wherein the second processing means includes transformation data memory means for storing inverse transformation data supplied thereto by the general purpose computer and is adapted to select such stored data in dependence upon plane identifying data supplied thereto by the first processing means.
11. Apparatus for generating a visual display, substantially as described hereinbefore with reference to the accompanying drawings.
Superseded Claim 10 New or Amended Claims:
GB08504406A 1985-02-20 1985-02-20 Apparatus for generating a visual display Expired GB2171579B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB08504406A GB2171579B (en) 1985-02-20 1985-02-20 Apparatus for generating a visual display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB08504406A GB2171579B (en) 1985-02-20 1985-02-20 Apparatus for generating a visual display

Publications (3)

Publication Number Publication Date
GB8504406D0 GB8504406D0 (en) 1985-03-20
GB2171579A true GB2171579A (en) 1986-08-28
GB2171579B GB2171579B (en) 1988-03-02

Family

ID=10574820

Family Applications (1)

Application Number Title Priority Date Filing Date
GB08504406A Expired GB2171579B (en) 1985-02-20 1985-02-20 Apparatus for generating a visual display

Country Status (1)

Country Link
GB (1) GB2171579B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0321291A2 (en) * 1987-12-18 1989-06-21 General Electric Company Microtexture for close-in detail
GB2181929B (en) * 1985-10-21 1989-09-20 Sony Corp Methods of and apparatus for video signal processing
EP0438193A2 (en) * 1990-01-15 1991-07-24 Philips Electronics Uk Limited Display apparatus and method of operating such an apparatus
EP0450978A2 (en) * 1990-04-06 1991-10-09 Link-Miles Limited Apparatus for generating a visual display
GB2242810A (en) * 1990-04-06 1991-10-09 Link Miles Ltd Addressing a texture memory
EP0462788A2 (en) * 1990-06-18 1991-12-27 Link-Miles Limited Apparatus for generating a visual display
GB2288304A (en) * 1994-04-01 1995-10-11 Evans & Sutherland Computer Co Computer graphics
WO1996013808A1 (en) * 1994-10-26 1996-05-09 The Boeing Company Method for controlling the level of detail displayed in a computer generated screen display of a complex structure
GB2343599A (en) * 1998-11-06 2000-05-10 Videologic Ltd Texturing systems for use in three-dimensional imaging systems
WO2000070559A1 (en) * 1999-05-14 2000-11-23 Graphic Gems Method and apparatus for displaying distant objects in a displayed scene
US6952207B1 (en) * 2002-03-11 2005-10-04 Microsoft Corporation Efficient scenery object rendering

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2019336A (en) * 1978-04-22 1979-10-31 Redifon Simulation Ltd Visual display apparatus for flight simulators
GB2051525A (en) * 1979-06-15 1981-01-14 Redifon Simulation Ltd C.G.I.-Surface textures
GB2138252A (en) * 1983-04-12 1984-10-17 Marconi Co Ltd Image generator

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2019336A (en) * 1978-04-22 1979-10-31 Redifon Simulation Ltd Visual display apparatus for flight simulators
GB2051525A (en) * 1979-06-15 1981-01-14 Redifon Simulation Ltd C.G.I.-Surface textures
GB2061074A (en) * 1979-06-15 1981-05-07 Redifon Simulation Ltd Improved visual display systems for computer generated images
GB2138252A (en) * 1983-04-12 1984-10-17 Marconi Co Ltd Image generator

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2181929B (en) * 1985-10-21 1989-09-20 Sony Corp Methods of and apparatus for video signal processing
US4953107A (en) * 1985-10-21 1990-08-28 Sony Corporation Video signal processing
EP0321291A2 (en) * 1987-12-18 1989-06-21 General Electric Company Microtexture for close-in detail
EP0321291A3 (en) * 1987-12-18 1991-07-03 General Electric Company Microtexture for close-in detail
EP0438193A2 (en) * 1990-01-15 1991-07-24 Philips Electronics Uk Limited Display apparatus and method of operating such an apparatus
EP0438193A3 (en) * 1990-01-15 1991-11-13 Philips Electronics Uk Ltd Display apparatus and method of operating such an apparatus
EP0450978A2 (en) * 1990-04-06 1991-10-09 Link-Miles Limited Apparatus for generating a visual display
GB2242810A (en) * 1990-04-06 1991-10-09 Link Miles Ltd Addressing a texture memory
GB2242810B (en) * 1990-04-06 1994-08-10 Link Miles Ltd Apparatus for generating a visual display
EP0450978A3 (en) * 1990-04-06 1993-03-31 Link-Miles Limited Apparatus for generating a visual display
EP0462788A3 (en) * 1990-06-18 1993-03-31 Link-Miles Limited Apparatus for generating a visual display
GB2245460A (en) * 1990-06-18 1992-01-02 Link Miles Ltd Interpolating texture values
GB2245460B (en) * 1990-06-18 1994-04-06 Link Miles Ltd Apparatus for generating a visual display
EP0462788A2 (en) * 1990-06-18 1991-12-27 Link-Miles Limited Apparatus for generating a visual display
GB2288304A (en) * 1994-04-01 1995-10-11 Evans & Sutherland Computer Co Computer graphics
WO1996013808A1 (en) * 1994-10-26 1996-05-09 The Boeing Company Method for controlling the level of detail displayed in a computer generated screen display of a complex structure
GB2343599A (en) * 1998-11-06 2000-05-10 Videologic Ltd Texturing systems for use in three-dimensional imaging systems
GB2343599B (en) * 1998-11-06 2003-05-14 Videologic Ltd Texturing systems for use in three dimensional imaging systems
US7116335B2 (en) 1998-11-06 2006-10-03 Imagination Technologies Limited Texturing systems for use in three-dimensional imaging systems
WO2000070559A1 (en) * 1999-05-14 2000-11-23 Graphic Gems Method and apparatus for displaying distant objects in a displayed scene
US6952207B1 (en) * 2002-03-11 2005-10-04 Microsoft Corporation Efficient scenery object rendering
US7158135B2 (en) 2002-03-11 2007-01-02 Microsoft Corporation Efficient scenery object rendering

Also Published As

Publication number Publication date
GB2171579B (en) 1988-03-02
GB8504406D0 (en) 1985-03-20

Similar Documents

Publication Publication Date Title
EP0160660B1 (en) A method and apparatus for texture generation
US4905164A (en) Method for modulating color for effecting color cell texture
CA2038426C (en) Method and apparatus for generating a texture mapped perspective view
EP0627103B1 (en) Image texturing system having theme cells
CA2071539C (en) Image generator
KR910009101B1 (en) Image synthesizing apparatus
US4570233A (en) Modular digital image generator
US4985854A (en) Method for rapid generation of photo-realistic imagery
US5630718A (en) Weather simulation system
US4667190A (en) Two axis fast access memory
EP0137109A1 (en) Image generating apparatus for producing from the co-ordinates of the end points of a line, a two-dimensional image depicting the line as viewed by the observer
EP0008324A1 (en) Computer generated display of images of simulated objects on a video display device
US5135397A (en) 3-d weather for digital radar landmass simulation
EP0284619A4 (en) Method and apparatus for sampling images to simulate movement within a multidimensional space
CA2205497A1 (en) Weather effects generator for simulation systems
GB2171579A (en) Apparatus for generating a visual display
US3892051A (en) Simulated collimation of computer generated images
JPH05507797A (en) image generator
EP0462788A2 (en) Apparatus for generating a visual display
EP0164457A1 (en) An image generator
EP0450978A2 (en) Apparatus for generating a visual display
Myers Interactive Computer Graphics: Flying High-Part I
GB2242810A (en) Addressing a texture memory
Veron et al. 3D displays for battle management
Weber Real Time Terrain Image Generation

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19960220