GB2256568A - Image generation system for 3-d simulations - Google Patents
Image generation system for 3-d simulations Download PDFInfo
- Publication number
- GB2256568A GB2256568A GB9112073A GB9112073A GB2256568A GB 2256568 A GB2256568 A GB 2256568A GB 9112073 A GB9112073 A GB 9112073A GB 9112073 A GB9112073 A GB 9112073A GB 2256568 A GB2256568 A GB 2256568A
- Authority
- GB
- United Kingdom
- Prior art keywords
- image
- video
- digital video
- computer graphics
- backdrop
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
- G09B9/05—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Processing Or Creating Images (AREA)
Abstract
An image generation system includes a computer graphics modeller for producing a computer graphics image of a foreground scene eg the interior of a motor vehicle including a transparent portion or portions eg the vehicle windows, digital video effects apparatus for manipulating backdrop video data to form a controlled sequence of motion video background images and a compositor for keying the computer graphics foreground image into the controlled sequence of motion video background images to form an output image sequence with the background visible through the transparent portion(s) of the foreground. Motion video background images are generated by applying different manipulations at successive image timings to the backdrop video data. A simulator system for vehicles comprising a vehicle cockpit 12 including user operable vehicle controls and at least one window aperture 16, 17, 18, a viewscreen 165, 175, 185 viewable through the or each aperture, a digital video effects apparatus for manipulating at least one set of backdrop video data for forming at least one controlled sequence of motion video images for display on a respective viewscreen. <IMAGE>
Description
IMAGE GENERATION SYSTEM FOR R-D SIMULATIONS
The invention relates to an image generation system for 3-D simulations.
Sophisticated computer graphics modellers (typically a general or special purpose computer workstation with appropriate computer graphics modelling software) are today widely used for the design and/or marketing of various 3-D structures, cars and so on before these are constructed. Computer graphics modellers are being used more and more as they provide a very effective medium for enabling customers, sales people and officials to understand what the final product will look like. They tend to provide much more flexibility than is possible than with conventional blueprints and/or scale models of the final products.
Computer graphics modellers are also used, for example, in flight and/or driving simulators. Thus, there is developing a need for effective modelling of 3-D objects or scenes in many fields, of which architectural modelling, flight simulation and simulation of car and/or aircraft control panel layouts are merely examples.
It is also known, for example, from UK patent application GB-A-2 181 929 (Sony Corporation) to manipulate video data defining the surface texture of an object onto a surface having the shape of the object for generating an image. GB-A-2 181 929 is concerned with simulating a moving object for use, for example, in a flight simulator.
In applications such as the simulation of control panels for cars and/or aircraft, the viewpoint of the viewer (or of the conceptual camera) would be constrained to be internal to the car or cockpit.
The modelling of such control panels and the interior of the cars and/or planes etc, can readily be accomplished with computer graphic wire frame models. However, usually a rendered model (i.e. with the surface as well as the edge detail) will be used as much more realistic results are obtained. Unfortunately, rendering of the model of the internal view of the parts of the car dashboard or flight deck, requires significantly more computer calculation and therefore any movement of indicators, or changing the viewpoint within the model, would be limited be the processing power of the computer graphics modeller. With conventional systems, changes of only around two frames per second are possible.
Computer graphics modellers which enable "photo realistic" representations of the model by the inclusion of detailed wall or panel textures, lighting and shading, furniture, etc. are known. However the level of detail required loads the computer graphics modeller even further, with the results that real time generation of images is not possible. Also, conventional computer graphics modellers only produce a model of the item in question (e.g. the interior of a car). Consider the case of a control panel in a car. Immediately above the control panel is the windscreen. In order to provide a realistic representation of how the control panel would look in the car, it would be useful to be able to model not only the control panel itself, but also a scene which can be viewed through the windscreen.However, in conventional computer graphics modellers, the view through the background would be blank or, at best, a still background image.
In evaluating control panel layouts for cars, for example, it would be helpful to be able to see a realistic scene which includes the view out of the car window. Ideally, the realistic scene of the view out of the car window would be a moving scene. Providing a moving scene, which could simulate the distractions of monitoring other traffic and general road situations, would allow the effectiveness of the control panel layout to be properly evaluated.
However, to simulate an external scene in combination with photo realistic modelling of the control panel would involve an immense cost in computer graphics modeller hardware and/or modelling time.
An object of the invention is to provide for the modelling of an external scene in addition to modelling an internal scene in a practical and cost effective manner.
In accordance with the invention, there is provided an image generation system comprising a computer graphics modeller for producing a computer graphics image of a foreground scene including at least one transparent portion, digital video effects apparatus for manipulating at least one set of backdrop video data to form a controlled sequence of motion video background images and means for keying the computer graphics foreground image into the controlled sequence of motion video background images to form an output image sequence with the background visible through the transparent portion or portions of the foreground.
By combining the output of a computer graphics modeller with video data processed by digital video effects apparatus, the invention provides for a cost effective mechanism for achieving photo-realistic modelling of objects or structures such as, for example control layouts for vehicles and the like where the viewing position is constrained with respect to the foreground, but not with respect to the background.
The use of a digital video effects apparatus gives more flexibility than would be possible using captured video images directly. If captured images were used directly, it would be necessary to have an immense number of captured images and a mechanism for very rapid access to those images in order to be able to accommodate changes in speed and/or direction of movement during the simulation. The use of a digital video effects apparatus, which need only be a linear digital video effects apparatus, enables changes in speed and direction of movement to be accommodated using scaling and/or translation operations on a much smaller set of captures images. The actual number of images used for any particular simulation will depend on the details of that simulation.
Preferably, the computer graphics modeller produces a view from a predetermined viewpoint and in a predetermined direction of a 3-D computer graphics model for forming the foreground image. Data defining the viewpoint and view direction can then be used to control the digital video effects apparatus.
The or each set of backdrop video data is preferably representative of at least one still video picture mapped onto a respective surface defined so as to be in the line of sight in 3-D space from the viewpoint through a respective transparent portion of the foreground. The or each surface preferably forms at least a portion of the inside of a hemisphere surrounding the computer graphics model in 3-D space. This enables realistic modelling of the background with linear scaling and/or transformation operations by the digital video effects generator. Thus, the digital video effects apparatus preferably generates motion video background images by applying different linear scaling and/or translation effects at successive image timings to the or each set of backdrop video data.
The digital video effects apparatus preferably comprises a memory for the temporary storage of backdrop video data and addressing means for providing different addressing of the memory during writing and reading operations to effect the linear scaling and/or translation effects. A control means is preferably provided, the control means being responsive to positional data from the computer graphics modeller for issuing control data defining the manipulations to be performed at successive image timings to the digital video effects apparatus.
In a preferred embodiment of the invention, where the computer graphics modeller generates a foreground image having a plurality of transparent portions separated by opaque portions, the digital video effects apparatus comprises a plurality of digital video effects units, each for manipulating a respective set of backdrop video data to form a respective sets of constituent background images, and means for combining the sets of constituent background images to form the controlled sequence of background images.
The invention also provides a simulator system for vehicles comprising a vehicle cockpit including user operable vehicle controls and at least one aperture, a viewscreen viewable through the or each aperture, a digital video effects apparatus for manipulating at least one set of backdrop video data for forming at least one controlled sequence of motion video images for display on a respective viewscreen.
A particular example of the invention is described hereinafter with reference to the accompanying drawings in which:
Figure 1 represents a view within a model from a constrained viewpoint;
Figure 2 represents a conceptual view of a scene simulation;
Figure 3 is a conceptual view of a second scene simulation;
Figure 4 is a schematic illustration of how three pre-distorted video sequences can be combined with the view of Figure 1;
Figure 5 is a schematic block diagram of apparatus for preprocessing video data;
Figure 6 is a schematic block diagram of a system in accordance with the invention;
Figure 7 is a schematic block diagram of digital video effects apparatus;
Figure 8 is a schematic illustration of a simulator for a vehicle in accordance with the invention; and
Figure 9 is a schematic illustration of a matrix of images.
Figure 1 represents a view 10 in high definition television 16:9 format from within the model of a car as could be produced by a conventional computer graphics modeller (eg. a general or special purpose computer workstation with 3-D computer graphics modelling software). The computer graphics modeller is provided with data defining a model of the car. The model includes three dimensional positional information, surface information such as texture etc. and possibly other factors such as lighting and so on. From this data, the computer graphics modeller produces a 2-D image of the car. As represented in Figure 1, the 2-D image comprises a representation of the control panel 12, or dashboard of the car, the surrounding window frames 14 for the windscreen 16 and side windows 17, 18, the interior of the roof 15 and other internal details as appropriate.However, the computer graphics modeller does not provide details of the views through the windscreen 16 and the side windows 17 and 18. In accordance with the invention, there is provided a system which enables representation of a scene (possibly a moving scene) as viewed through these windows 16, 17 and 18.
Figure 2 represents the concept behind the present invention.
More specifically, in addition to a computer graphics model 20 represented by the inner circle in Figure 2 produced by a computer graphics modeller, video images (possibly moving video images) are mapped, or projected onto an imaginary hemisphere, 22, illustrated in plan view, or part thereof, by a digital video effects apparatus for forming a background scene surrounding the model 20.
In practice, for a view such as illustrated in Figure 1, where the viewpoint is constrained, it is not necessary to form a complete scene surrounding the model 20, it being merely necessary to create sections of that scene as visible through the windows or apertures in the model. This is illustrated conceptually in Figure 3 where the backdrop scene 16S would be visible through the windscreen 16 and the backdrop scenes 17S and 18S would be visible through the side windows 17 and 18 respectively. Figure 4 illustrates how these three views could be arranged with respect to the windscreen and side windows of the model shown in Figure 1.
Preferably these views would be separately captured as a sequence of still video images of moving scenes from three different camera directions.
In order to adapt the originally captured video images for viewing through the appropriate view ports, apparatus such as illustrated in Figure 5 could be used. In other words, the original images can be corrected for distortions which appear in a camera lens and also adapted to the particular viewing screen or surface through the use of a non-linear digital video effects generator 23. The originally captured images can be stored in source video storage 22 (e.g. a first video tape recorder) and then processed by a non-linear digital video effects apparatus 23 before being stored in a backdrop image storage 24 (e.g. a second video tape recorder). The manipulations to be performed by the non-linear digital video effects apparatus are determined by a controller 22.The controller is programmed by a user to determine specific mapping instructions for mapping the originally captured images onto the inside of a sphere representing the backdrop.
The mapping which is applied can be used to enable seamless tessellation of the views as seen through the side windows with that viewed through the windscreen. In addition, colour correction can be employed to remove any level or colour variations causes by changes in aperture setting or angle of the sun etc. when the original images are captured. Where the separations between the views (i.e. the window pillars 14 in Figure 1) are narrow, it is desirable that seamless tessellation is employed in order to create the impression of a continuous external scene. However, where the structural features of the model allow (i.e. there is a significant separation between the individual views) seamless tessellation of the images is not required as a less perfect match would still create the impression that the external scene was continuous.
Figure 6 is a schematic block diagram of an image generation system in accordance with the invention. The system illustrated in
Figure 6 enables a sequence of pre-processed backdrop video images to be combined with the computer graphics model itself in a manner which enables a moving background to be simulated.
A conventional computer graphics modeller, (e.g. a computer workstation 30 including 3-D modelling software) generates an output image representative of a view from within an object (eg. the interior of the car of Figure 1). The computer workstation of the computer graphics modeller will typically include conventional user input means including one or more of the following, namely a keyboard, mouse, graphics tablet, light pen and so on. The computer graphics modeller will typically also include a monitor for displaying the foreground without the background information.The output from the computer graphics modeller 30 includes positional information on connection 31 identifying the position and direction of view in 3-D space used to create the image of the model, the image pixel data relating to the image of that model on connection 32, and keying data 33 specifying areas of that model image which relate to the foreground and which relate to the background.
In principle, a conventional computer aided design software package or other conventional 3-D modelling software can be used for generating the model image. Where the conventional computer graphics modelling software does not provide an output of the positional information representative of the viewpoint position and direction in 3-D space, or a key signal indicating which parts of the object are opaque and which parts are non-opaque, the computer graphics modelling software will need to be modified in order to provide these features.
Any computer graphics modelling software which provides an image of a model, will have the data available which is representative of the viewpoint position and direction in 3-D space. Also, a computer graphics modelling system which provides for rendering of surfaces will have data available about which of those surfaces are opaque and which are non-opaque. Accordingly, it will be apparent to one skilled in the art how to modify the computer graphics modelling software in order to provide an output of these data.
The positional information is supplied to a control unit 34 for controlling the operation of a plurality of digital video effects unit 16A, 17A and 18A and corresponding background image stores 16V, 17V and 18V, respectively. Each of the background video stores 16V, 17V and 18V contains video data relating to one or more pre-processed stills for forming the basis of the views to be seen through the windows or apertures in the car. Note that one background video store V and one digital vide effects unit A are provided for each transparent portion of the model in the present example. It will be appreciate that the number of stores and digital video effects units can be adapted for different numbers of transparent portions (e.g. four of each could be provided for a model having four windows).The video data stored in the background video stores are produced by pre-processing the stills as described with reference to Figure 5. Each of the background video stores can be in the form, for example, of a digital video tape recorder. The output from each of the background video stores is supplied to the video input of the respective digital video effects unit.
The digital video effects unit provides for the appropriate linear motion of the image seen through the appropriate window. More specifically, the digital video effects units 16A, 17A and 18A perform scaling and/or translation operations on the images from the respective image storage units 16V, 17V and 18V in response to control information from the control unit 34 in order to simulate moving images. The output of each of the digital video effects units 16A, 17A and 18A is supplied to a combiner 35 where the separate output information is combined to form a sequence of background images.
Each background image comprises the mapped backdrop images 16S, 17S and 18S arranged in the relationship indicated by the dashed lines in Figure 4. Each background image in the sequence in supplied in turn via a connection to a compositor operating as a keyer 36. The compositor 36 responds to the keying information supplied by the computer graphics modeller on the connection 33 to key the foreground image pixels on the connection 33 into the background image from the combiner 35. The final output image for display is output to a video monitor 38 for display.
Each of the digital video effects unit 16A, 17A and 18A is used to expand the image in size and to perform horizontal and vertical translation of the image in order to simulate motion. In this way it is possible to shift the relative positions of the viewpoint and the background by synchronous translation of all the video backgrounds in order to simulate motion of the car, or other structure being modelled by the computer graphics modeller.
Figure 7 is a schematic block diagram of a typical digital video effects generator 40 which can be used as one of the digital video effects units 16A, 17A or 18A in the system of Figure 6, and also as the digital video effects generator 23 in Figure 5. As the digital video effects generator can be conventional, it will not be described in detail herein.
In summary, the digital video effects generator operates as follows. A video signal representing each source image to be manipulated is input into the digital video effects generator 32.
Manipulation of the source image(s) is performed in the digital video effects generator by controlling either the read or the write addresses to a memory 50 where temporary storage of the image data is effected.
As illustrated in Figure 6, read side addressing is employed, this being controlled by the address generator 52.
A pixel interpolator 58 enables output pixel values to be computed where the address generator 52 does not generate a one-to-one mapping between a storage location in the memory 50 and an output pixel. The addresses produced by the address generator 52 include an integer portion for addressing the memory 50 and a fractional portion for controlling the pixel interpolator 58.
As, in the processing of images, the manipulation may involve compression of the image and in the absence of corrective measures, compression of an image can give rise to a liaising which will degrade the quality of the output image, a filter 54 is provided to compensate for the effects of compression. Filter control 56 determines local scaling factors for controlling the filter 54.
The address generator 52 determines addresses for mapping the pixel data out of the memory 50 in accordance with a particular manipulation to be performed in response to control data from the control unit 34. The control data supplied by the control unit 34 to the digital video effect generator includes a definition of the mapping which needs to be performed such that the address generator may generate the appropriate addresses.
The control unit 34 can be in the form of a personal computer or computer workstation with appropriate software, can be provided as an integral part of the digital video effects generator or can be provided as a special purpose piece of hardware.
The function of the control unit 34 could alternatively be integrated in a computer workstation of the computer graphics modeller, so that a separate control unit would not be necessary, the computer graphics modeller providing the necessary control signals to the digital video effects apparatus and the other elements of the modelling system directly.
By programming the computer graphics modeller to define and output data defining the view position and direction in 3-D space, and the position and orientation of the model with respect to the background, the control unit 34 can be arranged to generate appropriate control data for controlling the digital video effects apparatus to produce the views of the background for display in the transparent portions of the model image. If then, the view position and direction and the model position and orientation are changed, the control unit will cause the digital video effects apparatus to change the manipulation of the background to reflect the changes (eg. for simulating the movement of the car along a road).Details of the computer graphics model (eg. the position of dials on the dashboard of a car being modelled) can be changed by redefining the model data on the computer graphics modeller in a conventional manner.
Thus the image generation system of Figure 6 is able to a 2-D image of a scene comprising a view from within the model from the computer graphics modeller with a moving background visible through windows or the like from video images manipulated by digital video effects apparatus in order to provide a realistic representation of the model in its environment. Using current computer graphics generators producing images, typically with a resolution of 1280 pixels x 1024 lines or 2048 pixels x 1560 lines, and high definition digital video effects apparatus, typically operating with a resolution of 1920 pixels x 1035 lines it is possible to produce in real time photorealistic representations of computer models of control panels and the like in a real world environment.It is possible to simulate moving the model around in real time by changing the selected viewpoint and/or direction, the modelling system responding to these changes to produce the appropriate output images.
Figure 8 illustrates a simulator in accordance with the invention for simulating a vehicle, here an automobile. The simulator comprises a mock-up 110 of the interior of the vehicle. Included in the mock-up is a set of user controls such as the steering wheel, gear stick, control switches etc.. In the simulator these are connected to sensors for producing output signals which are connected via a connection line 112 to a control unit 134. The control unit 134 uses the control signals on the line 112 to control first, second and third background image storage 116V, 117V, and 118V and digital video effects apparatus 116A, 117A and 118A to generate video images for display on video viewscreens 1165, 117S and 118S.In the same way as in the modelling system described above, the control unit generates signals for selecting video data from the background image storage 116V, 117V, and 118V and for controlling the digital video effects apparatus 116A, 117A, and 118A to perform appropriate processing of those video data from the storage for display on the screen 116S, 117S and 1185. The video viewscreens 116S, 117S and 118S are arranged behind respective windows 116, 117, and 118 in the cockpit mock-up 110. Accordingly, it will be appreciated that the specific manipulations to be performed on the video data will depend on the orientation of the video display screens with respect to the windows (i.e. whether they are parallel or at an angle thereto).Note that in the previous example of the invention the viewscreens were in the same plane as the image of the foreground, but transformed to create the appropriate perspective effects.
In the case of a vehicle simulator, each of the first, second and third background image storage units preferably contain the same video data, namely video data representative of a matrix of video images mapped on to spherical surfaces, to provide a complete array of images which may be selected for processing by the digital video effects apparatuses. Separate background image storage is provided for each digital video effects apparatus for data bandwidth reasons. If high bandwidth background image storage is available which enables video data to be extracted at a sufficient rate to service all of the digital video effects apparatus, then only one background image storage would be needed.
Figures 9A and 9B form a schematic representation of the matrix of images.
Figure 9A illustrates a grid or matrix of images arranged about nodes having horizontal coordinates i-1, i, i+1, i+2 etc. and vertical coordinates j-1, j, j+1, j+2 etc. Associated with each node four images I are stored. For example, for the node i,j, four images IijW, Ii,j,e, Ii,j,n and and , are stored. Image Ii; w represents a captured image viewed from a westerly direction adjacent the node i,j. The image 1i.J.e represents a captured image viewed from an easterly direction adjacent the node i,j. The image 1i.J,n represents a captured image viewed from a northerly direction adjacent the node i,j. The captured image Ii.j.# represents a captured image viewed from a southerly direction adjacent to the node i,j.The use of the points of the compass are used herein for the purposes of explanation only. It will be appreciated that the images do not have to be captured as viewed from the specific compass positions. However, the object of capturing the four images is to enable a background image to be generated when approaching a position adjacent the node i,j in any direction. This is achieved by capturing four images for each of the nodes in the matrix and for arranging the images to overlap at the nodes as illustrated by the small lines crossing the parallel lines representing each pair of images adjacent the nodes.
In practice, the captured images are not stored in the background image storage of Figure 7. Instead, the image data which is stored is representative of the captured images mapped onto part of a sphere, or a cylinder. This is illustrated in slightly enlarged scale in Figure 8B for the four images associated with the node i,j. Thus, captured image Ii,j,w w is stored as that image mapped onto a surface represented as the surface 11i.j,w The captured image Ii,j,s is stored as that image mapped onto the surface represented as I'ijS. Likewise, image Ii,j,e is stored as that image mapped onto a surface represented as surface I' i.j .e and the captured image Iij n is stored as that image mapped onto the surface represented as I'ij n. It will be noted that the surface I'ije and the surface I'#,# n overlap. This applies also to other adjacent images. In this way a continuous representation of space is generated such that a background image can be generated from any position within the matrix by an appropriate combination of the stored background images. The effect of movement is generated by appropriate scaling and/or translation operations on the stored background images. Thus, as the viewpoint approaches a stored image, appropriate pixel information is enlarged so that it appears as if the viewer is approaching the objects represented in that image.Before the viewpoint actually contacts the surface onto which the image is projected, the next image or images in that direction are selected in order to generate a continous appearance of movement throughout the matrix.
Although Figure 9A illustrates a small degree of overlap between adjacent images, in many embodiments it will be desirable to have a greater degree of overlap so that a smooth representation of movement is produced in the output background images when moving from one stored background image to the next.
It will be appreciated that the concept of the matrix of images can be extended to three dimensions to include views upwards and downwards as well as those illustrated in Figures 9A and 9B.
The technique of using a matrix of images as explained with reference to Figures 9A and 9B can also be applied to the image generation system described with reference to Figure 6.
Through the use of the matrix of images, it is possible to enable the simulator to be driven throughout the simulated 3-D world in which the images have been captured by performing scaling and horizontal and/or vertical translation of the images.
The simulator of Figure 8 is able to provide a high quality moving background which is visible through the window apertures (eg.
the windscreen, side windows or the like) of the cockpit of a vehicle such as an automobile, an aircraft, a spacecraft or the like being simulated from video images manipulated by digital video effects apparatus. Using current high definition digital video effects apparatus, typically operating with the resolution of 1920 pixels by 1035 lines it is possible to produce high quality images from a real world environment. It is thus possible to simulate moving around the real world environment in real time by responding to the operation of user controls in the vehicle to produce appropriate images for display on viewscreens visible through windows of the vehicle simulator.
Claims (21)
1. Image generation system comprising a computer graphics modeller for producing a computer graphics image of a foreground scene including at least one transparent portion, digital video effects apparatus for manipulating at least one set of backdrop video data to form a controlled sequence of motion video background images and means for keying the computer graphics foreground image into the controlled sequence of motion video background images to form an output image sequence with the background visible through the transparent portions of the foreground.
2. An image generation system as claimed in Claim 1 wherein the computer graphics modeller produces a view from a predetermined viewpoint and in a predetermined direction of a 3-D computer graphics model for forming the foreground image.
3. An image generation system as claimed in Claim 2 wherein the or each set of backdrop video data are representative of at least one still video picture mapped onto a respective surface defined so as to be in the line of sight in 3-D space from the viewpoint through a respective transparent portion of the foreground.
4. An image generation system as claimed in any preceding claim wherein the backdrop video data are representative of a matrix of video pictures.
5. An image generation system as claimed in Claim 3 or Claim 4 wherein the or each surface forms at least a portion of the inside of a hemisphere surrounding the computer graphics model in 3-D space.
6. An image generation system as claimed in any preceding Claim wherein the digital video effects apparatus generates motion video background images by applying different linear scaling and/or translation effects at successive image timings to the or each set of backdrop video data.
7. An image generation system as claimed in Claim 6 wherein the digital video effects apparatus comprises a memory for the temporary storage of backdrop video data and addressing means for providing different addressing of the memory during writing and reading operations to effect the linear scaling and/or translation effects.
8. An image generation system as claimed in Claim 7 comprising control means, responsive to positional data from the computer graphics modeller, for issuing control data defining the manipulations to be performed at successive image timings to the digital video effects apparatus.
9. An image generation system as claimed in any one of Claims 1 to 5 wherein the computer graphics modeller generates a foreground image having a plurality of transparent portions separated by opaque portions and wherein the digital video effects apparatus comprises a plurality of digital video effects units, each for manipulating a respective set of backdrop video data to form a respective sets of constituent background images and means for combining the sets of constituent background images to form the controlled sequence of background images.
10. An image generation system as claimed in Claim 9 wherein each digital video effects unit generates motion video background images by applying different linear scaling and/or translation effects at successive image timings to a respective set of backdrop video data.
11. An image generation system as claimed in Claim 10 wherein each digital video effects unit comprises a memory for the temporary storage of backdrop video data and addressing means for providing different addressing of the memory during writing and reading operations to effect the linear scaling and/or translation effects.
12. An image generation system was claimed in Claim 11 comprising control means, responsive to positional data from the computer graphics modeller, for issuing control data defining the manipulations to be performed at successive image timings to each digital video effects unit.
13. A simulator system for vehicles comprising a vehicle cockpit including user operable vehicle controls and at least one aperture, a viewscreen viewable through the or each aperture, a digital video effects apparatus for manipulating at least one set of backdrop video data for form at least one controlled sequence of motion video images for display on a respective viewscreen.
14. A simulator system as claimed in Claim 13 wherein the or each set of backdrop video data are representative of at least one still video picture mapped onto a respective surface defined so as to be in the line of sight in 3-D space from a given viewpoint through a respective aperture.
15. A simulator system as claimed in Claim 13 or Claim 14 wherein the backdrop video data are representative of a matrix of video pictures.
16. A simulator system as claimed in any one of Claims 13 to 15 wherein the or each surface forms at least a portion of the inside of a hemisphere surrounding the vehicle cockpit.
17. A simulator system as claimed in any one of Claims 13 to 16 wherein the digital video effects apparatus generates motion video background images by applying different linear scaling and/or translation effects at successive image timings to the or each set of backdrop video data.
18. A simulator system as claimed in Claim 17 wherein the digital video effects apparatus comprises a memory for the temporary storage of backdrop video data and addressing means for providing different addressing of the memory during writing and reading operations to effect the linear scaling and/or translation effects.
19. A simulator system for vehicles as claimed in Claimed 18 comprising control means, operation of the vehicle controls for issuing control data defining the manipulations to be performed at successive image timings to the digital video effects apparatus.
20. An image generation system substantially as hereinbefore described with reference to the accompanying drawings.
21. A simulator system for vehicles substantially as hereinbefore described with reference to the accompanying drawings.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9112073A GB2256568B (en) | 1991-06-05 | 1991-06-05 | Image generation system for 3-D simulations |
JP14593492A JP3200163B2 (en) | 1991-06-05 | 1992-06-05 | 3D simulation image forming system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9112073A GB2256568B (en) | 1991-06-05 | 1991-06-05 | Image generation system for 3-D simulations |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9112073D0 GB9112073D0 (en) | 1991-07-24 |
GB2256568A true GB2256568A (en) | 1992-12-09 |
GB2256568B GB2256568B (en) | 1995-06-07 |
Family
ID=10696124
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9112073A Expired - Lifetime GB2256568B (en) | 1991-06-05 | 1991-06-05 | Image generation system for 3-D simulations |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP3200163B2 (en) |
GB (1) | GB2256568B (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1994019784A1 (en) * | 1993-02-17 | 1994-09-01 | Atari Games Corporation | Scenario development system for vehicle simulators |
US5474453A (en) * | 1993-02-17 | 1995-12-12 | Atari Games Corporation | Scenario development system for vehicle simulators |
FR2728995A1 (en) * | 1994-12-29 | 1996-07-05 | Renault | Image projection system for automobile driving simulator |
FR2730842A1 (en) * | 1995-02-17 | 1996-08-23 | Renault | Process for visualising images of adjustable size in field of view for motor vehicle driving simulator |
US5835101A (en) * | 1996-04-10 | 1998-11-10 | Fujitsu Limited | Image information processing apparatus having means for uniting virtual space and real space |
EP0886254A1 (en) * | 1997-06-20 | 1998-12-23 | Thomson-Csf | Compact display device for vehicle simulator with large driving cabin |
EP0886245A2 (en) * | 1997-06-20 | 1998-12-23 | Nippon Telegraph And Telephone Corporation | Display of moving object on background image |
GB2351199A (en) * | 1996-09-13 | 2000-12-20 | Pandora Int Ltd | Automatic insertion of computer generated image in video image. |
EP1138159A1 (en) * | 1998-12-07 | 2001-10-04 | Universal City Studios, Inc. | Image correction method to compensate for point of view image distortion |
US6525765B1 (en) | 1997-04-07 | 2003-02-25 | Pandora International, Inc. | Image processing |
WO2003096302A2 (en) * | 2002-05-14 | 2003-11-20 | Cae Inc. | Graphical user interface for a flight simulator based on a client-server architecture |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10320590A (en) | 1997-05-19 | 1998-12-04 | Honda Motor Co Ltd | Composite image production device and method therefor |
JP3449254B2 (en) | 1997-11-14 | 2003-09-22 | ヤマハ株式会社 | D / A converter |
US6356297B1 (en) | 1998-01-15 | 2002-03-12 | International Business Machines Corporation | Method and apparatus for displaying panoramas with streaming video |
DE10021824C2 (en) | 1999-05-07 | 2002-01-31 | Yamaha Corp | D / A converter device and D / A converter method |
US9425747B2 (en) | 2008-03-03 | 2016-08-23 | Qualcomm Incorporated | System and method of reducing power consumption for audio playback |
CN108922307A (en) * | 2018-07-26 | 2018-11-30 | 杭州拓叭吧科技有限公司 | Drive simulating training method, device and driving simulation system |
KR102256610B1 (en) * | 2021-01-12 | 2021-05-26 | 서울특별시 | Visual effect and sudden stop experience system of subway |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3862358A (en) * | 1971-11-04 | 1975-01-21 | Us Navy | Visual simulation system |
EP0185905A1 (en) * | 1984-12-26 | 1986-07-02 | International Business Machines Corporation | Method of creating a document |
EP0230141A2 (en) * | 1986-01-02 | 1987-07-29 | Texas Instruments Incorporated | Porthole window system for computer displays |
EP0284764A2 (en) * | 1987-03-02 | 1988-10-05 | International Business Machines Corporation | Improved user transaction guidance |
-
1991
- 1991-06-05 GB GB9112073A patent/GB2256568B/en not_active Expired - Lifetime
-
1992
- 1992-06-05 JP JP14593492A patent/JP3200163B2/en not_active Expired - Lifetime
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3862358A (en) * | 1971-11-04 | 1975-01-21 | Us Navy | Visual simulation system |
EP0185905A1 (en) * | 1984-12-26 | 1986-07-02 | International Business Machines Corporation | Method of creating a document |
EP0230141A2 (en) * | 1986-01-02 | 1987-07-29 | Texas Instruments Incorporated | Porthole window system for computer displays |
EP0284764A2 (en) * | 1987-03-02 | 1988-10-05 | International Business Machines Corporation | Improved user transaction guidance |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5474453A (en) * | 1993-02-17 | 1995-12-12 | Atari Games Corporation | Scenario development system for vehicle simulators |
US5660547A (en) * | 1993-02-17 | 1997-08-26 | Atari Games Corporation | Scenario development system for vehicle simulators |
WO1994019784A1 (en) * | 1993-02-17 | 1994-09-01 | Atari Games Corporation | Scenario development system for vehicle simulators |
FR2728995A1 (en) * | 1994-12-29 | 1996-07-05 | Renault | Image projection system for automobile driving simulator |
FR2730842A1 (en) * | 1995-02-17 | 1996-08-23 | Renault | Process for visualising images of adjustable size in field of view for motor vehicle driving simulator |
US5835101A (en) * | 1996-04-10 | 1998-11-10 | Fujitsu Limited | Image information processing apparatus having means for uniting virtual space and real space |
GB2351199A (en) * | 1996-09-13 | 2000-12-20 | Pandora Int Ltd | Automatic insertion of computer generated image in video image. |
GB2351199B (en) * | 1996-09-13 | 2001-04-04 | Pandora Int Ltd | Image processing |
US6525765B1 (en) | 1997-04-07 | 2003-02-25 | Pandora International, Inc. | Image processing |
FR2765015A1 (en) * | 1997-06-20 | 1998-12-24 | Thomson Csf | COMPACT VIEWING DEVICE FOR VEHICLE SIMULATOR WITH WIDE DRIVING CAB |
EP0886245A2 (en) * | 1997-06-20 | 1998-12-23 | Nippon Telegraph And Telephone Corporation | Display of moving object on background image |
EP0886254A1 (en) * | 1997-06-20 | 1998-12-23 | Thomson-Csf | Compact display device for vehicle simulator with large driving cabin |
EP0886245A3 (en) * | 1997-06-20 | 2004-07-21 | Nippon Telegraph And Telephone Corporation | Display of moving object on background image |
EP1138159A1 (en) * | 1998-12-07 | 2001-10-04 | Universal City Studios, Inc. | Image correction method to compensate for point of view image distortion |
EP1138159A4 (en) * | 1998-12-07 | 2007-05-02 | Universal City Studios Inc | Image correction method to compensate for point of view image distortion |
WO2003096302A2 (en) * | 2002-05-14 | 2003-11-20 | Cae Inc. | Graphical user interface for a flight simulator based on a client-server architecture |
WO2003096302A3 (en) * | 2002-05-14 | 2004-04-29 | Cae Inc | Graphical user interface for a flight simulator based on a client-server architecture |
US7117135B2 (en) | 2002-05-14 | 2006-10-03 | Cae Inc. | System for providing a high-fidelity visual display coordinated with a full-scope simulation of a complex system and method of using same for training and practice |
CN100476881C (en) * | 2002-05-14 | 2009-04-08 | Cae公司 | Graphical user interface for a flight simulator based on a client-server architecture |
Also Published As
Publication number | Publication date |
---|---|
JP3200163B2 (en) | 2001-08-20 |
JPH0793579A (en) | 1995-04-07 |
GB2256568B (en) | 1995-06-07 |
GB9112073D0 (en) | 1991-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5694533A (en) | 3-Dimensional model composed against textured midground image and perspective enhancing hemispherically mapped backdrop image for visual realism | |
GB2256568A (en) | Image generation system for 3-d simulations | |
US5751927A (en) | Method and apparatus for producing three dimensional displays on a two dimensional surface | |
US4343037A (en) | Visual display systems of the computer generated image type | |
US5905499A (en) | Method and system for high performance computer-generated virtual environments | |
US5459529A (en) | Video processing for composite images | |
EP0583060A2 (en) | Method and system for creating an illusion of three-dimensionality | |
JP3759971B2 (en) | How to shade a 3D image | |
JPS63502464A (en) | Comprehensive distortion correction in real-time image generation systems | |
JPH0757117A (en) | Forming method of index to texture map and computer control display system | |
JPH0375682A (en) | High frequency signal detecting apparatus | |
JPH05506520A (en) | computer video processing system | |
JPH05507166A (en) | image generator | |
Yang et al. | Nonlinear perspective projections and magic lenses: 3D view deformation | |
EP0100097B1 (en) | Computer controlled imaging system | |
GB2051525A (en) | C.G.I.-Surface textures | |
JPH07200870A (en) | Stereoscopic three-dimensional image generator | |
Ono et al. | A photo-realistic driving simulation system for mixed-reality traffic experiment space | |
Mueller | Architectures of image generators for flight simulators | |
JPH0241785B2 (en) | ||
JPH0850469A (en) | Magnified display device for large screen with high resolution | |
JPH04213780A (en) | Image processing method | |
JPH09190547A (en) | Image compositing and display device and its method | |
Blake | The natural flow of perspective: Reformulating perspective projection for computer animation | |
JP2511771B2 (en) | Image memory type image generator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) | ||
PE20 | Patent expired after termination of 20 years |
Expiry date: 20110604 |