GB2533625A - Visual Display Systems - Google Patents

Visual Display Systems Download PDF

Info

Publication number
GB2533625A
GB2533625A GB1423145.0A GB201423145A GB2533625A GB 2533625 A GB2533625 A GB 2533625A GB 201423145 A GB201423145 A GB 201423145A GB 2533625 A GB2533625 A GB 2533625A
Authority
GB
United Kingdom
Prior art keywords
display devices
image
visual
frame image
visual display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1423145.0A
Other versions
GB2533625B (en
Inventor
Christopher Lewis Mark
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MICRO NAV Ltd
Original Assignee
MICRO NAV Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MICRO NAV Ltd filed Critical MICRO NAV Ltd
Priority to GB1423145.0A priority Critical patent/GB2533625B/en
Publication of GB2533625A publication Critical patent/GB2533625A/en
Application granted granted Critical
Publication of GB2533625B publication Critical patent/GB2533625B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • G09B9/302Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/18Use of a frame buffer in a display terminal, inclusive of the display panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A method of generating a non-linear geometrical adjustment of a multi-frame visual presentation which is arranged to be displayed over a number of visual display devices (1a; 1b, in figure 2), such that each display device displays an image portion of the overall frame image, and the method comprising generating 103 a two-dimensional mesh geometry and elongating 102 a vertical dimension of a viewing frustum. The geometrical correction may generate a smooth continuous transition between image portions on adjacent screens. The visual presentation may comprise computer generated images of a three dimensional scene which may be rendered 104 using a perspective projection. The computer generated images may be copied 105 to a texture which is mapped to the 2 dimensional mesh.

Description

VISUAL DISPLAY SYSTEMS
Technical Field
The present invention relates generally to visual displays. Background In Computer Generated Imagery (CGI) a graphics technique is employed such that three-dimensional objects are rendered in the scene according to their distance from the observer; objects in the distance appear smaller than objects in the foreground. Frequently, this perspective effect is achieved by applying a Perspective Projection when converting from three-dimensional world-space positions to two-dimensional screen-space co-ordinates. However, when used within multi-screen Segmented Displays, Perspective Projection on each segment has the unfortunate effect of producing dis-continuities at the joins between adjacent segments. This effect is best illustrated using a spherical grid, in which points of equal latitude are joined to form horizontal lines that encircle a sphere, the central one being referred to as the equator, and points of equal longitude are joined to form vertical lines that extend between the north and south poles. If an observer is placed at the centre of such a spherical grid looking outwards, Perspective Projection will produce an image on a flat screen similar to that shown in Figure 1. In a segmented display consisting of two or more adjacent visual display devices, such as flat screen devices, the combined image will appear similar to that shown in Figure 2, on screens la and lb. It can be seen from this combined image that except for the equator, all other lines of constant latitude are discontinuous across the transition from one segment to the other. This is a general problem with the presentation of three-dimensional linear features that span multiple screens. In the application of the computer simulation of the real world, such linear features would include roads, runways. railway lines and Earth's horizon. If our spherical grid model is replaced with a moving three-dimensional world scene comprising ground, sky, roads, buildings, etc. it can be seen that the discontinuities manifest here also. Figure 3 shows such a scene displayed on two adjacent flat screen monitors. The observer is looking down on an airfield scene. The reader is invited to imagine a vehicle travelling along the straight service road in the foreground of this scene. As the vehicle moves along the road from left to right, it transitions from one monitor to the other and as it does so it appears to change direction slightly. Clearly this is an unrealistic and unhelpful representation of the scene.
We have recognised that an improved display would be desirous and have devised a way to provide a visually more realistic presentation of an image displayed on a plurality of display segments.
Summary
According to a first aspect of the invention there is provided a method of generating a nonlinear geometric& adjustment of a multi-frame visual presentation which is arranged to be displayed over a number of visual display devices, such that each display device displays an image portion of the overall frame image, and the method comprising generating a two-dimensional mesh geometry by elongating a vertical dimension of a viewing frustum.
According to a second aspect of thc invention there is provided a visual display apparatus comprising a data processor configured to perform the method of the first aspect of the invention.
According to a third aspect of the invention there are provided machine-readable instructions for execution by a data processor which is arranged to implement the method of the first aspect of the invention. The instructions may be embodied as a software product.
The method, apparatus or machine readable instructions, may include one or more features described in the description and/or shown in the drawings.
Brief Description of the Drawings
Various embodiments of the invention will now be described, by way of example only, with reference to the following drawings in which: Figure 4 is a block diagram of a computer image generation display system, Figure 5 is a flow diagram, Figure 6 is a representation of an elongated wire mesh, Figure 7 is a view or two display screens displaying a correct grid pattern, and Figure 8 is a view of two display screens displaying geometrically corrected representations of a scene.
Detailed Description
There will now be described a visual display apparatus 10 which performs an adjustment to an image collectively displayed over a number of visual display devices, such as flat screen displays. The apparatus I comprises two fiat screen panels la and lb. and a data processor 2.
The displays may be termed as ganged. The data processor 2 comprises a graphics processor configured to process and image data to be displayed over the displays la and lb. The processor 2 is provided with image data, stored in an associated memory, and is operative to process that data to generate signals which drive what is displayed on the displays la and lb. In the description which follows, reference is made to a segmented display, and this should be understood to include a plurality of visual display devices, which collectively display an image or a series of images, which each device displaying a portion of the overall image.
Broadly, the data processor 2 comprises instructions to effect a non-linear geometric adjustment by first vertically elongating the viewing frustum and then applying an additional rendering pass for each frame on a real-time basis. The extended vertical field of view is given by: HFOV and VFOV arc the original horizontal and vertical fields of view respect el and VFOVext is the elongated, or extended vertical field of view.
Each frame, after the scene has been rendered using standard perspective projection, the contents of the frame buffer are copied into a texture. This texture is mapped onto a two-dimensional polygon mesh, and this mesh is then rendered using orthographic (parallel) projection into the frame buffer. Thus the original image is stretched and warped according to the geometry of the polygon mesh. With the use of high performance Graphics Processor Units (GPUs), these additional operations can be executed in less than ten milliseconds, so advantageously the frame update rate of the graphics sub-system is either unaffected or only marginally reduced.
The geometric adjustment needed is only in the vertical axis, hence the X ordinates of the vertices within the polygon mesh are set according to a constant linear progression. The ordinates are calculated according to the following formulae: Using the above formulae, the data processor 2 calculates an adjusted mesh geometry which is a modified reference co-ordinate system relating the two dimensional space of the displays. A wire mesh representation of such a geometry is shown in Figure 6, in which it can be seen that there is a degree of curvature in the vertical dimension, of the grid lines. When this geometry is used in rendering a frame image (including texture and perspective scaling), as will be described below, a more visually realistic transition between adjacent displays results.
The process which is executed by the data processor 2 is illustrated in the flow diagram in Figure 5. Once the processor has commenced operating, at step 102 the extended vertical field of view is calculated using the formulae above. On the basis of those calculations, a mesh geometry applicable to the area of the display screens is calculated. At step 104, scene data is used to generate a perspective projection of the frame image of the scene using the extended field of view. For perspective projection, when the human eye views a scene, objects in the distance appear smaller than objects close by. Representing distant objects as smaller provides additional realism.
At step 105, the data processor 2 is configured to copy the perspective projection image data stored in the buffer to a texture. In 3D graphics, texture may be understood as referring to the digital representation of the surface of an object, such as colour and brightness, how transparent and reflective the object is, etc. Once a texture has been defined, it can be wrapped around a 3-dimensional object. In the current context, texture relates principally, although not exclusively, to first storing a copy of an RGB image contained in the frame buffer after the normal rendering of the scene, then mapping this image over the warped polygon mesh.
At step 106, a parallel, or orthographic, projection is applied to render the textured mesh generated at step 105. Orthographic projection ignores the effect of perspective, with objects with are further away appearing smaller. Types of view created by orthographic projection include plane, cross-section, bird's-eye, and elevation. The texture is applied onto a two-dimensional polygon mesh. Thus the original image is stretched and warped according to the geometry of the polygon mesh.
The processor is arranged to then output signalling including this data to the displays. The image rendering steps are then repeated for subsequent frames. The whole image field of view is sub-divided by the number of segments used, and each segment is configured to display its sub-area within the whole, having field of view and directional offsct applied accordingly. For example, if the whole field of view is 120 degrees horizontally by 40 degrees vertically and two segments were used, they would both be configured with a field of view of 60 by 40 degrees, the left segment would be programmed with a -30 degree horizontal offset and the right segment would be programmed with a +30 degree horizontal offset. Thus the two segments collectively display the whole 120 by 40 degree image. Reference is made to Figure 7 which shows a spherical grid displayed on two adjacent flat panel displays. As can be seen, the lines of constant latitude are now straight and continuous transitioning across the displays, which, without the processing steps above would not be discontinuous at the transition between the displays. Reference is also made to Figure 8 which shows an airport scene. A straight service road is shown in the foreground, and spans the two displays. As can be seen, there is a smooth and continuous transition of this linear feature across the displays, thus enhancing the visual realism of the CGI scene.
Applications of the above process include, but are not limited to, visual simulation for vehicle simulators whose purpose is to train prospective drivers/pilots and Air Traffic Control (ATC) simulators to train prospective air traffic controllers. The process achieves a higher degree of visual realism, and therefore provides an enhanced training experience.
Although flat screen displays are referred to above displays may include TV/monitor displays and flat projection screens illuminated from the front or rear by image projectors. It will also be appreciated that although two displays are shown, the display may comprise more screens.

Claims (10)

  1. CLAIMSI. A method of generating a non-linear geometrical adjustment of a multi-frame visual presentation which is arranged to be displayed over a number of visual display devices, such that each display device displays an image portion of the overall frame image, and the method comprising generating a two-dimensional mesh geometry by elongating a vertical dimension of a viewing frustum.
  2. 2. A method as claimed in claim 1 in which the two-dimensional mesh geometry is such as to provide a geometric adjustment to stretch and warp the frame image.
  3. 3. A method as claimed in claim I or claim 2 in which the mesh geometry is a two-dimensional reference co-ordinate system domain of the visual display devices.
  4. 4. A method as claimed in claim I or claim 2 in which the geometrical adjustment is such as to generate a substantially smooth continuous transition between image portions of adjacent visual display devices
  5. 5. A method as claimed in any preceding claim which comprises rendering frame image data using a perspective projection.
  6. 6. A method as claimed in claim 4 which comprises applying a texture to the rendered frame image data.
  7. 7. A method as claimed in claim 5 which comprises mapping the rendered frame image data onto the two-dimensional mesh geometry.
  8. 8. A method as claimed in claim 6 which comprises rendering the mapped frame image data using an orthographic projection
  9. 9. Computer executable instructions, which, when executed by a data processor, arranged to output signals for generating a multi-frame visual presentation over a number of visual display devices, wherein each display is caused to show a portion of an overall frame image, and the instructions are configured such that a two-dimensional mesh geometry is generated by elongating a vertical dimension of a viewing frustum.
  10. 10. A data processor for a visual display system, the processor arranged to generate a nonlinear geometrical adjustment of a multi-frame visual presentation which is arranged to be displayed over a number of visual display devices, such that each display device displays an image portion of the overall frame image, and the method comprising generating a two-dimensional mesh geometry' by elongating a vertical dimension of a viewing frustum.Amendments to the claims have been made as followsCLAINA method; eonectng eoinputer generated ensionalreal-time thatcollectively form a niulti-frame visual presentatitm which is arranged to be displayed over a plurality of visual display devices, wherein each display device displays a respective image portion of an overall frame image, and the method comprising for each image portion elongating a vertical dimension of a viewing frustum and applying non--linear geometrical adjustment by; means of a two-dimensional mesh geometry and er applying a rendering pass, before being output to a respective display device, and wherein the geometrical adjusiment is such as to generate a substantially smooth continuous transition between image portions of isual display devices.2. Am which two-dimensional mesh geometry is such as to provide,etrie adjustment stretch and Warp the frame Image, 3. A method claimed i claim 1 or claim 2 in which the mesh g ensional referencecc-ordinate system domain of the visual display devices,: method as claimed in any pre which comprises renderingra image data using a perspective projeeti 5. A rendered fram claim I which comprises applying a A method asclaimed in claim 4 which comprise ndcrcd frame Mage data dab thetwo-dimensional mesh geometry,: 7.,A method as claimed in claim 5 which comprises rendering the mapped, frame image data:titthig:fiti:orthOgiiaphie projectioi portion of an overall frame image, and the instructions are configured 8. computer executable instructions which, when executed by a data processor, arranged to output signals for generating a multi-frame mi. presentation over a number of visual display devices, wherein each display is caused to show an mage for each image portion elongating a var nealviewing frustum and applying ical adjustment by means of a two-dimensional mesh geometry and further applying a rendering pass, before being output to a respective display device, the geometrical adjustment is such as to generate a substantially smooth contiruou t ansition between Unita: portions of adjacent visual display devices, data processor for a visual presentat visual display devices, such overall frame image, and in u portion elongating a vertiea geometrical adjustment by in display system, the processor arranged to * is arranged to be displayed over a number of each display device displays an image portion of the the processor configured such that for e skin of a viewing frustum and applying s of a two-cu:nen:Iona' mesh geometry and fur applying a rendering pass, before being output to a respective display device, and wherein the geometrical adjustment is such as to generate a substantially smooth continuous transition between image portions of adjacent visual display devices, computer exeeutahie instructionsdata processor stibstantia ribed, with reference to the
GB1423145.0A 2014-12-23 2014-12-23 Visual Display Systems Active GB2533625B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1423145.0A GB2533625B (en) 2014-12-23 2014-12-23 Visual Display Systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1423145.0A GB2533625B (en) 2014-12-23 2014-12-23 Visual Display Systems

Publications (2)

Publication Number Publication Date
GB2533625A true GB2533625A (en) 2016-06-29
GB2533625B GB2533625B (en) 2021-07-28

Family

ID=56100080

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1423145.0A Active GB2533625B (en) 2014-12-23 2014-12-23 Visual Display Systems

Country Status (1)

Country Link
GB (1) GB2533625B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107979766A (en) * 2016-10-21 2018-05-01 弗勒克斯普拉内特有限公司 Content series flow system and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2184926A1 (en) * 2008-11-07 2010-05-12 Barco NV Non-linear image mapping using a plurality of non-linear image mappers of lesser resolution
WO2012170559A2 (en) * 2011-06-06 2012-12-13 Array Telepresence, Inc. Dual-axis image equalization in video conferencing
US20140160233A1 (en) * 2012-12-11 2014-06-12 Sony Corporation Information processing apparatus, information processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2184926A1 (en) * 2008-11-07 2010-05-12 Barco NV Non-linear image mapping using a plurality of non-linear image mappers of lesser resolution
WO2012170559A2 (en) * 2011-06-06 2012-12-13 Array Telepresence, Inc. Dual-axis image equalization in video conferencing
US20140160233A1 (en) * 2012-12-11 2014-06-12 Sony Corporation Information processing apparatus, information processing method, and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107979766A (en) * 2016-10-21 2018-05-01 弗勒克斯普拉内特有限公司 Content series flow system and method
CN107979766B (en) * 2016-10-21 2020-07-24 弗勒克斯普拉内特有限公司 Content streaming system and method

Also Published As

Publication number Publication date
GB2533625B (en) 2021-07-28

Similar Documents

Publication Publication Date Title
CN103337095B (en) The tridimensional virtual display methods of the three-dimensional geographical entity of a kind of real space
US10096157B2 (en) Generation of three-dimensional imagery from a two-dimensional image using a depth map
CN102243768B (en) Method for drawing stereo picture of three-dimensional virtual scene
US20110084983A1 (en) Systems and Methods for Interaction With a Virtual Environment
CN106897976B (en) Single video card triple channel solid what comes into a driver's projection software based on GPU corrects fusion method
CN102289845B (en) Three-dimensional model drawing method and device
CN101968890B (en) 360-degree full-view simulation system based on spherical display
GB2244412A (en) Simulating non-homogeneous fog with a plurality of translucent layers
DE202014010937U1 (en) Superposition of two-dimensional map data on a three-dimensional scene
KR101359011B1 (en) 3-dimensional visualization system for displaying earth environment image
US10609353B2 (en) Systems and methods for generating and displaying stereoscopic image pairs of geographical areas
US20170124748A1 (en) Method of and apparatus for graphics processing
KR20150124112A (en) Method for Adaptive LOD Rendering in 3-D Terrain Visualization System
CN104282014A (en) Multichannel geometric correction and edge blending method based on NURBS curved surfaces
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
CN106204703A (en) Three-dimensional scene models rendering intent and device
GB2256568A (en) Image generation system for 3-d simulations
GB2533625A (en) Visual Display Systems
CN102087465B (en) Method for directly assigning and displaying true three-dimensional simulation regions
CN111599011B (en) Power system scene rapid construction method and system based on WebGL technology
US9875573B2 (en) Method and apparatus for rendering a 3-dimensional scene
Ono et al. A photo-realistic driving simulation system for mixed-reality traffic experiment space
Xing et al. Multi-projector three-dimensional display for 3D Geographic Information System
CN106170085A (en) A kind of without mirror solid engine exchange method
CN105447812A (en) 3D moving image displaying and information hiding method based on linear array