GB2514008A - Display system - Google Patents

Display system Download PDF

Info

Publication number
GB2514008A
GB2514008A GB1407819.0A GB201407819A GB2514008A GB 2514008 A GB2514008 A GB 2514008A GB 201407819 A GB201407819 A GB 201407819A GB 2514008 A GB2514008 A GB 2514008A
Authority
GB
United Kingdom
Prior art keywords
display apparatus
display
processor
projector
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1407819.0A
Other versions
GB201407819D0 (en
GB2514008B (en
Inventor
Mark Lewis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
MICRO NAV Ltd
Original Assignee
MICRO NAV Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by MICRO NAV Ltd filed Critical MICRO NAV Ltd
Publication of GB201407819D0 publication Critical patent/GB201407819D0/en
Publication of GB2514008A publication Critical patent/GB2514008A/en
Application granted granted Critical
Publication of GB2514008B publication Critical patent/GB2514008B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Abstract

A display apparatus 1 comprises multiple fixed position image projectors 2, a display screen 10 which comprises an at least partly cylindrical display surface, a data processor 15, multiple fixed position image sensors 3, and a light emitter arrangement, which may comprise an array of LEDs, and which is arranged to emit light at a plurality of reference point locations (11, fig.2) on the screen. The data processor is arranged to perform an alignment routine in which the image sensors are arranged to record images of the activated emitters, and also to record reference images projected by the projectors into the screen, the data processor being arranged to process the recorded images so as to generate a transformation between the display surface and the image sensor, and a transformation between the projector and the display surface, and to use the transformations to generate a non-linear geometric correction to allow projected images from the projectors to be aligned with the display screen.

Description

DISPLAY SYSTEM
Technical Field
The present invention relates generally to projector display systems.
-I
Background
The set-up and on-going maintenance of multi-projector blended displays, such as simulation displays, has long been the preserve of a small number of specialist display engineers. These experts built up their skills and knowledge over years of experience, and the work involved has often taken days of painstaking manual adjustment and tuning with the aid of expensive specialist equipment. Sadly, even though huge amounts of time and money have been spent on this work, the end results of manual alignment have never been perfect. Moreover, as time progresses these displays have become misaligned due to small mechanical shifts, and colour differences creep in due to projector lamp aging or lamp changes. It is often necessary to call back the experts to re-align these displays, and at significant cost to the end-user.
We have appreciated that there is a need for the automated calibration of a projector based display system.
Summary
According to a first aspect of the invention there is provided a display apparatus as claimed in claim 1.
It will be appreciated that although reference is made to a processor, this covers both a single physical entity as well as multiple data processing entities.
The light emitter arrangement may comprise a plurality of light emitter components. The light emitter components are preferably selectively
activatable.
The light emitter arrangement may be incorporated into the screen.
The processor may be arranged to generate a three dimensional model of the screen using the acquired image data.
The processor may combine data indicative of the shape of the display surface with image data from the image sensor.
The data processor may be arranged to use the transformation for at least one to other aspect of the calibration routine.
The processor may be arranged to cause the projector to display reference patterns/images on the display screen, which patterns/images are sensed by the image sensor, and the data therefrom used to determine calibration data.
The processor may be arranged to allow transformation from a display screen frame of reference to an image sensor frame of reference, and vice versa.
The processor may be arranged to control activation of the emitters and to synchronise the activation with activation of the image sensor.
The projector may be arranged to project moving images.
The apparatus may be a simulator, The apparatus may comprise an aircraft control tower simulator.
The processor may be arranged to activate the emitters in a predetermined sequence.
The apparatus may comprise a plurality of projectors and a plurality of image sensors, each of said projectors being associated with a respective image sensor.
Each projector and each image sensor may be directed towards a respective portion of the display surface, with one portion neighbouring an adjacent portion.
The emitters may provide fixed reference points (relative to the display surface) for the image sensor.
The processor may be arranged to apply data acquired from the calibration tO routine to rendered image data to serve as input data to the projector.
The processor may be arranged to cause one or more images to be displayed on the display surface, which at least one image is sensed by the image sensor, and the sensed image data used to generate calibration data.
The apparatus may be a front-illuminated projector apparatus.
The reference point locations may be spaced at regular intervals over the display surface.
The reference point locations may be arranged in a grid pattern.
The light emitter arrangement may be controllable by way of contro' signals issued from a data processor.
The light emitter arrangement may be arranged to be activated in a predetermined sequence.
The reference point locations may be controllable to be in an ON state or OFF state.
The reference point locations may be embedded within the display surface.
According to a second aspect of the invention there is provided a method of aligning a projector display system, comprising use of the display screen of the second aspect of the invention, the method comprising sensing the light from the light emitter arrangement by an image sensor, and using the data so obtained to generate a transformation between the display surface and the image sensor.
The method may be a fully automated process.
to Any of the above aspects, or further aspects, may comprise one or more features
disclosed in the description and/or the drawings.
Brief Description of the drawings
Various embodiments of the drawings will now be described, by way of example only, with reference to the following drawings in which: Figure 1 is a schematic view of a projector display system, Figure 2 is a perspective view of a display screen of the system of Figure 1, Figure 3 is a block diagram of an image renderer, and Figure 4 is a block diagram of a display calibration renderer,
Detailed Description
With reference initially to Figure 1, there is shown a visual display apparatus or system I The apparatus I comprises a plurality of projectors 2, a plurality of image sensors comprising cameras 3 and a plurality of data processors. Each trio of a projector 2, a camera 3 and a data processor 4, forms, collectiv&y, a respective image generator node, which in turn can be considered as a respective image channel. The apparatus I further comprises a curved display 2) screen 10 onto which images from the projectors are displayed. As will be described in more detail below, each image generator node causes a projected image to be displayed on a respective portion of the display screen 10. The apparatus further comprises a display system manager 15 and a screen controller 16. Broadly, the apparatus has two principal modes of operation; a calibration mode and an image generation mode. In the alignment/calibration mode, data is acquired so that that correction/calibration data can be applied to rendered image data during the image generation mode.
to The system as shown in Figure 1 also comprises hubs 2a, 4a and ISa.
The data processing performed by the data processors of the apparatus may be considered as forming three processing modules: a Display Calibration Processor (DCP), an Image Generator (IG) and a Display Calibration Renderer (DCR). In broad terms, the DCP performs calculittions during the calibration mode, and is controlled by a Display Calibration Manager (DCM), and the processing is distributed across the data processors of each of the IG nodes. The IC performs the necessary processing, in conjunction with the DCR, to generate image data which can then be transformed into projected images by the projectors. Common to both of the modes is the use of the Display Calibration Renderer (DCR) library. This plugs into either the Display Calibration Processor (DCP) when the system is operating in calibration mode, or the Image Generator (IG) when in image generation mode.
The 10 is a real-time three-dimensional image renderer. It loads a model of the world (terrain, ocean, sky, buildings, etc.) plus movable vehicle models (aircraft, ground vehicles, ships, characters). It is instructed by a host simulation computer via a network interface, and instructions include the location (latitude, longitude, altitude) of the simulated eye-point, locations of ali moveable objects (vehicles, etc.), environmental conditions (time of day, day of year, weather conditions, clouds, etc.). From this information the JO generates images in real-time (typically 60Hz, but could be another value such as 30Hz).
A simulator visual system comprises an array of IG nodes, each generating a portion of the entire field of view of the display as a whole such that collectively the projected images cover the whole screen and provide a continuous unbroken image to the observer. Figure 3 shows a block diagram of the principal functional components of the renderer.
to The DCR is a software library module that performs display calibration functions (geometric correction, blending, etc.) in post-processing after the normal 3D image has been rendered. The library "plugs in" to both the IG and the Display Calibration Processor (DCP). Data files are used to store (calibration) information that describes the exact form of geometric correction, blending, etc. to be applied. When used within the IG, each DCR loads the data file and this is then applied each frame to correct the generated image. When used within the DCP, the DCR constructs the calibration data and then this is saved into a data file.
The DCP runs on each IG node, and performs the necessary computations for the calibration process. It is controlled via a network interface by the Display Calibration Manager (DCM).
With reference to Figure 2, the display screen 10 is of curved shape (when viewed in plan) and in particular is of part arcuate/part-circular shape. It will be appreciated that in alternative embodiments, the screen may be of overall complete circular shape, but the screen's display surface is at least in part cylindrical. An inwardly-facing surface 12 of the screen is arranged to receive and display images projected thereonto by the projectors. The display screen further comprises a plurality of light emitters 11, which are regularly spaced over the screen. In particular, the light emitters 11 are arranged in grid formation. The emitters are embedded within the screen 12, each one located at a specific polar location (azimuth and elevation) such that collectively they form a fixed reference for the cameras 3. The light emitters may comprise LEDs. The light emitters can be controllably activated (individually) by way of signals received from the controller to during the calibration mode.
Advantageously, the emitters are dimensioned such that they are not readily visible during the image generation mode.
The cameras 3 and the projectors 2 are held at fixed positions supported above the space bounded by the display screen 10, for example by way of a gantry to structure or similar, with the camera of each node directed to the same portion of the display surface 12 as its associated projector 2. The projectors and the image sensors are thus provided as fixed-position components, and in the sense that their positions are fixed relative to the display surface.
As will now be described in detail, the following alignment/calibration sub-routines are performed during the calibration mode: Camera/screen transformation Camera/proj ector transformation Channel View Frustum Geometric Image Warping Soft Edge Blending A first stage of the calibration mode is the calibration of the camera / screen transformation, The aim of this procedure is to create a means to transform between a camera space (pixel X, Y) and screen space (polar azimuth, elevation), and which is a bidirectional transformation. The steps of this procedure may be summarised as follows: 1. Calibrate camera lens distortion to create a transformation from captured X, Y pixel location to undistorted pixel location.
2. Install software-controlled light emitters at known polar locations on the screen surface.
3. Capture images, detect camera X, Y of each light emitter and correct these locations for lens distortion.
4. Find the optimum transhtion and rotation in three-dimensional space from which to render points that represent all the light emitter reference positions.
5, Render the three-dimensional screen model, comprising vertices at polar co-ordinates, using the translation and rotation found in the previous step.
6. Switch to orthographic projection and render the model again using the to already projected two-dimensional co-ordinates, but correct the positions of these such that the ones that correspond with light emitter positions coincide exactly with the measured positions in the camera image.
7. Use an off-screen frame buffer to encode the azimuth and elevation.
8. The resulting image can then be used to find the azimuth and elevation of any pixel location.
9. To transform from screen space to camera space use bilinear interpolation on the polar grid.
10. To transform from camera space to screen space index into the image created above and extract azimuth and elevation.
It will be appreciated that it is not necessary to know where the camera is in three-dimensional space. Finally, the absolute accuracy of the light emitter reference locations can have a non-zero tolerance.
A second stage of the calibration procedure is the camera/projector transformation to create a means to transform between a camera space (pixel X, Y) and a projector space (pixel X, Y). Again, this needs to be a bidirectional transformation. During this stage the following steps are performed: 1. Display a field of a black colour on the projector and capture the camera image. This will be subtracted from all subsequent camera images.
2. Display a field of a second colour on the projector and capture the camera image. This will be used to delimit the footprint of the projector.
3, Display a pattern on the projector, detect and store the camera X, Y locations of features of the pattern.
4. Display and capture a number of reference patterns that coLlectively enable the identification of each of the features displayed in the previous step. This determines the projector-space location of each detected feature.
5. To transform from camera-space to projector-space index into the image generated from the above render and extract the projector-space location.
to A third stage of the calibration procedure is the Channel View Frustum. The aim of this procedure is to determine the field-of-view extents for a projector, to be used in the JO when rendering a perspective image. A summary of the steps performed is as follows: 1. For each projector space border pixel, compute the screen space polar co-ordinate by first converting from projector-space to camera-space, and then from camera-space to screen-space.
2. Determine the frustum extent for each of the left edge, the right edge, the bottom edge and the top edge.
3. Calculate the channel direction Heading and Pitch (H & P).
4. Adjust frustum to take into account the channel direction.
A fourth stage of the calibration procedure is Geometric Image Warping. The purpose of this stage is to compute the non-linear warping necessary to correctly align a projected image to the screen geometry. A summary of the steps is as follows: 1. Create a mesh at a configurable density.
2. For each mesh vertex: a. Compute the linear projected polar co-ordinate.
b. Transform from this polar co-ordinate (screen-space) to camera-space.
c. Transform from camera-space to projector-space.
d. Position the vertex at this pix& location.
3. After the IG has rendered each frame, read the contents of the frame buffer into a texture.
4. Apply this texture and render the mesh.
A fifth stage of the calibration procedure is to apply a soft edge blend to all regions where projected images overlap.
The above embodiment has various significant advantages. Advantageously, displays incorporating very wide fields of view can be calibrated precisely and to rapidly, the process is fully automated and can be operated by semi-skilled users. Display systems incorporating multiple projectors are aligned and blended in a fraction of the time normally taken by traditional, manual means.
The above automated alignment/calibration system is installed at the end-user site as part of the display system, and remains available to the user over the lifetime of the system. Typically, re-alignment (ie requiring a re-run of the alignment/calibration routine) would be necessary in the event that a projector or component is moved or replaced, but nevertheless, the quality of the displayed image can also advantageously benefit from a periodic re-alignment, in particular by reason of colour and light changes.
The array of (preferably high resolution digital) cameras provides the "eyes" of the system to provide data which is fed to the data processors, and ensuring that the quality of the resulting smooth continuous image is much higher than has ever been achieved by traditional manual means, What is more, the process can be repeated by the user as frequently as is needed, at no extra cost (and in particular, at minimal time cost). Taking just minutes to align an entire three hundred and sixty degree display, the process can be performed quickly and easily by the user whenever it is required.
Although the above embodiment finds particular application in the field of simulation displays, other applications of projector display systems are also relevant.
It will be appreciated that although mention is made of one camera/image sensor for each projector, in alternative embodiments, there may be more or fewer than one camera for each projector. Accordingly, in some embodiments there need not be a 1: t relationship between both cameras and image sensors. It will further be appreciated that more than one projector may be provided per to camera/image sensor.

Claims (27)

  1. CLAIMS1. A display apparatus comprising multiple fixed position image projectors, a display screen, a data processor, multiple fixed position image sensors, and the apparatus comprising a light emitter arrangement, the light emitter arrangement arranged to emit light from the display surface, at a plurality of reference point locations from the screen, and the display screen comprising an at least in part cylindrical display surface wherein, the data processor arranged to perform an alignment routine in which the image sensors arranged to record images of the to activated emitters, and to record reference images projected by the projectors into the screen, the data processor arranged to process the recorded images so as to generate a transformation between the display surface and the image sensor, and a transformation between the projector and the display surface, and to use the transformations to generate a non-linear geometric correction to allow projected images from the projectors to be aligned with the display screen.
  2. 2, A display apparatus as claimed in claim I in which the light emitter arrangement comprises a plurality of light emitter components.
  3. 3. A display apparatus as claimed in claim I or claim 2 in which the light emitter components are selectively activatable.
  4. 4. A display apparatus as claimed in any preceding claim in which the light emitter arrangement is incorporated into the screen.
  5. 5. A display apparatus as claimed in any preceding claim in which the processor is arranged to generate a three dimensional model of the screen using the acquired image data.
  6. 6. A display apparatus as claimed in any preceding claim in which the processor combines data indicative of the shape of the display surface with image data from the image sensors.
  7. 7, A display apparatus as claimed in any preceding claim in which the data processor is arranged to use the transformations for at least one other aspect of a calibration routine.
  8. 8. A display apparatus as claimed in any preceding claim in which the to processor is arranged to cause the projector to display reference patterns/images on the display screen, which patterns/images are sensed by the image sensors, and the data therefrom used to determine a transformation between the projector's frame of reference and that of the image sensors.
  9. 9. A display apparatus as claimed in any preceding claim in which the processor is arranged to allow transformation from display screen frame of reference to an image sensor frame of reference, and vice versa.
  10. 10. A display apparatus as claimed in any preceding claim in which the processor is arranged to control activation of the emitters and to synchronise the activation with activation of the image sensor.
  11. 11. A display apparatus as claimed in any preceding claim in which the projector is arranged to project moving images.
  12. 12. A display apparatus as claimed in any preceding claim in which the apparatus is a simulator.
  13. 13. A display apparatus as claimed in any preceding claim in which the apparatus comprises an air traffic control simulator.
  14. 14. A display apparatus as claimed in any preceding claim in which the processor is arranged to activate the emitters in a predetermined sequence.
  15. 15. A display apparatus as claimed in any preceding claim in which each projector and each image sensor is directed towards a respective portion of the display surface, with one portion neighbouring an adjacent portion.
  16. 16. A display apparatus as claimed in any preceding claim in which the light emitter arrangement provides fixed reference points (relative to the display to surface) for the image sensor.
  17. 17. A display apparatus as claimed in any preceding claim in which the processor is arranged to apply data acquired from the calibration routine to render image data to serve as input data to the projector.
  18. 18. A display apparatus as claimed in any preceding claim in which the processor is arranged to cause one or more images to be displayed on the display surface, in which at least one image is sensed by the image sensor, and the sensed image data used to generate calibration data.
  19. 19. A display apparatus as claim in any preceding claim in which the apparatus is a front-illuminated projector apparatus.
  20. 20. A display apparatus as claimed in any preceding claimin which the reference point locations are spaced at regular intervals over the display surface.
  21. 2]. A display apparatus as claimed in any preceding claim in which the reference point locations are arranged in a grid pattern.
  22. 22. A display apparatus as claimed in any preceding claim in which the light emitter arrangement is controllable by way of control signals issued from the data processor.
  23. 23. A display apparatus as claimed in any preceding claim in which light emitters of the light emitter arrangement are controllable to be in an ON state or OFF state.
  24. 24. A method of aligning a projector display system, comprising use of the to display screen, the data processor and the image sensors of any preceding claim.
  25. 25. The method of claim 24 which is a fully-automated process.
  26. 26. A display apparatus substantially as herein described with reference to the drawings.
  27. 27. A method of aligning a projector display system substantially as herein described.Amendments to the claims have been filed as follows:C LA I NI S1, A automated calibration display apparatus comprising multiple fixed position image projectors, a projection display screen, a data processor, multiple fixed position image sensors, and the apparatus comprising a light emitter arrangement, the display screen comprising an at least in part cylindrical projection display surface and the light emitter arrangement arranged to emit light from the display screen at a plurality of reference point locations from said surface, and wherein, the data processor arranged to perform an alignment routine in which the image sensors arranged to record images of the activated emitters, and to record reference images projected by the projectors into the screen, the data processor arranged to process the recorded images so as to generate a transformation between the geometry of the projection screen display surface and the images from the image sensors, and a transformation between the projector and the display surface, and to use the transformations to generate a non-linear geometric correction to allow projected 0 images from the projectors to be aligned with the projection display screen. (0(\J 2. A display apparatus as claimed in claim in which the light emitter arrangement comprises a plurality of light emitter components.3, A display apparatus as claimed in claim or claim 2 in which the light emitter components are selectively activatable.4. A display apparatus as claimed in any preceding claim in which the light emitter arrangement is incorporated into the screen.5. A display apparatus as claimed in any preceding claim in which the processor is arranged to generate a three dimensional model of the screen using the acquired image data.6. A display apparatus as claimed in any preceding claim in which the processor combines data indicative of the shape of the projection display surface with image data from the image sensors.7. A display apparatus as claimed in any preceding claim in which the data processor is arranged to use the transformations for at least one other aspect of a calibration routine.8. A display apparatus as claimed in any preceding claim in which the processor is arranged to cause the projector to display reference patterns/images on the display screen, which patterns/images are sensed by the image sensors, and the data therefrom used to determine a transformation between the projector's frame of reference and that of the image sensors.9, A display apparatus as claimed in any preceding claim in which the processor is arranged to allow transformation from projection display screen 0 frame of reference to an image sensor frame of reference, and vice versa. (0(\J 10. A display apparatus as claimed in any preceding claim in which the processor is arranged to control activation of the emitters and to synchronise the activation with activation of the image sensor.11. A display apparatus as claimed in any preceding claim in which the projector is arranged to project moving images.12. A display apparatus as claimed in any preceding claim in which the apparatus is a simulator.13. A display apparatus as claimed in any preceding claim in which the apparatus comprises an air traffic control simulator.14. A display apparatus as claimed in any preceding claim in which the processor is arranged to activate the emitters in a predetermined sequence.15. A display apparatus as claimed in any preceding claim in which each projector and each image sensor is directed towards a respective portion of the projection display surface, with one portion neighbouring an adjacent portion.16. A display apparatus as claimed in any preceding claim in which the light emitter arrangement provides fixed reference points (relative to the projection display surface) for the image sensor.17. A display apparatus as claimed in any preceding claim in which the processor is arranged to apply data acquired from the calibration routine to ct-render image data to serve as input data to the projector.18. A display apparatus as claimed in any preceding claim in which the 0 processor is arranged to cause one or more images to be displayed on the (0 projection display surface, in which at least one image is sensed by the image (\J sensor, and the sensed image data used to generate calibration data.19. A display apparatus as claim in any preceding claim in which the apparatus is a front-illuminated projector apparatus.20. A display apparatus as claimed in any preceding claim in which the reference point locations are spaced at regular intervals over the projection display surface.21. A display apparatus as claimed in any preceding claim in which the reference point locations are arranged in a grid pattern.22. A display apparatus as claimed in any preceding claim in which the light emitter arrangement is controllable by way of control signals issued from the data processor.23. A display apparatus as claimed in any preceding claim in which light emitters of the light emitter arrangement are controllable to be in an ON state or OFF state.24. A method of aligning a projector display system, comprising use of the to display screen, the data processor and the image sensors of any preceding claim.25. The method of claim 24 which is a fully-automated process.26. A display apparatus substantially as herein described with reference to the drawings.27. A method of aligning a projector display system substantially as herein described.
GB1407819.0A 2013-05-03 2014-05-02 Display system Active GB2514008B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1308086.6A GB201308086D0 (en) 2013-05-03 2013-05-03 Display system

Publications (3)

Publication Number Publication Date
GB201407819D0 GB201407819D0 (en) 2014-06-18
GB2514008A true GB2514008A (en) 2014-11-12
GB2514008B GB2514008B (en) 2015-09-30

Family

ID=48627315

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1308086.6A Ceased GB201308086D0 (en) 2013-05-03 2013-05-03 Display system
GB1407819.0A Active GB2514008B (en) 2013-05-03 2014-05-02 Display system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1308086.6A Ceased GB201308086D0 (en) 2013-05-03 2013-05-03 Display system

Country Status (1)

Country Link
GB (2) GB201308086D0 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017004859A1 (en) 2017-05-19 2018-11-22 Daimler Ag Method for calibrating a projection geometry of a head-up display and associated calibration device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2324150A (en) * 1997-04-11 1998-10-14 Seos Displays Ltd Calibrating target projector in image display apparatus
US20070242233A1 (en) * 2006-04-13 2007-10-18 Nokia Corporation Relating to image projecting
US20070273795A1 (en) * 2006-04-21 2007-11-29 Mersive Technologies, Inc. Alignment optimization in image display systems employing multi-camera image acquisition
US20080089611A1 (en) * 2006-10-17 2008-04-17 Mcfadyen Doug Calibration Technique For Heads Up Display System
US20080129894A1 (en) * 2006-12-02 2008-06-05 Electronics And Telecommunications Research Institute Geometric calibration apparatus for correcting image distortions on curved screen, and calibration control system and method using the same
US20120176415A1 (en) * 2011-01-06 2012-07-12 Telenav, Inc. Graphical display system with adaptive keystone mechanism and method of operation thereof
WO2013057714A1 (en) * 2011-10-20 2013-04-25 Imax Corporation Invisible or low perceptibility of image alignment in dual projection systems

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2324150A (en) * 1997-04-11 1998-10-14 Seos Displays Ltd Calibrating target projector in image display apparatus
US20070242233A1 (en) * 2006-04-13 2007-10-18 Nokia Corporation Relating to image projecting
US20070273795A1 (en) * 2006-04-21 2007-11-29 Mersive Technologies, Inc. Alignment optimization in image display systems employing multi-camera image acquisition
US20080089611A1 (en) * 2006-10-17 2008-04-17 Mcfadyen Doug Calibration Technique For Heads Up Display System
US20080129894A1 (en) * 2006-12-02 2008-06-05 Electronics And Telecommunications Research Institute Geometric calibration apparatus for correcting image distortions on curved screen, and calibration control system and method using the same
US20120176415A1 (en) * 2011-01-06 2012-07-12 Telenav, Inc. Graphical display system with adaptive keystone mechanism and method of operation thereof
WO2013057714A1 (en) * 2011-10-20 2013-04-25 Imax Corporation Invisible or low perceptibility of image alignment in dual projection systems

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017004859A1 (en) 2017-05-19 2018-11-22 Daimler Ag Method for calibrating a projection geometry of a head-up display and associated calibration device
DE102017004859B4 (en) 2017-05-19 2019-03-14 Daimler Ag Method for calibrating a projection geometry of a head-up display and associated calibration device

Also Published As

Publication number Publication date
GB201308086D0 (en) 2013-06-12
GB201407819D0 (en) 2014-06-18
GB2514008B (en) 2015-09-30

Similar Documents

Publication Publication Date Title
Raskar et al. Quadric transfer for immersive curved screen displays
US10275898B1 (en) Wedge-based light-field video capture
CN101572787B (en) Computer vision precision measurement based multi-projection visual automatic geometric correction and splicing method
CN103929604A (en) Projector array splicing display method
EP2806404B1 (en) Image conversion for signage
CN105308503A (en) System and method for calibrating a display system using a short throw camera
CN103559737A (en) Object panorama modeling method
CN111062869B (en) Multi-channel correction splicing method for curved curtain
KR101553273B1 (en) Method and Apparatus for Providing Augmented Reality Service
CN110505468B (en) Test calibration and deviation correction method for augmented reality display equipment
Van Baar et al. Seamless multi-projector display on curved screens
CN105137705A (en) Method and device for creating virtual dome screen
KR101842141B1 (en) 3 dimensional scanning apparatus and method therefor
Ahmed et al. Geometric correction for uneven quadric projection surfaces using recursive subdivision of Bézier patches
JP2015534299A (en) Automatic correction method of video projection by inverse transformation
GB2514008A (en) Display system
EP0473310A2 (en) Oblique photographic data base generation
CN106846472B (en) Method and device for generating image map based on panoramic map
Xizuo et al. Multi-Projector Calibration Based on Virtual Viewing Space
KR101847996B1 (en) Image projection method for a curved projection area and projection system therefor
KR20170030926A (en) Method and system for omnidirectional environmental projection with Single Projector and Single Spherical Mirror
JPH0154749B2 (en)
Liu et al. Construct low-cost multi-projector tiled display system for marine simulator
Raskar et al. Multi-projector imagery on curved surfaces
CN103942363A (en) Method for configuring optical loads of deep space probe