WO2011004145A1 - An image processing method and device - Google Patents

An image processing method and device Download PDF

Info

Publication number
WO2011004145A1
WO2011004145A1 PCT/GB2010/001292 GB2010001292W WO2011004145A1 WO 2011004145 A1 WO2011004145 A1 WO 2011004145A1 GB 2010001292 W GB2010001292 W GB 2010001292W WO 2011004145 A1 WO2011004145 A1 WO 2011004145A1
Authority
WO
WIPO (PCT)
Prior art keywords
raster data
operable
image processing
data
processing apparatus
Prior art date
Application number
PCT/GB2010/001292
Other languages
French (fr)
Inventor
Graham Olive
Phillip Small
Original Assignee
Thales Holdings Uk Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thales Holdings Uk Plc filed Critical Thales Holdings Uk Plc
Publication of WO2011004145A1 publication Critical patent/WO2011004145A1/en

Links

Classifications

    • G06T5/80
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel
    • G06T5/94
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen
    • G09G5/397Arrangements specially adapted for transferring the contents of two or more bit-mapped memories to the screen simultaneously, e.g. for mixing or overlay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels

Definitions

  • the present invention concerns methods of and apparatus for processing images, such as for use in visual displays in simulators, and dynamic simulators in particular.
  • the aim of a simulator is to provide a platform on which technology or environments are realistically recreated. In many applications, a high 'fidelity' is required if simulators are to be effective for use in training, visualisation or research. The fidelity of images generated and displayed in simulators has a significant impact on the fidelity of the simulator.
  • Light points are a particular challenge.
  • Scenes that are visualised and displayed in a simulator may involve highly adapted and, often, very bright lighting.
  • One example may be airfield runway lighting.
  • the runway is provided with bright lights which appear to a pilot as intense points of light.
  • General purpose image display technology may have a limited dynamic range with which to recreate realistic intensity of points of light to be displayed. Therefore, enhancements to the light points for visual displays of simulators are required if a high fidelity simulator is to be provided.
  • CTR Cathode Ray Tube
  • Calligraphic light point technology provides the ability to display high intensity points by focusing an electron beam on electron sensitive phosphor in the CRT tube. The intensity of the light point is varied by means of the duration for which the electron beam is focused. This achieves a high dynamic range for the intensity of a light point.
  • the resolution and refresh rate dictates the time for which any given pixel can be focussed upon. The colour of any given pixel can be varied but the intensity of any given pixel cannot be varied significantly to produce a high intensity light point.
  • Calligraphic light point technology overcomes this limitation by providing an additional scan interval dedicated to the creation of light points.
  • Calligraphic light point technology has an additional advantage in that it is immune to fluctuations in intensity which may be caused by raster aliasing and similar raster related effects manifest in graphics hardware rendering.
  • a calligraphic light point traversing an image will tend to have constant area and brightness, for example.
  • Calligraphic CRT displays suffer a number of shortcomings in their application to simulators, and dynamic flight simulators in particular.
  • calligraphic CRT displays are relatively complex devices involving high-tension voltage sources. As a result, they are relatively bulky and have a significant weight.
  • the mass of a typical calligraphic CRT display projector may be upwards of 200 kg.
  • three or more such projectors need to be mounted on a moving platform.
  • the disadvantages of a display with a mass of 600kg on a movable platform are readily apparent, particularly as modern flight simulators require movable platforms capable of highly dynamic movement.
  • CRT displays allow a relatively limited number of light points to be drawn in a calligraphic interval. Due to the shortcomings of calligraphic CRTs, it is desirable to replace them with lightweight and low cost "Light Valve” (LV) projector display technologies such as Liquid Crystal Display (LCD), Liquid Crystal on Silicon (LCoS) and Digital Light ProcessingTM (DLP).
  • LV Light Valve
  • LCD Liquid Crystal Display
  • LCDoS Liquid Crystal on Silicon
  • DLP Digital Light ProcessingTM
  • these are raster-only devices and therefore have limited dynamic range for light points. Additionally, as light points need to be generated entirely within a raster signal, aliasing and other raster related spatial or temporal effects may lead to variations in the intensity, shape or size of a light point. The area of a raster only light point will consist of pixels on separate rows of the raster.
  • Light Valve displays provide limited fidelity for simulator displays. Unless the lightpoint features can be enhanced this may limit their application in high fidelity simulators.
  • aspects of the present invention provide an apparatus for processing raster data, the apparatus including means to receive first raster data defining a background scene, means to post-warp the first raster data, means to receive pre- warped second raster data defining light points for the background scene and means to combine the post-warped first raster data and pre- warped second raster data.
  • the second raster data defining light points may be generated or processed independently of effects occurring during post- warping.
  • the light points may benefit from precision floating point mathematics and area based algorithms to maintain realistic characteristics as they traverse a scene represented by the first raster data.
  • aspects of the invention provide image processing of these light points in the pre- warped domain.
  • This image processing may include floating point mathematics.
  • This processing may also include constant area or constant energy algorithms.
  • Other aspects of the present invention provide an apparatus for combining first raster data defining background and second raster data defining light points, the apparatus including means to attenuate the first raster data dependent on the second raster data and means to add the second raster data to the attenuated first raster data.
  • the attenuation allows the dynamic range of the combined light points and background data to, in effect, be increased around the light points.
  • the attenuation may also mitigate bleed of colour of background images into light points. This may allow the light points to maintain pure colour. This may be particularly beneficial in some specific instances, such as when a red light point is against a background of a snow-covered runway. In this instance bleed of colour might result in red light or a pink light point.
  • the term 'raster data' relates broadly to data in a raster format, the data relating to images, image streams or scenes, whether static or dynamic.
  • the term 'geometric domain' refers broadly to a reference or coordinate domain for specifying geometric positions within an image or scene, for example, wherein raster data may be transformed from one geometric domain to another by a suitable geometric transformation. This transformation may also be referred to as warping, post-warping, image mapping, or geometric mapping.
  • an image processing apparatus operable to provide raster data for a display apparatus, the image processing apparatus including: a geometric transformer operable to receive first raster data in a first geometric domain and to transform said first raster data to a second geometric domain;
  • a second raster data generator operable to generate second raster data in the second geometric domain
  • the apparatus may comprise a first raster data generator operable to generate the first raster data.
  • the raster data may represent a scene to be displayed in a simulator.
  • the first raster data may represent a background, without light points, of a scene to be displayed by a simulator.
  • the second raster data may represent light points for the scene to be displayed.
  • the apparatus may comprise an attenuator operable on the first raster data once transformed to the second geometric domain, the attenuator being operable to attenuate the transformed first raster data dependent on the second raster data.
  • the apparatus may comprise a spatial filtering unit operable to apply spatial filtering so as to expand features represented in the second raster data.
  • the apparatus may comprise a colour processor operable to apply a weighted summation function to colour components of the second raster data prior to spatial filtering.
  • the second raster data generator may be operable to generate raster data corresponding to light points of a scene for a simulator.
  • the second raster data generator may be operable to generate raster data dependent on draw list data.
  • the draw-list data may comprise instructions for the generation of light points.
  • the first data generator may be operable to provide draw-list data for data
  • the first geometric domain may be a linear domain such as might be used by a computer display.
  • the second geometric domain may be a non-linear domain.
  • the second geometric domain may be suitable for projection by a given projector used in an Out-The- Window display.
  • the apparatus may include a geometric transformation controller operable to generate instructions for geometrical transformation of data between the first and second geometric domains. These instructions may be used by the second data generator to generate second raster data in the second geometric domain.
  • the first raster data may use a limited portion of an available display intensity dynamic range. The portion may be 50%. Aternatively, the portion may be another
  • the portion may be calculated to allow a given dynamic range for light points, with respect to a background image.
  • the apparatus may include a feature controller operable to apply algorithms to the second data to control features represented within the second raster data. These features may be light points.
  • the feature controller may be operable to control the position of the features.
  • the feature controller may be operable to apply floating-point mathematics to the features.
  • the feature controller may be operable to control the area of the features.
  • the feature controller may be operable to apply area-based algorithms to the features.
  • the feature controller may be operable to apply anti-aliasing algorithms to the features.
  • the area of light points may be controlled to be constant in the geometric domain in which the light points will be displayed. It will be understood by the skilled reader that "constant" is to be read in the context of changes in area intrinsic in a dynamic scene to be displayed. For example, a light point which is traversing a scene which includes an approaching runway will be constant in terms of traversing a raster image, but will gradually expand as the runway is progressively approached.
  • the algorithms are applied to the second raster data in the second geometric domain, so the effect of the algorithms will occur in that domain directly.
  • the second raster data generator may be operable to generate data corresponding to light points of a scene for a simulator.
  • an image processing apparatus operable to combine raster data, the apparatus including: an attenuator operable on first raster data to provide attenuated first raster data; and
  • a raster data combiner operable to combine the attenuated first raster data with second raster data
  • the attenuation unit may be operable to apply attenuation to the first raster data dependent upon the second raster data.
  • the apparatus may include a geometric transformation unit operable to transform the first raster data from the first geometric domain to the second geometric domain.
  • the attenuator may be operable on the first raster data once transformed.
  • the attenuation unit may be operable to apply attenuation to the transformed first raster data dependent on second raster data in the second geometric domain.
  • the apparatus may include a spatial filtering unit operable on the second raster data.
  • the attenuation unit may be operable to apply attenuation dependent on the spatially filtered second raster data.
  • the spatial filtering unit may be operable to apply a Gaussian spatial filter.
  • a Gaussian spatial filter will cause attenuation applied to form expanded attenuated regions around features, such as light points, represented in the second raster data. This may tend to prevent colour bleed. This may also tend to generate a larger apparent dynamic range for features such as light points.
  • the spatial filtering unit may be operable to apply a NxN spatial filter.
  • the spatial filter may be a 3x3 spatial filter.
  • the attenuation applied may be dependent on a spatially filtered function of colour components of the second raster data.
  • the apparatus may include a colour processor operable to apply a function to colour components of the second raster data and provide colour processed raster data to the spatial filtering unit.
  • the function may be a weighted sum of colour components. The weighted sum may generate monochromatic raster data to be supplied to the filtering unit to generate monochromatic filtered raster data.
  • the filtering unit and/or the colour component processing unit may be operable to apply filter coefficients with polarity and/or normalisation suitable to attenuate said first raster data if multiplied by the data output from the filter.
  • monochromatic data with suitable polarity provides a scene which corresponds to the light point scene but which has expanded light points and has an opposite polarity, in effect.
  • This scene is used to attenuate the background scene to increase the effective contrast of the light points relative to the raster data around the light points. This provides a higher fidelity, or contrast, for light points. This may also mitigate or prevent colour bleed between background and light points. This may also increase the sharpness of light points due to a subtle border being provided by attenuation of background raster data, around each light point.
  • the attenuator may be operable to attenuate the first raster data by multiplying it with data from the spatial filtering unit.
  • an image processing apparatus operable to combine raster data, the apparatus including:
  • a first data input operable to receive first raster data defined in a first geometric domain
  • a geometric transformer operable to transform said first raster data to a second geometric domain
  • a second data input operable to receive second raster data in the second geometric domain
  • a raster data combiner operable to combine first and second raster data in the second geometric domain.
  • the combined image may have spatial positioning of the second raster data with the first raster data preserved.
  • an image processing apparatus for providing raster data with enhanced light points for a display apparatus, the image processing apparatus including:
  • a geometric transformation unit operable to provide geometrically transformed background raster data
  • an attenuator operable to apply an attenuation filter to the geometrically transformed background raster data to provide attenuated geometrically transformed background raster data, the attenuation filter determined dependent on the light point raster data, and
  • a raster data combiner operable to combine light point raster data to the attenuated geometrically transformed background raster data.
  • the combination of the background raster data with geometrically transformed light point raster data may be so as to preserve spatial positioning of light points relative to the background.
  • the attenuation filter may be operable to attenuate the geometrically transformed background data using raster data which is provided by spatially filtering the light point raster data.
  • the apparatus may include a light point raster data generating unit operable to generate light point raster data corresponding to light point characteristics and geometric transformation of background raster data.
  • This embodiment also applies attenuation and light point raster data to background data which has already been geometrically transformed.
  • the geometric transformation unit may be operable to apply a transformation according to the geometric characteristics of a given non-linear display apparatus.
  • a display apparatus for a simulator including: a raster display apparatus operable to project images onto a non-linear display screen; and
  • a method of processing image data for a display apparatus including:
  • a method of processing image data for a display apparatus including:
  • the method may include applying algorithms to control features represented in the second raster data.
  • the invention provides a raster data combiner apparatus for combining received first and second raster data, the apparatus including:
  • an attenuator operable to attenuate the first raster data, wherein said attenuation is dependent on the second raster data
  • a raster data combiner operable to combine the second raster data with the attenuated first raster data.
  • the method may include applying geometric image transformation to the background raster data prior to attenuation being applied.
  • the light point raster data provided may be representative of corresponding geometric image transformations.
  • a computer program product which may be embodied in a passive storage medium, and which configures a computer, or combination of computer and dedicated hardware, to operate as an apparatus defined in any one of the paragraphs above.
  • a computer program product which may be embodied in a passive storage medium, and which configures a computer to perform a method defined in any one of the paragraphs above.
  • the term computer may encompass a computer with dedicated hardware.
  • aspects of the invention may be provided in the form of a computer program product, which may be embodied in a passive storage medium such as an optical or magnetic medium, or in an electronic medium such as a mass storage device (e.g. a FLASH memory), or in a hardware device implemented to achieve execution of instructions in accordance with the invention, such as an ASIC, an FPGA, a DSP or the like.
  • a passive storage medium such as an optical or magnetic medium
  • an electronic medium such as a mass storage device (e.g. a FLASH memory)
  • a hardware device implemented to achieve execution of instructions in accordance with the invention, such as an ASIC, an FPGA, a DSP or the like.
  • Figure 1 depicts a dynamic simulator with which an image processing apparatus according to a specific embodiment of the invention is used
  • Figure 2 is a schematic diagram of an image processing apparatus according to an embodiment of the present invention.
  • FIG 3 is a schematic diagram of an image processing apparatus in accordance with an alternative specific embodiment of the present invention, the embodiment depicts in more detail, a merge unit as depicted in Figure 2;
  • Figure 4 is a flow diagram depicting a process applied in accordance with the specific embodiment depicted in Figure 3;
  • Figure 5 is a flow diagram depicting a process applied in accordance with the same specific embodiment as depicted in Figures 3 and 4;
  • Figure 6 depicts an image processing apparatus in accordance with a further alternative specific embodiment of the invention.
  • Figure 7 depicts the generation, geometric mapping and combining of raster data in accordance with a specific embodiment of the present invention such as that depicted in Figures 2, or 3;
  • Figure 8 is a flow diagram depicting a process applied in accordance with a specific embodiment of the present invention such as that depicted in Figures 2 or 3.
  • FIG. 1 schematically depicts a dynamic simulator 1 , such as a flight simulator.
  • the simulator 1 has a Back Projection Screen (BPS) 2 onto which a projector 3 projects an image. The image is then relayed by a mirror 4 to a viewing area 5.
  • BPS Back Projection Screen
  • the shape of the screen 2 and mirror 4 combined with a suitably geometrically mapped scene from the projector 3, produces a collimated Out-The- Window (OTW) display at the viewing area 5.
  • the simulator 1, of the specific embodiment depicted has an image processing apparatus 6 to provide the raster-only projector 3 with raster images with enhanced light point characteristics. These images are updated in real-time to represent a simulated vista.
  • the projector 3 of this specific embodiment is a raster-only projector.
  • This raster data is provided in a form which is geometrically transformed, or warped, for the particular screen 2 and mirror 4.
  • the ray paths 7 depict how the scene is projected via the projector 3, screen 2 and mirror 4.
  • Motion system jacks 8 provide actuation for the dynamic simulator 1.
  • FIG. 2 depicts an image processing apparatus 10 according to a specific embodiment of the present invention.
  • the image processing apparatus 10 forms part of the scene generator 6.
  • An image generator 11 generates background raster data for scenes to be displayed on the screen 2, to represent vistas outside a cockpit for example.
  • a light point generator 13 generates light point raster data .
  • the light point generator 13 communicates with the image generator 11 to receive draw list data from which light point data is generated. This data is eventually combined with the background raster data to provide an image to be projected by the raster-only projector 3.
  • the skilled reader will be aware of various light point generator algorithms which may be suitable.
  • the light point generator 13 generates raster data directly from received image data.
  • the image generator may simply supply generated image data to both the light point generator and the warping unit 12.
  • the image generator may be replaced with an input to receive image raster data.
  • a geometric transformation or warping unit 12 communicates with the image generator 11 to receive background image data.
  • the unit 12 is capable of applying geometric transformation, or post- warping, or image mapping, of background raster data provided by the image generator 11.
  • a geometric transformation control unit 14 communicates with both the background warping unit 12 and the light point generator 13 to provide geometric instructions for the transformation and generation of each to their respective image data into and in a geometric domain to be used by the projector 3.
  • a merge unit 15 communicates with both the warping unit 12 and the light point generator unit 13.
  • a raster-only, or fixed matrix, projector 16 which may correspond to the projector 3 shown in figure 1, communicates with the merge unit 15 to project combined raster data.
  • the image generator 11 In use, the image generator 11 generates an un-warped background image in the form of raster data.
  • An un-warped scene will typically be in a linear geometric reference domain.
  • the light point generator 13 receives light point drawing list data from the image generator, frame synchronisation data from the image generator and image correction geometric mapping control data from the geometric mapping control unit 14.
  • the warping unit 12 transforms raster data from the typically linear, un-warped domain to a typically non-linear warped domain used at the input of the projector 16. This transformation may be referred to as post- warping.
  • the light point generator 13 generates pre-warped light point images in the form of raster data in a non-linear domain.
  • the equivalent information is used by the warping unit 12, which is controlled by the geometric mapping control unit 14 to produce post- warped background image data.
  • the light point generator receives data from both the controller 14 and the image generator 11.
  • the warped background image data and the pre-warped light point image data are merged by the merge unit 15.
  • the merge unit 15 provides an enhanced light point background image, or a background image with overlaid light point
  • FIG 3 is a schematic diagram of an image processing apparatus part of which, according to a specific embodiment, serves as the merge unit 15 of Figure 2.
  • the merge unit 15 has an input for light point raster data 21 and an input for background raster data 22. These have colour components 2 la-2 Ic and 22a-22c.
  • a light point image frame buffer 19 may be provided for the merge unit 15.
  • a background image frame buffer 20 may also be provided.
  • Figure 3 also shows the warping unit 12. It will be apparent to the skilled reader that, in some embodiments of the invention, the light point image frame buffer 19 and background image frame buffer 20 may be integrated into the light point generator 13 and image generator 11, as shown in Figure 2.
  • the light point data buffer 19 is synchronised with the background frame buffer 20. If provided these will be synchronised by any suitable means known to the skilled reader. It will be apparent that in embodiments with integrated buffers, the synchronisation may be done within the light point generator 13 and image generator 11.
  • the merge unit 15 has an attenuation unit 25 able to apply attenuation to the
  • background raster data 22 to provide an attenuated background raster data signal 26.
  • a filter 27 provides a spatially filtered attenuation raster data 28.
  • the attenuation raster data 28 is dependent on the light point raster data 21 as the data 28 is generated by applying spatial filtering to raster data derived from the light point raster data 21.
  • the polarity of the spatial filtering is selected so that the data 28 attenuates background raster data through a multiplication operation.
  • the attenuating unit 25 generates a product of the background raster data 22 by the attenuation raster data 28 using a set of multipliers 29a-29c.
  • the data 28 may, therefore, be referred to as a raster light attenuation signal.
  • the array is generated by using a 3x3 filter array of coefficients fitting a Gaussian distribution. Other sizes of array may be used in alternative embodiments. Alternatives to Gaussian distributions may be used in other embodiments also. These distributions will be selected to expand or blur the light point data so that attenuation dependent on the light points will act to attenuate the background raster data 22 in a region around the light points. Suitable distributions to use as alternatives to a Gaussian distribution will be apparent to the skilled reader.
  • the algorithms employed in the real-time filter 27 maintain the anti-aliased quality of the light point data ensuring the background attenuation 28 is free from spatial or temporal aliasing artefacts.
  • the design of the real-time filter array 27 ensures that the size and level of the attenuation array 28 is a dynamic and variable function of the corresponding light point feature (i.e. larger brighter points cut a larger diameter, deeper hole into the background compared with smaller, dimmer points).
  • a larger filter array or alternative filter weightings are able to create a larger and/or darker border between the light point feature and the background image. This technique is effective in enhancing light point contrast, and in maintaining light point colour purity and perceived sharpness, all of which are critical for effective pilot training.
  • the merge unit 15 has a colour processing unit 32 which applies a weighted sum to the colour components 21a-21c of the light point raster data 21 to generate monochromatic light point raster data.
  • the weighting components are 0.7, 0.7 and 0.1 for red, green and blue respectively. These may be adjustable in some specific embodiments envisaged. In other embodiments these components may vary.
  • the colour processor weightings applied to colour components 21a-21c are chosen to match the human eye spectral response to ensure the degree of attenuation is commensurate with the light point data RGB values. For example a Blue light of a given value is perceived by the eye as being of a lower brightness than say a Green or Red light of equal value.
  • the merging unit 15 has a series of combined adder units 30a-30c to add the light point raster data 21 to the attenuated background raster data 26 and provide an enhanced background raster data in which background raster data 22 has been enhanced with light point raster data 21. Since the attenuation function is only applied to the background data the light point data remains unmodified and consequently retains its original sharpness and resolution. The subsequent merging function therefore creates a composite image which preserves the original quality of the light point raster data ensuring there is no loss of fidelity or quality.
  • the process performed by the merge unit 15 is illustrated in Figure 4.
  • the process 40 begins at step 41 when light point raster data 21 and background raster data 22 are received by the merge unit 15.
  • the light point raster data 21 is pre-warped into the geometric domain used for projection.
  • warping or geometric mapping is applied to the background raster data 22. This warping corresponds to the pre-warping of the light point raster data.
  • attenuation is applied to the warped background raster data 22. This attenuation is applied by multiplying the data 22 by an attenuation array 28 derived from the light point raster data 21.
  • the light point raster data was generated in a pre- warped format which corresponds to the geometric domain to which background raster data 22 was warped.
  • the light point raster data 21 and background raster data 22 can be overlaid in the warped geometric domain, hi the present embodiment the attenuation array is derived by applying Gaussian spatial filtering to a weighted sum of colour components of the light point raster data 21.
  • the coefficient values have a symmetric Gaussian profile.
  • the polarity and normalisation of the attenuation values is such that 1.0 corresponds to the absence of a light point and 0.0 corresponds to a full light point. Therefore, a multiplication operation will cause maximum attenuation for a full light point.
  • the light point raster data 21 is combined or added to the attenuated background raster data 26 by the combiner unit 30 which performs an adding operation.
  • the process ends at step 45 where background raster data enhanced with light point raster data is provided as raster data with enhanced light points 31.
  • FIG. 5 is a flow diagram depicting the attenuation process carried out at step 43 of Figure 4. The process begins at step 51. At step 52 colour components of the light point raster data are received by the merge unit 15.
  • the weighted sum 33 is calculated to produce monochromatic light point data.
  • the spatial filter 27 applies spatial filtering to the weighted sum 33 of the light point raster data 21 to produce an attenuation array 28.
  • step 55 the product of the background raster data 22 and the attenuation array 28 is found, to provide the attenuated background raster data 26.
  • FIG. 6 provides a schematic diagram of a light point image generator, such as 13, according to another specific embodiment of the invention.
  • a merge unit 81 similar to a merge unit 15, is in communication with a raster image generator 82 and a light point image frame store 83.
  • the image generator 82 and the frame store 83 are synchronised. For clarity, functional units relating to geometric mapping or warping are omitted from Figure 6.
  • the light point image frame store 83 communicates with a light point rendering unit 84.
  • the light point rendering unit 84 applies constant area anti-aliasing algorithms to the light points to avoid fluctuations, jumping and other artefacts of light points, as they traverse a raster based image for example.
  • the light point rendering unit 84 also applies high precision, floating point mathematics to the light point rendering.
  • the light points are rendered in the same geometric domain as they will be projected.
  • the light point image frame store 83 also communicates with a halo rendering unit 85 which renders halos and atmospheric dispersion effects for light points.
  • the light point frame store 83, light point rendering unit 84, and halo rendering unit 85 of this specific embodiment perform a similar function to the light point generator 13 shown in Figure 2.
  • the light point rendering unit 84 renders light points using data relating to light point range 86, light point intensity 87, light point colour 88, light point type 89, light point position 90, and atmospheric transmittance 91.
  • the light point position conforms to the, typically, non-linear geometric domain used by a projector 3.
  • the types of data 86 to 91 will be apparent to the skilled reader.
  • the halo rendering unit 85 uses the same data as the light point rendering unit 84.
  • the light point image frame store 83 combines light point data provided by the light point rendering unit 84 with rendered halo data provided by the halo rendering unit 85.
  • the light point frame store synchronises with the image generator 82. Any suitable means of synchronising known to the skilled reader may be used, but frame buffers are used in the specific embodiment described here.
  • the light point frame store 83 provides light point raster data synchronised with raster data of a background image from the image generator 82 to the merge unit 81.
  • a weighting and filtering unit 94 and multiplying unit 95 attenuate the background image raster data from the image generator 82.
  • An adding unit 96 combines this attenuated background image raster data with the light point raster data from the frame store 83.
  • the merge unit 81 outputs raster data attenuated according to the attenuation dependent weighting provided by unit 95, which is dependent on the light point raster data and halo raster data provided by the frame store 83 after filtering provided by the unit 94.
  • Figure 7 gives a simple depiction of the preparation, according to a preferred aspect of an aspect of the present invention, of a scene to be displayed in a simulator.
  • the scene is typically displayed using a projector 3.
  • Raster data used by projector 3 typically conforms to a non-linear geometric domain selected for a given OTW mirror 4.
  • FIG. 7 Shown in Figure 7 is a background scene 101 corresponding to raster data generated by an image generator 11.
  • the background scene 101 is represented here in a snap-shot of time for simplicity.
  • a simulator will typically use dynamic scenes or image streams.
  • the background scene 101 is generated in a first, linear geometric domain as it might appear on a two dimensional display. This is illustrated by the straight perspective lines formed by runway strips, for example.
  • the background scene 101 shown does not feature any light points even though they may eventually feature in the scene to be displayed.
  • light points corresponding to the background scene 101 will feature as a draw list (not depicted here).
  • FIG. 7 Also shown in Figure 7 is the background scene 102 after the raster data representing it has been warped, post-warped, or geometrically transformed, to the geometric domain suitable for the given OTW projector 3.
  • FIG. 7 Also shown in Figure 7 is a light point scene 103 rendered by the light point generator 13. This scene is shown as pre- warped, to the second geometric domain, suitable for the OTW projector 3. This is illustrated by the runway lights curving outward in the same manner as the runway strips after post-warping.
  • Figure 7 depicts the post-warped background scene and pre-warped light points merged as 104.
  • Figure 8 is a flow diagram of a process 200 corresponding to those of Figures 4 and 5 combined and described with reference to Figure 7. The process starts at step 201.
  • the image generator renders raster data for a background scene 101.
  • the raster data is generated in a linear geometric domain.
  • the image generator such as depicted by 11 in Figure 2 generates draw list data (not shown) for light points 103 which will feature in an image to be projected but do not feature in the background scene 101.
  • draw list data (not shown) for light points 103 which will feature in an image to be projected but do not feature in the background scene 101.
  • draw list data only a limited portion of the available dynamic range is used for this raster data. This provides increased contrast between the background scene and light points added later in the process.
  • warping or geometric mapping, instructions are generated by the warping control unit 14. These instructions define transformations from a first geometric domain in which the raster data for background scene is generated to a second geometric domain used by a projector, such as shown in Figure 1.
  • the raster data for the background 101 scene is post- warped or transformed using the geometric mapping instructions provided by the warping control unit 14.
  • the post- warped background scene is shown as 102.
  • the light point raster data for the light points 103 is generated using the draw list data provided by the image generator 11 and also the geometric mapping instruction data provided by the geometric mapping unit, such as depicted by 14 in Figure 2.
  • the light point raster data therefore conforms to the geometric domain used by the projector and may be considered to be pre-warped.
  • image processing algorithms are applied to the light point raster data to control the characteristics of the light points, or other features such as halos, and prevent fluctuations in size, aliasing or other artefacts being introduced by raster rendering or warping.
  • this data conforms to the geometric domain used by the projector, the effect of any algorithms will be realised in that domain directly and will not be affected by post-warping.
  • constant area algorithms are applied. A light point which traverses a simulator scene, but which represents a light at a constant distance, will maintain a constant energy, or area, and intensity, and will traverse steadily. Any suitable constant energy or area algorithms known to the reader may be used.
  • the area of the lightpoints are maintained as constant area for the light points 103 which have been rendered as pre- warped raster data.
  • the algorithms of this embodiment will allow intended variations of the light point which correspond to the evolving scene.
  • other algorithms are applied to control the position of light points, or other features. These algorithms apply high precision floating-point mathematics such as are well known by the skilled reader.
  • step 208 the raster data for the background scene 101 and the light points are merged to form an image with enhanced light points.
  • the process ends at step 209.
  • Some embodiments are implemented using a computer. Other embodiments are implemented using hardware such as raster Graphics Processor Units (GPUs), Field Programmable Gate Arrays (FPGAs) or even Application Specific Integrated Circuits (ASICs). Other specific embodiments use a combination of these.
  • GPUs raster Graphics Processor Units
  • FPGAs Field Programmable Gate Arrays
  • ASICs Application Specific Integrated Circuits
  • FIG. 15 or 81 Other embodiments are implemented by a computer program product or computer processor readable media, such as magnetic, optical, holographic storage media or in solid state memory media. These provide instructions or code which configures a computer, processor and/or peripheral device to implement the embodiments. Other embodiments are implemented using a combination of computers and dedicated hardware.
  • the merge units such as depicted by 15 or 81, and their associated data buffers, such as 19, 20 and 83 are implemented using FPGAs to create the functional units described, and the remaining aspects are implemented using computers with peripherals as understood to be appropriate to the skilled reader.
  • the merge units, such as depicted by 15 or 81 are provided as component devices to provide merge functionality for a system which includes separate image generators.
  • the second raster data may be other than light points.
  • the second raster data may define aeroplanes or vehicles represented as small images against a background.
  • symbology used in the display may be represented as second raster data. For example symbology used in Head-Up
  • Displays may be represented as the second raster data.
  • the images may particularly benefit from pre- warping and the avoidance of raster related effects and such effects exacerbated by post-warping.

Abstract

An image processing apparatus and method provides raster image data with light point enhancements. In one aspect the apparatus generates first raster data in a first geometric domain and second raster data in a second geometric domain. The apparatus transforms the first raster data from the first domain to the second domain and combines the transformed first raster data with the second raster data. In another aspect the apparatus has an attenuation unit which applies attenuation to the first raster data. The attenuation is applied dependent upon second raster data. The combination of the transformed first raster data and second raster data provides enhanced contrast, colour representation, and preserved spatial positioning of the second raster data within the background image provided by the first raster data.

Description

AN IMAGE PROCESSING METHOD AND DEVICE
Field of the Invention
The present invention concerns methods of and apparatus for processing images, such as for use in visual displays in simulators, and dynamic simulators in particular.
Background of the Invention
The aim of a simulator is to provide a platform on which technology or environments are realistically recreated. In many applications, a high 'fidelity' is required if simulators are to be effective for use in training, visualisation or research. The fidelity of images generated and displayed in simulators has a significant impact on the fidelity of the simulator.
One particular challenge in creating high fidelity images for simulators relates to lighting effects. Light points are a particular challenge. Scenes that are visualised and displayed in a simulator may involve highly adapted and, often, very bright lighting. One example may be airfield runway lighting. To assist pilots navigating an aeroplane safely onto a runway, the runway is provided with bright lights which appear to a pilot as intense points of light. General purpose image display technology may have a limited dynamic range with which to recreate realistic intensity of points of light to be displayed. Therefore, enhancements to the light points for visual displays of simulators are required if a high fidelity simulator is to be provided.
Until recently, high-end image display technology used for simulators involved Cathode Ray Tube (CRT) projectors with calligraphic light point technology. Calligraphic light point technology provides the ability to display high intensity points by focusing an electron beam on electron sensitive phosphor in the CRT tube. The intensity of the light point is varied by means of the duration for which the electron beam is focused. This achieves a high dynamic range for the intensity of a light point. In non-calligraphic raster scan display devices the resolution and refresh rate dictates the time for which any given pixel can be focussed upon. The colour of any given pixel can be varied but the intensity of any given pixel cannot be varied significantly to produce a high intensity light point. Calligraphic light point technology overcomes this limitation by providing an additional scan interval dedicated to the creation of light points.
Calligraphic light point technology has an additional advantage in that it is immune to fluctuations in intensity which may be caused by raster aliasing and similar raster related effects manifest in graphics hardware rendering. A calligraphic light point traversing an image will tend to have constant area and brightness, for example.
Calligraphic CRT displays suffer a number of shortcomings in their application to simulators, and dynamic flight simulators in particular. First, the addition of high intensity light points often leads to accelerated CRT tube degradation, which involves additional maintenance costs in replacing CRT tubes. Secondly, calligraphic CRT displays are relatively complex devices involving high-tension voltage sources. As a result, they are relatively bulky and have a significant weight. The mass of a typical calligraphic CRT display projector may be upwards of 200 kg. For application to dynamic simulators, three or more such projectors need to be mounted on a moving platform. The disadvantages of a display with a mass of 600kg on a movable platform are readily apparent, particularly as modern flight simulators require movable platforms capable of highly dynamic movement. Thirdly, CRT displays allow a relatively limited number of light points to be drawn in a calligraphic interval. Due to the shortcomings of calligraphic CRTs, it is desirable to replace them with lightweight and low cost "Light Valve" (LV) projector display technologies such as Liquid Crystal Display (LCD), Liquid Crystal on Silicon (LCoS) and Digital Light Processing™ (DLP). However, these are raster-only devices and therefore have limited dynamic range for light points. Additionally, as light points need to be generated entirely within a raster signal, aliasing and other raster related spatial or temporal effects may lead to variations in the intensity, shape or size of a light point. The area of a raster only light point will consist of pixels on separate rows of the raster. Therefore, this may result in fluctuations in the appearance of a light point as it traverses a display. These effects are exacerbated when an image is post-warped, after rendering in a linear geometric domain, as required to match the geometric domain used by an out-of-the- window display. As a result of these limitations, Light Valve displays provide limited fidelity for simulator displays. Unless the lightpoint features can be enhanced this may limit their application in high fidelity simulators.
Summary of the Invention Aspects of the present invention provide an apparatus for processing raster data, the apparatus including means to receive first raster data defining a background scene, means to post-warp the first raster data, means to receive pre- warped second raster data defining light points for the background scene and means to combine the post-warped first raster data and pre- warped second raster data.
This may allow the second raster data defining light points to be generated or processed independently of effects occurring during post- warping. The light points may benefit from precision floating point mathematics and area based algorithms to maintain realistic characteristics as they traverse a scene represented by the first raster data.
Other aspects of the invention provide image processing of these light points in the pre- warped domain. This image processing may include floating point mathematics. This processing may also include constant area or constant energy algorithms. Other aspects of the present invention provide an apparatus for combining first raster data defining background and second raster data defining light points, the apparatus including means to attenuate the first raster data dependent on the second raster data and means to add the second raster data to the attenuated first raster data. The attenuation allows the dynamic range of the combined light points and background data to, in effect, be increased around the light points. The attenuation may also mitigate bleed of colour of background images into light points. This may allow the light points to maintain pure colour. This may be particularly beneficial in some specific instances, such as when a red light point is against a background of a snow-covered runway. In this instance bleed of colour might result in red light or a pink light point.
As used herein terms such as 'an attenuator' or 'a raster data generator' and similar terms do not exclude the possibility of the provision of more than one of these units.
As used herein the term 'raster data' relates broadly to data in a raster format, the data relating to images, image streams or scenes, whether static or dynamic. As used herein the term 'geometric domain' refers broadly to a reference or coordinate domain for specifying geometric positions within an image or scene, for example, wherein raster data may be transformed from one geometric domain to another by a suitable geometric transformation. This transformation may also be referred to as warping, post-warping, image mapping, or geometric mapping.
In one aspect of the invention there is provided an image processing apparatus operable to provide raster data for a display apparatus, the image processing apparatus including: a geometric transformer operable to receive first raster data in a first geometric domain and to transform said first raster data to a second geometric domain;
a second raster data generator operable to generate second raster data in the second geometric domain; and
a raster data combiner operable to combine the first and the second raster data in the second geometric domain. The apparatus may comprise a first raster data generator operable to generate the first raster data.
The raster data may represent a scene to be displayed in a simulator. The first raster data may represent a background, without light points, of a scene to be displayed by a simulator. The second raster data may represent light points for the scene to be displayed. The apparatus may comprise an attenuator operable on the first raster data once transformed to the second geometric domain, the attenuator being operable to attenuate the transformed first raster data dependent on the second raster data. The apparatus may comprise a spatial filtering unit operable to apply spatial filtering so as to expand features represented in the second raster data.
The apparatus may comprise a colour processor operable to apply a weighted summation function to colour components of the second raster data prior to spatial filtering.
The second raster data generator may be operable to generate raster data corresponding to light points of a scene for a simulator. The second raster data generator may be operable to generate raster data dependent on draw list data. The draw-list data may comprise instructions for the generation of light points.
The first data generator may be operable to provide draw-list data for data
corresponding to light points to be generated by the second raster data generator.
The first geometric domain may be a linear domain such as might be used by a computer display. The second geometric domain may be a non-linear domain. The second geometric domain may be suitable for projection by a given projector used in an Out-The- Window display.
The apparatus may include a geometric transformation controller operable to generate instructions for geometrical transformation of data between the first and second geometric domains. These instructions may be used by the second data generator to generate second raster data in the second geometric domain. The first raster data may use a limited portion of an available display intensity dynamic range. The portion may be 50%. Aternatively, the portion may be another
programmable percentage. The portion may be calculated to allow a given dynamic range for light points, with respect to a background image.
The apparatus may include a feature controller operable to apply algorithms to the second data to control features represented within the second raster data. These features may be light points.
The feature controller may be operable to control the position of the features. The feature controller may be operable to apply floating-point mathematics to the features.
The feature controller may be operable to control the area of the features. The feature controller may be operable to apply area-based algorithms to the features. The feature controller may be operable to apply anti-aliasing algorithms to the features. The area of light points may be controlled to be constant in the geometric domain in which the light points will be displayed. It will be understood by the skilled reader that "constant" is to be read in the context of changes in area intrinsic in a dynamic scene to be displayed. For example, a light point which is traversing a scene which includes an approaching runway will be constant in terms of traversing a raster image, but will gradually expand as the runway is progressively approached.
The algorithms are applied to the second raster data in the second geometric domain, so the effect of the algorithms will occur in that domain directly.
The second raster data generator may be operable to generate data corresponding to light points of a scene for a simulator. In another aspect of the invention there is provided an image processing apparatus operable to combine raster data, the apparatus including: an attenuator operable on first raster data to provide attenuated first raster data; and
a raster data combiner operable to combine the attenuated first raster data with second raster data,
wherein the attenuation unit may be operable to apply attenuation to the first raster data dependent upon the second raster data.
The apparatus may include a geometric transformation unit operable to transform the first raster data from the first geometric domain to the second geometric domain. The attenuator may be operable on the first raster data once transformed.
The attenuation unit may be operable to apply attenuation to the transformed first raster data dependent on second raster data in the second geometric domain. The apparatus may include a spatial filtering unit operable on the second raster data.
The attenuation unit may be operable to apply attenuation dependent on the spatially filtered second raster data. The spatial filtering unit may be operable to apply a Gaussian spatial filter. A Gaussian spatial filter will cause attenuation applied to form expanded attenuated regions around features, such as light points, represented in the second raster data. This may tend to prevent colour bleed. This may also tend to generate a larger apparent dynamic range for features such as light points.
The spatial filtering unit may be operable to apply a NxN spatial filter. The spatial filter may be a 3x3 spatial filter.
The attenuation applied may be dependent on a spatially filtered function of colour components of the second raster data. The apparatus may include a colour processor operable to apply a function to colour components of the second raster data and provide colour processed raster data to the spatial filtering unit. The function may be a weighted sum of colour components. The weighted sum may generate monochromatic raster data to be supplied to the filtering unit to generate monochromatic filtered raster data.
The filtering unit and/or the colour component processing unit may be operable to apply filter coefficients with polarity and/or normalisation suitable to attenuate said first raster data if multiplied by the data output from the filter. Thus monochromatic data with suitable polarity provides a scene which corresponds to the light point scene but which has expanded light points and has an opposite polarity, in effect. This scene is used to attenuate the background scene to increase the effective contrast of the light points relative to the raster data around the light points. This provides a higher fidelity, or contrast, for light points. This may also mitigate or prevent colour bleed between background and light points. This may also increase the sharpness of light points due to a subtle border being provided by attenuation of background raster data, around each light point.
The attenuator may be operable to attenuate the first raster data by multiplying it with data from the spatial filtering unit.
In another aspect of the invention there is provided an image processing apparatus operable to combine raster data, the apparatus including:
a first data input operable to receive first raster data defined in a first geometric domain;
a geometric transformer operable to transform said first raster data to a second geometric domain;
a second data input operable to receive second raster data in the second geometric domain; a raster data combiner operable to combine first and second raster data in the second geometric domain.
The combined image may have spatial positioning of the second raster data with the first raster data preserved.
In another aspect of the invention there is provided an image processing apparatus for providing raster data with enhanced light points for a display apparatus, the image processing apparatus including:
a geometric transformation unit operable to provide geometrically transformed background raster data;
an attenuator operable to apply an attenuation filter to the geometrically transformed background raster data to provide attenuated geometrically transformed background raster data, the attenuation filter determined dependent on the light point raster data, and
a raster data combiner operable to combine light point raster data to the attenuated geometrically transformed background raster data.
The combination of the background raster data with geometrically transformed light point raster data may be so as to preserve spatial positioning of light points relative to the background.
The attenuation filter may be operable to attenuate the geometrically transformed background data using raster data which is provided by spatially filtering the light point raster data.
The apparatus may include a light point raster data generating unit operable to generate light point raster data corresponding to light point characteristics and geometric transformation of background raster data.
This embodiment also applies attenuation and light point raster data to background data which has already been geometrically transformed. The geometric transformation unit may be operable to apply a transformation according to the geometric characteristics of a given non-linear display apparatus. In another aspect of the invention there is a display apparatus for a simulator including: a raster display apparatus operable to project images onto a non-linear display screen; and
a image processing apparatus in accordance with any one of the preceding claims.
In another aspect of the invention there is provided a method of processing image data for a display apparatus, the method including:
receiving first raster data in a first geometric domain;
transforming said first raster data to a second geometric domain;
receiving second raster data in the second geometric domain; and
combining first and second raster data in the second geometric domain.
In another aspect of the invention there is provided a method of processing image data for a display apparatus, the method including:
attenuating first raster data; and
combining the attenuated first raster data with second raster data, and wherein attenuation of the first raster data is dependent upon second raster data.
The method may include applying algorithms to control features represented in the second raster data.
In another aspect the invention provides a raster data combiner apparatus for combining received first and second raster data, the apparatus including:
an attenuator operable to attenuate the first raster data, wherein said attenuation is dependent on the second raster data; and
a raster data combiner operable to combine the second raster data with the attenuated first raster data. In another aspect of the invention there is provided a method for providing raster data with enhanced light points for display apparatus, the method including:
applying attenuation filtering to background raster data to provide attenuated background raster data, said attenuation filtering being dependent on light point raster data; and
combining the light point raster data with the attenuated background raster data.
The method may include applying geometric image transformation to the background raster data prior to attenuation being applied. The light point raster data provided may be representative of corresponding geometric image transformations.
In another aspect of the invention there is provided a computer program product which may be embodied in a passive storage medium, and which configures a computer, or combination of computer and dedicated hardware, to operate as an apparatus defined in any one of the paragraphs above.
In another aspect of the invention there is provided a computer program product which may be embodied in a passive storage medium, and which configures a computer to perform a method defined in any one of the paragraphs above. Here the term computer may encompass a computer with dedicated hardware.
Aspects of the invention may be provided in the form of a computer program product, which may be embodied in a passive storage medium such as an optical or magnetic medium, or in an electronic medium such as a mass storage device (e.g. a FLASH memory), or in a hardware device implemented to achieve execution of instructions in accordance with the invention, such as an ASIC, an FPGA, a DSP or the like.
Further aspects, features and advantages of the invention will become apparent to the reader of the following description of specific embodiments of the invention, provided by way of example only, with reference to the accompanying drawings, in which: Brief Description of the Drawings
Figure 1 depicts a dynamic simulator with which an image processing apparatus according to a specific embodiment of the invention is used;
Figure 2 is a schematic diagram of an image processing apparatus according to an embodiment of the present invention;
Figure 3 is a schematic diagram of an image processing apparatus in accordance with an alternative specific embodiment of the present invention, the embodiment depicts in more detail, a merge unit as depicted in Figure 2;
Figure 4 is a flow diagram depicting a process applied in accordance with the specific embodiment depicted in Figure 3;
Figure 5 is a flow diagram depicting a process applied in accordance with the same specific embodiment as depicted in Figures 3 and 4;
Figure 6 depicts an image processing apparatus in accordance with a further alternative specific embodiment of the invention;
Figure 7 depicts the generation, geometric mapping and combining of raster data in accordance with a specific embodiment of the present invention such as that depicted in Figures 2, or 3;
Figure 8 is a flow diagram depicting a process applied in accordance with a specific embodiment of the present invention such as that depicted in Figures 2 or 3.
Detailed Description of the Preferred Embodiments of the Invention
Figure 1 schematically depicts a dynamic simulator 1 , such as a flight simulator. The simulator 1 has a Back Projection Screen (BPS) 2 onto which a projector 3 projects an image. The image is then relayed by a mirror 4 to a viewing area 5. As it is well known to the skilled reader, the shape of the screen 2 and mirror 4, combined with a suitably geometrically mapped scene from the projector 3, produces a collimated Out-The- Window (OTW) display at the viewing area 5. The simulator 1, of the specific embodiment depicted, has an image processing apparatus 6 to provide the raster-only projector 3 with raster images with enhanced light point characteristics. These images are updated in real-time to represent a simulated vista. The projector 3 of this specific embodiment is a raster-only projector. This raster data is provided in a form which is geometrically transformed, or warped, for the particular screen 2 and mirror 4. The ray paths 7 depict how the scene is projected via the projector 3, screen 2 and mirror 4. Motion system jacks 8 provide actuation for the dynamic simulator 1.
Figure 2 depicts an image processing apparatus 10 according to a specific embodiment of the present invention. The image processing apparatus 10 forms part of the scene generator 6. An image generator 11 generates background raster data for scenes to be displayed on the screen 2, to represent vistas outside a cockpit for example. A light point generator 13 generates light point raster data . The light point generator 13 communicates with the image generator 11 to receive draw list data from which light point data is generated. This data is eventually combined with the background raster data to provide an image to be projected by the raster-only projector 3. The skilled reader will be aware of various light point generator algorithms which may be suitable.
In an alternative embodiment the light point generator 13 generates raster data directly from received image data. In this embodiment the image generator may simply supply generated image data to both the light point generator and the warping unit 12. In another embodiment the image generator may be replaced with an input to receive image raster data.
A geometric transformation or warping unit 12 communicates with the image generator 11 to receive background image data. The unit 12 is capable of applying geometric transformation, or post- warping, or image mapping, of background raster data provided by the image generator 11. A geometric transformation control unit 14 communicates with both the background warping unit 12 and the light point generator 13 to provide geometric instructions for the transformation and generation of each to their respective image data into and in a geometric domain to be used by the projector 3. A merge unit 15 communicates with both the warping unit 12 and the light point generator unit 13. A raster-only, or fixed matrix, projector 16, which may correspond to the projector 3 shown in figure 1, communicates with the merge unit 15 to project combined raster data.
In use, the image generator 11 generates an un-warped background image in the form of raster data. An un-warped scene will typically be in a linear geometric reference domain. The light point generator 13 receives light point drawing list data from the image generator, frame synchronisation data from the image generator and image correction geometric mapping control data from the geometric mapping control unit 14. The warping unit 12 transforms raster data from the typically linear, un-warped domain to a typically non-linear warped domain used at the input of the projector 16. This transformation may be referred to as post- warping. The light point generator 13 generates pre-warped light point images in the form of raster data in a non-linear domain. The equivalent information is used by the warping unit 12, which is controlled by the geometric mapping control unit 14 to produce post- warped background image data. The light point generator receives data from both the controller 14 and the image generator 11. The warped background image data and the pre-warped light point image data are merged by the merge unit 15. The merge unit 15 provides an enhanced light point background image, or a background image with overlaid light point
enhancements, for a fixed matrix raster projector, or raster-only projector, 16.
Figure 3 is a schematic diagram of an image processing apparatus part of which, according to a specific embodiment, serves as the merge unit 15 of Figure 2. The merge unit 15 has an input for light point raster data 21 and an input for background raster data 22. These have colour components 2 la-2 Ic and 22a-22c. A light point image frame buffer 19 may be provided for the merge unit 15. A background image frame buffer 20 may also be provided. Figure 3 also shows the warping unit 12. It will be apparent to the skilled reader that, in some embodiments of the invention, the light point image frame buffer 19 and background image frame buffer 20 may be integrated into the light point generator 13 and image generator 11, as shown in Figure 2.
The light point data buffer 19 is synchronised with the background frame buffer 20. If provided these will be synchronised by any suitable means known to the skilled reader. It will be apparent that in embodiments with integrated buffers, the synchronisation may be done within the light point generator 13 and image generator 11.
The merge unit 15 has an attenuation unit 25 able to apply attenuation to the
background raster data 22 to provide an attenuated background raster data signal 26. A filter 27 provides a spatially filtered attenuation raster data 28. The attenuation raster data 28 is dependent on the light point raster data 21 as the data 28 is generated by applying spatial filtering to raster data derived from the light point raster data 21. The polarity of the spatial filtering is selected so that the data 28 attenuates background raster data through a multiplication operation.
The attenuating unit 25 generates a product of the background raster data 22 by the attenuation raster data 28 using a set of multipliers 29a-29c. The data 28 may, therefore, be referred to as a raster light attenuation signal.
In the specific embodiment described here, the array is generated by using a 3x3 filter array of coefficients fitting a Gaussian distribution. Other sizes of array may be used in alternative embodiments. Alternatives to Gaussian distributions may be used in other embodiments also. These distributions will be selected to expand or blur the light point data so that attenuation dependent on the light points will act to attenuate the background raster data 22 in a region around the light points. Suitable distributions to use as alternatives to a Gaussian distribution will be apparent to the skilled reader. The algorithms employed in the real-time filter 27 maintain the anti-aliased quality of the light point data ensuring the background attenuation 28 is free from spatial or temporal aliasing artefacts. The design of the real-time filter array 27 ensures that the size and level of the attenuation array 28 is a dynamic and variable function of the corresponding light point feature (i.e. larger brighter points cut a larger diameter, deeper hole into the background compared with smaller, dimmer points). A larger filter array or alternative filter weightings are able to create a larger and/or darker border between the light point feature and the background image. This technique is effective in enhancing light point contrast, and in maintaining light point colour purity and perceived sharpness, all of which are critical for effective pilot training. The merge unit 15 has a colour processing unit 32 which applies a weighted sum to the colour components 21a-21c of the light point raster data 21 to generate monochromatic light point raster data. In this embodiment, the weighting components are 0.7, 0.7 and 0.1 for red, green and blue respectively. These may be adjustable in some specific embodiments envisaged. In other embodiments these components may vary.
In another embodiment the colour processor weightings applied to colour components 21a-21c are chosen to match the human eye spectral response to ensure the degree of attenuation is commensurate with the light point data RGB values. For example a Blue light of a given value is perceived by the eye as being of a lower brightness than say a Green or Red light of equal value.
The merging unit 15 has a series of combined adder units 30a-30c to add the light point raster data 21 to the attenuated background raster data 26 and provide an enhanced background raster data in which background raster data 22 has been enhanced with light point raster data 21. Since the attenuation function is only applied to the background data the light point data remains unmodified and consequently retains its original sharpness and resolution. The subsequent merging function therefore creates a composite image which preserves the original quality of the light point raster data ensuring there is no loss of fidelity or quality.
The process performed by the merge unit 15 is illustrated in Figure 4. The process 40 begins at step 41 when light point raster data 21 and background raster data 22 are received by the merge unit 15. The light point raster data 21 is pre-warped into the geometric domain used for projection.
At step 42, warping or geometric mapping is applied to the background raster data 22. This warping corresponds to the pre-warping of the light point raster data. At step 43 attenuation is applied to the warped background raster data 22. This attenuation is applied by multiplying the data 22 by an attenuation array 28 derived from the light point raster data 21. The light point raster data was generated in a pre- warped format which corresponds to the geometric domain to which background raster data 22 was warped. Therefore, the light point raster data 21 and background raster data 22 can be overlaid in the warped geometric domain, hi the present embodiment the attenuation array is derived by applying Gaussian spatial filtering to a weighted sum of colour components of the light point raster data 21.
In the specific embodiment described here, the coefficient values have a symmetric Gaussian profile. The polarity and normalisation of the attenuation values is such that 1.0 corresponds to the absence of a light point and 0.0 corresponds to a full light point. Therefore, a multiplication operation will cause maximum attenuation for a full light point. At step 44, the light point raster data 21 is combined or added to the attenuated background raster data 26 by the combiner unit 30 which performs an adding operation. The process ends at step 45 where background raster data enhanced with light point raster data is provided as raster data with enhanced light points 31. The process ends at step 45.
Figure 5 is a flow diagram depicting the attenuation process carried out at step 43 of Figure 4. The process begins at step 51. At step 52 colour components of the light point raster data are received by the merge unit 15.
At step 53 the weighted sum 33 is calculated to produce monochromatic light point data.
At step 54 the spatial filter 27 applies spatial filtering to the weighted sum 33 of the light point raster data 21 to produce an attenuation array 28.
At step 55 the product of the background raster data 22 and the attenuation array 28 is found, to provide the attenuated background raster data 26.
The process terminates at step 56 with attenuated background raster data 26 being available to be combined with light point raster data 21. Figure 6 provides a schematic diagram of a light point image generator, such as 13, according to another specific embodiment of the invention. A merge unit 81, similar to a merge unit 15, is in communication with a raster image generator 82 and a light point image frame store 83. The image generator 82 and the frame store 83 are synchronised. For clarity, functional units relating to geometric mapping or warping are omitted from Figure 6.
The light point image frame store 83 communicates with a light point rendering unit 84. The light point rendering unit 84 applies constant area anti-aliasing algorithms to the light points to avoid fluctuations, jumping and other artefacts of light points, as they traverse a raster based image for example. The light point rendering unit 84 also applies high precision, floating point mathematics to the light point rendering. The light points are rendered in the same geometric domain as they will be projected. The light point image frame store 83 also communicates with a halo rendering unit 85 which renders halos and atmospheric dispersion effects for light points. The light point frame store 83, light point rendering unit 84, and halo rendering unit 85 of this specific embodiment perform a similar function to the light point generator 13 shown in Figure 2.
The light point rendering unit 84 renders light points using data relating to light point range 86, light point intensity 87, light point colour 88, light point type 89, light point position 90, and atmospheric transmittance 91. The light point position conforms to the, typically, non-linear geometric domain used by a projector 3. The types of data 86 to 91 will be apparent to the skilled reader. The halo rendering unit 85 uses the same data as the light point rendering unit 84.
The light point image frame store 83 combines light point data provided by the light point rendering unit 84 with rendered halo data provided by the halo rendering unit 85. The light point frame store synchronises with the image generator 82. Any suitable means of synchronising known to the skilled reader may be used, but frame buffers are used in the specific embodiment described here.
The light point frame store 83 provides light point raster data synchronised with raster data of a background image from the image generator 82 to the merge unit 81. A weighting and filtering unit 94 and multiplying unit 95 attenuate the background image raster data from the image generator 82. An adding unit 96 combines this attenuated background image raster data with the light point raster data from the frame store 83. The merge unit 81 outputs raster data attenuated according to the attenuation dependent weighting provided by unit 95, which is dependent on the light point raster data and halo raster data provided by the frame store 83 after filtering provided by the unit 94. Figure 7 gives a simple depiction of the preparation, according to a preferred aspect of an aspect of the present invention, of a scene to be displayed in a simulator. The scene is typically displayed using a projector 3. Raster data used by projector 3 typically conforms to a non-linear geometric domain selected for a given OTW mirror 4.
Shown in Figure 7 is a background scene 101 corresponding to raster data generated by an image generator 11. The background scene 101 is represented here in a snap-shot of time for simplicity. The skilled person will recognise that a simulator will typically use dynamic scenes or image streams.
The background scene 101 is generated in a first, linear geometric domain as it might appear on a two dimensional display. This is illustrated by the straight perspective lines formed by runway strips, for example. The background scene 101 shown does not feature any light points even though they may eventually feature in the scene to be displayed. In this embodiment, light points corresponding to the background scene 101 will feature as a draw list (not depicted here).
Also shown in Figure 7 is the background scene 102 after the raster data representing it has been warped, post-warped, or geometrically transformed, to the geometric domain suitable for the given OTW projector 3.
Also shown in Figure 7 is a light point scene 103 rendered by the light point generator 13. This scene is shown as pre- warped, to the second geometric domain, suitable for the OTW projector 3. This is illustrated by the runway lights curving outward in the same manner as the runway strips after post-warping.
Finally, Figure 7 depicts the post-warped background scene and pre-warped light points merged as 104. Figure 8 is a flow diagram of a process 200 corresponding to those of Figures 4 and 5 combined and described with reference to Figure 7. The process starts at step 201. At step 202, the image generator renders raster data for a background scene 101. The raster data is generated in a linear geometric domain.
At step 203, the image generator, such as depicted by 11 in Figure 2 generates draw list data (not shown) for light points 103 which will feature in an image to be projected but do not feature in the background scene 101. In some embodiments of the present invention only a limited portion of the available dynamic range is used for this raster data. This provides increased contrast between the background scene and light points added later in the process.
At step 204, warping, or geometric mapping, instructions are generated by the warping control unit 14. These instructions define transformations from a first geometric domain in which the raster data for background scene is generated to a second geometric domain used by a projector, such as shown in Figure 1.
At step 205 the raster data for the background 101 scene is post- warped or transformed using the geometric mapping instructions provided by the warping control unit 14. The post- warped background scene is shown as 102. At step 206 the light point raster data for the light points 103 is generated using the draw list data provided by the image generator 11 and also the geometric mapping instruction data provided by the geometric mapping unit, such as depicted by 14 in Figure 2. The light point raster data therefore conforms to the geometric domain used by the projector and may be considered to be pre-warped.
At step 207 image processing algorithms are applied to the light point raster data to control the characteristics of the light points, or other features such as halos, and prevent fluctuations in size, aliasing or other artefacts being introduced by raster rendering or warping. As this data conforms to the geometric domain used by the projector, the effect of any algorithms will be realised in that domain directly and will not be affected by post-warping. In one embodiment, constant area algorithms are applied. A light point which traverses a simulator scene, but which represents a light at a constant distance, will maintain a constant energy, or area, and intensity, and will traverse steadily. Any suitable constant energy or area algorithms known to the reader may be used. In this embodiment of the present invention the area of the lightpoints are maintained as constant area for the light points 103 which have been rendered as pre- warped raster data. As will be understood by the skilled reader the algorithms of this embodiment will allow intended variations of the light point which correspond to the evolving scene. In this embodiment, other algorithms are applied to control the position of light points, or other features. These algorithms apply high precision floating-point mathematics such as are well known by the skilled reader.
At step 208 the raster data for the background scene 101 and the light points are merged to form an image with enhanced light points. The process ends at step 209.
Some embodiments are implemented using a computer. Other embodiments are implemented using hardware such as raster Graphics Processor Units (GPUs), Field Programmable Gate Arrays (FPGAs) or even Application Specific Integrated Circuits (ASICs). Other specific embodiments use a combination of these.
Other embodiments are implemented by a computer program product or computer processor readable media, such as magnetic, optical, holographic storage media or in solid state memory media. These provide instructions or code which configures a computer, processor and/or peripheral device to implement the embodiments. Other embodiments are implemented using a combination of computers and dedicated hardware. In the case of a specific embodiment the merge units, such as depicted by 15 or 81, and their associated data buffers, such as 19, 20 and 83 are implemented using FPGAs to create the functional units described, and the remaining aspects are implemented using computers with peripherals as understood to be appropriate to the skilled reader. In another specific embodiment the merge units, such as depicted by 15 or 81, are provided as component devices to provide merge functionality for a system which includes separate image generators. In a further embodiment the second raster data may be other than light points. In one embodiment the second raster data may define aeroplanes or vehicles represented as small images against a background. Alternatively symbology used in the display may be represented as second raster data. For example symbology used in Head-Up
Displays (HUDs) may be represented as the second raster data. In this embodiment the images may particularly benefit from pre- warping and the avoidance of raster related effects and such effects exacerbated by post-warping. While the invention has been described in terms of what are at present its preferred embodiments, it will be apparent to those skilled in the art that various changes can be made to the prior art embodiments without departing from the scope of the invention, which is defined by the claims appended hereto. The reader will appreciate that the foregoing is but one example of implementation of the present invention, and that further aspects, features, variations and advantages may arise from using the invention in different embodiments. The scope of protection is intended to be provided by the claims appended hereto, which are to be interpreted in the light of the description with reference to the drawings and not to be limited thereby.

Claims

CLAIMS:
1. An image processing apparatus operable to provide raster data for a display apparatus, the image processing apparatus comprising:
a geometric transformer operable to receive first raster data in a first geometric domain and to transform said first raster data to a second geometric domain;
a second raster data generator operable to generate second raster data in the second geometric domain; and
a raster data combiner operable to combine the first and the second raster data in the second geometric domain.
2. An image processing apparatus in accordance with claim 1 comprising an attenuator operable on the first raster data once transformed to the second geometric domain, the attenuator being operable to attenuate the transformed first raster data dependent on the second raster data.
3. An image processing apparatus in accordance with claim 2 comprising a spatial filtering unit operable to apply spatial filtering to the second raster data to provide spatially filtered second raster data for the attenuator.
4. An image processing apparatus in accordance with claim 3 wherein the spatial filtering unit is operable to apply spatial filtering so as to expand features represented in the second raster data.
5. An image processing apparatus in accordance with claim 3 or claim 4 comprising a colour processor operable to apply a weighted summation function to colour components of the second raster data prior to spatial filtering.
6. An image processing apparatus in accordance with claims 1 to 5, wherein the second data generator is operable to apply one or more image processing algorithms to light points represented within the second raster data.
7. An image processing apparatus in accordance with claim 6 wherein the one or more image processing algorithms control the position of the light points.
8. An image processing apparatus in accordance with claim 6 or claim 7 wherein the one or more image processing algorithms control the area of the light points.
9. An image processing apparatus in accordance with claim 6, 7 or 8 wherein the one or more image processing algorithms control the energy of the light points.
10. An image processing apparatus in accordance with any one of the preceding claims wherein the second geometric domain corresponds to a geometrically non-linear display apparatus.
11. An image processing apparatus in accordance with any one of the preceding claims including a transform controller operable to generate instructions for
transformation of data between the first and second geometric domains.
12. An image processing apparatus in accordance with any preceding claim wherein the second raster data generator is operable to generate data corresponding to light points of a scene for a simulator.
13. An image processing apparatus in accordance with any preceding claim wherein the second raster data generator is operable to generate features representative of one or more atmospheric effects for light points of a scene for a simulator.
14. An image processing apparatus in accordance with claim 1 to 13 comprising a first data generator operable to provide the first raster data and to provide draw list data for light points to be generated by the second raster data generator.
15. An image processing apparatus in accordance with any one of claims 1 to 13 comprising a first data generator operable to provide the first raster data using a portion of an available dynamic range.
16. An image processing apparatus operable to provide raster data for a display apparatus, the image processing apparatus comprising:
an attenuator operable on transformed first raster data to provide attenuated raster data; and
a raster data combiner operable to combine the attenuated first raster data with second raster data,
wherein the attenuation unit is operable to apply attenuation to the first raster data dependent upon second raster data.
17. An image processing apparatus in accordance with claim 16 comprising a geometric transform unit operable to transform the first raster data from the first geometric domain to the second geometric domain.
18. An image processing apparatus in accordance with claim 17 wherein the attenuator unit is operable to apply attenuation to the first raster data after it has been transformed from a first geometric domain to a second geometric domain.
19. An image processing apparatus in accordance with any one of claims 16 to 18 including a spatial filtering unit operable on the second raster data, wherein the attenuating unit is operable to apply attenuation dependent on spatially filtered second raster data.
20. An image processing apparatus in accordance with claim 19 wherein the spatial filtering unit is operable on the second raster data to apply filtering so as to expand features represented in the second raster data to provide expanded features, whereby the attenuation unit applies attenuation so as to attenuate the first raster data according to expanded features to form a region of attenuated raster data around each feature represented in the second raster data.
21. An image processing apparatus in accordance with claim 19 or claim 20, wherein the spatial filtering unit is operable to apply a Gaussian spatial filter, whereby the attenuating unit is operable to apply attenuation dependent on Gaussian spatially filtered second raster data.
22. An image processing apparatus as claimed in any one of claims 16 to 21 wherein the attenuator is operable to attenuate the first raster data by multiplying it with data provided by the spatial filtered unit.
23. An image processing apparatus operable to provide raster data for a display apparatus, the image processing apparatus comprising:
a first data input operable to receive first raster data defined in a first geometric domain;
a geometric transformer operable to transform said first raster data to a second geometric domain;
a second data input operable to receive second raster data in the second geometric domain;
a raster data combiner operable to combine first and second raster data in the second geometric domain.
24. A display apparatus for a simulator comprising:
a raster display apparatus operable to project images onto a non-linear display screen; and
a image processing apparatus in accordance with any one of the preceding claims.
25. A method ofprocessing image data for a display apparatus, the method comprising:
receiving first raster data in a first geometric domain;
transforming said first raster data to a second geometric domain;
receiving second raster data in the second geometric domain;
combining first and second raster data in the second geometric domain.
26. A method of processing image data for a display apparatus, the method comprising:
attenuating first raster data; and
combining the attenuated first raster data with second raster data,
wherein attenuation of the first raster data is dependent upon second raster data.
27. A method in accordance with claim 24 or claim 25 comprising applying algorithms to control features represented in the second raster data.
28. A computer program product operable to configure a computer to provide the apparatus of any one of claims 1 to 23.
29. A computer program product operable to configure a computer to perform the method of any one of claims 24 to 26.
PCT/GB2010/001292 2009-07-09 2010-07-05 An image processing method and device WO2011004145A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0911963.7 2009-07-09
GB0911963A GB2471708A (en) 2009-07-09 2009-07-09 Image combining with light point enhancements and geometric transforms

Publications (1)

Publication Number Publication Date
WO2011004145A1 true WO2011004145A1 (en) 2011-01-13

Family

ID=41022425

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2010/001292 WO2011004145A1 (en) 2009-07-09 2010-07-05 An image processing method and device

Country Status (2)

Country Link
GB (1) GB2471708A (en)
WO (1) WO2011004145A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220366799A1 (en) * 2021-05-14 2022-11-17 Rockwell Collins, Inc. Neuromorphic cameras for aircraft

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5363475A (en) * 1988-12-05 1994-11-08 Rediffusion Simulation Limited Image generator for generating perspective views from data defining a model having opaque and translucent features
US20030231261A1 (en) * 2002-06-12 2003-12-18 Bassi Zorawar S. Short throw projection system and method
WO2006047447A2 (en) * 2004-10-22 2006-05-04 Fakespace Labs, Inc. Rear projection imaging system with image warping distortion correction system and associated method
US20060109270A1 (en) * 2002-11-01 2006-05-25 Cae Inc. Method and apparatus for providing calligraphic light point display
US20060284866A1 (en) * 2003-08-13 2006-12-21 Thales Method and device for the generation of specific elements of an image and method and device for the generation of overall images comprising said specific elements

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2079116A (en) * 1980-06-24 1982-01-13 Kaiser Aerospace & Electronics Distortion correction circuit in a display system
JPH0642709B2 (en) * 1989-05-22 1994-06-01 大日本印刷株式会社 Method of forming combined image
US5526442A (en) * 1993-10-04 1996-06-11 Hitachi Medical Corporation X-ray radiography method and system
GB0315116D0 (en) * 2003-06-27 2003-07-30 Seos Ltd Image display apparatus for displaying composite images
CN1985266B (en) * 2004-07-26 2010-05-05 奥普提克斯晶硅有限公司 Panoramic vision system and method
US20090122195A1 (en) * 2007-11-09 2009-05-14 Van Baar Jeroen System and Method for Combining Image Sequences

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5363475A (en) * 1988-12-05 1994-11-08 Rediffusion Simulation Limited Image generator for generating perspective views from data defining a model having opaque and translucent features
US20030231261A1 (en) * 2002-06-12 2003-12-18 Bassi Zorawar S. Short throw projection system and method
US20060109270A1 (en) * 2002-11-01 2006-05-25 Cae Inc. Method and apparatus for providing calligraphic light point display
US20060284866A1 (en) * 2003-08-13 2006-12-21 Thales Method and device for the generation of specific elements of an image and method and device for the generation of overall images comprising said specific elements
WO2006047447A2 (en) * 2004-10-22 2006-05-04 Fakespace Labs, Inc. Rear projection imaging system with image warping distortion correction system and associated method

Also Published As

Publication number Publication date
GB0911963D0 (en) 2009-08-19
GB2471708A (en) 2011-01-12

Similar Documents

Publication Publication Date Title
CN107306332B (en) Occlusive direct view augmented reality system, computing device and method
US6809731B2 (en) System and method for rendering high-resolution critical items
US20020180727A1 (en) Shadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital warp, intensity transforms, color matching, soft-edge blending, and filtering for multiple projectors and laser projectors
JP5451319B2 (en) Image processing apparatus, image processing method, program, and storage medium
US8629868B1 (en) Systems and methods for simulating depth of field on a computer generated display
CN104954715A (en) GPU (graphics processing unit) acceleration based video display method adopting multi-projector splicing fusion on special-shaped screens
JP6371092B2 (en) Video processing apparatus and projector apparatus using the same
JP6897681B2 (en) Information processing equipment, information processing methods, and programs
US9754347B2 (en) Method and device for simulating a wide field of view
JP6335924B2 (en) Video display device
WO2014162533A1 (en) Video display device
WO2011004145A1 (en) An image processing method and device
Majumder et al. Computer graphics optique: Optical superposition of projected computer graphics
US20020158877A1 (en) Shadow buffer control module method and software construct for adjusting per pixel raster images attributes to screen space and projector features for digital wrap, intensity transforms, color matching, soft-edge blending and filtering for multiple projectors and laser projectors
US10249078B1 (en) System and method for simulating infrared (IR) light halos in a computer graphics display
CN108961173B (en) H 5-end-based augmented reality optimization method
KR20220112495A (en) Image projection system and method of the same
EP3350796B1 (en) Night vision goggles aided flight simulator system and method
Smit et al. Non-uniform crosstalk reduction for dynamic scenes
US11941408B2 (en) Encoding stereo splash screen in static image
Sweet et al. 120 HERTZ–THE NEW 60 FOR FLIGHT SIMULATION?
US20230388472A1 (en) Information processing apparatus, information processing method, and program
Holmes Large screen color CRT projection system with digital correction
TWI704526B (en) Gaming system with expanded vision
Advani et al. Dynamic properties of visual cueing systems in flight simulators

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10735039

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10735039

Country of ref document: EP

Kind code of ref document: A1