GB2558312A - Vortex ring based display - Google Patents

Vortex ring based display Download PDF

Info

Publication number
GB2558312A
GB2558312A GB1622468.5A GB201622468A GB2558312A GB 2558312 A GB2558312 A GB 2558312A GB 201622468 A GB201622468 A GB 201622468A GB 2558312 A GB2558312 A GB 2558312A
Authority
GB
United Kingdom
Prior art keywords
projection screen
image
projector
vortex ring
display system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1622468.5A
Other versions
GB201622468D0 (en
Inventor
Jacques Achille Charlier Olivier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Barco NV
Original Assignee
Barco NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Barco NV filed Critical Barco NV
Priority to GB1622468.5A priority Critical patent/GB2558312A/en
Publication of GB201622468D0 publication Critical patent/GB201622468D0/en
Priority to PCT/EP2017/084567 priority patent/WO2018122211A2/en
Publication of GB2558312A publication Critical patent/GB2558312A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • G02B26/0833Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/10Projectors with built-in or built-on screen
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • G03B21/562Screens moving during projection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • H04N5/7416Projection arrangements for image reproduction, e.g. using eidophor involving the use of a spatial light modulator, e.g. a light valve, controlled by a video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2206/00Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A projector screen 42 is susceptible to movement during projection; the system comprises means 46 to detect the position of the screen; and projector 44 comprises a light valve 45 with more pixels than the image to be projected 41, wherein a subset of the pixels used to project the image is a function of the determined location of the screen. The screen may be a vortex ring transporting one of smoke, water vapour, water droplets, dust, or humid air. An optical sensor may be used to detect the position of the screen, it may be able to detect wavelengths below 300nm and above 700nm. Heater or cooler elements may change the temperature of the screen. Markers may be projected onto the screen in order to detect its position. A (Kalman) filter, may be used to detect the location of the screen. The projector may project an image onto a second screen.

Description

(54) Title of the Invention: Vortex ring based display
Abstract Title: Digital projector comprising means to determine position of projection screen (57) A projector screen 42 is susceptible to movement during projection; the system comprises means 46 to detect the position of the screen; and projector 44 comprises a light valve 45 with more pixels than the image to be projected 41, wherein a subset of the pixels used to project the image is a function of the determined location of the screen. The screen may be a vortex ring transporting one of smoke, water vapour, water droplets, dust, or humid air. An optical sensor may be used to detect the position of the screen, it may be able to detect wavelengths below 300nm and above 700nm. Heater or cooler elements may change the temperature of the screen. Markers may be projected onto the screen in order to detect its position. A (Kalman) filter, may be used to detect the location of the screen. The projector may project an image onto a second screen.
Fig. 4A
Figure GB2558312A_D0001
OB
Figure GB2558312A_D0002
Figure GB2558312A_D0003
At least one drawing originally filed was informal and the print reproduced here is taken from a later filed formal copy.
1/11
Fig. 1
Figure GB2558312A_D0004
Figure GB2558312A_D0005
2/11
ΓΜ σι
Figure GB2558312A_D0006
Figure GB2558312A_D0007
Figure GB2558312A_D0008
Sl
W
Figure GB2558312A_D0009
Figure GB2558312A_D0010
Figure GB2558312A_D0011
3/11
Fig. 3A
Figure GB2558312A_D0012
4/11
Fig. 3B
Figure GB2558312A_D0013
Figure GB2558312A_D0014
—ή I
ι !
! 1
1
1 I
8
ζ*
σ, 4) (Ο, 0)
5/11
Fig. 3C
Figure GB2558312A_D0015
6/11
Fig. 4A
Figure GB2558312A_D0016
7/11
Fig. 4B
Figure GB2558312A_D0017
8/11 ι_η
Figure GB2558312A_D0018
Figure GB2558312A_D0019
9/11
Figure GB2558312A_D0020
Figure GB2558312A_D0021
Figure GB2558312A_D0022
Fig. 6
Figure GB2558312A_D0023
Μ- xj10/11
Fig. 7
Figure GB2558312A_D0024
11/11
Fig. 8
Figure GB2558312A_D0025
102
VORTEX RING BASED DISPLAY
The present application relates to displays systems and methods of constructing them and operating them. The display system may comprise a non-solid projection screen like a smoke curtain, a water curtain and in particular vortex rings carrying smoke, water droplets, fog ...
Background
It is known from the art how to project images on vortex rings of smoke or water vapor.
In JP11184420 “EDDY RING GENERATING DEVICE, DISPLAY DEVICE USING IT AND ITS METHOD“ an eddy ring generating device has plural blow-out holes blowing out eddy rings at regular time intervals. The eddy rings advance side by side and the eddy rings are generated continuously from the blow-out holes to form plural eddy rings lines. A projector projects images on the eddy rings, and a display is formed in the space separating the blow-out holes and
Tracking of the vortex rings is not necessary. The position of the projector remains unchanged. Successive vortex rings crosses the same light ray, each vortex ring replacing a previous vortex ring on which (part of) an image is being projected.
In “Vortex Ring Based Display” (Tokuda, Y., Nishimura, K., Suzuki, Y., Tanikawa, T., and Michitaka, H. Vortex ring based display. Virtual Systems and Multimedia (VSMM), 2010, 51—54.) Yutaka Tokuda et al describe a “mid-air projection display based on vortex ring”. Tokuda suggests using a depth camera to track the vortex to enable interaction with users.
Neither Tokuda et al in their article nor JP JP11184420 disclose:
- How to track the position a vortex moving in a direction with a component perpendicular to the direction of projection.
- How to control the position of one or more vortices.
- How to modify a proj ection system to keep proj ecting on a given vortex while the vortex is moving.
- How to use vortex rings as projection surfaces without interfering with the projection of a movie on a fixed projection surface.
There is a need for improvement in the art.
Summary of the invention.
In a first aspect of the invention, a display system comprises a first projection screen whose position is susceptible to change during projection, a digital projector to project images on a the first projection screen the display system being characterized in that it comprises means to determine the position of the projection screen; in that the image to be projected on the projection screen has less pixels than the number of pixels of the light valve of the projector; and in that the subset of pixels of the light valve used to project the image on the projection screen is function of the determined position of said projection screen.
It is an advantage of the invention that the position and/or attitude of the projector need not be modified e.g. by means of a motorized platform to keep projection of the image.
It is a further advantage of the invention that synchronization is necessary between the sequence of image and the position f the first projection screen.
In a further aspect of the invention, the first projection screen is determined by an optical sensor. An optical sensor does not require contact with the first projection screen allowing the invention to be applied to non-solid projection screen like smoke curtains, water curtains and in particular vortex rings carrying smoke, water droplets, fog ...
In a further aspect of the invention, the means to determine the position of the projection screen is an optical sensor that can detect wavelengths below 390 nm and above 700 nm.
The optical sensor is for instance an infrared camera also known as thermal camera.
The optical sensor can also be a camera with a filter to filter the wavelengths below 700 nm. A near infrared light source (like e.g. NIR light emitting diode) illuminates the first projection screen. It is an advantage of that aspect of the invention that it allows to better discriminate the first projection screen from its environment and decrease the complexity of the image processing required to determine the position of the first projection screen.
In a further aspect of the invention, at least one of a heater element or a cooler element changes the temperature of the first projection screen. It is an advantage of that aspect of the invention that it will further facilitate detection of the first projection screen.
For instance, if the temperature difference between the first projection screen and the other objects in its environment is increased, the difference in pixel value between the pixels related to the first projection screen and the pixels related to other object in the environment of the first projection screen (including the background) will be increased as well. The larger the difference in pixel value, the easier it will be to isolate the first projection screen for instance by applying thresholding to the images taken by the thermal camera.
In a further aspect of the invention, a mapping function and/or a neural network is used to determine the subset of pixels of the light valve is used to determine which subset of pixels of the light valve must be used to project a well determined image on the first projection screen.
In another aspect of the invention, invisible markers are projected on the first projection. For instance, infrared images can be projected that on the first projection screen. The infrared markers can for instance be swept across the space in front of the projector in a scanning fashion. The markers can be projected by a distinct projector or they can be projected by the same projector that can project visible images as well as near infrared images (as e.g. disclosed in US9077915B2 “Interweaving of IR and visible images “.
In another aspect of the invention a filter estimates the position of the projection screen.
For instance, a first position of the projection screen being known at a given instant tl, if its speed and acceleration are known around the same instant tl, it is possible to anticipate the position of the projection screen at a later time t2 > tl without having to use the sensor.
It is an advantage of that aspect of the invention that a simpler sensor can be used (for instance the sensor might be limited to detecting the exit of a vortex ring from the vortex ring generator when the first projection screen is a vortex ring loaded with e.g. smoke.
When markers sweep the space in front of the projector to detect the projection screen, anticipating its position allows to limit the scan to a region surrounding the estimated position.
The filter can be based on empirical models of how a vortex ring will move across a projection room. When the first projection screen is “solid”, known equations of the mechanics can be used to anticipate the position of the first projection screen.
In particular, the filter to estimate the position of the projection screen can be a filter like a
Kalman filter that uses a system's dynamics model (e.g., physical laws of motion), known control inputs to that system, and multiple sequential measurements (such as from sensors) to form an estimate of the system's varying quantities (its state in this case a position) that is better than the estimate obtained by using only one measurement alone. Noisy sensor data, approximations in the equations that describe the system evolution, and external factors that are not accounted for all place limits on how well it is possible to determine the system's state. A filter of this type such as a Kalman filter deals effectively with the uncertainty due to noisy sensor data and to some extent also with random external factors. The Kalman filter produces an estimate of the state of the system, i.e. position as an average of the system's predicted state and of the new measurement using a weighted average. The purpose of the weights is that values with better (i.e., smaller) estimated uncertainty are trusted more. The weights are calculated from the covariance, a measure of the estimated uncertainty of the prediction of the system's state. The result of the weighted average is a new state estimate that lies between the predicted and measured state, and has a better estimated uncertainty than either alone. This process is repeated at every time step, with the new estimate and its covariance informing the prediction used in the following iteration. This means that a filter like a Kalman filter works recursively and requires only the last best guess, rather than the entire history, of a system's state, here position to calculate a new state, i.e. position.
In another aspect of the invention the pixels of the light valve not used to project an image on the first projection screen are used to project a second image or sequence of images on a second projection screen.
This is an advantage of that aspect of the invention that a background image or sequence of images can be projected on a second screen having a fixed position and positioned behind the first projection screen. The first projection screen may cast a shadow on the second projection screen emphasizing the different plane in which the first and second projection screen are positioned thereby enhancing the projection by conferring it a three dimensional character. The background image can be modified in function of the position of the first projection screen to correct for visual artifacts caused by projection of the first projection screen (including but not limited to shadow, or discontinuities caused by the subset of pixels not available to project the second image or sequence of images on the second screen).
In a further aspect of the invention, the distance between the objective of the projector and the first projection screen is at least 10 times or more the distance between the sensor used to detect the first projection screen and the objective of the projector.
In a further aspect of the invention, the distance between the objective of the projector and the first projection screen is at least 100 times or more the distance between the sensor used to detect the first projection screen and the objective of the projector.
In a further aspect of the invention, the distance between the objective of the projector and the first projection screen is at least 1000 times or more the distance between the sensor used to detect the first projection screen and the objective of the projector.
It is an advantage of those aspects of the inventions that it simplifies the determination of the first projection screen.
In a further aspect of the invention the first projection screen is a vortex ring transporting a material to form an image when lit by the projector, the vortex ring being generated by a first vortex ring generator.
It is an advantage of that aspect of the invention that the first projection screen will need no mechanical support that would interfere with projection on a second screen.
The material transported by the vortex ring to form an image can be at least one of smoke, water vapor, water droplets, dust, and humid air.
It is an advantage of that aspect of the invention that front projection and back projection are possible by choosing a different material.
Smoke may be better suited for front projection. Water droplets are well suited for back projection in particular when the size of the droplets is comparable to the wavelengths of the light used by the projector. Indeed, it that case, Mie scattering favors transmission of the light in a restricted angle, thereby allowing displaying multiview images, that is different perspective of an object, each perspective being projected on the water droplets by a different projector at different angles.
In another aspect of the invention, the first vortex ring generator and a second vortex ring generator generates vortex rings that collide with one another. The head-on collision of two vortex rings will give rise to a projection screen having different shape and velocity than the vortex rings used to generate it.
In particular, if the vortex rings that collide are similar in shape, geometry and velocity, their head-on collision will distribute the material they contain (smoke, fog, water droplets) across a disc that is more or less stationary and has a larger diameter than the vortex rings.
It is an advantage of that aspect of the invention that it is possible to generate a projection screen larger than the vortex rings, thereby allowing to position a projection screen at any position in front of e.g. a second projection screen while minimizing interference with projection on that second screen.
In another aspect of the invention, the vortex rings contain different materials that generate a smoke a fog or water droplets only when mixed together during a collision of the two vortex rings. It is an advantage of that aspect of the invention that it further reduce visual artefacts that the vortex ring would cause at positions different from the position where a projection screen composed of smoke, fog, water droplets... is desired.
Brief description of the figures.
Figure 1 illustrates how the image projected on a first projection of which the position may change relates to the pixels of the light valve used for projection.
Figure 2 illustrates back projection and front projection on a vortex ring.
Figure 3A, 3B and 3C illustrate how a light valve can be used to sweep the space in front of a projector.
Figure 4A and 4B illustrate how an image or sequence or sequence of images can be projected on a moving projector screen while the position of the projection screen was unknown at the time the images were generated.
Figure 5 illustrates a potential problem related to the determination of the position of the first projection screen.
Figure 6 illustrates how the distance between projector and first projection screen affects the determination of the position of the first projection screen.
Figure 7 illustrates how two sensors can be used to determine the position of the first projection screen in space.
Figure 8, 9 and 10 illustrate how a disc like projection screen can be generated during the head on collision of two vortex rings.
Definitions.
Coordinates. Unless mentioned otherwise, the position of a point in space will be 5 determined in a Cartesian system of axis and will be named X, Y and Z as is customary. In several examples, the system of axis will be fixed to a projector. Unless mentioned otherwise, the X and Y axis are perpendicular to the optical axis of the projector under consideration and the Z axis will be parallel to the optical axis of the projector.
Degree Of Freedom (DOF). The degrees of freedom (DOF) of a rigid body are defined as 10 the number of independent movements the rigid body has.
If the rigid body is unable to translate, it has maximum 3 degrees of freedom associated to rotations around different axis. The 3DOF are for instance pitch, roll and yaw that determine the attitude of the body.
Digital Projector. An image projector that uses a (digital) light valve like a DMD, UC or 15 LCOS driven by electric signals, like e.g. sequence of binary or digital electric signals, to form an image on a projection screen.
Light Valve. Example of light valve (also known as spatial light modulators) are DMD (digital micro-mirror devices where micro-mirrors assembled into an array reflect light selectively in certain direction in function of their state), LC or Liquid Crystal panels,
LCOS or liquid crystal on silicon.
Vortex Ring. Also called a toroidal vortex, is a torus-shaped vortex in a fluid or gas; that is, a region where the fluid mostly spins around an imaginary axis line that forms a closed loop. The dominant flow in a vortex ring is said to be toroidal, more precisely poloidal.
Maximum Light Cone. The maximum light cone of a projector is the light cone formed 25 by the projected light rays when at least all the pixels at the periphery of the spatial light modulator are ON.
Nominal. Being according to plan.
Pixel. Pixel will be used to designate a picture element (i.e. part of an image) or an element of a light valve or spatial light modulator. In particular for a Digital Micro-mirror Device (or DMD), pixel can designate a micro-mirror of the array of micro-mirrors of the DMD.
Pixel Value.
In photography and computing, a grayscale or greyscale digital image is an image in which the value of each pixel is a single sample, that is, it carries only intensity information.
Images of this sort, also known as black-and-white, are composed exclusively of shades of gray, varying from black at the weakest intensity to white at the strongest.
In computing, although the grayscale can be computed through rational numbers, image pixels are stored in binary, quantized form. Some early grayscale monitors can only show up to sixteen (4-bit) different shades, but today grayscale images (as photographs) intended for visual display (both on screen and printed) are commonly stored with 8 bits per sampled pixel, which allows 256 different intensities (i.e., shades of gray) to be recorded, typically on a non-linear scale. The precision provided by this format is barely sufficient to avoid visible banding artifacts, but very convenient for programming because a single pixel then occupies a single byte.
Each of the pixels that represents an image stored inside a computer has a pixel value which describes how bright that pixel is, and/or what color it should be. For a grayscale images, the pixel value is a single number that represents the brightness of the pixel. The most common pixel format is the byte image, where this number is stored as an 8-bit integer giving a range of possible values from 0 to 255. Typically zero is taken to be black, and
255 is taken to be white. Values in between make up the different shades of gray.
To represent color images, separate red, green and blue components must be specified for each pixel (assuming an RGB color space), and so the pixel 'value' is actually a vector of three numbers. Often the three different components are stored as three separate 'grayscale' images known as color planes (one for each of red, green and blue), which have to be recombined when displaying or processing.
Multi-spectral images can contain even more than three components for each pixel, and by extension these are stored in the same kind of way, as a vector pixel value, or as separate color planes.
The actual grayscale or color component intensities for each pixel may not actually be stored explicitly. Often, all that is stored for each pixel is an index into a colormap in which the actual intensity or colors can be looked up.
Although simple 8-bit integers or vectors of 8-bit integers are the most common sorts of pixel values used, some image formats support different types of value, for instance 32-bit signed integers or floating point values. Such values are extremely useful in image processing as they allow processing to be carried out on the image where the resulting pixel values are not necessarily 8-bit integers. If this approach is used then it is usually necessary to set up a colormap which relates particular ranges of pixel values to particular displayed colors.
Thresholding. The simplest thresholding methods replace each pixel in an image with a black pixel if the image intensity Iij (with I and j the coordinates of the pixel in the pixel array that forms an image) is less than some fixed constant T (that is, Iij < T), or a white pixel if the image intensity is greater than that constant.
Visible Spectrum. The visible spectrum is the portion of the electromagnetic spectrum that is visible to the human eye. Electromagnetic radiation in this range of wavelengths is called visible light or simply light. A typical human eye will respond to wavelengths from about 390 to 700 nm.
Description of Example of Embodiments.
The present invention will be described with respect to particular embodiments and with reference to certain drawings but the invention is not limited thereto but only by the claims. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn on scale for illustrative purposes. Where the term comprising is used in the present description and claims, it does not exclude other elements or steps. Furthermore, the terms first, second, third and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other sequences than described or illustrated herein.
At an instant tO, an image 11 of a sequence of image 14 is projected on a vortex ring 12 at a position PO in front of a projector 13 and moving with a velocity VI across the projection field of the projector 13 (not shown). The vortex can be used either as a front projection surface or as a back projection surface (see figure 2). The image Al has less pixels than the light valve 15 of the projector 13. This is illustrated on figure 1 by superposing the pixels of Al on the light valve 15. In the example of figure 1, the image Al fits in a rectangle N1 pixels wide and Ml pixels high.
At an instant tl, an image 1 IB of the sequence of image 14 must be projected on the vortex ring 12 which is now at a position Pl in front of projector 13, with Pl different from P0.
For each position of the vortex ring the light rays corresponding to an image 11 and emitted by the projector 13 must impinge on the vortex ring.
A straightforward solution would be to mount the projector 13 on a motorized platform with at least 1 or 2 DOF, the position of the platform being continuously adapted in function of the position of the vortex in order for the image 11 to be projected right on the vortex ring 12.
In one embodiment of the present invention, instead of modifying the position of the projector, it is the state of the light valve of the projector 13 that is adapted in function of the position of the vortex 12 even if the image content of image 11 remains unchanged. By using different light valves to project an image 11, the light rays associated with image 11 can sweep an area in front of the projector 13. The position of the vortex 12 can be determined by any of the method described later or that would be known to the art.
In a first example of embodiment, we assume that the image 11 to be projected on the vortex ring 12 fits in a rectangle 16 that is N1 pixel wide and Ml pixel high.
The pixel data (intensity, color) is known for each pixel at a position (x, y) in the rectangle 16 for each image 11 of the sequence of image 14.
The rectangle 16 has fewer pixels than the light valve 15 of the proj ector 13. The light valve or spatial light modulator 15 is N2 pixels wide and M2 pixels high.
The position of the vortex being known with respect to the projector, offset value Ox and Oy are determined. Ox and Oy are the coordinate of a pixel of spatial light modulator 15 that will generate a light ray that intersect the vortex ring when it is at position PO. The image data of image 11 is offset by Ox and Oy before being sent to the spatial light modulator 15: the pixel with coordinates (0, 0) in rectangle 16 will correspond to pixel (Ox,
Oy) on the light valve 15. The pixel with coordinates (N-l, M-l) in the rectangle 16 will correspond to pixel (Ox + Nl-1, Oy + Ml-1) on the light valve 15.
Figure 3 A to 3B illustrate how a ray of light 3 IB can sweep the space in front of a projector 33 merely by changing the state of e.g. the micro-mirrors on a reflective spatial light modulator 35. For the sake of simplicity, the spatial light modulator 35 is only 16 pixels wide and 9 pixels high. An active pixel is represented as a hashed square on the spatial light modulator 35.
On figure 3 A, the micro-mirror at position (0, 4) is active (all the others are in the off-state). The ray of light (or more precisely the bundle of light rays) 31 are reflected by the micro10 mirror (0,4) towards the projection optics 32 and exits the projector 33 as 3IB .
On figure 3B, the micro-mirror at position (7, 4) is active. The ray of light (or more precisely the bundle of light rays) 31 is reflected by the micro-mirror (7, 4) towards the projection optics 32 and exits the projector 33 as 3 IB.
On figure 3C, the micro-mirror at position (16, 4) is active. The ray of light (or more 15 precisely the bundle of light rays) 31 is reflected by the micro-mirror (16, 4) towards the projection optics 32 and exits the projector 33 as 3 IB.
Figure 4A and 4B illustrates how an image 41 can be projected on a moving vortex ring 42 without changing the position of the projector 44. We take as an example a still image to stress the fact that the state of the pixels of the spatial light modulator 45 is changed to keep projecting the still image 41 on the moving vortex ring regardless of the position of the vortex ring as it moves in front of projector 44. As was the case for figures 3 A to 3B and for the sake of simplicity, the spatial light modulator 45 is only 16 pixels wide and 9 pixels high. An active pixel (that is a pixel that will project light on a screen) is represented as a hashed square on the spatial light modulator 45.
On figure 4A, the position PO of the vortex 42 is determined by e.g. any of the methods described earlier in this application. On figure 4A, a sensor 46 detects the position PO of the vortex ring 42. Based on the position PO offsets Ox and Oy are determined (by processing means integrated with the sensor 46 or by processing means not shown on figure 4A). The image data sent to the spatial light modulator concerning the image Al is in a rectangle whose corners have coordinates (Ox, Oy), (Ox +2, Oy), (Ox, Oy+2) and (Ox+2, Oy+2). On figure 4A, Ox = 0 and Oy = 3.
On figure 4B, the vortex has moved and is at position Pl. The sensor 46 detects the position Pl of the vortex ring 42. New offsets Ox’ and Oy’ are determined and the image data related to image 41 is applied to pixels of the spatial light modulator 45 in a rectangular region whose comers have coordinates (Ox’, Oy’), (Ox’ +2, Oy’), (Ox’, Oy’+2) and (Ox’+2,
Oy’+2).
The data representing the image 41 to be projected on the vortex ring 42 is used to drive a subset of micro-mirrors of the spatial light modulator 45. That subset of micro-mirrors is determined by the size of the image 41 (the number of pixels in the image data encoding the image 41) and the position of the vortex ring 42 with respect to the projector 43.
Instead of a rectangle, the image 41 can be delimited by e.g. a circle, an oval...
The procedure is similar whether the projector 44 uses e.g. 3 light valves, one for each color R, G and B or a single light valve (each color being projected sequentially). The data relative to the image 41 is split into R, G and B data and the offsets Ox and Oy are applied for the R, G and B image data respectively.
Figure 5 illustrate a possible issue imposing constraints on the sensor 46.
On figure 5, the vortex rings 50, 51 and 52 are moving in a direction perpendicular to the plane of the figure (which coincides with the plane x z of the right-handed system of reference xyz seen on figure 5). For the sake of simplicity, we consider vortex rings 50, 51 and 52 moving in a direction parallel to the y axis of the system of reference).
As illustrated on figure 5, at a given instant, vortex rings at different distances of the projector and the sensor will be seen at different angles by the sensor while they can be reached by the same (bundle of) light ray(s) 410. The line of sights 420A, 420B and 420C joining a point of the vortex rings to the sensor 46 are different in function of the distance of a vortex rings to the sensor 46, the same light rays 410 will reach them and therefore the same offsets Ox and Oy must be used to determine which subsets of pixels of the light valve must be activated to project a given image on the vortex rings 420A, 420B and 420C.
It appears therefore necessary to know the three coordinates x, y and z in space of the vortex rings to correctly determine the offsets Ox and Oy.
Figure 6 shows a particular case for which it may be sufficient to known only two coordinates of a vortex ring to determine the offsets Ox and Oy.
If the sensor 46 is close enough to the projector 44, the line of sights 420A, 420B and 420C are almost identical when the vortex rings are far enough from the projector 44 and the sensor 46. The 2D image of the vortex rings 50, 51 and 52 will be almost indistinguishable from each other and only two parameters (e.g. x and y coordinate of a vortex as seen by the sensor 46) will determine which offset Ox and Oy must be used to project on a given vortex ring. The same offsets will be valid for the vortex rings that are intersected by the same ray of light (e.g. 410 on figure 6) regardless of their distance to the projector.
Another possibility is to integrate the sensor 46 in the projector 44 as is the case in US20150219500A1 “Projection system with safety detection”.
Another possibility is to project on vortex rings that follow a known trajectory. For instance, the vortex ring generator can generate vortex rings that will move along a direction for which the z coordinate is known (as determined by the vortex ring generator configuration) and will be constant or almost constant (as in the example of figure 5 and 6). In such cases, by combining the measurements of the sensor 46 with the known z coordinate, the position of the vortex rings in space is known and the offsets Ox and Oy can be determined).
The offset Ox and Oy can for instance be determined with a look-up table or by means of mapping functions Ox = f(x, y, z) and Oy = g(x, y, z) where Ox and Oy are the offsets to be used to project an image right onto the vortex ring and x, y, z are the coordinates of the vortex ring. Instead of look-up tables and analytical functions, fuzzy logic, trained neural networks, self-organizing neural networks can also be used.
If the sensor 46 is a time of flight camera, the three coordinates x, y and z of the vortex ring are known.
Simplifications are possible, if for instance, the size of the image projected on the vortex ring is small compared to the dimensions of the vortex ring, the size of the look-up table can be reduced. The space in front of the projector can be divided into a number NB of zones where NB is smaller than the number of pixels of the light valve in the projector 44 and NB is smaller or equal to the number of pixels of the sensor 46 (when sensor 46 is a camera) or in general the number of positions that can be discriminated by a sensor 46.
Training a neural network or setting-up a look-up table to map the coordinate of a vortex ring, or more generally a screen that can occupy an arbitrary position in the field of projection of projector 44 can be done with targets that can be detected by the sensor 46.
A known (bundle of) ray(s) of light is projected by projector 44. The coordinate (i, j) of the pixel of the light valve that generates that ray of light is known. The target is positioned to intersect the ray of light and its coordinates are determined by the sensor 46 (either the sensor determines x, y and z coordinates are as discussed earlier, the z coordinate is imposed by design). The operation is repeated for several positions of the target.
For each position occupied by the target, the offsets Ox and Oy are the coordinates (i, j) of the pixel of the light valve that was used to generate the ray of light intersecting the target at said position.
One has thus established a look-up table that associates offsets Ox and Oy with positions of a vortex ring on which to project an image (or more generally a projection screen that can occupy any position within the maximum projection cone of projector 44).
The look-up table can also be used to train a neural network. This can be interesting to limit the number of positions (or training point) at which a target has to be placed. The trained neural network will interpolate the value for the offsets Ox and Oy between the training points.
The offsets Ox and Oy can be applied differently in function of the reference point (the point of the projection screen that is used to determine the coordinates of the projection screen).
If the coordinates (x, y, z) corresponds to a corner of the projection screen, the offset will be applied to the image in a similar fashion as for the example of figure 4A and 4B.
If the coordinates (x, y, z) corresponds to the center of the vortex ring (or more generally the projection screen) and if one wants the image to be centered, the offsets will be applied to the center pixel of the image. In the case of the image 41 on figure 4A, this means that the pixel at the center of image 41 would correspond to the pixel (0, 3) instead of the lower left comer pixel of image 41. In the case of the image 41 on figure 4B, this means that the pixel at the center of image 41 would correspond to the pixel (7, 3) instead of the lower left corner pixel of image 41. In other words, the subset of pixels will be different in function of the point of reference on the projection screen and the desired relative position of the image on the projection screen.
Instead of a single sensor 46, two sensors 46A and 46B (that can be identical) can be used to easily measure the position of a projection screen.
The first sensor 46A can for instance be positioned as was the case on figure 5 and 6 and the second sensor 46B can be positioned with its optical axis perpendicular to that of sensor 46A as seen on figure 47.
Two or more projectors can be used to project image data on a vortex ring. The two projectors can be synchronized (synchronization of the projectors can be done according to a method disclosed in US20160119507A1 “SYNCHRONIZED MEDIA SERVERS AND PROJECTORS” for instance. It can be interesting to use two or more projectors projecting different perspective of an object on a travelling vortex ring used as back projection screen and filled with water droplets with the adequate size for Mie scattering. Thanks to the Mie scattering, the images being projected will not be visible at the same time from a fixed position. The vortex ring filled with water vapour.
The same sensor 46 or sensors 46A and 46B can be used for the two or more projectors but the mapping functions, the look-up table, the neural network must be tuned for each projector.
It is another purpose of the invention to determine the position of the travelling projection screen. The travelling projection screen can be used in conjunction with a larger projection screen. When images are projected on both projection screens, it may be difficult to discern the travelling projection screen easily and without contact.
In an aspect of the invention, the temperature of the travelling projection screen is modified to allow its detection by an infra-red sensor. The infrared sensor can be an infrared camera.
If the travelling projection screen is solid (e.g. a fabric stretched on a frame), heater resistors can be integrated in the frame. By dissipating heat more than surrounding objects, the travelling projection screen will be easier to be detected by the infra-red sensor.
If the infra-red sensor is an infrared camera, the pixel values of pixels corresponding to (part) of the travelling projection screen will be higher than for other objects in the field of view of the infrared camera. This may enhance the identification of the travelling projection screen on the image formed by the infrared camera after thresholding.
Alternatively, the travelling projection screen can be cooled to a temperature lower than the temperature of other objects in the field of view of the infrared sensor.
If the travelling projection screen is not solid, for instance if it is a vortex ring filled with fog, smoke, water droplets ... the gas and/or the fog, smoke or water particles that will be used to form the vortex ring can be heated or cooled in the vortex ring generator.
A vortex ring generator is described in e.g. US 20130214054 “Generator apparatus for producing vortex rings entrained with charged particles” where the material contained in a reservoir is distributed in a controlled fashion to form a vortex ring containing said material.
The vortex ring generator described in US 20130214054 can be modified for the purpose of the invention by adding a cooling and/or heating element to bring the material contained in the reservoir to a desired temperature. To facilitate the detection of the vortex ring transporting the material (like smoke, fog, water droplets), the temperature of the material in the reservoir is taken higher or lower than the temperature of the objects that are in the field of view of the sensor 46 or sensors like 46A and 46B. This can be particularly relevant when the vortex is used as back projection screen. Indeed, in that case, the vortex ring will be between the sensor and the audience (human beings at a temperature of +/-37 degrees C).
In some instances, it may be desirable to let a temporary projection screen appear for a limited amount of time anywhere in front of a larger projection screen without disturbing the projection (noise of motors used to deploy the temporary projection screen, support structure that may cast shadows on the large projection screen).
Vortex rings may offer a solution. It is known in the art that when two vortex rings collide head-on, the material contained in the vortex rings will spread across an expanding discus that is perpendicular to the direction along which the vortex were travelling. This is illustrated on figures 8, 9 and 10.
On figure 8, a first vortex ring 82A moves with a velocity VI towards a second vortex ring 82B which moves towards 82A with a velocity V2 that is close to or equal to -VI. Both vortices are moving along the direction D. The vortices 82A and 82B have similar dimensions (i.e. their dimensions very by less than 50%, less than 25% and preferably less than 10%).
Figure 9 shows the vortices right before their collision.
Figure 10 shows the result of the collision. The material that was in the vortex rings is spreading across a disc 102. The discus expands in a plane perpendicular to the direction D but the velocity along the direction D of the material that was in the vortex rings is less than the velocity of either of the vortices 82A and 82B. When the vortices that collide are identical, the velocity of the material forming the disc 102 along the direction D is zero or close to zero.
On figure 2, the arrows pointing outwards from the center of the disc represent the velocity of the material (fog, smoke, water droplets ...) in different parts of the disc. The disc expands and will ultimately dissipate. Dissipation can nevertheless take too long in which case the same or another vortex ring generator can be used to accelerate the dissipation of the disc.
Shooting one or more vortex rings into the disc 102 will accelerate its dissipation. The vortex ring that is used to fasten dissipation of the disc 102 can be a vortex ring charged with ionized material. Examples of a vortex ring generator where the vortex ring carries ionized gas or particles is described in US 20130214054. Using a vortex ring carrying ionized gas is advantageous when the material of the disc is smoke. When the vortex ring will approach and collide with the disc 102, electrical charges will be transferred to the smoke particles who will be drawn to the objects in the vicinity, thereby accelerating dissipation of the disc 102.
In some instances, the vortex rings 82A and 82 should be invisible, yet their collision should produce a disc that can be used as front or rear projection screen.
Let us take a first vortex ring 82A carrying a gas (e.g. air) at a temperature Tl and a second vortex ring 82B carrying humid air at a temperature T2 with T2 > Tl. During a head on collision between vortex rings 82A and 82B, the material they carry will mix. The temperature of the humid air that was carried by vortex ring 82B will decrease and the humidity will condense forming water droplets that can be used as projection screen.
Vortex ring carrying water droplets are less susceptible to interfere with other projectors; in particular when the size of the water particles is comparable to the wavelengths of the light used for projection. Indeed, in Mie scattering, the light is diffused preferentially in the direction opposite to the projector. The consequence of this is that is a vortex ring carrying water droplet passes in between a large projection screen and the projector projecting on the large projection screen, the light
A method similar to the previous one can be used to project images at the same time on a large screen and a vortex ring with the same projector.
In that case, the audience is facing the large projection screen and the vortex ring, with the projector in its back. The vortex ring must carry a material that allows front projection.
The projector receives at least two streams of image data: a first image stream to be projected on the large projection screen and a second image stream to be projected on the vortex ring. As was the case previously, a subset of pixels of the light valve is used to project the second image stream on the vortex ring and the remaining pixels of the light valve are used to project part of the first image stream on the large projection screen.

Claims (30)

Claims
1. A display system comprising a first projection screen whose position is susceptible to change during projection, a digital projector with a light valve to project images on the first projection screen the display system being characterized in that it comprises means to determine the position of the projection screen; in that the image to be projected on the projection screen has less pixels than the number of pixels of the light valve of the projector; and in that the subset of pixels of the light valve used to project the image on the projection screen is function of the determined position of said projection screen.
2. A display system according to claim 1, further characterized in that the means to determine the position of the projection screen is an optical sensor.
3. A display system according to any of the preceding claims, further characterized in that the means to determine the position of the projection screen is an optical sensor that can detect wavelengths below 390 nm and above 700 nm.
4. A display system according to any of the preceding claims, further characterized in that at least one of a heater element or a cooler element changes the temperature of the projection screen.
5. A display system according to any of the preceding claims, further characterized in that the means to determine the position of the projection screen use markers projected on the projection screen.
6. A display system according to any of the preceding claims, further characterized in that at least one of a look-up table, a mapping function and/or a neural network and/or artificial intelligence and/or machine learning method is used to determine the subset of pixels of the light valve.
7. A display system according to any of the preceding claims, further characterized in that a filter estimates the position of the projection screen.
8. A display system according to any of the preceding claims, further characterized in that a Kalman filter estimates the position of the projection screen.
9. A display system according to any of the preceding claims, further characterized in that the pixels of the light valve not used to project an image on the first projection screen are used to project an image on a second projection screen.
10. A display system according to any of the preceding claims, further characterized in that the distance between the objective of the projector and the projection screen is at least 10 times or more the distance between the sensor and the objective.
11. A display system according to any of the preceding claims, further characterized in that the distance between the projector objective and the projection screen is at least 100 times or more the distance between the sensor and the objective.
12. A display system according to any of the preceding claims, further characterized in that the distance between the projector objective and the projection screen is at least 1000 times or more the distance between the sensor and the objective.
13. A display system according to any of the preceding claims, further characterized in that the projection screen is a vortex ring transporting a material to form an image when lit by the projector, the vortex ring being generated by a first vortex ring generator.
14. A display system according to claim 13, further characterized in that the projection screen is a vortex ring transporting at least one of smoke, water vapor, water droplets, dust, and humid air.
15. A display system according to claim 13 or 14, further characterized in that the first vortex ring generator and a second vortex ring generator acting generate vortex rings that collide with one another to generate a projection screen having different speed, geometry and dimensions than the vortex rings generated by the first and second vortex ring generators.
16. A method of operating a display system comprising a first projection screen whose position is susceptible to change during projection, a digital projector with a light valve to project images on the first projection screen, the method comprising: determining the position of the projection screen in that an image to be projected on the projection screen has less pixels than the number of pixels of the light valve of the projector; and in that the subset of pixels of the light valve used to project the image on the projection screen is function of the determined position of said projection screen.
17. A method according to claim 16, wherein determining the position of the projection screen is done with an optical sensor.
18. A method according to claim 16 or 17, wherein determining the position of the projection screen is done with an optical sensor that can detect wavelengths below 390 nm and above 700 nm.
19. A method according to any of the claims 16 to 18, further comprising changing the temperature of the projection screen.
20. A method according to any of the claims 16 to 19, wherein determining the position of the projection screen uses markers projected on the projection screen.
21. A method according to any of the claims 16 to 20, wherein determining a subset of pixels of the light valve is done with at least one of a look-up table, a mapping function and/or a neural network, artificial intelligence or machine learning methods.
22. A method according to any of the claims 16 to 21, comprising a filter to obtain estimates of the position of the projection screen.
23. A method system according to claim 22, wherein the filter is a Kalman filter for estimating the position of the projection screen.
24. A method according to any of the claims 16 to 23, wherein pixels of the light valve not used to project an image on the first projection screen are used to project an image on a second projection screen.
25. A method according to any of claims 16 to 24, wherein the distance between the
5 obj ective of the proj ector and the proj ection screen is set to at least 10 times or more the distance between the sensor and the objective.
26. A method according to any of the claim 16 to 25, wherein the distance between the projector objective and the projection screen is set to at least 100 times or more the
10 distance between the sensor and the objective.
27. A method according to any of the claims 16 to 26, wherein the distance between the projector objective and the projection screen is set to at least 1000 times or more the distance between the sensor and the objective.
28. A method according to any of the claims 16 to 27, wherein the projection screen is a vortex ring transporting a material to form an image when lit by the projector, the vortex ring being generated by a first vortex ring generator.
20
29. A method according to claim 28, wherein the projection screen is a vortex ring transporting at least one of smoke, water vapor, water droplets, dust, and humid air.
30. A method according to claim 28 or 29, wherein that the first vortex ring generator and a second vortex ring generator acting generate vortex rings that collide with one
25 another to generate a projection screen having different speed, geometry and dimensions than the vortex rings generated by the first and second vortex ring generators.
Intellectual
Property
Office
Application No: GB1622468.5
GB1622468.5A 2016-12-31 2016-12-31 Vortex ring based display Withdrawn GB2558312A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1622468.5A GB2558312A (en) 2016-12-31 2016-12-31 Vortex ring based display
PCT/EP2017/084567 WO2018122211A2 (en) 2016-12-31 2017-12-22 Vortex ring based display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1622468.5A GB2558312A (en) 2016-12-31 2016-12-31 Vortex ring based display

Publications (2)

Publication Number Publication Date
GB201622468D0 GB201622468D0 (en) 2017-02-15
GB2558312A true GB2558312A (en) 2018-07-11

Family

ID=58412189

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1622468.5A Withdrawn GB2558312A (en) 2016-12-31 2016-12-31 Vortex ring based display

Country Status (2)

Country Link
GB (1) GB2558312A (en)
WO (1) WO2018122211A2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110737105B (en) * 2019-10-28 2021-11-16 京东方科技集团股份有限公司 Three-dimensional image display system
JP7472711B2 (en) 2020-08-06 2024-04-23 株式会社Jvcケンウッド PROJECTION CONTROL DEVICE AND PROJECTION CONTROL METHOD
CN114071117B (en) * 2021-11-15 2024-03-08 北京清尚建筑装饰工程有限公司 3D simulated lamplight projection system and projection method thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2991787A1 (en) * 2012-06-11 2013-12-13 Mireille Jacquesson Device for projecting images on e.g. soap bubble in smoke, has optical sensor i.e. video camera, for detecting position of bubble screens, and projector for projecting images on screens, where sensor includes infra-red ray emitter
US20160206969A1 (en) * 2013-09-09 2016-07-21 Molcure, Inc. Soap Bubble-Generating System

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11184420A (en) * 1997-12-18 1999-07-09 Mitsubishi Electric Corp Eddy ring generating device, display device using it and its method
CN105472358A (en) * 2014-10-05 2016-04-06 万明 Intelligent terminal about video image processing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2991787A1 (en) * 2012-06-11 2013-12-13 Mireille Jacquesson Device for projecting images on e.g. soap bubble in smoke, has optical sensor i.e. video camera, for detecting position of bubble screens, and projector for projecting images on screens, where sensor includes infra-red ray emitter
US20160206969A1 (en) * 2013-09-09 2016-07-21 Molcure, Inc. Soap Bubble-Generating System

Also Published As

Publication number Publication date
WO2018122211A2 (en) 2018-07-05
GB201622468D0 (en) 2017-02-15
WO2018122211A3 (en) 2018-08-09

Similar Documents

Publication Publication Date Title
US11115633B2 (en) Method and system for projector calibration
US10380802B2 (en) Projecting augmentation images onto moving objects
CN110476148B (en) Display system and method for providing multi-view content
TWI253006B (en) Image processing system, projector, information storage medium, and image processing method
CN104284119B (en) The equipment, system and method for projected image on the predefined part of object
Asayama et al. Fabricating diminishable visual markers for geometric registration in projection mapping
EP2869263A1 (en) Method and apparatus for generating depth map of a scene
CN107018392B (en) Projector optimization method and system
GB2558312A (en) Vortex ring based display
US9049369B2 (en) Apparatus, system and method for projecting images onto predefined portions of objects
Kagami et al. Sticky projection mapping: 450-fps tracking projection onto a moving planar surface
Siegl et al. Adaptive stray-light compensation in dynamic multi-projection mapping
Fukamizu et al. Elamorph projection: Deformation of 3d shape by dynamic projection mapping
KR101757627B1 (en) Marker tracking apparatus for projection area in augmented reality environment using three-dimensional model and marker tracking method thereof
Fukiage et al. Animating static objects by illusion‐based projection mapping
Tsukamoto et al. Distributed Optimization Framework for Shadow Removal in Multi‐Projection Systems
KR20220112495A (en) Image projection system and method of the same
Fanani et al. Secondary camera movement in machinema using path finding
KR20050041525A (en) Motion capture system
Klein et al. Non-line-of-sight mocap
Siegl et al. Stray-light compensation in dynamic projection mapping
Barnum et al. A projector-camera system for creating a display with water drops
Yasui et al. Dynamic and Occlusion-Robust Light Field Illumination
KR102198646B1 (en) Image projection system and method of the same
Kagami et al. Homography Estimation Using Marker Projection Control: A Case of Calibration-Free Projection Mapping

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)