WO2019207008A1 - Display apparatus, controller therefor and method of controlling the same - Google Patents

Display apparatus, controller therefor and method of controlling the same Download PDF

Info

Publication number
WO2019207008A1
WO2019207008A1 PCT/EP2019/060536 EP2019060536W WO2019207008A1 WO 2019207008 A1 WO2019207008 A1 WO 2019207008A1 EP 2019060536 W EP2019060536 W EP 2019060536W WO 2019207008 A1 WO2019207008 A1 WO 2019207008A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
user
display
image
position information
Prior art date
Application number
PCT/EP2019/060536
Other languages
French (fr)
Inventor
Ravinder DAHIYA
William Ringal Taube NAVARAJ
Original Assignee
University Court Of The University Of Glasgow
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Court Of The University Of Glasgow filed Critical University Court Of The University Of Glasgow
Publication of WO2019207008A1 publication Critical patent/WO2019207008A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This invention relates to a display apparatus, a controller for a display apparatus and a method of controlling a display apparatus. The display apparatus included a display arranged to display a source image; at least one screen arranged in front of said display such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen; a position detection apparatus for detecting the position of a part of a user's body in proximity to said holographic image or to said screen and to generate position information relating to said detected position; an air source arranged to direct air to one or more locations in proximity to said screen; and a controller arranged to receive position information from the position detection apparatus and to control the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user's body. In this manner a user can be enabled to manipulate a holographic image using their hands or other body parts and to receive sensory feedback, particularly touch, hardness and/or temperature feedback from their interaction.

Description

DISPLAY APPARATUS. CONTROLLER THEREFOR AND METHOD OF CONTROLLING
THE SAME
Field of the Invention
The present invention relates to a display apparatus, a controller therefor and methods of controlling a display apparatus. It is particularly, but not exclusively, concerned with a display apparatus for displaying a holographic and/or pseudo-holographic image and detecting interaction of a user with that image and providing sensory feedback.
Background of the Invention
Interaction in the real 3D physical world involves 3D vision, touch and tactile feedback. There are various technologies to bring some of these experiences in the digital world, each having their own advantages and disadvantages. An interactive system which could bring these experiences together, in an effective way, finds great interest in multiple sectors including the multi-billion-dollar entertainment industry, education, medical, engineering, public
demonstration, security, next-generation smart virtual assistance and avatar (telepresence) etc.
In particular, a number of technologies exist for making holographic images. However, bringing these interactive experiences together in a system currently involves significant challenges in terms of technology complexity, cost, implementation and safety.
One of the most successful method for making 3D true holograms in mid-air is based on Laser Plasma Technology [1]. However, safety issues associated with the use of high-power lasers, the harmful ionization they cause mid-air, and need to wear protective goggles can all affect the user acceptance.
An interactive system which could bring holo-experience vision with touch and tactile feedback experiences together, in an effective way, finds great interest in multiple sectors including the entertainment industry, education, medical, engineering, public demonstration, security and avatar (telepresence) etc. However, bringing these interactive experiences together in a system involves significant challenges in terms of technology complexity, cost, implementation and safety.
An object of the present invention is to provide a display apparatus which can display a three-dimensional reproduction of an object and provide sensory feedback to a user who interacts with that three-dimensional reproduction. A further object of the present invention is to improve the production of a three-dimensional reproduction of an object and a user’s experience of and interaction with such reproductions.
Summary of the Invention
At their broadest, aspects of the present invention provide display apparatuses, controllers for display apparatuses and methods of controlling display apparatuses which generate holographic or pseudo-holographic images of an object and provide sensory feedback to a user who interacts with that object.
For the avoidance of doubt, the term“hologram” is used throughout to refer to both true holograms and to pseudo-holograms and related terms (e.g.“holographic”) should be interpreted in the same manner.
A first aspect of the present invention provides a display apparatus, the display apparatus including: a display arranged to display a source image; at least one screen arranged in front of said display such the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen; a position detection apparatus for detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and to generate position information relating to said detected position; an air source arranged to direct air to one or more locations in proximity to said screen; and a controller arranged to receive position information from the position detection apparatus and to control the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body.
In the display apparatus according to this aspect of the invention, the source image (which may be a planar image, or itself a hologram) can be reflected on one or many semi-reflective screen(s) to create a floating 3D image perception to the user from screens on one or multiple side(s), creating a holographic or pseudo-holographic effect.
In particular configurations the source image is displayed on a first surface (such as the display panel of an LCD/LED, or a stack of such display panels to create a segmented tensor display, or other display or a projection surface). Optionally, there may be provided an additional segmented optical overlay to scatter the 2D pixels to 3D voxels to offer depth perception and motion parallax. In particular configurations the screen or screens may be arranged such that they are neither perpendicular to nor parallel to the direction of light from said source image incident on said screen(s).
The display apparatus according to this aspect allows a user to view a three-dimensional image of an object and interact with that object whilst receiving sensory feedback. The sensory feedback may include, for example, sensations of touch or hardness based on the air source directing air to the location of the part of the user’s body which is“interacting” with the holographic object thereby creating pressure on the user’s body akin to touch. A sensation of hardness can be recreated dependent on the quantity and velocity of air directed to the user’s body and therefore the ease by which the user can move against the air.
In the configurations of the invention described further below and in the embodiments of the invention, it will be assumed that the display apparatus is configured such that the display and first screen are located at the top of the apparatus with the screen(s) below them.
References to below/above/side are to be construed accordingly. However, it will be readily appreciated that there are no particular limits on the orientation of the display apparatus and so this preferred orientation can be rotated about any axis without affecting the nature of the display apparatus or the user’s interaction with it.
Preferably there are a plurality of screens arranged in a pyramidal or partial pyramidal arrangement and the display and the screens are configured to cause a mutually-consistent holographic image of a three-dimensional object to be displayed when the apparatus is viewed by a user from a plurality of directions. Where there is a plurality of screens, the screens are preferably arranged so that each screen is directly opposed to another of the screens such that the two screens are identical and opposite across the position where the holographic image appears. This can prevent or reduce the extent to which reflections from other screens appear in the screen closest to the user.
Using a multi-sided pyramid arrangement of screens can allow different views of the object to be displayed to viewers from different directions . There may for example be provided different information tags on different sides. This can allow a user (or multiple users) to view the holographic image of the object from a number of different directions. Where there is a plurality of screens, the source image displayed may be a composite image made up of the individual images to be displayed on each of the screens arranged in front of the source image. In particular embodiments, the screens are arranged based on a square-based pyramid structure, thereby allowing the source image to be made up from orthogonal views of the object. In certain embodiments, there are three screens which are arranged to form three of the sloping sides of a square-based pyramid, such that the fourth sloping side is left open, thereby allowing a user easy access to the interior of the pyramid so that they can interact with the holographic image as if it were a real object.
The air source preferably includes a plurality of air source elements, each arranged to direct air to a location in proximity to said screen, and the controller in such embodiments is arranged to activate one or more of said air source elements depending on the position information. The plurality of air source elements may be arranged in one or more arrays, for example above or below the screen(s). Where the elements are arranged in an array, each element may be configured to primarily direct air to a particular location (or, more accurately, along a particular column) in or around the apparatus. Thus, for sensory feedback to a part of a user’s body in a particular position, one of the elements may be activated. Depending on the dimensions of the apparatus and the configuration of the air source elements, diffusion of the ejected air between the element and the part of the user’s body which is to experience the sensory feedback may mean that, in certain situations, several elements are activated at the same time in order to provide the desired sensory feedback from the cumulative effect of the air from all of the activated elements. In certain embodiments a plurality of air source elements is arranged such that a first group of said air source elements are arranged to direct air in a direction opposite or substantially opposite to the direction that a second group of said air source elements are arranged to direct air. For example, there may be two arrays of air source elements positioned above and below the screens, and/or arrays to the side of the screens (although such an arrangement would only be practical if there were no further screens arranged to allow viewing from that side). Such arrangements can allow the feeling of touch to be generated on the appropriate side of the user’s body part that is interacting with the displayed object.
In certain embodiments, the air source is arranged to direct air of different temperatures and the controller is arranged to control the temperature of air directed by the air source. This can provide“hot” or“cold” sensory feedback to the user. The selection of whether hot or cold air (or something in between) is provided may be determined by which part of displayed object the user is interacting with and/or by the nature of the object being interacted with. The provision of air of different temperatures can allow an enhanced level of realism to be associated with the sensory feedback as the user will“feel” the interaction at the expected temperature for the object being displayed.
The control of the temperature of the air from the air source may be achieved by supplying the air source with air of different temperatures, or by use of fast-acting heating or cooling coils within the air source.
In certain embodiments the controller is arranged to receive information about said object being displayed and further is arranged to: determine, from said position information and the information about said object, whether the part of the user’s body is positioned proximate to or in the same three-dimensional position as a part of the object being displayed or not; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, control the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user.
In other embodiments the controller is arranged to receive information about said object being displayed and further is arranged to: determine, from said position information and the information about said object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding or proximate to the position where a part of the object being displayed appears on the screen; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, control the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
Any known position detection approach can be used for the position detection apparatus.
In certain embodiments the position detection apparatus includes at least one infrared emitter arranged at an edge of the screen to transmit infrared radiation into said screen such that it is substantially totally internally reflected within said screen, and a detector arranged to detect infrared radiation which is emitted from said screen as a result of interaction by a user interfering with said total internal reflection.
In certain embodiments the position detection apparatus includes at least one infrared emitter arranged to emit infrared radiation to positions where the image of said object appears to a user, and a detector arranged to detect infrared radiation resulting from the interaction of the part of the user’s body with infrared radiation emitted from said emitter. Preferably the screen(s) of the display apparatus are specifically adapted for use in the display apparatus. Preferably the material of the screen and the relationship of the screen to the source image (i.e. the angle between the plane of the screen and a direction of light travel from the source image) are such that between 50-75% of the visible light incident from the source image is reflected by the screen. The user’s experience with the screen(s) will be significantly improved by screens with higher reflectance and this can also allow the apparatus to be used more effectively in higher ambient light conditions.
More preferably the material of the screen and the positioning of the screen are such that at least 20% of the visible light incident on the screen in a direction perpendicular to the direction of light travel from the source image to the screen is transmitted through the screen. This level of transmittance allows the screen to have a“see-through” effect.
The three-dimensional configuration of the displayed holographic reflection may be enhanced by characteristics of the display. For example, in some embodiments the display is a segmented lenticular display. In other embodiments the display is a segmented stereoscopic display. Such displays allow three dimensional characteristics of an image to be displayed either with or without the aid of additional viewing devices such as glasses.
Preferably the display has a narrow viewing cone such that a user viewing the displayed object is unable to view the image on the source display. It can be distracting for a user viewing the holographic image to also be able to see the source image(s), particular where there is a plurality of source images intended for display on different screens of the apparatus. By providing a display with a narrow viewing cone, a user located to one side of the apparatus and viewing the holographic image can be prevented from also seeing the source image.
One manner in which the display may be configured to provide a narrow viewing cone is to have a micro-louvered surface on the display. Where there is a plurality of screens, the configuration of such micro-louvered surface may be adapted to the arrangement of the screens.
Preferably the display apparatus includes a further controller, wherein the further controller is arranged to: receive the position information and to detect changes in the position
information; and adjust the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body. The further controller may form part of, or consist of the same physical elements as, the controller of the display apparatus. In such arrangements, the user can interact with the object displayed in the holographic image whilst receiving sensory feedback from their interactions such as touch, temperature, pressure and/or hardness based on the nature of their interaction with the object.
The manipulation of the object/image may be based on the detection of touch, touch-gesture or 3D gesture movements, or any combination thereof.
The apparatus of the present aspect may include any combination of some, all or none of the above described preferred and optional features.
A second aspect of the present invention provides a controller for a display apparatus, the display apparatus having: a display arranged to display a source image; at least one screen arranged in front of said display such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen; a position detection apparatus for detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and to generate position information relating to said detected position; and an air source arranged to direct air to one or more locations in proximity to said screen, wherein the controller is arranged to: receive position information from the position detection apparatus and to control the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body. The controller of this aspect preferably works with a display apparatus in which the source image (which may be a planar image, or itself a hologram) created by said apparatus can be reflected on one or many semi-reflective screen(s) to create a floating 3D image perception to the user from slanted screens on one or multiple side(s), creating a holographic effect.
In particular configurations the source image is displayed on a first surface (such as the display panel of LCD(s)/LED(s), including the possibility of a stack of such display panels, or other display or a projection surface).
In particular configurations the screen or screens may be arranged such that they are neither perpendicular to nor parallel to the direction of light incident from said source image on the screen(s). The controller according to this aspect allows a user to view a three-dimensional image of an object and interact with that object whilst receiving sensory feedback. The sensory feedback may include, for example, sensations of touch or hardness based on the air source directing air to the location of the part of the user’s body which is“interacting” with the holographic object thereby creating pressure on the user’s body akin to touch. A sensation of hardness can be recreated as a function of the quantity and velocity of air directed to the user’s body and therefore the ease by which the user can move against the air. Preferably the controller is further arranged to configure, from data regarding the three- dimensional structure of the object, a source image such that projection of the image and its reflection by the screen or screens causes the pseudo-holographic image of the object to be displayed. This may involve producing multiple representations of the object, one for each screen wherein each representation presents a different view of the object due to the different angle from which the object will be viewed on that screen. This may also involve further image manipulation, for example to create segmented lenticular images or segmented stereoscopic images on the screen to enhance the three-dimensional reproduction of the image of the object.
In certain embodiments the controller is further arranged to: determine, from said position information and the data regarding the three-dimensional structure of the object, whether the part of the user’s body is positioned proximate to or in the same three-dimensional position as a part of the object being displayed or not; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, control the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user.
In certain embodiments the controller is further arranged to: determine, from said position information and the data regarding the three-dimensional structure of the object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding to or proximate to the position where a part of the object being displayed appears on the screen; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, control the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
Preferably the controller is further arranged to: receive the position information and to detect changes in the position information; adjust the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body. In such arrangements, the user can interact with the object displayed in the holographic image whilst receiving sensory feedback from their interactions such as touch, temperature, pressure and/or hardness based on the nature of their interaction with the object.
The manipulation of the object/image may be based on the detection of touch, touch-gesture or 3D gesture movements, or any combination thereof.
The controller of the present aspect may include any combination of some, all or none of the above described preferred and optional features.
A further aspect of the present invention provides a method of controlling a display apparatus, the display apparatus having: a display arranged to generate a source image; at least one screen arranged in front of said display; and an air source arranged to direct air to one or more locations in proximity to said screen, the method including the steps of:
displaying a source image such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen; detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and generating position information relating to said detected position; and controlling the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body.
Preferably the method further includes the step of: configuring, from data regarding the three- dimensional structure of the object, a source image such that projection of the image and its reflection by the screen causes the holographic image of the object to be displayed. This may involve producing multiple representations of the object, one for each screen wherein each representation presents a different view of the object due to the different angle from which the object will be viewed on that screen. This may also involve further image manipulation, for example to create segmented lenticular images or segmented stereoscopic images to enhance the three-dimensional reproduction of the image of the object.
In certain embodiments the method further includes the steps of: determining, from said position information and the data regarding the three-dimensional structure of the object, whether the part of the user’s body is positioned proximate to or in the same three- dimensional position as a part of the object being displayed or not; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, controlling the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user. In certain embodiments the methods further includes the steps of: determining, from said position information and the data regarding the three-dimensional structure of the object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding to the position where a part of the object being displayed appears on the screen or proximate to such position; and if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, controlling the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
Preferably the method further includes the steps of: optionally detecting changes in the position information; and adjusting the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body.
In such arrangements, the user can interact with the object displayed in the holographic image whilst receiving sensory feedback from their interactions such as touch, temperature, pressure and/or hardness based on the nature of their interaction with the objection.
The manipulation of the object/image may be based on the detection of touch, touch-gesture or 3D gesture movements, or any combination thereof.
The method of the present aspect may include any combination of some, all or none of the above described preferred and optional features. The method of the present aspect is preferably carried out on a display apparatus according to the above first aspect or using a controller according to the above second aspect, including some, all or none of the preferred and optional features of those aspects, but need not be so.
Further aspects of the present invention include computer programs which, when run on a computer, carry out the method of the above aspect, and computer program products having such programs contained thereon.
Brief Description of the Drawings
Embodiments of the invention will now be described by way of example with reference to the accompanying drawings in which: Figure 1 shows an apparatus according to an embodiment of the present invention; Figure 2 shows the optical specification for the pyramid material in the embodiment of Figure
1 ;
Figure 3 shows the typical weighted reflectance for coating with material of various refractive indices and thicknesses variation for materials to be used in embodiments of the present invention;
Figure 4(a) shows the functional components of the hand-tracking process and haptic feedback control used in an embodiment of the present invention; Figure 4(b) shows the configuration of a prototype single-element haptic feedback unit forming part of an embodiment of the present invention; Figure 5 shows an apparatus according to another embodiment of the present invention;
Figures 6(a) and (b) respectively show the use of segmented lenticular image formation and segmented stereoscopic image formation as part of the 3-dimensional rendering of the display in an embodiment of the present invention; and
Figure 7 shows, schematically, the use of segmented micro-louvers as part of an
embodiment of the present invention.
Detailed Description
Figure 1 shows an apparatus 10 according to an embodiment of the present invention. This apparatus provides a touch-interactive pseudo-holographic display. Graphical output (images or videos or live graphics) from a projector 11 are displayed on a planar display 12. This amounts to a scalable segmented lenticular/stereoscopic image forming system, typically comprising a projector+lenses/optics+screen).
The image from the planar display 12 is reflected through one or more semi-reflective screen(s) 13 arranged at an angle to the planar display (in the embodiment shown in Figure 1 there are four such screens arranged in a pyramidal fashion with bottom side left open for access). This is referred to herein as a see-through reflective pyramid screen with optical coating (enabling semi-reflective for visible, bi-directional transparency and refractory image formation in IR).
The light emitted from the 2D display is reflected on the screens giving a floating object perception to the observer who is observing the slanted screens from any side. In the case of a suitably-formed image on the screen, this creates a‘pseudo-holographic’ effect (known colloquially as a“Pepper’s Ghost” hologram). Such holographic images are, strictly- speaking, not“true” holograms as they do not offer both depth perception and motion parallax. As discussed further below, the apparatus can also make use of true holograms generated above the pyramidal arrangement and reproduced inside the pyramid for safe user interaction. The term“hologram” (and equivalent terms) will be used herein to refer to both true holograms and pseudo-holograms, both of which can be created or projected by devices such as that shown in Figure 1.
The apparatus shown in Figure 1 provides for scaling of the displayed image. By using a rear projection screen 12 and a projector 1 1 (which may be, for example, a wide-angle projector for compactness), the primary planar image can be formed, which on further reflection through the angled semi-reflective screens 13 results in the Pepper’s Ghost effect. This scaling method enables the formation of a large-scale Pepper’s Ghost display.
In a different configuration, the apparatus can make use of direct volumetric true holograms generated above the pyramidal screen inside a safe enclosure, which when reflected on the setup will give a safer true 3D image of the volumetric hologram, whilst allowing for safe interaction in terms of touch, gesture and tactile feedback.
The apparatus of Figure 1 has a multi-touch/multi-gesture interface 14 on all sides of the pyramid screens 13. Conventional touch interfaces such as capacitive or acoustic based interfaces could be used on the top of the projection screens. However, in standard implementations of such interfaces, these could block the 3D image formed inside the pyramid. Another option is to use resistive touch interfaces. However, a problem with resistive interfaces is that they do not allow multi-touch operation and so gesture recognition and control (e.g. pinch-zooming) is not possible.
Accordingly, the touch interface in the apparatus shown in Figure 1 is a scalable infrared touch interface and display adapted for the pyramidal Pepper’s Ghost display which does not interfere or block the display mechanism to any noticeable degree. Infrared lights (LEDs 15) are projected through the edges of the semi-reflective screens 13 (at the base of pyramid in the case of multi-screen arrangement such as that shown in Figure 1 , thereby providing edge illumination). The infrared radiation emitted undergoes Total Internal Reflection (TIR) within the two planar surfaces of the screens. The reflections from the internal junction edges of the pyramids are blocked using an I R filter.
When a user touches the screen on any of the four planes, the TIR is broken and the infrared light gets reflected out of the screen and reaches the base of the pyramid. There, an infrared camera 16 captures the image formed and processes it in a processor 17. Specific gesture and voxel processing hardware for pyramidal holographic interface is used in order to provide a real-time response and“move” the image projected in accordance with the user’s interaction.
Two arrays of acoustic/air-flow transducers 18 are arranged to face each other in different planes, above and below the pyramidal screens 13. These transducers give a bi-directional touch and feel of the 3D graphics to the user’s hands and provide tactile or haptic feedback to the user in terms of pressure, hardness etc. The number of acoustic transducers in the scheme in Figure 1 will be understood to be for illustration purposes only and the exact number and configuration may be adapted and changed between embodiments. By creating pressure/pressure-waves from transducers on one side, the feeling of touch can be replicated in mid-air. The feeling of pressure and hardness of 3D virtual objects can be recreated by pressure/pressure-waves from both the planes. Further, the feel of temperature can also be recreated using heated air coming from specific air flow nozzles. Overall, this means that the apparatus could provide touch, temperature, pressure and hardness interaction without any safety implications for the user.
Determination of interaction points
To do this, the volumetric points of interaction of fingers and hands with the
display/apparatus need to be determined. In the embodiment shown in Figure 1 , this is achieved by using the infrared light from the edge illumination 15. The arrays of infrared lights 15 around the edge of the pyramid 13 described above also cause the user’s hand(s) in proximity to the pyramid to be illuminated. The motion of a hand in front of the slanted screen causes a refracted infrared image to be formed inside the screen. This image is received by the infrared camera or cameras 16 underneath the pyramid structure. Using image processing, the position and orientation of the fingers/hands in 3D space can be obtained from the images. Further, when the screen is touched the intensity of the touch increases resulting in a blob. So using this both touch and gesture can be interpreted. In US 8638989 B2, 3 infrared LEDs and 2 cameras have been used to detect motion gestures by direct imaging underneath the motion. Since, the structure in the arrangement of Figure 1 is 3-dimensional, the points of touch (single or multiple) also have to be mapped to the 3- dimensional structure. In order to make the reflected 3D-like image visible in daylight, special interference filters/optical coatings are used as described below. Screen coatings
The specifications for the optical coatings used in this embodiment are shown in Figure 2. The screen 13 of the pyramid should reflect 50-75% of light directly incident from the top of the pyramid. At least 20% of the visible light incident on the slanted pyramidal screen (i.e. light parallel to the ground plane) is transmitted through the screen to give a see-through effect. For infrared the screen has bidirectional transmittance.
The primary reason for using darker ambient in existing pseudo-hologram displays is the poor reflectance of the materials used for the screens (for example acrylic). While the refractive index (Rl) of acrylic is suitable for Frustrated Total Internal Reflectance (FTIR), it does not offer good reflectance. A similar issue is faced if glass is used in place of acrylic. Typically glass offers mere -3.5% reflectance and acrylic offers -3.75% reflectance.
The user experience will be significantly improved by having a material with reflectance of >10% with little or no influence on the transmittance of the screen material to give the floating effect and“in-hand” interaction. Therefore, ideally, a high reflectance for the hologram screen is provided by using a material with a higher refractive index. Alternatively, special optical coatings can be used to achieve the same performance with low cost plastic (like acrylic) or glass screens.
Depending upon the precise requirements, thin film single layer optical coatings or multilayer optical coatings can be used. The typical weighted reflectance for coating with material of various refractive indices and thicknesses variation are given in Figure 3, based on acrylic substrate. To achieve >10% reflectance of specific thicknesses (>40nm) and refractive indices (>1.75) can be chosen as desired. Such materials could be realized by special thin film coatings (for example, silicon-rich-oxides, silicon nitrides or silicon rich nitrides SiOxNy etc.) deposited by various thin film deposition tools (for example, using chemical vapor deposition) on the top of acrylic or glass substrates.
It should be noted that these optimization calculations assume a constant refractive index. Actual values of refractive index may vary slightly with deposited coating with respect to wavelength. Such layers have been developed by the inventors in past for use in solar cells and could be repurposed for this application (see references [2] and [3]). Similarly, multilayer films can be used to achieve better performance where progressively lesser refractive index materials are used in the coating (for example, silicon rich nitride of Rl 2.2 and thickness 70nm followed by Rl 1.8 and thickness 100nm will give visible region weighted reflectance of 37.6%). Aero-transducers
Figure 1 shows how an array of acoustic/aero-transducers may be placed around the display device. The transducers could be provided in a curved configuration rather than the planar configuration shown. Here the operation of one element of the array is described. From this, the scaling up of the operation of the array is merely a question of replication and increased processing power.
In the description a single haptic element in the array will be referred to as a“haptel” (haptic element - c.f. pixel). A simple prototype of a single haptel which was used to demonstrate the concept is shown in Figure 4(b). The system diagram of the implemented haptel is shown Figure 4(a).
The image from the volumetric/non-volumetric display 21 is projected onto the semi- transparent single-/multi-sided display 22 (the pyramid of the embodiment shown in Figure 1 ) to create a floating rendering 23. As the user attempts to maneuver the virtual object with their hand, the position of the hand or fingertip is tracked by the hand tracking system 24. The positions are continuously sent to the rendering and control application running on a computer 25. When the fingers appear to be touching the virtual object, the control application gets the coordinates of the finger(s) through the tracking mechanism and this allows the source of air to follow the movements of finger as it manipulates the virtual object.
Thus, the haptel tracker follows the hand and as soon as feeling of touch (e.g. pressure, temperature) is needed, the air source is triggered to deliver the intended force (typically 0-2 bar) or heat (typically 0-40°C) to the target. The hot/cold feeling is generated by selecting hot or cold air to be blown from the nozzle as controlled by a hot/cold controller 26. The specifications for pressure and temperature can be changed as required by the target application and are controlled by a microcontroller 27 in the interface unit 28 which also controls the on/off state of the individual haptel.
An example implemented aero-haptel interface in conjunction with the pyramidal display screen 13 is shown in Figure 4(b). The inlet to the aeronozzle 31 comes from a compressed hot/cold air reservoir through inlet pipe 32. Servos 33 control the flow of air through the nozzle 31. The air is released to obtain the desired pressure and temperature at the target point (e.g. fingertip) to create the haptics sensation. The configuration in Figure 4(b) can be scaled up to provide for a full array of haptels.
Display options The display 12 of the apparatus is also modified from a standard projector/screen configurations in order to align with the pyramidal arrangements and give the user an experience related to the depth-of-perception.
In this regard, the display may be a segmented lenticular display as illustrated in Figure 4(a) or a segmented stereoscopic display as illustrated in Figure 4(b). By these mechanisms the user can experience depth-of-perception compared to conventional Pepper’s Ghost display.
In the case of a segmented lenticular display the user does not need special eyewear to perceive the depth. On other hand, special eyewear will be needed in the case of a segmented stereoscopic display. A segmented lenticular display rendering could be obtained by using segmented lenticular optics in any of the following places: inside or as a part of the projector; on the top planar screen area; or directly on the top of the pyramid.
The lenses require specific arrangement which involves different orientation in different directions to form the regions of the image formation. The lenticular system could be designed to accommodate different number of views depending upon the size/scale of the display. The patent also embodies alignment correction mechanism of segmented display which is different from normal lenticular displays.
A schematic illustration of this is shown in Figure 6(a), where a lenticular scheme in a segmented manner divided into four areas is shown. In a practical demo, a lenticular film was made to align with the LCD display with 2540 DPI.
In case of stereoscopic displays (Figure 6(b)), this can be achieved using a rendering machine for 3D graphics or live-recorded with special cameras. This kind of display could use any of the known stereoscopic display technologies such as anaglyph etc.. In the example illustrated in Figure 6(b), a stereoscopic anaglyph rendering of a 3D heart was carried out by assuming two cameras with red and cyan colour filtering in 3D rendering software. The user can experience depth perception in such an arrangement by using similarly arranged anaglyph glasses.
For portable or smaller systems, segmented lenticular/stereoscopic active display systems are used as illustrated in the Figure 5. In the apparatus of the embodiment shown in Figure 5, the configuration is very similar to that of the embodiment shown in Figure 1 , but the projector and screen have been replaced by an active display 1 1 a which incorporates the stereoscopic/lenticular elements discussed above. Other features of Figure 5, being the same as in Figure 1 , are not described here again.
Another feature of displays used in some embodiments of the present invention is the use of a normal-narrow-angle active display or display-optics. This allows the user to see only the reflection and not the direct image formed on the four corners which improves the user’s 3D experience as they won’t get distracted from the image rendering on the planar screen.
The source of the hologram display ideally has a narrow-viewing cone for better user experience and to allow them to see only the reflected image from the slanted screen and not directly from the projector or main display screen. This is especially important for larger (e.g. life-size) holograms as the (large) main display screen could be a distraction. By using a narrow-viewing cone display, the user will see only the reflected rendering from the angled screens of the apparatus and not the image(s) on the main display screen.
Traditionally, a narrow-viewing angle is considered to be a disadvantage for LCD and LED screens, but this effect can be used in embodiments of the present invention and/or a purpose-built narrow-angle display can be used.
A narrow-angle display can be achieved by using rectangular diffraction blocking filters. This can be achieved in a number of ways.
The first is to use segmented micro-louvering. This is illustrated in Figure 7. A micro-louver film with perpendicular arrangement at the diagonal is arranged on the screen in the rectangular/pyramidal arrangement shown on the left hand side of Figure 7. The cross- section of the louvered film is shown on the right hand side. The spacing between the louvers is of the order of tens of microns. Commercially-available micro-louvered film can be used for this purpose, but will need to be arranged in the manner shown in Figure 7 to match the pyramidal structure of the apparatus (or correspondingly for different structural configurations of the screens of the apparatus). The micro-louvered film can be stacked, if desired with the lenticular/stereoscopic films.
A screen with micro-louvered film arranged as shown in Figure 5 was tested with the apparatus of the embodiment shown in Figure 1. The display screen above the pyramid structure appeared completely black to the user. In an alternative arrangement, the polarizer of an LCD display can be optimized to have a higher depth. LCD displays normally have a twisted nematic (TN) crystal, between two opposing polarizers (horizontal and vertical) in which the twisting of polarization changes when a voltage is applied on it. The main limiting factor of the viewing angle is the thickness of the twisting nematic crystal. While the major effort by researchers working on LCD is to broaden the viewing angle, a narrow viewing angle can be advantageous in the present application. This will restrict the user from seeing the source image. Accordingly, the parameters of the TN can be optimized to achieve narrow viewing angle. Again, this should also be implemented in a segmented way for different sides of the pseudo-holographic display each having a different polarization twist. An apparatus using such a display was tested. The image on the display screen above the pyramid structure appeared blurry and had colour distortion unless viewed vertically downwards. In a further alternative a one-sided mirror was used with the reflective side facing the pyramid structure. This arrangement caused reflection of the hologram on the one-sided mirror. As a result, this is an inferior solution which doesn’t give a good user experience, but is still better than being able to see the full projected image on the display screen.
Implementations and uses Display apparatuses such as those described in the above embodiments may be used in a variety of ways and with different additional functionality as desired.
For example, the displays may be controlled by and/or interface with external devices such as mobile phones, remote controls, EEG control, separate control from a connected computer or other electronic device (connection may be wired or wireless according to known protocols).
The apparatus (or suitable parts of it) may be rotatable and/or movable.
The apparatus may interface with communication devices (e.g. Internet-type communications such as audio or video calls, or mobile telephony devices) to provide a calling experience with tactile experiences (e.g.“shaking hands” remotely). Motion parallax to images with motion sensors such as Microsoft Kinect etc.
Other, non-limiting, uses that are currently envisaged or use as a platform for interactive CAD (Computer Aided Design) related work, use in virtual interactive games and market interactive demonstrations.
Virtual haptic control of various appliances not usually within human reach or scale such as atomic force microscopes, hazardous environment maneuvering, space-based object control and manipulation are also envisaged and the apparatus will allow remote haptic or sensory feedback to the user in such situations.
While the invention has been described in conjunction with the exemplary embodiments described above, many equivalent modifications and variations will be apparent to those skilled in the art when given this disclosure. Accordingly, the exemplary embodiments of the invention set forth above are considered to be illustrative and not limiting. Various changes to the described embodiments may be made without departing from the spirit and scope of the invention. For example, and without limitation, progress in micro/nanoelectronics technology is expected to lead to large transparent display screens which could be adapted to provide hologram-like effects. Such screens may thus take the place of one or more of the screen(s) forming the reflection for user interaction.
In particular, although the methods of the above embodiments have been described as being implemented on the systems of the embodiments described, the methods and systems of the present invention need not be implemented in conjunction with each other, but can be implemented on alternative systems or using alternative methods respectively.
References
[1] “Laser produced 3D display in the air.” Hidei Kimura, Taro Uchiyama, and Hiroyuki Yoshikawa. ACM SIGGRAPH 2006 Emerging Technologies (SIGGRAPH’06). ACM, New York, NY, Article 20. [2] “Efficiency enhancement of silicon solar cells with silicon nanocrystals embedded in
PECVD silicon nitride matrix, William R. Taube et al Solar Energy Materials and Solar Cells 101 , 32-35;
[2] Plasma enhanced chemical vapor deposited (PECVD) silicon-rich-nitride thin films for improving silicon solar cells efficiency, A Kumar, WR Taube et al., Int. J. Sci. Eng. Technol 1 (4), 1 11-116)
All references referred to above are hereby incorporated by reference.

Claims

1. A display apparatus, the display apparatus including:
a display arranged to display a source image;
at least one screen arranged in front of said display such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen;
a position detection apparatus for detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and to generate position information relating to said detected position;
an air source arranged to direct air to one or more locations in proximity to said screen; and
a controller arranged to receive position information from the position detection apparatus and to control the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body.
2. A display apparatus according to claim 1 wherein there are a plurality of screens arranged in a pyramidal or partial pyramidal arrangement and the display and the screens are configured to cause a mutually-consistent holographic image of a three-dimensional object to be displayed when the apparatus is viewed by a user from a plurality of directions.
3. A display apparatus according to claim 1 or claim 2 wherein the air source includes a plurality of air source elements, each arranged to direct air to a location in proximity to said screen, and the controller is arranged to activate one or more of said air source elements depending on the position information, preferably with the plurality of air source elements being arranged in an array.
4. A display apparatus according to claim 3 wherein the plurality of air source elements are arranged such that a first group of said air source elements are arranged to direct air in a direction opposite or substantially opposite to the direction that a second group of said air source elements are arranged to direct air.
5. A display apparatus according to any one of the preceding claims wherein the air source is arranged to direct air of different temperatures and the controller is arranged to control the temperature of air directed by the air source.
6. A display apparatus according to any one of the preceding claims wherein the controller is arranged to receive information about said object being displayed and further is arranged to:
determine, from said position information and the information about said object, whether the part of the user’s body is positioned in the same three-dimensional position as a part of the object being displayed or not, or proximate to such three-dimensional position; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, control the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user.
7. A display apparatus according to any one of claims 1 to 5 wherein the controller is arranged to receive information about said object being displayed and further is arranged to: determine, from said position information and the information about said object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding to or proximate to the position where a part of the object being displayed appears on the screen; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, control the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
8. A display apparatus according to any one of the preceding claims wherein the position detection apparatus includes at least one infrared emitter arranged at an edge of the screen to transmit infrared radiation into said screen such that it is substantially totally internally reflected within said screen, and a detector arranged to detect infrared radiation which is emitted from said screen as a result of interaction by a user interfering with said total internal reflection.
9. A display apparatus according to any one of the preceding claims wherein the position detection apparatus includes a least one infrared emitter arranged to emit infrared radiation to positions where the image of said object appears to a user, and a detector arranged to detect infrared radiation resulting from the interaction of the part of the user’s body with infrared radiation emitted from said emitter.
10. A display apparatus according to any one of the preceding claims where the material of the screen and the relationship of the screen to the source image are such that between 50-75% of the visible light incident from the source image is reflected by the screen.
11. A display apparatus according to claim 10 where the material of the screen and the positioning of the screen are such that at least 20% of the visible light incident on the screen in a direction perpendicular to the direction of light from said source image incident on the screen is transmitted through the screen.
12. A display apparatus according to any one of the preceding claims wherein the display is a segmented lenticular display or a segmented stereoscopic display.
13. A display apparatus according to any one of the preceding claims wherein the display has a narrow viewing cone such that a user viewing the displayed object is unable to view an image on the source display.
14. A display apparatus according to claim 13 wherein the display has a micro-louvered surface to narrow the viewing cone of the display.
15. A display apparatus according to any one of the preceding claims having a further controller, wherein the further controller is arranged to:
receive the position information and to detect changes in the position information; adjust the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body.
16. A controller for a display apparatus, the display apparatus having:
a display arranged to display a source image;
at least one screen arranged in front of said display such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen;
a position detection apparatus for detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and to generate position information relating to said detected position; and
an air source arranged to direct air to one or more locations in proximity to said screen,
wherein the controller is arranged to:
receive position information from the position detection apparatus and to control the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body.
17. A controller according to claim 16 wherein the controller is further arranged to configure, from data regarding the three-dimensional structure of the object, a source image such that projection of the image and its reflection by the screen causes the holographic image of the object to be displayed.
18. A controller according to claim 17 wherein the controller is further arranged to:
determine, from said position information and the data regarding the three- dimensional structure of the object, whether the part of the user’s body is positioned proximate to or in the same three-dimensional position as a part of the object being displayed or not; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, control the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user.
19. A controller according to claim 17 wherein the controller is further arranged to:
determine, from said position information and the data regarding the three- dimensional structure of the object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding to the position where a part of the object being displayed appears on the screen; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, control the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
20. A controller according to any one of claims 16-19, wherein the controller is further arranged to:
receive the position information and to detect changes in the position information; adjust the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body.
21. A method of controlling a display apparatus, the display apparatus having:
a display arranged to display a source image;
at least one screen arranged in front of said display; and
an air source arranged to direct air to one or more locations in proximity to said screen,
the method including the steps of:
displaying a source image such that the projection of the image and its reflection by the screen causes a holographic image of a three-dimensional object to be displayed to a user viewing said screen from a position that is neither perpendicular to nor parallel to the plane of said screen;
detecting the position of a part of a user’s body in proximity to said holographic image or to said screen and to generating position information relating to said detected position; and
controlling the air source so as to direct air in accordance with said received position information so as to provide sensory feedback to the detected part of the user’s body.
22. A method according to claim 21 further including the step of: configuring, from data regarding the three-dimensional structure of the object, a source image such that projection of the image and its reflection by the screen causes the holographic image of the object to be displayed.
23. A method according to claim 22 further including the steps of:
determining, from said position information and the data regarding the three- dimensional structure of the object, whether the part of the user’s body is positioned proximate to or in the same three-dimensional position as a part of the object being displayed or not; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object, controlling the air source to direct air to said position to cause a sensation of touch, hardness and/or temperature to the user.
24. A method according to claim 22 further including the steps of:
determining, from said position information and the data regarding the three- dimensional structure of the object, whether the part of the user’s body is in proximity to or touching the screen in a position corresponding to or proximate to the position where a part of the object being displayed appears on the screen; and
if it is determined that the part of the user’s body is positioned proximate to or in the same position as a part of the object appearing on the screen, controlling the air source to direct air to the position of the part of the user’s body to cause a sensation of touch, hardness and/or temperature to the user.
25. A method according to any one of claims 21-24 further including the steps of:
optionally detecting changes in the position information; and
adjusting the source image based on the position information or change in position information, such that a user is able to manipulate and/or move the holographic image of the object using a part of their body.
PCT/EP2019/060536 2018-04-25 2019-04-24 Display apparatus, controller therefor and method of controlling the same WO2019207008A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1806775.1 2018-04-25
GBGB1806775.1A GB201806775D0 (en) 2018-04-25 2018-04-25 Display apparatus, controller therefor and method of controlling the same

Publications (1)

Publication Number Publication Date
WO2019207008A1 true WO2019207008A1 (en) 2019-10-31

Family

ID=62236220

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2019/060536 WO2019207008A1 (en) 2018-04-25 2019-04-24 Display apparatus, controller therefor and method of controlling the same

Country Status (2)

Country Link
GB (1) GB201806775D0 (en)
WO (1) WO2019207008A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110085083A (en) * 2019-06-05 2019-08-02 南京航空航天大学 A kind of Tiny pore injection virtual report control platform of array
WO2021223265A1 (en) * 2020-05-07 2021-11-11 南京钟山虚拟现实技术研究院有限公司 Holographic projection imaging interactive system
WO2022024745A1 (en) * 2020-07-31 2022-02-03 Agc株式会社 Display unit

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150323991A1 (en) * 2014-05-07 2015-11-12 International Business Machines Corporation Sensory holograms

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150323991A1 (en) * 2014-05-07 2015-11-12 International Business Machines Corporation Sensory holograms

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BIMBER O ET AL: "The virtual showcase", IEEE COMPUTER GRAPHICS AND APPLICATIONS, IEEE SERVICE CENTER, NEW YORK, NY, US, vol. 21, no. 6, 1 November 2001 (2001-11-01), pages 48 - 55, XP011093931, ISSN: 0272-1716, DOI: 10.1109/38.963460 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110085083A (en) * 2019-06-05 2019-08-02 南京航空航天大学 A kind of Tiny pore injection virtual report control platform of array
CN110085083B (en) * 2019-06-05 2024-03-15 南京航空航天大学 Micro-airflow jet array virtual control platform
WO2021223265A1 (en) * 2020-05-07 2021-11-11 南京钟山虚拟现实技术研究院有限公司 Holographic projection imaging interactive system
WO2022024745A1 (en) * 2020-07-31 2022-02-03 Agc株式会社 Display unit

Also Published As

Publication number Publication date
GB201806775D0 (en) 2018-06-06

Similar Documents

Publication Publication Date Title
Monnai et al. HaptoMime: mid-air haptic interaction with a floating virtual screen
US10241344B1 (en) Advanced retroreflecting aerial displays
US20200409529A1 (en) Touch-free gesture recognition system and method
US8502816B2 (en) Tabletop display providing multiple views to users
US11598919B2 (en) Artificial reality system having Bragg grating
Hirsch et al. BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields
US8704822B2 (en) Volumetric display system enabling user interaction
US20100328306A1 (en) Large format high resolution interactive display
TWI543038B (en) Device, system, and method for projection of images onto tangible user interfaces
WO2019207008A1 (en) Display apparatus, controller therefor and method of controlling the same
Butler et al. Vermeer: direct interaction with a 360 viewable 3D display
US20120013613A1 (en) Tools for Use within a Three Dimensional Scene
US10366642B2 (en) Interactive multiplane display system with transparent transmissive layers
EP3205088B1 (en) Telepresence experience
US10116914B2 (en) Stereoscopic display
TW201104494A (en) Stereoscopic image interactive system
Benko Beyond flat surface computing: challenges of depth-aware and curved interfaces
Yoshida et al. RePro3D: Full-parallax 3D display with haptic feedback using retro-reflective projection technology
WO2002084637A1 (en) Control of depth movement between screens of multilevel display
Yasugi et al. Development of aerial interface by integrating omnidirectional aerial display, motion tracking, and virtual reality space construction
Chan et al. On top of tabletop: A virtual touch panel display
Adhikarla et al. Design and evaluation of freehand gesture interaction for light field display
Matsumaru et al. Three-dimensional aerial image interface, 3DAII
WO2009112722A9 (en) Oled lcd hybrid interactive device with microstructured plates
Christou et al. Touch interactive 3D surfaces

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19719854

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19719854

Country of ref document: EP

Kind code of ref document: A1