WO2022158209A1 - 空間浮遊映像表示装置 - Google Patents
空間浮遊映像表示装置 Download PDFInfo
- Publication number
- WO2022158209A1 WO2022158209A1 PCT/JP2021/046981 JP2021046981W WO2022158209A1 WO 2022158209 A1 WO2022158209 A1 WO 2022158209A1 JP 2021046981 W JP2021046981 W JP 2021046981W WO 2022158209 A1 WO2022158209 A1 WO 2022158209A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- floating image
- display device
- light
- user
- image display
- Prior art date
Links
- 238000007667 floating Methods 0.000 title claims abstract description 400
- 230000035807 sensation Effects 0.000 claims abstract description 35
- 239000004973 liquid crystal related substance Substances 0.000 claims description 97
- 230000010287 polarization Effects 0.000 claims description 48
- 230000015541 sensory perception of touch Effects 0.000 claims description 42
- 230000005236 sound signal Effects 0.000 claims description 42
- 238000003384 imaging method Methods 0.000 claims description 32
- 230000003287 optical effect Effects 0.000 claims description 32
- 230000004907 flux Effects 0.000 claims description 26
- 230000000903 blocking effect Effects 0.000 claims description 4
- 238000000034 method Methods 0.000 abstract description 28
- 238000011161 development Methods 0.000 abstract description 6
- 238000002604 ultrasonography Methods 0.000 abstract description 5
- 230000036449 good health Effects 0.000 abstract description 2
- 230000036642 wellbeing Effects 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 68
- 238000009792 diffusion process Methods 0.000 description 35
- 230000002093 peripheral effect Effects 0.000 description 28
- 230000006870 function Effects 0.000 description 22
- 238000006243 chemical reaction Methods 0.000 description 17
- 238000012545 processing Methods 0.000 description 17
- 238000000926 separation method Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 13
- 238000003860 storage Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 10
- 239000010408 film Substances 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 9
- 230000010363 phase shift Effects 0.000 description 9
- 239000000758 substrate Substances 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 239000011347 resin Substances 0.000 description 8
- 229920005989 resin Polymers 0.000 description 8
- 230000015572 biosynthetic process Effects 0.000 description 7
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 5
- 238000002474 experimental method Methods 0.000 description 5
- 239000000919 ceramic Substances 0.000 description 4
- 210000005069 ears Anatomy 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 208000015181 infectious disease Diseases 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000006866 deterioration Effects 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 208000008918 voyeurism Diseases 0.000 description 3
- 208000019901 Anxiety disease Diseases 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 2
- -1 acryl Chemical group 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000008033 biological extinction Effects 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/56—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/133—Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
- G02F1/1333—Constructional arrangements; Manufacturing methods
- G02F1/1335—Structural association of cells with optical devices, e.g. polarisers or reflectors
- G02F1/1336—Illuminating devices
- G02F1/133602—Direct backlight
- G02F1/133603—Direct backlight with LEDs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F9/00—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/346—Image reproducers using prisms or semi-transparent mirrors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/40—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/20—Arrangements for obtaining desired frequency or directional characteristics
- H04R1/32—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only
- H04R1/40—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers
- H04R1/403—Arrangements for obtaining desired frequency or directional characteristics for obtaining desired directional characteristic only by combining a number of identical transducers loud-speakers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
Definitions
- the present invention relates to the technology of a spatially floating image display device.
- spatial floating information display systems there are already known image display devices that display spatially floating images toward the outside, display methods that display spatially floating images that can be input by the user as user interfaces or man-machine interfaces, and the like. ing. In contrast to conventional liquid crystal touch panels, which require the user to touch the physical display surface with their fingers, this type of spatial floating image can be used as a non-contact user interface. is.
- Patent Document 1 discloses a display capable of reliably giving a feeling of operation to an operator who operates an image of an insubstantial operation target displayed in midair. It is disclosed to provide an operating device.
- the spatially floating image which is the image displayed by the spatially floating image display device, is displayed as if it were floating in space.
- the user can visually recognize the floating image.
- a floating image when used as a non-contact user interface, the user has a definite tactile sensation of "contacting an object", for example, like a push button (an object configured as a GUI image) in a conventional touch panel screen. (for example, touch feeling) cannot be obtained. Therefore, erroneous operation and erroneous input are likely to be induced. For example, it may be determined that a touch operation has been performed even if the user intended to touch the button, or that a touch operation has been performed even if the user did not intend to touch the button.
- An object of the present invention is to improve the user-friendliness, improve the visibility and operability, and reduce erroneous operation and error when using the generated spatial floating image as a non-contact user interface in relation to the technology of the spatial floating image display device.
- An object of the present invention is to provide a technique suitable for preventing or reducing input.
- a spatially floating image display device is a spatially floating image display device that forms a spatially floating image, and includes a display device that displays an image, a retroreflective member that retroreflects image light from the display device, forming the spatially floating image based on the reflected light from the retroreflective member, and an operation including the position of the user's fingers with respect to the plane of the spatially floating image or a spatial region containing an object displayed on the plane. and a tactile sensation that generates a tactile sensation in the finger by forming sound pressure by ultrasonic waves in the vicinity of the position of the finger based on the information detected by the sensor. and a generator.
- the usability for the user is improved, and visibility and operation are improved.
- the object is to provide a technology that is more flexible and suitable for preventing or reducing erroneous operations and erroneous inputs. Problems, configurations, effects, etc. other than the above will be described in [Mode for Carrying Out the Invention].
- FIG. 1 shows a functional block configuration example of a spatially floating image display device according to an embodiment of the present invention.
- 1 shows a configuration example of a main part of a spatially floating image display device according to an embodiment
- 4 shows a configuration example of a retroreflective member. An example of incidence and reflection of light rays on a retroreflective member is shown.
- FIG. 4 is a schematic explanatory diagram showing a normal image and a ghost image in a spatially floating image display device; 3 shows another example of the main configuration of the spatially floating image display device according to one embodiment.
- 1 shows a configuration example of a light shielding member in a spatially floating image display device according to an embodiment.
- 3 shows another configuration example of the light shielding member in the spatially floating image display device according to one embodiment.
- FIG. 1 shows a configuration example of a non-contact user interface using a floating image in a floating image display device according to an embodiment.
- FIG. 10 shows an example of the line-of-sight direction of the user with respect to the spatially floating image of FIG. 9 ;
- FIG. 4 shows another configuration example of a non-contact user interface using a spatially floating image in the spatially floating image display device according to one embodiment.
- FIG. 12 shows an example of the line-of-sight direction of the user with respect to the spatially floating image of FIG. 11; 4 shows an example of the arrangement of super-directional speakers in the spatially floating video display device according to one embodiment.
- 3 shows another arrangement example of super-directional speakers in the space-floating image display device according to one embodiment.
- FIG. 4 shows an arrangement example of a spatially floating image, a super-directional speaker, and a camera in the spatially floating image display device according to one embodiment.
- 3 shows another arrangement example of a spatially floating image, a super-directional speaker, and a camera in the spatially floating image display device of one embodiment.
- a configuration of a spatially floating image display device according to an embodiment a configuration of a user, a spatially floating image, a fingertip tactile sensation generation device, and the like as viewed from the side is shown.
- 1 shows an example configuration of a fingertip tactile sense generation device, which is a fingertip tactile sense generation unit, in the spatially floating image display device according to one embodiment.
- 4 shows a configuration example of an inductance circuit in a spatial floating image display device according to an embodiment; 4 shows a configuration example of phase groups on the plane of the ultrasonic element array in the spatially floating image display device according to one embodiment.
- 1 shows a configuration example of a non-contact user interface using a spatially floating image and an arrangement example of a fingertip tactile sense generation device, etc., in the spatially floating image display device according to one embodiment.
- 4 shows a configuration example of a non-contact user interface using a spatially floating image, and another arrangement example of a fingertip tactile sense generation device, etc., in the spatially floating image display device according to one embodiment.
- 3 shows another arrangement example of the fingertip tactile sense generation device in the spatially floating image display device according to one embodiment.
- FIG. 10 is an explanatory diagram relating to light source diffusion characteristics of a display device in one embodiment
- FIG. 10 is an explanatory diagram relating to light source diffusion characteristics of a display device in one embodiment
- 1 shows a configuration example of a display device in one embodiment. 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. 1 is a layout diagram showing the configuration of a main part of a spatial floating image display device according to an embodiment;
- FIG. 3 is a cross-sectional view showing a configuration example of a video display device in the spatially floating video display device of one embodiment;
- 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. 1 is a cross-sectional view showing a configuration example of a light source device in one embodiment;
- FIG. FIG. 4 is an enlarged cross-sectional view of a light guide in one embodiment;
- FIG. 10 is an explanatory diagram relating to diffusion characteristics of a display device in one embodiment;
- FIG. 10 is an explanatory diagram relating to diffusion characteristics of a display device in one embodiment;
- 1 is a cross-sectional view showing a configuration example of a display device in one embodiment;
- the main body as hardware for them is the processor or the controller composed of the processor etc. , devices, computers, systems, etc.
- a computer executes processing according to a program read out on a memory by a processor while appropriately using resources such as a memory and a communication interface.
- the processor is composed of, for example, a semiconductor device such as a CPU or GPU.
- a processor is composed of devices and circuits capable of performing predetermined operations.
- the processing can be implemented not only by software program processing but also by dedicated circuits. FPGA, ASIC, CPLD, etc. can be applied to the dedicated circuit.
- the program may be pre-installed as data on the target computer, or may be distributed to the target computer as data from the program source and installed.
- the program source may be a program distribution server on a communication network or a non-transitory computer-readable storage medium.
- a program may consist of a plurality of program modules.
- the computer system may be composed of multiple computers.
- a program may consist of a plurality of program modules.
- the computer system may consist of a client server system or a cloud computing system.
- a spatial floating image display device of the prior art example there is a configuration in which an image display device such as an organic EL panel or a liquid crystal panel as a high-resolution color display image source is combined with a retroreflective member.
- the image light is diffused over a wide angle.
- the retroreflection portion 2a constituting the retroreflection member 2 is a hexahedron, as shown in FIGS.
- FIGS For this reason, in the spatially floating image display device of the prior art example, in addition to the reflected light that is normally reflected, as shown in FIG. Due to the light, as shown in FIG. 5, a plurality of ghost images from a first ghost image G1 to a sixth ghost image G6 are generated as ghost images in addition to the regular image R1 which is the regular spatial floating image 3.
- FIG. This impairs the image quality of the spatially floating image.
- Fig. 5 shows an example of how a spatially floating image is viewed from a normal user's viewpoint (an eyepoint in a standard correct position).
- ghost images G1 to G6 are generated.
- the embodiment prevents such a ghost image and obtains only the regular image R1.
- the occurrence of a ghost image is not only annoying for the user, but there is also the possibility that a person other than the original user (for example, another person in the vicinity of the user) will see the ghost image having the same content as the floating image. Yes, it was a big problem.
- the object for example, push button
- the object which the user intends to select and operate is not detected due to the deviation between the floating image and the user's line of sight.
- a peculiar problem has been clarified that an erroneous input such as selection operation of a different object occurs.
- the prior art example can induce erroneous input against the user's intention in devices and systems to which a non-contact user interface using spatially floating images is applied. For example, when applied to a bank ATM device, there is a possibility of causing a large error such as an input error of the amount of money.
- the frequency of erroneous input as described above may increase.
- the erroneous input and erroneous operation as described above is a serious problem from the viewpoint of the spread and application of spatial floating image display devices to various uses in the future.
- the spatial floating image display device of the embodiment eliminates the ghost image that significantly reduces the visibility of the spatial floating image in the spatial floating image display device of the prior art example, and improves the brightness of the spatial floating image. It has a configuration that improves visibility.
- the spatially floating image display device of the embodiment when the user performs an operation such as touching the surface of the spatially floating image with the fingertip at a location where the spatially floating image is displayed in the air outside, it has a function to generate a sense of physical contact (such as a touch feeling).
- spatial floating images are used as a non-contact user interface that requires the touch of at least one button. In this case, when the user touches the button with the fingertip, the spatial floating image display device generates and gives the fingertip the feeling of touching a physical button as a touch feeling by this function.
- the spatially floating image display device of the embodiment provides, when the user performs a touch operation on the surface of the spatially floating image, a sound (for example, a number written on the button) corresponding to the portion touched by the fingertip (for example, the touched button). etc.) from the vicinity of the location.
- a sound for example, a number written on the button
- the object is an element or part that constitutes a floating image or a graphical user interface (GUI), and is a visible image/image that has no physical entity other than air.
- GUI graphical user interface
- FIG. 1 shows a functional block configuration example of a spatially floating image display device according to one embodiment.
- the spatial floating image display device 1000 of FIG. Video signal input unit 1131, audio signal input unit 1133, communication unit 1132, aerial operation detection sensor 1351, aerial operation detection unit 1350, fingertip tactile sensation generation unit (in other words, touch sensation generation unit) 1230, audio signal output unit 1240, super directivity It includes a speaker 1242, a normal speaker 1243, a video control unit 1160, a storage unit 1170, an imaging unit 1180, and the like. These elements are interconnected through a bus or the like.
- Each main component of the spatial floating image display device 1000 is accommodated in the housing 1190 .
- the imaging unit 1180 and the aerial operation detection sensor 1351 may be provided as part of the housing 1190 or outside the housing 1190 .
- the retroreflective portion 1101 in FIG. 1 corresponds to the retroreflective member 2 in FIG.
- the retroreflection section 1101 retroreflects the light modulated by the image display section 1102 .
- the spatially floating image 3 is formed by the light output from the spatially floating image display device 1000 out of the reflected light from the retroreflector 1101 .
- the image display unit 1102 in FIG. 1 corresponds to the liquid crystal display panel 11 in FIG. 2 and corresponds to a color display image source.
- the light source 1105 in FIG. 1 corresponds to the light source device 13 in FIG. 1 correspond to the display device 1 (image display device) in FIG.
- the image display unit 1102 is a display unit that modulates transmitted light and generates an image based on a video signal that is input under the control of the image control unit 1160 .
- a transmissive liquid crystal panel is used.
- a reflective liquid crystal panel or a DMD (Digital Micromirror Device: registered trademark) panel using a method of modulating reflected light may be used.
- a light source 1105 generates light for the image display unit 1102, and is a solid-state light source such as an LED light source or a laser light source.
- the power supply 1106 converts AC current input from the outside into DC current to power the light source 1105 . In addition, the power supply 1106 supplies necessary DC current to each part in the spatially floating image display device 1000 .
- the light guide 1104 guides the light generated by the light source 1105 to illuminate the image display section 1102 .
- a combination of the light guide 1104 and the light source 1105 can also be called a backlight of the image display section 1102 .
- Various methods are conceivable for the combination of the light guide 1104 and the light source 1105 .
- a specific configuration example of the combination of the light guide 1104 and the light source 1105 will be described later.
- the mid-air operation detection sensor 1351 is a sensor for detecting the operation of the floating image 3 by the finger UH of the user U in FIG.
- the mid-air operation detection sensor 1351 senses a range that overlaps with the entire display range of the floating image 3, for example.
- the mid-air operation detection sensor 1351 may sense only a range that overlaps with at least a part of the display range of the floating image 3 .
- Specific examples of the aerial operation detection sensor 1351 include distance sensors using invisible light such as infrared rays, invisible light lasers, or ultrasonic waves.
- the aerial operation detection sensor 1351 may be configured to detect position coordinates on a two-dimensional plane corresponding to the main plane of the floating image 3 by combining a plurality of sensors.
- the aerial operation detection sensor 1351 may be configured by a ToF (Time Of Flight) LiDAR (Light Detection and Ranging) or an image sensor (in other words, a camera).
- the mid-air operation detection sensor 1351 only needs to perform sensing for detecting an operation such as a touch operation on an object displayed as the space floating image 3 by the user's U finger UH. Existing technology can also be applied to such sensing.
- the mid-air operation detection unit 1350 acquires a sensing signal (in other words, detection information) from the mid-air operation detection sensor 1351, and based on the sensing signal, detects a state including whether or not the finger UH of the user U touches the object in the floating image 3. It performs determination, calculation of the position where the fingertip and the object are in contact, and the like.
- the aerial operation detection unit 1350 may be configured by a circuit such as FPGA. Also, part of the functions of the aerial manipulation detection unit 1350 may be realized by software based on a spatial manipulation detection program executed by the processor of the control unit 1110, for example.
- the aerial operation detection sensor 1351 and the aerial operation detection unit 1350 may be built in the floating image display device 1000 or may be provided outside the floating image display device 1000 separately. When provided separately, the aerial operation detection sensor 1351 and the aerial operation detection unit 1350 send information and signals to the floating image display device 1000 (for example, a control device to be described later) via a wired or wireless communication connection path or signal transmission path. is configured to transmit If the mid-air operation detection sensor 1351 and the mid-air operation detection unit 1350 are provided separately, it is possible to construct a system in which only the mid-air operation detection function can be added as an option using a floating image display device without the mid-air operation detection function as the main body. is.
- the aerial operation detection sensor 1351 may be provided separately, and the aerial operation detection unit 1350 may be incorporated in the floating image display device. If it is desired to arrange the mid-air operation detection sensor 1351 more freely with respect to the installation position of the spatially floating image display device, there is an advantage in the configuration in which only the mid-air operation detection sensor 1351 is separate.
- Elements such as the aerial operation detection sensor 1351, the imaging unit 1180 (especially the camera), a super-directional speaker and an ultrasonic element array, which will be described later, are basically arranged at fixed positions and orientations designed in advance. , and may be configured such that the position and orientation thereof can be changed and adjusted by the user.
- the imaging unit 1180 is configured using, for example, a camera having an image sensor, and captures the space near the floating image 3 and/or the body (head, face, eyes, arms, fingers, etc.) of the user U who operates the floating image 3. etc.) is imaged.
- a plurality of imaging units 1180 may be provided.
- the imaging unit 1180 may be provided as a stereo camera with two or more cameras.
- the imaging unit 1180 may be an imaging unit with a depth sensor.
- the imaging unit 1180 may assist the detection of the user U's operation on the floating image 3 by the aerial operation detection sensor 1351 and the aerial operation detection unit 1350 .
- detection processing by the mid-air operation detection unit 1350 can be facilitated.
- the aerial operation detection sensor 1351 is an object intrusion sensor that detects whether or not an object has entered the intrusion detection plane, targeting a plane including the display surface of the floating image 3 (also referred to as an intrusion detection plane).
- it is configured as a contact detection sensor or the like that detects whether or not a finger UH is in contact with the surface of the spatially floating image 3 .
- information such as how far away an object (for example, a finger UH) that has not entered the intrusion detection plane is from the intrusion detection plane, or how close the object is to the intrusion detection plane, is sent to the aerial operation detection sensor. Detection may be difficult or impossible with 1351 alone.
- the distance between the object and the intrusion detection plane can be calculated by using the depth calculation information of the object based on the captured image of the camera of the imaging unit 1180, the depth information of the object by the depth sensor, or the like. , can be calculated with higher accuracy.
- Various types of information including such distances detected and calculated using various types of sensor devices can be effectively used for various types of display control and the like for the spatially floating image 3 .
- the aerial operation detection unit 1350 may detect an operation of the floating image 3 by the user U based on the captured image of the imaging unit 1180 without using the aerial operation detection sensor 1351 .
- the position of the mid-air operation detection sensor 1351 is not the position near the floating image 3 and the user U as shown in FIG. A position opposite to the spatially floating image 3 and the user U (a position where a two-dimensional plane of the spatially floating image 3 can be captured) such as the position 1180 may be used.
- the image capturing unit 1180 captures an image of the face of the user U who operates the floating image 3, and the control unit 1110 performs identification processing (for example, user authentication based on face recognition) of the user U based on the captured image.
- identification processing for example, user authentication based on face recognition
- the imaging unit 1180 may simply implement a function like a human sensor.
- the operation of the user U may be spied on. Such peeping should be prevented in order to ensure the confidentiality of the contents of the spatially floating image 3 and the operation.
- the image capturing unit 1180 captures an image of a range including the user U who operates the spatial floating image 3 and its surrounding area, and determines whether or not another person is peeping.
- a configuration may be adopted in which the presence or absence of viewing or the like is discriminated.
- the operation input unit 1107 is, for example, an operation button or a remote controller light-receiving unit, and inputs a signal for an operation different from the air operation on the floating image 3 by the user U.
- the operation input unit 1107 may be used by a person other than the user U who operates the spatially floating image 3, such as an administrator, to operate the spatially floating image display device 1000. FIG.
- the video signal input unit 1131 inputs video data from a connected external video output device.
- the audio signal input unit 1133 inputs audio data from a connected external audio output device.
- the audio signal output unit 1240 can output audio signals based on audio data input to the audio signal input unit 1133 .
- the audio signal output unit 1240 may output an audio signal based on data pre-recorded in the storage unit 1170 or built-in data in the non-volatile memory 1108 . Examples include audio data such as numbers and character strings, and audio data such as other operation sounds and error warning sounds.
- the audio data includes data for the generation of audio signals associated with the spatially floating image 3 and objects.
- An example of the operation sound is a sound (such as “pop”) output when an object such as a push button in the floating image 3 is touch-operated.
- the audio signal output unit 1240 may be connected to a speaker (normal speaker 1243 in FIG. 1) that outputs normal audio in the audible band.
- the normal speaker 1243 may be used for audio that does not require confidentiality.
- the audio signal output unit 1240 may be connected to the super-directional speaker 1242 in FIG.
- Superdirectional speaker 1242 in FIG. 1 corresponds to superdirectional speaker 30 in FIG.
- the super-directional speaker 1242 is composed of an array in which a plurality of ultrasonic output elements capable of generating, for example, ultrasonic signals of about 40 kHz are arranged on a plane. In this case, generally speaking, the more ultrasonic output elements that are used, the louder the sound volume obtained by the super-directional speaker.
- a super-directional speaker is a speaker that outputs super-directional sound so that only people (corresponding ears) existing in a specific limited space area can hear sound in the audible band.
- super-directional loudspeakers have the property that their sound cannot be heard by people (corresponding ears) outside a specific limited spatial region.
- the principle of a super-directional speaker will be briefly explained.
- ultrasonic waves are more linear than sounds in the audible band (for example, human speech). Therefore, by using the 40 kHz ultrasonic signal as a carrier and modulating the carrier with an audio signal in the audible band (for example, amplitude modulation), it is possible to make the audio audible only in a specific limited spatial region. be.
- the spatial floating image display device 1000 identifies the positions of the user U's face, eyes, ears, or the like in space by using the imaging unit 1180 (for example, multiple cameras). Then, the spatially floating image display device 1000 can perform control so that the sound from the super-directional speaker 1242 can be heard only in a specific region near the user U's ear according to the specific result. Specifically, the spatially floating image display device 1000 controls the phase (in other words, delay time) of each ultrasonic signal input to each ultrasonic output element forming the super-directional speaker 1242 . This allows the sound to be heard only in a specific limited spatial region.
- Super-directional speaker 1242 even if a plurality of ultrasonic output elements are arranged not on a plane but on a concave curved surface, for example, the sound can be heard only in the above-mentioned specific limited space area. can be done.
- Super-directional speaker 1242 may be configured as part of housing 1190 or may be configured as a separate body from housing 1190 . Specific examples will be described later.
- the fingertip tactile sensation generation unit 1230 is a part that has a function of generating and imparting a tactile sensation to the fingertip when the user U's finger UH including the fingertip of the user U operates the floating image 3 in space.
- the fingertip tactile sensation generation unit 1230 generates a feeling as if there is an object other than air on the fingertip when a touch operation by the fingertip of the user U on the object displayed as the space floating image 3 is detected. is generated and given as a touch feeling.
- the fingertip tactile sense generation unit 1230 is configured using, for example, an ultrasonic element array in which a plurality of ultrasonic output elements are arranged on a plane. An example of the fingertip tactile sense generation unit 1230 will be described later.
- the fingertip tactile sense generation unit 1230 uses information on the touch position of the fingertip of the user U on the object in the floating image 3 detected by the air operation detection sensor 1351 and the air operation detection unit 1350 .
- the fingertip tactile sense generation unit 1230 has a function capable of emitting ultrasonic waves with a predetermined strength of sound pressure toward the touch position from the ultrasonic element array. This ultrasonic wave has very strong directivity. When this ultrasonic wave hits the fingertip of the user U, the user U can get a touch feeling as if he/she touched some real object with his/her fingertip.
- the fingertip tactile sense generation unit 1230 has a function of modulating the ultrasonic signal with an audio signal in the audible band, in other words, a function of superimposing sound on the ultrasonic wave.
- a sound based on the audio signal is emitted from the vicinity of the fingertip where the user U touches the object. This voice goes in all directions. The user U can hear this sound accompanying the touch operation. Therefore, the user U can not only get a touch feeling on the fingertip, but also can more reliably recognize the touch operation performed by the user from the voice emitted near the fingertip.
- the fingertip tactile sense generation unit 1230 may be configured as part of the housing 1190 or may be configured as a device having a housing separate from the housing 1190 . Fingertip tactile sensation generating section 1230 may be arranged at a predetermined position outside housing 1190, or may be configured such that its position and orientation can be variably adjusted.
- the fingertip tactile sense generation unit 1230 in FIG. 1 corresponds to the fingertip tactile sense generation device 6 in FIG. 17 described later.
- the non-volatile memory 1108 stores and holds various data used in the spatial floating image display device 1000 .
- the data stored in the non-volatile memory 1108 includes video/image data for displaying the spatial floating video 3, for example.
- This data includes data for constructing objects (characters, buttons, icons, etc.) to be displayed as at least part of the spatial floating image 3 .
- This data may include data for various operations, object layout information, object metadata, control information, related information, and the like.
- the memory 1109 stores image data to be displayed as the spatial floating image 3, data for controlling the spatial floating image display device 1000, and the like.
- the video/image may be a still image or a moving image.
- the control unit 1110 corresponds to the controller of the spatial floating image display device 1000, and controls the operation of each connected unit.
- the control unit 1110 has a processor and cooperates with a program stored in the memory 1109 to perform arithmetic processing based on information acquired from each unit in the spatial floating image display device 1000 .
- the communication unit 1132 communicates with external devices, servers, etc. via a wired or wireless communication interface. Various data such as video/image data and audio data are transmitted and received through communication via the communication unit 1132 .
- the spatial floating image display device 1000 may acquire instructions, image data, and the like from the outside via the communication unit 1132, and may output/store information on the operation result of the user and the like to the outside.
- the storage unit 1170 is a storage device that records various data and information such as video/image data and audio data.
- Storage unit 1170 may hold data similar to nonvolatile memory 1108 .
- various data/information such as video/image data and audio data may be recorded in the storage unit 1170 in advance before product shipment.
- the storage unit 1170 may record various data and information such as video/image data and audio data acquired from an external device, server, or the like via the communication unit 1132 .
- the audio data recorded in the storage unit 1170 can be output as audio through the audio signal output unit 1140, for example.
- Video/image data and audio data are associated in a predetermined manner.
- a certain push button object is an object that displays characters such as ⁇ A'', ⁇ 1'', ⁇ Yes'', ⁇ No'', and ⁇ Redo''.
- the audio data associated with the image data of the object has audio data for uttering the character as audio in response to a predetermined operation on the object.
- another certain object is an object in which characters are not displayed, but even in that case, it is possible to associate the audio data of a sound produced in response to a predetermined operation on that object.
- a video control unit 1160 performs various controls related to video signals input to the video display unit 1102 .
- the video control unit 1160 creates a video signal, switches the video signal, and the like.
- the video control unit 1160 controls which of the memory 1109, the storage unit 1170, the video signal input unit 1131, the communication unit 1132, etc. is input as the source of the video signal to be input to the video display unit 1102, or the like. I do.
- the video control unit 1160 generates a superimposed video signal by superimposing the video signal in the memory 1109 and the video signal input from the video signal input unit 1131, for example, and inputs the superimposed video signal to the video display unit 1102. can also be controlled to As a result, a composite image based on the superimposed image signal can be formed as the spatially floating image 3 .
- the video control unit 1160 may control image processing for the video signal of each source.
- image processing include scaling processing for enlarging, reducing, and transforming an image, brightness adjustment processing for changing brightness, contrast adjustment processing for changing the contrast curve of an image, and decomposition of an image into light components for each component.
- brightness adjustment processing for changing brightness
- contrast adjustment processing for changing the contrast curve of an image
- decomposition of an image into light components for each component There is a Retinex process that changes the weighting of .
- the image control unit 1160 may perform special effect image processing or the like for assisting an air operation such as a touch operation on the space floating image 3 by the user U on the image signal input to the image display unit 1102. .
- the special effect video processing is performed, for example, based on the detection result of the touch operation by the aerial operation detection unit 1350 and the captured image of the user U by the imaging unit 1180 .
- Examples of special effect images include an animation in which the button sinks in the depth direction so as to emphasize the touch operation when the object of the push button is touched, and an animation of the button on the surface of the space floating image 3.
- An animation or the like in which ripples are generated in the surroundings may be used.
- the spatially floating image display device 1000 (particularly, the image control unit 1160) creates data for displaying the spatially floating image 3 (in other words, a video signal) based on the image/image data in the memory 1109, the storage unit 1170, and the like. and input to the video display unit 1102 of the video display device 1 .
- the image light generated and emitted by the image display unit 1102 is reflected via the retroreflection unit 1101 and emitted as image light having high directivity toward the outside of the spatial floating image display device 1000. be.
- a spatially floating image 3 is output and formed at a predetermined position outside.
- a regular user U facing this spatially floating image 3 moves in the direction of an arrow A from an eye point UP (in other words, eyes) corresponding to a predetermined standard position in space.
- the floating image 3 can be viewed favorably.
- the spatial floating image display device 1000 is equipped with various functions. However, the spatially floating image display device 1000 does not need to have all of these functions, and only needs to have at least the function of forming the spatially floating image 3, and various forms are possible.
- FIG. 2 shows the configuration of the main part of the spatial floating image display device 1000 of one embodiment, the configuration example of the retroreflector 1101, and the like.
- FIG. 2 shows the configuration when viewed from the side in the direction in which the spatial floating image display device 1000 and the regular user U face each other in space.
- a display device 1 image display device
- the display device 1 includes a liquid crystal display panel 11 and a light source device 13 that generates specific polarized light having narrow-angle diffusion characteristics.
- the image light of the specific polarized wave from the display device 1 is reflected by the polarization separation member 101 having a film that selectively reflects the image light of the specific polarized wave provided on the transparent member 100 (for example, glass), and is retroreflected. Incident on the member 2 .
- a retroreflective member 2 is arranged in the other diagonal direction of the transparent member 100 (a direction having an angle B with respect to the horizontal plane).
- the polarization separation member 101 is a sheet-shaped polarization separation member adhered to the rear surface side (lower surface side in the vertical direction) of the transparent member 100 .
- a ⁇ /4 plate 2b as a wavelength plate is provided on the image light incident surface of the retroreflective member 2.
- the image light passes through the ⁇ /4 plate 2b twice, when it enters the retroreflective member 2 and when it exits. Thereby, the image light undergoes polarization conversion from the specific polarized wave (in other words, one polarized wave) to the other polarized wave.
- the polarization separating member 101 that selectively reflects the image light of the specific polarized wave has the property of transmitting the polarized light of the other polarized wave after the polarization conversion.
- the image light of the specific polarized wave after the polarization conversion is transmitted through the polarization separation member 101 to the outside.
- the image light transmitted through the polarization separation member 101 forms a space floating image 3 which is a real image at a predetermined position outside the transparent member 100 .
- the spatially floating image 3 in this example is arranged such that the main plane is oblique with respect to the horizontal plane of the transparent member 100 at an angle C (an angle corresponding to the angle B).
- the aerial operation detection sensor 1351 in this example is arranged at a position near the transparent member 100 on the extension of the plane of the floating image 3 as shown in the drawing.
- the main plane of the spatial floating image 3 also corresponds to the range in which the touch operation is possible.
- the light forming the spatially floating image 3 is a set of light rays converging from the retroreflective member 2 to the optical image of the spatially floating image 3 . These rays travel straight even after passing through the optical image of the spatially floating image 3 . Therefore, the spatially floating image 3 is an image having high directivity, unlike diffuse image light formed on a screen by a general projector or the like. Therefore, in the configuration of FIG. 2, the user U (corresponding eye point UP) views the floating image 3 from the direction of the arrow A corresponding to the direction of the highly directional image light (indicated by the dashed-dotted line arrow). When viewed, the spatially floating image 3 is perceived as a bright image.
- the spatially floating image 3 is viewed as an image. Not visible at all. This characteristic is very suitable for a system that displays a video that requires high security or a video that is highly confidential and should not be viewed by a person facing the user U.
- the polarization axes of the reflected image light may become uneven.
- part of the image light whose polarization axes are not aligned is reflected by the polarization separation member 101 described above and returns to the display device 1 .
- a ghost image (FIG. 5) may be generated and the image quality of the floating image 3 may be degraded. Therefore, in this embodiment, an absorptive polarizing plate 12 is provided on the image display surface of the display device 1 .
- the polarization separating member 101 may be formed of, for example, a reflective polarizing plate or a metal multilayer film that reflects a specific polarized wave.
- a super-directional speaker 30 (see FIG. 2) is placed at a predetermined position so as not to block the optical path for forming the spatially floating image 3 above or in the vicinity of the transparent member 100 shown in FIG. 1 super-directional speaker 1242) are arranged.
- the super-directive speaker 30 outputs super-directive sound in the direction of the dashed-dotted arrow in the drawing (that is, the direction toward the ear UE).
- the user U can hear the sound in the audible band only in a very limited space area near the ear UE of the user U, and the people around the user U can hear the sound. has the characteristic that the voice cannot be heard. This characteristic is particularly suitable for outputting voice information with high confidentiality (for example, personal identification number, amount of money, etc.) by reading out.
- FIG. 3 shows the surface shape of a retroreflective member manufactured by Nippon Carbide Industry Co., Ltd. as an example of the configuration of a typical retroreflective member 2 .
- the surface of this retroreflective member has hexagonal prisms as a plurality of regularly arranged retroreflective portions 2a (in other words, retroreflective elements).
- a light beam incident on the inside of the hexagonal prism is reflected by the wall surface and the bottom surface of the hexagonal prism and emitted as retroreflected light in a direction corresponding to the incident light.
- a normal image R1 as illustrated in FIG. 5 is formed as the spatially floating image 3.
- FIG. 4 depending on the image light obliquely incident on the retroreflective member 2 (hexagonal prism) among the image light from the display device 1, ghost images G1 to G6 are generated separately from the normal image R1. is formed.
- the spatially floating image display device 1000 of the embodiment displays the spatially floating image 3 which is a real image based on the image displayed on the display device 1 .
- the resolution of the spatially floating image 3 largely depends on the resolution of the liquid crystal display panel 11 as well as the diameter D and the pitch P of the hexagonal prisms that are the retroreflection portions 2a of the retroreflection member 2 shown in FIG.
- the effective resolution of the spatially floating image 3 is reduced to about 1/3.
- a roll press method for molding is a method of aligning the retroreflective portions 2a and forming them on a film.
- the reverse shape of the shape to be formed is formed on the surface of the roll, and UV curable resin is applied on the base material for fixing and passed between the rolls to shape the required shape, and ultraviolet rays are applied. Cure by irradiation. Thereby, a retroreflective member 2 having a desired shape is obtained.
- the display device 1 is configured using a liquid crystal display panel 11 and a light source device 13 that generates specific polarized light having narrow-angle diffusion characteristics, which will be described later.
- a light source device 13 that generates specific polarized light having narrow-angle diffusion characteristics, which will be described later.
- FIG. 6 shows another example of the main configuration of the spatially floating image display device according to one embodiment.
- the display device 1 includes a liquid crystal display panel 11 as an image display element, and a light source device 13 for generating light of a specific polarized wave having narrow-angle diffusion characteristics.
- the liquid crystal display panel 11 is composed of one selected from a small one with a screen size of about 5 inches to a large one with a screen size exceeding 80 inches. Image light from the liquid crystal display panel 11 is reflected toward the retroreflecting member 2 by a polarization separating member 101 such as a reflective polarizing plate.
- a ⁇ /4 plate 2b is provided on the light incident surface of the retroreflecting member 2, and polarization conversion (converting a specific polarized wave into the other polarized wave) is performed by passing the image light twice.
- polarization conversion converting a specific polarized wave into the other polarized wave
- the image light after the polarization conversion passes through the polarization separating member 101 and forms the space floating image 3 which is a real image outside the transparent member 100 .
- retroreflection may cause the polarization axes to become uneven, and part of the image light is reflected and returned to the display device 1 .
- an absorptive polarizing plate 12 is provided on the image display surface of the display device 1 .
- the image light is transmitted through the absorptive polarizing plate 12 and the reflected light is absorbed by the absorptive polarizing plate 12 .
- deterioration in image quality due to the ghost image of the spatially floating image 3 is prevented.
- the surface of the transparent member 100 (external light incident surface) has: It is preferable to provide an absorptive polarizing plate 12B as shown.
- the polarized light separating member 101 can be composed of, for example, a reflective polarizing plate or a metal multilayer film that reflects a specific polarized wave.
- FIG. 6 and the configuration of FIG. 2 is that oblique image light other than normal image light forming the spatially floating image 3 is placed in the optical path between the polarization separating member 101 and the liquid crystal display panel 11.
- a light shielding member 24 and a light shielding member 23 for shielding the light from the light are provided.
- a light shielding member 24 is arranged at a distance L2 near the liquid crystal display panel 11 , and a light shielding member 23 is arranged near the polarization separation member 101 .
- a light shielding member 22 for shielding oblique image light other than normal image light is also provided in the middle of the optical path between the retroreflection member 2 and the polarization separation member 101 .
- a light blocking member 22 is arranged at a distance L1 from the retroreflective member 2 . These light blocking members block oblique image light that causes ghost images. Broken line portions of the light shielding members 22, 23, and 24 indicate transmission (non-light shielding) regions. As a result, the occurrence of ghost images as described above can be further suppressed.
- the inventors have confirmed through experiments that the light shielding effect can be enhanced by providing the light shielding member 24 and the light shielding member 23 in the space between the liquid crystal display panel 11 and the polarization separation member 101 .
- the inner diameter of the light shielding member 23 and the light shielding member 24 was set to 110% of the area through which the normal image light flux forming the spatially floating image 3 passes, thereby improving the component accuracy. It was confirmed that it was possible to create and assemble within the range of mechanical tolerance.
- the light shielding member 22 provided between the retroreflection member 2 and the polarization separation member 101 has a distance L1 between the light shielding member 22 and the retroreflection member 2 of 50 relative to the distance between the retroreflection member 2 and the polarization separation member 101 . % or less, the occurrence of ghost images can be further reduced.
- the light shielding member 22 is installed at a position where the distance L1 between the light shielding member 22 and the retroreflecting member 2 is 30% or less of the distance between the retroreflecting member 2 and the polarization separation member 101, it will be visually visible. The occurrence of ghost images can be reduced to a practically acceptable level. It was confirmed that the provision of the three light shielding members 22, 23 and 24 described above can further reduce the level of the ghost image.
- FIG. 7 shows another configuration example of the light shielding member for reducing the occurrence of ghost images in the spatially floating image display device of one embodiment.
- FIG. 7A shows the cross-sectional shape of the light shielding member 25 in this embodiment.
- FIG. 7B shows the shape of the plane of the light shielding member 25 (the surface seen in the direction perpendicular to the optical axis). Twenty-five effective areas are shown.
- the area of the area 27 through which the normal image light flux passes is set to be smaller than the area of the inner diameter of the outer frame 25a (the area having the property of blocking light) of the light shielding member 25. It is
- FIG. 8 shows another example of a light shielding member that reduces ghost image generation in the spatially floating image display device according to one embodiment.
- FIG. 8A shows the cross-sectional shape of the light shielding member 26 in this embodiment.
- FIG. 8B shows the planar shape of the light shielding member 26.
- the effective area of the light shielding member 26 is substantially the same size as the area 27 through which the regular image light flux forming the spatially floating image 3 passes. It is shown.
- beams 26b are provided from the outer frame 26a of the light shielding member 26 toward the inside. The tip of the beam 26b extends to the outer shape of the region 27. As shown in FIG.
- the area of the region 27 through which the normal image light flux passes is smaller than the area of the inner diameter of the outer frame 26a of the light shielding member 26, and is equal to the surface inscribed with the beam 26b.
- the shape of the main surface of the retroreflective member 2 is a planar shape directly facing the display device 1 (including reflection in the optical path).
- the shape of the retroreflective member 2 may be a concave surface or a convex surface having a radius of curvature of 200 mm or more, for example, from a planar shape directly facing the display device 1 . In the case of this configuration, even if a ghost image is generated by oblique image light reflected by the retroreflective member 2, the ghost image generated after the reflection is moved away from the visual field of the user U so that it cannot be viewed. can.
- the curvature radius is set to 100 mm or less, out of the light reflected by the peripheral portion of the retroreflective member 2 (the peripheral portion of the area where the ⁇ /4 plate 2b is arranged in FIG. 2 etc. and mainly reflects the image light) , a new problem arises in that the amount of light reflected normally decreases, and the amount of peripheral light in the obtained spatial floating image 3 decreases. Therefore, in order to reduce the ghost image to a practically acceptable level, it is preferable to select and apply the above-mentioned technical means or use them together.
- FIG. 9 is an explanatory diagram of a video display method for preventing erroneous input in the spatially floating video display device according to one embodiment.
- FIG. 9 shows a display example of the spatially floating image 3 on the spatially floating image display device 1000, and shows how the spatially floating image 3 is viewed from the user U's viewpoint.
- This example of the spatially floating image 3 corresponds to a non-contact user interface that has a plurality of objects such as numeric keys and allows input of numbers and the like.
- the display device 1 includes the liquid crystal display panel 11 and the light source device 13 that generates specific polarized light having narrow-angle diffusion characteristics. It can be composed of items selected from large items exceeding an inch. For example, image light from the liquid crystal display panel 11 is reflected toward the retroreflecting member 2 by a polarization separation member 101 such as a reflective polarizing plate.
- FIG. 10 is a schematic diagram showing how the user U visually recognizes the spatially floating image 3 when the spatially floating image display device 1000 is viewed from the side.
- it shows the case where the user U's eye point UP is at three different positions of upper, middle, and lower in the height direction, and the spatially floating image 3 is viewed in each of the line-of-sight directions LU, LM, and LL. .
- the angles A, B, and C described above (FIG. 2) are about 45 degrees.
- the spatially floating image 3 is formed by light rays having high directivity, when the spatially floating image 3 is viewed from the viewing direction shown in FIG. Image 3 is visually recognized as a bright image.
- the user U operates the spatially floating image display device 1000 at the correct position and the line of sight is in the viewing direction ML, the user U can display the entire spatially floating image 3 as shown in FIG. Visible.
- the position of the user's U eye is optimal in a configuration in which the light source device 13 having narrow-angle diffusion characteristics and the liquid crystal display panel 11 as an image display element are combined.
- position corresponding to the line-of-sight direction LM in FIG. 10 for example, when viewing along the line-of-sight direction LU on the upper side in FIG. It turned out to be difficult. Considering this from the opposite point of view, it can be said that there is little danger of someone other than the authorized user U peeking at the floating image 3, and it is effective from the viewpoint of security and the like.
- the inventor studied a method that allows easy touch operations on the spatially floating image 3 at the optimum viewing position.
- a portion of the housing 50 (the housing 1190 in FIG. ) is provided with a camera unit 55 (corresponding to the imaging section 1180 in FIG. 1) for judging whether the standing position of the user U is good or bad.
- the spatial floating image display device 1000 uses the camera unit 55 to determine the position and orientation of the face of the user U in space and, if necessary, the position of the pupil (eyepoint UP).
- the spatially floating image display device 1000 in the spatially floating image 3 shown in FIG. The display state of 3b, 3c and 3d is changed.
- the spatial floating image display device 1000 displays the optimal position corresponding to the upper side of the four sides.
- the display state of the optimal viewing position display sections on the four sides is switched, such as turning off the viewing position display section 3a.
- a state in which the user U can see all of the optimum monitoring position display portions 3a, 3b, 3c, and 3d on four sides that is, a state in which it looks like a picture frame corresponds to a preferable viewing state.
- the line of sight of the user U can be guided to the line of sight direction ML corresponding to the optimum position.
- the spatially floating image display device 1000 automatically adjusts the brightness of the displayed image based on the information on the brightness of the external world detected by the camera unit 55, thereby adjusting the brightness of the spatially floating image 3 and the like. This makes it possible to reduce power consumption and improve visibility.
- FIGS. 11 and 12 a member having a physical entity (other than air) for position recognition, such as a transparent structural member 60 made of plastic, is provided at the position where the floating image 3 is displayed.
- a transparent structural member 60 made of plastic
- frame-shaped transparent structural members 60 are arranged on the four sides of the outer periphery of the spatially floating image 3 (including the frame display portion) similar to that in FIG.
- the transparent structural member 60 has, for example, a lower side portion fixed to one side on the front side of the housing 50 .
- a non-transparent frame-shaped structural member made of metal or the like may be provided.
- the user U can easily recognize the non-transparent structural member, so that the display position of the spatially floating image 3 can be set even from a horizontal direction or an obliquely upward direction, for example, several meters away. can be recognized quickly. After that recognition, the user U can operate the spatial floating image 3 with a finger from the front position.
- a TOF sensor 56 (TOF type distance sensor) is provided as the aerial operation detection sensor 1351 .
- the TOF sensor 56 senses the distance between an object such as a finger of the user U and the TOF sensor 56, the position of the object, and the like.
- This TOF sensor 56 is arranged below the spatial floating image 3 .
- the TOF sensor 56 is arranged in a portion of the housing 50 (50a) corresponding to the lower side portion of the transparent structural member 60. As shown in FIG.
- This TOF sensor 56 can detect the state such as the touch position of the fingertip by scanning the entire area within the plane of the spatially floating image 3 .
- the TOF sensor 56 and the control unit 1110 using it can detect and measure the position coordinates (for example, the touch position) of an object such as a finger in the coordinate system in the planar direction of the spatially floating image 3, and also detect the movement of the object. Conditions such as direction and speed of movement can also be detected.
- the TOF sensor 56 has a configuration in which a plurality of combinations of infrared light emitters and light receivers are linearly arranged as shown in FIG. 11, for example, in order to read the distance and position on a two-dimensional plane.
- the TOF sensor 56 receives the light emitted from the infrared light emitting unit and reflected by the light receiving unit, and multiplies the time difference from light emission to light reception by the speed of light, so that the distance to the object can be clearly determined. become. Further, the coordinates on the plane of the spatially floating image 3 can be read from the coordinates at the portion where the time difference is the smallest among the plurality of infrared light emitting units and light receiving units on a straight line.
- a method using a configuration including an infrared laser light generating unit and an imaging unit using an infrared camera may be used.
- the shape of the irradiation area of the infrared laser beam generated from the infrared laser beam generating unit is a thin film sheet shape, and is close to the display surface of the floating image 3 with a gap of, for example, several millimeters or less. do.
- This irradiation area of the infrared laser light is also called a "laser sheet".
- the infrared camera of the imaging section captures light reflected by an object such as a finger of the user U from a laser sheet formed by infrared laser light generated from an infrared laser light generating section (in other words, a laser sheet generating section). Then, the control unit 1110 can identify the position of the finger on the laser sheet by analyzing the captured image generated by the imaging unit.
- a configuration in which a plurality of configurations such as sensors capable of detecting the position coordinates of an object on a two-dimensional plane as described above are combined in the direction perpendicular to the surface of the floating image 3 (in other words, the depth direction and the front-back direction). may be As a result, information such as the positional coordinates of the object in the three-dimensional area including the vertical direction of the spatially floating image 3 can be obtained. That is, it is possible to obtain in more detail the state of the operation such as a finger touch on the spatially floating image 3 .
- the inventor also studied a display method that allows the user U to visually recognize the floating image 3 more clearly.
- the space-floating image is displayed on the outer frame 50a of the housing 50 on the outer periphery of the transparent member 100, which is the window through which the image light is emitted. If it is designed so that a part of 3 is hung, it can be visually recognized more clearly.
- the layout of the entire optical system may be designed so that the lower end of the spatially floating image 3 hangs over the outer frame 50a.
- Example 3 of Spatial Floating Image Display Device Super-Directional Speaker>
- the inventor of the present invention prevents erroneous input such as a touch operation on an object that the user wants to select. Therefore, we examined a configuration in which input operations can be performed reliably.
- the inventors particularly studied a method of suitably outputting voice corresponding to the user's input operation, for example, a method of assisting user's operation by voice or performing operation guidance.
- a voice corresponding to an object for example, a number button selected by a user by touch operation or the like is generated.
- an object for example, a number button
- the inventor of the present invention uses a super-directive speaker (super-directive speaker 1242 in FIG. 1, super-directive speaker 30 in FIG. 2) to study a sound output method that allows only authorized users to listen to sound. did.
- FIG. 13 shows a configuration example in which the super-directional speaker 30 is arranged on the outer frame 50a of the housing 50 in the spatial floating image display device 1000 of one embodiment.
- An application example is an ATM device.
- This configuration example is a configuration in which the sound from the super-directional speaker 30 is output directly toward the user's U ear UE.
- the superdirective sound propagation path is a straight path, in other words a non-reflective path.
- the super-directional loudspeaker 30 creates a highly directional sound field 31 .
- the sound emitted from the super-directional speaker 30 directly reaches the area near the user's U ear UE.
- a region in the vicinity of the ear UE of the user U at the regular position is a region in which a strong sound field 31 is formed. Therefore, only the user U can hear the voice, and others cannot hear it.
- FIG. 14 shows a configuration example in which the super-directional speaker 30 is arranged on the wall 57 on the opposite side of the user U in the spatial floating image display device 1000 of one embodiment.
- the sound emitted from the super-directional speaker 30 is once reflected by the plane of the transparent member 100 (for example, glass), and the reflected sound reaches the area near the ear UE of the user U. .
- the super-directive speaker 30 has an optical path for forming the spatially floating image 3 (from the retroreflective member 2 described above to the spatially floating image 3 through the transparent member 100). , the optical path of the image light toward the eye point UP, etc.). Therefore, the super-directional speaker 30 does not block the optical path for forming the spatially floating image 3 . Further, when these configurations are compared, the similarity in direction between the optical path of the video light and the audio path is higher in FIG. 14 than in FIG. 13 . Therefore, the configuration of FIG. 14 is more effective from the viewpoint that it is easy to hear the sound as if it is being emitted from the spatially floating image 3 from the position of the regular user U (the position in the line-of-sight direction LM). is.
- the super-directional speaker 30 in the embodiment may apply a technique of forming a sound field such as a three-dimensional position.
- a configuration may be applied in which a plurality of ultrasonic output elements are arranged on a concave curved surface instead of on a flat surface, and the curvature of the concave surface is changed.
- it is possible to control the sound field such as the three-dimensional position where the sound output from the super-directional speaker 30 can be heard.
- the spatial floating image display device 1000 of one embodiment has a configuration as shown in FIG.
- one super-directional speaker 30 is arranged at a position on the far side of the housing 50 as seen from the user, that is, at the central position in the left-right direction.
- the configuration examples shown in FIGS. 13 and 14 can be similarly applied.
- a camera 55CL is arranged on the left side and a camera 55CR is arranged on the right side of the super-directional speaker 30 as seen from the user.
- the cameras 55CL and 55CR are configuration examples of the imaging unit 1180 described above.
- the control unit 1110 controls the phase difference (or time difference) of the ultrasonic signals input to the plurality of ultrasonic output elements forming the superdirective speaker 30 .
- an optimal sound field can be formed so that the sound can be heard only in the vicinity of the user's face or ears.
- the user can hear the sound from the super-directional speaker 30 without being heard by others.
- the configuration as shown in FIG. 15 is very suitable from the viewpoint of security.
- Super-directional speaker 30 and cameras 55CL and 55CR may be provided at a position separate from housing 50 of spatially floating image display device 1000, may be installed at housing 50, or may be installed at a predetermined position. It may be fixed to the housing 50 .
- the spatial floating image display device 1000 of one embodiment has a configuration as shown in FIG.
- the camera is built in the housing of the super-directive speaker 30.
- it has a configuration in which the super-directive speaker and the camera are integrated.
- a super-directive speaker 30L is arranged on the left side and a super-directive speaker 30R is arranged on the right side of the back side as viewed from the user.
- the super-directive speaker 30L and the super-directive speaker 30R are each integrated with a camera section 55C built in a housing, in other words, they are a speaker/camera unit.
- This configuration is not only superior in terms of space factor due to integration, but also enables stereo shooting with two left and right cameras (camera unit 55C). Therefore, the user's face position and the distance from the ultrasonic speaker to the face position can be calculated and obtained based on the images of the respective camera units 55C.
- the speaker/camera unit may be fixed at a predetermined position of the housing 50.
- the distance between the left and right units is always constant. Therefore, it is possible to more accurately calculate the positional relationship between the user, the camera, and the super-directional speaker.
- the position of the sound field formed by the two superdirective speakers 30L and 30R can be calculated with high accuracy. This makes it possible to accurately set the focal region of the sound field formed by the two super-directional speakers 30L and 30R, that is, the region where only authorized users can hear the sound.
- the sound generated by the sound field formed by the two super-directional speakers 30L and 30R can be heard by, for example, only authorized users of the ATM device, and other people near the user (for example, on the left, right, or behind) can hear the sound.
- a sound field is formed by the left super-directional speaker 30L for the user's left ear, and a sound field by the right super-directional speaker 30R is formed for the right ear.
- the user's face position is determined by a stereo camera.
- the method is not limited to such a method, and a method of specifying the positions of the face, eyes, or the like by using a heat sensor or the like instead of the camera may be used.
- a configuration for generating a tactile sensation at a fingertip when operating a spatially floating image when the spatially floating image is applied as a non-contact user interface will be described as a spatially floating image display device according to an embodiment.
- the function when the user touches an object (for example, a push button) displayed as a floating image in space, the function generates a touch feeling as if the user were actually touching an object with the fingertip.
- This function is realized by using the fingertip tactile sensation generator 1230 in FIG. 1 described above.
- FIG. 17 shows a schematic configuration of a spatially floating image display device 1000 according to the present embodiment, including a user U and the like, viewed from the side.
- the spatial coordinate system and directions are indicated by (X, Y, Z).
- the Z direction is the vertical direction and the up-down direction
- the X direction and the Y direction are the horizontal directions
- the X direction is the left-right direction as seen from the user U
- the Y direction is the front-back direction and the depth direction.
- the coordinate system and direction in the spatial floating image 3 are indicated by (x, y, z).
- the x direction and the y direction are two orthogonal directions that constitute the main two-dimensional plane of the spatial floating image 3.
- the x direction is the horizontal direction (in other words, the horizontal direction within the screen), and the y direction is the vertical direction (in the screen vertically).
- the z-direction is a direction perpendicular to the two-dimensional plane, and is a front-rear direction related to penetration and approach of the finger UH.
- This spatial floating image display device 1000 is implemented as a part of, for example, an ATM device of a bank.
- the space floating image 3 is formed obliquely from the position on the front side of the housing 50 as seen from the user U, with the angle C described above (FIG. 2) being about 45 degrees.
- the user U visually recognizes the spatially floating image 3 from an eye point UP corresponding to the eye.
- the position and line-of-sight direction of the eye point UP of the user U correspond to the optimal line-of-sight direction LM described above (FIG. 10), and the optical axis a3 corresponds.
- the user U performs an operation such as a touch operation on the floating image 3 with a finger UH (especially a fingertip Uf).
- This position P1 can be expressed as positional coordinates (X, Y, Z) in the space or positional coordinates (x, y) in the coordinate system of the two-dimensional plane of the spatial floating image 3, and these can be converted. is.
- the fingertip tactile sense generation device 6 is provided as an implementation example corresponding to the fingertip tactile sense generation unit 1230 of FIG.
- This fingertip tactile sense generation device 6 is provided as a separate body on the outside of the main housing 50 of the spatial floating imaging device 1000 .
- the fingertip tactile sensation generating device 6 includes a housing separate from the housing 50.
- the housing has, for example, a rectangular parallelepiped shape, and the output plane of the ultrasonic element array 61 is arranged on the surface thereof.
- An ultrasonic signal generating circuit 62 and the like are built in the housing.
- the ultrasonic signal generating circuit 62 is connected to the control device 10 through wired or wireless signal lines/communications.
- the fingertip tactile sensation generating device 6 stands vertically on a base 51 corresponding to the upper surface of the housing 50 or the outer frame in the Y direction. 50 or an optional part).
- the video display device 1, the retroreflective member 2, and the like are housed and fixed as the components described above.
- the image display device 1 includes a light source device 13, a liquid crystal display panel 11, an absorption polarizing plate 12, and the like.
- the retroreflective member 2 is provided with a ⁇ /4 plate 2b.
- the control device 10 and the light shielding member 120 are provided inside the housing 50 .
- the light shielding member 120 has the function as shown in FIG. 6 described above.
- the control device 10 is an implementation example of elements such as the control unit 1110 and the video control unit 1160 in FIG. 1, and can be implemented as a control board or the like. Elements such as the image display device 1, the sensor 4, the camera 5, and the fingertip tactile sense generation device 6 are connected to the control device 10 via signal lines and communication. Note that the control device 10 may communicate with an external device (for example, a server) to exchange data.
- an external device for example, a server
- the sensor 4 is an implementation example of the aerial operation detection sensor 1351 in FIG. 1, and a TOF sensor, for example, can be applied. As shown in the figure, the sensor 4 is arranged at a position on the front side of the housing 50 in the Y direction and at a position extending from the plane of the spatially floating image 3, and the direction of the detection axis is on the plane of the spatially floating image 3. parallel to the direction (y-direction).
- the sensors 4 may be arranged in the center in the X direction, for example, as in FIG. 11, or two sensors may be arranged on the left and right sides.
- the angle C between the spatial floating image 3 and the sensor 4 is about 45 degrees in this example.
- the camera 5 is an implementation example of the imaging unit 1180 in FIG.
- the camera 5 is installed at a position on the back side of the housing 50 in the Y direction, as shown.
- One or more cameras 5 are arranged in the X direction, and in particular, two left and right cameras may constitute a stereo camera.
- the imaging direction of the camera 5 is set so as to be able to capture the face UF of the user U at the regular position, as indicated by the dashed-dotted line arrow, and the imaging range (in other words, the angle of view) is the spatial floating image. 3 and the face UF.
- the fingertip tactile sense generation device 6 includes an ultrasonic element array 61 and an ultrasonic signal generation circuit 62, the details of which are shown in FIG.
- the ultrasonic element array 61 is an array in which a plurality of ultrasonic elements 63 are arranged on a plane as shown in FIG. 18, and outputs ultrasonic waves.
- the ultrasonic signal generation circuit 62 is a circuit that generates an ultrasonic driving signal c5 for driving the ultrasonic element array 61 under the control of the control device 10 .
- phase-controlled ultrasonic waves (in other words, ultrasonic waves) emitted from the ultrasonic element array 61 touch an object on the plane (xy) of the spatially floating image 3 when the fingertip Uf of the user U touches.
- the figure shows a case where a focus is formed by sound pressure at a position (for example, position P1).
- the path of the ultrasonic waves output from the ultrasonic element array 61 of the fingertip tactile sensation generating device 6 is the path reflected by the transparent member 100 .
- a group of ultrasonic waves from the ultrasonic element array 61 is once reflected by the surface of the transparent member 100 and forms a focus at a position corresponding to the object of the spatially floating image 3 and the fingertip Uf as shown.
- Ultrasonic waves are first output from the ultrasonic element array 61 obliquely downward along the axis a4.
- the angle of this axis a4 is the same angle as the angle A of the axis a1 from which image light is emitted from the image display device 1.
- the ultrasonic wave is reflected (substantially totally reflected) by the upper surface of the transparent member 100, and becomes a path obliquely upward as indicated by the axis a5.
- the angle of this axis a5 is the same angle as the angle B of the axis a2 of the image light emitted from the retroreflective member 2.
- the ultrasonic wave reaches the central position P1 of the spatially floating image 3 on the axis a5.
- the fingertip tactile sense generation device 6 is arranged at the illustrated position outside the housing 50 so as not to block the optical path of the image light from the retroreflective member 2 and the like in the housing 50. ing.
- the ultrasonic waves from the ultrasonic element array 61 are irradiated from the back side of the spatially floating image 3 in a substantially vertical direction.
- the ultrasonic waves can be applied to the fingertip Uf of the user U from a direction perpendicular to the axis a5, and the touch feeling can be made more favorable than when the ultrasonic waves are applied from other directions. .
- the ultrasonic element array 61 is configured to be able to form, like a focal point, a region where the sound pressure of ultrasonic waves is relatively high based on control.
- FIG. 17 shows the case where the focal point of the ultrasound is formed at the central position P1 of the spatially floating image 3 .
- the ultrasonic element array 61 can form a focal point on a region of high sound pressure at a predetermined distance in the direction of the output ultrasonic wave path (axis a4 and axis a5).
- the fingertip tactile sense generating device 6 can variably control the position of the focal point of this ultrasonic wave.
- the focal point of the ultrasonic wave can be formed so as to match the desired area corresponding to the touch position of the fingertip Uf.
- the sound pressure of this ultrasonic wave can give a tactile sensation such as a touch feeling to the fingertip Uf when the user U performs a touch operation.
- the performance and position of the ultrasonic element array 61 are determined so that the focal point of the maximum sound pressure can be formed at a predetermined distance on the path of the ultrasonic waves between the ultrasonic element array 61 and the spatially floating image 3. and orientation etc. are designed.
- the predetermined distance is designed as a suitable distance based on the performance of the ultrasonic element 63 .
- the surface of the ultrasonic element array 61 is designed in size, shape, number of elements, etc. so as to cover the formation of focal points in all touch-operable areas on the surface of the floating image 3 .
- the camera 5 of the imaging unit 1180 has a function of detecting that the user U has come to a predetermined position in front of the floating image display device 1000 by, for example, detecting the face of the user U.
- a function of detecting that the user U has come to a predetermined position in front of the floating image display device 1000 by, for example, detecting the face of the user U. may be used for For example, when the spatial floating image display device 1000 detects that a person such as the user U has come to a predetermined position based on the camera 5, predetermined control (for example, display of the spatial floating image 3, audio output, etc.) is performed. ), and when it is detected that the person has left the predetermined position, the predetermined control may be stopped.
- predetermined control for example, display of the spatial floating image 3, audio output, etc.
- FIG. 18 shows a configuration example of the fingertip tactile sense generation device 6 corresponding to the fingertip tactile sense generation unit 1230 .
- the ultrasonic element array 61 has an array configuration in which a plurality of ultrasonic elements 63 are arranged on a plane at approximately equal intervals so as to generate ultrasonic waves having a frequency of about 40 kHz, for example.
- N be the number of ultrasonic elements 63 .
- N 223.
- An array of these ultrasonic elements 62 constitutes an ultrasonic phased array.
- An ultrasound phased array is an array of ultrasound elements that can control the position of the focal formation of ultrasound waves.
- the shape of the arrangement of the plurality of ultrasonic elements 63 as the ultrasonic element array 61 is not limited to concentric circles, and may be square, rectangular, polygonal, or the like, for example.
- the plurality of ultrasonic elements 63 are arranged at approximately equal intervals without gaps, the arrangement is not limited to this.
- An example of the applicable ultrasonic element 63 (in other words, an ultrasonic transducer) is MA40S4S manufactured by Murata Manufacturing Co., Ltd.
- the piezoelectric ceramics is housed in a cylindrical case with a diameter of about 1 cm for each element, and two terminals protrude from the case (MA40S4S specifications: 1639972367144_0.pdf ).
- piezoelectric ceramics expand and contract when voltage is applied, changing their shape.
- an ultrasonic AC voltage having a frequency of, for example, 40 kHz to the piezoelectric ceramics
- the piezoelectric ceramics generates ultrasonic waves at the frequency of the ultrasonic AC voltage.
- MA40S4S which is the ultrasonic element 63 used in this embodiment
- the basic performance is that when an ultrasonic voltage of 10 Vrms with a frequency of 40 kHz is applied between the terminals, the output side of the ultrasonic element 63 A sound pressure of about 120 dB can be obtained at a position of 30 cm.
- ultrasonic waves are applied to the ultrasonic elements 63 so that the ultrasonic waves emitted by the ultrasonic elements 63 constituting the ultrasonic element array 61 strengthen or weaken each other.
- the phase of the drive signal (in other words, the delay time) is changed.
- the arbitrary point on the ultrasonic element array 61 corresponds to the position of the ultrasonic element 63 in the plane of the array as shown in FIG. 18, where the coordinate system is (Ax, Ay). .
- the area containing the central point as indicated by the solid line circle in the drawing can be the point where the sound pressure is the strongest (corresponding to the area where the focus is formed).
- the ultrasonic driving signal c5 it is possible to set the region, for example, to the region indicated by the dashed circle in the drawing (for example, the upper side in the Ay direction).
- the point having the strongest sound pressure can be formed as a focal point.
- the fingertip tactile sensation generating device 6 controls the ultrasonic drive signal c5 to focus on the point where the sound pressure is strongest at a position at a predetermined distance on the ultrasonic wave path (axis a4 and axis a5).
- the point where the sound pressure is strongest can be formed as a focal point not only at the position P1 in the plane of the spatially floating image 3 but also at positions shifted to some extent in the z-direction before and after the position P1.
- the floating image display device 1000 (especially the control device 10) detects a touch position (in other words, fingertip position) on the surface of the floating image 3 by the user U's fingertip Uf. Then, the spatially floating image display device 1000 controls the output of the ultrasonic waves from the fingertip tactile sense generation device 6 so as to match the touch position, etc., so that the region with the highest sound pressure is formed near the fingertip Uf. make it As a result, the user U can feel the sound pressure of the ultrasonic wave at the fingertip Uf during the touch operation. That is, the user U can obtain a touch feeling as if he had touched an object in the air where nothing is actually there.
- a touch position in other words, fingertip position
- the ultrasonic signal generation circuit 62 of FIG. 18 has a circuit group for generating and supplying the ultrasonic drive signal c5 to the ultrasonic element array 61 under the control of the control device 10.
- FIG. This circuit group generates an ultrasonic drive signal c5 to be applied to each ultrasonic element 63 constituting the ultrasonic element array 61 .
- An ultrasonic drive signal c5 is input from two terminals for each ultrasonic element 63 .
- the circuit group of the ultrasonic signal generation circuit 62 includes, in order from the input side to the output side, an ultrasonic carrier signal generation circuit 621, a rectangular wave generation circuit 622, a phase shift circuit 623, an amplitude (AM) modulation circuit 624, and an inductance. It has a circuit 625 .
- the ultrasonic carrier signal generation circuit 611 is an oscillation circuit that generates an ultrasonic carrier signal c1 having a frequency of 40 kHz, for example.
- the generated ultrasonic carrier signal c 1 is input to the rectangular wave generation circuit 612 .
- the rectangular wave generation circuit 612 converts the ultrasonic carrier signal c1 into an ultrasonic carrier signal c2 that is a rectangular wave.
- the ultrasonic carrier signal c 2 which is a rectangular wave output from the rectangular wave generation circuit 612 , is input to the phase shift circuit 613 .
- the phase shift circuit 613 is a circuit that generates an ultrasonic carrier signal c3, which is a rectangular wave in an ultrasonic band having a plurality of types (e.g., eight types) of different phases.
- the phase shift circuit 623 is a circuit that generates a signal for forming sound pressure (focus) according to the fingertip position.
- “having different phases” is synonymous with “having different delay times.” That is, for example, the eight types of ultrasonic signals c3 are ultrasonic signals having eight types of different delay times.
- each ultrasonic carrier signal c3 is selected by control from among a plurality of types (e.g., eight types) of phases.
- a signal having a phase By controlling the phase or delay time in this way, it is possible to form a focal point at an arbitrary point on the ultrasonic element array 61 where the sound pressure is strongest.
- the sensor detection information b1 from the input terminal is input to the phase shift circuit 623 and the inductance circuit 625 .
- the sensor detection information b1 is detection information by the sensor 4 in FIG. 17 related to the user U performing an operation such as a touch on the object of the space floating image 3, or is processed by a processor or the like based on the detection information. This is the resulting information.
- the sensor detection information b1 includes information such as the position of the fingertip Uf in FIG. 17, and includes the position coordinates (X, Y, Z) or (x, y) of the position P1, for example. In this example, as the sensor detection information b1, two-dimensional coordinates (x, y) of the position of the fingertip Uf when the spatial floating image 3 is assumed to be a two-dimensional plane are used.
- the phase shift circuit 623 uses the sensor detection information b1 (fingertip position information, etc.) to control the phase of the ultrasonic signal, that is, control to change the phase of the ultrasonic drive signal c5 input to each ultrasonic element 63. I do. As a result, a signal for realizing the formation of the maximum sound pressure point (focus) is generated according to the touch position or the like for the ultrasonic waves emitted from the entire ultrasonic element array 61 .
- the phase shift circuit 623 can be configured by, for example, a shift register. By changing the number of stages of the shift register, the types of the phases are not limited to eight types, but can be any number of types.
- the ultrasonic carrier signal c3 having a plurality of (eight types) of phases output from the phase shift circuit 623 is input to the AM modulation circuit 624.
- the AM modulation circuit 624 is a circuit for superimposing an audio signal on the ultrasonic carrier signal c3 as a function, and is a circuit for AM-modulating the ultrasonic carrier signal c3 with the audio signal.
- the audio signal b2 is input to the AM modulation circuit 624 from the input terminal.
- the audio signal b2 is a modulating audio signal for modulating the ultrasonic carrier signal c2.
- the AM modulation circuit 614 AM-modulates the ultrasonic carrier signal c3 with the audio signal b2 to obtain a modulated signal c4 (modulated ultrasonic carrier signal).
- the audio signal b1 input to the AM modulation circuit 624 is a signal for generating audio associated with an object that the user U has touched or otherwise operated on the floating image 3 with the fingertip Uf.
- This voice is, for example, an audible band voice (eg, "1") reading out the number (eg, "1") displayed by the push button object.
- the audio signal b1 may be an audio signal such as a predetermined operation sound or an error warning sound for informing the user U that the object has been operated.
- the modulated ultrasonic signal c4 (modulated ultrasonic carrier signal) output from the AM modulation circuit 624 is input to the inductance circuit 625 .
- the inductance circuit 625 is a circuit composed of, for example, a coil, and generates N ultrasonic driving signals c5 corresponding to the N ultrasonic elements 63 based on the modulated ultrasonic signal c4.
- the generated N ultrasonic drive signals c5 are supplied to the N ultrasonic elements 63 of the ultrasonic element array 61 .
- the AM modulation circuit 624 performs AM modulation with the audio signal b1
- the ultrasonic waves emitted from the ultrasonic element array 61 are superimposed with the audio signal.
- a sound corresponding to the sound signal b1 is emitted from the vicinity of the location where the user U has operated the object of the floating image 3 with the fingertip Uf (for example, the touched position P1).
- the ultrasonic waves hit the fingertip Uf the sound is demodulated.
- the sound emitted from that point basically propagates in all directions and reaches the user's U ear UE.
- the user U touches an object the user U can obtain the above-described touch feeling and hear the sound related to the object from the vicinity of the object. These allow the user U to more reliably recognize that the object has been touched.
- FIG. 19 shows a configuration example of the inductance circuit 625.
- FIG. FIG. 19 shows a configuration example in the case of generating the ultrasonic drive signal c5 having the eight types of phases.
- the inductance circuit 625 is configured with a plurality of inductances that are variable capacitance inductances whose inductance components can be varied.
- eight ultrasonic signals c4 modulated ultrasonic carrier signals
- FIG. 19 An example of input to circuit 625 is shown.
- the ultrasonic element array 61 is divided into eight regions based on the eight types of phase control.
- FIG. 20 shows an example in which the surface of the ultrasonic element array 61 is divided into eight regions corresponding to eight types of phases. In this example, eight regions are formed concentrically around the center point of the array.
- Ultrasonic driving signals c5 having different phases are input to the respective regions.
- One selected phase is associated with each region (in other words, phase group).
- the central area indicated by the solid line is the maximum sound pressure area (point M), and the sound pressure is lower toward the outer periphery in the radial direction.
- the plurality of ultrasonic elements 63 forming one phase group region is also referred to as an ultrasonic element group, and the number of ultrasonic elements 63 forming the group is m.
- m 7 in the central region shown.
- the inductance component of the inductance circuit 625 of FIG. Let C be a capacitance component obtained by multiplying the number m of acoustic wave element groups. Then, the inductance component L and the capacitance component C form an LC resonance circuit.
- the capacitance components C1 to C8 corresponding to the eight regions of the eight types of phases are illustrated as an equivalent circuit 1900 of the ultrasonic element group.
- the N signals are illustrated as signal lines grouped into eight phase groups.
- the input terminals 1902 actually have the aforementioned two input terminals for each ultrasonic element 63 of the phase group.
- Inductance circuit 625 has inductance components L1-L8 connected to capacitance components C1-C8.
- the inductance components L1 to L8 are adjusted so that the resonance frequency (assumed to be f) of the LC resonance circuit is 40 kHz. As shown, the resonance frequency f is determined by 1/ ⁇ 2 ⁇ (LC) ⁇ .
- the fingertip tactile sensation generating device 6 uses the ultrasonic phased array by the ultrasonic element array 61 to generate a point or a small area (point M 20) can be formed.
- the phase control using the phase shift circuit 623 when the distance between the arbitrary ultrasonic elements 63 constituting the ultrasonic element array 61 and the point M is long, the phase is advanced, When the distance between the acoustic wave element 63 and the point M is short, the control is such that the phase is delayed.
- the ultrasonic waves emitted from the ultrasonic elements 63 strengthen each other or weaken each other, so that the area of the point M where the sound pressure level of the ultrasonic waves becomes maximum is the focal point. can be formed.
- the sound pressure level As a more specific example of the sound pressure level, when an ultrasonic signal of 10 Vrms is applied to each ultrasonic element 63, the direction above the ultrasonic element array 61 (perpendicular to the plane on the output side, the path of the ultrasonic wave) is It was confirmed that a sound pressure of about 0.6 gf (gram force) was formed at a position of about 20 cm in the direction of Note that this position of about 20 cm is an example corresponding to the performance of the element used, and the design is not limited to this. For example, if it is desired to form a sound pressure similar to the above at a longer position of 30 cm, the voltage applied to the ultrasonic element 63 should be raised to about 17 to 18 Vrms.
- the standard position where the maximum sound pressure is formed as described above is designed to match the plane of the spatially floating image 3 .
- the user U can obtain a sufficient operational feeling as a touch feeling when performing a touch operation on an object in the plane of the spatially floating image 3 with his or her fingertip Uf.
- the touch operation is an operation of touching an object by moving the fingertip Uf in the z-direction so as to penetrate from the front to the back of the plane of the floating image 3 .
- the ultrasonic signal c3 when the ultrasonic signal c3 is AM-modulated by the audio signal b2 (for example, reading a number) in the AM modulation circuit 624, the sound can be emitted from the vicinity of the point M having the maximum sound pressure. can.
- the sound signal can be demodulated and emitted as related sound from the user's U fingertip Uf and the vicinity of the object.
- the fingertip Uf touches (in other words, is positioned) an object formed by the spatially floating image 3 that is actually nothing
- the user U feels as if he or she is physically touching the fingertip Uf. It is possible to obtain a pleasant feeling as a touch feeling.
- the user U can hear the object-related sound emitted from the vicinity of the fingertip Uf. Thereby, the user U can more reliably recognize the operation performed by the user U as a touch operation from both the sense of touch and the sound.
- the ultrasonic signal generation circuit 62 generates ultrasonic signals for each ultrasonic element 62 in the ultrasonic element array 61, and generates ultrasonic waves from at least some of the ultrasonic elements 63 corresponding to the area where sound pressure is generated by ultrasonic waves.
- the signal is AM modulated by the audio signal.
- the audio signal b2 when the audio signal b2 is not input to the AM modulation circuit 614, that is, when AM modulation is not performed, no sound is emitted from the location where the fingertip Uf is positioned.
- the control device 10 may set the audio signal b2 in association with the object or operation according to the content of the object or the like of the spatially floating image 3 . Further, for example, when operation A (for example, touch) and operation B (for example, swipe) are possible for object A, when operation A is performed, voice A is performed, and when operation B is performed, voice B is performed. is also possible.
- the ultrasonic signal generation circuit 62 in FIG. 18 is illustrated as an example of functional block configuration using an analog circuit, implementation is possible without being limited to this.
- the ultrasonic signal generation circuit 62 may be implemented entirely with digital signal processing circuits.
- each circuit portion may be implemented with one or more dedicated circuits.
- each circuit unit may be implemented by software program processing.
- FIG. 21 shows an example of the positional relationship between the spatially floating image 3 and the fingertip tactile sensation generating device 6 when viewing the spatially floating image 3 by the spatially floating image display device 1000 of FIG.
- the fingertip tactile sense generation device 6 is installed at the left and right central positions of the stand 52 (plate-shaped in this example, but not limited to this), and the cameras (5L, 5R) are arranged at the left and right positions.
- a linear sensor 4 is arranged on the front side of the base 51 of the housing 50, and a frame-shaped transparent structural member 60 is fixed obliquely upward from that point.
- a spatially floating image 3 is formed inside.
- an object OB1 is displayed as part of the spatial floating image 3.
- the object OB1 is displayed at, for example, the central position within the plane (xy) of the spatial floating image 3.
- FIG. Object OB1 is, for example, a push button object, and is defined in advance as an object that receives a touch operation (or tap operation).
- FIG. 21 schematically shows how the user U touches the object OB1 with the index finger of the right hand as the fingertip Uf.
- the fingertip tactile sensation generation device 6 controls the phase of the ultrasonic driving signal c5 input to each ultrasonic element 63 of the ultrasonic element array 61, thereby changing the fingertip Uf of the user U in the spatial floating image 3. , that is, the touch position of the object OB1, ultrasonic waves are generated from the ultrasonic element array 61 so that the maximum sound pressure is generated.
- a sound pressure of, for example, about 0.6 gf is generated at the fingertip Uf, and the user U can obtain a touch feeling of touching an object at the fingertip Uf, which is the touch position.
- the spatially floating image display device 1000 of the present embodiment is configured such that the positional relationship between the spatially floating image 3 and the fingertip tactile sense generation device 6 is shown in FIGS. 17 and 21.
- FIG. As a result, the fingertip tactile sensation generation device 6 can generate an ultrasonic touch sensation on the fingertip Uf of the user U without obstructing the optical path (such as the axis a2) for generating the spatially floating image 3.
- the position of the touch between the fingertip Uf of the user U and the object OB1 is determined by the aerial operation detection unit 1350 in FIG. can be detected by The control device 10 grasps information such as the touch position (for example, the position P1 in FIG. 17) output by the aerial operation detection unit 1350 .
- the fingertip tactile sense generation device 6 phase-controls the output of ultrasonic waves based on the touch position coordinate information received from the control device 10 . That is, as shown in FIG. 17, a group of ultrasonic waves from the ultrasonic element array 61 is emitted toward the touch position, so that the focus of the sound pressure is formed at the touch position, and a touch feeling is generated at the fingertip Uf. . In the main plane (xy) of the spatial floating image 3 in FIG. 21, a touch feeling can be generated by similar control for any position.
- the control device 10 and the fingertip tactile sense generation device 6 When generating the touch feeling of the object OB1, the control device 10 and the fingertip tactile sense generation device 6 generate the object OB1 as the modulation audio signal b1 input to the AM modulation circuit 624 of FIG. A predetermined audio signal corresponding to OB1 is input. As a result, a sound signal is superimposed on the ultrasonic wave group, and a predetermined sound is emitted from the vicinity of the touch position of the object OB1 by the fingertip Uf, and the user U can hear the sound.
- the predetermined audio signal associated with the object OB1 may be, for example, an operation sound (such as "pong") indicating that the push button is pressed, or a number (such as "1") or a symbol written on the push button. (for example, "Ichi”), or a guidance voice or the like that is not written on the push button but is associated with it.
- an operation sound such as "pong”
- a number such as "1”
- a symbol written on the push button for example, "Ichi”
- a guidance voice or the like that is not written on the push button but is associated with it.
- FIG. 22 shows an example of installation of the fingertip tactile sensation generating device 6 and the super-directional speaker 30 in this form of combined use.
- left and right superdirective speakers 30L and 30R similar to those shown in FIG. 13 are provided in addition to the configuration of the fingertip tactile sensation generating device 6 similar to that shown in FIG.
- the control device 10 controls the fingertip tactile sense generation device 6 to generate a tactile sense and voice, and the super-directional speaker. 30 controls the generation of sound.
- the user U can obtain a touch feeling on the fingertips Uf, and can hear the sound associated with the object OB1 as the sound from the super-directional speakers 30 (30L, 30R) near the user U. can be heard only by the user U in a manner inaudible to others.
- the combined use mode is particularly effective when applied to a highly confidential system, such as an ATM device.
- the above mode of combined use may be applied to systems that do not require a high level of confidentiality, such as ticket vending machines at stations.
- a station name selection button or the like is used as the object of the floating image 3, and the station name is used as the sound corresponding to the object.
- the voice of the station name is output.
- output by the fingertip tactile sense generation device 6 or output by the super-directional speaker 30 may be used.
- the super-directional speaker 30 when the super-directional speaker 30 is used, there is no fear that other people around the user U may overhear the information such as the station name, and a privacy-friendly ticket vending machine system can be configured.
- the super-directional speaker 30 When transmitting any sound to the user U, the super-directional speaker 30 is turned on and used. At that time, the sound output by the ultrasonic element array 61 is turned off and is not used.
- the output from the super-directional speaker 30 and the sound output from the ultrasonic element array 61 are selectively used. For example, they are used properly according to the degree of confidentiality of the target voice.
- the super-directional speaker 30 is used when outputting a type of sound that should be highly confidential (for example, the sound of a personal identification number, etc.).
- the ultrasonic element array 61 is used when outputting a type of sound that does not require high confidentiality (for example, an operation sound).
- the floating-in-air image display device 1000 uses the camera 5 of the imaging unit 1180 to detect the face position and the like when outputting audio to the user U using the super-directional speaker 30. , to output sound from the super-directional speaker 30 toward the detected face position. Thereby, the effect regarding the super-directional speaker 30 becomes higher.
- a user U who views and operates the floating image 3 as a non-contact user interface, can more reliably view objects such as push buttons in the floating image 3 without ghost images. Furthermore, when the user U touches the object, the user U can obtain a touch feeling similar to that of touching a physical button. Furthermore, when the user touches the object, the user can hear the sound associated with the object emitted from the vicinity of the fingertip. According to the embodiment, it is possible to provide a non-contact user interface that can minimize the risk of contact infection, has excellent visibility and operability, and can reduce erroneous operations and erroneous inputs.
- the spatially floating image display device of the embodiment it is possible to display high-resolution and high-luminance image information as a spatially floating image in a state of being suspended in space.
- this spatial floating image is used as a non-contact user interface including objects such as push buttons, the user can easily operate without feeling anxious about physical buttons being touched and infected.
- you touch an object displayed as a floating image with your fingertip you can get a feeling (touch feeling, etc.) as if you were touching a physical button.
- the user touches an object such as a push button the user can hear sounds such as numbers associated with the object from the fingertip and the vicinity of the object. As a result, it is possible to prevent or reduce erroneous input by the user for the spatially floating image.
- the fingertip tactile sense generation unit 1230 can generate a touch feeling on the fingertip.
- the object of the floating image 3 can be visually recognized by the user U, there is actually no object other than air at that position, and it is difficult to get a real feeling of operation.
- the user U touches the object, the user U can perceive the sensation of touching an object, and the fact that the touch operation has been performed (in other words, that the device side has accepted the touch operation/input) can be perceived both visually and tactilely. can be recognized by Therefore, it is possible to realize a contactless user interface that is more suitable than the conventional one and that is superior in confidentiality of information.
- the fingertip tactile sense generation device 6 is used to modulate the ultrasonic signal with the audio signal, thereby adding a sound related to the object to the ultrasonic signal. superimpose.
- the user U touches an object, it is possible not only to generate a touch feeling, but also to hear a sound related to the object from the vicinity of the fingertip Uf. That is, when the user U operates the object of the spatially floating image 3, the user U can recognize that the operation has been performed reliably from the sense of sight, touch, and hearing. Therefore, a more suitable non-contact user interface can be realized.
- the spatially floating image display device of the embodiment and the spatially floating image by it are used in systems used by an unspecified number of users, for example, public facilities such as stations, government offices, and hospitals, facilities such as banks and department stores, cash registers, elevators, and the like. It can be applied as a non-contact user interface including various applications such as ticketing, reception, authentication, and settlement in devices such as Examples of GUI objects include station names, personal identification numbers, push buttons representing destination floors, and the like. It is applicable not only to buttons but also to various objects such as slide bars.
- Such a contactless user interface eliminates the need for users to touch physical panels and buttons with their fingertips, thereby minimizing the risk of contact infection and allowing them to use applications without feeling uneasy.
- the visibility and operability of the non-contact user interface using the spatially floating image can be greatly improved compared to the conventional art.
- the spatially floating image display device of the embodiment includes the image display device 1 as an image source and the retroreflection member 2, and the divergence angle of the image light emitted toward the outside is made small, that is, an acute angle.
- this spatially floating image display device has high efficiency of light utilization, and can greatly reduce the generation of ghost images that occur in addition to the main spatially floating image, which has been a problem with the conventional retroreflection method. It is possible to obtain a clear spatial floating image.
- This spatially floating image display device also includes an image display device 1 including a unique light source device 13 . As a result, it is possible to provide a novel and highly usable spatial floating image display device capable of significantly reducing power consumption.
- FIG. 23 shows the configuration of the spatially floating image display device 1000 of the modification as viewed from the side.
- This modification differs from the configuration of FIG. 17 in that the output path of the ultrasonic waves from the fingertip tactile sense generation device 6 is not the above-mentioned reflection type, but is a horizontally linear path. It is configured so as to form a focus on the fingertip Uf from the back side of the .
- the fingertip tactile sense generation device 6 is separate from the housing 50 and is installed on the wall 57 behind the floating image display device 1000 (or on the surface of another device or the like).
- the output plane of the ultrasonic element array 61 is in the vertical direction, and the output axis a4 of ultrasonic waves is in the horizontal direction. Even in this configuration, the path of the ultrasonic wave is designed so as not to overlap with the path of the image light (axis a2, etc.) from the retroreflective member 2, etc. in the housing 50 and not to obstruct them.
- the angle of irradiation of ultrasonic waves (in particular, the focus of sound pressure) with respect to the plane (xy) of the spatially floating image 3 is, for example, about 45 degrees, like the angle ⁇ 2. Even with this configuration, since sound pressure can be formed on the fingertip Uf, it is possible to give a sufficient touch feeling or the like.
- the ultrasonic element array 61 and the like may be provided inside the housing 50 .
- the ultrasonic element array 61 is arranged so as not to block elements such as the retroreflective member 2 inside the housing 50 .
- the transparent member 100 or a part of the housing 50 is provided with an opening or the like through which the ultrasonic waves pass.
- FIG. 24 shows a state in which the spatial floating image 3 and the like are viewed from the side in another modified example.
- various operations other than touch operations are accepted as operations on the floating image 3 and objects.
- FIG. 24 shows a spatially floating image area 3V as a three-dimensional area that includes the main plane (xy) of the spatially floating image 3 and extends outward.
- the spatially floating image area 3V is, for example, an area having a predetermined distance K1 in the front and rear directions in the z-direction perpendicular to the plane.
- the spatial floating image display device 1000 accepts various operations by the user U's finger UH on this spatial floating image area 3V.
- the spatially floating image display device 1000 detects the position, movement, etc. of the finger UH based on sensing by the sensor 4, the camera 5, etc., and determines various operations.
- the sensor 4 or the like detects the operation state including the three-dimensional position and motion of the finger UH with respect to the surface of the spatially floating image 3 or the spatially floating image region 3V.
- the distance in the z direction between the fingertip Uf and the surface of the spatially floating image 3 is, for example, the distance K2.
- the distance in the z direction between the fingertip Uf and the surface of the spatially floating image 3 is, for example, the distance K3.
- the control device 10 measures such a distance within the range of the predetermined spatial floating image area 3V, and obtains the positional coordinates (X, Y, Z) of the fingertip Uf in the space.
- the control device 10 can grasp the movement of the finger UH and the like by obtaining the position at each point in time series.
- finger UH operations include swipe operations, flick operations, and pinch operations.
- swipe operations when applied to a swipe operation of an object, the position of focal point formation by ultrasonic waves may be controlled based on the position of movement of the fingertip Uf in the x and y directions due to the swipe operation.
- These examples of operations correspond to operations on existing touch panels, but are not limited to these, and arbitrary operations (such as gestures) can be defined.
- the operation by the finger UH is not limited to the operation by one finger, and the operation by two fingers or the whole hand is similarly possible.
- the object to be manipulated for the spatially floating image 3 is not limited to the finger UH, but can be similarly applied to an object such as a pen held by the finger UH (however, in this case, it is not possible to generate a sensation to the fingertip).
- the control of the focus formation by the ultrasonic waves from the fingertip tactile sense generation device 6 is limited to the control of the direction within the plane (xy) described above.
- control in the z-direction perpendicular to the plane is also possible.
- the ultrasonic wave may be focused according to the position of the distance at that time. Based on the phase control described above, movement of the focal point in the z-direction is possible.
- a touch operation or the like when a state in which the finger UH does not touch the main surface of the floating image 3 but is sufficiently close to it, a touch operation or the like is performed.
- a tactile sensation may be generated by determining that the predetermined operation of is performed.
- FIG. 25 is a graph listing the viewing distance L from the user's panel and the convergence angle between the panel long side and the panel short side when the panel size (screen ratio 16:10) is used as a parameter. .
- the convergence angle can be set according to the short side. For example, if a 22-inch panel is used vertically and the viewing distance is 0.8 m, the convergence angle is 10 degrees. Then, the image light from the four corners of the screen can be effectively directed to the user.
- the image light from the 4 corners of the screen can be effectively directed to the user if the convergence angle is 7 degrees.
- the overall brightness of the screen can be improved. can improve.
- the liquid crystal display panel 11 is directed inward so that the light around the screen is directed toward the user. Furthermore, when the average distance between the eyes of an adult is 65 mm, the luminance difference in the horizontal direction of the screen of the spatially floating image caused by the parallax between the left eye and the right eye was obtained using the viewing distance as a parameter. The results are shown in FIG. When the shortest viewing distance in normal use is 0.8 m, the difference in brightness due to parallax is the sum of the viewing angle difference (5 degrees) and the convergence angle (7 degrees) on the long side shown in FIG. A light source device having characteristics such that the relative luminance does not fall below 50% at 12 degrees may be used.
- the user's line of sight can be shifted, so if the brightness difference due to parallax in the long-side direction, which is more stringent, is taken into account, the overall brightness of the screen can be improved. do.
- the reflective polarizing plate of the grid structure according to the embodiment (for example, the reflective polarizing plate that constitutes the polarization separation member 101 in FIG. 2) has deteriorated characteristics with respect to light from a direction perpendicular to the polarization axis. Therefore, it is desirable that the reflective polarizing plate has specifications along the polarization axis, and the light source device 13 of the embodiment, which can emit the image light emitted from the liquid crystal display panel 11 at a narrow angle, is an ideal light source. Also, the characteristics in the horizontal direction are similarly degraded with respect to oblique light.
- the light source device 13 capable of emitting image light from the liquid crystal display panel 11 at a narrower angle is used as the backlight of the liquid crystal display panel 11 . This makes it possible to provide high-contrast spatial floating images. This configuration example will be described below.
- the display device 1 includes a liquid crystal display panel 11 as an image display element and a light source device 13 as a light source.
- FIG. 27 shows the light source device 13 together with the liquid crystal display panel 11 as an exploded perspective view.
- the liquid crystal display panel 11 has narrow-angle diffusion characteristics due to the light from the light source device 13, which is a backlight device, as indicated by the direction of the arrow 3000.
- an illumination light beam having characteristics similar to those of laser light is obtained, with the plane of polarization aligned in one direction.
- the liquid crystal display panel 11 Based on the illumination light flux, the liquid crystal display panel 11 emits image light modulated according to an input image signal. Then, the image light is reflected by the retroreflection member 2 and transmitted through the transparent member 100 to form the space floating image 3 as a real image.
- the display device 1 of FIG. 27 includes a liquid crystal display panel 11, a light direction conversion panel 54 for controlling the directivity of the light flux emitted from the light source device 13, and, if necessary, a narrow angle diffusion plate (not shown). is configured with That is, polarizing plates are provided on both sides of the liquid crystal display panel 11, and image light of a specific polarized wave is emitted after modulating the intensity of the light according to the image signal (arrow 3000). As a result, the desired image is projected toward the retroreflective member 2 through the light direction conversion panel 54 as light of a specific polarized wave with high directivity, reflected by the retroreflective member 2, and directed to the user's eyes to form a spatially floating image 3.
- a protective cover 250 shown in FIGS. 28 and 29 may be provided on the surface of the light direction conversion panel 54 described above.
- the display device in order to improve the utilization efficiency of the luminous flux (arrow 3000) emitted from the light source device 13 and to significantly reduce the power consumption, the display device includes the light source device 13 and the liquid crystal display panel 11.
- light (arrow 3000) from the light source device 13 is projected toward the retroreflective member 2, reflected by the retroreflective member 2, and then reflected by a transparent sheet (not shown) provided on the surface of the transparent member 100.
- Directivity can also be controlled to form a spatially floating image at a desired location.
- the transparent sheet is configured by optical parts such as a Fresnel lens or a linear Fresnel lens, so that the imaging position of the spatially floating image can be controlled while imparting high directivity.
- the image light from the display device 1 efficiently reaches the user with high directivity like laser light, and as a result, a high-quality spatial floating image can be displayed with high resolution.
- power consumption by the display device 1 including the LED elements 201 of the light source device 13 can be significantly reduced.
- FIG. 27 shows an example of a specific configuration of the display device 1.
- FIG. 28 is a cross-sectional view showing an example of a specific configuration of the light source device 13 (corresponding to the light source 1105 in FIG. 1) in FIG. 27.
- the liquid crystal display panel 11 and the light direction changing panel 54 are arranged on the light source device 13 of FIG.
- the light source device 13 is formed of, for example, plastic on the case shown in FIG. 27, and is configured by housing the LED element 201 and the light guide 203 (corresponding to the light guide 1104 in FIG. 1) therein. ing.
- FIG. 28 is a cross-sectional view showing an example of a specific configuration of the light source device 13 (corresponding to the light source 1105 in FIG. 1) in FIG. 27.
- the liquid crystal display panel 11 and the light direction changing panel 54 are arranged on the light source device 13 of FIG.
- the light source device 13 is formed of, for example, plastic on the case shown in FIG. 27, and is configured by housing the LED element 201 and the light guide 203
- the end face of the light guide 203 gradually increases in cross-sectional area toward the light receiving part in order to convert the diverging light from each LED element 201 into a substantially parallel light flux. It has a lens shape that has the effect of gradually decreasing the divergence angle due to multiple total reflections when propagating inside.
- a liquid crystal display panel 11 is attached to the upper surface of the light guide 203 .
- An LED element 201 as a semiconductor light source and an LED substrate 202 on which a control circuit for the LED element 201 is mounted are attached to one side surface (the left end surface in this example) of the case of the light source device 13 .
- a heat sink which is a member for cooling the heat generated by the LED elements and the control circuit, may be attached to the outer surface of the LED substrate 202 .
- the liquid crystal display panel 11 attached to the frame (not shown) of the liquid crystal display panel 11 attached to the upper surface of the case of the light source device 13 is electrically connected to the liquid crystal display panel 11.
- a flexible wiring board (FPC, not shown) and the like are attached. That is, the liquid crystal display panel 11, which is a liquid crystal display element, along with the LED element 201, which is a solid-state light source, modulates the intensity of transmitted light based on a control signal from a control circuit (not shown) that constitutes the electronic device. to generate a display image. Since the image light generated at this time has a narrow diffusion angle and only a specific polarized wave component, it is possible to obtain a new image display device similar to a surface emitting laser image source driven by a video signal.
- FIGS. 28 and 29 are cross-sectional views, only one LED element 201 constituting the light source is shown.
- the shape of the light receiving end face 203a of the light guide 203 converts incident light into substantially parallel light (collimated light). For this reason, the light receiving portion on the end surface of the light guide 203 and the LED element 201 are attached while maintaining a predetermined positional relationship.
- Each of the light guides 203 is made of translucent resin such as acryl.
- the LED light-receiving surface at the end of the light guide 203 has, for example, a conical convex outer peripheral surface obtained by rotating the parabolic cross section.
- a convex lens surface protruding outward (or a concave lens surface recessed inward) is provided at the center of the flat surface (not shown).
- the outer shape of the light receiving portion of the light guide 203 to which the LED element 201 is attached is a paraboloid that forms a conical outer peripheral surface. The angles of the reflective surface and the paraboloid are set so that the angle allows total reflection at .
- the LED elements 201 are arranged at predetermined positions on the surface of the LED board 202, which is the circuit board.
- the LED substrate 202 is fixed to the LED collimator (light-receiving end surface 203a) so that the LED elements 201 on the surface thereof are located in the central portion of the recesses described above.
- the shape of the light receiving end surface 203a of the light guide 203 makes it possible to extract the light emitted from the LED element 201 as substantially parallel light, thereby improving the utilization efficiency of the generated light. Become.
- the light source device 13 is configured by attaching a light source unit in which a plurality of LED elements 201 as light sources are arranged on the light receiving end surface 203a, which is a light receiving portion provided on the end surface of the light guide 203. is converted into substantially parallel light by the lens shape of the light receiving end surface 203a of the light guide 203, guided inside the light guide 203 as indicated by the arrow, and then converted by the light beam direction converting means 204 into the light guide 203 is emitted toward the liquid crystal display panel 11 arranged substantially parallel to the light.
- the luminous flux direction changing means 204 has a configuration in which a portion having a different refractive index is provided in the shape of the surface of the light guide 203 or inside the light guide 203, so that the light flux propagating in the light guide 203 can be guided. The light is emitted toward the liquid crystal display panel 11 arranged substantially parallel to the light body 203 .
- the liquid crystal display panel 11 faces the center of the screen and the viewing point is placed at the same position as the diagonal dimension of the screen, if the relative luminance ratio is 20% or more when comparing the luminance between the center of the screen and the peripheral part of the screen, There is no practical problem, and if it exceeds 30%, the characteristics will be even better.
- the light source device 13 includes, for example, a light guide 203 formed of plastic or the like and provided with a light beam direction changing means 204 on its surface or inside, an LED element 201 as a light source, a reflective sheet 205, and a reflective polarizing plate 206. , and lenticular lenses.
- a liquid crystal display panel 11 having polarizing plates on the light source light incident surface and the image light output surface is attached to the upper surface of the light source device 13 .
- a film or sheet-like reflective polarizing plate 49 is provided on the light source light incident surface (lower surface) of the liquid crystal display panel 11 corresponding to the light source device 13 .
- Reflective polarizing plate 49 selectively reflects polarized wave (for example, P wave) WAV 2 on one side of natural light flux 210 emitted from LED element 201 , and reflects light provided on one (lower) surface of light guide 203 .
- the light is reflected by the sheet 205 and directed toward the liquid crystal display panel 11 again. Therefore, a ⁇ /4 plate, which is a retardation plate, is provided between the reflective sheet 205 and the light guide 203 or between the light guide 203 and the reflective polarizing plate 49 so that the light is reflected by the reflective sheet 205 and passed through twice.
- the image light beam (arrow 213 in FIG. 28) whose light intensity is modulated by the image signal in the liquid crystal display panel 11 is incident on the retroreflective member 2, and as shown in FIG.
- a spatially floating image 3 which is a real image, can be obtained outside.
- the light source device 13 of FIG. 29 also includes a light guide 203 formed of plastic or the like and provided with a light beam direction changing means 204 on its surface or inside, an LED element 201 as a light source, a reflective sheet 205, and a reflective polarizing plate. 206, a lenticular lens, and the like.
- the liquid crystal display panel 11 is attached as an image display element, and has polarizing plates on the light source light entrance surface and the image light exit surface.
- a film or sheet-like reflective polarizing plate 49 is provided on the light source light incident surface (lower surface) of the liquid crystal display panel 11 corresponding to the light source device 13 .
- one polarized wave for example, S wave
- WAV 1 is selectively reflected by the reflective polarizing plate 49 , and reflected on one (lower) surface of the light guide 203 .
- the light is reflected by the sheet 205 and heads toward the liquid crystal display panel 11 again.
- a ⁇ /4 plate which is a retardation plate, is provided between the reflective sheet 205 and the light guide 203 or between the light guide 203 and the reflective polarizing plate 49 .
- the polarized light on one side is reflected by the reflecting sheet 205 and passed through twice, thereby converting the reflected light flux from S-polarized light to P-polarized light.
- the image light flux (arrow 214 in FIG. 29) whose light intensity is modulated by the image signal in the liquid crystal display panel 11 is incident on the retroreflection member 2, and as shown in FIG. , generates a spatial floating image 3, which is a real image, to the outside.
- the reflective polarizing plate In the light source device 13 shown in FIGS. 28 and 29, in addition to the action of the polarizing plate provided on the light incident surface of the corresponding liquid crystal display panel 11, the reflective polarizing plate reflects the polarized component on one side.
- the obtained contrast ratio is obtained by multiplying the reciprocal of the cross transmittance of the reflective polarizing plate and the reciprocal of the cross transmittance obtained by the two polarizing plates attached to the liquid crystal display panel 11 .
- FIG. 30 shows another example of a specific configuration of the display device 1.
- the light source device 13 in FIG. 30 is similar to the light source device in FIG. 29 and the like.
- the light source device 13 is configured by housing an LED, a collimator, a synthetic diffusion block, a light guide, etc. in a case made of plastic or the like.
- a liquid crystal display panel 11 is attached to the upper surface of the light source device 13 .
- an LED element which is a semiconductor light source, and an LED substrate on which a control circuit for the LED element is mounted are attached.
- a heat sink 103 which is a member for cooling the heat generated by the LED elements and the control circuit, is attached to the outer surface of the LED substrate.
- the frame of the liquid crystal display panel 11 attached to the upper surface of the case is attached with the liquid crystal display panel 11 attached to the frame and the FPC or the like electrically connected to the liquid crystal display panel 11. . That is, the liquid crystal display panel 11, which is a liquid crystal display element, generates a display image by modulating the intensity of transmitted light based on a control signal from a control circuit constituting an electronic device together with an LED element, which is a solid-state light source. .
- the light source device of the display device 1 converts a divergent luminous flux of natural light (P-polarized wave and S-polarized wave are mixed) from LEDs (LED elements on the LED substrate) into a substantially parallel luminous flux by an LED collimator 15, and converts the substantially parallel luminous flux to The light is reflected toward the liquid crystal display panel 11 by the reflective light guide 304 .
- the reflected light is incident on the wave plate and the reflective polarizing plate 49 arranged between the liquid crystal display panel 11 and the reflective light guide 304 .
- a specific polarized wave (for example, S-polarized wave) is reflected by the reflective polarizing plate 49 , phase-converted by the wave plate, returns to the reflecting surface, passes through the phase difference plate again, and is transmitted through the reflective polarizing plate 49 . It is converted into polarized waves (for example, P-polarized waves).
- the natural light from the LED is aligned to a specific polarized wave (for example, P-polarized wave), and the specific polarized wave is incident on the liquid crystal display panel 11, is luminance-modulated in accordance with the video signal, and displays the video on the panel surface.
- a specific polarized wave for example, P-polarized wave
- the configuration of FIG. 31 has a plurality of LEDs constituting the light source, as in the previous example, but since FIG. 31 is a vertical cross-sectional view, only one LED is illustrated. These LEDs are attached at predetermined positions with respect to the LED collimator 15 .
- Each LED collimator 15 is made of translucent resin such as acrylic or glass.
- the LED collimator 15 has a conical convex outer peripheral surface obtained by rotating the parabolic cross section, and has a concave portion at the top thereof with a convex portion (that is, a convex lens surface) formed in the center portion.
- the central portion of the planar portion has a convex lens surface that protrudes outward (or may be a concave lens surface that is recessed inward).
- the paraboloid that forms the conical outer peripheral surface of the LED collimator 15 is set within an angular range that allows total internal reflection of the light emitted from the LED in the peripheral direction.
- the reflecting surface is formed so as to be within the range of angles that allow total internal reflection of light emitted from the LED in the peripheral direction.
- the configuration described above is similar to that of the light source device 13 of the image display device 1 shown in FIGS. 28 and 29 and the like. Furthermore, the light converted into substantially parallel light by the LED collimator 15 shown in FIG. The other polarized light passes through the reflective light guide 304 again and is reflected by the reflector 271 provided on the other surface of the light guide that is not in contact with the liquid crystal display panel 11 . At this time, the polarized light is converted by passing twice through the retardation plate ( ⁇ /4 plate) 270 arranged between the reflector 271 and the liquid crystal display panel 11, and is transmitted through the reflective light guide 304 again. The light is transmitted through the reflective polarizing plate 49 provided on the surface and incident on the liquid crystal display panel 11 with the polarization direction aligned. As a result, all the light from the light source can be used, and the light utilization efficiency is greatly improved (for example, doubled).
- the emitted light from the liquid crystal display panel 11 is distributed in both the horizontal direction of the screen ((a) of FIG. 38, indicated by the X-axis) and the vertical direction of the screen ((b) of FIG. 38, indicated by the Y-axis). had similar diffusion properties.
- the diffusion characteristic of the emitted light flux from the liquid crystal display panel 11 of the present embodiment is, for example, as shown in example 1 in FIG. is 13 degrees, the viewing angle becomes about 1/5 of the conventional 62 degrees.
- the angle of view in the vertical direction is uneven in the vertical direction, and the angle of reflection of the reflective light guide and the area of the reflective surface are adjusted so that the angle of view on the upper side is suppressed to about 1/3 of the angle of view on the lower side.
- Optimize. As a result, compared to conventional liquid crystal TVs, the amount of image light directed toward the viewing direction is greatly improved, and the luminance is increased by 50 times or more.
- the viewing angle at which the luminance is 50% of the front view is 5 degrees, which is 1/1 of the conventional 62 degrees. 12.
- the angle of view in the vertical direction is uniform in the vertical direction, and the angle of reflection and the area of the reflective surface of the reflective light guide are optimized so that the angle of view is suppressed to about 1/12 of the conventional angle.
- the amount of image light directed toward the viewing direction is greatly improved, and the luminance is 100 times or more.
- ⁇ Configuration example 1 of light source device> a configuration example of the optical system such as the light source device 13 housed in the case will be described in detail with reference to FIGS. 33 and 34 along with FIG. 32 to 34 show the LED elements 14 (14a, 14b) constituting the light source, which are attached to the LED collimator 15 at predetermined positions.
- Each of the LED collimators 15 is made of translucent resin such as acrylic.
- the LED collimator 15 has a conical convex outer peripheral surface 156 obtained by rotating the parabolic cross section, and has a convex portion (that is, a convex lens surface) at the center at the top. ) 157 is formed.
- the paraboloid that forms the conical outer peripheral surface 156 of the LED collimator 15 is set within an angle range in which the light emitted from the LED element 14 in the peripheral direction can be totally reflected therein, or , a reflective surface is formed.
- the LED elements 14 are arranged at predetermined positions on the surface of the LED board 102, which is the circuit board.
- This LED board 102 is arranged and fixed to the LED collimator 15 so that the LED elements 14 (14a, 15b) on its surface are positioned at the center of the recesses 153, respectively.
- the light emitted upward (to the right) from the central portion in particular has the outer shape of the LED collimator 15.
- the two formed convex lens surfaces 157 and 154 converge the light into parallel light.
- the light emitted in the peripheral direction from other portions is reflected by the parabolic surface that forms the conical outer peripheral surface of the LED collimator 15 and is similarly condensed into parallel light.
- the LED collimator 15 having a convex lens in its central part and a parabolic surface in its peripheral part almost all the light generated by the LED element 14 can be extracted as parallel light. is. This makes it possible to improve the utilization efficiency of the generated light.
- a polarization conversion element 2100 is provided on the light exit side of the LED collimator 15 .
- the polarization conversion element 2100 includes a columnar translucent member having a parallelogram cross section (parallelogram prism) and a columnar translucent member having a triangular cross section (triangular prism).
- a plurality of light beams are arranged in an array in parallel to a plane perpendicular to the optical axis of the parallel light from the LED collimator 15 .
- a polarizing beam splitter (PBS film) 211 and a reflective film 212 are alternately provided on the interface between the adjacent translucent members arranged in an array.
- a ⁇ /2 phase plate 213 is provided on the exit surface from which light that has entered and passed through the PBS film 211 is emitted.
- a rectangular synthesis diffusion block 16 also shown in FIG. That is, the light emitted from the LED element 14 is collimated by the function of the LED collimator 15 and enters the synthesizing/diffusion block 16 , and reaches the light guide 17 after being diffused by the texture 161 on the output side.
- the light guide 17 is a rod-shaped member with a substantially triangular cross section made of translucent resin such as acrylic.
- the light guide 17 has a light guide light entrance portion (a light guide light entrance surface) facing the output surface of the combined diffusion block 16 via the first diffuser plate 18a. ) 171, a light guide light reflecting portion (including a light guide light reflecting surface) 172 forming an inclined surface, and a second diffusion plate 18b, facing the liquid crystal display panel 11, which is a liquid crystal display element. and a light guide light emitting portion (including a light guide light emitting surface) 173 .
- FIG. 36 which is a partially enlarged view of the light guide body light reflection portion (surface) 172 of the light guide body 17, a large number of reflection surfaces 172a and connecting surfaces 172b are alternately formed in a sawtooth shape. formed.
- Reflecting surface 172a (a line segment rising to the right in the drawing) forms an angle ⁇ n (n: natural number, 1 to 130 in this example) with respect to the horizontal plane indicated by the dashed line in FIG. , here, the angle ⁇ n is set to 43 degrees or less (however, 0 degrees or more).
- the light guide body light entrance portion (surface) 171 is formed in a curved convex shape that is inclined toward the light source side. According to this, the parallel light from the output surface of the synthetic diffusion block 16 is diffused through the first diffusion plate 18a and is incident thereon. While being slightly bent (in other words, deflected) upward by 171, it reaches the light guide light reflecting portion (surface) 172, where it is reflected and reaches the liquid crystal display panel 11 provided on the upper emission surface.
- the display device 1 it is possible to further improve the light utilization efficiency and its uniform illumination characteristics, and to manufacture it at a small size and at a low cost, including a modularized light source device for S-polarized waves. becomes.
- the polarization conversion element 2100 is attached after the LED collimator 15, but is not limited to this. can get.
- the light guide body light reflecting portion (surface) 172 has a large number of reflecting surfaces 172a and connecting surfaces 172b alternately formed in a sawtooth shape. Further, the light guide body light emitting portion (surface) 173 is provided with a narrow-angle diffusion plate, and the diffused light flux is incident on the light direction conversion panel 54 for controlling the directivity characteristics from an oblique direction. The light enters the liquid crystal display panel 11 .
- the light direction changing panel 54 is provided between the light guide body light emitting portion 173 and the liquid crystal display panel 11, but even if it is provided on the emitting surface of the liquid crystal display panel 11, the same effect can be obtained.
- FIG. 35 shows a plurality of (two in this example) LED elements 14 (14a, 14b) constituting the light source, which are arranged at predetermined positions with respect to the LED collimator 15, as in the example of FIG. attached to the Each of the LED collimators 15 is made of translucent resin such as acrylic. 33, this LED collimator 15 has a conical convex outer peripheral surface 156 obtained by rotating the parabolic section, and at the top thereof, there is a convex portion at the center (that is, a convex lens surface ) 157 is formed.
- the paraboloid that forms the conical outer peripheral surface 156 of the LED collimator 15 is set within an angle range in which the light emitted in the peripheral direction from the LED element 14a can be totally reflected therein, or , a reflective surface is formed.
- the LED elements 14 are arranged at predetermined positions on the surface of the LED board 102, which is the circuit board.
- the LED substrate 102 is arranged and fixed to the LED collimator 15 so that the LED elements 14 on its surface are positioned in the central portion of the recesses 153 .
- the LED collimator 15 having a convex lens in its central portion and a parabolic surface formed in its peripheral portion, almost all of the light generated by the LED element 14 can be extracted as parallel light. is. This makes it possible to improve the utilization efficiency of the generated light.
- a light guide 170 is provided on the light emitting side of the LED collimator 15 via the first diffusion plate 18a.
- the light guide 170 is a rod-shaped member with a substantially triangular cross section made of translucent resin such as acryl.
- the light guide body 170 is a light guide body light incident part which is an incident part of the light guide body 170 facing the output surface of the combined diffusion block 16 via the first diffusion plate 18a. 171 , a light guide light reflecting portion 172 forming an inclined surface, and a light guide light emitting portion 173 facing the liquid crystal display panel 11 with the reflective polarizing plate 200 interposed therebetween.
- the reflective polarizing plate 200 for example, if a material having characteristics of reflecting P-polarized light and transmitting S-polarized light is selected, the P-polarized light of the natural light emitted from the LED, which is the light source, is reflected by the reflective polarizing plate 200. Then, the light passes through the ⁇ /4 plate 201a provided in the light guide light reflecting portion 172 shown in FIG. 34, is reflected by the reflecting surface 201b, and passes through the ⁇ /4 plate 201a again to be converted into S-polarized light. . As a result, all the light beams incident on the liquid crystal display panel 11 are unified into S-polarized light.
- the reflective polarizing plate 200 is selected to reflect S-polarized light and transmit P-polarized light
- the S-polarized natural light emitted from the LED which is the light source
- the reflective polarizing plate 200 passes through the ⁇ /4 plate 201a provided in the light guide light reflecting portion 172 shown in FIG. be done.
- the ⁇ /4 plate 201a provided in the light guide light reflecting portion 172 shown in FIG. be done.
- all the light beams incident on the liquid crystal display panel 52 are unified into P-polarized light.
- Polarization conversion can also be achieved with the configuration described above.
- ⁇ Configuration example 3 of light source device> Another example of the configuration of the optical system such as the light source device 13 will be described with reference to FIG.
- a diverging luminous flux of natural light (a mixture of P-polarized and S-polarized light) from the LEDs of the LED substrate 102 is converted into a substantially parallel luminous flux by the LED collimator 15, and is converted by the reflective light guide 304.
- the light is reflected toward the liquid crystal display panel 11 .
- the reflected light enters the reflective polarizing plate 206 arranged between the liquid crystal display panel 11 and the reflective light guide 304 .
- a specific polarized wave (for example, S-polarized wave) is reflected by the reflective polarizing plate 206 , transmitted through the surface connecting the reflective surfaces of the reflective light guide 304 , and placed facing the opposite surface of the reflective light guide 304 .
- the light is reflected by the reflected light 271, passes through the retardation plate ( ⁇ /4 plate) 270 twice, undergoes polarization conversion, passes through the light guide and the reflective polarizing plate, and enters the liquid crystal display panel 11, Modulated to image light.
- the specific polarized wave and the polarization-converted plane of polarization the light utilization efficiency is doubled, and the degree of polarization (extinction ratio) of the reflective polarizer is also added to the extinction ratio of the entire system. Therefore, by using the light source device 13 of the present embodiment, the contrast ratio of the information display system can be greatly improved.
- the natural light from the LED is aligned with a specific polarization (for example, P polarization).
- a plurality of LEDs (only one is shown in FIG. 31 because it is a longitudinal section) are provided to form a light source, and these are attached to the LED collimator 15 at predetermined positions.
- Each of the LED collimators 15 is made of translucent resin such as acrylic or glass.
- the LED collimator 15 has a conical convex outer peripheral surface obtained by rotating the parabolic cross section, and has a concave portion at the top thereof with a convex portion (that is, a convex lens surface) formed in the central portion thereof.
- the central portion of the planar portion has a convex lens surface projecting outward (or a concave lens surface recessed inward).
- the paraboloid that forms the conical outer peripheral surface of the LED collimator 18 is set within an angle range in which the light emitted in the peripheral direction from the LED collimator 15 can be totally reflected therein, or A reflective surface is formed.
- the LEDs are arranged at predetermined positions on the surface of the LED board 102, which is the circuit board.
- the LED substrate 102 is arranged and fixed to the LED collimator 15 so that the LEDs on its surface are positioned at the center of the recess.
- the light emitted from the LED by the LED collimator 15 particularly the light emitted from the central portion thereof is condensed by the two convex lens surfaces forming the outer shape of the LED collimator 15 and collimated. become light.
- the light emitted in the peripheral direction from other portions is reflected by the paraboloid forming the conical outer peripheral surface of the LED collimator 15, and is similarly condensed into parallel light.
- the LED collimator 18 having a convex lens in its central portion and a parabolic surface in its peripheral portion, almost all the light generated by the LED can be extracted as parallel light. This makes it possible to improve the utilization efficiency of the generated light.
- ⁇ Configuration example 4 of light source device> Furthermore, another example of the configuration of the optical system such as the light source device 13 will be described with reference to FIG.
- two optical sheets 207 (in other words, diffusion sheets) are used on the light exit side of the LED collimator 15 to convert the diffusion characteristics in the vertical and horizontal directions of the drawing, and the light from the LED collimator 15 is The light is made incident between the two optical sheets 207 .
- the optical sheet 207 is composed of one sheet, the vertical and horizontal diffusion characteristics are controlled by the fine shapes of the front and back surfaces. Also, a plurality of diffusion sheets may be used to share the action.
- the diffusion angle of the light from the LED collimator 15 in the vertical direction of the screen is matched to the width of the vertical surface of the reflection surface of the diffusion sheet, and in the horizontal direction, the luminous flux emitted from the liquid crystal display panel 11 is reduced.
- the number of LEDs and the angle of divergence from the optical element 500 should be optimally designed as design parameters so that the surface density is uniform. That is, in this configuration, the diffusion characteristics are controlled by the surface shapes of a plurality of diffusion sheets instead of the light guide.
- the polarization conversion is performed by the same method as in the configuration example 3 of the light source device described above.
- a polarization conversion element 2100 (FIG. 30) may be provided between the LED collimator 15 and the optical sheet 207 to convert the polarization, and then the light from the light source may be made incident on the optical sheet 207 .
- the reflective polarizing plate 206 described above is selected to reflect S-polarized light and transmit P-polarized light, it will reflect S-polarized light out of the natural light emitted from the LED, which is the light source, to the position shown in FIG.
- the light passes through the retardation plate 270 , is reflected by the reflector 271 , passes through the retardation plate 270 again, is converted into P-polarized light, and enters the liquid crystal display panel 11 . It is necessary to select an optimum value for the thickness of the retardation plate depending on the incident angle of the light beam on the retardation plate, and the optimum value exists in the range from ⁇ /16 to ⁇ /4.
- the user can operate the image without feeling uneasy about contact infection of infectious diseases, for example. to enable. If the technology according to this embodiment is applied to a system used by an unspecified number of users, it will be possible to reduce the risk of contact infection of infectious diseases and to provide a non-contact user interface that can be used without anxiety. . In this way, we will contribute to "3 good health and well-being for all" in the Sustainable Development Goals (SDGs) advocated by the United Nations.
- SDGs Sustainable Development Goals
- the technology according to the present embodiment by making the angle of divergence of emitted image light small and aligning it to a specific polarized wave, only normal reflected light can be efficiently reflected by the retroreflective member. To obtain a bright and clear spatial floating image with high utilization efficiency. According to the technology according to the present embodiment, it is possible to provide a highly usable non-contact user interface capable of significantly reducing power consumption. In this way, we will contribute to the Sustainable Development Goals (SDGs) advocated by the United Nations, namely, "9: Build a foundation for industry and technological innovation" and "11: Sustainable urban development.” Furthermore, the technique according to the present embodiment makes it possible to form a spatially floating image by image light with high directivity (straightness).
- SDGs Sustainable Development Goals
- SYMBOLS 1... Display (image display apparatus), 2... Retroreflection member, 3... Spatial floating image, 4... Sensor (aerial operation detection sensor), 5... Camera, 6... Fingertip tactile sense generation device, 10... Control device, 11... Liquid crystal display panel 12 Absorptive polarizing plate 13 Light source device 50 Housing 61 Ultrasonic element array 62 Ultrasonic signal generating circuit 100 Transparent member 101 Polarization separation member 1000 Space Floating image display device, U...user, Uf...fingertip.
Abstract
Description
従来技術例の空間浮遊映像表示装置としては、高解像度なカラー表示映像源として有機ELパネルや液晶パネル等の映像表示装置を、再帰反射部材と組み合わせた構成が挙げられる。従来技術例の空間浮遊映像表示装置では、映像光が広角で拡散する。また、従来技術例の空間浮遊映像表示装置では、再帰反射部材2を構成する再帰性反射部2aは、図3や図4にも示すように、6面体である。このため、従来技術例の空間浮遊映像表示装置では、正規に反射する反射光の他に、図4に示すように、再帰反射部材2(複数の再帰性反射部2a)に斜めから入射する映像光よって、図5に示すように、正規な空間浮遊映像3である正規像R1の他に、ゴースト像として、第1ゴースト像G1から第6ゴースト像G6まで、複数のゴースト像が発生する。これにより、空間浮遊映像の画質を損ねていた。
図1は、一実施の形態の空間浮遊映像表示装置の機能ブロック構成例を示す。図1の空間浮遊映像表示装置1000は、再帰性反射部1101、映像表示部1102、導光体1104、光源1105、電源1106、操作入力部1107、不揮発性メモリ1108、メモリ1109、制御部1110、映像信号入力部1131、音声信号入力部1133、通信部1132、空中操作検出センサ1351、空中操作検出部1350、指先触覚生成部(言い換えるとタッチ感生成部)1230、音声信号出力部1240、超指向性スピーカ1242、通常スピーカ1243、映像制御部1160、ストレージ部1170、および撮像部1180等を備えている。これらの要素は、バス等を通じて相互に接続されている。空間浮遊映像表示装置1000の主な各構成要素は、筐体1190に収容されている。、撮像部1180および空中操作検出センサ1351は、筐体1190の一部として、または筐体1190の外側に設けられてもよい。
図2は、一実施の形態の空間浮遊映像表示装置1000の主要部構成、および再帰性反射部1101の構成例等を示す。図2は、空間内において空間浮遊映像表示装置1000と正規のユーザUとが対向する方向に対し側面から見た場合の構成を示す。図2のように、水平面に配置された透明部材100の斜め方向(水平面に対し角度Aを持つ方向)には、特定偏波の映像光を挟角に発散させる表示装置1(映像表示装置)を備える。表示装置1は、液晶表示パネル11と、挟角な拡散特性を有する特定偏波の光を生成する光源装置13とを備えている。
図6は、一実施の形態の空間浮遊映像表示装置の主要部構成の他の例を示す。表示装置1は、映像表示素子としての液晶表示パネル11と、挟角な拡散特性を有する特定偏波の光を生成する光源装置13とを有して構成される。液晶表示パネル11は、画面サイズが5インチ程度の小型のものから80インチを超える大型なものまでから選択されたもので構成される。液晶表示パネル11からの映像光は、例えば反射型偏光板のような偏光分離部材101によって、再帰反射部材2に向けて反射される。
図9は、一実施の形態の空間浮遊映像表示装置における誤入力の防止のための映像表示方法に関する説明図である。図9は、空間浮遊映像表示装置1000での空間浮遊映像3の表示例を示し、ユーザUの視点から空間浮遊映像3を見た場合の見え方を示す。この空間浮遊映像3の例は、テンキー等の複数のオブジェクトを持つ、数字等の入力が可能な非接触ユーザインタフェースに相当する。前述のように、表示装置1は、液晶表示パネル11と、挟角な拡散特性を有する特定偏波の光を生成する光源装置13とを備え、画面サイズが5インチ程度の小型のものから80インチを超える大型なものまでから選択されたもので構成できる。例えば、液晶表示パネル11からの映像光を、反射型偏光板のような偏光分離部材101によって、再帰反射部材2に向けて反射させる。
次に、本発明者は、空間浮遊映像表示装置で形成される空間浮遊映像として表示されるオブジェクトを非接触ユーザインタフェースとして使用する場合に、ユーザが選択したいオブジェクトに対するタッチ操作等の誤入力を防止して確実に入力操作を行うことができる構成について検討した。そのために、特に、ユーザの入力操作に対応させて好適に音声を出力する方法、例えば音声によるユーザ操作補助あるいは操作ガイダンス等を行う方法について検討した。
次に、一実施の形態の空間浮遊映像表示装置として、空間浮遊映像を非接触ユーザインタフェースとして適用する場合に、空間浮遊映像に対する操作時に指先に触覚の感覚を生成する構成について説明する。本実施の形態では、ユーザが、空間浮遊映像として表示されたオブジェクト(例えば押しボタン)にタッチ操作した時に、指先に、実際に何かの物体に触ったような感触をタッチ感として生成する機能を有する。この機能は、前述の図1の指先触覚生成部1230を用いて実現される。
図17は、本実施の形態の空間浮遊映像表示装置1000の構成として、ユーザU等を含め側面から見た概略構成を示す。なお、説明上、空間の座標系や方向を(X,Y,Z)で示す。Z方向は鉛直方向、上下方向であり、X方向およびY方向は水平方向であり、X方向はユーザUから見た左右方向であり、Y方向は前後方向、奥行き方向である。また、空間浮遊映像3での座標系や方向を(x,y,z)で示す。x方向およびy方向は、空間浮遊映像3の主要な2次元平面を構成する直交する2つの方向であり、x方向は横方向(言い換えると画面内水平方向)、y方向は縦方向(画面内垂直方向)である。z方向は、その2次元平面に対して垂直な方向であり、手指UHの侵入や近接等に係わる前後方向である。この空間浮遊映像表示装置1000は、例えば銀行のATM装置などの一部として実装される。
図18は、指先触覚生成部1230に対応する指先触覚生成装置6の構成例を示す。超音波素子アレイ61は、周波数として例えば40kHz近傍の超音波を発生できるように、平面上に複数個の超音波素子63をほぼ等間隔で配置したアレイの構成を有する。超音波素子63の数をNとする。一例としてN=223個である。本例では、超音波素子アレイ61としては、複数(N=223)の超音波素子63が同心円状に配置され、円形のアレイを構成している。これらの超音波素子62のアレイによって、超音波フェーズドアレイが構成されている。超音波フェーズドアレイは、超音波の焦点の形成の位置を制御可能な超音波素子アレイである。
1639972367144_0.pdf
)。
図21は、正規のユーザUの視点から図17の空間浮遊映像表示装置1000による空間浮遊映像3を見た場合の、空間浮遊映像3と指先触覚生成装置6との位置関係の一例を示す。図21では、スタンド52(本例では板状だがこれに限られない)の左右で中央の位置に指先触覚生成装置6が設置されており、左右の位置にはカメラ(5L,5R)が配置されている。筐体50の台51の手前側には、直線状のセンサ4が配置されており、その箇所から斜め上には額縁状の透明構造部材60が固定されており、その透明構造部材60の面内に空間浮遊映像3が形成されている。
上記実施の形態の変形例として、図17等の指先触覚生成装置6と、前述の図13等の超指向性スピーカ30との両方を組み合わせて併用する形態も可能である。図22は、この併用の形態における、指先触覚生成装置6と超指向性スピーカ30との設置例を示す。本例では、図21と同様の指先触覚生成装置6の構成に加え、図13と同様の左右の超指向性スピーカ30L,30Rが設けられている。
以上のように、実施の形態(例4)の空間浮遊映像表示装置1000によれば、以下のような効果を奏する。非接触ユーザインタフェースとしての空間浮遊映像3を視認および操作するユーザUは、空間浮遊映像3による例えば押しボタン等のオブジェクトを、ゴースト像も無くより確実に視認することができる。さらに、ユーザUは、そのオブジェクトをタッチ操作した際に、物理的なボタンをタッチしたことに近いタッチ感を得ることができる。さらに、ユーザは、そのオブジェクトをタッチ操作した際に、指先の近傍から発するそのオブジェクトに関連付けられた音声を聞き取ることができる。実施の形態によれば、接触感染のリスクを最小限とすることができ、また、視認性および操作性に優れ、かつ、誤操作や誤入力を低減可能な非接触ユーザインタフェースを提供できる。
上記例4の実施の形態に関する各種の変形例として以下も可能である。
次に、上述した各実施の形態の空間浮遊映像表示装置に適用可能である映像表示装置等の詳しい構成や特性などについて説明する。図2の液晶表示パネル11として、大型の液晶表示パネルを使用する場合には、画面中央に対してユーザが正対した場合に、画面周辺の光がユーザ(眼)の方向に向かうように、液晶表示パネルを内側に向ける構成としてもよい。これにより、画面明るさの全面性が向上する。
実施の形態でのグリッド構造の反射型偏光板(例えば図2の偏光分離部材101を構成する反射型偏光板)は、偏光軸に対して垂直方向からの光についての特性は低下する。このため、反射型偏光板は、偏光軸に沿った仕様が望ましく、液晶表示パネル11からの出射映像光を挟角で出射可能である実施の形態の光源装置13が理想的な光源となる。また、水平方向の特性についても同様に斜めからの光については特性低下がある。以上の特性を考慮して、本実施の形態の構成例では、液晶表示パネル11からの出射映像光をより挟角に出射可能な光源装置13を、液晶表示パネル11のバックライトとして使用する。これにより、高コントラストな空間浮遊映像を提供可能となる。本構成例について以下に説明する。
図27を用いて表示装置1の構成例を説明する。この表示装置1は、映像表示素子としての液晶表示パネル11と共に、その光源を構成する光源装置13を備えている。図27では、光源装置13を液晶表示パネル11と共に展開斜視図として示している。この液晶表示パネル11は、矢印3000の方向で示すように、バックライト装置である光源装置13からの光により、挟角な拡散特性を有する、すなわち、指向性(言い換えると直進性)が強く、かつ、偏光面を一方向に揃えた、レーザ光に似た特性の照明光束を得る。液晶表示パネル11は、その照明光束に基づいて、入力される映像信号に応じて変調をかけた映像光を出射する。そして、その映像光が、再帰反射部材2により反射させられて、透明部材100を透過して、空間浮遊映像3を実像として形成する。
前述のように、図27は、表示装置1の具体的な構成の一例を示す。図28は、図27の光源装置13(図1の光源1105に対応)の具体的な構成の一例を示す断面図である。図28に示すように、図27の光源装置13の上には、液晶表示パネル11と光方向変換パネル54とが配置されている。光源装置13は、図27に示したケース上に、例えば、プラスチック等によって形成され、その内部にLED素子201、導光体203(図1の導光体1104に対応)を収納して構成されている。導光体203の端面は、図28等にも示すように、それぞれのLED素子201からの発散光を略平行光束に変換するために、受光部に対して対面に向かって徐々に断面積が大きくなり、内部を伝搬する際に複数回全反射することで発散角が徐々に小さくなるような作用を有するレンズ形状となっている。導光体203の上面には、液晶表示パネル11が取り付けられている。また、光源装置13のケースの1つの側面(本例では左側の端面)には、半導体光源であるLED素子201や、LED素子201の制御回路を実装したLED基板202が取り付けられている。なお、LED基板202の外側面には、LED素子および制御回路で発生する熱を冷却するための部材であるヒートシンクが取り付けられてもよい。
図30には、表示装置1の具体的な構成の他の一例を示す。図30の光源装置13は、図29等の光源装置と同様である。この光源装置13は、例えばプラスチック等のケース内に、LED、コリメータ、合成拡散ブロック、導光体等を収納して構成されている。光源装置13の上面には液晶表示パネル11が取り付けられている。また、光源装置13のケースの1つの側面には、半導体光源であるLED素子や、LED素子の制御回路を実装したLED基板が取り付けられている。LED基板の外側面には、LED素子および制御回路で発生する熱を冷却するための部材であるヒートシンク103が取り付けられている。
続いて、図31を用いて、表示装置1の具体的な構成の他の例を説明する。この表示装置1の光源装置は、LED(LED基板のLED素子)からの自然光(P偏波とS偏波が混在)の発散光束をLEDコリメータ15により略平行光束に変換し、略平行光束を反射型導光体304により液晶表示パネル11に向け反射する。反射光は、液晶表示パネル11と反射型導光体304の間に配置された波長板と反射型偏光板49に入射する。反射型偏光板49で特定の偏波(例えばS偏波)が反射され、波長板で位相が変換されて反射面に戻り、再び位相差板を通過して、反射型偏光板49を透過する偏波(例えばP偏波)に変換される。
続いて、ケース内に収納されている光源装置13等の光学系の構成例について、図32と共に図33および図34を参照しながら、詳細に説明する。図32~図34には、光源を構成するLED素子14(14a,14b)が示されており、これらはLEDコリメータ15に対して所定の位置に取り付けられている。なお、このLEDコリメータ15は、各々、例えばアクリル等の透光性の樹脂により形成されている。そして、このLEDコリメータ15は、図34にも示すように、放物断面を回転して得られる円錐凸形状の外周面156を有すると共に、その頂部では、その中央部に凸部(すなわち凸レンズ面)157を形成した凹部153を有する。また、その平面部の中央部には、外側に突出した凸レンズ面(あるいは、内側に凹んだ凹レンズ面でもよい)154を有している。なお、LEDコリメータ15の円錐形状の外周面156を形成する放物面は、LED素子14から周辺方向に出射する光をその内部で全反射することが可能な角度の範囲内において設定され、あるいは、反射面が形成されている。
<光源装置の構成例2>
光源装置13等の光学系の構成の他の例について図31を用いて説明する。本例では、図31に示すように、LED基板102のLEDからの自然光(P偏光とS偏光が混在)の発散光束をLEDコリメータ15により略平行光束に変換し、反射型導光体304により液晶表示パネル11に向け反射する。反射光は液晶表示パネル11と反射型導光体304の間に配置された反射型偏光板206に入射する。反射型偏光板206で特定の偏波(例えばS偏波)が反射され、反射型導光体304の反射面を繋ぐ面を透過し、反射型導光体304の反対面に面して配置された反射板271で反射され、位相差板(λ/4板)270を2度透過することで偏光変換され、導光体と反射型偏光板を透過して液晶表示パネル11に入射し、映像光に変調される。このとき、特定偏波と偏光変換された偏波面を合わせることで、光の利用効率が通常の2倍となり、反射型偏光板の偏光度(消光比)もシステム全体の消光比に乗せられる。よって、本実施の形態の光源装置13を用いることで、情報表示システムのコントラスト比を大幅に向上できる。
さらに、光源装置13等の光学系の構成の他の例について図39を用いて説明する。図39の構成では、LEDコリメータ15の光の出射側には図面の垂直方向と水平方向の拡散特性を変換する光学シート207(言い換えると拡散シート)を2枚用い、LEDコリメータ15からの光を2枚の光学シート207の間に入射させる。この光学シート207は、1枚で構成する場合には表面と裏面の微細形状で垂直と水平の拡散特性を制御する。また、拡散シートを複数枚使用して作用を分担してもよい。光学シート207の表面形状と裏面形状により、LEDコリメータ15からの光の画面垂直方向の拡散角を拡散シートの反射面の垂直面の幅に合わせ、水平方向では液晶表示パネル11から出射する光束の面密度が均一になるように、LEDの数量と光学素子500からの発散角を設計パラメータとして最適設計するとよい。つまり、本構成では、導光体の代わりに、複数の拡散シートの表面形状により拡散特性を制御する。本実施の形態では、偏光変換は、前述の光源装置の構成例3と同様の方法で行われる。これに対し、LEDコリメータ15と光学シート207の間に偏光変換素子2100(図30)を設けて、偏光変換を行った後、光学シート207に光源光を入射させてもよい。
Claims (12)
- 空間浮遊映像を形成する空間浮遊映像表示装置であって、
映像を表示する表示装置と、
前記表示装置からの映像光を再帰反射させる再帰反射部材と、
を備え、
前記再帰反射部材からの反射光に基づいて前記空間浮遊映像を形成し、
前記空間浮遊映像の面または当該面に表示されるオブジェクトを含む空間領域に対する、ユーザの手指またはユーザが保持する物体の位置を含む操作の状態を検出するためのセンサと、
前記センサで検出した情報に基づいて、前記手指またはユーザが保持する物体の位置の付近に対し、超音波による音圧を形成することで、前記手指またはユーザが保持する物体に触覚の感覚を生成する触覚生成装置と、
を備える、空間浮遊映像表示装置。 - 請求項1記載の空間浮遊映像表示装置において、
前記触覚生成装置は、前記超音波の信号を音声信号によって変調することで、前記手指またはユーザが保持する物体の位置の付近から音声を発生させる、
空間浮遊映像表示装置。 - 請求項1記載の空間浮遊映像表示装置において、
前記触覚生成装置は、
複数の超音波素子が配置された超音波素子アレイと、
前記センサで検出した情報に基づいて、前記超音波素子アレイの各々の超音波素子に入力する超音波駆動信号を発生する超音波信号発生回路と、
を有し、
前記超音波信号発生回路は、前記超音波素子毎に、異なる複数の種類の位相を有する超音波信号のうち選択された1つの超音波信号を与えるように、前記超音波駆動信号を発生する、
空間浮遊映像表示装置。 - 請求項1記載の空間浮遊映像表示装置において、
前記オブジェクトとして、前記手指またはユーザが保持する物体によるタッチ操作を受け付けるオブジェクトを有し、
前記空間浮遊映像の面における前記手指またはユーザが保持する物体のタッチ位置の付近に対し、前記超音波による音圧を形成する、
空間浮遊映像表示装置。 - 請求項3記載の空間浮遊映像表示装置において、
前記表示装置および前記再帰反射部材が収容された筐体と、
前記筐体の一部に設けられ、前記再帰反射部材からの反射光を透過させる透明部材と、
を備え、
前記超音波素子アレイは、前記超音波素子アレイから発された超音波が前記透明部材で反射された後に前記空間浮遊映像の面の裏側から当たる経路となるように、前記透明部材の外側の位置に配置されている、
空間浮遊映像表示装置。 - 請求項3記載の空間浮遊映像表示装置において、
前記表示装置および前記再帰反射部材が収容された筐体と、
前記筐体の一部に設けられ、前記再帰反射部材からの反射光を透過させる透明部材と、
を備え、
前記超音波素子アレイは、前記超音波素子アレイから発された超音波が前記透明部材で反射されずに前記空間浮遊映像の面の裏側から当たる経路となるように、前記透明部材の外側の位置に配置されている、
空間浮遊映像表示装置。 - 請求項2記載の空間浮遊映像表示装置において、
前記変調は、振幅変調である、
空間浮遊映像表示装置。 - 請求項2記載の空間浮遊映像表示装置において、
前記音声信号は、前記空間浮遊映像の面またはオブジェクト、または、前記面またはオブジェクトに対する操作、に関連付けられた所定の音声の音声信号である、
空間浮遊映像表示装置。 - 請求項1記載の空間浮遊映像表示装置において、
さらに、前記ユーザの顔の付近に向けて超指向性の音声を出力する超指向性スピーカを備え、
前記触覚生成装置によって前記ユーザの手指またはユーザが保持する物体に触覚の感覚を生成するとともに、前記超指向性スピーカによって前記ユーザの顔の付近に向けて超指向性の音声を出力する、
空間浮遊映像表示装置。 - 請求項2記載の空間浮遊映像表示装置において、
さらに、前記ユーザの顔の付近に向けて超指向性の音声を出力する超指向性スピーカを備え、
秘匿性を高くすべき種類の音声を出力する際には、前記超指向性スピーカによって前記ユーザの顔の付近に向けて超指向性の音声を出力し、
秘匿性を高くしなくてもよい種類の音声を出力する際には、前記触覚生成装置によって前記ユーザの手指またはユーザが保持する物体の位置の付近から音声を発生させる、
空間浮遊映像表示装置。 - 請求項9または10に記載の空間浮遊映像表示装置において、
さらに、前記空間浮遊映像に対する視認および操作を行う前記ユーザの顔の付近を撮像する撮像装置を備え、
前記撮像装置の画像に基づいて前記ユーザの顔の位置を検出し、
前記顔の位置の付近に前記超指向性の音声の聴取可能領域を形成するように、前記超指向性スピーカからの前記超指向性の音声の出力を制御する、
空間浮遊映像表示装置。 - 請求項1記載の空間浮遊映像表示装置において、
前記表示装置は、
映像を表示する液晶表示パネルと、
前記液晶表示パネルに特定の偏光方向の光を供給する光源装置と、
を有し、
前記再帰反射部材は、前記液晶表示パネルからの映像光として狭角な発散角を有する映像光束を再起反射させ、
前記液晶表示パネルと前記再帰反射部材とを結ぶ光路上の空間内に配置され、前記液晶表示パネルからの特定角度を超える発散角を有する映像光が前記再帰反射部材に入射することを遮る遮光部材を備える、
空間浮遊映像表示装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/265,266 US20240036634A1 (en) | 2021-01-25 | 2021-12-20 | Air floating video display apparatus |
CN202180078748.1A CN116530099A (zh) | 2021-01-25 | 2021-12-20 | 空间悬浮影像显示装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-009640 | 2021-01-25 | ||
JP2021009640A JP2022113411A (ja) | 2021-01-25 | 2021-01-25 | 空間浮遊映像表示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022158209A1 true WO2022158209A1 (ja) | 2022-07-28 |
Family
ID=82549378
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/046981 WO2022158209A1 (ja) | 2021-01-25 | 2021-12-20 | 空間浮遊映像表示装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240036634A1 (ja) |
JP (1) | JP2022113411A (ja) |
CN (1) | CN116530099A (ja) |
WO (1) | WO2022158209A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102630676B1 (ko) * | 2022-09-05 | 2024-01-29 | 주식회사 정완컴퍼니 | Lidar 센서를 활용한 인터렉티브 홀로그램 장치 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5720684B2 (ja) * | 2010-07-23 | 2015-05-20 | 日本電気株式会社 | 立体表示装置及び立体表示方法 |
JP2017131340A (ja) * | 2016-01-26 | 2017-08-03 | フィールズ株式会社 | 娯楽空間制御装置、娯楽空間生成システム、娯楽空間制御方法およびコンピュータプログラム |
JP2017142370A (ja) * | 2016-02-10 | 2017-08-17 | 三菱電機株式会社 | 空中映像表示装置 |
WO2018043673A1 (ja) * | 2016-08-31 | 2018-03-08 | 国立大学法人宇都宮大学 | 表示装置及び空中像の表示方法 |
JP2018195143A (ja) * | 2017-05-18 | 2018-12-06 | 株式会社デンソーテン | 制御装置、入力システムおよび制御方法 |
JP2019133284A (ja) * | 2018-01-30 | 2019-08-08 | コニカミノルタ株式会社 | 非接触式入力装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8985774B2 (en) * | 2007-03-30 | 2015-03-24 | National Institute Of Information And Communication Technology | Floating image interaction device and its program |
CN111316193B (zh) * | 2018-09-18 | 2023-09-15 | 谷歌有限责任公司 | 显示助理设备 |
-
2021
- 2021-01-25 JP JP2021009640A patent/JP2022113411A/ja active Pending
- 2021-12-20 WO PCT/JP2021/046981 patent/WO2022158209A1/ja active Application Filing
- 2021-12-20 CN CN202180078748.1A patent/CN116530099A/zh active Pending
- 2021-12-20 US US18/265,266 patent/US20240036634A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5720684B2 (ja) * | 2010-07-23 | 2015-05-20 | 日本電気株式会社 | 立体表示装置及び立体表示方法 |
JP2017131340A (ja) * | 2016-01-26 | 2017-08-03 | フィールズ株式会社 | 娯楽空間制御装置、娯楽空間生成システム、娯楽空間制御方法およびコンピュータプログラム |
JP2017142370A (ja) * | 2016-02-10 | 2017-08-17 | 三菱電機株式会社 | 空中映像表示装置 |
WO2018043673A1 (ja) * | 2016-08-31 | 2018-03-08 | 国立大学法人宇都宮大学 | 表示装置及び空中像の表示方法 |
JP2018195143A (ja) * | 2017-05-18 | 2018-12-06 | 株式会社デンソーテン | 制御装置、入力システムおよび制御方法 |
JP2019133284A (ja) * | 2018-01-30 | 2019-08-08 | コニカミノルタ株式会社 | 非接触式入力装置 |
Also Published As
Publication number | Publication date |
---|---|
US20240036634A1 (en) | 2024-02-01 |
JP2022113411A (ja) | 2022-08-04 |
CN116530099A (zh) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9298255B2 (en) | Transmissive display apparatus and operation input method | |
JP2012503230A (ja) | 複数のカメラを有するタッチスクリーンディスプレイ | |
KR102649783B1 (ko) | 도파관을 사용한 조향 가능한 하이브리드 디스플레이 | |
US20090309842A1 (en) | Touch Control Virtual Screen Apparatus | |
WO2022138297A1 (ja) | 空間浮遊映像表示装置 | |
WO2022158209A1 (ja) | 空間浮遊映像表示装置 | |
US20170090518A1 (en) | Electronic apparatus | |
KR102127863B1 (ko) | 원통형 스크린장치에서 영상을 조정하는 방법 | |
WO2022137940A1 (ja) | 空間浮遊映像表示装置 | |
WO2022113745A1 (ja) | 空間浮遊映像表示装置 | |
WO2023276921A1 (ja) | 空中浮遊映像表示装置 | |
JP2022097901A (ja) | 空間浮遊映像表示装置 | |
JP7172207B2 (ja) | 入力装置 | |
KR20120025335A (ko) | 적외선 터치스크린 장치 | |
KR20200039995A (ko) | 공간 터치 감지 방법 및 이를 수행하는 표시 장치 | |
WO2023112463A1 (ja) | 空間浮遊映像情報表示システム | |
WO2023243181A1 (ja) | 空間浮遊映像情報表示システム | |
JP2022089271A (ja) | 空間浮遊映像表示装置 | |
JP2023006618A (ja) | 空間浮遊映像表示装置 | |
WO2023068021A1 (ja) | 空中浮遊映像表示システム | |
WO2022270384A1 (ja) | 空中浮遊映像表示システム | |
WO2024062749A1 (ja) | 空中浮遊映像表示装置 | |
WO2023162690A1 (ja) | 空中浮遊映像表示装置 | |
WO2023085069A1 (ja) | 空中浮遊映像表示装置 | |
US20240045227A1 (en) | Air floating video display apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21921310 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180078748.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18265266 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21921310 Country of ref document: EP Kind code of ref document: A1 |