WO2024101245A1 - Aerial display device - Google Patents

Aerial display device Download PDF

Info

Publication number
WO2024101245A1
WO2024101245A1 PCT/JP2023/039476 JP2023039476W WO2024101245A1 WO 2024101245 A1 WO2024101245 A1 WO 2024101245A1 JP 2023039476 W JP2023039476 W JP 2023039476W WO 2024101245 A1 WO2024101245 A1 WO 2024101245A1
Authority
WO
WIPO (PCT)
Prior art keywords
aerial
light
display device
detection
display
Prior art date
Application number
PCT/JP2023/039476
Other languages
French (fr)
Japanese (ja)
Inventor
康宏 代工
Original Assignee
Toppanホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toppanホールディングス株式会社 filed Critical Toppanホールディングス株式会社
Publication of WO2024101245A1 publication Critical patent/WO2024101245A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • G02B3/08Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/04Prisms
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/08Mirrors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an aerial display device.
  • Aerial display devices capable of displaying images and videos as aerial images are being researched, and are expected to become a new human-machine interface.
  • An aerial display device for example, is equipped with a two-sided corner reflector array in which two-sided corner reflectors are arranged in an array, and reflects light emitted from the display surface of a display element to form a real image in the air.
  • the display method using a two-sided corner reflector array is aberration-free, and can display a real image (aerial image) in a plane-symmetrical position.
  • Patent Document 1 discloses an optical element in which a transparent square prism protruding from the surface of a transparent flat plate is used as a two-sided corner reflector, and multiple square prisms are arranged in an array on a flat surface.
  • Patent Document 2 discloses an optical element in which each of the first and second light control panels is formed by arranging multiple planar light reflecting sections vertically inside a transparent flat plate, and the first and second light control panels are arranged so that the planar light reflecting sections are orthogonal to each other.
  • the optical elements of Patent Documents 1 and 2 reflect light emitted from a display element twice from orthogonal reflecting surfaces to generate an aerial image.
  • the observer can touch the aerial image displayed by the aerial display device without touching the device.
  • the sensing element of the aerial display device detects objects present in the area of the aerial image and recognizes that the observer has touched the aerial image.
  • the aerial image may move in tandem with the viewpoint.
  • the aerial image and the detection area for detecting the aerial image become misaligned.
  • the sensing element may recognize that an aerial image other than the one the observer intended has been touched.
  • the present invention provides an aerial display device that can more accurately detect touch operations on an aerial image by an observer.
  • an aerial display device comprising: a display element for displaying an image; an optical element arranged to receive light from the display element and reflecting the light from the display element to the opposite side to the display element to form an aerial image in the air; and a sensing element for forming a detection area in a spatial region overlapping the aerial image and detecting an object in the detection area, the aerial image including first and second aerial images arranged along a first direction, the detection area including first and second detection areas corresponding to the first and second aerial images, respectively, the first detection area including a plurality of first partial areas each extending in a second direction perpendicular to the first direction and aligned in the first direction, the second detection area including a plurality of second partial areas each extending in the second direction and aligned in the first direction, and the sensing element determining which of the first and second aerial images has been touched based on the relationship between the number of first partial areas turned on among the plurality of first partial areas and the number of second partial areas turned on
  • an aerial display device in which, if the number of turned-on first partial areas among the plurality of first partial areas is m, and the number of turned-on second partial areas among the plurality of second partial areas is n, the sensing element determines that the first aerial image has been touched if m>n, and determines that the second aerial image has been touched if m ⁇ n.
  • an aerial display device in which the sensing element includes a light-emitting section that emits light toward the detection area and a light-receiving section that receives the light reflected by the object.
  • an aerial display device in which the optical element includes a planar substrate and a plurality of optical elements disposed below the substrate, each extending in the first direction and aligned in the second direction, each of the plurality of optical elements being inclined with respect to the normal direction of the substrate and having an entrance surface and a reflection surface that are in contact with each other.
  • an aerial display device according to the first aspect, further comprising an orientation control element disposed between the display element and the optical element, which transmits oblique light components of the light from the display element.
  • an aerial display device in which the alignment control element includes a plurality of transparent members and a plurality of light-shielding members that are alternately arranged, and the plurality of light-shielding members are inclined with respect to the normal to the alignment control element.
  • an aerial display device in which the display element and the optical element are arranged parallel to each other.
  • the present invention provides an aerial display device that can more accurately detect touch operations on an aerial image by an observer.
  • FIG. 1 is a perspective view of an aerial display device according to a first embodiment of the present invention.
  • FIG. 2 is a side view of the aerial display device shown in FIG. 1 in the XZ plane.
  • FIG. 3 is a perspective view illustrating the appearance of the aerial display device.
  • FIG. 4A is a plan view of the alignment control element shown in FIG.
  • FIG. 4B is a cross-sectional view of the alignment control element taken along line AA' in FIG. 4A.
  • FIG. 5 is a perspective view of the optical element shown in FIG.
  • FIG. 6 is a block diagram of an aerial display device.
  • FIG. 7 is a perspective view illustrating the state of light reflection in an optical element.
  • FIG. 8 is a side view of the XZ plane for explaining how light is reflected in the optical element.
  • FIG. 1 is a perspective view of an aerial display device according to a first embodiment of the present invention.
  • FIG. 2 is a side view of the aerial display device shown in FIG. 1 in the
  • FIG. 9 is a side view of the YZ plane for explaining how light is reflected in the optical element.
  • FIG. 10 is a diagram for explaining the angular conditions of the incident surface and the reflecting surface in the optical element.
  • FIG. 11 is a diagram illustrating a state in which an observer touches an aerial image.
  • FIG. 12 is a diagram for explaining the state of an aerial image when an observer moves the viewpoint up and down.
  • FIG. 13 is a diagram illustrating an example of an aerial image displayed by the aerial display device.
  • FIG. 14 is a diagram illustrating an example of a detection region formed by a sensing element.
  • FIG. 15 is a diagram for explaining the detailed configuration of a part of the detection region.
  • FIG. 16 is a schematic diagram illustrating the detection operation by the sensing element.
  • FIG. 17 is a flowchart explaining the overall operation of the aerial display device.
  • FIG. 18 is a flowchart illustrating the overall operation of the aerial display device according to the second embodiment of the present invention.
  • FIG. 19 is a flowchart illustrating the overall operation of the aerial display device according to the third embodiment of the present invention.
  • FIG. 1 is a perspective view of the aerial display device 1 according to the first embodiment of the present invention.
  • the X direction is the direction along one side of the aerial display device 1
  • the Y direction is the direction perpendicular to the X direction in the horizontal plane
  • the Z direction is the direction perpendicular to the XY plane (also called the normal direction).
  • Fig. 2 is a side view of the aerial display device 1 shown in Fig. 1 in the XZ plane.
  • Fig. 3 is a perspective view illustrating the appearance of the aerial display device 1.
  • the aerial display device 1 is a device that displays images (including videos).
  • the aerial display device 1 displays an aerial image in the air above its own light emission surface. "Displaying an aerial image” has the same meaning as "forming an aerial image.”
  • the light emission surface of the aerial display device 1 refers to the upper surface of the component that is arranged in the uppermost layer among the multiple components that make up the aerial display device 1 and are arranged on the optical path.
  • An aerial image is a real image that is formed in the air.
  • the aerial display device 1 comprises an illumination element (also called a backlight) 10, a display element 20, an orientation control element 30, an optical element 40, a sensing element 50, and a housing 60.
  • the illumination element 10, the display element 20, the orientation control element 30, and the optical element 40 are arranged in this order along the Z direction and parallel to one another.
  • the illumination element 10, the display element 20, the orientation control element 30, and the optical element 40 are fixed at specific positions by fixing members (not shown) with specific intervals between them.
  • the lighting element 10 emits illumination light and outputs the illumination light toward the display element 20.
  • the lighting element 10 includes a light source unit 11, a light guide plate 12, and a reflective sheet 13.
  • the lighting element 10 is, for example, a side light type lighting element.
  • the lighting element 10 constitutes a surface light source.
  • the lighting element 10 may be configured so that the light intensity reaches a peak in an oblique direction at an angle ⁇ 1 , which will be described later.
  • the light source unit 11 is disposed to face the side of the light guide plate 12.
  • the light source unit 11 emits light toward the side of the light guide plate 12.
  • the light source unit 11 includes a plurality of light-emitting elements, for example, white LEDs (Light Emitting Diodes).
  • the light guide plate 12 guides the illumination light from the light source unit 11 and emits the illumination light from its upper surface.
  • the reflection sheet 13 reflects the illumination light emitted from the bottom surface of the light guide plate 12 back toward the light guide plate 12.
  • the lighting element 10 may be provided with a member (including a prism sheet and a diffusion sheet) that improves optical characteristics on the upper surface of the light guide plate 12.
  • the display element 20 is a transmissive display element.
  • the display element 20 is composed of, for example, a liquid crystal display element.
  • the drive mode of the display element 20 is not particularly limited, and TN (Twisted Nematic) mode, VA (Vertical Alignment) mode, homogeneous mode, or the like can be used.
  • the display element 20 receives illumination light emitted from the illumination element 10.
  • the display element 20 transmits the illumination light from the illumination element 10 and performs optical modulation. Then, the display element 20 displays a specific image on its own screen.
  • the orientation control element 30 has a function of reducing unnecessary light.
  • the unnecessary light is a light component that does not contribute to generating an aerial image, and includes a light component that transmits through the optical element 40 in the normal direction.
  • the orientation control element 30 is configured to transmit light components in a predetermined angular range centered on an oblique direction at an angle ⁇ 1 with respect to the normal direction, and to block light components outside the above-mentioned angular range.
  • the area of the orientation control element 30 is set to be equal to or larger than the area of the display element 20. A detailed configuration of the orientation control element 30 will be described later.
  • the optical element 40 reflects light incident from the bottom side to the top side.
  • the optical element 40 also reflects light incident obliquely from the bottom side, for example, in the front direction (normal direction).
  • the area of the optical element 40 is set to be equal to or larger than the area of the display element 20.
  • the detailed configuration of the optical element 40 will be described later.
  • the optical element 40 forms an aerial image 2 in the air.
  • the aerial image 2 is parallel to the element surface of the optical element 40 and is a two-dimensional image.
  • the element surface refers to a virtual plane on which the optical element 40 extends in the in-plane direction.
  • the element surface has the same meaning as in-plane.
  • the element surfaces of the other elements have the same meaning.
  • An observer 3 standing in front of the optical element 40 can view the aerial image 2.
  • the sensing element 50 is disposed above the optical element 40 and on one side of the aerial display device 1.
  • the sensing element 50 is disposed, for example, at approximately the same level as the aerial image 2.
  • the sensing element 50 may also be disposed below the aerial image 2 so that the light emitted from the sensing element 50 crosses the aerial image 2 diagonally.
  • the sensing element 50 is fixed to a specific position by a fixing member (not shown).
  • the sensing element 50 forms a detection area 53 in a two-dimensional spatial region that includes part or all of the aerial image 2 generated by the aerial display device 1.
  • the sensing element 50 detects an object (body) present in the detection area 53.
  • the sensing element 50 emits infrared light to the detection area 53 and detects the reflected light reflected by the object.
  • the sensing element 50 includes a light-emitting unit that emits infrared light toward the detection area 53 and a light-receiving unit (sensor) that detects the reflected light reflected by the object.
  • the sensing element 50 is composed of, for example, a line sensor in which multiple light-emitting elements and multiple light-receiving elements are alternately arranged in a row.
  • the line sensor can scan space in a line using infrared light, and can scan a two-dimensional space consisting of the direction in which multiple light-emitting elements are arranged and the direction in which light travels.
  • the direction of the infrared light emitted by the sensing element 50 (emission angle from the XY plane) can be set appropriately.
  • the housing 60 houses the lighting element 10, the display element 20, the orientation control element 30, the optical element 40, and the sensing element 50.
  • the housing 60 has an opening at the top that exposes the optical element 40.
  • the sensing element 50 is attached to one side of the housing 60.
  • the sensing element 50 is not necessarily housed in the housing 60, and may be located at any position outside the housing 60.
  • the sensing element 50 can be located in a position optimal for detection operations.
  • FIG. 4A is a plan view of the alignment control element 30 shown in Fig. 1.
  • Fig. 4B is a cross-sectional view of the alignment control element 30 taken along the line AA' in Fig. 4A.
  • the substrate 31 is configured to be planar in the XY plane and has a rectangular parallelepiped shape.
  • the substrate 31 transmits light.
  • a plurality of transparent members 33 are provided, each extending in the Y direction and aligned in the X direction.
  • a plurality of light-shielding members 34 are provided, each extending in the Y direction and aligned in the X direction.
  • the plurality of transparent members 33 and the plurality of light-shielding members 34 are arranged alternately such that adjacent ones are in contact with each other.
  • a substrate 32 is provided on the plurality of transparent members 33 and the plurality of light-shielding members 34.
  • the substrate 32 is configured to be planar in the XY plane and has a rectangular parallelepiped shape.
  • the substrate 32 transmits light.
  • the transparent member 33 extends in an oblique direction at an angle ⁇ 1 with respect to the normal direction of the base material 31 in the XZ plane.
  • the transparent member 33 is a parallelogram with a side surface inclined by the angle ⁇ 1 in the XZ plane.
  • the transparent member 33 transmits light.
  • the light blocking member 34 extends in an oblique direction at an angle ⁇ 1 with respect to the normal direction of the base material 31 in the XZ plane.
  • the light blocking member 34 is a parallelogram with side surfaces inclined by an angle ⁇ 1 in the XZ plane.
  • the light blocking member 34 blocks light.
  • the thickness of the light blocking member 34 is set to be thinner than the thickness of the transparent member 33.
  • Two adjacent light blocking members 34 are positioned so that their ends slightly overlap in the Z direction.
  • Glass or transparent resin (including acrylic resin) is used for the base materials 31 and 32 and the transparent member 33.
  • resin mixed with black dye or pigment is used for the light blocking member 34.
  • the alignment control element 30 may be constructed by omitting one or both of the substrates 31 and 32.
  • the function of the alignment control element 30 can be realized if multiple transparent members 33 and multiple light blocking members 34 are arranged alternately.
  • the orientation control element 30 configured in this manner can transmit the display light so that the light intensity in the oblique direction at an angle ⁇ 1 with respect to the normal direction becomes a peak.
  • the angle ⁇ 1 is set, for example, to be equal to or greater than 10 degrees and equal to or less than 60 degrees.
  • the orientation control element 30 is configured to block light components other than the range of 30° ⁇ 30° with respect to the normal direction.
  • the orientation control element 30 is configured to block light components other than the range of 30° ⁇ 20° with respect to the normal direction.
  • the orientation control element 30 may be disposed between the lighting element 10 and the display element 20. Also, the orientation control element 30 may be omitted when constructing the aerial display device 1.
  • FIG. 5 is a perspective view of the optical element 40 shown in Fig. 1.
  • Fig. 5 also shows an enlarged view of a portion of the optical element 40.
  • the enlarged view in Fig. 5 is a side view in the XZ plane.
  • the optical element 40 includes a substrate 41 and a plurality of optical elements 42.
  • the substrate 41 is configured to be planar in the XY plane and has a rectangular parallelepiped shape.
  • a plurality of optical elements 42 are provided on the bottom surface of the substrate 41.
  • Each of the plurality of optical elements 42 is composed of a triangular prism.
  • the optical element 42 is arranged so that three side surfaces of the triangular prism are parallel to the XY plane, and one side surface is in contact with the substrate 41.
  • the plurality of optical elements 42 each extend in the Y direction and are arranged side by side in the X direction. In other words, the plurality of optical elements 42 have a sawtooth shape in the XZ plane.
  • Each of the multiple optical elements 42 has an incident surface 43 and a reflective surface 44.
  • the left side surface is the incident surface 43
  • the right side surface is the reflective surface 44.
  • the incident surface 43 is a surface onto which light from the display element 20 is incident.
  • the reflective surface 44 is a surface that reflects light that has been incident on the incident surface 43 from the outside, within the optical element 42.
  • the incident surface 43 and the reflective surface 44 form an angle ⁇ p .
  • the substrate 41 and the optical element 42 are made of a transparent material.
  • the optical element 42 is, for example, formed integrally with the substrate 41 using the same transparent material as the substrate 41.
  • the substrate 41 and the optical element 42 may be formed separately, and the optical element 42 may be adhered to the substrate 41 using a transparent adhesive.
  • the transparent material that constitutes the substrate 41 and the optical element 42 may be glass or a transparent resin (including an acrylic resin).
  • the optical element 40 configured in this way internally reflects incident light and forms a real image in the air.
  • the optical element 40 also forms an aerial image 2 at a position in front of the element surface.
  • FIG. 6 Block configuration of the aerial display device 1
  • Figure 6 is a block diagram of the aerial display device 1.
  • the aerial display device 1 includes a control unit 70, a memory unit 71, an input/output interface (input/output IF) 72, a display unit 73, a sensing element 50, and an input unit 74.
  • the control unit 70, the memory unit 71, and the input/output interface 72 are connected to each other via a bus 75.
  • the input/output interface 72 is connected to the display unit 73, the sensing element 50, and the input unit 74.
  • the input/output interface 72 performs interface processing for each of the display unit 73, the sensing element 50, and the input unit 74 in accordance with a predetermined standard.
  • the display unit 73 includes an illumination element 10 and a display element 20.
  • the display unit 73 displays an image.
  • the sensing element 50 includes a light-emitting unit 51 and a light-receiving unit 52.
  • the light-emitting unit 51 emits infrared light toward the detection area.
  • the light-receiving unit 52 detects the light reflected by the object.
  • the control unit 70 is composed of one or more processors, such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
  • the control unit 70 realizes various functions by executing programs stored in the memory unit 71.
  • the control unit 70 includes a display processing unit 70A, an information processing unit 70B, and a detection position calculation unit 70C.
  • the display processing unit 70A controls the operation of the display unit 73 (specifically, the lighting element 10 and the display element 20).
  • the display processing unit 70A controls the on and off of the lighting element 10.
  • the display processing unit 70A transmits an image signal to the display element 20, causing the display element 20 to display an image.
  • the information processing unit 70B generates an image to be displayed by the aerial display device 1.
  • the information processing unit 70B can use image data stored in the memory unit 71.
  • the information processing unit 70B may also obtain image data from outside using a communication function (not shown).
  • the detection position calculation unit 70C controls the operation of the sensing element 50.
  • the detection position calculation unit 70C controls the light emitting unit 51 included in the sensing element 50 to emit infrared light, and forms a detection area consisting of infrared light in a specified spatial region.
  • the detection position calculation unit 70C calculates the position of the target object based on multiple detection signals sent from the light receiving unit 52 included in the sensing element 50.
  • the memory unit 71 includes non-volatile memory devices such as a ROM (Read Only Memory), HDD (Hard Disk Drive), and SSD (Solid State Drive), and volatile memory devices such as a RAM (Random Access Memory) and registers.
  • the memory unit 71 stores programs executed by the control unit 70.
  • the memory unit 71 stores various data necessary for the control of the control unit 70.
  • the memory unit 71 stores data for images displayed by the aerial display device 1.
  • the input unit 74 includes, for example, a touch panel or buttons, and receives information input by the user.
  • the information processing unit 70B is capable of selecting an image to be displayed on the display unit 73 based on the information received by the input unit 74.
  • Fig. 2 The arrows in Fig. 2 indicate optical paths.
  • light emitted from an arbitrary point "o" of the display element 20 is incident on the orientation control element 30.
  • a light component at an angle ⁇ 1 (including a light component in a predetermined angle range centered on the angle ⁇ 1 ) is transmitted through the orientation control element 30.
  • the light transmitted through the orientation control element 30 is incident on the optical element 40.
  • the optical element 40 reflects the incident light to the opposite side to the orientation control element 30, and forms an aerial image 2 in the air.
  • Figure 7 is a perspective view illustrating how light is reflected in the optical element 40.
  • Figure 8 is a side view of the XZ plane illustrating how light is reflected in the optical element 40.
  • Figure 8 is a view of the optical element 40 when both eyes of the observer 3 (i.e., the line connecting both eyes) are parallel to the X direction.
  • Figure 9 is a side view of the YZ plane illustrating how light is reflected in the optical element 40.
  • Figure 9 is a view of the optical element 40 when both eyes of the observer 3 are parallel to the Y direction.
  • the critical angle is the smallest incident angle above which total reflection occurs.
  • the critical angle is the angle with respect to the perpendicular to the incident surface.
  • the light emitted from point "o" is totally reflected by the reflecting surface 44 of the optical element 42, and the light is focused in the air to generate an aerial image.
  • the light emitted from point "o" is not reflected by the reflecting surface 44 of the optical element 42, and the light does not form an image in the air, so it does not contribute to the generation of an aerial image.
  • the condition for observer 3 to be able to see the aerial image is that both eyes of observer 3 are parallel to the X direction or close to it (for example, ⁇ 10 degrees with respect to the X direction). Also, if observer 3's eyes are parallel to the X direction or close to it and the viewpoint is moved along the Y direction, the aerial image can always be recognized.
  • FIG. 10 is a diagram explaining the angular conditions of the incident surface 43 and the reflecting surface 44 of the optical element 40.
  • the angle of the incident surface 43 with respect to the Z direction (the direction perpendicular to the element surface) is ⁇ 2
  • the angle of the reflecting surface 44 with respect to the Z direction is ⁇ 3
  • the angle between the incident surface 43 and the reflecting surface 44 is ⁇ p .
  • the refractive index of the material of optical element 40 is n p , and the refractive index of air is 1.
  • the incident angle at incident surface 43 is ⁇ 4 , and the refraction angle is ⁇ 5.
  • the incident angle at the top surface of optical element 40 is ⁇ 8 , and the refraction angle is ⁇ 9.
  • the refraction angle ⁇ 9 is the emission angle.
  • the emission angle ⁇ 9 is expressed by the following equation (2).
  • ⁇ 9 sin ⁇ 1 (n p * sin (sin ⁇ 1 ((1/n p ) * sin (90° - ( ⁇ 1 + ⁇ 2 )))) + ⁇ 2 + 2 ⁇ 3 - 90°)) ...
  • the critical angle at the reflecting surface 44 is expressed by the following formula (3).
  • Critical angle ⁇ ⁇ 6 ( ⁇ 7 )
  • Critical angle sin ⁇ 1 (1/n p ) (3)
  • the angle of incidence ⁇ 6 on the reflecting surface 44 is set to be larger than the critical angle on the reflecting surface 44.
  • the angle ⁇ 3 of the reflecting surface 44 is set so that the angle of incidence of light incident on the reflecting surface 44 is larger than the critical angle.
  • the angle ⁇ 2 of the incident surface 43 is set so that the light incident on the incident surface 43 is not totally reflected by the incident surface 43.
  • the angle ⁇ 2 of the incident surface 43 is set so that the angle of incidence of the light incident on the incident surface 43 is smaller than the critical angle.
  • the angle between the element surface of the optical element 40 and the surface of the aerial image 2, and the distance between the element surface of the optical element 40 and the surface of the aerial image 2 can be adjusted by optimally setting the angle ⁇ 1 of light incident on the optical element 40, the refractive index of the optical element 40, the angle ⁇ 2 of the incident surface 43 of the optical element 40, and the angle ⁇ 3 of the reflecting surface 44 of the optical element 40.
  • Figure 11 is a diagram explaining how the observer 3 touches the aerial image 2.
  • the Y direction is the vertical direction (direction perpendicular to the ground)
  • the X direction is the horizontal direction (direction parallel to the ground)
  • the Z direction is the front direction of the observer 3 (direction parallel to the line of sight).
  • the up and down direction of the observer 3 is along the Y direction.
  • the aerial display device 1 forms aerial images 2A and 2B in the air.
  • the aerial images 2A and 2B are, for example, rectangular push buttons.
  • the aerial images 2A and 2B are displayed in different two-dimensional spatial regions.
  • the sensing element 50 forms detection areas 53A and 53B in areas that roughly overlap the aerial images 2A and 2B, respectively.
  • Detection areas 53A and 53B are formed with infrared light.
  • the shape of the aerial image is predetermined as the image to be displayed by the aerial display device 1, and information regarding the shape of the aerial image is stored in the memory unit 71 in association with information regarding the image displayed by the aerial display device 1.
  • the area occupied by the aerial image 2A is set as detection area 53A, and information regarding detection area 53A is stored in the memory unit 71.
  • a detection area is set for each of the multiple aerial images.
  • the observer 3 touches the aerial image 2A or 2B with his/her finger 3A.
  • the sensing element 50 detects the finger 3A of the observer 3 present in the detection area 53A or the detection area 53B based on the control of the control unit 70.
  • the detection position calculation unit 70C calculates the position of the object based on the signal sent from the light receiving unit 52 of the sensing element 50. If the calculated position is present in the detection area 53A, for example, the detection position calculation unit 70C determines that the aerial image 2A has been touched.
  • Figure 12 is a diagram explaining the state of the aerial images when observer 3 moves his/her viewpoint up and down.
  • the aerial images 2A, 2B viewed by observer 3 also move upward.
  • Figure 12 illustrates an example in which aerial images 2A, 2B move upward.
  • the aerial images 2A, 2B viewed by observer 3 also move downward.
  • the information on detection areas 53A and 53B is managed in advance by the control unit 70 based on the aerial image to be displayed, and detection areas 53A and 53B are always set to the same position. Therefore, when viewed by the observer 3, there may be a misalignment between the aerial images 2A and 2B and the detection areas 53A and 53B.
  • FIG. 13 is a diagram illustrating an example of an aerial image displayed by the aerial display device 1.
  • the aerial display device 1 displays four aerial images 2A to 2D.
  • Each of the aerial images 2A to 2D is, for example, a rectangular push button.
  • the aerial images 2A to 2D are arranged in a rectangle.
  • the aerial images 2A to 2D are, for example, colored differently from each other.
  • FIG. 14 is a diagram illustrating an example of a detection area formed by the sensing element 50.
  • the sensing element 50 forms four detection areas 53A-53D corresponding to the four aerial images 2A-2D, respectively.
  • the outer shape of the detection area 53A has the same shape as the aerial image 2A.
  • the detection area 53A is positioned so as to overlap with the aerial image 2A and is positioned at approximately the same level as the aerial image 2A.
  • the level of the aerial image 2A and the detection area 53A refers to the height from the light emission surface of the aerial display device 1.
  • the configuration of the detection areas 53B-53D is also similar to that of the detection area 53A.
  • FIG. 15 is a diagram for explaining the detailed configuration of some of the detection regions (detection regions 53A, 53B).
  • Detection region 53A includes multiple detection partial regions 54A, each extending in the X direction and aligned in the Y direction. Each of the multiple detection partial regions 54A has a rectangular shape.
  • the width (length in the Y direction) of the detection partial region 54A can be set as appropriate, and it is preferable that it is narrower from the viewpoint of improving detection accuracy.
  • the interval between adjacent detection partial regions 54A can be set as appropriate, and it is preferable that it is narrower from the viewpoint of improving detection accuracy.
  • the sensing element 50 can detect an object for each detection partial region 54A.
  • detection region 53B includes multiple detection partial regions 54B.
  • the configuration of detection regions 53C and 53D is also the same as that of detection region 53A.
  • the width and interval of the detection partial regions included in detection regions 53A to 53D are the same.
  • FIG. 16 is a schematic diagram explaining the detection operation by the sensing element 50. Two detection areas 53A and 53B are shown in FIG. 16. FIG. 16 shows the detection areas 53A and 53B as viewed from the opposite side to the observer 3 (the aerial display device 1 side).
  • Detection partial region is on refers to the state in which the detection partial region is detecting an object.
  • m is 0 or more and is 0 or less than the total number of detection partial regions 54A.
  • n is 0 or more and is 0 or less than the total number of detection partial regions 54B.
  • the touch operation by the observer 3 can be determined more accurately.
  • FIG. 17 is a flowchart explaining the overall operation in the aerial display device 1.
  • the aerial display device 1 displays two aerial images 2A, 2B and forms two detection areas 53A, 53B.
  • the control unit 70 displays the aerial images 2A and 2B (step S100).
  • the display processing unit 70A displays an image on the screen of the display element 20.
  • the optical element 40 reflects light from the display element 20 and forms the aerial images 2A and 2B in the air.
  • the sensing element 50 executes a sensing operation (step S101).
  • the light-emitting unit 51 included in the sensing element 50 emits infrared light to the detection areas 53A and 53B including the areas where the aerial images 2A and 2B are displayed.
  • the sensing element 50 monitors whether or not an object is present within the detection areas 53A and 53B (step S102). That is, the light receiving unit 52 included in the sensing element 50 monitors the infrared light reflected by the object.
  • the detection position calculation unit 70C determines the number of ON detection partial areas based on the signal detected by the sensing element 50 (step S103). Specifically, the detection position calculation unit 70C determines the relationship between the number m of ON detection partial areas 54A among all detection partial areas 54A included in the detection area 53A and the number n of ON detection partial areas 54B among all detection partial areas 54B included in the detection area 53B. The detection position calculation unit 70C compares the number m with the number n and determines whether the number m is larger or smaller than the number n.
  • step S104 Yes
  • the detection position calculation unit 70C determines that the first aerial image (corresponding to aerial image 2A) has been touched (step S105).
  • step S106 Yes
  • the detection position calculation unit 70C determines that the second aerial image (corresponding to aerial image 2B) has been touched (step S107).
  • step S106 No
  • the detection position calculation unit 70C determines that there is no touch operation by the observer 3 (step S108).
  • the touch operation on the aerial image 2 by the observer 3 is detected.
  • the aerial display device 1 executes an operation according to the touch operation by the observer 3.
  • a detection area 53 is formed corresponding to a certain aerial image 2, and this detection area 53 is composed of a plurality of linear detection partial areas 54. Then, the touch operation of the observer 3 is determined according to the number of detection partial areas 54 that are turned on by an object. This makes it possible to recognize the touch operation of the observer 3 on the targeted aerial image 2 even if an object exists in both of two adjacent detection areas 53. As a result, it is possible to realize an aerial display device 1 that can more accurately detect the touch operation of the observer 3 on the aerial image 2.
  • the aerial display device 1 can display an aerial image 2 in the air by reflecting light emitted from the display element 20 by the optical element 40.
  • the aerial display device 1 can also display the aerial image 2 in the front direction, parallel to the element surface of the optical element 40. It is also possible to realize an aerial display device 1 that can improve display quality.
  • the observer 3 when the observer 3 looks at the optical element 40 with both eyes parallel to or close to the X direction (i.e., the direction in which the multiple optical elements 42 are arranged), the observer 3 can see an aerial image. In addition, when the observer 3 moves the viewpoint along the Y direction with both eyes parallel to or close to the X direction, the observer 3 can always see an aerial image. In addition, a wider viewing angle can be achieved when the observer 3's eyes are parallel to or close to the X direction.
  • the multiple elements that make up the aerial display device 1 can be arranged in parallel. This makes it possible to realize an aerial display device 1 that can be made smaller in size in the Z direction.
  • Second embodiment when the number m of turned-on detection partial areas 54A is the same as the number n of turned-on detection partial areas 54B, the observer 3 is made aware that it is impossible to determine which of the two aerial images 2A, 2B he or she is touching.
  • FIG. 18 is a flowchart explaining the overall operation of the aerial display device 1 according to the second embodiment of the present invention.
  • the operations from steps S100 to S107 are the same as those in the first embodiment.
  • step S106 No
  • the control unit 70 blinks the first and second aerial images (aerial images 2A, 2B) (step S200). Specifically, the display processing unit 70A blinks the image on the display element 20. The optical element 40 forms the blinking first and second aerial images (aerial images 2A, 2B). This operation makes it possible for the observer 3 to recognize that it is impossible to determine which of the first and second aerial images has been touched. This operation also makes it possible to prompt the observer 3 to move his/her finger 3A to the target aerial image.
  • the touch operation of the observer 3 can be determined more accurately.
  • the other effects are the same as those of the first embodiment.
  • the two aerial images of the detection target may be enlarged by a predetermined ratio.
  • FIG. 19 is a flowchart explaining the overall operation of the aerial display device 1 according to the third embodiment of the present invention.
  • the operations from step S100 to S107 are the same as those in the first embodiment.
  • step S106 No
  • the detection position calculation unit 70C waits for a fixed time (step S201). During the fixed time of step S201, the detection position calculation unit 70C stops determining whether the aerial image has been touched. This operation makes it possible for the observer 3 to recognize that it is not possible to determine whether the first or second aerial image has been touched. This operation also makes it possible to encourage the observer 3 to move the finger 3A to the target aerial image.
  • the touch operation of the observer 3 can be determined more accurately.
  • the other effects are the same as those of the first embodiment.
  • the display element 20 and the optical element 40 are arranged parallel to each other. However, this is not limited to this, and the display element 20 may be arranged diagonally relative to the optical element 40.
  • the angle between the display element 20 and the optical element 40 is set in the range greater than 0 degrees and less than 45 degrees.
  • the orientation control element 30 can be omitted.
  • the optical element 40 may be configured as a dihedral corner reflector array in which dihedral corner reflectors are arranged in an array.
  • a dihedral corner reflector array When a dihedral corner reflector array is used, an aerial image is formed at a plane-symmetrical position.
  • the left side of the optical element 42 is defined as the incident surface 43, and the right side is defined as the reflective surface 44.
  • the incident surface 43 and the reflective surface 44 may be configured in reverse. In this case, the function of the aerial display device 1 described in the embodiment will also be reversed.
  • a liquid crystal display element is used as an example of the display element 20, but the present invention is not limited to this.
  • the display element 20 can also be a self-luminous organic EL (electroluminescence) display element or a micro LED (Light Emitting Diode) display element.
  • a micro LED display element is a display element that uses LEDs to emit the R (red), G (green), and B (blue) that make up a pixel. When a self-luminous display element 20 is used, the lighting element 10 is not necessary.
  • the present invention is not limited to the above-described embodiments, and can be modified in various ways during implementation without departing from the gist of the invention.
  • the embodiments may also be implemented in appropriate combination, in which case the combined effects can be obtained.
  • the above-described embodiments include various inventions, and various inventions can be extracted by combinations selected from the multiple constituent elements disclosed. For example, if the problem can be solved and an effect can be obtained even if some constituent elements are deleted from all the constituent elements shown in the embodiments, the configuration from which these constituent elements are deleted can be extracted as an invention.
  • 1...aerial display device 2...aerial image, 3...observer, 10...illumination element, 11...light source unit, 12...light guide plate, 13...reflective sheet, 20...display element, 30...orientation control element, 31...substrate, 32...substrate, 33...transparent member, 34...light-shielding member, 40...optical element, 41...substrate, 42...optical element, 43...incident surface, 44...reflective surface, 50...sensing element, 51...light-emitting unit, 52...light-receiving unit, 53...detection area, 54...detection partial area, 60...housing, 70...control unit, 70A...display processing unit, 70B...information processing unit, 70C...detection position calculation unit, 71...memory unit, 72...input/output interface, 73...display unit, 74...input unit, 75...bus.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optical Elements Other Than Lenses (AREA)

Abstract

This aerial display device comprises: a display element (20) that displays an image; an optical element (40) that is disposed so as to receive light from the display element (20), the optical element (40) forming an aerial image in the air by reflecting light from the display element (20) to a side opposite the display element (20); and a sensing element (50) that forms a detection region in a spatial region overlapping the aerial image and detects an object in the detection region. The detection region includes first and second detection regions corresponding to first and second aerial images, respectively. The first detection region includes a plurality of first partial regions, and the second detection region includes a plurality of second partial regions. On the basis of a relationship between the number of first partial regions that are turned on and the number of second partial regions that are turned on, the sensing element (50) determines which of the first and second aerial image was touched.

Description

空中表示装置Aerial Display Device
 本発明は、空中表示装置に関する。 The present invention relates to an aerial display device.
 画像や動画などを空中像として表示可能な空中表示装置が研究され、新しいヒューマン・マシン・インターフェースとして期待されている。空中表示装置は、例えば、2面コーナーリフレクタがアレイ状に配列された2面コーナーリフレクタアレイを備え、表示素子の表示面から出射される光を反射し、空中に実像を結像する。2面コーナーリフレクタアレイによる表示方法は、収差が無く、面対称位置に実像(空中像)を表示することができる。 Aerial display devices capable of displaying images and videos as aerial images are being researched, and are expected to become a new human-machine interface. An aerial display device, for example, is equipped with a two-sided corner reflector array in which two-sided corner reflectors are arranged in an array, and reflects light emitted from the display surface of a display element to form a real image in the air. The display method using a two-sided corner reflector array is aberration-free, and can display a real image (aerial image) in a plane-symmetrical position.
 特許文献1は、透明平板の表面から突出した透明な四角柱を2面コーナーリフレクタとして使用し、複数の四角柱を平面上にアレイ状に配置した光学素子を開示している。また、特許文献2は、第1及び第2光制御パネルの各々を、透明平板の内部に垂直に複数の平面光反射部を並べて形成し、第1及び第2光制御パネルを、互いの平面光反射部が直交するように配置した光学素子を開示している。特許文献1、2の光学素子は、表示素子から出射された光を直交する反射面で2回反射させ、空中像を生成している。 Patent Document 1 discloses an optical element in which a transparent square prism protruding from the surface of a transparent flat plate is used as a two-sided corner reflector, and multiple square prisms are arranged in an array on a flat surface. Patent Document 2 discloses an optical element in which each of the first and second light control panels is formed by arranging multiple planar light reflecting sections vertically inside a transparent flat plate, and the first and second light control panels are arranged so that the planar light reflecting sections are orthogonal to each other. The optical elements of Patent Documents 1 and 2 reflect light emitted from a display element twice from orthogonal reflecting surfaces to generate an aerial image.
 観察者は、機器に接触することなく、空中表示装置が表示した空中像をタッチすることが可能である。空中表示装置が備えるセンシング素子は、空中像の領域に存在する物体を検知し、観察者が空中像をタッチしたことを認識する。 The observer can touch the aerial image displayed by the aerial display device without touching the device. The sensing element of the aerial display device detects objects present in the area of the aerial image and recognizes that the observer has touched the aerial image.
 空中表示装置に対して観察者の視点が移動した場合に、視点の移動に合わせて空中像が移動する場合がある。この場合、空中像と、空中像を検知するための検知領域とがずれてしまう。この場合、センシング素子は、観察者が目標とする空中像とは異なる空中像がタッチされたと認識してしまう可能性がある。 When the observer's viewpoint moves relative to the aerial display device, the aerial image may move in tandem with the viewpoint. In this case, the aerial image and the detection area for detecting the aerial image become misaligned. In this case, the sensing element may recognize that an aerial image other than the one the observer intended has been touched.
日本国特開2011-191404号公報Japanese Patent Publication No. 2011-191404 日本国特開2011-175297号公報Japanese Patent Publication No. 2011-175297
 本発明は、観察者による空中像へのタッチ操作をより正確に検知することが可能な空中表示装置を提供する。 The present invention provides an aerial display device that can more accurately detect touch operations on an aerial image by an observer.
 本発明の第1態様によると、画像を表示する表示素子と、前記表示素子からの光を受けるように配置され、前記表示素子からの光を、前記表示素子と反対側に反射し、空中に空中像を結像する光学素子と、前記空中像と重なる空間領域に検知領域を形成し、前記検知領域内の対象物を検知するセンシング素子とを具備し、前記空中像は、第1方向に沿って配置された第1及び第2空中像を含み、前記検知領域は、前記第1及び第2空中像にそれぞれ対応した第1及び第2検知領域を含み、前記第1検知領域は、それぞれが前記第1方向に直交する第2方向に延び、前記第1方向に並んだ複数の第1部分領域を含み、前記第2検知領域は、それぞれが前記第2方向に延び、前記第1方向に並んだ複数の第2部分領域を含み、前記センシング素子は、前記複数の第1部分領域のうちオンした第1部分領域の数と、前記複数の第2部分領域のうちオンした第2部分領域の数との関係に基づいて、前記第1及び第2空中像のどちらがタッチされたかを判定する、空中表示装置が提供される。 According to a first aspect of the present invention, there is provided an aerial display device comprising: a display element for displaying an image; an optical element arranged to receive light from the display element and reflecting the light from the display element to the opposite side to the display element to form an aerial image in the air; and a sensing element for forming a detection area in a spatial region overlapping the aerial image and detecting an object in the detection area, the aerial image including first and second aerial images arranged along a first direction, the detection area including first and second detection areas corresponding to the first and second aerial images, respectively, the first detection area including a plurality of first partial areas each extending in a second direction perpendicular to the first direction and aligned in the first direction, the second detection area including a plurality of second partial areas each extending in the second direction and aligned in the first direction, and the sensing element determining which of the first and second aerial images has been touched based on the relationship between the number of first partial areas turned on among the plurality of first partial areas and the number of second partial areas turned on among the plurality of second partial areas.
 本発明の第2態様によると、前記複数の第1部分領域のうちオンした第1部分領域の数をm、前記複数の第2部分領域のうちオンした第2部分領域の数をnとすると、前記センシング素子は、m>nである場合に、前記第1空中像がタッチされたと判定し、m<nである場合に、前記第2空中像がタッチされたと判定する、第1態様に係る空中表示装置が提供される。 According to a second aspect of the present invention, there is provided an aerial display device according to the first aspect, in which, if the number of turned-on first partial areas among the plurality of first partial areas is m, and the number of turned-on second partial areas among the plurality of second partial areas is n, the sensing element determines that the first aerial image has been touched if m>n, and determines that the second aerial image has been touched if m<n.
 本発明の第3態様によると、前記センシング素子は、m=nである場合、タッチ操作が無いと判定する、第2態様に係る空中表示装置が提供される。 According to a third aspect of the present invention, the aerial display device according to the second aspect is provided, in which the sensing element determines that no touch operation has been performed when m=n.
 本発明の第4態様によると、前記表示素子は、前記センシング素子によりm=nであると判定された場合、前記第1及び第2空中像を点滅させる、第2態様に係る空中表示装置が提供される。 According to a fourth aspect of the present invention, there is provided an aerial display device according to the second aspect, in which the display element blinks the first and second aerial images when the sensing element determines that m=n.
 本発明の第5態様によると、前記表示素子は、前記センシング素子によりm=nであると判定された場合、前記第1及び第2空中像を拡大する、第2態様に係る空中表示装置が提供される。 According to a fifth aspect of the present invention, there is provided an aerial display device according to the second aspect, in which the display element enlarges the first and second aerial images when the sensing element determines that m=n.
 本発明の第6態様によると、前記センシング素子は、m=nである場合、一定時間だけ待機し、前記一定時間が経過した後、数mと数nとの大小関係を判定する、第2態様に係る空中表示装置が提供される。 According to a sixth aspect of the present invention, the aerial display device according to the second aspect is provided, in which the sensing element waits for a fixed time when m=n, and after the fixed time has elapsed, determines whether the number m is larger than the number n.
 本発明の第7態様によると、前記センシング素子は、前記検知領域に向けて光を発光する発光部と、前記対象物で反射された反射光を受光する受光部とを含む、第1態様に係る空中表示装置が提供される。 According to a seventh aspect of the present invention, there is provided an aerial display device according to the first aspect, in which the sensing element includes a light-emitting section that emits light toward the detection area and a light-receiving section that receives the light reflected by the object.
 本発明の第8態様によると、前記光学素子は、平面状の基材と、前記基材の下に設けられ、それぞれが前記第1方向に延び、前記第2方向に並んだ複数の光学要素とを含み、前記複数の光学要素の各々は、前記基材の法線方向に対してそれぞれが傾き、互いに接する入射面及び反射面を有する、第1態様に係る空中表示装置が提供される。 According to an eighth aspect of the present invention, there is provided an aerial display device according to the first aspect, in which the optical element includes a planar substrate and a plurality of optical elements disposed below the substrate, each extending in the first direction and aligned in the second direction, each of the plurality of optical elements being inclined with respect to the normal direction of the substrate and having an entrance surface and a reflection surface that are in contact with each other.
 本発明の第9態様によると、前記表示素子と前記光学素子との間に配置され、前記表示素子からの光のうち斜め方向の光成分を透過する配向制御素子をさらに具備する、第1態様に係る空中表示装置が提供される。 According to a ninth aspect of the present invention, there is provided an aerial display device according to the first aspect, further comprising an orientation control element disposed between the display element and the optical element, which transmits oblique light components of the light from the display element.
 本発明の第10態様によると、前記配向制御素子は、交互に配置された複数の透明部材及び複数の遮光部材を含み、前記複数の遮光部材は、前記配向制御素子の法線に対して傾いている、第9態様に係る空中表示装置が提供される。 According to a tenth aspect of the present invention, there is provided an aerial display device according to the ninth aspect, in which the alignment control element includes a plurality of transparent members and a plurality of light-shielding members that are alternately arranged, and the plurality of light-shielding members are inclined with respect to the normal to the alignment control element.
 本発明の第11態様によると、前記表示素子及び前記光学素子は、互いに平行に配置される、第1態様に係る空中表示装置が提供される。 According to an eleventh aspect of the present invention, there is provided an aerial display device according to the first aspect, in which the display element and the optical element are arranged parallel to each other.
 本発明によれば、観察者による空中像へのタッチ操作をより正確に検知することが可能な空中表示装置を提供することができる。 The present invention provides an aerial display device that can more accurately detect touch operations on an aerial image by an observer.
図1は、本発明の第1実施形態に係る空中表示装置の斜視図である。FIG. 1 is a perspective view of an aerial display device according to a first embodiment of the present invention. 図2は、図1に示した空中表示装置のXZ面における側面図である。FIG. 2 is a side view of the aerial display device shown in FIG. 1 in the XZ plane. 図3は、空中表示装置の外観を説明する斜視図である。FIG. 3 is a perspective view illustrating the appearance of the aerial display device. 図4Aは、図1に示した配向制御素子の平面図である。FIG. 4A is a plan view of the alignment control element shown in FIG. 図4Bは、図4AのA-A´線に沿った配向制御素子の断面図である。FIG. 4B is a cross-sectional view of the alignment control element taken along line AA' in FIG. 4A. 図5は、図1に示した光学素子の斜視図である。FIG. 5 is a perspective view of the optical element shown in FIG. 図6は、空中表示装置のブロック図である。FIG. 6 is a block diagram of an aerial display device. 図7は、光学素子における光の反射の様子を説明する斜視図である。FIG. 7 is a perspective view illustrating the state of light reflection in an optical element. 図8は、光学素子における光の反射の様子を説明するXZ面の側面図である。FIG. 8 is a side view of the XZ plane for explaining how light is reflected in the optical element. 図9は、光学素子における光の反射の様子を説明するYZ面の側面図である。FIG. 9 is a side view of the YZ plane for explaining how light is reflected in the optical element. 図10は、光学素子における入射面及び反射面の角度条件を説明する図である。FIG. 10 is a diagram for explaining the angular conditions of the incident surface and the reflecting surface in the optical element. 図11は、観察者が空中像をタッチする様子を説明する図である。FIG. 11 is a diagram illustrating a state in which an observer touches an aerial image. 図12は、観察者が視点を上下に動かした場合の空中像の様子を説明する図である。FIG. 12 is a diagram for explaining the state of an aerial image when an observer moves the viewpoint up and down. 図13は、空中表示装置が表示する空中像の一例を説明する図である。FIG. 13 is a diagram illustrating an example of an aerial image displayed by the aerial display device. 図14は、センシング素子が形成する検知領域の一例を説明する図である。FIG. 14 is a diagram illustrating an example of a detection region formed by a sensing element. 図15は、一部の検知領域の詳細な構成を説明する図である。FIG. 15 is a diagram for explaining the detailed configuration of a part of the detection region. 図16は、センシング素子による検知動作を説明する模式図である。FIG. 16 is a schematic diagram illustrating the detection operation by the sensing element. 図17は、空中表示装置における全体動作を説明するフローチャートである。FIG. 17 is a flowchart explaining the overall operation of the aerial display device. 図18は、本発明の第2実施形態に係る空中表示装置における全体動作を説明するフローチャートである。FIG. 18 is a flowchart illustrating the overall operation of the aerial display device according to the second embodiment of the present invention. 図19は、本発明の第3実施形態に係る空中表示装置における全体動作を説明するフローチャートである。FIG. 19 is a flowchart illustrating the overall operation of the aerial display device according to the third embodiment of the present invention.
 以下、実施形態について図面を参照して説明する。ただし、図面は模式的または概念的なものであり、各図面の寸法および比率等は必ずしも現実のものと同一とは限らない。また、図面の相互間で同じ部分を表す場合においても、互いの寸法の関係や比率が異なって表される場合もある。特に、以下に示す幾つかの実施形態は、本発明の技術思想を具体化するための装置および方法を例示したものであって、構成部品の形状、構造、配置等によって、本発明の技術思想が特定されるものではない。なお、以下の説明において、同一の機能及び構成を有する要素については同一符号を付し、重複する説明は省略する。 The following describes the embodiments with reference to the drawings. However, the drawings are schematic or conceptual, and the dimensions and ratios of each drawing are not necessarily the same as those in reality. Furthermore, even when the same parts are shown in different drawings, the dimensional relationships and ratios between them may be different. In particular, the following embodiments are examples of devices and methods for embodying the technical ideas of the present invention, and the technical ideas of the present invention are not specified by the shape, structure, arrangement, etc. of the components. In the following description, elements having the same functions and configurations are given the same reference numerals, and duplicate descriptions will be omitted.
 [1] 第1実施形態
 [1-1] 空中表示装置1の構成
 図1は、本発明の第1実施形態に係る空中表示装置1の斜視図である。図1において、X方向は、空中表示装置1のある1辺に沿った方向であり、Y方向は、水平面内においてX方向に直交する方向であり、Z方向は、XY面に直交する方向(法線方向ともいう)である。図2は、図1に示した空中表示装置1のXZ面における側面図である。図3は、空中表示装置1の外観を説明する斜視図である。
[1] First embodiment [1-1] Configuration of the aerial display device 1 Fig. 1 is a perspective view of the aerial display device 1 according to the first embodiment of the present invention. In Fig. 1, the X direction is the direction along one side of the aerial display device 1, the Y direction is the direction perpendicular to the X direction in the horizontal plane, and the Z direction is the direction perpendicular to the XY plane (also called the normal direction). Fig. 2 is a side view of the aerial display device 1 shown in Fig. 1 in the XZ plane. Fig. 3 is a perspective view illustrating the appearance of the aerial display device 1.
 空中表示装置1は、画像(動画を含む)を表示する装置である。空中表示装置1は、自身の光出射面の上方の空中に、空中像を表示する。「空中像を表示する」とは、「空中像を結像する」と同じ意味である。空中表示装置1の光出射面とは、空中表示装置1を構成しかつ光路上に配置された複数の部材のうち最上層に配置された部材の上面を意味する。空中像とは、空中に結像する実像である。 The aerial display device 1 is a device that displays images (including videos). The aerial display device 1 displays an aerial image in the air above its own light emission surface. "Displaying an aerial image" has the same meaning as "forming an aerial image." The light emission surface of the aerial display device 1 refers to the upper surface of the component that is arranged in the uppermost layer among the multiple components that make up the aerial display device 1 and are arranged on the optical path. An aerial image is a real image that is formed in the air.
 空中表示装置1は、照明素子(バックライトともいう)10、表示素子20、配向制御素子30、光学素子40、センシング素子50、及び筐体60を備える。照明素子10、表示素子20、配向制御素子30、及び光学素子40は、この順にZ方向に沿って配置され、互いに平行に配置される。照明素子10、表示素子20、配向制御素子30、及び光学素子40は、互いに特定の間隔を空けるようにして、図示せぬ固定部材で特定の位置に固定される。 The aerial display device 1 comprises an illumination element (also called a backlight) 10, a display element 20, an orientation control element 30, an optical element 40, a sensing element 50, and a housing 60. The illumination element 10, the display element 20, the orientation control element 30, and the optical element 40 are arranged in this order along the Z direction and parallel to one another. The illumination element 10, the display element 20, the orientation control element 30, and the optical element 40 are fixed at specific positions by fixing members (not shown) with specific intervals between them.
 照明素子10は、照明光を発光し、この照明光を表示素子20に向けて出射する。照明素子10は、光源部11、導光板12、及び反射シート13を備える。照明素子10は、例えばサイドライト型の照明素子である。照明素子10は、面光源を構成する。照明素子10は、後述する角度θの斜め方向に光強度がピークになるように構成してもよい。 The lighting element 10 emits illumination light and outputs the illumination light toward the display element 20. The lighting element 10 includes a light source unit 11, a light guide plate 12, and a reflective sheet 13. The lighting element 10 is, for example, a side light type lighting element. The lighting element 10 constitutes a surface light source. The lighting element 10 may be configured so that the light intensity reaches a peak in an oblique direction at an angle θ1 , which will be described later.
 光源部11は、導光板12の側面に向き合うように配置される。光源部11は、導光板12の側面に向けて光を発光する。光源部11は、例えば白色LED(Light Emitting Diode)からなる複数の発光素子を含む。導光板12は、光源部11からの照明光を導光し、照明光を自身の上面から出射する。反射シート13は、導光板12の底面から出射した照明光を、再び導光板12に向けて反射する。照明素子10は、導光板12の上面に、光学特性を向上させる部材(プリズムシート、及び拡散シートを含む)を備えていてもよい。 The light source unit 11 is disposed to face the side of the light guide plate 12. The light source unit 11 emits light toward the side of the light guide plate 12. The light source unit 11 includes a plurality of light-emitting elements, for example, white LEDs (Light Emitting Diodes). The light guide plate 12 guides the illumination light from the light source unit 11 and emits the illumination light from its upper surface. The reflection sheet 13 reflects the illumination light emitted from the bottom surface of the light guide plate 12 back toward the light guide plate 12. The lighting element 10 may be provided with a member (including a prism sheet and a diffusion sheet) that improves optical characteristics on the upper surface of the light guide plate 12.
 表示素子20は、透過型の表示素子である。表示素子20は、例えば液晶表示素子で構成される。表示素子20の駆動モードについては特に限定されず、TN(Twisted Nematic)モード、VA(Vertical Alignment)モード、又はホモジニアスモードなどを用いることができる。表示素子20は、照明素子10から出射された照明光を受ける。表示素子20は、照明素子10からの照明光を透過して光変調を行う。そして、表示素子20は、自身の画面に特定の画像を表示する。 The display element 20 is a transmissive display element. The display element 20 is composed of, for example, a liquid crystal display element. The drive mode of the display element 20 is not particularly limited, and TN (Twisted Nematic) mode, VA (Vertical Alignment) mode, homogeneous mode, or the like can be used. The display element 20 receives illumination light emitted from the illumination element 10. The display element 20 transmits the illumination light from the illumination element 10 and performs optical modulation. Then, the display element 20 displays a specific image on its own screen.
 配向制御素子30は、不要光を低減する機能を有する。不要光とは、空中像を生成するのに寄与しない光成分であり、法線方向に光学素子40を透過する光成分を含む。配向制御素子30は、法線方向に対して角度θの斜め方向を中心として所定の角度範囲の光成分を透過するとともに、上記角度範囲以外の光成分を遮光するように構成される。配向制御素子30の面積は、表示素子20の面積以上に設定される。配向制御素子30の詳細な構成については後述する。 The orientation control element 30 has a function of reducing unnecessary light. The unnecessary light is a light component that does not contribute to generating an aerial image, and includes a light component that transmits through the optical element 40 in the normal direction. The orientation control element 30 is configured to transmit light components in a predetermined angular range centered on an oblique direction at an angle θ 1 with respect to the normal direction, and to block light components outside the above-mentioned angular range. The area of the orientation control element 30 is set to be equal to or larger than the area of the display element 20. A detailed configuration of the orientation control element 30 will be described later.
 光学素子40は、底面側から入射した光を上面側に反射する。また、光学素子40は、底面側から斜めに入射した入射光を、例えば正面方向(法線方向)に反射する。光学素子40の面積は、表示素子20の面積以上に設定される。光学素子40の詳細な構成については後述する。光学素子40は、空中に空中像2を結像する。空中像2は、光学素子40の素子面に平行であり、2次元の画像である。素子面とは、光学素子40が面内方向に広がる仮想的な平面を言う。素子面は、面内と同じ意味である。その他の素子の素子面についても同様の意味である。光学素子40の正面にいる観察者3は、空中像2を視認することができる。 The optical element 40 reflects light incident from the bottom side to the top side. The optical element 40 also reflects light incident obliquely from the bottom side, for example, in the front direction (normal direction). The area of the optical element 40 is set to be equal to or larger than the area of the display element 20. The detailed configuration of the optical element 40 will be described later. The optical element 40 forms an aerial image 2 in the air. The aerial image 2 is parallel to the element surface of the optical element 40 and is a two-dimensional image. The element surface refers to a virtual plane on which the optical element 40 extends in the in-plane direction. The element surface has the same meaning as in-plane. The element surfaces of the other elements have the same meaning. An observer 3 standing in front of the optical element 40 can view the aerial image 2.
 センシング素子50は、光学素子40の上方かつ空中表示装置1の一側部に配置される。センシング素子50は、例えば、空中像2とおおよそ同じレベルに配置される。センシング素子50は、センシング素子50から出射される光が空中像2を斜めに横切るように、空中像2より下方に配置してもよい。センシング素子50は、図示せぬ固定部材で特定の位置に固定される。 The sensing element 50 is disposed above the optical element 40 and on one side of the aerial display device 1. The sensing element 50 is disposed, for example, at approximately the same level as the aerial image 2. The sensing element 50 may also be disposed below the aerial image 2 so that the light emitted from the sensing element 50 crosses the aerial image 2 diagonally. The sensing element 50 is fixed to a specific position by a fixing member (not shown).
 センシング素子50は、空中表示装置1が生成した空中像2の一部又は全部を含む2次元の空間領域に、検知領域53を形成する。センシング素子50は、検知領域53に存在する対象物(物体)を検知する。センシング素子50は、検知領域53に赤外光を出射し、対象物で反射された反射光を検知する。センシング素子50は、検知領域53に向けて赤外光を発光する発光部と、対象物で反射された反射光を検知する受光部(センサ)とを含む。センシング素子50は、例えば、複数の発光素子と複数の受光素子とが交互に一列に並んだラインセンサで構成される。ラインセンサは、赤外光を用いてライン状に空間をスキャンすることが可能であり、複数の発光素子が並んだ方向と光が進む方向とからなる2次元の空間をスキャンすることが可能である。センシング素子50が出射する赤外光の方向(XY面からの出射角度)は、適宜設定可能である。 The sensing element 50 forms a detection area 53 in a two-dimensional spatial region that includes part or all of the aerial image 2 generated by the aerial display device 1. The sensing element 50 detects an object (body) present in the detection area 53. The sensing element 50 emits infrared light to the detection area 53 and detects the reflected light reflected by the object. The sensing element 50 includes a light-emitting unit that emits infrared light toward the detection area 53 and a light-receiving unit (sensor) that detects the reflected light reflected by the object. The sensing element 50 is composed of, for example, a line sensor in which multiple light-emitting elements and multiple light-receiving elements are alternately arranged in a row. The line sensor can scan space in a line using infrared light, and can scan a two-dimensional space consisting of the direction in which multiple light-emitting elements are arranged and the direction in which light travels. The direction of the infrared light emitted by the sensing element 50 (emission angle from the XY plane) can be set appropriately.
 筐体60は、照明素子10、表示素子20、配向制御素子30、光学素子40、及びセンシング素子50を収容する。筐体60は、上部に光学素子40を露出する開口部を有する。また、筐体60の一側部には、センシング素子50が取り付けられる。センシング素子50は、筐体60に収容されず、筐体60の外の任意の位置に配置してもよい。センシング素子50は、検知動作に最適な箇所に配置することが可能である。 The housing 60 houses the lighting element 10, the display element 20, the orientation control element 30, the optical element 40, and the sensing element 50. The housing 60 has an opening at the top that exposes the optical element 40. In addition, the sensing element 50 is attached to one side of the housing 60. The sensing element 50 is not necessarily housed in the housing 60, and may be located at any position outside the housing 60. The sensing element 50 can be located in a position optimal for detection operations.
 [1-1-1] 配向制御素子30の構成
 図4Aは、図1に示した配向制御素子30の平面図である。図4Bは、図4AのA-A´線に沿った配向制御素子30の断面図である。
[1-1-1] Configuration of the alignment control element 30 Fig. 4A is a plan view of the alignment control element 30 shown in Fig. 1. Fig. 4B is a cross-sectional view of the alignment control element 30 taken along the line AA' in Fig. 4A.
 基材31は、XY面において平面状に構成され、直方体を有する。基材31は、光を透過する。 The substrate 31 is configured to be planar in the XY plane and has a rectangular parallelepiped shape. The substrate 31 transmits light.
 基材31上には、それぞれがY方向に延び、X方向に並んだ複数の透明部材33が設けられる。また、基材31上には、それぞれがY方向に延び、X方向に並んだ複数の遮光部材34が設けられる。複数の透明部材33と複数の遮光部材34とは、隣接するもの同士が接するようにして交互に配置される。 On the base material 31, a plurality of transparent members 33 are provided, each extending in the Y direction and aligned in the X direction. Also, on the base material 31, a plurality of light-shielding members 34 are provided, each extending in the Y direction and aligned in the X direction. The plurality of transparent members 33 and the plurality of light-shielding members 34 are arranged alternately such that adjacent ones are in contact with each other.
 複数の透明部材33及び複数の遮光部材34上には、基材32が設けられる。基材32は、XY面において平面状に構成され、直方体を有する。基材32は、光を透過する。 A substrate 32 is provided on the plurality of transparent members 33 and the plurality of light-shielding members 34. The substrate 32 is configured to be planar in the XY plane and has a rectangular parallelepiped shape. The substrate 32 transmits light.
 透明部材33は、XZ面において、基材31の法線方向に対して角度θの斜め方向に延びる。透明部材33は、XZ面において、側面が角度θだけ傾いた平行四辺形である。透明部材33は、光を透過する。 The transparent member 33 extends in an oblique direction at an angle θ 1 with respect to the normal direction of the base material 31 in the XZ plane. The transparent member 33 is a parallelogram with a side surface inclined by the angle θ 1 in the XZ plane. The transparent member 33 transmits light.
 遮光部材34は、XZ面において、基材31の法線方向に対して角度θの斜め方向に延びる。遮光部材34は、XZ面において、側面が角度θだけ傾いた平行四辺形である。遮光部材34は、光を遮光する。遮光部材34の厚みは、透明部材33の厚みより薄く設定される。 The light blocking member 34 extends in an oblique direction at an angle θ 1 with respect to the normal direction of the base material 31 in the XZ plane. The light blocking member 34 is a parallelogram with side surfaces inclined by an angle θ 1 in the XZ plane. The light blocking member 34 blocks light. The thickness of the light blocking member 34 is set to be thinner than the thickness of the transparent member 33.
 隣接する2個の遮光部材34は、Z方向において互いの端部が若干重なるように配置される。 Two adjacent light blocking members 34 are positioned so that their ends slightly overlap in the Z direction.
 基材31、32、及び透明部材33としては、ガラス、又は透明な樹脂(アクリル樹脂を含む)が用いられる。遮光部材34としては、例えば、黒色の染料又は顔料が混入された樹脂が用いられる。 Glass or transparent resin (including acrylic resin) is used for the base materials 31 and 32 and the transparent member 33. For example, resin mixed with black dye or pigment is used for the light blocking member 34.
 なお、基材31、32の一方又は両方を省略して、配向制御素子30を構成してもよい。複数の透明部材33と複数の遮光部材34とが交互に配置されていれば、配向制御素子30の機能を実現できる。 The alignment control element 30 may be constructed by omitting one or both of the substrates 31 and 32. The function of the alignment control element 30 can be realized if multiple transparent members 33 and multiple light blocking members 34 are arranged alternately.
 このように構成された配向制御素子30は、法線方向に対して角度θの斜め方向の光強度がピークになるように、表示光を透過することができる。角度θは、例えば10度以上60度以下に設定される。例えば、配向制御素子30は、法線方向に対して30°±30°の範囲以外の光成分を遮光するように構成される。望ましくは、配向制御素子30は、法線方向に対して30°±20°の範囲以外の光成分を遮光するように構成される。 The orientation control element 30 configured in this manner can transmit the display light so that the light intensity in the oblique direction at an angle θ 1 with respect to the normal direction becomes a peak. The angle θ 1 is set, for example, to be equal to or greater than 10 degrees and equal to or less than 60 degrees. For example, the orientation control element 30 is configured to block light components other than the range of 30°±30° with respect to the normal direction. Desirably, the orientation control element 30 is configured to block light components other than the range of 30°±20° with respect to the normal direction.
 なお、変形例として、配向制御素子30は、照明素子10と表示素子20との間に配置してもよい。また、配向制御素子30を省略して、空中表示装置1を構成してもよい。 As a modified example, the orientation control element 30 may be disposed between the lighting element 10 and the display element 20. Also, the orientation control element 30 may be omitted when constructing the aerial display device 1.
 [1-1-2] 光学素子40の構成
 図5は、図1に示した光学素子40の斜視図である。図5には、光学素子40の一部を拡大した拡大図も図示している。図5の拡大図は、XZ面における側面図である。
[1-1-2] Configuration of the Optical Element 40 Fig. 5 is a perspective view of the optical element 40 shown in Fig. 1. Fig. 5 also shows an enlarged view of a portion of the optical element 40. The enlarged view in Fig. 5 is a side view in the XZ plane.
 光学素子40は、基材41、及び複数の光学要素42を備える。基材41は、XY面において平面状に構成され、直方体を有する。 The optical element 40 includes a substrate 41 and a plurality of optical elements 42. The substrate 41 is configured to be planar in the XY plane and has a rectangular parallelepiped shape.
 基材41の底面には、複数の光学要素42が設けられる。複数の光学要素42の各々は、三角柱で構成される。光学要素42は、三角柱の3個の側面がXY面と平行になるように配置され、1つの側面が基材41に接する。複数の光学要素42は、それぞれがY方向に延び、X方向に並んで配置される。換言すると、複数の光学要素42は、XZ面において鋸歯状を有する。 A plurality of optical elements 42 are provided on the bottom surface of the substrate 41. Each of the plurality of optical elements 42 is composed of a triangular prism. The optical element 42 is arranged so that three side surfaces of the triangular prism are parallel to the XY plane, and one side surface is in contact with the substrate 41. The plurality of optical elements 42 each extend in the Y direction and are arranged side by side in the X direction. In other words, the plurality of optical elements 42 have a sawtooth shape in the XZ plane.
 複数の光学要素42の各々は、入射面43及び反射面44を有する。Y方向から見て、左側の側面が入射面43であり、右側の側面が反射面44である。入射面43は、表示素子20からの光が入射する面である。反射面44は、入射面43に外部から入射した光を、光学要素42の内部で反射する面である。入射面43と反射面44とは、角度θを有する。 Each of the multiple optical elements 42 has an incident surface 43 and a reflective surface 44. When viewed from the Y direction, the left side surface is the incident surface 43, and the right side surface is the reflective surface 44. The incident surface 43 is a surface onto which light from the display element 20 is incident. The reflective surface 44 is a surface that reflects light that has been incident on the incident surface 43 from the outside, within the optical element 42. The incident surface 43 and the reflective surface 44 form an angle θ p .
 基材41及び光学要素42は、透明材料で構成される。光学要素42は、例えば、基材41と同じ透明材料によって基材41と一体的に形成される。基材41と光学要素42とを個別に形成し、透明な接着材を用いて基材41に光学要素42を接着してもよい。基材41及び光学要素42を構成する透明材料としては、ガラス、又は透明な樹脂(アクリル樹脂を含む)が用いられる。 The substrate 41 and the optical element 42 are made of a transparent material. The optical element 42 is, for example, formed integrally with the substrate 41 using the same transparent material as the substrate 41. The substrate 41 and the optical element 42 may be formed separately, and the optical element 42 may be adhered to the substrate 41 using a transparent adhesive. The transparent material that constitutes the substrate 41 and the optical element 42 may be glass or a transparent resin (including an acrylic resin).
 このように構成された光学素子40は、入射光を内部で反射して、空中に実像を結像する。また、光学素子40は、素子面の正面の位置に、空中像2を結像する。 The optical element 40 configured in this way internally reflects incident light and forms a real image in the air. The optical element 40 also forms an aerial image 2 at a position in front of the element surface.
 [1-1-3] 空中表示装置1のブロック構成
 図6は、空中表示装置1のブロック図である。空中表示装置1は、制御部70、記憶部71、入出力インターフェース(入出力IF)72、表示部73、センシング素子50、及び入力部74を備える。制御部70、記憶部71、及び入出力インターフェース72は、バス75を介して互いに接続される。
[1-1-3] Block configuration of the aerial display device 1 Figure 6 is a block diagram of the aerial display device 1. The aerial display device 1 includes a control unit 70, a memory unit 71, an input/output interface (input/output IF) 72, a display unit 73, a sensing element 50, and an input unit 74. The control unit 70, the memory unit 71, and the input/output interface 72 are connected to each other via a bus 75.
 入出力インターフェース72は、表示部73、センシング素子50、及び入力部74に接続される。入出力インターフェース72は、表示部73、センシング素子50、及び入力部74のそれぞれに対して、所定の規格に応じたインターフェース処理を行う。 The input/output interface 72 is connected to the display unit 73, the sensing element 50, and the input unit 74. The input/output interface 72 performs interface processing for each of the display unit 73, the sensing element 50, and the input unit 74 in accordance with a predetermined standard.
 表示部73は、照明素子10、及び表示素子20を備える。表示部73は、画像を表示する。 The display unit 73 includes an illumination element 10 and a display element 20. The display unit 73 displays an image.
 センシング素子50は、発光部51、及び受光部52を備える。発光部51は、検知領域に向けて赤外光を発光する。受光部52は、対象物で反射された反射光を検知する。 The sensing element 50 includes a light-emitting unit 51 and a light-receiving unit 52. The light-emitting unit 51 emits infrared light toward the detection area. The light-receiving unit 52 detects the light reflected by the object.
 制御部70は、CPU(Central Processing Unit)やMPU(Micro Processing Unit)等の1つ以上のプロセッサにより構成される。制御部70は、記憶部71に格納されたプログラムを実行することで各種機能を実現する。制御部70は、表示処理部70A、情報処理部70B、及び検知位置算出部70Cを備える。 The control unit 70 is composed of one or more processors, such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit). The control unit 70 realizes various functions by executing programs stored in the memory unit 71. The control unit 70 includes a display processing unit 70A, an information processing unit 70B, and a detection position calculation unit 70C.
 表示処理部70Aは、表示部73(具体的には、照明素子10、及び表示素子20)の動作を制御する。表示処理部70Aは、照明素子10のオン及びオフを制御する。表示処理部70Aは、表示素子20に画像信号を送信し、表示素子20に画像を表示させる。 The display processing unit 70A controls the operation of the display unit 73 (specifically, the lighting element 10 and the display element 20). The display processing unit 70A controls the on and off of the lighting element 10. The display processing unit 70A transmits an image signal to the display element 20, causing the display element 20 to display an image.
 情報処理部70Bは、空中表示装置1が表示する画像を生成する。情報処理部70Bは、記憶部71に格納された画像データを用いることが可能である。情報処理部70Bは、図示せぬ通信機能を用いて外部から画像データを取得してもよい。 The information processing unit 70B generates an image to be displayed by the aerial display device 1. The information processing unit 70B can use image data stored in the memory unit 71. The information processing unit 70B may also obtain image data from outside using a communication function (not shown).
 検知位置算出部70Cは、センシング素子50の動作を制御する。検知位置算出部70Cは、センシング素子50に含まれる発光部51が赤外光を出射するように制御し、所定の空間領域に赤外光からなる検知領域を形成する。検知位置算出部70Cは、センシング素子50に含まれる受光部52から送られる複数の検知信号に基づいて、対象物の位置を算出する。 The detection position calculation unit 70C controls the operation of the sensing element 50. The detection position calculation unit 70C controls the light emitting unit 51 included in the sensing element 50 to emit infrared light, and forms a detection area consisting of infrared light in a specified spatial region. The detection position calculation unit 70C calculates the position of the target object based on multiple detection signals sent from the light receiving unit 52 included in the sensing element 50.
 記憶部71は、ROM(Read Only Memory)、HDD(Hard Disk Drive)、及びSSD(Solid State Drive)等の不揮発性記憶装置と、RAM(Random Access Memory)、及びレジスタ等の揮発性記憶装置とを含む。記憶部71は、制御部70が実行するプログラムを格納する。記憶部71は、制御部70の制御に必要な各種データを格納する。記憶部71は、空中表示装置1が表示する画像のデータを格納する。 The memory unit 71 includes non-volatile memory devices such as a ROM (Read Only Memory), HDD (Hard Disk Drive), and SSD (Solid State Drive), and volatile memory devices such as a RAM (Random Access Memory) and registers. The memory unit 71 stores programs executed by the control unit 70. The memory unit 71 stores various data necessary for the control of the control unit 70. The memory unit 71 stores data for images displayed by the aerial display device 1.
 入力部74は、例えばタッチパネルやボタンなどを含み、ユーザが入力した情報を受け付ける。情報処理部70Bは、入力部74が受け付けた情報に基づいて、表示部73に表示する画像を選択することが可能である。 The input unit 74 includes, for example, a touch panel or buttons, and receives information input by the user. The information processing unit 70B is capable of selecting an image to be displayed on the display unit 73 based on the information received by the input unit 74.
 [1-2] 空中表示装置1の動作
 次に、上記のように構成された空中表示装置1の動作について説明する。
[1-2] Operation of the aerial display device 1 Next, the operation of the aerial display device 1 configured as described above will be described.
 図2の矢印は、光路を示している。図2に示すように、表示素子20の任意の点“o”から出射された光は、配向制御素子30に入射する。表示素子20から出射された光のうち角度θの光成分(角度θを中心とした所定の角度範囲の光成分を含む)は、配向制御素子30を透過する。配向制御素子30を透過した光は、光学素子40に入射する。光学素子40は、入射光を、配向制御素子30と反対側に反射し、空中に空中像2を結像する。 The arrows in Fig. 2 indicate optical paths. As shown in Fig. 2, light emitted from an arbitrary point "o" of the display element 20 is incident on the orientation control element 30. Of the light emitted from the display element 20, a light component at an angle θ1 (including a light component in a predetermined angle range centered on the angle θ1 ) is transmitted through the orientation control element 30. The light transmitted through the orientation control element 30 is incident on the optical element 40. The optical element 40 reflects the incident light to the opposite side to the orientation control element 30, and forms an aerial image 2 in the air.
 図7は、光学素子40における光の反射の様子を説明する斜視図である。図8は、光学素子40における光の反射の様子を説明するXZ面の側面図である。図8は、観察者3の両目(すなわち、両目を結ぶ線)がX方向に平行な状態で光学素子40を見た図である。図9は、光学素子40における光の反射の様子を説明するYZ面の側面図である。図9は、観察者3の両目がY方向に平行な状態で光学素子40を見た図である。 Figure 7 is a perspective view illustrating how light is reflected in the optical element 40. Figure 8 is a side view of the XZ plane illustrating how light is reflected in the optical element 40. Figure 8 is a view of the optical element 40 when both eyes of the observer 3 (i.e., the line connecting both eyes) are parallel to the X direction. Figure 9 is a side view of the YZ plane illustrating how light is reflected in the optical element 40. Figure 9 is a view of the optical element 40 when both eyes of the observer 3 are parallel to the Y direction.
 表示素子20の任意の点“o”から出射された光は、光学素子40の入射面43に入射し、反射面44に到達する。反射面44の法線方向に対して臨界角よりも大きい角度で反射面44に到達した光は、反射面44で全反射され、光学素子40の光学要素42が形成されている側と反対側の平面から出射される。臨界角とは、その入射角を超えると全反射する最少の入射角である。臨界角は、入射面の垂線に対する角度である。 Light emitted from an arbitrary point "o" of the display element 20 enters the incident surface 43 of the optical element 40 and reaches the reflecting surface 44. Light that reaches the reflecting surface 44 at an angle greater than the critical angle with respect to the normal direction of the reflecting surface 44 is totally reflected by the reflecting surface 44 and is emitted from the flat surface opposite the side on which the optical elements 42 of the optical element 40 are formed. The critical angle is the smallest incident angle above which total reflection occurs. The critical angle is the angle with respect to the perpendicular to the incident surface.
 図8のXZ面では、点“o”から出射された光は、光学要素42の反射面44で全反射され、その光は空中で結像されて空中像を生成する。 In the XZ plane of FIG. 8, the light emitted from point "o" is totally reflected by the reflecting surface 44 of the optical element 42, and the light is focused in the air to generate an aerial image.
 図9のYZ面では、点“o”から出射された光は、光学要素42の反射面44で反射されず、その光は空中で結像することがないため空中像の生成に寄与しない。 In the YZ plane of FIG. 9, the light emitted from point "o" is not reflected by the reflecting surface 44 of the optical element 42, and the light does not form an image in the air, so it does not contribute to the generation of an aerial image.
 すなわち、観察者3が空中像を視認できる条件は、観察者3の両眼がX方向に平行、又はそれに近い状態(例えばX方向に対して±10度)である。また、観察者3の両眼がX方向に平行、又はそれに近い状態でY方向に沿って視点を移動した場合、空中像を常に認識することができる。 In other words, the condition for observer 3 to be able to see the aerial image is that both eyes of observer 3 are parallel to the X direction or close to it (for example, ±10 degrees with respect to the X direction). Also, if observer 3's eyes are parallel to the X direction or close to it and the viewpoint is moved along the Y direction, the aerial image can always be recognized.
 図10は、光学素子40における入射面43及び反射面44の角度条件を説明する図である。 FIG. 10 is a diagram explaining the angular conditions of the incident surface 43 and the reflecting surface 44 of the optical element 40.
 Z方向(素子面に垂直な方向)に対する入射面43の角度をθ、Z方向に対する反射面44の角度をθ、入射面43と反射面44とのなす角度をθとする。角度をθは、以下の式(1)で表される。 
 θ=θ+θ   ・・・(1)
The angle of the incident surface 43 with respect to the Z direction (the direction perpendicular to the element surface) is θ 2 , the angle of the reflecting surface 44 with respect to the Z direction is θ 3 , and the angle between the incident surface 43 and the reflecting surface 44 is θ p . The angle θ p is expressed by the following formula (1).
θ p = θ 2 + θ 3 ... (1)
 配向制御素子30から角度θで出射された光は、入射面43に入射する。光学素子40の材料の屈折率をn、空気の屈折率を1とする。入射面43における入射角をθ、屈折角をθとする。反射面44における入射角をθ、反射角をθ(=θ)とする。光学素子40の上面における入射角をθ、屈折角をθとする。屈折角θが出射角である。出射角θは、以下の式(2)で表される。
 θ=sin-1(n*sin(sin-1((1/n)*sin(90°-(θ+θ)))+θ+2θ-90°))   ・・・(2)
Light emitted from orientation control element 30 at an angle θ 1 is incident on incident surface 43. The refractive index of the material of optical element 40 is n p , and the refractive index of air is 1. The incident angle at incident surface 43 is θ 4 , and the refraction angle is θ 5. The incident angle at reflecting surface 44 is θ 6 , and the reflection angle is θ 7 (=θ 6 ). The incident angle at the top surface of optical element 40 is θ 8 , and the refraction angle is θ 9. The refraction angle θ 9 is the emission angle. The emission angle θ 9 is expressed by the following equation (2).
θ 9 = sin −1 (n p * sin (sin −1 ((1/n p ) * sin (90° - (θ 1 + θ 2 )))) + θ 2 + 2θ 3 - 90°)) ... (2)
 反射面44における臨界角は、以下の式(3)で表される。
 臨界角<θ(=θ
 臨界角=sin-1(1/n)   ・・・(3)
The critical angle at the reflecting surface 44 is expressed by the following formula (3).
Critical angle < θ 6 (= θ 7 )
Critical angle = sin −1 (1/n p ) (3)
 すなわち、反射面44における入射角θは、反射面44における臨界角より大きく設定される。換言すると、反射面44の角度θは、反射面44に入射する光の入射角が臨界角より大きくなるように設定される。 That is, the angle of incidence θ6 on the reflecting surface 44 is set to be larger than the critical angle on the reflecting surface 44. In other words, the angle θ3 of the reflecting surface 44 is set so that the angle of incidence of light incident on the reflecting surface 44 is larger than the critical angle.
 また、入射面43に入射した光は、入射面43で全反射されないように設定される。すなわち、入射面43の角度θは、入射面43に入射する光の入射角が臨界角より小さくなるように設定される。 Moreover, the angle θ2 of the incident surface 43 is set so that the light incident on the incident surface 43 is not totally reflected by the incident surface 43. In other words, the angle θ2 of the incident surface 43 is set so that the angle of incidence of the light incident on the incident surface 43 is smaller than the critical angle.
 光学素子40の素子面と空中像2の面との角度、及び光学素子40の素子面と空中像2の面との距離は、光学素子40に入射する光の角度θ、光学素子40の屈折率、光学素子40の入射面43の角度θ、光学素子40の反射面44の角度θを最適に設定することで調整が可能である。 The angle between the element surface of the optical element 40 and the surface of the aerial image 2, and the distance between the element surface of the optical element 40 and the surface of the aerial image 2 can be adjusted by optimally setting the angle θ 1 of light incident on the optical element 40, the refractive index of the optical element 40, the angle θ 2 of the incident surface 43 of the optical element 40, and the angle θ 3 of the reflecting surface 44 of the optical element 40.
 [1-3] 対象物の検知動作
 次に、対象物の検知動作について説明する。
[1-3] Object Detection Operation Next, the object detection operation will be described.
 図11は、観察者3が空中像2をタッチする様子を説明する図である。図11において、Y方向が鉛直方向(地面に直交する方向)であり、X方向が水平方向(地面に平行な方向)であり、Z方向が観察者3の正面方向(視線に平行な方向)である。観察者3の上下方向は、Y方向に沿った方向である。 Figure 11 is a diagram explaining how the observer 3 touches the aerial image 2. In Figure 11, the Y direction is the vertical direction (direction perpendicular to the ground), the X direction is the horizontal direction (direction parallel to the ground), and the Z direction is the front direction of the observer 3 (direction parallel to the line of sight). The up and down direction of the observer 3 is along the Y direction.
 空中表示装置1は、空中に空中像2A、2Bを結像する。空中像2A、2Bは、例えば、四角形の押しボタンである。空中像2A、2Bは、互いに異なる2次元の空間領域に表示される。 The aerial display device 1 forms aerial images 2A and 2B in the air. The aerial images 2A and 2B are, for example, rectangular push buttons. The aerial images 2A and 2B are displayed in different two-dimensional spatial regions.
 センシング素子50は、制御部70の制御に基づいて、空中像2A、2Bにおおよそ重なる領域にそれぞれ、検知領域53A、53Bを形成する。検知領域53A、53Bは、赤外光で形成される。空中像の形状は、空中表示装置1が表示する画像として予め決められており、空中像の形状に関する情報は、空中表示装置1が表示する画像の情報に関連付けて、記憶部71に格納されている。空中像2Aが占める領域は、検知領域53Aとして設定され、検知領域53Aの情報は、記憶部71に格納されている。同様に、複数の空中像の各々について、検知領域が設定される。 Under the control of the control unit 70, the sensing element 50 forms detection areas 53A and 53B in areas that roughly overlap the aerial images 2A and 2B, respectively. Detection areas 53A and 53B are formed with infrared light. The shape of the aerial image is predetermined as the image to be displayed by the aerial display device 1, and information regarding the shape of the aerial image is stored in the memory unit 71 in association with information regarding the image displayed by the aerial display device 1. The area occupied by the aerial image 2A is set as detection area 53A, and information regarding detection area 53A is stored in the memory unit 71. Similarly, a detection area is set for each of the multiple aerial images.
 観察者3は、自身の指3Aで空中像2A又は空中像2Bをタッチする。センシング素子50は、制御部70の制御に基づいて、検知領域53A又は検知領域53Bに存在する観察者3の指3Aを検知する。具体的には、検知位置算出部70Cは、センシング素子50の受光部52から送られる信号に基づいて、対象物の位置を算出する。検知位置算出部70Cは、算出した位置が例えば検知領域53Aに存在する場合、空中像2Aがタッチされたと判定する。 The observer 3 touches the aerial image 2A or 2B with his/her finger 3A. The sensing element 50 detects the finger 3A of the observer 3 present in the detection area 53A or the detection area 53B based on the control of the control unit 70. Specifically, the detection position calculation unit 70C calculates the position of the object based on the signal sent from the light receiving unit 52 of the sensing element 50. If the calculated position is present in the detection area 53A, for example, the detection position calculation unit 70C determines that the aerial image 2A has been touched.
 図12は、観察者3が視点を上下に動かした場合の空中像の様子を説明する図である。観察者3が空中表示装置1の正面から上側に視点を移動した場合、観察者3が視認する空中像2A、2Bも上側に移動する。図12は、空中像2A、2Bが上側に移動した場合を例示している。観察者3が空中表示装置1の正面から下側に視点を移動した場合、観察者3が視認する空中像2A、2Bも下側に移動する。 Figure 12 is a diagram explaining the state of the aerial images when observer 3 moves his/her viewpoint up and down. When observer 3 moves his/her viewpoint from the front of the aerial display device 1 to the upper side, the aerial images 2A, 2B viewed by observer 3 also move upward. Figure 12 illustrates an example in which aerial images 2A, 2B move upward. When observer 3 moves his/her viewpoint from the front of the aerial display device 1 to the lower side, the aerial images 2A, 2B viewed by observer 3 also move downward.
 一方、検知領域53A及び検知領域53Bの情報は、表示する空中像に基づいて制御部70により予め管理されており、検知領域53A及び検知領域53Bは、常に同じ位置に設定される。よって、観察者3から見て、空中像2A、2Bと検知領域53A、53Bとがそれぞれずれる場合がある。 On the other hand, the information on detection areas 53A and 53B is managed in advance by the control unit 70 based on the aerial image to be displayed, and detection areas 53A and 53B are always set to the same position. Therefore, when viewed by the observer 3, there may be a misalignment between the aerial images 2A and 2B and the detection areas 53A and 53B.
 図13は、空中表示装置1が表示する空中像の一例を説明する図である。空中表示装置1は、4個の空中像2A~2Dを表示する。空中像2A~2Dの各々は、例えば、四角形の押しボタンである。空中像2A~2Dは、四角形に配置される。空中像2A~2Dは、例えば、互いに異なる色に着色される。 FIG. 13 is a diagram illustrating an example of an aerial image displayed by the aerial display device 1. The aerial display device 1 displays four aerial images 2A to 2D. Each of the aerial images 2A to 2D is, for example, a rectangular push button. The aerial images 2A to 2D are arranged in a rectangle. The aerial images 2A to 2D are, for example, colored differently from each other.
 図14は、センシング素子50が形成する検知領域の一例を説明する図である。センシング素子50は、4個の空中像2A~2Dにそれぞれ対応する4個の検知領域53A~53Dを形成する。検知領域53Aの外形は、空中像2Aと同じ形状を有する。検知領域53Aは、平面視において、空中像2Aと重なるように配置されるとともに、空中像2Aとおおよそ同じレベルに配置される。空中像2A及び検知領域53Aのレベルとは、空中表示装置1の光出射面からの高さを意味する。検知領域53B~53Dの構成も、検知領域53Aと同様である。 FIG. 14 is a diagram illustrating an example of a detection area formed by the sensing element 50. The sensing element 50 forms four detection areas 53A-53D corresponding to the four aerial images 2A-2D, respectively. The outer shape of the detection area 53A has the same shape as the aerial image 2A. In a planar view, the detection area 53A is positioned so as to overlap with the aerial image 2A and is positioned at approximately the same level as the aerial image 2A. The level of the aerial image 2A and the detection area 53A refers to the height from the light emission surface of the aerial display device 1. The configuration of the detection areas 53B-53D is also similar to that of the detection area 53A.
 図15は、一部の検知領域(検知領域53A、53B)の詳細な構成を説明する図である。検知領域53Aは、それぞれがX方向に延び、Y方向に並んだ複数の検知部分領域54Aを備える。複数の検知部分領域54Aの各々は、長方形を有する。検知部分領域54Aの幅(Y方向の長さ)は、適宜設定可能であり、検知精度を向上させる観点から、より狭い方が望ましい。隣接する検知部分領域54Aの間隔は、適宜設定可能であり、検知精度を向上させる観点から、より狭い方が望ましい。センシング素子50は、検知部分領域54Aごとに、対象物を検知することが可能である。同様に、検知領域53Bは、複数の検知部分領域54Bを備える。検知領域53C、53Dの構成も、検知領域53Aと同様である。検知領域53A~53Dに含まれる検知部分領域の幅及び間隔はそれぞれ同じである。 FIG. 15 is a diagram for explaining the detailed configuration of some of the detection regions ( detection regions 53A, 53B). Detection region 53A includes multiple detection partial regions 54A, each extending in the X direction and aligned in the Y direction. Each of the multiple detection partial regions 54A has a rectangular shape. The width (length in the Y direction) of the detection partial region 54A can be set as appropriate, and it is preferable that it is narrower from the viewpoint of improving detection accuracy. The interval between adjacent detection partial regions 54A can be set as appropriate, and it is preferable that it is narrower from the viewpoint of improving detection accuracy. The sensing element 50 can detect an object for each detection partial region 54A. Similarly, detection region 53B includes multiple detection partial regions 54B. The configuration of detection regions 53C and 53D is also the same as that of detection region 53A. The width and interval of the detection partial regions included in detection regions 53A to 53D are the same.
 図16は、センシング素子50による検知動作を説明する模式図である。図16には、2個の検知領域53A、53Bを示している。図16は、検知領域53A、53Bを観察者3と反対側(空中表示装置1側)から見た図である。 FIG. 16 is a schematic diagram explaining the detection operation by the sensing element 50. Two detection areas 53A and 53B are shown in FIG. 16. FIG. 16 shows the detection areas 53A and 53B as viewed from the opposite side to the observer 3 (the aerial display device 1 side).
 観察者3の指3Aが検知領域53A、53Bの間の領域を含むようにして検知領域53A、53Bの両方に存在しているものとする。検知領域53Aにおいて、m本の検知部分領域54Aがオンしている。検知領域53Bにおいて、n本の検知部分領域54Bがオンしている。“検知部分領域がオン”とは、検知部分領域が対象物を検知している状態をいう。mは、0以上かつ全検知部分領域54Aの数以下である。nは、0以上かつ全検知部分領域54Bの数以下である。 It is assumed that the finger 3A of the observer 3 is present in both detection regions 53A and 53B, including the area between the detection regions 53A and 53B. In the detection region 53A, m detection partial regions 54A are on. In the detection region 53B, n detection partial regions 54B are on. "Detection partial region is on" refers to the state in which the detection partial region is detecting an object. m is 0 or more and is 0 or less than the total number of detection partial regions 54A. n is 0 or more and is 0 or less than the total number of detection partial regions 54B.
 センシング素子50は、対象物を以下のように判定する。
・「m>n」の場合は、空中像2Aがタッチされている。「m>n」は、対象物のうち検知領域53Aに存在する領域が、検知領域53Bに存在する領域より大きいことを意味する。
・「m<n」の場合は、空中像2Bがタッチされている。「m<n」は、対象物のうち検知領域53Bに存在する領域が、検知領域53Aに存在する領域より大きいことを意味する。
・「m=n」の場合は、タッチ操作が無いと判定する。
The sensing element 50 determines the object as follows.
If “m>n”, the aerial image 2A is touched. “m>n” means that the area of the object present in the detection area 53A is larger than the area present in the detection area 53B.
When “m<n”, the aerial image 2B is touched. “m<n” means that the area of the object that exists in the detection area 53B is larger than the area that exists in the detection area 53A.
If "m=n", it is determined that there is no touch operation.
 このような対象物の判定処理により、観察者3のタッチ操作をより正確に判定できる。 By performing this type of object determination process, the touch operation by the observer 3 can be determined more accurately.
 [1-4] 全体動作の流れ
 次に、空中表示装置1における全体動作の流れについて説明する。図17は、空中表示装置1における全体動作を説明するフローチャートである。以下の説明では、図16に対応した動作について説明する。すなわち、空中表示装置1は、2個の空中像2A、2Bを表示するとともに、2個の検知領域53A、53Bを形成するものとする。
[1-4] Flow of Overall Operation Next, the flow of overall operation in the aerial display device 1 will be described. Figure 17 is a flowchart explaining the overall operation in the aerial display device 1. In the following explanation, the operation corresponding to Figure 16 will be explained. That is, the aerial display device 1 displays two aerial images 2A, 2B and forms two detection areas 53A, 53B.
 制御部70は、空中像2A、2Bを表示する(ステップS100)。表示処理部70Aは、表示素子20の画面に画像を表示させる。光学素子40は、表示素子20からの光を反射し、空中に空中像2A、2Bを結像する。 The control unit 70 displays the aerial images 2A and 2B (step S100). The display processing unit 70A displays an image on the screen of the display element 20. The optical element 40 reflects light from the display element 20 and forms the aerial images 2A and 2B in the air.
 続いて、センシング素子50は、センシング動作を実行する(ステップS101)。センシング素子50に含まれる発光部51は、空中像2A、2Bが表示された領域を含む検知領域53A、53Bに、赤外光を出射する。 Then, the sensing element 50 executes a sensing operation (step S101). The light-emitting unit 51 included in the sensing element 50 emits infrared light to the detection areas 53A and 53B including the areas where the aerial images 2A and 2B are displayed.
 続いて、センシング素子50は、検知領域53A、53B内に対象物が存在するか否かを監視している(ステップS102)。すなわち、センシング素子50に含まれる受光部52は、対象物で反射された赤外光を監視している。 Then, the sensing element 50 monitors whether or not an object is present within the detection areas 53A and 53B (step S102). That is, the light receiving unit 52 included in the sensing element 50 monitors the infrared light reflected by the object.
 センシング素子50が対象物を検知した場合(S102=Yes)、検知位置算出部70Cは、センシング素子50により検知された信号に基づいて、オンしている検知部分領域の数を判定する(ステップS103)。具体的には、検知位置算出部70Cは、検知領域53Aに含まれる全ての検知部分領域54Aのうちオンしている検知部分領域54Aの数mと、検知領域53Bに含まれる全ての検知部分領域54Bのうちオンしている検知部分領域54Bの数nとの関係を判定する。検知位置算出部70Cは、数mと数nとを比較し、数mと数nとの大小関係を判定する。 If the sensing element 50 detects an object (S102 = Yes), the detection position calculation unit 70C determines the number of ON detection partial areas based on the signal detected by the sensing element 50 (step S103). Specifically, the detection position calculation unit 70C determines the relationship between the number m of ON detection partial areas 54A among all detection partial areas 54A included in the detection area 53A and the number n of ON detection partial areas 54B among all detection partial areas 54B included in the detection area 53B. The detection position calculation unit 70C compares the number m with the number n and determines whether the number m is larger or smaller than the number n.
 「m>n」である場合(ステップS104=Yes)、検知位置算出部70Cは、第1空中像(空中像2Aに対応する)がタッチされたと判定する(ステップS105)。 If "m>n" is true (step S104 = Yes), the detection position calculation unit 70C determines that the first aerial image (corresponding to aerial image 2A) has been touched (step S105).
 「m<n」である場合(ステップS106=Yes)、検知位置算出部70Cは、第2空中像(空中像2Bに対応する)がタッチされたと判定する(ステップS107)。 If "m<n" (step S106 = Yes), the detection position calculation unit 70C determines that the second aerial image (corresponding to aerial image 2B) has been touched (step S107).
 「m=n」である場合(ステップS106=No)、検知位置算出部70Cは、観察者3によるタッチ操作が無いと判定する(ステップS108)。 If "m = n" (step S106 = No), the detection position calculation unit 70C determines that there is no touch operation by the observer 3 (step S108).
 このようにして、観察者3による空中像2へのタッチ操作が検知される。その後、空中表示装置1は、観察者3のタッチ操作に応じた動作を実行する。 In this way, the touch operation on the aerial image 2 by the observer 3 is detected. After that, the aerial display device 1 executes an operation according to the touch operation by the observer 3.
 [1-5] 第1実施形態の効果
 第1実施形態では、ある空中像2に対応して検知領域53を形成し、この検知領域53を、複数のライン状の検知部分領域54で構成する。そして、対象物によりオンした検知部分領域54の数に応じて、観察者3のタッチ操作を判定するようにしている。これにより、隣接する2個の検知領域53の両方に対象物が存在する場合でも、観察者3が目標とする空中像2へのタッチ操作を認識することができる。ひいては、観察者3による空中像2へのタッチ操作をより正確に検知することが可能な空中表示装置1を実現できる。
[1-5] Effects of the First Embodiment In the first embodiment, a detection area 53 is formed corresponding to a certain aerial image 2, and this detection area 53 is composed of a plurality of linear detection partial areas 54. Then, the touch operation of the observer 3 is determined according to the number of detection partial areas 54 that are turned on by an object. This makes it possible to recognize the touch operation of the observer 3 on the targeted aerial image 2 even if an object exists in both of two adjacent detection areas 53. As a result, it is possible to realize an aerial display device 1 that can more accurately detect the touch operation of the observer 3 on the aerial image 2.
 また、空中表示装置1は、表示素子20から出射された光を光学素子40で反射させることで、空中に空中像2を表示することができる。また、空中表示装置1は、その正面方向において、光学素子40の素子面に平行に空中像2を表示することができる。また、表示品質を向上させることが可能な空中表示装置1を実現できる。 The aerial display device 1 can display an aerial image 2 in the air by reflecting light emitted from the display element 20 by the optical element 40. The aerial display device 1 can also display the aerial image 2 in the front direction, parallel to the element surface of the optical element 40. It is also possible to realize an aerial display device 1 that can improve display quality.
 また、観察者3の両眼がX方向(すなわち、複数の光学要素42が並ぶ方向)に平行、又はそれに近い状態で光学素子40を見た場合に、観察者3は、空中像を視認することができる。また、観察者3の両眼がX方向に平行、又はそれに近い状態でY方向に沿って視点を移動した場合、空中像を常に視認することができる。また、観察者3の両眼がX方向に平行、又はそれに近い状態において、より広い視野角を実現できる。 In addition, when the observer 3 looks at the optical element 40 with both eyes parallel to or close to the X direction (i.e., the direction in which the multiple optical elements 42 are arranged), the observer 3 can see an aerial image. In addition, when the observer 3 moves the viewpoint along the Y direction with both eyes parallel to or close to the X direction, the observer 3 can always see an aerial image. In addition, a wider viewing angle can be achieved when the observer 3's eyes are parallel to or close to the X direction.
 また、空中表示装置1を構成する複数の素子を平行に配置することができる。これにより、Z方向に小型化が可能な空中表示装置1を実現できる。 In addition, the multiple elements that make up the aerial display device 1 can be arranged in parallel. This makes it possible to realize an aerial display device 1 that can be made smaller in size in the Z direction.
 [2] 第2実施形態
 第2実施形態は、オンしている検知部分領域54Aの数mと、オンしている検知部分領域54Bの数nとが同じである場合、2個の空中像2A、2Bのどちらをタッチしているか判定できないことを観察者3に認識させるようにしている。
[2] Second embodiment In the second embodiment, when the number m of turned-on detection partial areas 54A is the same as the number n of turned-on detection partial areas 54B, the observer 3 is made aware that it is impossible to determine which of the two aerial images 2A, 2B he or she is touching.
 図18は、本発明の第2実施形態に係る空中表示装置1における全体動作を説明するフローチャートである。ステップS100~S107までの動作は、第1実施形態と同じである。 FIG. 18 is a flowchart explaining the overall operation of the aerial display device 1 according to the second embodiment of the present invention. The operations from steps S100 to S107 are the same as those in the first embodiment.
 「m=n」である場合(ステップS106=No)、制御部70は、第1及び第2空中像(空中像2A、2B)を点滅させる(ステップS200)。具体的には、表示処理部70Aは、表示素子20の画像を点滅させる。光学素子40は、点滅した第1及び第2空中像(空中像2A、2B)を結像する。この動作により、第1及び第2空中像のどちらがタッチされたか判定できないことを観察者3に認識させることが可能である。また、この動作により、観察者3が指3Aを目標の空中像にずらすように促すことが可能である。 If "m = n" (step S106 = No), the control unit 70 blinks the first and second aerial images ( aerial images 2A, 2B) (step S200). Specifically, the display processing unit 70A blinks the image on the display element 20. The optical element 40 forms the blinking first and second aerial images ( aerial images 2A, 2B). This operation makes it possible for the observer 3 to recognize that it is impossible to determine which of the first and second aerial images has been touched. This operation also makes it possible to prompt the observer 3 to move his/her finger 3A to the target aerial image.
 続いて、ステップS103に移行する。その後、「m>n」であるか、又は「m<n」であるかが判定される。 Then, the process proceeds to step S103. It is then determined whether "m>n" or "m<n" holds.
 第2実施形態によれば、観察者3のタッチ操作をより正確に判定できる。その他の効果は、第1実施形態と同じである。 According to the second embodiment, the touch operation of the observer 3 can be determined more accurately. The other effects are the same as those of the first embodiment.
 なお、変形例として、検知対象の2個の空中像を点滅させる動作に替えて、検知対象の2個の空中像を所定の割合だけ拡大するようにしてもよい。 As a modified example, instead of blinking the two aerial images of the detection target, the two aerial images of the detection target may be enlarged by a predetermined ratio.
 [3] 第3実施形態
 第3実施形態は、オンしている検知部分領域54Aの数mと、オンしている検知部分領域54Bの数nとが同じである場合、タッチ操作の判定を一定時間だけ待機するようにしている。
[3] Third embodiment In the third embodiment, when the number m of turned-on detection partial areas 54A is the same as the number n of turned-on detection partial areas 54B, the determination of the touch operation is waited for a certain period of time.
 図19は、本発明の第3実施形態に係る空中表示装置1における全体動作を説明するフローチャートである。ステップS100~S107までの動作は、第1実施形態と同じである。 FIG. 19 is a flowchart explaining the overall operation of the aerial display device 1 according to the third embodiment of the present invention. The operations from step S100 to S107 are the same as those in the first embodiment.
 「m=n」である場合(ステップS106=No)、検知位置算出部70Cは、一定時間だけ待機する(ステップS201)。ステップS201の一定時間の間、検知位置算出部70Cは、空中像のタッチ操作の判定を停止する。この動作により、第1及び第2空中像のどちらがタッチされたか判定できないことを観察者3に認識させることが可能である。また、この動作により、観察者3が指3Aを目標の空中像にずらすように促すことが可能である。 If "m = n" (step S106 = No), the detection position calculation unit 70C waits for a fixed time (step S201). During the fixed time of step S201, the detection position calculation unit 70C stops determining whether the aerial image has been touched. This operation makes it possible for the observer 3 to recognize that it is not possible to determine whether the first or second aerial image has been touched. This operation also makes it possible to encourage the observer 3 to move the finger 3A to the target aerial image.
 続いて、ステップS103に移行する。その後、「m>n」であるか、又は「m<n」であるかが判定される。 Then, the process proceeds to step S103. It is then determined whether "m>n" or "m<n" holds.
 第3実施形態によれば、観察者3のタッチ操作をより正確に判定できる。その他の効果は、第1実施形態と同じである。 According to the third embodiment, the touch operation of the observer 3 can be determined more accurately. The other effects are the same as those of the first embodiment.
 [4] 変形例
 上記実施形態では、観察者3の上下方向(Y方向)に隣接する2個の空中像のどちらがタッチされたかを判定するようにしている。しかし、これに限定されず、X方向に隣接する2個の空中像についても、検知部分領域の数に応じてどちらがタッチされたかを判定するようにしてもよい。
[4] Modifications In the above embodiment, it is determined which of two aerial images adjacent in the up-down direction (Y direction) of the observer 3 has been touched. However, the present invention is not limited to this, and it may also be determined which of two aerial images adjacent in the X direction has been touched, depending on the number of detection partial regions.
 上記実施形態では、表示素子20と光学素子40とを平行に配置している。しかし、これに限定されず、光学素子40に対して表示素子20を斜めに配置してもよい。表示素子20と光学素子40との角度は、0度より大きく45度より小さい範囲に設定される。この変形例では、配向制御素子30を省略できる。 In the above embodiment, the display element 20 and the optical element 40 are arranged parallel to each other. However, this is not limited to this, and the display element 20 may be arranged diagonally relative to the optical element 40. The angle between the display element 20 and the optical element 40 is set in the range greater than 0 degrees and less than 45 degrees. In this modified example, the orientation control element 30 can be omitted.
 上記実施形態で説明した光学素子40に限定されず、他の種類の結像素子を用いることも可能である。例えば、2面コーナーリフレクタがアレイ状に配列された2面コーナーリフレクタアレイで光学素子40を構成してもよい。2面コーナーリフレクタアレイを用いた場合は、面対称の位置に空中像が結像される。 It is not limited to the optical element 40 described in the above embodiment, and other types of imaging elements can also be used. For example, the optical element 40 may be configured as a dihedral corner reflector array in which dihedral corner reflectors are arranged in an array. When a dihedral corner reflector array is used, an aerial image is formed at a plane-symmetrical position.
 上記実施形態では、光学要素42の左側の側面が入射面43、右側の側面が反射面44として定義している。しかし、これに限定されず、入射面43と反射面44とを逆に構成してもよい。この場合、実施形態で説明した空中表示装置1の作用も左右が逆になる。 In the above embodiment, the left side of the optical element 42 is defined as the incident surface 43, and the right side is defined as the reflective surface 44. However, this is not limited to this, and the incident surface 43 and the reflective surface 44 may be configured in reverse. In this case, the function of the aerial display device 1 described in the embodiment will also be reversed.
 上記実施形態では、表示素子20として液晶表示素子を例に挙げて説明しているが、これに限定されるものではない。表示素子20は、自発光型である有機EL(electroluminescence)表示素子、又はマイクロLED(Light Emitting Diode)表示素子などを用いることも可能である。マイクロLED表示素子は、画素を構成するR(赤)、G(緑)、B(青)をそれぞれLEDで発光させる表示素子である。自発光型の表示素子20を用いる場合、照明素子10は不要である。 In the above embodiment, a liquid crystal display element is used as an example of the display element 20, but the present invention is not limited to this. The display element 20 can also be a self-luminous organic EL (electroluminescence) display element or a micro LED (Light Emitting Diode) display element. A micro LED display element is a display element that uses LEDs to emit the R (red), G (green), and B (blue) that make up a pixel. When a self-luminous display element 20 is used, the lighting element 10 is not necessary.
 本発明は、上記実施形態に限定されるものではなく、実施段階ではその要旨を逸脱しない範囲で種々に変形することが可能である。また、各実施形態は適宜組み合わせて実施してもよく、その場合組み合わせた効果が得られる。更に、上記実施形態には種々の発明が含まれており、開示される複数の構成要件から選択された組み合わせにより種々の発明が抽出され得る。例えば、実施形態に示される全構成要件からいくつかの構成要件が削除されても、課題が解決でき、効果が得られる場合には、この構成要件が削除された構成が発明として抽出され得る。 The present invention is not limited to the above-described embodiments, and can be modified in various ways during implementation without departing from the gist of the invention. The embodiments may also be implemented in appropriate combination, in which case the combined effects can be obtained. Furthermore, the above-described embodiments include various inventions, and various inventions can be extracted by combinations selected from the multiple constituent elements disclosed. For example, if the problem can be solved and an effect can be obtained even if some constituent elements are deleted from all the constituent elements shown in the embodiments, the configuration from which these constituent elements are deleted can be extracted as an invention.
 1…空中表示装置、2…空中像、3…観察者、10…照明素子、11…光源部、12…導光板、13…反射シート、20…表示素子、30…配向制御素子、31…基材、32…基材、33…透明部材、34…遮光部材、40…光学素子、41…基材、42…光学要素、43…入射面、44…反射面、50…センシング素子、51…発光部、52…受光部、53…検知領域、54…検知部分領域、60…筐体、70…制御部、70A…表示処理部、70B…情報処理部、70C…検知位置算出部、71…記憶部、72…入出力インターフェース、73…表示部、74…入力部、75…バス。
 
1...aerial display device, 2...aerial image, 3...observer, 10...illumination element, 11...light source unit, 12...light guide plate, 13...reflective sheet, 20...display element, 30...orientation control element, 31...substrate, 32...substrate, 33...transparent member, 34...light-shielding member, 40...optical element, 41...substrate, 42...optical element, 43...incident surface, 44...reflective surface, 50...sensing element, 51...light-emitting unit, 52...light-receiving unit, 53...detection area, 54...detection partial area, 60...housing, 70...control unit, 70A...display processing unit, 70B...information processing unit, 70C...detection position calculation unit, 71...memory unit, 72...input/output interface, 73...display unit, 74...input unit, 75...bus.

Claims (11)

  1.  画像を表示する表示素子と、
     前記表示素子からの光を受けるように配置され、前記表示素子からの光を、前記表示素子と反対側に反射し、空中に空中像を結像する光学素子と、
     前記空中像と重なる空間領域に検知領域を形成し、前記検知領域内の対象物を検知するセンシング素子と、
     を具備し、
     前記空中像は、第1方向に沿って配置された第1及び第2空中像を含み、
     前記検知領域は、前記第1及び第2空中像にそれぞれ対応した第1及び第2検知領域を含み、
     前記第1検知領域は、それぞれが前記第1方向に直交する第2方向に延び、前記第1方向に並んだ複数の第1部分領域を含み、
     前記第2検知領域は、それぞれが前記第2方向に延び、前記第1方向に並んだ複数の第2部分領域を含み、
     前記センシング素子は、前記複数の第1部分領域のうちオンした第1部分領域の数と、前記複数の第2部分領域のうちオンした第2部分領域の数との関係に基づいて、前記第1及び第2空中像のどちらがタッチされたかを判定する
     空中表示装置。
    A display element for displaying an image;
    an optical element that is arranged to receive light from the display element, and that reflects the light from the display element to an opposite side to the display element to form an aerial image in the air;
    a sensing element that forms a detection area in a spatial region overlapping with the aerial image and detects an object within the detection area;
    Equipped with
    the aerial image includes first and second aerial images arranged along a first direction;
    the detection area includes first and second detection areas corresponding to the first and second aerial images, respectively;
    The first detection area includes a plurality of first partial areas each extending in a second direction perpendicular to the first direction and aligned in the first direction,
    The second detection region includes a plurality of second partial regions each extending in the second direction and aligned in the first direction,
    The sensing element determines whether either the first or second aerial image has been touched based on the relationship between the number of turned-on first partial areas among the plurality of first partial areas and the number of turned-on second partial areas among the plurality of second partial areas.
  2.  前記複数の第1部分領域のうちオンした第1部分領域の数をm、前記複数の第2部分領域のうちオンした第2部分領域の数をnとすると、
     前記センシング素子は、m>nである場合に、前記第1空中像がタッチされたと判定し、m<nである場合に、前記第2空中像がタッチされたと判定する
     請求項1に記載の空中表示装置。
    Let m be the number of first partial regions that are turned on among the plurality of first partial regions, and n be the number of second partial regions that are turned on among the plurality of second partial regions.
    The aerial display device according to claim 1 , wherein the sensing element determines that the first aerial image has been touched if m>n, and determines that the second aerial image has been touched if m<n.
  3.  前記センシング素子は、m=nである場合、タッチ操作が無いと判定する
     請求項2に記載の空中表示装置。
    The aerial display device according to claim 2 , wherein the sensing element determines that no touch operation has been performed when m=n.
  4.  前記表示素子は、前記センシング素子によりm=nであると判定された場合、前記第1及び第2空中像を点滅させる
     請求項2に記載の空中表示装置。
    The aerial display device according to claim 2 , wherein the display element blinks the first and second aerial images when the sensing element determines that m=n.
  5.  前記表示素子は、前記センシング素子によりm=nであると判定された場合、前記第1及び第2空中像を拡大する
     請求項2に記載の空中表示装置。
    The aerial display device according to claim 2 , wherein the display element enlarges the first and second aerial images when the sensing element determines that m=n.
  6.  前記センシング素子は、m=nである場合、一定時間だけ待機し、前記一定時間が経過した後、数mと数nとの大小関係を判定する
     請求項2に記載の空中表示装置。
    The aerial display device according to claim 2 , wherein the sensing element waits for a certain period of time when m=n, and after the certain period of time has elapsed, determines whether the number m is larger than the number n.
  7.  前記センシング素子は、前記検知領域に向けて光を発光する発光部と、前記対象物で反射された反射光を受光する受光部とを含む
     請求項1に記載の空中表示装置。
    The aerial display device according to claim 1 , wherein the sensing element includes a light-emitting section that emits light toward the detection area and a light-receiving section that receives light reflected by the object.
  8.  前記光学素子は、平面状の基材と、前記基材の下に設けられ、それぞれが前記第1方向に延び、前記第2方向に並んだ複数の光学要素とを含み、
     前記複数の光学要素の各々は、前記基材の法線方向に対してそれぞれが傾き、互いに接する入射面及び反射面を有する
     請求項1に記載の空中表示装置。
    The optical element includes a planar substrate and a plurality of optical elements provided below the substrate, each of the optical elements extending in the first direction and aligned in the second direction;
    The aerial display device according to claim 1 , wherein each of the optical elements has an incident surface and a reflecting surface that are inclined with respect to a normal direction of the base material and are in contact with each other.
  9.  前記表示素子と前記光学素子との間に配置され、前記表示素子からの光のうち斜め方向の光成分を透過する配向制御素子をさらに具備する
     請求項1に記載の空中表示装置。
    The aerial display device according to claim 1 , further comprising an orientation control element disposed between the display element and the optical element, which transmits oblique light components of the light from the display element.
  10.  前記配向制御素子は、交互に配置された複数の透明部材及び複数の遮光部材を含み、
     前記複数の遮光部材は、前記配向制御素子の法線に対して傾いている
     請求項9に記載の空中表示装置。
    the alignment control element includes a plurality of transparent members and a plurality of light blocking members arranged alternately;
    The aerial display device according to claim 9 , wherein the plurality of light blocking members are inclined with respect to a normal to the alignment control element.
  11.  前記表示素子及び前記光学素子は、互いに平行に配置される
     請求項1に記載の空中表示装置。
    The aerial display device according to claim 1 , wherein the display element and the optical element are arranged parallel to each other.
PCT/JP2023/039476 2022-11-07 2023-11-01 Aerial display device WO2024101245A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022178245A JP2024067866A (en) 2022-11-07 2022-11-07 Aerial Display Device
JP2022-178245 2022-11-07

Publications (1)

Publication Number Publication Date
WO2024101245A1 true WO2024101245A1 (en) 2024-05-16

Family

ID=91032936

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/039476 WO2024101245A1 (en) 2022-11-07 2023-11-01 Aerial display device

Country Status (2)

Country Link
JP (1) JP2024067866A (en)
WO (1) WO2024101245A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017125984A1 (en) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Aerial display device
JP2021139932A (en) * 2020-03-02 2021-09-16 凸版印刷株式会社 Aerial display device
JP2022063376A (en) * 2020-10-12 2022-04-22 凸版印刷株式会社 Aerial display device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017125984A1 (en) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Aerial display device
JP2021139932A (en) * 2020-03-02 2021-09-16 凸版印刷株式会社 Aerial display device
JP2022063376A (en) * 2020-10-12 2022-04-22 凸版印刷株式会社 Aerial display device

Also Published As

Publication number Publication date
JP2024067866A (en) 2024-05-17

Similar Documents

Publication Publication Date Title
JP4513918B2 (en) Illumination device and electro-optical device
KR101489415B1 (en) Display device and optical device
JP6757779B2 (en) Non-contact input device
EP2365423A2 (en) Touch object and proximate object sensing apparatus by selectively radiating light
WO2010137219A1 (en) Touch panel, liquid crystal panel, liquid crystal display device, and liquid crystal display device incorporating touch panel
US20110157097A1 (en) Coordinate sensor, electronic device, display device, light-receiving unit
US6948820B2 (en) Interactive display system having an optical channeling element
JP5515280B2 (en) Position detecting device and electro-optical device
JP2011257337A (en) Optical position detection device and display device with position detection function
JP2011257338A (en) Optical position detection device and display device with position detection function
US20230258955A1 (en) Aerial display apparatus
US20150117055A1 (en) Display apparatus
JP2010153256A (en) Illuminating device
WO2024101245A1 (en) Aerial display device
WO2023112617A1 (en) Aerial floating image display system
WO2024117075A1 (en) Aerial display device
WO2024096072A1 (en) Aerial display device
CA2536998A1 (en) Interactive display system having a scaled virtual target zone
WO2024090130A1 (en) Aerial display device
TWI475280B (en) Touch panel integrated display device
WO2023228530A1 (en) Spatial floating image information display system
WO2024053253A1 (en) Aerial display device
WO2024018929A1 (en) Aerial display device
WO2023136106A1 (en) Aerial display apparatus
JP2011252882A (en) Optical position detector