CN115917399A - Virtual image display device - Google Patents

Virtual image display device Download PDF

Info

Publication number
CN115917399A
CN115917399A CN202180049844.3A CN202180049844A CN115917399A CN 115917399 A CN115917399 A CN 115917399A CN 202180049844 A CN202180049844 A CN 202180049844A CN 115917399 A CN115917399 A CN 115917399A
Authority
CN
China
Prior art keywords
light
illumination light
unit
virtual image
stage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180049844.3A
Other languages
Chinese (zh)
Inventor
南原孝启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Publication of CN115917399A publication Critical patent/CN115917399A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/02Diffusing elements; Afocal elements
    • G02B5/0205Diffusing elements; Afocal elements characterised by the diffusing properties
    • G02B5/021Diffusing elements; Afocal elements characterised by the diffusing properties the diffusion taking place at the element's surface, e.g. by means of surface roughening or microprismatic structures
    • G02B5/0215Diffusing elements; Afocal elements characterised by the diffusing properties the diffusion taking place at the element's surface, e.g. by means of surface roughening or microprismatic structures the surface having a regular structure
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/02Diffusing elements; Afocal elements
    • G02B5/0273Diffusing elements; Afocal elements characterized by the use
    • G02B5/0278Diffusing elements; Afocal elements characterized by the use used in transmission
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • G02F1/133602Direct backlight
    • G02F1/133606Direct backlight including a specially adapted diffusing, scattering or light controlling members
    • G02F1/133607Direct backlight including a specially adapted diffusing, scattering or light controlling members the light controlling member including light directing or refracting elements, e.g. prisms or lenses
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • G02F1/133602Direct backlight
    • G02F1/133603Direct backlight with LEDs

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Nonlinear Science (AREA)
  • Mathematical Physics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Instrument Panels (AREA)
  • Optical Elements Other Than Lenses (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

A virtual image display device of the present invention includes: an illumination unit (40) that emits white illumination light; an image forming unit (20) that forms an image by transmitting illumination light and emits display light of the image; and a light-condensing unit (30) that condenses the illumination light to the image-forming unit (20). The image forming unit (20) has a plurality of pixel regions (212) arranged in an array. The light-collecting unit (30) has: a front lens unit (312) in which a plurality of front lens units are arranged to form a diffusing surface (313) in the form of a wave surface that diffuses illumination light incident independently on each pixel region (212) in an angle space in which the illumination light incident independently on each pixel region is converged; and a rear-stage lens unit (322) in which a plurality of rear-stage lens units are arranged in a manner such that refraction surface sections are formed intermittently in the direction orthogonal to the respective optical axes (A1) in the rear stage of each front-stage lens unit (312), and the refraction surface sections each independently converge illumination light that is independently incident into the angle space of each pixel region (212) by refraction.

Description

Virtual image display device
Cross Reference to Related Applications
The present application is based on Japanese patent application No. 2020-122212, filed in Japan on 16.7.2020, and the contents of the basic application are incorporated by reference in their entirety.
Technical Field
The present disclosure relates to a virtual image display device.
Background
Conventionally, there is known a virtual image display device that reflects display light by a light-transmitting member and displays a virtual image formed by the display light so as to be visually confirmed.
For example, a virtual image display device disclosed in patent document 1 includes a display that forms an image by transmitting illumination light condensed by an illumination lens and emits display light of the image. Here, the illumination lenses have an arrangement structure in which a plurality of illumination lenses are arranged so that lens surface portions for converging illumination light by refraction are formed intermittently in a direction orthogonal to the respective optical axes. This can suppress uneven illuminance on the display.
Patent document 1: japanese patent No. 6237249
However, in the virtual image display device disclosed in patent document 1, a discontinuous surface whose surface shape changes discontinuously is formed at the boundary between the arrangement structures. As a result, the illumination light incident on the boundary of the arrangement structure generates edge light that is sharply emitted depending on the position of the viewpoint of the viewer, and may cause a reduction in illumination quality resulting in visibility of the left and right virtual images.
Disclosure of Invention
Accordingly, an object of the present disclosure is to provide a virtual image display device that improves the visibility of a virtual image.
Hereinafter, technical means of the present disclosure for solving the problems will be described.
One aspect of the present disclosure is a virtual image display device that reflects display light by a light-transmitting member and displays a virtual image formed by the display light in a visually recognizable manner, the device including:
an illumination unit that emits white illumination light;
an image forming unit that forms an image by transmitting illumination light and emits display light of the image; and
a condensing unit that condenses the illumination light to the image forming unit,
the image forming unit has a plurality of pixel regions arranged,
the light-collecting unit has:
a plurality of pre-stage lens portions arranged to form a diffusing surface in a wave-surface shape, the diffusing surface diffusing the illumination light independently incident on each pixel region in an angle space in which the illumination light independently incident on each pixel region is converged,
and a rear-stage lens unit in which a plurality of rear-stage lens units are arranged in a manner such that refraction surface portions are formed intermittently in a direction orthogonal to the optical axis in a rear stage of each of the front-stage lens units, and the refraction surface portions converge, by refraction, illumination light that is incident individually into an angle space of each pixel region.
In the subsequent-stage lens portion in which the plurality of light condensing units are arranged in such an embodiment, the refracting surface portions are formed intermittently in the orthogonal direction with respect to the optical axis so as to condense, by refraction, the illumination light that is independently incident into the angular space of each of the plurality of arranged pixel regions in the image forming unit, respectively. Therefore, in the front lens portion in which a plurality of light converging units are arranged in one embodiment, the diffusing surface is formed so as to diffuse the illumination light independently incident on each pixel region in an angle space in which the illumination light independently incident on each pixel region is converged. Thus, the illumination light subjected to the diffusion and condensing action in each preceding stage lens portion is apparently spread in the angular space and enters the corresponding subsequent stage lens portion. As a result, edge light that emits light sharply due to the boundary between the rear-stage lens portions is less likely to be generated. This improves the illumination quality and improves the visibility of the virtual image.
Drawings
Fig. 1 is a schematic diagram showing the overall configuration of a virtual image display device according to a first embodiment.
Fig. 2 is a cross-sectional view showing a detailed configuration of the virtual image display device according to the first embodiment.
Fig. 3 is a view in line III-III of fig. 2.
Fig. 4 is an IV-IV line view of fig. 2.
Fig. 5 is a perspective view of the front stage lens array of fig. 2.
Fig. 6 is an enlarged sectional view of fig. 2.
Fig. 7 is a view in line VII-VII of fig. 2.
Fig. 8 is an enlarged sectional view taken along line VIII-VIII of fig. 2.
Fig. 9 is an enlarged sectional view of fig. 2.
Fig. 10 is an X-X line view of fig. 2.
Fig. 11 is a schematic diagram for explaining an example of illumination of the image forming panel of fig. 3.
Fig. 12 is a perspective view of the front stage lens array of the second embodiment.
Fig. 13 is a cross-sectional view of a virtual image display device according to a third embodiment, corresponding to fig. 2.
Fig. 14 is a cross-sectional view showing the virtual image display device according to the third embodiment in correspondence with fig. 6.
Fig. 15 is a cross-sectional view showing a modification of fig. 2.
Fig. 16 is an enlarged sectional view of fig. 15.
Fig. 17 is a cross-sectional view showing a modification of fig. 2.
Fig. 18 is a cross-sectional view showing a modification of fig. 2.
Detailed Description
Hereinafter, a plurality of embodiments will be described with reference to the drawings. In addition, in each embodiment, the same reference numerals are given to corresponding components, and redundant description may be omitted. In addition, in the case where only a part of the configuration is described in each embodiment, the configuration of the other embodiment described above can be applied to the other part of the configuration. In addition, not only the combinations of the configurations explicitly described in the description of the respective embodiments, but also the configurations of the plurality of embodiments may be partially combined without being explicitly described unless particularly no hindrance is caused in the combinations.
As shown in fig. 1, the virtual image display device according to the first embodiment is a head-up display (hereinafter, referred to as HUD) 100 mounted on a vehicle 1 and housed in an instrument panel 2 of the vehicle 1. Here, the vehicle 1 is broadly interpreted to include various vehicles such as an airplane, a ship, and a non-moving game machine body, in addition to an automobile and a railway vehicle, for example. In particular, the vehicle 1 of the present embodiment is a four-wheeled automobile. The front, rear, up, down, left, and right directions of the HUD100 are defined with reference to the vehicle 1 on the horizontal plane.
The HUD100 projects display light of an image toward the windshield 3 of the vehicle 1. As a result, the display light reflected by the windshield 3 reaches the visual confirmation area EB set in the room of the vehicle 1. An occupant who is located in the visual confirmation area EB at the indoor viewpoint EP of the vehicle 1 perceives the display light that has reached the visual confirmation area EB as a virtual image VRI. In this way, the HUD100 displays the virtual image VRI that can be visually confirmed by the visual confirmation person (hereinafter, simply referred to as a visual confirmation person) 4 as the occupant of the vehicle 1, thereby enabling the visual confirmation person 4 to recognize various kinds of information. Examples of the various information displayed as the virtual image VRI by the HUD100 include information indicating the state of the vehicle 1 such as the vehicle speed and the remaining fuel level, visual field assistance information, road information, and navigation information.
The visual confirmation area EB is a spatial area that can be visually confirmed by the visual confirmation person 4 when the virtual image VRI displayed by the HUD100 satisfies a predetermined specification (for example, when the virtual image VRI as a whole has a predetermined luminance or higher), and is also called an Eye movement range (Eye box). Typically, the visual confirmation area EB is set to overlap with the eye movement range set in the vehicle 1. The eye movement range is set to a virtual ellipsoid based on a visual line range statistically representing the spatial distribution of the viewpoint EP of the visual confirmation person 4.
The windshield 3 is a transparent member formed in a transparent plate shape, for example, from glass, synthetic resin, or the like. The windshield 3 is positioned above the dashboard 2 and defines the interior and exterior of the vehicle 1. The windshield 3 is inclined in a posture of being separated from the instrument panel 2 as it goes from the front to the rear. The rear surface of the windshield 3, which is the indoor side, is formed in a smooth concave or planar shape as a reflection surface 3a on which the display light is projected from the HUD100 and reflected.
The windshield 3 is also configured by providing a reflection type holographic optical element, and utilizing diffraction reflection on interference fringes instead of surface reflection. Instead of the windshield 3, a combiner as a light transmitting member may be provided in the interior of the vehicle 1, and the reflecting surface 3a may be provided in the combiner.
As shown in fig. 1, the HUD100 includes a light guide unit 10, an image forming unit 20, a light condensing unit 30, and an illumination unit 40.
The light guiding unit 10 constitutes an optical path L from the image forming unit 20 to the windshield 3. The light guide unit 10 guides the display light projected from the image forming unit 20 toward the windshield 3. The light guide unit 10 preferably has an enlarging function of enlarging the image formed by the image forming unit 20 to a virtual image VRI visually confirmed by the visual confirmation person 4 at a predetermined optical magnification. This is because miniaturization can be achieved by the amplification action of the light guide unit 10.
The light guide unit 10 having such a function includes at least one optical member 11. The light guide unit 10 is configured by combining one flat mirror (or curved mirror) 11a as the optical member 11 and one concave mirror 11 b. Here, the concave mirror 11b provides the above-described amplification. Alternatively, for example, the light guide unit 10 may be configured by combining one convex mirror and one concave mirror as the optical member 11, or may be configured by one concave mirror as the optical member 11. The optical member 11 constituting the light guide unit 10 may be either fixed or movable.
The image forming unit 20 forms an image that can be formed as a virtual image VRI outside the vehicle 1, and emits display light of the formed image toward the light guide unit 10. As shown in fig. 1 and 2, the image forming unit 20 includes an image display panel 21 and a diffusion panel 22.
The image display panel 21 is formed in a plate shape as a whole. The image display panel 21 is a transmissive TFT liquid crystal panel using a thin film transistor. The image display panel 21 is an active matrix type having a plurality of liquid crystal pixels arranged two-dimensionally. The illumination light from the illumination unit 40 is incident on an incident surface 210, which is one surface of the image display panel 21, through the condensing unit 30. The display light of the image is emitted from the emission surface 211 on the opposite side of the image display panel 21 toward the light guide unit 10 on the light path L. The image display panel 21 displays an image formed as the display light.
In the image display panel 21 having such a function, a pair of plate-shaped polarizers and a liquid crystal layer sandwiched between these polarizers are stacked in a plate thickness direction. Each polarizer has mutually orthogonal transmission and blocking axes along both faces 210, 211 of the image display panel 21. Each polarizer transmits polarized light at an azimuth of the transmission axis and absorbs polarized light at an azimuth of the blocking axis. The liquid crystal layer is configured to be capable of adjusting the polarization of the transmitted illumination light in accordance with the applied voltage for each liquid crystal pixel. The ratio of light transmitted through the polarizer on the emission side, that is, the transmittance is adjusted for each liquid crystal pixel by polarization adjustment in the liquid crystal layer, thereby forming an image. Here, in particular, in the image display panel 21, a color filter is provided in each liquid crystal pixel, whereby a color image can be formed.
As shown in fig. 2 and 3, a plurality of pixel regions 212, which are two-dimensionally arranged by predetermined amounts in the Xa direction and the Ya direction perpendicular to each other, are set in the image display panel 21. Each pixel region 212 is defined as a rectangular image forming region in which a plurality of liquid crystal pixels are two-dimensionally arranged in each of the Xa direction and the Ya direction. The number of pixel regions 212 arranged in the Xa direction may be either "smaller" or "larger" or "the same" with respect to the number of pixel regions 212 arranged in the Ya direction, but the first embodiment shown in fig. 3 has a "smaller" configuration.
As shown in fig. 1, the diffusion panel 22 is formed in a plate shape or a film shape as a whole by a hard transparent material such as glass or resin. The diffusion panel 22 is disposed substantially parallel along the incident surface 210 of the image display panel 21. The diffusion panel 22 provides a diffusion effect to the illumination light incident toward the image display panel 21. The diffusion panel 22 may be formed integrally with the image display panel 21 by providing the incident surface 210 of the panel 21 with fine irregularities.
The light condensing unit 30 shown in fig. 1 and 2 condenses the illumination light from the illumination unit 40 to the image forming unit 20. The light collecting unit 30 includes a front lens array 31 and a rear lens array 32.
As shown in fig. 2 and 4, the front lens array 31 is formed in a plate shape as a whole from a hard transparent material such as glass or resin. The front stage lens array 31 is a plano-convex lens array. The front lens array 31 has a plurality of front lens portions 312 two-dimensionally arrayed at predetermined numbers in the Xb direction and the Yb direction orthogonal to each other. The number of arrays of the front-stage lens portions 312 in the Xb direction matches the number of arrays of the pixel regions 212 in the Xa direction. The number of arrays of the front stage lens portions 312 in the Yb direction matches the number of arrays of the pixel regions 212 in the Ya direction. With these configurations, each front lens portion 312 corresponds to any one of the pixel regions 212 in a 1: 1 relationship.
Illumination light from the illumination unit 40 is incident on a front stage incident surface 310 which is one surface of a front stage lens section 312. The illumination light incident on the front-stage incident surface 310 is emitted from the front-stage emission surface 311, which is the opposite surface of each front-stage lens portion 312, toward the rear-stage lens array 32.
The front stage incident surface 310 of each front stage lens portion 312 shown in fig. 2, 5, and 6 is a plane substantially perpendicular to the optical axis Al orthogonal to the Xb direction and the Yb direction. In each pre-lens unit 312, the pre-emission surface 311 has a composite surface structure in which a diffusion surface 313 is synthesized on a virtual base surface Sb. As shown in fig. 6, the virtual base Sb defined for each pre-stage lens portion 312 has a convex surface shape smoothly curved in any direction including the Xb direction and the Yb direction. Each of the front lens units 312 condenses the illumination light emitted toward the rear lens array 32 in the angular space θ according to the convex virtual base Sb.
In order to exert such a light condensing effect, the function Zb representing the convex shape of the virtual base Sb is given by, for example, the following expression 1. In formula 1, c is a curvature giving a convex shape. In equation 1, r is a radial diameter (i.e., radius) from the optical axis Al associated with an arbitrary point of the convex shape. In equation 1, k is a conic constant. In formula 1, α i is a free-form surface coefficient.
[ formula 1]
Figure BPA0000334223740000061
/>
The diffusion surface 313 defined for each preceding lens unit 312 has a wave surface shape in which a wave travels outward in the Yb direction from a virtual plane α that includes the optical axis Al and extends in the Xb direction. Each of the front lens units 312 diffuses the illumination light emitted toward the rear lens array 32 in the angular space θ according to the wave-shaped diffusion surface 313.
In order to exhibit such a diffusion effect, the function Zw indicating the wavefront shape of the diffusion surface 313 is given by, for example, the following expression 2 which defines a one-dimensional plane wavefront. In equation 2, ay is the maximum amplitude of the wavefront in the Yb direction. In equation 2, yw is a separation distance from the virtual plane α in the Yb direction with respect to an arbitrary point in the wave surface shape. In formula 2, λ y is a wavelike wavelength in the Yb direction.
[ formula 2]
Zw=Ay·cos(2π·Yw/λy)
As described above, the function Zc representing the composite surface structure of the diffusion surface 313 and the virtual base surface Sb in the pre-stage output surface 311 of each pre-stage lens portion 312 is given by the following expression 3. According to the above configuration, each front lens unit 312 diffuses illumination light independently incident on each corresponding pixel region 212 in the image forming unit 20 in a range converging in the angle space θ of each light collection.
[ formula 3]
Zc=Zb+Zw
As shown in fig. 2 and 7, the rear lens array 32 is formed in a plate shape as a whole from a hard transparent material such as glass or resin. The rear lens array 32 includes a plurality of rear lens portions 322 two-dimensionally arranged in a predetermined number in each of an Xc direction and a Yc direction orthogonal to each other. The number of the rear lens portions 322 arranged in the Xc direction matches the number of the pixel regions 212 arranged in the Xa direction and the number of the front lens portions 312 arranged in the Xb direction. The number of the arrays of the rear lens portions 322 in the Yc direction matches the number of the arrays of the pixel regions 212 in the Ya direction and the number of the arrays of the front lens portions 312 in the Yb direction. With these configurations, each rear-stage lens portion 322 corresponds to any one of the pixel regions 212 and any one of the front-stage lens portions 312 in a ratio of 1: 1.
Each rear-stage lens portion 322 is located at the rear stage with respect to the corresponding front-stage lens portion 312, and shares the optical axis Al. The image display panel 21 and the diffusion panel 22 are disposed obliquely with respect to the optical axis A1 of each of the preceding lens unit 312 and the succeeding lens unit 322. With this inclined arrangement, the Xa direction of the image display panel 21 is defined as being inclined toward these lens arrays 31, 32 sides with respect to the Xb direction of the preceding-stage lens array 31 and the Xc direction of the following-stage lens array 32. On the other hand, the Ya direction of the image display panel 21 is defined to be substantially parallel along the Yb direction of the front stage lens array 31 and the Yc direction of the rear stage lens array 32.
The illumination light from each corresponding front lens unit 312 is incident on a rear incident surface 320 which is one surface of each rear lens unit 322 shown in fig. 7 to 9. The illumination light incident on the rear-stage incident surface 320 is emitted from the rear-stage emission surface 321, which is the opposite surface of each rear-stage lens unit 322, toward the corresponding pixel region 212.
In each rear-stage lens portion 322 shown in fig. 8, the rear-stage incident surface 320 forms a compound surface structure in which forward refractive surface portions 323 and backward refractive surface portions 324 are alternately arranged from the optical axis Al toward the outside in the Xc direction. The plurality of positive refracting surface portions 323 are formed in a stripe shape intermittently separated from each other in the Xc direction and extending in the Yc direction (see fig. 7). Each of the positive refracting surface portions 323 corresponds to any one of divided portions obtained by dividing the virtual basal plane Si1 by a certain width in the Xc direction. Here, the virtual base plane Si1 is defined as, for example, a convex shape or the like that is convex to the incident side. The plurality of counter refraction surface portions 324 are formed in a stripe shape intermittently separated from each other in the Xc direction and extending in the Yc direction (see fig. 7). Each of the catadioptric surface portions 324 corresponds to any one of the divided portions obtained by dividing the virtual basal plane Si2 into a plurality of segments in the Xc direction. Here, the virtual basal plane Si2 is defined as, for example, a valley-shaped slope shape or the like recessed toward the emission side. In the composite surface structure as described above, each forward refractive surface portion 323 converges illumination light on the optical axis Al side in the Xc direction by refraction and makes the illumination light parallel to the optical axis Al, while each backward refractive surface portion 324 refracts illumination light backward to each forward refractive surface portion 323 and mixes the illumination light with the collimated light. The parallelization means that the illumination light is in a state of being close to a parallel light flux, and it is not necessary that the illumination light is completely parallel light flux.
In each of the rear lens portions 322 of the first embodiment, the Xc direction in which the forward refractive surface portions 323 and the backward refractive surface portions 324 are alternately formed intermittently corresponds to the left-right direction Dh of the virtual image VRI (see fig. 1). In each rear-stage lens portion 322, a mountain-shaped boundary 328 is formed between the catadioptric surface portion 324 and the catadioptric surface portion 324 in another rear-stage lens portion 322 adjacent in the Xc direction. Further, a valley-shaped boundary 328 may be formed between the forward refracting surface portion 323 in each rear-stage lens portion 322 and the forward refracting surface portion 323 in another rear-stage lens portion 322 adjacent in the Xc direction.
In each of the rear-stage lens portions 322 shown in fig. 9, the rear-stage emission surface 321 has a composite surface structure in which forward refractive surface portions 325 and backward refractive surface portions 326 are alternately arranged from the optical axis A1 toward the outside in the Yc direction. The plurality of positive refraction surface portions 325 are formed in a stripe shape intermittently separated from each other in the Yc direction and extending in the Xc direction (see fig. 7). Each of the forward refractive surface portions 325 corresponds to any one of the divided portions obtained by dividing the virtual base surface So1 into a plurality of pieces in the Yc direction. Here, the virtual base surface So1 is defined as, for example, a convex surface shape protruding toward the emission side. The plurality of catadioptric surface portions 326 are formed in a stripe shape that is intermittently separated from each other in the Yc direction and extends in the Xc direction (see fig. 7). Each of the catadioptric surface portions 326 corresponds to any one of divided portions obtained by dividing the virtual base So2 by a certain width in the Yc direction. Here, the virtual base So2 is defined as, for example, a valley-shaped slope shape or the like recessed toward the incident side. In the composite surface structure as described above, each forward refractive surface portion 325 converges the illumination light toward the optical axis Al in the Yc direction by refraction and makes the illumination light parallel to the optical axis Al, and each backward refractive surface portion 324 refracts the illumination light in the opposite direction to each forward refractive surface portion 323 and mixes the illumination light with the collimated light.
In each of the rear lens portions 322 of the first embodiment, the Yc direction in which the forward refractive surface portions 325 and the backward refractive surface portions 326 are alternately formed intermittently corresponds to the vertical direction Dv of the virtual image VRI (see fig. 1). In each rear-stage lens portion 322, a mountain-shaped boundary 329 is formed between the catadioptric surface portion 326 and the catadioptric surface portion 326 of another rear-stage lens portion 322 adjacent in the Yc direction. Further, a valley-shaped boundary 329 may be formed between the forward refractive surface portion 325 in each rear-stage lens portion 322 and the forward refractive surface portion 325 in another rear-stage lens portion 322 adjacent in the Yc direction.
According to the above configuration, the light collecting unit 30 collects the illumination light that is independently incident on each of the pixel regions 212 in the image forming unit 20 by the cooperation of the front lens portion 312 and the rear lens portion 322 corresponding to the pixel regions 212, respectively.
The illumination unit 40 shown in fig. 1 and 2 emits illumination light for illuminating the image forming unit 20 via the light condensing unit 30. As shown in fig. 1, 2, and 10, the illumination unit 40 includes a plurality of light source units 402 arranged two-dimensionally at a predetermined number in each of the Xd direction and Yd direction orthogonal to each other. The number of light source sections 402 arranged in the Xd direction matches the number of pixel regions 212 arranged in the Xa direction, the number of front-stage lens sections 312 arranged in the Xb direction, and the number of rear-stage lens sections 322 arranged in the Xc direction. The number of light source sections 402 arranged in the Yd direction matches the number of pixel regions 212 arranged in the Ya direction, the number of front lens sections 312 arranged in the Yb direction, and the number of rear lens sections 322 arranged in the Yc direction. With these configurations, each light source section 402 is associated with any one of the pixel regions 212, any one of the front lens sections 312, and any one of the rear lens sections 322 in a 1: 1 relationship.
Each light source unit 402 is composed of a light source element that emits white illumination light independently. The light source element of each light source section 402 is, for example, an LED bare chip using YAG or KSF. The light source elements of the light source units 402 can independently set the illuminance of illumination light by variable adjustment of the light emission intensity.
As shown in fig. 6, the light source elements of each light source section 402 are disposed on the common optical axis Al of the corresponding front-stage lens section 312 and rear-stage lens section 322. Here, the light source element of each light source section 402 is disposed closer to the light collecting unit 30 than the combined focal point of the corresponding lens sections 312 and 322 in the direction along the optical axis Al of the corresponding front-stage lens section 312 and rear-stage lens section 322. At the same time, the light source elements of each light source section 402 are arranged closer to the light collecting unit 30 than the focal length Pb of the virtual base Sb associated with the corresponding lens section 312 in the direction along the optical axis A1 of the corresponding front lens section 312.
In each light source section 402 shown in fig. 2, the directions of the maximum intensity peaks of the light emission intensities of the light source elements are set substantially parallel to the optical axes Al of the corresponding front-stage lens section 312 and rear-stage lens section 322. In this setting, the Xd direction of the illumination unit 40 is defined to be substantially parallel along the Xb direction of the front stage lens array 31 and the Xc direction of the rear stage lens array 32, and is defined to be inclined from the Xa direction of the image display panel 21. On the other hand, the Yd direction of the illumination unit 40 is defined to be substantially parallel along the Ya direction of the image display panel 21, the Yb direction of the front stage lens array 31, and the Yc direction of the rear stage lens array 32.
With the above configuration, the illumination light emitted from the light source element of each light source section 402 is sequentially incident on the corresponding front lens section 312 and rear lens section 322. That is, the illumination unit 40 emits illumination light that is independently incident on each corresponding group of the lens portions 312 and 322 from the light source element of the light source portion 402 that independently corresponds to each lens portion 312 and 322.
As shown in fig. 11 with white, the pixel region 212 corresponding to the light source section 402 where the light source element emits light at maximum intensity is illuminated with white light transmitted at maximum illuminance. As shown by the dot-shaped hatching in fig. 11, the pixel region 212 corresponding to the light source section 402 in which the light source element emits light at an intensity lower than the maximum intensity is illuminated by the white light through transmission at an illuminance lower than the maximum illuminance. As shown by cross hatching in fig. 11, the pixel region 212 corresponding to the light source portion 402 with the light source element turned off becomes a substantially non-display region where illumination is not transmitted.
(Effect)
The operational effects of the first embodiment described above will be described below.
In the rear-stage lens portion 322 in which a plurality of light collecting units 30 are arranged in the first embodiment, the forward refracting surface portions 325 are intermittently formed in the Yc direction that becomes the orthogonal direction with respect to the optical axis Al so as to converge, by refraction, the illumination light that is individually incident into the angle space θ of each of the plurality of arranged pixel regions 212 in the image forming unit 20. Therefore, in the front stage lens section 312 in which a plurality of light condensing units 30 of the first embodiment are arrayed, the diffusion surface 313 is formed so as to diffuse the illumination light that is independently incident respectively to each pixel region 212 within the angle space θ in which the illumination light that is independently incident respectively is condensed. Thus, the illumination light subjected to the diffusion and collection action in each preceding-stage lens portion 312 apparently spreads in the angular space θ, and enters the corresponding subsequent-stage lens portion 322. As a result, edge light that is sharply emitted due to the boundary 329 between the rear-stage lens portions 322 is less likely to be generated. This improves the illumination quality and improves the visibility of the virtual image VRI.
The diffusion surface 313 of the first embodiment has a wavy surface shape in which the wave travels in the Yb direction, which is the Yc direction orthogonal to the Yc direction. Thus, in the front-stage lens portion 312, the apparent spread of the illumination light can be promoted by the diffusion/condensation action in the Yb direction along the Yc direction in which the rear-stage lens portions 322 are arranged. This effectively suppresses the edge light caused by the boundary 329 between the rear lens portions 322, and improves the visibility of the virtual image VRI.
According to the first embodiment, the Yc direction as the orthogonal direction in which the rear-stage lens portions 322 are arranged corresponds to the vertical direction Dv of the virtual image VRI. Thus, the luminance of the display light is not likely to change in the vertical direction Dv of the virtual image VRI in which the eyeball is hard to move with respect to the horizontal direction Dh of the virtual image VRI in which the eyeball of the viewer 4 who views the virtual image VRI is likely to move. This can improve the visibility of the virtual image VRI in conjunction with the suppression of the edge light.
In the illumination unit 40 of the first embodiment, the plurality of light source sections 402 are arranged so as to emit illumination light that is independently incident to each group of the front-stage lens sections 312 and the rear-stage lens sections 322, respectively. Thus, the illumination light entering the corresponding front lens portion 312 from each light source portion 402 and subjected to the diffusion and condensation effect apparently spreads in the angle space θ and enters the corresponding rear lens portion 322. This can suppress incidence of edge light on each pixel region 212 due to the boundary 329 between the rear lens portions 322, thereby improving the visibility of the virtual image VRI.
In each of the rear lens portions 322 of the first embodiment, forward refractive surface portions 323 and 325 for parallelizing the illumination light by refraction and backward refractive surface portions 324 and 326 for refracting and mixing the illumination light into the parallelized light are alternately formed in Xc and Yc directions, which are orthogonal directions. Thus, the illumination light is not easily generated into edge light by the boundaries 328 and 329 between the rear-stage lens portions 322 due to the light condensing action in the rear-stage lens portions 322 accompanied by mixing of the collimated light in addition to the diffusion light condensing action in the front-stage lens portion 312. This effectively suppresses edge light and improves the visibility of the virtual image VRI.
(second embodiment)
As shown in fig. 12, the second embodiment is a modification of the first embodiment. In the second embodiment, the number of elements 212, 312, 322, and 402 arranged in the Xa, xb, xc, and Xd directions is "large" relative to the number of elements 212, 312, 322, and 402 arranged in the Ya, yb, yc, and Yd directions (fig. 12 shows only the front lens portion 312).
In the second embodiment, the dimension in the left-right direction (vertical direction to the paper surface in fig. 1 of the first embodiment) of the visual confirmation area EB corresponding to the left-right direction Dh of the virtual image VRI is larger than the dimension in the up-down direction (vertical direction in fig. 1 of the first embodiment) of the visual confirmation area EB corresponding to the up-down direction Dv of the virtual image VRI. This makes it possible to provide a display of a virtual image VRI in which the visibility in the laterally long visual confirmation area EB is improved by effectively using the windshield 3 that is laterally long in the vehicle 1.
(third embodiment)
As shown in fig. 13 and 14, the third embodiment is a modification of the second embodiment. In the third embodiment, the front emission surface 311 of each front lens portion 3312 has a composite surface structure in which a diffusion surface 3313 different from that of the first embodiment is combined with a virtual base surface Sb similar to that of the first embodiment. The diffusing surfaces 3313 defined in the respective front lens portions 3312 have a wave surface shape in which a wave travels from the optical axis A1 to at least the Xc direction and the Yb direction. Each of the front lens portions 3312 diffuses the illumination light emitted toward the rear lens array 32 in the angular space θ by such a wave-shaped diffusion surface 3313.
In order to exhibit such a diffusion effect, the function Zw indicating the wave shape of the diffusion surface 3313 may be given by the following expression 4 which defines a two-dimensional plane wave surface, for example. In equation 4, ax and Ay are the maximum amplitudes of the wavefront shapes in the Xb and Yb directions, respectively. In equation 4, xw is a separation distance from the virtual plane β in the Xb direction with respect to an arbitrary point in the wave surface shape. Here, the virtual plane β is defined as a plane that includes the optical axis Al and extends in the Yb direction, and is orthogonal to the virtual plane α. In equation 4, yw is a separation distance from the virtual plane α in the Yb direction with respect to an arbitrary point in the wave surface shape. In equation 4, λ x and λ y are wavelike wavelengths in Xb and Yb directions, respectively.
[ formula 4]
Zw=Ax·cos(2π·Xw/λx)+Ay·cos(2π·Yw/λy)
The function Zw indicating the wavefront shape of the diffusing surface 3313 may be given by the following equation 5 for defining the non-attenuating spherical wavefront. In equation 5, a is the maximum amplitude of the wavefront in any direction around the optical axis Al including the Xb and Yb directions. In equation 5, xw and Yw are distances separating from the virtual planes β and α in the Xb and Yb directions, respectively, with respect to an arbitrary point in the wave form. In equation 5, λ is a wavelike wavelength in any direction around the optical axis A1 including Xb and Yb directions.
[ formula 5]
Zw=A·cos[2π·{(Xw 2 +Yw 2 ) 1/2 }/λ]
The function Zw indicating the wavefront shape of the diffusing surface 3313 can be given by, for example, the following equation 6 that defines the attenuation spherical wavefront. In equation 6, a is the maximum amplitude of the wavefront shape in any direction around the optical axis A1 including the Xb and Yb directions. In equation 6, xw and Yw are distances separating from the virtual planes β and α in the Xb and Yb directions, respectively, with respect to an arbitrary point in the wave form. In equation 6, λ is a waved wavelength in any direction around the optical axis Al including Xb and Yb directions.
[ formula 6]
Zw=[A/{(Xw 2 +Yw 2 ) 1/2 }]·cos[2π·{(Xw 2 +Yw 2 ) 1/2 }/λ]
The function Zw indicating the wavefront shape of the diffusion surface 313 can be given by the following expressions 7 to 9 that define a sinc wavefront, for example. In equations 8 and 9, ax and Ay are the maximum amplitudes of the wavefront shapes in the Xb and Yb directions, respectively. In equations 8 and 9, xw and Yw are distances separating from the virtual planes β and α in the Xb and Yb directions, respectively, with respect to an arbitrary point in the wave form. In equations 8 and 9, λ x and λ y are wavelike wavelengths in Xb and Yb directions, respectively.
[ formula 7]
Zw=Zx+Zy
[ formula 8]
Zx=Ax·{sin(2π·Xw/λx)}/(2π·Xw/λx)
[ formula 9]
Zy=Ay·{sin(2π·Yw/λy)}/(2π·Yw/λy)
The function Zw indicating the wave shape of the diffusing surface 3313 may be given by the following expression 10 for specifying a composite two-dimensional plane wave surface, for example. In expression 10, j is an integer having N,1 to N, or a suffix represented by the integer, where the number of wave surfaces to be combined is N. In equation 10, axj and Ayj are the maximum amplitudes of the wavefront shapes in the Xb and Yb directions, respectively. In equation 10, xw and Yw are distances separating from the virtual planes β and α in the Xb and Yb directions, respectively, with respect to an arbitrary point in the wave form. In equation 10, λ x and λ y are wavelike wavelengths in Xb and Yb directions, respectively. In formula 10, the case where N =1 corresponds to formula 4.
[ formula 10]
Figure BPA0000334223740000131
Here, in the case of the above-described expressions 4, 7 to 9, and 10, the maximum amplitudes Ax and Axj in the Xb direction may be either "small" or "large" or "the same" which are different from the maximum amplitudes Ay and Ayj in the Yb direction. When the maximum amplitudes Ax and Axj are different from the maximum amplitudes Ay and Ayj, the illumination light is given an anisotropic diffusion effect. Therefore, in the third embodiment in particular, the maximum amplitudes Ax, axj in the Xb direction that can correspond to the left-right direction Dh of the virtual image VRI are set to be "larger" than the maximum amplitudes Ay, ayj in the Yb direction that can correspond to the up-down direction Dv of the virtual image VRI. Thus, in the third embodiment, the virtual image VRI with high visibility can be provided by efficiently diffusing the illumination light toward the display light having the laterally long visual confirmation area EB described in the second embodiment.
Further, the diffusing surface 3313 according to the third embodiment has a wave surface shape in which waves travel at least in Xb and Yb directions along Xc and Yc directions which are a pair of orthogonal directions orthogonal to each other. Thus, in the front-stage lens portion 3312, the apparent spread of the illumination light can be promoted by the diffusion condensing action in the Xb and Yb directions along the Xc and Yc directions in which the rear-stage lens portions 322 are arranged. This effectively suppresses edge light caused by the boundaries 328 and 329 between the rear-stage lens portions 322, and improves the visibility of the virtual image VRI.
(other embodiments)
While the embodiments have been described above, the present disclosure is not limited to these embodiments, and various embodiments can be applied to the present disclosure without departing from the scope of the present disclosure.
As shown in fig. 15 and 16, in the modified example, at least one of rear-stage incident surface 320 and rear-stage emission surface 321 (fig. 15 and 16 are examples of only rear-stage emission surface 321) in rear-stage lens array 32 may be formed of a fresnel lens surface. In this case, a connecting surface portion 327 connecting the forward refractive surface portions 323 and the forward refractive surface portions 325 substantially in parallel with the optical axis Al is formed instead of at least one of the backward refractive surface portions 324 and the backward refractive surface portions 326. In this case, a valley-shaped boundary 328 or a boundary 329 is formed between at least one of the forward refractive surface portion 323 and the forward refractive surface portion 325 in each rear-stage lens portion 322 and the forward refractive surface portion 323 or the forward refractive surface portion 325 in another rear-stage lens portion 322 adjacent in the Xc direction or the Yc direction.
As shown in fig. 17, in the modification, the image display panel 21 and the diffusion panel 22 may be disposed substantially perpendicular to the optical axis Al of each front-stage lens unit 312 and each rear-stage lens unit 322. The pre-stage lens array 31 of the modification may be a TIR lens array in which a composite surface structure of the diffusion surfaces 313 and 3313 and the virtual base surface Sb is formed on the lens portions 312 and 3312.
As shown in fig. 18, in the lens portions 312 and 3312 of the modified example, a composite surface structure (an example of the diffusion surface 313) of the diffusion surfaces 313 and 3313 and the virtual base surface Sb may be formed on the front stage incident surface 310. In this case, the virtual base surface Sb of the synthesized diffusion surfaces 313 and 3313 similar to the first embodiment may be a plane surface substantially perpendicular to the optical axis Al. In this case, the front-stage emission surface 311 of the lens portions 312 and 3312 may have a convex surface shape smoothly curved in any direction including the Xb direction and the Yb direction.
In the image display panel 21 of the modification, the pixel regions 212 may be one-dimensionally arranged in a line in one of the Xa direction and the Ya direction. In the front lens array 31 of the modification, the front lens portions 312 may be one-dimensionally arranged in a line in one of the Xb direction and the Yb direction. In the second-stage lens array 32 of the modification, the second-stage lens portions 322 may be arranged one-dimensionally in one row in one of the Xc direction and the Yc direction. In a modification, the light source units 402 may be arranged one-dimensionally in a row in one of the Xd direction and the Yd direction.
The Xa direction and the Ya direction in the image display panel 21 of the modification may be replaced with each other. The Xc direction and the Yc direction in the subsequent-stage lens array 32 of the modification may be replaced with each other. The Yd direction in the light source units 402, 2402, and 3402 according to the modification may correspond to the left-right direction Dh (see fig. 1 of the first embodiment) of the virtual image VRI.

Claims (7)

1. A virtual image display device (100) which reflects display light by a light transmitting member (3) and displays a virtual image (VRI) formed by the display light in a visually recognizable manner, the virtual image display device comprising:
an illumination unit (40) that emits white illumination light;
an image forming unit (20) that forms an image by transmitting the illumination light and emits the display light of the image; and
a light collecting unit (30) for collecting the illumination light to the image forming unit,
the image forming unit has a plurality of pixel regions (212) arranged,
the light-collecting unit includes:
a front lens unit (312, 3312) in which a plurality of the front lens units are arranged to form a diffusing surface (313, 3313) in a wave shape, the diffusing surface diffusing the illumination light that is independently incident on each of the pixel regions in an angle space (θ) in which the illumination light that is independently incident on each of the pixel regions is converged; and
and a rear-stage lens unit (322) in which a plurality of rear-stage lens units are arranged in a manner such that refraction surface sections (323, 325) are formed intermittently in directions (Xc, yc) orthogonal to an optical axis (Al) in a rear stage of each of the front-stage lens units, and the refraction surface sections converge the illumination light that is incident independently into the angular space for each of the pixel regions by refraction.
2. The virtual image display apparatus of claim 1,
the diffusing surface (313) has a wave surface shape in which waves travel in a direction (Yb) along the orthogonal direction (Yc).
3. The virtual image display apparatus of claim 1,
the diffusing surface (3313) has a wave surface shape in which waves travel in directions (Xb, yb) along a pair of orthogonal directions (Xc, yc) orthogonal to each other.
4. The virtual image display apparatus according to claim 3,
the maximum amplitudes (Ax, axi, ay, ayi) of the waves are different in directions (Xb, yb) in which the waves travel along a pair of orthogonal directions orthogonal to each other.
5. The virtual image display apparatus according to any one of claims 2 to 4,
the orthogonal direction includes a direction corresponding to a vertical direction (Dv) of the virtual image.
6. The virtual image display apparatus according to any one of claims 1 to 5,
the rear lens unit alternately forms forward refractive surface sections (323, 325) as the refractive surface sections for parallelizing the illumination light by refraction and backward refractive surface sections (324, 326) for mixing the illumination light into the parallelized light by refraction in the orthogonal directions (Xc, yc).
7. The virtual image display apparatus according to any one of claims 1 to 6,
the illumination unit has a light source unit (402), and a plurality of the light source units are arranged to emit the illumination light that is incident on each of the groups of the front-stage lens units and the rear-stage lens units independently.
CN202180049844.3A 2020-07-16 2021-05-28 Virtual image display device Pending CN115917399A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2020122212A JP7318604B2 (en) 2020-07-16 2020-07-16 virtual image display
JP2020-122212 2020-07-16
PCT/JP2021/020562 WO2022014180A1 (en) 2020-07-16 2021-05-28 Virtual-image display device

Publications (1)

Publication Number Publication Date
CN115917399A true CN115917399A (en) 2023-04-04

Family

ID=79555151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180049844.3A Pending CN115917399A (en) 2020-07-16 2021-05-28 Virtual image display device

Country Status (4)

Country Link
JP (1) JP7318604B2 (en)
CN (1) CN115917399A (en)
DE (1) DE112021003813T5 (en)
WO (1) WO2022014180A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023118016A (en) * 2022-02-14 2023-08-24 株式会社小糸製作所 Optical member, light source device, and head-up display

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009004696A1 (en) 2007-06-29 2009-01-08 Toppan Printing Co., Ltd. Lens sheet, optical sheet for display, and backlight unit and display apparatus using the same
JP4946678B2 (en) 2007-07-06 2012-06-06 凸版印刷株式会社 Lens sheet, optical sheet for display, backlight unit using the same, and display device
JP6237249B2 (en) 2014-01-15 2017-11-29 株式会社デンソー Illumination lens, illumination unit, and head-up display device
JP6287605B2 (en) 2014-06-09 2018-03-07 株式会社デンソー Head-up display device and lighting unit thereof
JP6579126B2 (en) 2016-05-25 2019-09-25 株式会社デンソー Head-up display device and image projection unit
EP3575846A1 (en) 2018-05-30 2019-12-04 Visteon Global Technologies Inc. Picture generation unit for head-up display
JP7051730B2 (en) 2019-01-31 2022-04-11 滲透工業株式会社 Manufacturing method of spheroidal graphite cast iron products

Also Published As

Publication number Publication date
WO2022014180A1 (en) 2022-01-20
JP2022018831A (en) 2022-01-27
DE112021003813T5 (en) 2023-04-27
JP7318604B2 (en) 2023-08-01

Similar Documents

Publication Publication Date Title
US10156778B2 (en) Display device for projecting image in midair using light guide plate
JP6579212B2 (en) Head-up display device
CN109154721B (en) Head-up display device and image projection unit
JP6319355B2 (en) Head-up display device
WO2017154993A1 (en) Stereoscopic display device
EP3447561A1 (en) Head-up display device
CN111051960A (en) Head-up display device and image projection unit
CN115917399A (en) Virtual image display device
WO2022130823A1 (en) Virtual image display device
US20120249968A1 (en) Autostereoscopic display
WO2017145557A1 (en) Head-up display device
JP7426625B2 (en) Optical systems, lighting systems, display systems and moving objects
WO2022014179A1 (en) Virtual-image display device
WO2020158258A1 (en) Virtual image display device
WO2022004223A1 (en) Virtual image display device
WO2018173955A1 (en) Image display device, virtual-image display device, and moving body
JP7375629B2 (en) virtual image display device
JP7342790B2 (en) virtual image display device
WO2022019101A1 (en) Head-up display device and diffusion plate for head-up display
JP2024003030A (en) Optical system, illuminating system, display system, and moving body
JP2004177924A (en) Display device and image display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination