WO2019160160A1 - Affichage tête haute, systeme d'affichage tête haute, et corps mobile - Google Patents

Affichage tête haute, systeme d'affichage tête haute, et corps mobile Download PDF

Info

Publication number
WO2019160160A1
WO2019160160A1 PCT/JP2019/006084 JP2019006084W WO2019160160A1 WO 2019160160 A1 WO2019160160 A1 WO 2019160160A1 JP 2019006084 W JP2019006084 W JP 2019006084W WO 2019160160 A1 WO2019160160 A1 WO 2019160160A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
sub
visible region
eye
Prior art date
Application number
PCT/JP2019/006084
Other languages
English (en)
Japanese (ja)
Inventor
薫 草深
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2019160160A1 publication Critical patent/WO2019160160A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/317Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Definitions

  • the present invention relates to a head-up display, a head-up display system, and a moving object.
  • the head-up display of the present disclosure includes a display surface, an optical element, a projection optical system, and a controller.
  • the display surface has a plurality of subpixels. The plurality of sub-pixels are arranged in a grid pattern along a first direction and a second direction substantially orthogonal to the first direction.
  • the display surface has a plurality of band-like regions. The plurality of band-like regions extend in a prescribed direction on the display surface.
  • the optical element defines a light beam direction of image light for each of a plurality of band-like regions. Image light is emitted from the sub-pixels.
  • the projection optical system reflects the image light so that a virtual image of an image displayed on the display surface is formed.
  • the controller controls the display surface.
  • the belt-like region includes a first visible region and a second visible region.
  • the first visible region emits image light that reaches the first eye of the user when the light ray direction is defined by the optical element.
  • the second visible region emits image light that reaches a second eye different from the first eye of the user when a light ray direction is defined by the optical element.
  • the controller displays a black image on a first sub-pixel at least partially included in the first visible region.
  • the controller controls an image to be displayed on a part of the second subpixel at least a part of which is included in the second visible region so as to be switchable between an observation image having arbitrary luminance and the black image. .
  • the head-up display of the present disclosure includes a display surface, an optical element, a projection optical system, and a controller.
  • the display surface has a plurality of subpixels. The plurality of sub-pixels are arranged in a grid pattern along a first direction and a second direction substantially orthogonal to the first direction.
  • the display surface has a plurality of band-like regions. The plurality of band-like regions extend in a prescribed direction on the display surface.
  • the optical element defines a light beam direction of image light for each of a plurality of band-like regions. Image light is emitted from the sub-pixels.
  • the projection optical system reflects the image light so that a virtual image of an image displayed on the display surface is formed.
  • the controller controls the display surface.
  • the belt-like region includes a first visible region and a second visible region.
  • the first visible region emits image light that reaches the first eye of the user when the light ray direction is defined by the optical element.
  • the second visible region emits image light that reaches a second eye different from the first eye of the user when a light ray direction is defined by the optical element.
  • the controller displays a black image on a first sub-pixel at least partially included in the first visible region.
  • the controller is configured to display an image to be displayed on a sub-pixel in contact with the first sub-pixel among second sub-pixels at least part of which is included in the second visible region, an observation image having arbitrary luminance, and the black image Control to switch between.
  • the head-up display of the present disclosure includes a display surface, an optical element, a projection optical system, and a controller.
  • the display surface has a plurality of subpixels. The plurality of sub-pixels are arranged in a grid pattern along a first direction and a second direction substantially orthogonal to the first direction.
  • the display surface has a plurality of band-like regions. The plurality of band-like regions extend in a prescribed direction on the display surface.
  • the optical element defines a light beam direction of image light for each of a plurality of band-like regions. Image light is emitted from the sub-pixels.
  • the projection optical system reflects the image light so that a virtual image of an image displayed on the display surface is formed.
  • the controller controls the display surface.
  • the belt-like region includes a first visible region and a second visible region.
  • the first visible region emits image light that reaches the first eye of the user when the light ray direction is defined by the optical element.
  • the second visible region emits image light that reaches a second eye different from the first eye of the user when a light ray direction is defined by the optical element.
  • the controller includes a first display state and a second display unit in a repeating unit of a first sub-pixel at least partly included in the first visible region and a second sub-pixel at least partly included in the second visible region. Control to switch between display states.
  • the first display state is a state in which the number of subpixels that display the black image is greater than the number of subpixels that display the observation image.
  • the second display state is a state in which the number of sub-pixels displaying the black image is the same as the number of sub-pixels displaying the observation image.
  • the head-up display system of the present disclosure includes an illuminance measuring instrument and a head-up display.
  • the illuminance measuring instrument measures illuminance.
  • the head-up display includes a display surface, an optical element, a projection optical system, and a controller.
  • the display surface has a plurality of subpixels. The plurality of sub-pixels are arranged in a grid pattern along a first direction and a second direction substantially orthogonal to the first direction.
  • the display surface has a plurality of band-like regions. The plurality of band-like regions extend in a prescribed direction on the display surface.
  • the optical element defines a light beam direction of image light for each of a plurality of band-like regions. Image light is emitted from the sub-pixels.
  • the projection optical system reflects the image light so that a virtual image of an image displayed on the display surface is formed.
  • the controller controls the display surface.
  • the belt-like region includes a first visible region and a second visible region.
  • the first visible region emits image light that reaches the first eye of the user when the light ray direction is defined by the optical element.
  • the second visible region emits image light that reaches a second eye different from the first eye of the user when a light ray direction is defined by the optical element.
  • the controller displays a black image on a first sub-pixel at least partially included in the first visible region.
  • the controller controls an image to be displayed on a part of the second subpixel at least a part of which is included in the second visible region so as to be switchable between an observation image having arbitrary luminance and the black image. .
  • the moving body of the present disclosure includes a head-up display.
  • the head-up display includes a display surface, an optical element, a projection optical system, and a controller.
  • the display surface has a plurality of subpixels. The plurality of sub-pixels are arranged in a grid pattern along a first direction and a second direction substantially orthogonal to the first direction.
  • the display surface has a plurality of band-like regions. The plurality of band-like regions extend in a prescribed direction on the display surface.
  • the optical element defines a light beam direction of image light for each of a plurality of band-like regions. Image light is emitted from the sub-pixels.
  • the projection optical system reflects the image light so that a virtual image of an image displayed on the display surface is formed.
  • the controller controls the display surface.
  • the belt-like region includes a first visible region and a second visible region.
  • the first visible region emits image light that reaches the first eye of the user when the light ray direction is defined by the optical element.
  • the second visible region emits image light that reaches a second eye different from the first eye of the user when a light ray direction is defined by the optical element.
  • the controller displays a black image on a first sub-pixel at least partially included in the first visible region.
  • the controller controls an image to be displayed on a part of the second subpixel at least a part of which is included in the second visible region so as to be switchable between an observation image having arbitrary luminance and the black image. .
  • FIG. 1 is a diagram illustrating a schematic configuration of a head-up display system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram showing a schematic configuration of the head-up display shown in FIG.
  • FIG. 3 is a diagram showing an example of the display panel shown in FIG. 2 viewed from the depth direction.
  • FIG. 4 is a diagram showing an example of the parallax barrier shown in FIG. 2 viewed from the depth direction.
  • FIG. 5 is a diagram showing an example of the display panel and the parallax barrier shown in FIG. 2 viewed from the parallax barrier side with the left eye.
  • FIG. 6 is a diagram showing an example of the display panel and the parallax barrier shown in FIG. 2 viewed from the parallax barrier side with the right eye.
  • FIG. 1 is a diagram illustrating a schematic configuration of a head-up display system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram showing a schematic configuration of the head-up display shown in FIG
  • FIG. 7 is a diagram for explaining the relationship between the virtual image shown in FIG. 1 and the user's eyes.
  • FIG. 8 is a diagram for explaining the first virtual image visually recognized by the user's left eye.
  • FIG. 9 is a diagram for explaining the first virtual image visually recognized by the right eye of the user.
  • FIG. 10 is a diagram for explaining an example of a virtual image visually recognized by each of the user's left eye and right eye.
  • FIG. 11 is a diagram for explaining an example when the number of subpixels for displaying a black image is increased and the position of the parallax barrier is changed in the example illustrated in FIG. 5.
  • FIG. 12 is a diagram for explaining another example of the virtual image visually recognized by the left eye and the right eye of the user.
  • FIG. 13 is a diagram illustrating a schematic configuration of a head-up display system according to an embodiment of the present disclosure.
  • the head-up display system 1 includes an illuminance measuring instrument 2 and a head-up display 3 as shown in FIG.
  • the head-up display 3 is also referred to as HUD (Head Up Display) 3.
  • the head-up display system 1 may be mounted on the moving body 100 as shown in FIG.
  • “Moving object” in the present disclosure includes vehicles, ships, and aircraft.
  • “Vehicle” in the present disclosure includes, but is not limited to, automobiles and industrial vehicles, and may include railway vehicles, domestic vehicles, and fixed-wing aircraft that run on the runway.
  • the automobile includes, but is not limited to, a passenger car, a truck, a bus, a two-wheeled vehicle, a trolley bus, and the like, and may include other vehicles that travel on the road.
  • Industrial vehicles include industrial vehicles for agriculture and construction. Industrial vehicles include but are not limited to forklifts and golf carts. Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
  • Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, crane trucks, dump trucks, and road rollers. Vehicles include those that travel by human power.
  • the vehicle classification is not limited to the above.
  • an automobile may include an industrial vehicle capable of traveling on a road, and the same vehicle may be included in a plurality of classifications.
  • Ships in the present disclosure include marine jets, boats, and tankers.
  • the aircraft in the present disclosure includes fixed wing aircraft and rotary wing aircraft.
  • the illuminance measuring instrument 2 is disposed in the vicinity of the projection optical system 110 that forms the head-up display 3 and will be described in detail later.
  • the illuminance measuring instrument 2 is not limited to the vicinity of the projection optical system 110, and can be arranged at another position of the moving body 100.
  • the illuminance measuring instrument 2 is configured to detect the illuminance of the surrounding environment of the projection optical system 110.
  • the illuminance measuring instrument 2 can also be used as another device or component provided in the moving body 100.
  • the HUD 3 includes a display device 4 and a projection optical system 110.
  • a part of the configuration of the HUD 3 may be shared with other devices or parts included in the moving body 100.
  • Other devices or parts included in the moving body 100 that are also used as a part of the configuration of the HUD 3 may be referred to as a HUD module.
  • the projection optical system 110 can include a first optical member 111 and a second optical member 112.
  • the first optical member 111 is configured to reflect the image light emitted from the display device 4 to reach a predetermined region of the second optical member 112.
  • the first optical member 111 may include one or more mirrors and lenses.
  • the first optical member 111 includes a mirror, for example, the mirror included in the first optical member 111 may be a concave mirror.
  • the first optical member 111 is displayed as one mirror.
  • the present invention is not limited to this, and the first optical member 111 may be configured by combining one or more mirrors, lenses, and other optical elements.
  • the second optical member 112 reflects the image light emitted from the display device 4 and reflected by the first optical member 111 to reach the user's left eye (first eye) and right eye (second eye).
  • the windshield of the moving body 100 may also be used as the second optical member 112 of the HUD 3. Therefore, the HUD 3 advances the light emitted from the display device 4 along the optical path A to the left eye and the right eye of the user. The user can visually recognize the light that has reached along the optical path A as the virtual image 120.
  • the display device 4 includes an irradiator 5, a display panel 6, a parallax barrier 7 as an optical element, and a controller 8.
  • the display device 4 can be accommodated in a dashboard of the mobile object 100, for example.
  • the irradiator 5 can be arranged on one surface side of the display panel 6.
  • the irradiator 5 is configured to irradiate the display panel 6 in a plane.
  • the irradiator 5 can include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 5 can emit irradiation light from a light source, and can make the irradiation light uniform in the surface direction of the display panel 6 by a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the irradiator 5 can be configured to emit the uniformed light toward the display panel 6.
  • the display panel 6 may be a display panel such as a transmissive liquid crystal display panel.
  • the display panel 6 has a plate-shaped display surface 61. As shown in FIG. 3, the display surface 61 has a plurality of subpixels. The plurality of subpixels are arranged in a lattice shape along a first direction and a second direction substantially orthogonal to the first direction. A direction orthogonal to the first direction and the second direction is referred to as a third direction.
  • the first direction may be referred to as the horizontal direction.
  • the second direction may be referred to as the vertical direction.
  • the third direction may be referred to as the depth direction.
  • the first direction, the second direction, and the third direction are not limited to these. In the drawing, the first direction is represented as the x-axis direction, the second direction is represented as the y-axis direction, and the third direction is represented as the z-axis direction.
  • Each subpixel can correspond to any of R (Red), G (Green), and B (Blue).
  • the three subpixels R, G, and B can constitute one pixel as a set.
  • One pixel can be called one pixel.
  • the color of each subpixel is not limited to R, G, and B, and may include other colors such as white, for example.
  • the number of subpixels constituting one pixel is not limited to three, and may be any number of one or more.
  • the horizontal direction is, for example, a direction in which a plurality of subpixels constituting one pixel are arranged.
  • the vertical direction is, for example, a direction in which subpixels of the same color are arranged.
  • the display panel 6 is not limited to a transmissive display panel, and a self-luminous display panel can also be used.
  • the transmissive display panel may include a MEMS (Micro Electro Mechanical Systems) shutter type display panel.
  • the self-luminous display panel can include an organic EL (electro-luminescence) display panel and an inorganic EL display panel.
  • the display device 4 may not include the irradiator 5.
  • the light emitted from the subpixel can be expressed as image light.
  • the display panel 6 is a transmissive display panel, the image light can be light transmitted through the sub-pixels among the light emitted from the irradiator 5.
  • the image light can have one of R, G, and B colors.
  • the display panel 6 is a self-luminous display panel, the image light can be light emitted from the subpixel itself.
  • the display surface 61 includes a plurality of band-like regions.
  • the plurality of strip regions include a first subpixel group Pg1 and a second subpixel group Pg2. Each of the plurality of strip-like regions extends in a specified direction having an inclination with respect to the y-axis on the display surface 61.
  • the first sub-pixel group Pg1 and the second sub-pixel group Pg2 are alternately and repeatedly arranged in the horizontal direction.
  • the second subpixel group Pg2 is adjacent to the first subpixel group Pg1 in the horizontal direction.
  • the first subpixel group Pg1 includes four subpixels P1 to P4 arranged in succession, two in the horizontal direction and two in the vertical direction.
  • the parallax barrier 7 is positioned a predetermined distance away from the display panel 6 on the opposite side of the display panel 6 from the irradiator 5.
  • the parallax barrier 7 has a strip-shaped light reducing surface 71 as shown in FIG. 4 for reducing image light.
  • the parallax barrier 7 has a plurality of dimming surfaces 71.
  • the parallax barrier 7 defines a translucent region 72 between two dimming surfaces 71 adjacent to each other among the plurality of dimming surfaces 71.
  • the light transmissive regions 72 and the light reducing surfaces 71 are alternately arranged in a direction orthogonal to the direction in which the light transmissive regions 72 and the light reducing surfaces 71 extend.
  • the end portion of the translucent area 72 defines the light beam direction of the image light for each of the plurality of band-shaped areas extending in the defined direction.
  • the end of the belt-like region can cross over a plurality of subpixels. In the band-like region, the length of the section for one pixel along the horizontal direction is shorter than the length of the section for one pixel along the vertical direction.
  • the light transmissive region 72 can be configured to have a higher light transmittance than the light reducing surface 71.
  • the light reducing surface 71 can be configured to have a light transmittance lower than that of the light transmitting region 72.
  • the translucent region 72 can be configured to transmit light incident on the parallax barrier 7.
  • the translucent region 72 may transmit light with a transmittance equal to or higher than the first predetermined value.
  • the first predetermined value may be 100%, for example, or a value close to 100%. If the image light emitted from the display surface 61 is in a range in which the image light can be satisfactorily viewed, the first predetermined value may be a value of 100% or less, such as 80% or 50%.
  • the light-reducing surface 71 is a portion that blocks and does not transmit light incident on the parallax barrier 7. In other words, the dimming surface 71 blocks an image displayed on the display device 10.
  • the dimming surface 71 may block light with a transmittance equal to or less than the second predetermined value.
  • the second predetermined value may be 0%, for example, or a value close to 0%.
  • the first predetermined value may be a value smaller than 50%, for example, 10%, as long as sufficient contrast with light transmitted through the light reducing surface 71 can be secured.
  • a sufficient contrast ratio may be, for example, 100: 1.
  • the dimming surface 71 may be composed of a film or a plate-like member having a transmittance less than the second predetermined value.
  • the translucent area 72 is composed of an opening provided in the film or plate member.
  • a film may be comprised with resin and may be comprised with another material.
  • the plate-like member may be made of resin or metal, or may be made of other materials.
  • the parallax barrier 7 is not limited to a film or a plate-like member, and may be composed of other types of members.
  • the base material may have a light-reducing property, and an additive having a light-reducing property may be contained in the substrate.
  • the parallax barrier 7 may be configured such that a light-reducing member partially overlaps a light-transmitting substrate.
  • the parallax barrier 7 may have a configuration in which a member having a light-reducing property is added to a part of a light-transmitting substrate.
  • the parallax barrier 7 may be composed of a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may be composed of a plurality of pixels and may control the light transmittance in each pixel.
  • the liquid crystal shutter can form a region having a high light transmittance or a region having a low light transmittance in an arbitrary shape.
  • the light transmitting region 72 may be a region having a transmittance equal to or higher than the first predetermined value.
  • the light reducing surface 71 may be a region having a transmittance equal to or lower than a second predetermined value.
  • the parallax barrier 7 includes a shutter panel that can be changed between a light transmission state and a light reduction state for each minute region.
  • the shutter panel includes a MEMS shutter panel employing a MEMS (Micro Electro Mechanical System) shutter in addition to the liquid crystal shutter.
  • MEMS Micro Electro Mechanical System
  • the parallax barrier 7 can be configured to change the light beam direction, which is the propagation direction of the image light emitted from the subpixel, for each of a plurality of strip-like regions extending in the specified direction in the display surface 61. Specifically, the parallax barrier 7 transmits the part of the image light emitted from the display surface 61 through the translucent area 72 so that the image light reaches the position of the left eye of the user.
  • Two optical members 112 can be configured to propagate.
  • the parallax barrier 7 allows the other part of the image light emitted from the display surface 61 to pass through the translucent region 72 so that the image light reaches the position of the right eye of the user. 112 may be configured to propagate.
  • the second optical member 112 can be configured to cause the image light whose light beam direction is defined by the parallax barrier 7 to reach each eye of the user so that a virtual image of the display panel 6 is formed.
  • the image light emitted from the subpixels P1 to P4 included in the left-eye visible region 61L (first visible region) of the display surface 61 shown in FIG. 5 is converted into the parallax barrier 7 and the projection optical system.
  • the user's left eye can be reached via 110.
  • Image light emitted from the sub-pixels P5 to P8 included in the left-eye dimming area 62L which is an area other than the left-eye visible area 61L, is difficult to be recognized or does not reach the user's left eye.
  • the user apparently exists as if the second virtual image 700 that is a virtual image of the parallax barrier 7 exists and defines the direction of the image light from the first virtual image 600. Can recognize images.
  • the forward direction is the direction of the second optical member 112 as viewed from the user.
  • the front is a direction in which the moving body 100 normally moves.
  • FIG. 8 is a diagram for explaining the first virtual image 600 visually recognized by the user's left eye.
  • FIG. 9 is a diagram for explaining the first virtual image 600 visually recognized by the user's right eye.
  • the portion where the first virtual image 600 is not visually recognized due to the image light being blocked by the light reducing surface 71 is indicated by oblique lines, and is hereinafter referred to as a virtual image 701 of the light reducing surface 71.
  • the first virtual image 600 includes a part of virtual image subpixels P′1 to P′8, which are virtual images of the subpixels P1 to P8 on the display surface 61.
  • the horizontal length of the virtual image subpixels P′1 to P′8 is VHp
  • the vertical length is VVp.
  • Part of the image light emitted from the display surface 61 can pass through the light-transmitting region 72 of the parallax barrier 7 and be reflected by the projection optical system 110 as described above to reach the left eye.
  • the user's left eye can visually recognize the virtual image 601L of the left-eye visible region 61L that is part of the first virtual image 600.
  • Image light emitted from the display surface 61 other than the image light reaching the left eye can be reduced by the light reduction surface 71 of the parallax barrier 7.
  • the left eye does not visually recognize the virtual image 602L of the left-eye dimming region 62L that is a region other than the virtual image 601L of the left-eye visible region 61L in the first virtual image 600.
  • the image light emitted from the display surface 61 and different from the image light reaching the left eye is transmitted through the light-transmitting region 72 of the parallax barrier 7 and is reflected by the projection optical system 110 as described above, so that the right eye. Can be reached.
  • the user's right eye apparently displays a virtual image 601R of the right eye visible region 61R that is a part of the first virtual image 600 and is different from the virtual image 601L of the left eye visible region 61L. Visible.
  • Image light other than image light that reaches the right eye emitted from the display surface 61 can be dimmed by the dimming surface 71 of the parallax barrier 7.
  • the right eye ideally does not visually recognize the virtual image 602R of the right eye dimming region 62R that is a region other than the virtual image 601R of the right eye visible region 61R in the first virtual image 600.
  • the barrier pitch Bp is an arrangement interval of the parallax barrier 7 in the horizontal direction.
  • the barrier opening width Bw is the horizontal length of the translucent region 72.
  • the gap g is a distance between the display panel 6 and the parallax barrier 7.
  • the appropriate viewing distance VD is a distance between the second virtual image 700 and the user's eye.
  • the actual ratio of the distance between the second virtual image 700 and the user's eye to the distance between the first virtual image 600 and the second virtual image 700 is as follows. The distance ratio shown in FIG. 7 is much larger.
  • the virtual image barrier pitch VBp is an arrangement interval of the virtual images 701 on the light reducing surface 71 in a direction corresponding to the first direction.
  • the virtual image gap Vg is a distance between the second virtual image 700 and the first virtual image 600.
  • the virtual image pitch Vk is a horizontal arrangement interval between the virtual image of the first subpixel group Pg1 and the virtual image of the second subpixel group Pg2.
  • VD (Vk / 2): Vg Formula (1)
  • VD: VBp (VD + Vg): Vk Formula (2)
  • the virtual image barrier opening width VBw is appropriately defined based on the appropriate visual distance VD, the virtual image gap Vg, and the virtual image pitch Vk so that the virtual image 601L of the left eye visible region 61L and the virtual image 601R of the right eye visible region 61R do not overlap. .
  • the virtual image barrier opening width VBw is a width corresponding to the width of the light transmitting region 72 in the second virtual image 700.
  • the image pitch k, barrier pitch Bp, barrier opening width Bw, and gap g in the display device 4 are defined such that the virtual image barrier pitch VBp, virtual image gap Vg, virtual image barrier opening width VBw, and virtual image pitch Vk satisfy the above conditions. Is done. These values are defined in consideration of the performance of the projection optical system 110 and the positional relationship with the display device 4.
  • the controller 8 is connected to each component of the display device 4 and can be configured to control each component.
  • the controller 8 can be configured as a processor, for example.
  • the controller 8 may include one or more processors.
  • the processor may include a general-purpose processor that reads a specific program and executes a specific function, and a dedicated processor specialized for a specific process.
  • the dedicated processor may include an application specific IC (ASIC: Application Specific Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 8 may be one of SoC (System-on-a-Chip) in which one or a plurality of processors cooperate and SiP (System-In-a-Package).
  • the controller 8 includes a storage unit, and may store various information or a program for operating each component of the display device 4 in the storage unit.
  • the storage unit may be configured by, for example, a semiconductor memory.
  • the controller 8 can be configured to control an image to be displayed on the display panel 6 based on a control signal received by the display device 4 or input to the display device 4 by a user operation.
  • the display device 4 can be configured to display a monocular image based on the control signal.
  • the display device 4 may be configured to display at least one of a two-dimensional image and a three-dimensional image in addition to a monocular image.
  • the display device 4 may be operable to switch a display image between a monocular image and a two-dimensional image and / or a three-dimensional image based on a control signal.
  • the controller 8 displays a black image in a sub-pixel (first sub-pixel) at least partially included in the left-eye visible region 61L that emits image light that passes through the light-transmitting region 72 and reaches the user's left eye. May be operable.
  • the black image is an image having a predetermined luminance such as black.
  • the predetermined luminance can be a value corresponding to the luminance of the lowest gradation among the gradation levels that can be displayed by the subpixel or the luminance of the gradation according to the luminance.
  • the controller 8 causes the observation image to be displayed on the sub-pixel (second sub-pixel) that is at least partially included in the right-eye visible region 61R that emits image light that passes through the light-transmitting region 72 and reaches the right eye. May be operable.
  • the observation image is an image having an arbitrary luminance to be observed by the user's right eye.
  • the first subpixel corresponds to a subpixel included in the first subpixel group Pg1 described above.
  • the second subpixel corresponds to a subpixel included in the above-described second subpixel group Pg2.
  • the controller 8 may be operable to display black images on the sub-pixels P1 to P4 at least partially included in the left-eye visible region 61L as shown in FIG.
  • a symbol “(B)” is attached to subpixels displaying a black image together with symbols P1 to P8.
  • the left eye of the user does not visually recognize or hardly recognize the virtual image at the positions of the virtual image subpixels P′1 to P′4.
  • the symbol “(B)” is attached to the virtual image subpixels corresponding to the subpixels P1 to P8 displaying the black image together with the symbols P′1 to P′8.
  • the black image has a value corresponding to the luminance of the lowest gradation among the gradation levels that can be displayed by the sub-pixel or the luminance of the gradation according to the luminance. Therefore, actually, when the user sees the direction corresponding to the black image through the second optical member 112, the user only sees an object located on the opposite side of the second optical member 11 from the user. Can be seen. That is, the user does not visually recognize the virtual image at the position corresponding to the sub-pixel displaying the black image. Or, actually, when the user sees the direction corresponding to the black image through the second optical member 112, the user only sees an object located on the opposite side of the second optical member 11 from the user. Is easy to see.
  • the position corresponding to the sub-pixel displaying the black image is described as the position of the virtual image sub-pixel.
  • the controller 8 may be operable to display observation images on the sub-pixels P5 to P8 at least partially included in the left-eye dimming area 62L as shown in FIG.
  • the observation image is displayed on the sub-pixels P5 to P8 at least partially included in the right-eye visible region 61R. Therefore, as shown in FIG. 9, the right eye of the user visually recognizes the virtual image of the observation image formed in the virtual image subpixels P′5 to P′8 of the virtual image 601R in the right-eye visible region 61R.
  • the user views the virtual image of the observation image only with the right eye.
  • the user can easily visually recognize the virtual image of the observation image only with the right eye. Therefore, it becomes difficult for the user to recognize the depth direction of the virtual image of the observation image, and it becomes easier to visually recognize the virtual image at the same time as an object far from the virtual image.
  • the left eye of the user does not visually recognize the virtual image of the observation image displayed in the left eye dimming area 62L.
  • image light from the left eye dimming area 62 ⁇ / b> L of the display surface 61 may leak from the dimming surface 71 of the parallax barrier 7.
  • the left eye may visually recognize the virtual image of the observation image formed in the virtual image subpixels P′5 to P′8 included in the virtual image 602L of the left eye dimming region 62L. Accordingly, the inventors have found that the left eye sees an observation image that should be seen only by the right eye, and crosstalk occurs.
  • the inventors have found that the image light from the sub-pixel closer to the left-eye visible region 61L in the left-eye dimming region 62L is more likely to leak from the dimming surface 71.
  • the inventors found that the lower the illuminance of the surrounding environment, the easier it is for the user's left eye to visually recognize the virtual image due to the image light leaking from the left eye dimming region 62L, which greatly affects the occurrence of crosstalk. It was.
  • the controller 8 can be configured to acquire the illuminance measured by the illuminance measuring instrument 2. It may be operable to display a black image on the first subpixel and to control the second subpixel based on the illuminance. Specifically, the controller 8 controls the image to be displayed on a part of the second subpixel at least a part of which is included in the right eye visible region 61R so as to be switchable between the observation image and the black image. May be operable. At this time, the controller 8 may be operable to display an observation image on a sub-pixel that does not display a black image among the second sub-pixels.
  • the controller 8 may be operable to control the display state of the display surface 61 so as to be switchable between the first display state and the second display state.
  • the first display state is a state in which the number of subpixels that display a black image is greater than the number of subpixels that display an image for observation in a repeating unit of the first subpixel and the second subpixel.
  • the second display state is a state in which the number of subpixels displaying a black image is the same as the number of subpixels displaying an observation image.
  • the controller 8 may be operable to display a black image on the first subpixel as described above.
  • the first illuminance is the lowest illuminance in the illuminance range in which crosstalk due to leakage of image light from the observation image is considered not to be recognized by the user.
  • the controller 8 may be operable to cause the second subpixel to display an observation image.
  • the controller 8 displays the black image on the first subpixel as described above. It may be operable.
  • the second illuminance is lower than the first illuminance.
  • the second illuminance is an illuminance in an illuminance range in which crosstalk due to leakage of image light from the observation image due to leakage of image light of the observation image is considered to be recognized by the user.
  • the second illuminance is the lowest illuminance in an illuminance range in which crosstalk due to leakage of image light from the observation image is not recognized by the user when a part of the observation image is changed to a black image. .
  • the controller 8 may be operable to display a black image on at least one of the second subpixels in a minimum repeating unit of the first subpixel and the second subpixel.
  • the minimum repeating unit is a minimum unit in which the first sub-pixel and the second sub-pixel are repeatedly arranged, and the sub-pixels P1 to P8 arranged in succession as shown in FIGS. 5 and 6 one by one. It is a range to include.
  • the controller 8 may display a black image on an image to be displayed on a subpixel in contact with the first subpixel among the second subpixels.
  • the controller 8 may display a black image on the sub pixel closest to the left eye visible region 61L among the second sub pixels.
  • the controller 8 causes the subpixel closer to the right eye to display a black image among the plurality of subpixels. It's okay.
  • the controller 8 may display a black image on the plurality of subpixels.
  • the controller 8 may display a black image on the subpixel P5 in addition to the subpixels P1 to P4.
  • the left eye of the user does not visually recognize the virtual image at the positions of the virtual image subpixels P′1 to P′4, as shown in FIG.
  • the controller 8 causes the observation image to be displayed on the sub-pixel that does not display the black image among the second sub-pixels.
  • the controller 8 may display the observation image on the subpixels P6 to P8 among the subpixels P5 to P8.
  • the right eye of the user can visually recognize the virtual image of the observation image formed on the virtual image subpixels P′6 to P′8.
  • the right eye of the user does not visually recognize the virtual image at the position of the virtual image subpixel P′5 or is difficult to visually recognize it.
  • the visibility may be lowered due to a decrease in the virtual image of the observation image visually recognized by the user's right eye.
  • the lower the illuminance of the surrounding environment the easier it is for the user's eyes to visually recognize the observation image even if the amount of image light is small. For this reason, the fall of the visibility of the virtual image of the image for observation can be reduced.
  • the controller 8 may control the parallax barrier 7. Specifically, as shown in FIG. 11, the controller 8 arranges the parallax barrier 7 so that the center of the left eye visible region 61L is positioned at the center in the horizontal direction of the region where the black image is displayed. It's okay. As a result, the subpixel closest to the left-eye visible region 61L that displays the observation image is separated from the left-eye visible region 61L. Therefore, crosstalk due to leakage of the observation image can be further reduced.
  • the controller 8 may be operable to display a black image on the first sub-pixel as described above.
  • the controller 8 may be operable to display a black image on more subpixels than in the case where the illuminance is greater than or equal to the second illuminance and less than the first illuminance among the second subpixels.
  • the controller 8 may display black images on the subpixels P5 and P8 in addition to the subpixels P1 to P4. Accordingly, as shown in FIG. 12, the user's left eye does not or does not easily see the virtual image at the positions of the virtual image subpixels P′1 to P′4. Even if image light leaks from the left-eye dimming area 62L at this time, the user's left eye does not visually recognize or hardly recognize the virtual image at the positions of the virtual image subpixels P′5 and P′8. For this reason, it can be further reduced that the virtual image of the observation image is visually recognized by the left eye of the user due to the leakage of the image light of the observation image. Therefore, the occurrence of crosstalk can be further reduced.
  • the controller 8 may be operable to display an observation image on a sub-pixel that does not display a black image among the second sub-pixels.
  • the controller 8 may display the observation image on the subpixels P6 and P7 among the subpixels P5 to P8.
  • the right eye of the user visually recognizes the virtual image of the observation image formed in the virtual image subpixels P′6 and P′7 of the virtual image 602L of the left-eye dimming region 62L. sell.
  • the user's right eye does not visually recognize the virtual image at the positions of the virtual image subpixels P′5 and P′8 or is difficult to visually recognize.
  • the parallax barrier 7 can be configured by a liquid crystal shutter that can be controlled by the controller 8.
  • the controller 8 may be operable to display a two-dimensional image having no parallax on the display panel 6.
  • the controller 8 may be operable to avoid providing the light reducing surface 71 on the parallax barrier 7.
  • the controller 8 can be operated so that the transmittance of the liquid crystal shutters constituting the parallax barrier 7 is uniformly equal to the transmittance of the light-transmitting region 72.
  • the image light of the two-dimensional image emitted from the display surface 61 can reach both the right eye and the left eye of the user. Therefore, the right eye and the left eye of the user can visually recognize the same two-dimensional image.
  • the controller 8 may be operable to display the left eye image on the first subpixel at least partially included in the left eye visible region 61L.
  • the controller 8 may be operable to display the right eye image on the second subpixel at least partially included in the right eye visible region 61R.
  • the left eye image and the right eye image are images having parallax with each other. When the left eye image is visually recognized by the left eye and the right eye image is visually recognized by the right eye, a stereoscopic image can be recognized by the user.
  • the controller 8 is operable to display the black image on the first subpixel at least partially included in the left eye visible region 61L. It is possible.
  • the controller 8 may be configured to control so that an image to be displayed on a part of the second subpixel at least part of which is included in the right eye visible region 61R can be switched between the observation image and the black image. For this reason, the amount of image light of the observation image that leaks from the dimming surface 71 can be controlled. Therefore, the amount of image light of the observation image transmitted to the user's left eye is controlled, and the observation image that the left eye visually recognizes together with the black image can be reduced. Thereby, the occurrence of crosstalk can be reduced.
  • the barrier opening ratio which is the ratio of the barrier opening width Bw to the barrier pitch Bp.
  • an observation image or a black image may be displayed on any of the second sub-pixels included in the left-eye dimming region 62L, and the barrier opening width As compared with the case where Bw is changed, precise control is not required. Therefore, the occurrence of crosstalk can be easily reduced.
  • the controller 8 can be configured to control an image to be displayed on the second sub-pixel based on the illuminance measured by the illuminance measuring instrument 2.
  • the human eye is more likely to visually recognize the observation image as the illuminance of the surrounding environment is lower. Therefore, by displaying a black image on the second sub-pixel or displaying an ornamental image based on the illuminance, the ornamental image is surely visually recognized by the right eye while appropriately reducing crosstalk. be able to.
  • the controller 8 when the illuminance is less than the first illuminance, the controller 8 sets black to at least one of the second subpixels in the minimum repeating unit of the first subpixel and the second subpixel. Display an image.
  • the controller 8 causes the observation image to be displayed on the sub-pixel that does not display the black image among the second sub-pixels. For this reason, it can be reduced that the virtual image of the observation image is visually recognized by the left eye of the user due to the leakage of the image light of the observation image. Therefore, occurrence of crosstalk can be reduced.
  • the lower the illuminance of the surrounding environment the easier it is to visually recognize the image even if the amount of image light is small. Therefore, the visibility of the virtual image of the observation image by the right eye decreases even if the amount of image light of the observation image decreases. Can be prevented.
  • the controller 8 displays a black image on the sub-pixel closest to the left-eye visible region 61L among the second sub-pixels.
  • the controller 8 when the illuminance is less than the second illuminance lower than the first illuminance, the controller 8 is more than the case where the illuminance is greater than or equal to the second illuminance among the second sub-pixels.
  • the sub-pixel may be operable to display a black image. It can be further reduced that the virtual image of the observation image is visually recognized by the left eye of the user due to the leakage of the image light of the observation image. Therefore, the occurrence of crosstalk can be further reduced.
  • the lower the illuminance of the surrounding environment the easier it is to visually recognize the image even when the amount of image light is small. Therefore, even if the amount of image light of the observation image is reduced, the visibility of the virtual image of the observation image by the right eye is reduced. Can be prevented.
  • the head-up display system 1 according to the second embodiment of the present disclosure includes an illuminance measuring instrument 2 and a head-up display 3 as shown in FIG.
  • the head-up display system 1 according to the second embodiment is different from the first embodiment in that it further includes a detection device 9.
  • the second embodiment only the configuration different from the first embodiment will be described.
  • the configuration that is not described in the second embodiment is the same as that of the first embodiment.
  • the detection device 9 can be configured to detect the position of either the left eye or the right eye of the user and output it to the controller 8.
  • the detection device 9 may include a camera, for example.
  • the detection device 9 may photograph the user's face with a camera.
  • the detection device 9 may detect the position of at least one of the left eye and the right eye from a captured image of the camera.
  • the detection device 9 may detect the position of at least one of the left eye and the right eye as coordinates in a three-dimensional space from a captured image of one camera.
  • the detection device 9 may detect the position of at least one of the left eye and the right eye as coordinates in a three-dimensional space from the captured images of two or more cameras.
  • Detecting device 9 may not be equipped with a camera and may be connected to a camera outside the device.
  • the detection device 9 may include an input terminal for inputting a signal from a camera outside the device.
  • the camera outside the apparatus may be directly connected to the input terminal.
  • the camera outside the apparatus may be indirectly connected to the input terminal via a shared network.
  • the detection device 9 that does not include a camera may include an input terminal through which the camera inputs a video signal.
  • the detection device 9 that does not include a camera may detect the position of at least one of the left eye and the right eye from the video signal input to the input terminal.
  • the detection device 9 may include a sensor, for example.
  • the sensor may be an ultrasonic sensor or an optical sensor.
  • the detection device 9 may detect the position of the user's head using a sensor, and may detect the position of at least one of the left eye and the right eye based on the position of the head.
  • the detection device 9 may detect the position of at least one of the left eye and the right eye as coordinates in a three-dimensional space with one or more sensors.
  • the detection device 9 may detect the movement distance of the left eye and the right eye along the arrangement direction of both eyes based on the detection result of at least one position of the left eye and the right eye.
  • the head-up display system 1 does not have to include the detection device 9.
  • the controller 8 may include an input terminal that inputs a signal from a detection device outside the device.
  • the detection device outside the device may be connected to the input terminal.
  • the detection device outside the device may use an electric signal and an optical signal as a transmission signal for the input terminal.
  • the detection device outside the device may be indirectly connected to the input terminal via a shared network.
  • the controller 8 may receive the position of at least one of the left eye and the right eye acquired from a detection device outside the device.
  • the controller 8 is configured to determine the positions of the left-eye visible region 61L and the right-eye visible region 61R based on the illuminance measured by the illuminance measuring instrument 2 and the position of the user's eye detected by the detection device 9. Can be done.
  • the controller 8 may determine the positions of the virtual image 601L of the left-eye visible region 61L and the virtual image 601R of the right-eye visible region 61R based on the positions. Then, the controller 8 may determine the positions of the left eye visible region 61L and the right eye visible region 61R based on the positions of the virtual image 601L of the left eye visible region 61L and the virtual image 601R of the right eye visible region 61R.
  • the controller 8 includes the second sub sub-dimension 62L including at least a part of the left-eye dimming area 62L based on the position of the left-eye visible area 61L when the user's eyes are at the reference position. It can be configured to control the pixels.
  • the controller 8 controls the second sub-pixel based on the position of the left-eye visible region 61L determined based on the position of the user's eye detected by the detection device 9. Can be configured to.
  • the black image and the observation image can be appropriately displayed for each of the left eye and the right eye of the user based on the position of the left eye visible region 61L.
  • the display device 4 includes the parallax barrier 7 as an optical element, but is not limited thereto.
  • the display device 4 may include a lenticular lens as an optical element.
  • the lenticular lens defines the traveling direction of the image light emitted from the subpixels P1 to P4 included in the left-eye visible region 61L, and passes through the projection optical system 110. It is operable to reach the user's left eye.
  • the lenticular lens defines the traveling direction of the image light emitted from the sub-pixels P5 to P8 included in the right-eye visible region 61R, and the user's through the projection optical system 110 Operate to reach the right eye.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Cet affichage tête haute selon la présente invention est pourvu d'une surface d'affichage, d'un élément optique, d'un système optique de projection et d'un dispositif de commande. La surface d'affichage comporte une pluralité de sous-pixels. La pluralité de sous-pixels sont agencés, selon un motif de grille, dans une première direction et dans une seconde direction approximativement perpendiculaire à la première direction. La surface d'affichage comporte une pluralité de régions en forme de bande. La pluralité de régions en forme de bande s'étendent dans une direction définie sur la surface d'affichage. L'élément optique définit la direction du faisceau de lumière de l'image pour chacune de la pluralité de régions en forme de bande. La lumière d'image est émise à partir des sous-pixels. Le système optique de projection réfléchit la lumière d'image pour former une image virtuelle de l'image affichée sur la surface d'affichage. Le dispositif de commande contrôle la surface d'affichage. Les régions en forme de bande comprennent une première région visible et une deuxième région visible. Dans la première région visible, la direction de faisceau est définie par l'élément optique de telle sorte que la lumière d'image qui atteint un premier œil d'un utilisateur est émise. Dans la seconde région visible, la direction de faisceau est définie par l'élément optique de telle sorte que la lumière d'image qui atteint un second œil d'un utilisateur, différente du premier œil, est émise. Le dispositif de commande affiche une image noire sur des premiers sous-pixels, dont au moins une partie est incluse dans la première région visible. Le dispositif de commande effectue une commande de telle sorte qu'une image affichée sur une partie de seconds sous-pixels, dont au moins une partie sont incluses dans la seconde région visible, peut être commutée entre l'image noire et une image d'observation ayant une luminance arbitraire.
PCT/JP2019/006084 2018-02-19 2019-02-19 Affichage tête haute, systeme d'affichage tête haute, et corps mobile WO2019160160A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-027379 2018-02-19
JP2018027379A JP7129789B2 (ja) 2018-02-19 2018-02-19 ヘッドアップディスプレイ、ヘッドアップディスプレイシステム、および移動体

Publications (1)

Publication Number Publication Date
WO2019160160A1 true WO2019160160A1 (fr) 2019-08-22

Family

ID=67618987

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/006084 WO2019160160A1 (fr) 2018-02-19 2019-02-19 Affichage tête haute, systeme d'affichage tête haute, et corps mobile

Country Status (2)

Country Link
JP (1) JP7129789B2 (fr)
WO (1) WO2019160160A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114503556A (zh) * 2019-10-01 2022-05-13 京瓷株式会社 三维显示装置、控制器、三维显示方法、三维显示系统及移动体
CN114503555A (zh) * 2019-10-01 2022-05-13 京瓷株式会社 三维显示装置、三维显示系统及移动体
CN114730095A (zh) * 2019-11-27 2022-07-08 京瓷株式会社 平视显示器系统以及移动体
CN114730096A (zh) * 2019-11-27 2022-07-08 京瓷株式会社 平视显示器系统以及移动体

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116601548A (zh) 2020-12-14 2023-08-15 京瓷株式会社 三维显示装置、图像显示系统以及移动体
EP4283987A1 (fr) * 2021-02-26 2023-11-29 Kyocera Corporation Dispositif d'affichage d'image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005331844A (ja) * 2004-05-21 2005-12-02 Toshiba Corp 立体画像表示方法、立体画像撮像方法及び立体画像表示装置
WO2012176445A1 (fr) * 2011-06-20 2012-12-27 パナソニック株式会社 Dispositif d'affichage d'images
JP2013076724A (ja) * 2011-09-29 2013-04-25 Japan Display West Co Ltd 表示装置、表示パネル、および電子機器
JP2014110568A (ja) * 2012-12-03 2014-06-12 Sony Corp 画像処理装置、画像処理方法、およびプログラム
JP2015194709A (ja) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 画像表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005331844A (ja) * 2004-05-21 2005-12-02 Toshiba Corp 立体画像表示方法、立体画像撮像方法及び立体画像表示装置
WO2012176445A1 (fr) * 2011-06-20 2012-12-27 パナソニック株式会社 Dispositif d'affichage d'images
JP2013076724A (ja) * 2011-09-29 2013-04-25 Japan Display West Co Ltd 表示装置、表示パネル、および電子機器
JP2014110568A (ja) * 2012-12-03 2014-06-12 Sony Corp 画像処理装置、画像処理方法、およびプログラム
JP2015194709A (ja) * 2014-03-28 2015-11-05 パナソニックIpマネジメント株式会社 画像表示装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114503556A (zh) * 2019-10-01 2022-05-13 京瓷株式会社 三维显示装置、控制器、三维显示方法、三维显示系统及移动体
CN114503555A (zh) * 2019-10-01 2022-05-13 京瓷株式会社 三维显示装置、三维显示系统及移动体
CN114730095A (zh) * 2019-11-27 2022-07-08 京瓷株式会社 平视显示器系统以及移动体
CN114730096A (zh) * 2019-11-27 2022-07-08 京瓷株式会社 平视显示器系统以及移动体
US11881130B2 (en) 2019-11-27 2024-01-23 Kyocera Corporation Head-up display system and moving body

Also Published As

Publication number Publication date
JP7129789B2 (ja) 2022-09-02
JP2019145967A (ja) 2019-08-29

Similar Documents

Publication Publication Date Title
WO2019160160A1 (fr) Affichage tête haute, systeme d'affichage tête haute, et corps mobile
JP6924637B2 (ja) 3次元表示装置、3次元表示システム、移動体、および3次元表示方法
CN110832382B (zh) 图像投影装置和移动体
JP7100523B2 (ja) 表示装置、表示システムおよび移動体
US20200053352A1 (en) Three-dimensional display apparatus, three-dimensional display system, head-up display system, and mobile body
JP2018120191A (ja) 3次元表示システム、ヘッドアップディスプレイシステム、及び移動体
US11616940B2 (en) Three-dimensional display device, three-dimensional display system, head-up display, and mobile object
WO2020090626A1 (fr) Dispositif d'affichage d'image, système d'affichage d'image et corps mobile
US20230004002A1 (en) Head-up display, head-up display system, and movable body
CN113614613B (zh) 立体虚像显示模块、立体虚像显示系统以及移动体
JP7227116B2 (ja) 3次元表示装置、コントローラ、3次元表示方法、3次元表示システム、及び、移動体
WO2020130049A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile
WO2019225400A1 (fr) Dispositif d'affichage d'images, système d'affichage d'images, affichage tête haute, et objet mobile
WO2020130048A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile
WO2019163817A1 (fr) Dispositif d'affichage d'image, système d'affichage tête haute et corps mobile
WO2020004275A1 (fr) Dispositif d'affichage tridimensionnel, contrôleur de commande, procédé d'affichage tridimensionnel, système d'affichage tridimensionnel, et corps mobile
WO2022019154A1 (fr) Dispositif d'affichage tridimensionnel
JP7346587B2 (ja) ヘッドアップディスプレイ、ヘッドアップディスプレイシステム及び移動体
WO2020130047A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, affichage tête haute et objet mobile
CN114730096A (zh) 平视显示器系统以及移动体
JP7475191B2 (ja) 眼間距離測定方法および較正方法
WO2022149599A1 (fr) Dispositif d'affichage tridimensionnel
WO2021060011A1 (fr) Barrière de parallaxe, dispositif d'affichage en trois dimensions (3d), système d'affichage en 3d, afficheur tête haute, et corps mobile
WO2021066111A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, et objet mobile
JP2021056255A (ja) パララックスバリア、3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、および移動体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19754455

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19754455

Country of ref document: EP

Kind code of ref document: A1