WO2020130049A1 - Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile - Google Patents

Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile Download PDF

Info

Publication number
WO2020130049A1
WO2020130049A1 PCT/JP2019/049677 JP2019049677W WO2020130049A1 WO 2020130049 A1 WO2020130049 A1 WO 2020130049A1 JP 2019049677 W JP2019049677 W JP 2019049677W WO 2020130049 A1 WO2020130049 A1 WO 2020130049A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
eye
user
sub
Prior art date
Application number
PCT/JP2019/049677
Other languages
English (en)
Japanese (ja)
Inventor
薫 草深
秀也 高橋
濱岸 五郎
Original Assignee
京セラ株式会社
公立大学法人大阪
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社, 公立大学法人大阪 filed Critical 京セラ株式会社
Publication of WO2020130049A1 publication Critical patent/WO2020130049A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • H04N13/125Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues for crosstalk reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/317Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present disclosure relates to a three-dimensional display device, a head-up display system, and a moving body.
  • the three-dimensional display device of the present disclosure includes a display panel, an optical element, an acquisition unit, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above.
  • the first display areas and the second display areas are alternately arranged on the display surface of the display panel.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
  • the head-up display system of the present disclosure includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the mobile object of the present disclosure includes a head-up display system.
  • the head-up display system includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above.
  • the first display areas and the second display areas are alternately arranged on the display surface of the display panel.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
  • FIG. 1 is a diagram showing an example of a three-dimensional display system according to an embodiment viewed from a vertical direction.
  • FIG. 2 is a diagram showing an example of the display panel shown in FIG. 1 viewed from the depth direction.
  • FIG. 3 is a diagram showing an example of the barrier shown in FIG. 1 viewed from the depth direction.
  • FIG. 4 is a diagram for explaining the left visible region in the display panel shown in FIG.
  • FIG. 5 is a diagram for explaining the right visible region in the display panel shown in FIG.
  • FIG. 6 is a schematic diagram showing the sub-pixels visually recognized by the left and right eyes of the user located at the proper viewing distance d.
  • FIG. 7 is a schematic diagram showing an example of sub-pixels that the left eye and the right eye of the user visually recognize when the observation distance Y is shorter than the proper viewing distance d.
  • FIG. 8 is a schematic diagram showing an example of sub-pixels visually recognized by the left and right eyes of a close user whose observation distance Y is the proper viewing distance d/2.
  • FIG. 9 is a schematic diagram which shows the other example of the sub pixel visually recognized by the left eye and right eye of the near user whose observation distance Y is the suitable viewing distance d/2.
  • FIG. 10 is a flowchart showing the processing of the three-dimensional display system according to the embodiment.
  • FIG. 11 is a schematic configuration diagram of a three-dimensional display device when the optical element is a lenticular lens.
  • FIG. 12 is a diagram showing an example of a head-up display system equipped with the three-dimensional display system according to this embodiment.
  • FIG. 13 is a diagram showing an example of a moving body equipped with the head-up display
  • An object of the present disclosure is to provide a three-dimensional display device, a head-up display system, and a moving body that allow a user to appropriately visually recognize a three-dimensional image.
  • a three-dimensional display system 100 is configured to include a detection device 1 and a three-dimensional display device 2 as shown in FIG.
  • the three-dimensional display system 100 displays an image on the display panel 5 of the three-dimensional display device 2. Part of the image light emitted from the display panel 5 is blocked by the barrier 6, so that different image lights reach the left and right eyes of the user.
  • the user can view the image stereoscopically because there is a parallax between the image viewed by the left eye and the image viewed by the right eye.
  • the three-dimensional display device 2 adjusts the width of the image displayed on the display panel 5 according to the distance between the user's eyes and the barrier 6 detected by the detection device 1. .
  • the three-dimensional display system 100 can allow the user to appropriately visually recognize the three-dimensional image regardless of the change in the distance to the user.
  • Detecting device 1 is configured to detect the position of the user's eyes.
  • the detection device 1 may detect the position of at least one of the left eye and the right eye of the user.
  • one eye of the user is also referred to as a first eye.
  • the other eye of the user is also referred to as the second eye.
  • the left eye is the first eye and the right eye is the second eye, but they may be reversed.
  • the position of the user's eyes is represented by, for example, coordinates in a three-dimensional space, but is not limited to this.
  • the detection device 1 may include, for example, a camera.
  • the detection device 1 may capture a user's face with a camera.
  • the detection device 1 may detect the position of the eyes of the user from the captured image of the face of the user.
  • the detection device 1 may detect the position of the user's eyes as the coordinates of the three-dimensional space from the image captured by one camera.
  • the detection device 1 may detect the position of the user's eyes as coordinates in a three-dimensional space from images captured by two or more cameras.
  • the detection device 1 outputs the position of at least one of the left eye and the right eye of the user to the three-dimensional display device 2.
  • Detecting device 1 may not be equipped with a camera and may be connected to a camera outside the device.
  • the detection device 1 may include an input terminal for inputting an image pickup signal from a camera outside the device.
  • the camera outside the device may be directly connected to the input terminal.
  • a camera outside the device may be indirectly connected to the input terminal via a shared network.
  • the detection device 1 may detect the position of the eye of the user from the video signal input to the input terminal.
  • the detection device 1 may include a sensor, for example.
  • the sensor may be an ultrasonic sensor or an optical sensor.
  • the detection device 1 may detect the position of the user's head with a sensor, and may detect the position of the user's eye based on the position of the head.
  • the detection device 1 may detect the position of the user's eyes as coordinates in the three-dimensional space by using one or more sensors.
  • the three-dimensional display system 100 does not have to include the detection device 1.
  • the three-dimensional display device 2 may include an input terminal for inputting a signal from the detection device outside the system.
  • the detection device outside the system may be directly connected to the input terminal.
  • the detection device outside the system may be indirectly connected to the input terminal via a shared network.
  • the three-dimensional display device 2 may acquire the position of the user's eyes from a detection device outside the system.
  • the three-dimensional display device 2 includes an acquisition unit 3, an illuminator 4, a display panel 5, a barrier 6, and a controller 7.
  • the acquisition unit 3 may be configured to acquire the position of at least one of the left eye and the right eye of the user detected by the detection device 1.
  • the acquisition unit 3 may include, for example, a communication module or the like.
  • the acquisition unit 3 may determine the distance between the user's eye and the barrier 6 from the acquired position of the user's eye.
  • the distance between the user's eyes and the barrier 6 may be the distance between the barrier 6 and at least one of the user's left and right eyes.
  • the distance between the user's eyes and the barrier 6 is also referred to as the user's observation distance.
  • the illuminator 4 can illuminate the display panel 5 in a plane.
  • the illuminator 4 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 4 emits irradiation light from a light source, and uniformizes the irradiation light in the surface direction of the display panel 5 using a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the illuminator 4 can emit uniformized light to the display panel 5.
  • the display panel 5 is, for example, a display panel such as a transmissive liquid crystal display panel, but is not limited to this. As shown in FIG. 2, the display panel 5 has a plurality of partitioned areas on a planar active area 51.
  • the active area 51 displays a mixed image.
  • the active area 51 is also referred to as a display surface.
  • the mixed image includes a left-eye image and a right-eye image having a parallax with respect to the left-eye image.
  • the left-eye image is also referred to as the first image.
  • the right eye image is also referred to as the second image.
  • the mixed image which will be described in detail later, may further include a third image.
  • the partitioned area is an area partitioned by the grid-like black matrix 52 in the first direction and the second direction orthogonal to the first direction.
  • the direction orthogonal to the first direction and the second direction is referred to as the third direction.
  • the first direction may be referred to as the horizontal direction.
  • the second direction may be referred to as the vertical direction.
  • the third direction may be referred to as the depth direction.
  • the first direction, the second direction, and the third direction are not limited to these.
  • the first direction is represented as the x-axis direction
  • the second direction is represented as the y-axis direction
  • the third direction is represented as the z-axis direction.
  • the active area 51 includes a plurality of sub-pixels arranged in a grid along the horizontal and vertical directions.
  • Each sub-pixel is made up of one set of three sub-pixels of R, G, and B, which correspond to any color of R (Red), G (Green), and B (Blue). You can One pixel is also referred to as one pixel.
  • the horizontal direction is, for example, a direction in which a plurality of subpixels forming one pixel are arranged.
  • the vertical direction is, for example, a direction in which subpixels of the same color are arranged.
  • the display panel 5 is not limited to a transmissive liquid crystal panel, and may be another display panel such as an organic EL (Electro Luminescence). When the display panel 5 is a self-luminous display panel, the three-dimensional display device 2 does not need to include the illuminator 4.
  • a plurality of sub-pixels arranged in the active area 51 can form a sub-pixel group Pg.
  • the sub-pixel group Pg is a minimum unit in which the controller 7, which will be described later, performs control for displaying an image in the active area 51.
  • the controller 7 causes the plurality of sub-pixels included in one sub-pixel group Pg to display the left-eye image or the right-eye image.
  • the number of subpixels displaying the left-eye image and the number of subpixels displaying the right-eye image may be the same.
  • the sub-pixel group Pg may be repeatedly arranged in the horizontal direction.
  • the sub-pixel group Pg may be repeatedly arranged adjacent to a position vertically displaced by one sub-pixel in the vertical direction.
  • a sub-pixel group Pg including twelve sub-pixels P1 to P12 arranged continuously, one in the vertical direction and twelve in the horizontal direction, is arranged.
  • the subpixels P(1) to P(2 ⁇ n ⁇ b) included in all the subpixel groups Pg may be collectively controlled by the controller 7. For example, when switching the image displayed on the sub-pixel P1 from the left-eye image to the right-eye image, the controller 7 changes the image displayed on the sub-pixel P1 included in all the sub-pixel groups Pg from the left-eye image to the right-eye image. You may switch at the same time. The controller 7 may change the number of subpixels included in the subpixel group Pg.
  • the barrier 6 is formed by a flat surface along the active area 51, and is arranged apart from the active area 51 by a predetermined distance (gap) g.
  • the barrier 6 is, for example, a parallax barrier, but is not limited to this and may be any optical element.
  • the barrier 6 may be located on the opposite side of the illuminator 4 with respect to the display panel 5.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5. As shown in FIG. 3, the barrier 6 has a plurality of light blocking surfaces 61 that block image light.
  • the plurality of light shielding surfaces 61 define a light transmitting area 62 between the light shielding surfaces 61 adjacent to each other.
  • the light transmitting region 62 has a higher light transmittance than the light shielding surface 61.
  • the light shielding surface 61 has a lower light transmittance than the light transmitting area 62.
  • the light transmitting area 62 is also referred to as a first light transmitting area.
  • the light shielding surface 61 is also referred to as a second light transmitting area.
  • the light-transmitting area 62 is a portion for transmitting light incident on the barrier 6.
  • the translucent region 62 may transmit light at the first transmittance.
  • the first transmittance is, for example, about 100%, but is not limited to this, and may be a value in a range in which the image light emitted from the display panel 5 can be visually recognized well.
  • the first transmittance may be, for example, 80% or more, or 50% or more.
  • the light-shielding surface 61 is a portion that blocks light that enters the barrier 6 and hardly transmits it. That is, the light blocking surface 61 blocks the image displayed in the active area 51 of the display panel 5 from reaching the eyes of the user.
  • the light shielding surface 61 may transmit light at the second transmittance.
  • the second transmittance is, for example, approximately 0%, but is not limited to this, and may be a value greater than 0% and a value close to 0% such as 0.5%, 1%, or 3%.
  • the first transmittance can be set to a value that is several times or more, for example, 10 times or more larger than the second transmittance.
  • the translucent region 62 may be a plurality of strip-shaped regions extending in a predetermined direction in the plane.
  • the translucent area 62 defines the light ray direction, which is the direction in which the image light emitted from the sub-pixels propagates.
  • the predetermined direction is a direction that forms a predetermined angle that is not 0 degree or 90 degrees with the vertical direction.
  • the light-transmitting regions 62 and the light-shielding surfaces 61 may extend in a predetermined direction along the active area 51 and may be repeatedly and alternately arranged in a direction orthogonal to the predetermined direction.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5 by the light shielding surface 61 and the light transmitting area 62. As shown in FIG. 1, the barrier 6 defines the image light emitted from the sub-pixels arranged in the active area 51, thereby defining the area on the active area 51 where the user's eyes can visually recognize.
  • the area in the active area 51 that emits the image light propagating to the position of the user's eyes is referred to as a visible area 51a.
  • a region in the active area 51 that emits image light propagating to the position of the left eye of the user is referred to as a left visible region 51aL.
  • the left visible area 51aL is also referred to as a first visible area.
  • a region in the active area 51 that emits image light propagating to the position of the right eye of the user is referred to as a right visible region 51aR.
  • the right visible region 51aR is also referred to as a second visible region.
  • the appropriate viewing distance d is referred to as OVD (Optimum Viewing Distance).
  • the barrier pitch Bp which is the arrangement interval of the translucent regions 62 in the horizontal direction, and the gap g between the active area 51 and the barrier 6, are the horizontal length Hp of the sub-pixel and the sub-pixel of the one-eye image. It is defined that the following equations (1) and (2) using the number n, the appropriate viewing distance d, and the interocular distance E are established.
  • E:d (n ⁇ Hp):g
  • d:Bp (d+g):(2 ⁇ n ⁇ Hp) Formula (2)
  • the suitable viewing distance d is the distance between the barrier 6 and at least one of the left eye and the right eye of the user in which the horizontal length of the visible region 51a is n subpixels.
  • the inter-eye distance E is the distance between the user's left eye and right eye.
  • the inter-eye distance E may be a value calculated from the position of the user's eyes or may be a preset value. When set in advance, the inter-eye distance E may be set to a value of 61.1 mm to 64.4 mm, which is a value calculated by a research of the National Institute of Advanced Industrial Science and Technology, for example.
  • Hp is the horizontal length of the sub-pixel as shown in FIG.
  • the barrier 6 may be composed of a member having the second transmittance.
  • the barrier 6 may be composed of, for example, a film or a plate-shaped member.
  • the light shielding surface 61 is made of a film or a plate member.
  • the translucent area 62 is composed of an opening provided in the film or the plate member.
  • the film is made of, for example, resin, but is not limited to this.
  • the plate member is made of, for example, resin or metal, but is not limited to this.
  • the barrier 6 may be made of a light-shielding base material, or may be made of a base material containing a light-shielding additive.
  • the barrier 6 may be composed of a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may include a plurality of pixels and may control the light transmittance of each pixel.
  • the liquid crystal shutter can form a region having a high light transmittance or a region having a low light transmittance in an arbitrary shape.
  • the light transmitting region 62 may be a region having the first transmittance.
  • the light shielding surface 61 may be a region having the second transmittance.
  • the barrier 6 having the above-described configuration allows the image light emitted from a part of the sub-pixels of the active area 51 to pass through the transparent region 62 and be propagated to the right eye of the user.
  • the barrier 6 can transmit the image light emitted from some other sub-pixels through the translucent region 62 and propagate to the left eye of the user.
  • An image visually recognized by the user's eye by propagating the image light to each of the user's left and right eyes will be described in detail with reference to FIGS. 4 and 5.
  • the left visible region 51aL shown in FIG. 4 is, as described above, the active area 51 visually recognized by the left eye of the user when the image light transmitted through the transparent region 62 of the barrier 6 reaches the left eye of the user.
  • the left invisible area 51bL is an area that the left eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the left visible region 51aL includes half of the sub-pixel P1, all of the sub-pixels P2 to P6, and half of the sub-pixel P7.
  • the image light from some other subpixels transmitted through the light transmitting region 62 of the barrier 6 reaches the right eye of the user, so that the right eye of the user is visually recognized.
  • the right invisible region 51bR is a region that the right eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the right visible region 51aR includes half of the sub-pixel P7, all of the sub-pixels P8 to P12, and half of the sub-pixel P1.
  • the left eye and the right eye each visually recognize the image.
  • the left-eye image and the right-eye image are parallax images having a parallax with each other.
  • the left eye includes half of the left-eye image displayed in the sub-pixel P1, the entire left-eye image displayed in the sub-pixels P2 to P6, and the right-eye image displayed in the sub-pixel P7.
  • the right eye visually recognizes half of the right eye image displayed in subpixel P7, the entire right eye image displayed in subpixels P8 to P12, and half of the left eye image displayed in subpixel P1. ..
  • the sub-pixel displaying the left-eye image is labeled with “L”
  • the sub-pixel displaying the right-eye image is labeled with “R”.
  • the area of the left eye image that the user's left eye visually recognizes is the maximum, and the area of the right eye image is the minimum.
  • the area of the right-eye image visually recognized by the right eye of the user is maximum, and the area of the left-eye image is minimum.
  • the fact that the left eye of the user visually recognizes the right eye image or the right eye of the user visually recognizes the left eye image is also referred to as crosstalk. The user can visually recognize the three-dimensional image with the crosstalk reduced.
  • the left-eye image and the right-eye image having parallax with each other are displayed in the sub-pixels included in each of the left visible region 51aL and the right visible region 51aR, the user located at the proper viewing distance d is displayed.
  • the image displayed on the panel 5 can be visually recognized as a three-dimensional image.
  • the left-eye image is displayed in the sub-pixels that are more than half visible by the left eye
  • the right-eye image is displayed in the sub-pixels that are more than half visible by the right eye.
  • the sub-pixels for displaying the left-eye image and the right-eye image are not limited to this, and the left-viewable area 51aL and the right-viewable area 51aR are designed to reduce crosstalk according to the design of the active area 51, the barrier 6, and the like. It may be appropriately determined based on. For example, depending on the aperture ratio of the barrier 6 or the like, the left-eye image is displayed on the sub-pixels that are viewed by the left eye at a predetermined rate or higher, and the right-eye image is displayed on the sub-pixels that are viewed by the right eye at a predetermined rate or higher. You can let me.
  • the controller 7 is connected to each component of the three-dimensional display system 100 and can control each component.
  • the controller 7 is configured as a processor, for example.
  • the controller 7 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for a specific process.
  • the dedicated processor may include an application-specific integrated circuit (ASIC: Application Specific Integrated Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 7 may be one of a SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and a SiP (System In-a-Package).
  • the controller 7 may include a storage unit, and the storage unit may store various kinds of information, a program for operating each component of the three-dimensional display system 100, or the like.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • the storage unit may function as a work memory of the controller 7.
  • controller 7 The control of each component of the three-dimensional display system 100 by the controller 7 will be described below.
  • the controller 7 determines the left visible region 51aL and the right visible region 51aR in the active area 51 of the display panel 5 based on the positions of at least one of the left eye and the right eye of the user.
  • the controller 7 may determine that the right eye of the user is located at a position horizontally moved from the position of the left eye by a predetermined eye distance E based on the position of the left eye of the user.
  • the controller may determine the left visible region 51aL and the right visible region 51aR so that the image light that has passed through each transparent region 62 reaches the left eye and the right eye of the user.
  • the left visible region 51aL and the right visible region 51aR in one translucent region 62 have horizontal lengths of n subpixels. Therefore, as shown in FIG. 1, the left visible region 51aL and the right visible region 51aR do not overlap each other and are arranged alternately in the horizontal direction on the display surface of the display panel 5.
  • the controller 7 determines that the sub-pixel included in the left visible region 51aL is the left sub-pixel.
  • the left sub-pixel is, for example, a sub-pixel that includes a predetermined proportion or more in the left visible region 51aL.
  • the left subpixel is also referred to as a first subpixel or a first display area.
  • the controller 7 determines that the sub-pixel included in the right visible region 51aR is the right sub-pixel.
  • the right sub-pixel is, for example, a sub-pixel including a predetermined ratio or more in the right visible region 51aR.
  • the left subpixel is also referred to as a second subpixel or a second display area. As shown in FIG. 1, the left sub-pixels and the right sub-pixels do not overlap each other and are alternately arranged in the horizontal direction on the display surface of the display panel 5.
  • FIG. 6 is a diagram showing sub-pixels visually recognized by the eyes of the user located at the proper viewing distance d by the image light that has passed through one light-transmitting area 62a.
  • the sub-pixel group Pg will be described as including 12 sub-pixels P1 to P12 arranged continuously in the horizontal direction.
  • FIG. 6 shows a visual recognition region 70 in which the right eye or the left eye of the user, who is located away from the barrier 6 by the appropriate viewing distance d, can visually recognize a predetermined subpixel.
  • six subpixels that are continuous in the horizontal direction are visually recognized.
  • the range in which the left and right eyes of the user can be visually recognized through the translucent region 62a is represented by a broken line.
  • the left eye of the user when the left eye of the user is at the position L1 included in the visual recognition area 70A, the left eye of the user visually recognizes the sub-pixels P1 to P6 through the light transmitting area 62a.
  • the sub-pixel visually recognized by the left eye of the user also changes.
  • the left eye of the user is at the position L2 included in the visual recognition area 70B, the left eye of the user visually recognizes the sub-pixels P2 to P7 via the light transmitting area 62a.
  • Subpixels visually recognized by the user's eyes in adjacent visual recognition areas 70 have a difference of one subpixel.
  • the three-dimensional display system 100 among the 2 ⁇ n subpixels arranged in the horizontal direction of the subpixel group Pg, n different subpixels each have a left eye of the user at the proper viewing distance d and The barrier pitch Bp and the gap g are defined so as to be visually recognized by the right eye. That is, the three-dimensional display system 100 is configured such that a difference of n sub-pixels is generated in the sub-pixel regions visually recognized by the left and right eyes of the user located at the proper viewing distance d. Therefore, in FIG.
  • the right eye visually recognizes the sub-pixels P7 to P12. It is located at the position R1 included in the region 70C.
  • the controller 7 sets the sub-pixels P1 to P6 as left sub-pixels and displays a left-eye image visually recognized by the left eye of the user.
  • the controller 7 sets the sub-pixels P7 to P12 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user.
  • the inter-eye distance E which is the distance between the left eye and the right eye of the user, corresponds to the distance of the n visual recognition areas 70. That is, the width of one visual recognition area 70 is E/n.
  • the controller 7 may change the sub-pixel displaying the right-eye image or the left-eye image according to the position of the user's eye acquired by the acquisition unit 3. For example, it is assumed that the user moves in the horizontal direction at the proper viewing distance d and the left eye of the user moves from the position L1 to the position L2. At this time, the controller 7 determines that, for example, from the position of the left eye of the user, the left eye of the user is located in the visual recognition area 70B for visually recognizing the sub-pixels P2 to P7. The controller 7 determines that the right eye, which is away from the left eye of the user by the inter-eye distance E from the left eye, is located in the visual recognition area 70D for visually recognizing the sub-pixels P8 to P12 and P1.
  • the controller 7 sets the sub-pixels P2 to P7 as left sub-pixels and the sub-pixels P8 to P12 and P1 as right sub-pixels. As a result, the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
  • the controller 7 determines each of the left subpixel and the right subpixel based on the viewing distance. Set the width.
  • the left and right eyes of the user are at the position L7-2 and the position R7-2, which are separated from the barrier 6 by the observation distance Y1.
  • the position L7-2 and the position R7-2 are positions closer to the barrier 6 along the depth direction than the positions L7-1 and R7-1, which are separated from the barrier 6 by the suitable viewing distance d.
  • the image light emitted from the display panel 5 and passing through the translucent region 62a travels along the optical path 71A and the optical path 71B to reach the left eye at the position L7-2 and the right eye at the position R7-2. Reach each.
  • the optical path 71A passes through the position L7-1' included in the visual recognition area 70E at the proper viewing distance d.
  • the visual recognition region 70E corresponds to a region in which the sub-pixels P12 and P1 to P5 can be visually recognized in the plane separated from the barrier 6 by the appropriate visual distance d.
  • the optical path 71A intersects the visual recognition region 70E at the position L7-1', which means that the left eye can visually recognize the sub-pixels P12 and P1 to P5. That is, the controller 7 can specify the sub-pixel that the left eye can visually recognize by calculating the visible region 70 where the optical path 71A intersects even when the left eye is at the position L7-2. The left eye at the position L7-2 can see the sub-pixels P12 and P1 to P5.
  • the optical path 71B passes through the position R7-1 included in the visual recognition region 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye at the position R7-2 can see the sub-pixels P7 to P12.
  • the left eye at the position L7-2 and the right eye at the position R7-2 both see the sub-pixel P12.
  • the sub-pixel P12 is represented by hatching.
  • the controller 7 sets the sub-pixels P7 to P12 as left sub-pixels and the sub-pixels P1 to P5 and P12 as right sub-pixels.
  • the controller 7 may set a sub pixel that is both a right sub pixel and a left sub pixel, such as the sub pixel P12, as the third sub pixel.
  • the ratio of the observation distance Y1 from the barrier 6 and the proper viewing distance d corresponds to the ratio of the distance between the position L7-2 and the position R7-2 and the distance between the position L7-1′ and the position R7-1. ..
  • the image visually recognized by the left and right eyes of the user at the observation distance Y1 which is separated by the inter-eye distance E is the width of one visual recognition region 70 from the inter-eye distance E at the optimum viewing distance d, that is, E/n. It can be considered that this corresponds to an image visually recognized by the left and right eyes of the user who has become longer.
  • FIG. 8 shows sub-pixels which are viewed by the user's eyes when the left and right eyes of the user are at the position L8-2 and the position R8-2, which are separated from the barrier 6 by the observation distance d/2. It is shown.
  • the position L8-2 and the position R8-2 are positions closer to the barrier 6 along the depth direction from the position L8-1 and the position R8-1 at the proper viewing distance d.
  • the observation distance d/2 is half the suitable viewing distance d.
  • the left eye of the user at the position L8-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71A.
  • the optical path 71A passes through the position L8-1' included in the visual recognition region 70F at the proper viewing distance d.
  • the visible region 70F corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the left eye at the position L8-2 can visually recognize the sub-pixels P7 to P12.
  • Position R8-2 The right eye of the user visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71B.
  • the optical path 71B passes through the position R8-1 included in the visual recognition area 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye located at the position R8-2 can visually recognize the sub-pixels P7 to P12.
  • the left eye at the position L8-2 and the right eye at the position R8-2 both see the sub-pixels P7 to P12.
  • the observation distance from the barrier 6 is d/2
  • the area of the image visually recognized by the left eye and the right eye of the user who is separated by the inter-eye distance E is the optimum viewing distance d.
  • the area corresponds to the area of the image visually recognized by the left and right eyes of the user, which is separated by the inter-eye distance 2E.
  • the controller 7 may set the number of sub-pixels forming the one-eye image to twice the number of sub-pixels forming the one-eye image at the proper viewing distance d.
  • the controller 7 changes the number of sub-pixels included in the sub-pixel group Pg and continuously arranged in the horizontal direction from 12 to 24 at the suitable viewing distance d.
  • the changed sub-pixel group Pg includes 24 sub-pixels P1 to P24.
  • the left eye of the user at the position L8-2 which is separated from the barrier 6 by the observation distance d/2, visually recognizes the sub-pixels P7 to P12 through the translucent area 62a.
  • the right eye of the user at the position R8-2 views the sub-pixels P19 to P24 through the translucent area 62a.
  • the controller 7 sets the sub-pixels P1 to P12 as left sub-pixels and displays the left-eye image visually recognized by the left eye of the user.
  • the controller 7 sets the sub-pixels P13 to P24 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user.
  • the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
  • the observation distance from the barrier 6 is d/3, and the regions of the images visually recognized by the left and right eyes of the user, which are separated by the inter-eye distance E, are separated by the inter-eye distance 3E at the optimum viewing distance d. It corresponds to the area of the image visually recognized by the left and right eyes of the user.
  • the controller 7 triples the width of the right sub-pixel and the left sub-pixel when the viewing distance of the user is d/3. Thereby, the controller 7 performs the same control as when the left eye and the right eye of the user are located at the proper viewing distance d, so that the user visually recognizes the three-dimensional image with the crosstalk reduced. Can be made.
  • the controller 7 may widen the width of each of the right sub-pixel and the left sub-pixel when the viewing distance of the user is shorter than the suitable viewing distance d and when the viewing distance is shorter than the predetermined distance. Specifically, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets the width of each of the right subpixel and the left subpixel to the right subpixel and the left at the proper viewing distance d. It may be n times wider than the width of each sub-pixel. The controller 7 may calculate n that satisfies the following expression (5) based on the user's viewing distance Y and the appropriate viewing distance d, and may increase the width of each of the right subpixel and the left subpixel by n times. n may be a natural number of 2 or more. d/n ⁇ Y>d/(n+1) Formula (5)
  • the controller 7 does not change the width of the right subpixel and the width of the left subpixel from the width at the proper viewing distance d when n is 1, that is, when the observation distance Y is larger than the proper viewing distance d/2.
  • the controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 2, that is, when the viewing distance is equal to or less than the optimum viewing distance d/2 and is greater than the optimum viewing distance d/3. 2 times.
  • the controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 3, that is, when the viewing distance is equal to or less than the optimum viewing distance d/3 and is greater than the optimum viewing distance d/4. 3 times. In this way, the controller 7 changes the width of the right subpixel and the width of the left subpixel according to the observation distance Y.
  • the controller 7 may increase the widths of all the right subpixels and the left subpixels by n times when expanding the widths of the right subpixel and the left subpixel by n times, or may increase the widths of all the right subpixels and the left subpixels by n times. The width may be increased by a factor of n.
  • the controller 7 may make a part of the width of the right subpixel and the left subpixel whose width is increased by n times different from the width of the other right subpixel and the left subpixel. That is, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets a part of the width of each of the right subpixel and the left subpixel to the right subpixel at the proper viewing distance d.
  • the width of each of the left sub-pixels is expanded by n times, and the other part of the width of each of the right sub-pixel and the left sub-pixel is set to m of the width of each of the right sub-pixel and the left sub-pixel at the proper viewing distance d. You may double it. m may be any number different from n.
  • the controller 7 may be configured to display the mixed image in the active area 51. Specifically, the controller 7 displays the left-eye image on the sub-pixel that is the left sub-pixel and not the right sub-pixel. The controller 7 displays the right-eye image on the sub-pixel that is the right sub-pixel and not the left sub-pixel. The controller 7 may display the third image on the third subpixel when the third subpixel that is the left subpixel and the right subpixel is present.
  • the controller 7 may be configured to set the brightness value of the third image displayed in the third sub-pixel to a predetermined value or less.
  • the controller 7 may display a black image as the third image.
  • the black image is an image having a predetermined brightness, such as black.
  • the predetermined brightness can be a value corresponding to the brightness of the lowest gradation among the displayable gradation levels of the sub-pixels or the brightness of the gradation corresponding thereto.
  • the controller 7 may display an average image having a luminance value that is an average value of the luminance values of the left-eye image and the right-eye image as the third image in the third subpixel.
  • the controller 7 may display, in the third subpixel, a left-eye image or a right-eye image as the third image based on the characteristics of the user.
  • the characteristic of the user is, for example, a characteristic regarding the dominant eye of the user.
  • the controller 7 may display the left-eye image or the right-eye image corresponding to the dominant eye based on the information indicating the dominant eye of the user set in advance or input from the outside. ..
  • the controller 7 displays the left-eye image as the third image when the dominant eye of the user is the left eye, and displays the right-eye image as the third image when the dominant eye of the user is the right eye. Good.
  • Step S101 The controller 7 acquires the position of at least one of the first eye (for example, the left eye) and the second eye (for example, the right eye) of the user from the detection device 1.
  • the first eye for example, the left eye
  • the second eye for example, the right eye
  • Step S102 The controller 7 calculates the observation distance between the user's eye position and the barrier 6 from the acquired information on the user's eye position.
  • Step S103 The controller 7 determines the width of each of the left subpixel and the right subpixel based on the viewing distance of the user. When the user's viewing distance is shorter than the optimum viewing distance, the controller 7 sets the width of each of the left subpixel and the right subpixel according to the viewing distance.
  • Step S104 The controller 7 determines a subpixel that is a left subpixel and a subpixel that is a right subpixel based on the position of the user's eye.
  • the controller 7 may determine the sub pixel that is the left sub pixel and the right sub pixel as the third sub pixel.
  • Step S105 The controller 7 displays the left-eye image on the sub-pixel which is the left sub-pixel.
  • Step S106 The controller 7 displays the right-eye image on the sub-pixel which is the right sub-pixel.
  • Step S107 The controller 7 displays the third image on the third subpixel when there is a third subpixel which is a left subpixel and a right subpixel.
  • the three-dimensional display device 2 includes the display panel 5, optical elements such as the barrier 6, the acquisition unit 3, and the controller 7.
  • the display panel 5 is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define the light ray direction of the image light emitted from the display panel 5.
  • the acquisition unit 3 is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the display panel 5 is configured to display a first display area configured to display a first image visually recognized by the user's first eye, and a second image visually recognized by the user's second eye. And a second display area. The first display area and the second display area are alternately arranged on the display panel 5.
  • the optical element includes a first light-transmitting region configured to transmit the image light with the first transmittance and a second light-transmitting region configured to transmit the image light with the second transmittance. ..
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in the optical element.
  • the controller 7 determines the first display area and the second display based on the observation distance. It is configured to set the width of each of the regions. With this configuration, the three-dimensional display device 2 can adjust the image displayed on the display panel 5 according to the viewing distance of the user.
  • the three-dimensional display device 2 can control the images visually recognized by the left and right eyes of the user so that the crosstalk is reduced when the viewing distance of the user is shorter than the proper viewing distance. Therefore, the three-dimensional display device 2 can appropriately allow the user to visually recognize the three-dimensional image regardless of the change in the distance from the user.
  • the controller 7 can widen the width of each of the first display area and the second display area when the observation distance becomes shorter than the first distance. According to such a configuration, the controller 7 spreads the image visually recognized by the left and right eyes of the user when the observation distance of the user is shorter than the optimum viewing distance, so that the left and right eyes of the user are Crosstalk of a visually recognized image can be reduced.
  • the controller 7 determines the width of each of the first display region and the second display region when the observation distance is shorter than 1/n (n is a number of 2 or more in advance) of the proper viewing distance.
  • the width of each of the first display region and the second display region can be increased by n times the distance.
  • the controller 7 determines the observation distance of the user, and when the observation distance of the user is shorter than the proper viewing distance, the controller 7 can perform the same control as when the user is at the proper viewing distance. it can. As a result, it is possible to reduce an increase in the amount of calculation processing and the amount of data by the controller 7 according to the change in the observation distance of the user.
  • the controller 7 sets the width of a part of the first display area and the second display area whose width is expanded to n times the first display area whose width is expanded to n times.
  • the width of the other display area in the second display area can be different.
  • the controller 7 may cause, for example, crosstalk of images visually recognized by the left and right eyes of the user when expanding the display area by n times, for some display areas, The width can be reduced.
  • the optical element is the barrier 6, but the optical element is not limited to this.
  • the optical element included in the three-dimensional display device 2 may be a lenticular lens 91.
  • the lenticular lens 91 is configured by arranging a plurality of vertically extending cylindrical lenses 92 in the horizontal direction.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the left visible region 51aL so as to reach the position of the left eye of the user.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the right visible region 51aR so as to reach the position of the right eye of the user.
  • the three-dimensional display system 100 is described as the three-dimensional display device 2 and the detection device 1 being separate bodies, but the present invention is not limited to this.
  • the three-dimensional display device 2 may include the function provided by the detection device 1.
  • the three-dimensional display device 2 can detect the position of at least one of the left eye and the right eye of the user.
  • the three-dimensional display system 100 can be mounted on the head-up display system 400.
  • the head-up display system 400 is also referred to as a HUD (Head Up Display) 400.
  • the HUD 400 includes the three-dimensional display system 100, an optical member 410, and a projected member 420 having a projected surface 430.
  • the HUD 400 is configured to cause the image light emitted from the three-dimensional display system 100 to reach the projection target member 420 via the optical member 410.
  • the HUD 400 is configured to cause the image light reflected by the projection target member 420 to reach the left and right eyes of the user.
  • the HUD 400 can cause image light to travel from the three-dimensional display system 100 to the left and right eyes of the user along the optical path 440 indicated by the broken line.
  • the user can visually recognize the image light reaching the optical path 440 as a virtual image 450.
  • the HUD 400 including the three-dimensional display system 100 may be mounted on the mobile body 10.
  • Part of the configuration of the HUD 400 may be combined with other devices and parts included in the moving body 10.
  • the moving body 10 may also use the windshield as the projection target member 420.
  • the other configuration may be referred to as a HUD module or a three-dimensional display component.
  • the HUD 400 and the three-dimensional display system 100 may be mounted on the moving body 10.
  • the “moving body” in the present disclosure includes a vehicle, a ship, and an aircraft.
  • the “vehicle” in the present disclosure includes, but is not limited to, an automobile and an industrial vehicle, and may include a railroad vehicle, a living vehicle, and a fixed-wing aircraft traveling on a runway.
  • Vehicles include, but are not limited to, passenger cars, trucks, buses, motorcycles, and trolleybuses, and may include other vehicles traveling on roads.
  • Industrial vehicles include industrial vehicles for agriculture and construction.
  • Industrial vehicles include, but are not limited to, forklifts and golf carts.
  • Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
  • Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, mobile cranes, dump trucks, and road rollers.
  • Vehicles include those that are driven manually.
  • the vehicle classification is not limited to the above.
  • an automobile may include an industrial vehicle that can travel on a road, and the same vehicle may be included in multiple classifications.
  • Ships in the present disclosure include marine jets, boats, and tankers.
  • the aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.
  • Detection Device 2 3D Display Device 3 Acquisition Unit 4 Illuminator 5 Display Panel 6 Barrier 7 Controller 10 Moving Object 51 Active Area 51aL Left Visible Area 51aR Right Visible Area 51bL Left Invisible Area 51bR Right Invisible Area 51aLR Binocular Visible Area 61 Light Shading Surface 62 Transparent area 70 Visual area 71A Optical path 71B Optical path 91 Lenticular lens 92 Cylindrical lens 100 Three-dimensional display system 400 Head-up display system 410 Optical member 420 Projected member 430 Projected surface 440 Optical path 450 Virtual image

Abstract

Selon l'invention, un dispositif d'affichage tridimensionnel (2) est pourvu d'un panneau d'affichage (5), d'un élément optique, d'une unité d'acquisition (3) et d'un dispositif de commande (7). Le panneau d'affichage (5) est configuré pour afficher une image mélangée comprenant une première image et une seconde image. L'élément optique est configuré pour définir une direction de rayon lumineux de lumière d'image émis à partir du panneau d'affichage (5). L'unité d'acquisition (3) est configurée pour acquérir la position d'un premier œil et/ou d'un second œil d'un utilisateur. Le panneau d'affichage (5) contient une première région d'affichage et une seconde région d'affichage disposées en alternance sur une surface d'affichage du panneau d'affichage (5), la première région d'affichage étant configurée pour afficher une première image devant être reconnue visuellement par le premier œil de l'utilisateur, la seconde région d'affichage étant configurée pour afficher une seconde image devant être reconnue visuellement par le second œil de l'utilisateur. Si la distance entre le premier œil et/ou le second œil de l'utilisateur et l'élément optique est inférieure à une distance de visualisation appropriée, le dispositif de commande (7) est configuré pour régler la largeur de chacune de la première région d'affichage et de la seconde région d'affichage en fonction d'une distance d'observation.
PCT/JP2019/049677 2018-12-21 2019-12-18 Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile WO2020130049A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018240072A JP2020102772A (ja) 2018-12-21 2018-12-21 3次元表示装置、ヘッドアップディスプレイシステム、及び移動体
JP2018-240072 2018-12-21

Publications (1)

Publication Number Publication Date
WO2020130049A1 true WO2020130049A1 (fr) 2020-06-25

Family

ID=71101847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049677 WO2020130049A1 (fr) 2018-12-21 2019-12-18 Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile

Country Status (2)

Country Link
JP (1) JP2020102772A (fr)
WO (1) WO2020130049A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074486A (zh) * 2023-03-21 2023-05-05 北京光谱印宝科技有限责任公司 裸眼3d显示装置
WO2023199765A1 (fr) * 2022-04-12 2023-10-19 公立大学法人大阪 Dispositif d'affichage stéréoscopique

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131887A1 (fr) * 2011-03-29 2012-10-04 株式会社 東芝 Dispositif d'affichage d'image tridimensionnelle
WO2015132828A1 (fr) * 2014-03-06 2015-09-11 パナソニックIpマネジメント株式会社 Procédé d'affichage d'image et appareil d'affichage d'image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131887A1 (fr) * 2011-03-29 2012-10-04 株式会社 東芝 Dispositif d'affichage d'image tridimensionnelle
WO2015132828A1 (fr) * 2014-03-06 2015-09-11 パナソニックIpマネジメント株式会社 Procédé d'affichage d'image et appareil d'affichage d'image

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023199765A1 (fr) * 2022-04-12 2023-10-19 公立大学法人大阪 Dispositif d'affichage stéréoscopique
CN116074486A (zh) * 2023-03-21 2023-05-05 北京光谱印宝科技有限责任公司 裸眼3d显示装置
CN116074486B (zh) * 2023-03-21 2023-07-25 北京光谱印宝科技有限责任公司 裸眼3d显示装置

Also Published As

Publication number Publication date
JP2020102772A (ja) 2020-07-02

Similar Documents

Publication Publication Date Title
JP6924637B2 (ja) 3次元表示装置、3次元表示システム、移動体、および3次元表示方法
JP7129789B2 (ja) ヘッドアップディスプレイ、ヘッドアップディスプレイシステム、および移動体
US20200053352A1 (en) Three-dimensional display apparatus, three-dimensional display system, head-up display system, and mobile body
JP7188981B2 (ja) 3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、及び移動体
WO2020130049A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile
JP7145214B2 (ja) 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体
CN114503556A (zh) 三维显示装置、控制器、三维显示方法、三维显示系统及移动体
WO2019225400A1 (fr) Dispositif d'affichage d'images, système d'affichage d'images, affichage tête haute, et objet mobile
JP7188888B2 (ja) 画像表示装置、ヘッドアップディスプレイシステム、および移動体
WO2020130048A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile
US11874464B2 (en) Head-up display, head-up display system, moving object, and method of designing head-up display
JP7336782B2 (ja) 3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、及び移動体
CN114730096A (zh) 平视显示器系统以及移动体
WO2023228887A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile
WO2022149599A1 (fr) Dispositif d'affichage tridimensionnel
JP7475231B2 (ja) 3次元表示装置
WO2022163728A1 (fr) Dispositif d'affichage tridimensionnel
WO2021060011A1 (fr) Barrière de parallaxe, dispositif d'affichage en trois dimensions (3d), système d'affichage en 3d, afficheur tête haute, et corps mobile
EP4040787A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, et objet mobile
WO2021060012A1 (fr) Barrière de parallaxe, dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, affichage tête haute et objet mobile
CN116235241A (zh) 三维显示装置、三维显示方法、三维显示系统以及移动体
JP2021056255A (ja) パララックスバリア、3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、および移動体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19898797

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19898797

Country of ref document: EP

Kind code of ref document: A1