WO2020130049A1 - Three-dimensional display device, head-up display system, and mobile body - Google Patents

Three-dimensional display device, head-up display system, and mobile body Download PDF

Info

Publication number
WO2020130049A1
WO2020130049A1 PCT/JP2019/049677 JP2019049677W WO2020130049A1 WO 2020130049 A1 WO2020130049 A1 WO 2020130049A1 JP 2019049677 W JP2019049677 W JP 2019049677W WO 2020130049 A1 WO2020130049 A1 WO 2020130049A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
eye
user
sub
Prior art date
Application number
PCT/JP2019/049677
Other languages
French (fr)
Japanese (ja)
Inventor
薫 草深
秀也 高橋
濱岸 五郎
Original Assignee
京セラ株式会社
公立大学法人大阪
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社, 公立大学法人大阪 filed Critical 京セラ株式会社
Publication of WO2020130049A1 publication Critical patent/WO2020130049A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • H04N13/125Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues for crosstalk reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/317Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present disclosure relates to a three-dimensional display device, a head-up display system, and a moving body.
  • the three-dimensional display device of the present disclosure includes a display panel, an optical element, an acquisition unit, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above.
  • the first display areas and the second display areas are alternately arranged on the display surface of the display panel.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
  • the head-up display system of the present disclosure includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the mobile object of the present disclosure includes a head-up display system.
  • the head-up display system includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define a light ray direction of image light emitted from the display panel.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image.
  • the display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above.
  • the first display areas and the second display areas are alternately arranged on the display surface of the display panel.
  • the optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including.
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel.
  • the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
  • FIG. 1 is a diagram showing an example of a three-dimensional display system according to an embodiment viewed from a vertical direction.
  • FIG. 2 is a diagram showing an example of the display panel shown in FIG. 1 viewed from the depth direction.
  • FIG. 3 is a diagram showing an example of the barrier shown in FIG. 1 viewed from the depth direction.
  • FIG. 4 is a diagram for explaining the left visible region in the display panel shown in FIG.
  • FIG. 5 is a diagram for explaining the right visible region in the display panel shown in FIG.
  • FIG. 6 is a schematic diagram showing the sub-pixels visually recognized by the left and right eyes of the user located at the proper viewing distance d.
  • FIG. 7 is a schematic diagram showing an example of sub-pixels that the left eye and the right eye of the user visually recognize when the observation distance Y is shorter than the proper viewing distance d.
  • FIG. 8 is a schematic diagram showing an example of sub-pixels visually recognized by the left and right eyes of a close user whose observation distance Y is the proper viewing distance d/2.
  • FIG. 9 is a schematic diagram which shows the other example of the sub pixel visually recognized by the left eye and right eye of the near user whose observation distance Y is the suitable viewing distance d/2.
  • FIG. 10 is a flowchart showing the processing of the three-dimensional display system according to the embodiment.
  • FIG. 11 is a schematic configuration diagram of a three-dimensional display device when the optical element is a lenticular lens.
  • FIG. 12 is a diagram showing an example of a head-up display system equipped with the three-dimensional display system according to this embodiment.
  • FIG. 13 is a diagram showing an example of a moving body equipped with the head-up display
  • An object of the present disclosure is to provide a three-dimensional display device, a head-up display system, and a moving body that allow a user to appropriately visually recognize a three-dimensional image.
  • a three-dimensional display system 100 is configured to include a detection device 1 and a three-dimensional display device 2 as shown in FIG.
  • the three-dimensional display system 100 displays an image on the display panel 5 of the three-dimensional display device 2. Part of the image light emitted from the display panel 5 is blocked by the barrier 6, so that different image lights reach the left and right eyes of the user.
  • the user can view the image stereoscopically because there is a parallax between the image viewed by the left eye and the image viewed by the right eye.
  • the three-dimensional display device 2 adjusts the width of the image displayed on the display panel 5 according to the distance between the user's eyes and the barrier 6 detected by the detection device 1. .
  • the three-dimensional display system 100 can allow the user to appropriately visually recognize the three-dimensional image regardless of the change in the distance to the user.
  • Detecting device 1 is configured to detect the position of the user's eyes.
  • the detection device 1 may detect the position of at least one of the left eye and the right eye of the user.
  • one eye of the user is also referred to as a first eye.
  • the other eye of the user is also referred to as the second eye.
  • the left eye is the first eye and the right eye is the second eye, but they may be reversed.
  • the position of the user's eyes is represented by, for example, coordinates in a three-dimensional space, but is not limited to this.
  • the detection device 1 may include, for example, a camera.
  • the detection device 1 may capture a user's face with a camera.
  • the detection device 1 may detect the position of the eyes of the user from the captured image of the face of the user.
  • the detection device 1 may detect the position of the user's eyes as the coordinates of the three-dimensional space from the image captured by one camera.
  • the detection device 1 may detect the position of the user's eyes as coordinates in a three-dimensional space from images captured by two or more cameras.
  • the detection device 1 outputs the position of at least one of the left eye and the right eye of the user to the three-dimensional display device 2.
  • Detecting device 1 may not be equipped with a camera and may be connected to a camera outside the device.
  • the detection device 1 may include an input terminal for inputting an image pickup signal from a camera outside the device.
  • the camera outside the device may be directly connected to the input terminal.
  • a camera outside the device may be indirectly connected to the input terminal via a shared network.
  • the detection device 1 may detect the position of the eye of the user from the video signal input to the input terminal.
  • the detection device 1 may include a sensor, for example.
  • the sensor may be an ultrasonic sensor or an optical sensor.
  • the detection device 1 may detect the position of the user's head with a sensor, and may detect the position of the user's eye based on the position of the head.
  • the detection device 1 may detect the position of the user's eyes as coordinates in the three-dimensional space by using one or more sensors.
  • the three-dimensional display system 100 does not have to include the detection device 1.
  • the three-dimensional display device 2 may include an input terminal for inputting a signal from the detection device outside the system.
  • the detection device outside the system may be directly connected to the input terminal.
  • the detection device outside the system may be indirectly connected to the input terminal via a shared network.
  • the three-dimensional display device 2 may acquire the position of the user's eyes from a detection device outside the system.
  • the three-dimensional display device 2 includes an acquisition unit 3, an illuminator 4, a display panel 5, a barrier 6, and a controller 7.
  • the acquisition unit 3 may be configured to acquire the position of at least one of the left eye and the right eye of the user detected by the detection device 1.
  • the acquisition unit 3 may include, for example, a communication module or the like.
  • the acquisition unit 3 may determine the distance between the user's eye and the barrier 6 from the acquired position of the user's eye.
  • the distance between the user's eyes and the barrier 6 may be the distance between the barrier 6 and at least one of the user's left and right eyes.
  • the distance between the user's eyes and the barrier 6 is also referred to as the user's observation distance.
  • the illuminator 4 can illuminate the display panel 5 in a plane.
  • the illuminator 4 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 4 emits irradiation light from a light source, and uniformizes the irradiation light in the surface direction of the display panel 5 using a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the illuminator 4 can emit uniformized light to the display panel 5.
  • the display panel 5 is, for example, a display panel such as a transmissive liquid crystal display panel, but is not limited to this. As shown in FIG. 2, the display panel 5 has a plurality of partitioned areas on a planar active area 51.
  • the active area 51 displays a mixed image.
  • the active area 51 is also referred to as a display surface.
  • the mixed image includes a left-eye image and a right-eye image having a parallax with respect to the left-eye image.
  • the left-eye image is also referred to as the first image.
  • the right eye image is also referred to as the second image.
  • the mixed image which will be described in detail later, may further include a third image.
  • the partitioned area is an area partitioned by the grid-like black matrix 52 in the first direction and the second direction orthogonal to the first direction.
  • the direction orthogonal to the first direction and the second direction is referred to as the third direction.
  • the first direction may be referred to as the horizontal direction.
  • the second direction may be referred to as the vertical direction.
  • the third direction may be referred to as the depth direction.
  • the first direction, the second direction, and the third direction are not limited to these.
  • the first direction is represented as the x-axis direction
  • the second direction is represented as the y-axis direction
  • the third direction is represented as the z-axis direction.
  • the active area 51 includes a plurality of sub-pixels arranged in a grid along the horizontal and vertical directions.
  • Each sub-pixel is made up of one set of three sub-pixels of R, G, and B, which correspond to any color of R (Red), G (Green), and B (Blue). You can One pixel is also referred to as one pixel.
  • the horizontal direction is, for example, a direction in which a plurality of subpixels forming one pixel are arranged.
  • the vertical direction is, for example, a direction in which subpixels of the same color are arranged.
  • the display panel 5 is not limited to a transmissive liquid crystal panel, and may be another display panel such as an organic EL (Electro Luminescence). When the display panel 5 is a self-luminous display panel, the three-dimensional display device 2 does not need to include the illuminator 4.
  • a plurality of sub-pixels arranged in the active area 51 can form a sub-pixel group Pg.
  • the sub-pixel group Pg is a minimum unit in which the controller 7, which will be described later, performs control for displaying an image in the active area 51.
  • the controller 7 causes the plurality of sub-pixels included in one sub-pixel group Pg to display the left-eye image or the right-eye image.
  • the number of subpixels displaying the left-eye image and the number of subpixels displaying the right-eye image may be the same.
  • the sub-pixel group Pg may be repeatedly arranged in the horizontal direction.
  • the sub-pixel group Pg may be repeatedly arranged adjacent to a position vertically displaced by one sub-pixel in the vertical direction.
  • a sub-pixel group Pg including twelve sub-pixels P1 to P12 arranged continuously, one in the vertical direction and twelve in the horizontal direction, is arranged.
  • the subpixels P(1) to P(2 ⁇ n ⁇ b) included in all the subpixel groups Pg may be collectively controlled by the controller 7. For example, when switching the image displayed on the sub-pixel P1 from the left-eye image to the right-eye image, the controller 7 changes the image displayed on the sub-pixel P1 included in all the sub-pixel groups Pg from the left-eye image to the right-eye image. You may switch at the same time. The controller 7 may change the number of subpixels included in the subpixel group Pg.
  • the barrier 6 is formed by a flat surface along the active area 51, and is arranged apart from the active area 51 by a predetermined distance (gap) g.
  • the barrier 6 is, for example, a parallax barrier, but is not limited to this and may be any optical element.
  • the barrier 6 may be located on the opposite side of the illuminator 4 with respect to the display panel 5.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5. As shown in FIG. 3, the barrier 6 has a plurality of light blocking surfaces 61 that block image light.
  • the plurality of light shielding surfaces 61 define a light transmitting area 62 between the light shielding surfaces 61 adjacent to each other.
  • the light transmitting region 62 has a higher light transmittance than the light shielding surface 61.
  • the light shielding surface 61 has a lower light transmittance than the light transmitting area 62.
  • the light transmitting area 62 is also referred to as a first light transmitting area.
  • the light shielding surface 61 is also referred to as a second light transmitting area.
  • the light-transmitting area 62 is a portion for transmitting light incident on the barrier 6.
  • the translucent region 62 may transmit light at the first transmittance.
  • the first transmittance is, for example, about 100%, but is not limited to this, and may be a value in a range in which the image light emitted from the display panel 5 can be visually recognized well.
  • the first transmittance may be, for example, 80% or more, or 50% or more.
  • the light-shielding surface 61 is a portion that blocks light that enters the barrier 6 and hardly transmits it. That is, the light blocking surface 61 blocks the image displayed in the active area 51 of the display panel 5 from reaching the eyes of the user.
  • the light shielding surface 61 may transmit light at the second transmittance.
  • the second transmittance is, for example, approximately 0%, but is not limited to this, and may be a value greater than 0% and a value close to 0% such as 0.5%, 1%, or 3%.
  • the first transmittance can be set to a value that is several times or more, for example, 10 times or more larger than the second transmittance.
  • the translucent region 62 may be a plurality of strip-shaped regions extending in a predetermined direction in the plane.
  • the translucent area 62 defines the light ray direction, which is the direction in which the image light emitted from the sub-pixels propagates.
  • the predetermined direction is a direction that forms a predetermined angle that is not 0 degree or 90 degrees with the vertical direction.
  • the light-transmitting regions 62 and the light-shielding surfaces 61 may extend in a predetermined direction along the active area 51 and may be repeatedly and alternately arranged in a direction orthogonal to the predetermined direction.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5 by the light shielding surface 61 and the light transmitting area 62. As shown in FIG. 1, the barrier 6 defines the image light emitted from the sub-pixels arranged in the active area 51, thereby defining the area on the active area 51 where the user's eyes can visually recognize.
  • the area in the active area 51 that emits the image light propagating to the position of the user's eyes is referred to as a visible area 51a.
  • a region in the active area 51 that emits image light propagating to the position of the left eye of the user is referred to as a left visible region 51aL.
  • the left visible area 51aL is also referred to as a first visible area.
  • a region in the active area 51 that emits image light propagating to the position of the right eye of the user is referred to as a right visible region 51aR.
  • the right visible region 51aR is also referred to as a second visible region.
  • the appropriate viewing distance d is referred to as OVD (Optimum Viewing Distance).
  • the barrier pitch Bp which is the arrangement interval of the translucent regions 62 in the horizontal direction, and the gap g between the active area 51 and the barrier 6, are the horizontal length Hp of the sub-pixel and the sub-pixel of the one-eye image. It is defined that the following equations (1) and (2) using the number n, the appropriate viewing distance d, and the interocular distance E are established.
  • E:d (n ⁇ Hp):g
  • d:Bp (d+g):(2 ⁇ n ⁇ Hp) Formula (2)
  • the suitable viewing distance d is the distance between the barrier 6 and at least one of the left eye and the right eye of the user in which the horizontal length of the visible region 51a is n subpixels.
  • the inter-eye distance E is the distance between the user's left eye and right eye.
  • the inter-eye distance E may be a value calculated from the position of the user's eyes or may be a preset value. When set in advance, the inter-eye distance E may be set to a value of 61.1 mm to 64.4 mm, which is a value calculated by a research of the National Institute of Advanced Industrial Science and Technology, for example.
  • Hp is the horizontal length of the sub-pixel as shown in FIG.
  • the barrier 6 may be composed of a member having the second transmittance.
  • the barrier 6 may be composed of, for example, a film or a plate-shaped member.
  • the light shielding surface 61 is made of a film or a plate member.
  • the translucent area 62 is composed of an opening provided in the film or the plate member.
  • the film is made of, for example, resin, but is not limited to this.
  • the plate member is made of, for example, resin or metal, but is not limited to this.
  • the barrier 6 may be made of a light-shielding base material, or may be made of a base material containing a light-shielding additive.
  • the barrier 6 may be composed of a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may include a plurality of pixels and may control the light transmittance of each pixel.
  • the liquid crystal shutter can form a region having a high light transmittance or a region having a low light transmittance in an arbitrary shape.
  • the light transmitting region 62 may be a region having the first transmittance.
  • the light shielding surface 61 may be a region having the second transmittance.
  • the barrier 6 having the above-described configuration allows the image light emitted from a part of the sub-pixels of the active area 51 to pass through the transparent region 62 and be propagated to the right eye of the user.
  • the barrier 6 can transmit the image light emitted from some other sub-pixels through the translucent region 62 and propagate to the left eye of the user.
  • An image visually recognized by the user's eye by propagating the image light to each of the user's left and right eyes will be described in detail with reference to FIGS. 4 and 5.
  • the left visible region 51aL shown in FIG. 4 is, as described above, the active area 51 visually recognized by the left eye of the user when the image light transmitted through the transparent region 62 of the barrier 6 reaches the left eye of the user.
  • the left invisible area 51bL is an area that the left eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the left visible region 51aL includes half of the sub-pixel P1, all of the sub-pixels P2 to P6, and half of the sub-pixel P7.
  • the image light from some other subpixels transmitted through the light transmitting region 62 of the barrier 6 reaches the right eye of the user, so that the right eye of the user is visually recognized.
  • the right invisible region 51bR is a region that the right eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the right visible region 51aR includes half of the sub-pixel P7, all of the sub-pixels P8 to P12, and half of the sub-pixel P1.
  • the left eye and the right eye each visually recognize the image.
  • the left-eye image and the right-eye image are parallax images having a parallax with each other.
  • the left eye includes half of the left-eye image displayed in the sub-pixel P1, the entire left-eye image displayed in the sub-pixels P2 to P6, and the right-eye image displayed in the sub-pixel P7.
  • the right eye visually recognizes half of the right eye image displayed in subpixel P7, the entire right eye image displayed in subpixels P8 to P12, and half of the left eye image displayed in subpixel P1. ..
  • the sub-pixel displaying the left-eye image is labeled with “L”
  • the sub-pixel displaying the right-eye image is labeled with “R”.
  • the area of the left eye image that the user's left eye visually recognizes is the maximum, and the area of the right eye image is the minimum.
  • the area of the right-eye image visually recognized by the right eye of the user is maximum, and the area of the left-eye image is minimum.
  • the fact that the left eye of the user visually recognizes the right eye image or the right eye of the user visually recognizes the left eye image is also referred to as crosstalk. The user can visually recognize the three-dimensional image with the crosstalk reduced.
  • the left-eye image and the right-eye image having parallax with each other are displayed in the sub-pixels included in each of the left visible region 51aL and the right visible region 51aR, the user located at the proper viewing distance d is displayed.
  • the image displayed on the panel 5 can be visually recognized as a three-dimensional image.
  • the left-eye image is displayed in the sub-pixels that are more than half visible by the left eye
  • the right-eye image is displayed in the sub-pixels that are more than half visible by the right eye.
  • the sub-pixels for displaying the left-eye image and the right-eye image are not limited to this, and the left-viewable area 51aL and the right-viewable area 51aR are designed to reduce crosstalk according to the design of the active area 51, the barrier 6, and the like. It may be appropriately determined based on. For example, depending on the aperture ratio of the barrier 6 or the like, the left-eye image is displayed on the sub-pixels that are viewed by the left eye at a predetermined rate or higher, and the right-eye image is displayed on the sub-pixels that are viewed by the right eye at a predetermined rate or higher. You can let me.
  • the controller 7 is connected to each component of the three-dimensional display system 100 and can control each component.
  • the controller 7 is configured as a processor, for example.
  • the controller 7 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for a specific process.
  • the dedicated processor may include an application-specific integrated circuit (ASIC: Application Specific Integrated Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 7 may be one of a SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and a SiP (System In-a-Package).
  • the controller 7 may include a storage unit, and the storage unit may store various kinds of information, a program for operating each component of the three-dimensional display system 100, or the like.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • the storage unit may function as a work memory of the controller 7.
  • controller 7 The control of each component of the three-dimensional display system 100 by the controller 7 will be described below.
  • the controller 7 determines the left visible region 51aL and the right visible region 51aR in the active area 51 of the display panel 5 based on the positions of at least one of the left eye and the right eye of the user.
  • the controller 7 may determine that the right eye of the user is located at a position horizontally moved from the position of the left eye by a predetermined eye distance E based on the position of the left eye of the user.
  • the controller may determine the left visible region 51aL and the right visible region 51aR so that the image light that has passed through each transparent region 62 reaches the left eye and the right eye of the user.
  • the left visible region 51aL and the right visible region 51aR in one translucent region 62 have horizontal lengths of n subpixels. Therefore, as shown in FIG. 1, the left visible region 51aL and the right visible region 51aR do not overlap each other and are arranged alternately in the horizontal direction on the display surface of the display panel 5.
  • the controller 7 determines that the sub-pixel included in the left visible region 51aL is the left sub-pixel.
  • the left sub-pixel is, for example, a sub-pixel that includes a predetermined proportion or more in the left visible region 51aL.
  • the left subpixel is also referred to as a first subpixel or a first display area.
  • the controller 7 determines that the sub-pixel included in the right visible region 51aR is the right sub-pixel.
  • the right sub-pixel is, for example, a sub-pixel including a predetermined ratio or more in the right visible region 51aR.
  • the left subpixel is also referred to as a second subpixel or a second display area. As shown in FIG. 1, the left sub-pixels and the right sub-pixels do not overlap each other and are alternately arranged in the horizontal direction on the display surface of the display panel 5.
  • FIG. 6 is a diagram showing sub-pixels visually recognized by the eyes of the user located at the proper viewing distance d by the image light that has passed through one light-transmitting area 62a.
  • the sub-pixel group Pg will be described as including 12 sub-pixels P1 to P12 arranged continuously in the horizontal direction.
  • FIG. 6 shows a visual recognition region 70 in which the right eye or the left eye of the user, who is located away from the barrier 6 by the appropriate viewing distance d, can visually recognize a predetermined subpixel.
  • six subpixels that are continuous in the horizontal direction are visually recognized.
  • the range in which the left and right eyes of the user can be visually recognized through the translucent region 62a is represented by a broken line.
  • the left eye of the user when the left eye of the user is at the position L1 included in the visual recognition area 70A, the left eye of the user visually recognizes the sub-pixels P1 to P6 through the light transmitting area 62a.
  • the sub-pixel visually recognized by the left eye of the user also changes.
  • the left eye of the user is at the position L2 included in the visual recognition area 70B, the left eye of the user visually recognizes the sub-pixels P2 to P7 via the light transmitting area 62a.
  • Subpixels visually recognized by the user's eyes in adjacent visual recognition areas 70 have a difference of one subpixel.
  • the three-dimensional display system 100 among the 2 ⁇ n subpixels arranged in the horizontal direction of the subpixel group Pg, n different subpixels each have a left eye of the user at the proper viewing distance d and The barrier pitch Bp and the gap g are defined so as to be visually recognized by the right eye. That is, the three-dimensional display system 100 is configured such that a difference of n sub-pixels is generated in the sub-pixel regions visually recognized by the left and right eyes of the user located at the proper viewing distance d. Therefore, in FIG.
  • the right eye visually recognizes the sub-pixels P7 to P12. It is located at the position R1 included in the region 70C.
  • the controller 7 sets the sub-pixels P1 to P6 as left sub-pixels and displays a left-eye image visually recognized by the left eye of the user.
  • the controller 7 sets the sub-pixels P7 to P12 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user.
  • the inter-eye distance E which is the distance between the left eye and the right eye of the user, corresponds to the distance of the n visual recognition areas 70. That is, the width of one visual recognition area 70 is E/n.
  • the controller 7 may change the sub-pixel displaying the right-eye image or the left-eye image according to the position of the user's eye acquired by the acquisition unit 3. For example, it is assumed that the user moves in the horizontal direction at the proper viewing distance d and the left eye of the user moves from the position L1 to the position L2. At this time, the controller 7 determines that, for example, from the position of the left eye of the user, the left eye of the user is located in the visual recognition area 70B for visually recognizing the sub-pixels P2 to P7. The controller 7 determines that the right eye, which is away from the left eye of the user by the inter-eye distance E from the left eye, is located in the visual recognition area 70D for visually recognizing the sub-pixels P8 to P12 and P1.
  • the controller 7 sets the sub-pixels P2 to P7 as left sub-pixels and the sub-pixels P8 to P12 and P1 as right sub-pixels. As a result, the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
  • the controller 7 determines each of the left subpixel and the right subpixel based on the viewing distance. Set the width.
  • the left and right eyes of the user are at the position L7-2 and the position R7-2, which are separated from the barrier 6 by the observation distance Y1.
  • the position L7-2 and the position R7-2 are positions closer to the barrier 6 along the depth direction than the positions L7-1 and R7-1, which are separated from the barrier 6 by the suitable viewing distance d.
  • the image light emitted from the display panel 5 and passing through the translucent region 62a travels along the optical path 71A and the optical path 71B to reach the left eye at the position L7-2 and the right eye at the position R7-2. Reach each.
  • the optical path 71A passes through the position L7-1' included in the visual recognition area 70E at the proper viewing distance d.
  • the visual recognition region 70E corresponds to a region in which the sub-pixels P12 and P1 to P5 can be visually recognized in the plane separated from the barrier 6 by the appropriate visual distance d.
  • the optical path 71A intersects the visual recognition region 70E at the position L7-1', which means that the left eye can visually recognize the sub-pixels P12 and P1 to P5. That is, the controller 7 can specify the sub-pixel that the left eye can visually recognize by calculating the visible region 70 where the optical path 71A intersects even when the left eye is at the position L7-2. The left eye at the position L7-2 can see the sub-pixels P12 and P1 to P5.
  • the optical path 71B passes through the position R7-1 included in the visual recognition region 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye at the position R7-2 can see the sub-pixels P7 to P12.
  • the left eye at the position L7-2 and the right eye at the position R7-2 both see the sub-pixel P12.
  • the sub-pixel P12 is represented by hatching.
  • the controller 7 sets the sub-pixels P7 to P12 as left sub-pixels and the sub-pixels P1 to P5 and P12 as right sub-pixels.
  • the controller 7 may set a sub pixel that is both a right sub pixel and a left sub pixel, such as the sub pixel P12, as the third sub pixel.
  • the ratio of the observation distance Y1 from the barrier 6 and the proper viewing distance d corresponds to the ratio of the distance between the position L7-2 and the position R7-2 and the distance between the position L7-1′ and the position R7-1. ..
  • the image visually recognized by the left and right eyes of the user at the observation distance Y1 which is separated by the inter-eye distance E is the width of one visual recognition region 70 from the inter-eye distance E at the optimum viewing distance d, that is, E/n. It can be considered that this corresponds to an image visually recognized by the left and right eyes of the user who has become longer.
  • FIG. 8 shows sub-pixels which are viewed by the user's eyes when the left and right eyes of the user are at the position L8-2 and the position R8-2, which are separated from the barrier 6 by the observation distance d/2. It is shown.
  • the position L8-2 and the position R8-2 are positions closer to the barrier 6 along the depth direction from the position L8-1 and the position R8-1 at the proper viewing distance d.
  • the observation distance d/2 is half the suitable viewing distance d.
  • the left eye of the user at the position L8-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71A.
  • the optical path 71A passes through the position L8-1' included in the visual recognition region 70F at the proper viewing distance d.
  • the visible region 70F corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the left eye at the position L8-2 can visually recognize the sub-pixels P7 to P12.
  • Position R8-2 The right eye of the user visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71B.
  • the optical path 71B passes through the position R8-1 included in the visual recognition area 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye located at the position R8-2 can visually recognize the sub-pixels P7 to P12.
  • the left eye at the position L8-2 and the right eye at the position R8-2 both see the sub-pixels P7 to P12.
  • the observation distance from the barrier 6 is d/2
  • the area of the image visually recognized by the left eye and the right eye of the user who is separated by the inter-eye distance E is the optimum viewing distance d.
  • the area corresponds to the area of the image visually recognized by the left and right eyes of the user, which is separated by the inter-eye distance 2E.
  • the controller 7 may set the number of sub-pixels forming the one-eye image to twice the number of sub-pixels forming the one-eye image at the proper viewing distance d.
  • the controller 7 changes the number of sub-pixels included in the sub-pixel group Pg and continuously arranged in the horizontal direction from 12 to 24 at the suitable viewing distance d.
  • the changed sub-pixel group Pg includes 24 sub-pixels P1 to P24.
  • the left eye of the user at the position L8-2 which is separated from the barrier 6 by the observation distance d/2, visually recognizes the sub-pixels P7 to P12 through the translucent area 62a.
  • the right eye of the user at the position R8-2 views the sub-pixels P19 to P24 through the translucent area 62a.
  • the controller 7 sets the sub-pixels P1 to P12 as left sub-pixels and displays the left-eye image visually recognized by the left eye of the user.
  • the controller 7 sets the sub-pixels P13 to P24 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user.
  • the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
  • the observation distance from the barrier 6 is d/3, and the regions of the images visually recognized by the left and right eyes of the user, which are separated by the inter-eye distance E, are separated by the inter-eye distance 3E at the optimum viewing distance d. It corresponds to the area of the image visually recognized by the left and right eyes of the user.
  • the controller 7 triples the width of the right sub-pixel and the left sub-pixel when the viewing distance of the user is d/3. Thereby, the controller 7 performs the same control as when the left eye and the right eye of the user are located at the proper viewing distance d, so that the user visually recognizes the three-dimensional image with the crosstalk reduced. Can be made.
  • the controller 7 may widen the width of each of the right sub-pixel and the left sub-pixel when the viewing distance of the user is shorter than the suitable viewing distance d and when the viewing distance is shorter than the predetermined distance. Specifically, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets the width of each of the right subpixel and the left subpixel to the right subpixel and the left at the proper viewing distance d. It may be n times wider than the width of each sub-pixel. The controller 7 may calculate n that satisfies the following expression (5) based on the user's viewing distance Y and the appropriate viewing distance d, and may increase the width of each of the right subpixel and the left subpixel by n times. n may be a natural number of 2 or more. d/n ⁇ Y>d/(n+1) Formula (5)
  • the controller 7 does not change the width of the right subpixel and the width of the left subpixel from the width at the proper viewing distance d when n is 1, that is, when the observation distance Y is larger than the proper viewing distance d/2.
  • the controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 2, that is, when the viewing distance is equal to or less than the optimum viewing distance d/2 and is greater than the optimum viewing distance d/3. 2 times.
  • the controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 3, that is, when the viewing distance is equal to or less than the optimum viewing distance d/3 and is greater than the optimum viewing distance d/4. 3 times. In this way, the controller 7 changes the width of the right subpixel and the width of the left subpixel according to the observation distance Y.
  • the controller 7 may increase the widths of all the right subpixels and the left subpixels by n times when expanding the widths of the right subpixel and the left subpixel by n times, or may increase the widths of all the right subpixels and the left subpixels by n times. The width may be increased by a factor of n.
  • the controller 7 may make a part of the width of the right subpixel and the left subpixel whose width is increased by n times different from the width of the other right subpixel and the left subpixel. That is, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets a part of the width of each of the right subpixel and the left subpixel to the right subpixel at the proper viewing distance d.
  • the width of each of the left sub-pixels is expanded by n times, and the other part of the width of each of the right sub-pixel and the left sub-pixel is set to m of the width of each of the right sub-pixel and the left sub-pixel at the proper viewing distance d. You may double it. m may be any number different from n.
  • the controller 7 may be configured to display the mixed image in the active area 51. Specifically, the controller 7 displays the left-eye image on the sub-pixel that is the left sub-pixel and not the right sub-pixel. The controller 7 displays the right-eye image on the sub-pixel that is the right sub-pixel and not the left sub-pixel. The controller 7 may display the third image on the third subpixel when the third subpixel that is the left subpixel and the right subpixel is present.
  • the controller 7 may be configured to set the brightness value of the third image displayed in the third sub-pixel to a predetermined value or less.
  • the controller 7 may display a black image as the third image.
  • the black image is an image having a predetermined brightness, such as black.
  • the predetermined brightness can be a value corresponding to the brightness of the lowest gradation among the displayable gradation levels of the sub-pixels or the brightness of the gradation corresponding thereto.
  • the controller 7 may display an average image having a luminance value that is an average value of the luminance values of the left-eye image and the right-eye image as the third image in the third subpixel.
  • the controller 7 may display, in the third subpixel, a left-eye image or a right-eye image as the third image based on the characteristics of the user.
  • the characteristic of the user is, for example, a characteristic regarding the dominant eye of the user.
  • the controller 7 may display the left-eye image or the right-eye image corresponding to the dominant eye based on the information indicating the dominant eye of the user set in advance or input from the outside. ..
  • the controller 7 displays the left-eye image as the third image when the dominant eye of the user is the left eye, and displays the right-eye image as the third image when the dominant eye of the user is the right eye. Good.
  • Step S101 The controller 7 acquires the position of at least one of the first eye (for example, the left eye) and the second eye (for example, the right eye) of the user from the detection device 1.
  • the first eye for example, the left eye
  • the second eye for example, the right eye
  • Step S102 The controller 7 calculates the observation distance between the user's eye position and the barrier 6 from the acquired information on the user's eye position.
  • Step S103 The controller 7 determines the width of each of the left subpixel and the right subpixel based on the viewing distance of the user. When the user's viewing distance is shorter than the optimum viewing distance, the controller 7 sets the width of each of the left subpixel and the right subpixel according to the viewing distance.
  • Step S104 The controller 7 determines a subpixel that is a left subpixel and a subpixel that is a right subpixel based on the position of the user's eye.
  • the controller 7 may determine the sub pixel that is the left sub pixel and the right sub pixel as the third sub pixel.
  • Step S105 The controller 7 displays the left-eye image on the sub-pixel which is the left sub-pixel.
  • Step S106 The controller 7 displays the right-eye image on the sub-pixel which is the right sub-pixel.
  • Step S107 The controller 7 displays the third image on the third subpixel when there is a third subpixel which is a left subpixel and a right subpixel.
  • the three-dimensional display device 2 includes the display panel 5, optical elements such as the barrier 6, the acquisition unit 3, and the controller 7.
  • the display panel 5 is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the optical element is configured to define the light ray direction of the image light emitted from the display panel 5.
  • the acquisition unit 3 is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the display panel 5 is configured to display a first display area configured to display a first image visually recognized by the user's first eye, and a second image visually recognized by the user's second eye. And a second display area. The first display area and the second display area are alternately arranged on the display panel 5.
  • the optical element includes a first light-transmitting region configured to transmit the image light with the first transmittance and a second light-transmitting region configured to transmit the image light with the second transmittance. ..
  • the first light transmissive regions and the second light transmissive regions are alternately arranged in the optical element.
  • the controller 7 determines the first display area and the second display based on the observation distance. It is configured to set the width of each of the regions. With this configuration, the three-dimensional display device 2 can adjust the image displayed on the display panel 5 according to the viewing distance of the user.
  • the three-dimensional display device 2 can control the images visually recognized by the left and right eyes of the user so that the crosstalk is reduced when the viewing distance of the user is shorter than the proper viewing distance. Therefore, the three-dimensional display device 2 can appropriately allow the user to visually recognize the three-dimensional image regardless of the change in the distance from the user.
  • the controller 7 can widen the width of each of the first display area and the second display area when the observation distance becomes shorter than the first distance. According to such a configuration, the controller 7 spreads the image visually recognized by the left and right eyes of the user when the observation distance of the user is shorter than the optimum viewing distance, so that the left and right eyes of the user are Crosstalk of a visually recognized image can be reduced.
  • the controller 7 determines the width of each of the first display region and the second display region when the observation distance is shorter than 1/n (n is a number of 2 or more in advance) of the proper viewing distance.
  • the width of each of the first display region and the second display region can be increased by n times the distance.
  • the controller 7 determines the observation distance of the user, and when the observation distance of the user is shorter than the proper viewing distance, the controller 7 can perform the same control as when the user is at the proper viewing distance. it can. As a result, it is possible to reduce an increase in the amount of calculation processing and the amount of data by the controller 7 according to the change in the observation distance of the user.
  • the controller 7 sets the width of a part of the first display area and the second display area whose width is expanded to n times the first display area whose width is expanded to n times.
  • the width of the other display area in the second display area can be different.
  • the controller 7 may cause, for example, crosstalk of images visually recognized by the left and right eyes of the user when expanding the display area by n times, for some display areas, The width can be reduced.
  • the optical element is the barrier 6, but the optical element is not limited to this.
  • the optical element included in the three-dimensional display device 2 may be a lenticular lens 91.
  • the lenticular lens 91 is configured by arranging a plurality of vertically extending cylindrical lenses 92 in the horizontal direction.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the left visible region 51aL so as to reach the position of the left eye of the user.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the right visible region 51aR so as to reach the position of the right eye of the user.
  • the three-dimensional display system 100 is described as the three-dimensional display device 2 and the detection device 1 being separate bodies, but the present invention is not limited to this.
  • the three-dimensional display device 2 may include the function provided by the detection device 1.
  • the three-dimensional display device 2 can detect the position of at least one of the left eye and the right eye of the user.
  • the three-dimensional display system 100 can be mounted on the head-up display system 400.
  • the head-up display system 400 is also referred to as a HUD (Head Up Display) 400.
  • the HUD 400 includes the three-dimensional display system 100, an optical member 410, and a projected member 420 having a projected surface 430.
  • the HUD 400 is configured to cause the image light emitted from the three-dimensional display system 100 to reach the projection target member 420 via the optical member 410.
  • the HUD 400 is configured to cause the image light reflected by the projection target member 420 to reach the left and right eyes of the user.
  • the HUD 400 can cause image light to travel from the three-dimensional display system 100 to the left and right eyes of the user along the optical path 440 indicated by the broken line.
  • the user can visually recognize the image light reaching the optical path 440 as a virtual image 450.
  • the HUD 400 including the three-dimensional display system 100 may be mounted on the mobile body 10.
  • Part of the configuration of the HUD 400 may be combined with other devices and parts included in the moving body 10.
  • the moving body 10 may also use the windshield as the projection target member 420.
  • the other configuration may be referred to as a HUD module or a three-dimensional display component.
  • the HUD 400 and the three-dimensional display system 100 may be mounted on the moving body 10.
  • the “moving body” in the present disclosure includes a vehicle, a ship, and an aircraft.
  • the “vehicle” in the present disclosure includes, but is not limited to, an automobile and an industrial vehicle, and may include a railroad vehicle, a living vehicle, and a fixed-wing aircraft traveling on a runway.
  • Vehicles include, but are not limited to, passenger cars, trucks, buses, motorcycles, and trolleybuses, and may include other vehicles traveling on roads.
  • Industrial vehicles include industrial vehicles for agriculture and construction.
  • Industrial vehicles include, but are not limited to, forklifts and golf carts.
  • Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
  • Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, mobile cranes, dump trucks, and road rollers.
  • Vehicles include those that are driven manually.
  • the vehicle classification is not limited to the above.
  • an automobile may include an industrial vehicle that can travel on a road, and the same vehicle may be included in multiple classifications.
  • Ships in the present disclosure include marine jets, boats, and tankers.
  • the aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.
  • Detection Device 2 3D Display Device 3 Acquisition Unit 4 Illuminator 5 Display Panel 6 Barrier 7 Controller 10 Moving Object 51 Active Area 51aL Left Visible Area 51aR Right Visible Area 51bL Left Invisible Area 51bR Right Invisible Area 51aLR Binocular Visible Area 61 Light Shading Surface 62 Transparent area 70 Visual area 71A Optical path 71B Optical path 91 Lenticular lens 92 Cylindrical lens 100 Three-dimensional display system 400 Head-up display system 410 Optical member 420 Projected member 430 Projected surface 440 Optical path 450 Virtual image

Abstract

A three-dimensional display device 2 is provided with a display panel 5, an optical element, an acquisition unit 3, and a controller 7. The display panel 5 is configured to display a mixture image comprising a first image and a second image. The optical element is configured to define a light ray direction of image light emitted from the display panel 5. The acquisition unit 3 is configured to acquire the position of at least one of a first eye and a second eye of a user. The display panel 5 includes a first display region and a second display region arranged alternately on a display surface of the display panel 5, the first display region being configured to display a first image to be visually recognized by the first eye of the user, the second display region being configured to display a second image to be visually recognized by the second eye of the user. If the distance between at least one of the first eye and the second eye of the user and the optical element is smaller than an appropriate viewing distance, the controller 7 is configured to set the width of each of the first display region and the second display region on the basis of an observation distance.

Description

3次元表示装置、ヘッドアップディスプレイシステム、及び移動体Three-dimensional display device, head-up display system, and moving body 関連出願の相互参照Cross-reference of related applications
 本出願は、2018年12月21日に日本国に特許出願された特願2018-240072の優先権を主張するものであり、この先の出願の開示全体をここに参照のために取り込む。 This application claims the priority of Japanese Patent Application No. 2018-240072 filed in Japan on December 21, 2018, and the entire disclosure of the earlier application is incorporated herein by reference.
 本開示は、3次元表示装置、ヘッドアップディスプレイシステム、及び移動体に関する。 The present disclosure relates to a three-dimensional display device, a head-up display system, and a moving body.
 従来、眼鏡を用いずに3次元表示を行うために、表示パネルから射出された光の一部を右眼に到達させ、表示パネルから射出された光の他の一部を左眼に到達させる光学素子を備える3次元表示装置が知られている(特許文献1参照)。 Conventionally, in order to perform three-dimensional display without using glasses, a part of the light emitted from the display panel reaches the right eye, and another part of the light emitted from the display panel reaches the left eye. A three-dimensional display device including an optical element is known (see Patent Document 1).
特開2001-166259号公報JP 2001-166259 A
 本開示の3次元表示装置は、表示パネルと、光学素子と、取得部と、コントローラと、を備える。前記表示パネルは、第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成されている。前記光学素子は、前記表示パネルから射出される画像光の光線方向を規定するように構成されている。前記取得部は、利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成されている。前記表示パネルは、前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含む。前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並ぶ。前記光学素子は、前記画像光を第1透過率で透過させるように構成された第1透光領域と、前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含む。前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並ぶ。前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている。 The three-dimensional display device of the present disclosure includes a display panel, an optical element, an acquisition unit, and a controller. The display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image. The optical element is configured to define a light ray direction of image light emitted from the display panel. The acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user. The display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above. The first display areas and the second display areas are alternately arranged on the display surface of the display panel. The optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including. The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel. When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
 本開示のヘッドアップディスプレイシステムは、表示パネルと、光学素子と、取得部と、光学部材と、コントローラと、を備える。前記表示パネルは、第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成されている。前記光学素子は、前記表示パネルから射出される画像光の光線方向を規定するように構成されている。前記取得部は、利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成されている。前記光学部材は、前記表示パネルから射出される画像光を、前記利用者に虚像として視認させるように構成されている。前記表示パネルは、前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含む。前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並ぶ。前記光学素子は、前記画像光を第1透過率で透過させるように構成された第1透光領域と、前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含む。前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並ぶ。前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている。 The head-up display system of the present disclosure includes a display panel, an optical element, an acquisition unit, an optical member, and a controller. The display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image. The optical element is configured to define a light ray direction of image light emitted from the display panel. The acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user. The optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image. The display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above. The first display areas and the second display areas are alternately arranged on the display surface of the display panel. The optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including. The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel. When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
 本開示の移動体は、ヘッドアップディスプレイシステムを備える。前記ヘッドアップディスプレイシステムは、表示パネルと、光学素子と、取得部と、光学部材と、コントローラと、を備える。前記表示パネルは、第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成されている。前記光学素子は、前記表示パネルから射出される画像光の光線方向を規定するように構成されている。前記取得部は、利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成されている。前記光学部材は、前記表示パネルから射出される画像光を、前記利用者に虚像として視認させるように構成されている。前記表示パネルは、前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含む。前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並ぶ。前記光学素子は、前記画像光を第1透過率で透過させるように構成された第1透光領域と、前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含む。前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並ぶ。前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている。 The mobile object of the present disclosure includes a head-up display system. The head-up display system includes a display panel, an optical element, an acquisition unit, an optical member, and a controller. The display panel is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image. The optical element is configured to define a light ray direction of image light emitted from the display panel. The acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user. The optical member is configured to allow the user to visually recognize the image light emitted from the display panel as a virtual image. The display panel displays a first display area configured to display the first image visually recognized by the first eye of the user, and a second image visually recognized by the second eye of the user. And a second display area configured as described above. The first display areas and the second display areas are alternately arranged on the display surface of the display panel. The optical element includes a first light-transmissive region configured to transmit the image light at a first transmittance, and a second light-transmissive region configured to transmit the image light at a second transmittance. ,including. The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel. When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. It is configured to set the width of each of the second display areas.
図1は、一実施形態における3次元表示システムを鉛直方向から見た例を示す図である。FIG. 1 is a diagram showing an example of a three-dimensional display system according to an embodiment viewed from a vertical direction. 図2は、図1に示す表示パネルを奥行方向から見た例を示す図である。FIG. 2 is a diagram showing an example of the display panel shown in FIG. 1 viewed from the depth direction. 図3は、図1に示すバリアを奥行方向から見た例を示す図である。FIG. 3 is a diagram showing an example of the barrier shown in FIG. 1 viewed from the depth direction. 図4は、図1に示す表示パネルにおける左可視領域を説明するための図である。FIG. 4 is a diagram for explaining the left visible region in the display panel shown in FIG. 図5は、図1に示す表示パネルにおける右可視領域を説明するための図である。FIG. 5 is a diagram for explaining the right visible region in the display panel shown in FIG. 図6は、適視距離dに位置する利用者の左眼及び右眼が視認するサブピクセルを示す模式図である。FIG. 6 is a schematic diagram showing the sub-pixels visually recognized by the left and right eyes of the user located at the proper viewing distance d. 図7は、観察距離Yが適視距離dより近い利用者の左眼及び右眼が視認するサブピクセルの一例を示す模式図である。FIG. 7 is a schematic diagram showing an example of sub-pixels that the left eye and the right eye of the user visually recognize when the observation distance Y is shorter than the proper viewing distance d. 図8は、観察距離Yが適視距離d/2である近い利用者の左眼及び右眼が視認するサブピクセルの一例を示す模式図である。FIG. 8 is a schematic diagram showing an example of sub-pixels visually recognized by the left and right eyes of a close user whose observation distance Y is the proper viewing distance d/2. 図9は、観察距離Yが適視距離d/2である近い利用者の左眼及び右眼が視認するサブピクセルの他の例を示す模式図である。FIG. 9: is a schematic diagram which shows the other example of the sub pixel visually recognized by the left eye and right eye of the near user whose observation distance Y is the suitable viewing distance d/2. 図10は、一実施形態における3次元表示システムの処理を表すフローチャートである。FIG. 10 is a flowchart showing the processing of the three-dimensional display system according to the embodiment. 図11は、光学素子をレンチキュラレンズとした場合の3次元表示装置の概略構成図である。FIG. 11 is a schematic configuration diagram of a three-dimensional display device when the optical element is a lenticular lens. 図12は、本実施形態に係る3次元表示システムを搭載したヘッドアップディスプレイシステムの例を示す図である。FIG. 12 is a diagram showing an example of a head-up display system equipped with the three-dimensional display system according to this embodiment. 図13は、図12に示すヘッドアップディスプレイシステムを搭載した移動体の例を示す図である。FIG. 13 is a diagram showing an example of a moving body equipped with the head-up display system shown in FIG.
 3次元表示装置において、利用者に3次元画像を適切に視認させることが望まれている。本開示の目的は、利用者に3次元画像を適切に視認させることができる3次元表示装置、ヘッドアップディスプレイシステム、及び移動体を提供することにある。 ③ It is desired to allow users to properly view 3D images on 3D display devices. An object of the present disclosure is to provide a three-dimensional display device, a head-up display system, and a moving body that allow a user to appropriately visually recognize a three-dimensional image.
 以下、本開示の一実施形態について、図面を参照して説明する。 Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings.
 本開示の一実施形態にかかる3次元表示システム100は、図1に示すように、検出装置1と、3次元表示装置2とを含んで構成されている。3次元表示システム100は、3次元表示装置2の表示パネル5に画像を表示させる。表示パネル5から射出された画像光の一部がバリア6によって遮光されることによって、利用者の左眼と右眼とに各々異なる画像光が到達する。利用者は、左眼で見る画像と右眼と見る画像とに互いに視差があることで、画像を立体視できる。利用者が前後に移動した場合に、3次元表示装置2は、検出装置1によって検出された利用者の眼とバリア6との距離に応じて、表示パネル5に表示させる画像の幅を調整する。これによって、3次元表示システム100は、利用者との距離の変化によらず、利用者に3次元画像を適切に視認させることができる。 A three-dimensional display system 100 according to an embodiment of the present disclosure is configured to include a detection device 1 and a three-dimensional display device 2 as shown in FIG. The three-dimensional display system 100 displays an image on the display panel 5 of the three-dimensional display device 2. Part of the image light emitted from the display panel 5 is blocked by the barrier 6, so that different image lights reach the left and right eyes of the user. The user can view the image stereoscopically because there is a parallax between the image viewed by the left eye and the image viewed by the right eye. When the user moves back and forth, the three-dimensional display device 2 adjusts the width of the image displayed on the display panel 5 according to the distance between the user's eyes and the barrier 6 detected by the detection device 1. .. Thereby, the three-dimensional display system 100 can allow the user to appropriately visually recognize the three-dimensional image regardless of the change in the distance to the user.
 検出装置1は、利用者の眼の位置を検出するように構成されている。検出装置1は、利用者の左眼及び右眼の少なくとも一方の位置を検出してよい。以下、利用者の一方の眼は、第1眼とも称される。利用者の他方の眼は、第2眼とも称される。本開示では、左眼を第1眼とし、右眼を第2眼とするが、逆であってよい。利用者の眼の位置は、例えば、3次元空間の座標で表されるが、これに限られない。検出装置1は、例えば、カメラを備えてよい。検出装置1は、カメラによって利用者の顔を撮影してよい。検出装置1は、利用者の顔の撮影画像から利用者の眼の位置を検出してよい。検出装置1は、1つのカメラによる撮影画像から、利用者の眼の位置を3次元空間の座標として検出してよい。検出装置1は、2個以上のカメラによる撮影画像から、利用者の眼の位置を3次元空間の座標として検出してよい。検出装置1は、利用者の左眼及び右眼の少なくとも一方の位置を3次元表示装置2に出力する。 Detecting device 1 is configured to detect the position of the user's eyes. The detection device 1 may detect the position of at least one of the left eye and the right eye of the user. Hereinafter, one eye of the user is also referred to as a first eye. The other eye of the user is also referred to as the second eye. In the present disclosure, the left eye is the first eye and the right eye is the second eye, but they may be reversed. The position of the user's eyes is represented by, for example, coordinates in a three-dimensional space, but is not limited to this. The detection device 1 may include, for example, a camera. The detection device 1 may capture a user's face with a camera. The detection device 1 may detect the position of the eyes of the user from the captured image of the face of the user. The detection device 1 may detect the position of the user's eyes as the coordinates of the three-dimensional space from the image captured by one camera. The detection device 1 may detect the position of the user's eyes as coordinates in a three-dimensional space from images captured by two or more cameras. The detection device 1 outputs the position of at least one of the left eye and the right eye of the user to the three-dimensional display device 2.
 検出装置1は、カメラを備えず、装置外のカメラに接続されていてよい。検出装置1は、装置外のカメラからの撮像信号を入力する入力端子を備えてよい。装置外のカメラは、入力端子に直接的に接続されてよい。装置外のカメラは、共有のネットワークを介して入力端子に間接的に接続されてよい。検出装置1は、入力端子に入力された映像信号から利用者の眼の位置を検出してよい。 Detecting device 1 may not be equipped with a camera and may be connected to a camera outside the device. The detection device 1 may include an input terminal for inputting an image pickup signal from a camera outside the device. The camera outside the device may be directly connected to the input terminal. A camera outside the device may be indirectly connected to the input terminal via a shared network. The detection device 1 may detect the position of the eye of the user from the video signal input to the input terminal.
 検出装置1は、例えば、センサを備えてよい。センサは、超音波センサ又は光センサなどであってよい。検出装置1は、センサによって利用者の頭部の位置を検出し、頭部の位置に基づいて利用者の眼の位置を検出してよい。検出装置1は、1個以上のセンサによって、利用者の眼の位置を3次元空間の座標として検出してよい。 The detection device 1 may include a sensor, for example. The sensor may be an ultrasonic sensor or an optical sensor. The detection device 1 may detect the position of the user's head with a sensor, and may detect the position of the user's eye based on the position of the head. The detection device 1 may detect the position of the user's eyes as coordinates in the three-dimensional space by using one or more sensors.
 3次元表示システム100は、検出装置1を備えなくてよい。3次元表示システム100が検出装置1を備えない場合、3次元表示装置2は、システム外の検出装置からの信号を入力する入力端子を備えてよい。システム外の検出装置は、入力端子に直接的に接続されてよい。システム外の検出装置は、共有のネットワークを介して入力端子に間接的に接続されてよい。3次元表示装置2は、システム外の検出装置から利用者の眼の位置を取得してよい。 The three-dimensional display system 100 does not have to include the detection device 1. When the three-dimensional display system 100 does not include the detection device 1, the three-dimensional display device 2 may include an input terminal for inputting a signal from the detection device outside the system. The detection device outside the system may be directly connected to the input terminal. The detection device outside the system may be indirectly connected to the input terminal via a shared network. The three-dimensional display device 2 may acquire the position of the user's eyes from a detection device outside the system.
 3次元表示装置2は、取得部3と、照射器4と、表示パネル5と、バリア6と、コントローラ7とを含んで構成されている。 The three-dimensional display device 2 includes an acquisition unit 3, an illuminator 4, a display panel 5, a barrier 6, and a controller 7.
 取得部3は、検出装置1によって検出された利用者の左眼及び右眼の少なくとも一方の位置を取得するように構成されてよい。取得部3は、例えば、通信モジュールなどを備えてよい。取得部3は、取得した利用者の眼の位置から、利用者の眼とバリア6との距離を判定してよい。利用者の眼とバリア6との距離は、利用者の左眼及び右眼の少なくとも一方とバリア6との距離であってよい。以下、利用者の眼とバリア6との距離は、利用者の観察距離とも称される。 The acquisition unit 3 may be configured to acquire the position of at least one of the left eye and the right eye of the user detected by the detection device 1. The acquisition unit 3 may include, for example, a communication module or the like. The acquisition unit 3 may determine the distance between the user's eye and the barrier 6 from the acquired position of the user's eye. The distance between the user's eyes and the barrier 6 may be the distance between the barrier 6 and at least one of the user's left and right eyes. Hereinafter, the distance between the user's eyes and the barrier 6 is also referred to as the user's observation distance.
 照射器4は、表示パネル5を面的に照射しうる。照射器4は、光源と、導光板、拡散板、及び拡散シートなどとを含んで構成されてよい。照射器4は、光源により照射光を射出し、導光板、拡散板、及び拡散シートなどにより照射光を表示パネル5の面方向に均一化する。照射器4は均一化された光を表示パネル5に射出しうる。 The illuminator 4 can illuminate the display panel 5 in a plane. The illuminator 4 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like. The irradiator 4 emits irradiation light from a light source, and uniformizes the irradiation light in the surface direction of the display panel 5 using a light guide plate, a diffusion plate, a diffusion sheet, or the like. The illuminator 4 can emit uniformized light to the display panel 5.
 表示パネル5は、例えば、透過型の液晶表示パネルなどの表示パネルであるが、これに限られない。図2に示すように、表示パネル5は、面状に形成されたアクティブエリア51上に複数の区画領域を有する。アクティブエリア51は、混合画像を表示する。アクティブエリア51は、表示面とも称される。混合画像は、左眼画像と、左眼画像に対して視差を有する右眼画像とを含む。以下、左眼画像は、第1画像とも称される。右眼画像は、第2画像とも称される。混合画像は、詳細は後述するが、第3画像を更に含んでよい。区画領域は、格子状のブラックマトリックス52により第1方向及び第1方向に直交する第2方向に区画された領域である。第1方向及び第2方向に直交する方向は第3方向と称される。第1方向は水平方向と称されてよい。第2方向は鉛直方向と称されてよい。第3方向は奥行方向と称されてよい。しかし、第1方向、第2方向、及び第3方向はそれぞれこれらに限られない。図面において、第1方向はx軸方向として表され、第2方向はy軸方向として表され、第3方向はz軸方向として表される。 The display panel 5 is, for example, a display panel such as a transmissive liquid crystal display panel, but is not limited to this. As shown in FIG. 2, the display panel 5 has a plurality of partitioned areas on a planar active area 51. The active area 51 displays a mixed image. The active area 51 is also referred to as a display surface. The mixed image includes a left-eye image and a right-eye image having a parallax with respect to the left-eye image. Hereinafter, the left-eye image is also referred to as the first image. The right eye image is also referred to as the second image. The mixed image, which will be described in detail later, may further include a third image. The partitioned area is an area partitioned by the grid-like black matrix 52 in the first direction and the second direction orthogonal to the first direction. The direction orthogonal to the first direction and the second direction is referred to as the third direction. The first direction may be referred to as the horizontal direction. The second direction may be referred to as the vertical direction. The third direction may be referred to as the depth direction. However, the first direction, the second direction, and the third direction are not limited to these. In the drawings, the first direction is represented as the x-axis direction, the second direction is represented as the y-axis direction, and the third direction is represented as the z-axis direction.
 区画領域の各々には、1つのサブピクセルが対応する。したがって、アクティブエリア51は、水平方向及び鉛直方向に沿って格子状に配列された複数のサブピクセルを備える。 Each sub-area corresponds to one sub-pixel. Therefore, the active area 51 includes a plurality of sub-pixels arranged in a grid along the horizontal and vertical directions.
 各サブピクセルは、R(Red)、G(Green)、及びB(Blue)のいずれかの色に対応する、R、G、及びBの3つのサブピクセルを一組として、1ピクセルが構成されてよい。1ピクセルは、1画素とも称される。水平方向は、例えば、1ピクセルを構成する複数のサブピクセルが並ぶ方向である。鉛直方向は、例えば、同じ色のサブピクセルが並ぶ方向である。表示パネル5は、透過型の液晶パネルに限られず、有機EL(Electro Luminescence)など、他の表示パネルであってよい。表示パネル5が自発光型の表示パネルである場合、3次元表示装置2は照射器4を備えなくてよい。 Each sub-pixel is made up of one set of three sub-pixels of R, G, and B, which correspond to any color of R (Red), G (Green), and B (Blue). You can One pixel is also referred to as one pixel. The horizontal direction is, for example, a direction in which a plurality of subpixels forming one pixel are arranged. The vertical direction is, for example, a direction in which subpixels of the same color are arranged. The display panel 5 is not limited to a transmissive liquid crystal panel, and may be another display panel such as an organic EL (Electro Luminescence). When the display panel 5 is a self-luminous display panel, the three-dimensional display device 2 does not need to include the illuminator 4.
 アクティブエリア51に配列された複数のサブピクセルは、サブピクセル群Pgを構成しうる。サブピクセル群Pgは、後述するコントローラ7がアクティブエリア51に画像を表示するための制御を行う最小単位である。コントローラ7は、1つのサブピクセル群Pgに含まれる複数のサブピクセルに、左眼画像又は右眼画像を表示させる。1つのサブピクセル群Pgにおいて、左眼画像を表示させるサブピクセルと、右眼画像を表示させるサブピクセルとは同数とされてよい。 A plurality of sub-pixels arranged in the active area 51 can form a sub-pixel group Pg. The sub-pixel group Pg is a minimum unit in which the controller 7, which will be described later, performs control for displaying an image in the active area 51. The controller 7 causes the plurality of sub-pixels included in one sub-pixel group Pg to display the left-eye image or the right-eye image. In one subpixel group Pg, the number of subpixels displaying the left-eye image and the number of subpixels displaying the right-eye image may be the same.
 アクティブエリア51において、サブピクセル群Pgは、水平方向に繰り返して配列されてよい。サブピクセル群Pgは、鉛直方向においては、水平方向に1サブピクセル分ずれた位置に隣接して繰り返して配列されてよい。サブピクセル群Pgは、所定の行及び列のサブピクセルを含んでよい。具体的には、サブピクセル群Pgは、鉛直方向にb個(b行)、水平方向に2×n個(2×n列)、連続して配列された2×n×b個のサブピクセルP(1)~P(2×n×b)を含んでよい。nは片眼画像を構成するサブピクセルの数であってよい。図2に示す例では、n=6、b=1である。アクティブエリア51には、鉛直方向に1個、水平方向に12個、連続して配列された12個のサブピクセルP1~P12を含むサブピクセル群Pgが配置される。 In the active area 51, the sub-pixel group Pg may be repeatedly arranged in the horizontal direction. The sub-pixel group Pg may be repeatedly arranged adjacent to a position vertically displaced by one sub-pixel in the vertical direction. The subpixel group Pg may include subpixels in predetermined rows and columns. Specifically, the sub-pixel group Pg includes b (b rows) in the vertical direction, 2×n (2×n columns) in the horizontal direction, and 2×n×b sub-pixels that are continuously arranged. It may include P(1) to P(2×n×b). n may be the number of sub-pixels forming a monocular image. In the example shown in FIG. 2, n=6 and b=1. In the active area 51, a sub-pixel group Pg including twelve sub-pixels P1 to P12 arranged continuously, one in the vertical direction and twelve in the horizontal direction, is arranged.
 全てのサブピクセル群Pgに含まれるサブピクセルP(1)~P(2×n×b)は、コントローラ7によって一括して制御されてよい。例えば、コントローラ7は、サブピクセルP1に表示させる画像を左眼画像から右眼画像に切り替える場合、全てのサブピクセル群Pgに含まれるサブピクセルP1に表示させる画像を左眼画像から右眼画像に同時的に切り替えてよい。コントローラ7は、サブピクセル群Pgに含まれるサブピクセルの数を変更してよい。 The subpixels P(1) to P(2×n×b) included in all the subpixel groups Pg may be collectively controlled by the controller 7. For example, when switching the image displayed on the sub-pixel P1 from the left-eye image to the right-eye image, the controller 7 changes the image displayed on the sub-pixel P1 included in all the sub-pixel groups Pg from the left-eye image to the right-eye image. You may switch at the same time. The controller 7 may change the number of subpixels included in the subpixel group Pg.
 再び図1を参照して、バリア6は、アクティブエリア51に沿う平面により形成され、アクティブエリア51から所定距離(ギャップ)gだけ離れて配置されている。バリア6は、例えば、パララックスバリアであるが、これに限られず、任意の光学素子であってよい。バリア6は、表示パネル5に対して照射器4の反対側に位置してよい。 Referring again to FIG. 1, the barrier 6 is formed by a flat surface along the active area 51, and is arranged apart from the active area 51 by a predetermined distance (gap) g. The barrier 6 is, for example, a parallax barrier, but is not limited to this and may be any optical element. The barrier 6 may be located on the opposite side of the illuminator 4 with respect to the display panel 5.
 バリア6は、表示パネル5から射出される画像光の光線方向を規定するように構成されてよい。バリア6は、図3に示すように、複数の、画像光を遮光する遮光面61を有する。複数の遮光面61は、互いに隣り合う遮光面61の間の透光領域62を画定する。透光領域62は、遮光面61に比べて光透過率が高い。遮光面61は、透光領域62に比べて光透過率が低い。以下、透光領域62は、第1透光領域とも称される。遮光面61は、第2透光領域とも称される。 The barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5. As shown in FIG. 3, the barrier 6 has a plurality of light blocking surfaces 61 that block image light. The plurality of light shielding surfaces 61 define a light transmitting area 62 between the light shielding surfaces 61 adjacent to each other. The light transmitting region 62 has a higher light transmittance than the light shielding surface 61. The light shielding surface 61 has a lower light transmittance than the light transmitting area 62. Hereinafter, the light transmitting area 62 is also referred to as a first light transmitting area. The light shielding surface 61 is also referred to as a second light transmitting area.
 透光領域62は、バリア6に入射する光を透過させる部分である。透光領域62は、第1透過率で光を透過させてよい。第1透過率は、例えば略100%であるが、これに限られず、表示パネル5から射出される画像光が良好に視認できる範囲の値であってよい。第1透過率は、例えば、80%以上、又は50%以上などとされうる。 The light-transmitting area 62 is a portion for transmitting light incident on the barrier 6. The translucent region 62 may transmit light at the first transmittance. The first transmittance is, for example, about 100%, but is not limited to this, and may be a value in a range in which the image light emitted from the display panel 5 can be visually recognized well. The first transmittance may be, for example, 80% or more, or 50% or more.
 遮光面61は、バリア6に入射する光を遮って殆ど透過させない部分である。即ち、遮光面61は、表示パネル5のアクティブエリア51に表示される画像が利用者の眼に到達することを遮る。遮光面61は、第2透過率で光を透過させてよい。第2透過率は、例えば略0%であるが、これに限られず、0%より大きく、0.5%、1%又は3%など、0%に近い値であってよい。第1透過率は、第2透過率より数倍以上、例えば、10倍以上大きい値とされうる。 The light-shielding surface 61 is a portion that blocks light that enters the barrier 6 and hardly transmits it. That is, the light blocking surface 61 blocks the image displayed in the active area 51 of the display panel 5 from reaching the eyes of the user. The light shielding surface 61 may transmit light at the second transmittance. The second transmittance is, for example, approximately 0%, but is not limited to this, and may be a value greater than 0% and a value close to 0% such as 0.5%, 1%, or 3%. The first transmittance can be set to a value that is several times or more, for example, 10 times or more larger than the second transmittance.
 バリア6において、透光領域62は、面内の所定方向に伸びる複数の帯状領域であってよい。透光領域62は、サブピクセルから射出される画像光が伝播する方向である、光線方向を規定する。所定方向は、鉛直方向と0度又は90度でない所定角度をなす方向である。透光領域62と遮光面61とは、アクティブエリア51に沿う所定方向に延び、所定方向と直交する方向に繰り返し交互に配列されてよい。 In the barrier 6, the translucent region 62 may be a plurality of strip-shaped regions extending in a predetermined direction in the plane. The translucent area 62 defines the light ray direction, which is the direction in which the image light emitted from the sub-pixels propagates. The predetermined direction is a direction that forms a predetermined angle that is not 0 degree or 90 degrees with the vertical direction. The light-transmitting regions 62 and the light-shielding surfaces 61 may extend in a predetermined direction along the active area 51 and may be repeatedly and alternately arranged in a direction orthogonal to the predetermined direction.
 バリア6は、遮光面61及び透光領域62によって、表示パネル5から射出される画像光の光線方向を規定するように構成されてよい。図1に示すように、バリア6が、アクティブエリア51に配列されたサブピクセルから射出された画像光を規定することによって、利用者の眼が視認可能なアクティブエリア51上の領域が定まる。以降において、アクティブエリア51内の領域のうち、利用者の眼の位置に伝播する画像光を射出するアクティブエリア51内の領域は可視領域51aと称される。また、利用者の左眼の位置に伝播する画像光を射出するアクティブエリア51内の領域は左可視領域51aLと称される。左可視領域51aLは、第1可視領域とも称される。利用者の右眼の位置に伝播する画像光を射出するアクティブエリア51内の領域は右可視領域51aRと称される。右可視領域51aRは、第2可視領域とも称される。 The barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5 by the light shielding surface 61 and the light transmitting area 62. As shown in FIG. 1, the barrier 6 defines the image light emitted from the sub-pixels arranged in the active area 51, thereby defining the area on the active area 51 where the user's eyes can visually recognize. Hereinafter, among the areas in the active area 51, the area in the active area 51 that emits the image light propagating to the position of the user's eyes is referred to as a visible area 51a. A region in the active area 51 that emits image light propagating to the position of the left eye of the user is referred to as a left visible region 51aL. The left visible area 51aL is also referred to as a first visible area. A region in the active area 51 that emits image light propagating to the position of the right eye of the user is referred to as a right visible region 51aR. The right visible region 51aR is also referred to as a second visible region.
 図1に示すように、利用者の左眼及び右眼は、バリア6から適視距離dだけ離れて位置すると仮定する。適視距離dは、OVD(Optimum Viewing Distance,最適観察距離)と称される。透光領域62の水平方向における配置間隔であるバリアピッチBp、及びアクティブエリア51とバリア6との間のギャップgは、サブピクセルの水平方向の長さHp、片眼画像を構成するサブピクセルの数n、適視距離d、及び眼間距離Eを用いた次の式(1)及び式(2)が成り立つように規定される。
 E:d=(n×Hp):g   式(1)
 d:Bp=(d+g):(2×n×Hp)   式(2)
As shown in FIG. 1, it is assumed that the left and right eyes of the user are located away from the barrier 6 by an appropriate viewing distance d. The appropriate viewing distance d is referred to as OVD (Optimum Viewing Distance). The barrier pitch Bp, which is the arrangement interval of the translucent regions 62 in the horizontal direction, and the gap g between the active area 51 and the barrier 6, are the horizontal length Hp of the sub-pixel and the sub-pixel of the one-eye image. It is defined that the following equations (1) and (2) using the number n, the appropriate viewing distance d, and the interocular distance E are established.
E:d=(n×Hp):g Formula (1)
d:Bp=(d+g):(2×n×Hp) Formula (2)
 適視距離dは、可視領域51aの水平方向の長さがサブピクセルn個分となる利用者の左眼及び右眼の少なくとも一方とバリア6との間の距離である。眼間距離Eは、利用者の左眼と右眼との間の距離である。眼間距離Eは、利用者の眼の位置から算出された値であってよく、或いは予め設定された値であってよい。予め設定される場合、眼間距離Eは、例えば、産業技術総合研究所の研究によって算出された値である61.1mm~64.4mmの値とされてよい。Hpは、図2に示すような、サブピクセルの水平方向の長さである。 The suitable viewing distance d is the distance between the barrier 6 and at least one of the left eye and the right eye of the user in which the horizontal length of the visible region 51a is n subpixels. The inter-eye distance E is the distance between the user's left eye and right eye. The inter-eye distance E may be a value calculated from the position of the user's eyes or may be a preset value. When set in advance, the inter-eye distance E may be set to a value of 61.1 mm to 64.4 mm, which is a value calculated by a research of the National Institute of Advanced Industrial Science and Technology, for example. Hp is the horizontal length of the sub-pixel as shown in FIG.
 バリア6は、第2透過率を有する部材で構成されてよい。バリア6は、例えば、フィルム又は板状部材で構成されてよい。この場合、遮光面61は、フィルム又は板状部材で構成されている。透光領域62は、フィルム又は板状部材に設けられた開口で構成されている。フィルムは、例えば、樹脂で構成されているが、これに限られない。板状部材は、例えば、樹脂又は金属などで構成されているが、これに限られない。バリア6は、遮光性を有する基材で構成されてよく、或いは遮光性を有する添加物を含有する基材で構成されてよい。 The barrier 6 may be composed of a member having the second transmittance. The barrier 6 may be composed of, for example, a film or a plate-shaped member. In this case, the light shielding surface 61 is made of a film or a plate member. The translucent area 62 is composed of an opening provided in the film or the plate member. The film is made of, for example, resin, but is not limited to this. The plate member is made of, for example, resin or metal, but is not limited to this. The barrier 6 may be made of a light-shielding base material, or may be made of a base material containing a light-shielding additive.
 バリア6は、液晶シャッターで構成されてよい。液晶シャッターは、印加する電圧に応じて光の透過率を制御しうる。液晶シャッターは、複数の画素で構成され、各画素における光の透過率を制御してよい。液晶シャッターは、光の透過率が高い領域又は光の透過率が低い領域を任意の形状に形成しうる。バリア6が液晶シャッターで構成されている場合、透光領域62は、第1透過率を有する領域とされてよい。バリア6が液晶シャッターで構成されている場合、遮光面61は、第2透過率を有する領域とされてよい。 The barrier 6 may be composed of a liquid crystal shutter. The liquid crystal shutter can control the light transmittance according to the applied voltage. The liquid crystal shutter may include a plurality of pixels and may control the light transmittance of each pixel. The liquid crystal shutter can form a region having a high light transmittance or a region having a low light transmittance in an arbitrary shape. When the barrier 6 is composed of a liquid crystal shutter, the light transmitting region 62 may be a region having the first transmittance. When the barrier 6 is composed of a liquid crystal shutter, the light shielding surface 61 may be a region having the second transmittance.
 バリア6は、上述した構成を有することで、アクティブエリア51の一部のサブピクセルから射出された画像光を、透光領域62を通過させ利用者の右眼に伝搬させることができる。バリア6は、他の一部のサブピクセルから射出された画像光を、透光領域62を通過させ利用者の左眼に伝搬させることができる。画像光が利用者の左眼及び右眼の各々に伝播されることによって、利用者の眼に視認される画像について、図4及び図5を参照して詳細に説明する。 The barrier 6 having the above-described configuration allows the image light emitted from a part of the sub-pixels of the active area 51 to pass through the transparent region 62 and be propagated to the right eye of the user. The barrier 6 can transmit the image light emitted from some other sub-pixels through the translucent region 62 and propagate to the left eye of the user. An image visually recognized by the user's eye by propagating the image light to each of the user's left and right eyes will be described in detail with reference to FIGS. 4 and 5.
 図4に示す左可視領域51aLは、上述のように、バリア6の透光領域62を透過した画像光が利用者の左眼に到達することによって、利用者の左眼が視認するアクティブエリア51上の領域である。左不可視領域51bLは、バリア6の遮光面61によって画像光が遮られることによって、利用者の左眼が視認することのできない領域である。図4において、左可視領域51aLには、サブピクセルP1の半分と、サブピクセルP2~P6の全体と、サブピクセルP7の半分とが含まれる。 The left visible region 51aL shown in FIG. 4 is, as described above, the active area 51 visually recognized by the left eye of the user when the image light transmitted through the transparent region 62 of the barrier 6 reaches the left eye of the user. The upper area. The left invisible area 51bL is an area that the left eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6. In FIG. 4, the left visible region 51aL includes half of the sub-pixel P1, all of the sub-pixels P2 to P6, and half of the sub-pixel P7.
 図5に示す右可視領域51aRは、バリア6の透光領域62を透過した他の一部のサブピクセルからの画像光が利用者の右眼に到達することによって、利用者の右眼が視認するアクティブエリア51上の領域である。右不可視領域51bRは、バリア6の遮光面61によって画像光が遮られることによって、利用者の右眼が視認することのできない領域である。図5において、右可視領域51aRには、サブピクセルP7の半分と、サブピクセルP8~P12の全体と、サブピクセルP1の半分とが含まれる。 In the right visible region 51aR shown in FIG. 5, the image light from some other subpixels transmitted through the light transmitting region 62 of the barrier 6 reaches the right eye of the user, so that the right eye of the user is visually recognized. This is an area on the active area 51 where The right invisible region 51bR is a region that the right eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6. In FIG. 5, the right visible region 51aR includes half of the sub-pixel P7, all of the sub-pixels P8 to P12, and half of the sub-pixel P1.
 サブピクセルP1~P6に左眼画像が表示され、サブピクセルP7~P12に右眼画像が表示されると、左眼及び右眼は各々画像を視認する。左眼画像及び右眼画像は互いに視差を有する視差画像である。具体的には、左眼は、サブピクセルP1に表示された左眼画像の半分と、サブピクセルP2~P6に表示された左眼画像の全体と、サブピクセルP7に表示された右眼画像の半分とを視認する。右眼は、サブピクセルP7に表示された右眼画像の半分と、サブピクセルP8~P12に表示された右眼画像の全体と、サブピクセルP1に表示された左眼画像の半分とを視認する。図4及び図5において、左眼画像を表示するサブピクセルには符号「L」が付され、右眼画像を表示するサブピクセルには符号「R」が付されている。 When the left-eye image is displayed on the sub-pixels P1 to P6 and the right-eye image is displayed on the sub-pixels P7 to P12, the left eye and the right eye each visually recognize the image. The left-eye image and the right-eye image are parallax images having a parallax with each other. Specifically, the left eye includes half of the left-eye image displayed in the sub-pixel P1, the entire left-eye image displayed in the sub-pixels P2 to P6, and the right-eye image displayed in the sub-pixel P7. Visualize half and. The right eye visually recognizes half of the right eye image displayed in subpixel P7, the entire right eye image displayed in subpixels P8 to P12, and half of the left eye image displayed in subpixel P1. .. In FIGS. 4 and 5, the sub-pixel displaying the left-eye image is labeled with “L”, and the sub-pixel displaying the right-eye image is labeled with “R”.
 この状態において、利用者の左眼が視認する左眼画像の領域は最大となり、右眼画像の面積は最小となる。利用者の右眼が視認する右眼画像の領域は最大となり、左眼画像の面積は最小となる。利用者の左眼が右眼画像を視認し、或いは利用者の右眼が左眼画像を視認することを、クロストークとも称される。利用者は、クロストークが低減された状態で3次元画像を視認することができる。 In this state, the area of the left eye image that the user's left eye visually recognizes is the maximum, and the area of the right eye image is the minimum. The area of the right-eye image visually recognized by the right eye of the user is maximum, and the area of the left-eye image is minimum. The fact that the left eye of the user visually recognizes the right eye image or the right eye of the user visually recognizes the left eye image is also referred to as crosstalk. The user can visually recognize the three-dimensional image with the crosstalk reduced.
 上述のように、互いに視差を有する左眼画像及び右眼画像が左可視領域51aL及び右可視領域51aRの各々に含まれるサブピクセルに表示されると、適視距離dに位置する利用者は表示パネル5に表示された画像を3次元画像として視認しうる。上述した構成では、左眼によって半分以上が視認されるサブピクセルに左眼画像が表示され、右眼によって半分以上が視認されるサブピクセルに右眼画像が表示された。これに限られず、左眼画像及び右眼画像を表示させるサブピクセルは、アクティブエリア51、及びバリア6などの設計に応じて、クロストークが低減されるように左可視領域51aL及び右可視領域51aRに基づいて適宜判定されてよい。例えば、バリア6の開口率などに応じて、左眼によって所定割合以上が視認されるサブピクセルに左眼画像を表示させ、右眼によって所定割合以上が視認されるサブピクセルに右眼画像を表示させてよい。 As described above, when the left-eye image and the right-eye image having parallax with each other are displayed in the sub-pixels included in each of the left visible region 51aL and the right visible region 51aR, the user located at the proper viewing distance d is displayed. The image displayed on the panel 5 can be visually recognized as a three-dimensional image. In the above-described configuration, the left-eye image is displayed in the sub-pixels that are more than half visible by the left eye, and the right-eye image is displayed in the sub-pixels that are more than half visible by the right eye. The sub-pixels for displaying the left-eye image and the right-eye image are not limited to this, and the left-viewable area 51aL and the right-viewable area 51aR are designed to reduce crosstalk according to the design of the active area 51, the barrier 6, and the like. It may be appropriately determined based on. For example, depending on the aperture ratio of the barrier 6 or the like, the left-eye image is displayed on the sub-pixels that are viewed by the left eye at a predetermined rate or higher, and the right-eye image is displayed on the sub-pixels that are viewed by the right eye at a predetermined rate or higher. You can let me.
 コントローラ7は、3次元表示システム100の各構成要素に接続され、各構成要素を制御しうる。コントローラ7は、例えばプロセッサとして構成されている。コントローラ7は、1以上のプロセッサを含んでよい。プロセッサは、特定のプログラムを読み込ませて特定の機能を実行する汎用のプロセッサ、及び特定の処理に特化した専用のプロセッサを含んでよい。専用のプロセッサは、特定用途向けIC(ASIC:Application Specific Integrated Circuit)を含んでよい。プロセッサは、プログラマブルロジックデバイス(PLD:Programmable Logic Device)を含んでよい。PLDは、FPGA(Field-Programmable Gate Array)を含んでよい。コントローラ7は、1つ又は複数のプロセッサが協働するSoC(System-on-a-Chip)、及びSiP(System In a Package)のいずれかであってよい。コントローラ7は、記憶部を備え、記憶部に各種情報、又は3次元表示システム100の各構成要素を動作させるためのプログラムなどを格納してよい。記憶部は、例えば半導体メモリなどで構成されてよい。記憶部は、コントローラ7のワークメモリとして機能してよい。 The controller 7 is connected to each component of the three-dimensional display system 100 and can control each component. The controller 7 is configured as a processor, for example. The controller 7 may include one or more processors. The processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for a specific process. The dedicated processor may include an application-specific integrated circuit (ASIC: Application Specific Integrated Circuit). The processor may include a programmable logic device (PLD: Programmable Logic Device). The PLD may include an FPGA (Field-Programmable Gate Array). The controller 7 may be one of a SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and a SiP (System In-a-Package). The controller 7 may include a storage unit, and the storage unit may store various kinds of information, a program for operating each component of the three-dimensional display system 100, or the like. The storage unit may be composed of, for example, a semiconductor memory. The storage unit may function as a work memory of the controller 7.
 以下、コントローラ7による、3次元表示システム100の各構成要素の制御について、説明する。 The control of each component of the three-dimensional display system 100 by the controller 7 will be described below.
(観察距離が適視距離である場合)
 図1を参照して、利用者が適視距離dに位置する場合における、コントローラ7による、3次元表示システム100の各構成要素の制御について、説明する。コントローラ7は、利用者の左眼及び右眼の少なくとも一方の位置に基づいて、表示パネル5のアクティブエリア51内の左可視領域51aL及び右可視領域51aRを判定する。コントローラ7は、例えば、利用者の左眼の位置に基づいて、左眼の位置から水平方向に所定の眼間距離Eだけ移動した位置に利用者の右眼があると判定してよい。コントローラは、各々の透光領域62を通過した画像光が利用者の左眼及び右眼に到達するように、左可視領域51aLと右可視領域51aRを判定してよい。
(When the observation distance is the proper viewing distance)
With reference to FIG. 1, the control of each component of the three-dimensional display system 100 by the controller 7 when the user is located at the proper viewing distance d will be described. The controller 7 determines the left visible region 51aL and the right visible region 51aR in the active area 51 of the display panel 5 based on the positions of at least one of the left eye and the right eye of the user. The controller 7 may determine that the right eye of the user is located at a position horizontally moved from the position of the left eye by a predetermined eye distance E based on the position of the left eye of the user. The controller may determine the left visible region 51aL and the right visible region 51aR so that the image light that has passed through each transparent region 62 reaches the left eye and the right eye of the user.
 利用者の観察距離が適視距離dである場合、1つの透光領域62における左可視領域51aLと右可視領域51aRとは共に水平方向の長さがサブピクセルn個分となる。そのため、図1に示すように、左可視領域51aLと右可視領域51aRとは重ならず、表示パネル5の表示面において水平方向に交互に並ぶ。コントローラ7は、左可視領域51aLに含まれるサブピクセルを左サブピクセルと判定する。左サブピクセルは、例えば、左可視領域51aLに所定割合以上が含まれるサブピクセルである。左サブピクセルは、第1サブピクセル、又は第1表示領域とも称される。また、コントローラ7は、右可視領域51aRに含まれるサブピクセルを右サブピクセルと判定する。右サブピクセルは、例えば、右可視領域51aRに所定割合以上が含まれるサブピクセルである。左サブピクセルは、第2サブピクセル、又は第2表示領域とも称される。図1に示すように、左サブピクセルと右サブピクセルとは重ならず、表示パネル5の表示面において水平方向に交互に並ぶ。 When the viewing distance of the user is the proper viewing distance d, the left visible region 51aL and the right visible region 51aR in one translucent region 62 have horizontal lengths of n subpixels. Therefore, as shown in FIG. 1, the left visible region 51aL and the right visible region 51aR do not overlap each other and are arranged alternately in the horizontal direction on the display surface of the display panel 5. The controller 7 determines that the sub-pixel included in the left visible region 51aL is the left sub-pixel. The left sub-pixel is, for example, a sub-pixel that includes a predetermined proportion or more in the left visible region 51aL. The left subpixel is also referred to as a first subpixel or a first display area. Further, the controller 7 determines that the sub-pixel included in the right visible region 51aR is the right sub-pixel. The right sub-pixel is, for example, a sub-pixel including a predetermined ratio or more in the right visible region 51aR. The left subpixel is also referred to as a second subpixel or a second display area. As shown in FIG. 1, the left sub-pixels and the right sub-pixels do not overlap each other and are alternately arranged in the horizontal direction on the display surface of the display panel 5.
 図6は、1つの透光領域62aを通過した画像光によって、適視距離dに位置する利用者の眼から視認されるサブピクセルを示す図である。以下、サブピクセル群Pgには水平方向に連続して配置された12個のサブピクセルP1~P12が含まれるものとして説明する。図6には、バリア6から適視距離dだけ離れて位置する利用者の右眼又は左眼が所定のサブピクセルを視認することができる視認領域70が示される。1つの視認領域70では水平方向に連続する6個のサブピクセルが視認される。図6では、利用者の左眼及び右眼が透光領域62aを介して視認できる範囲は、破線によって表されている。例えば、利用者の左眼が視認領域70Aに含まれる位置L1にあるとき、利用者の左眼は、透光領域62aを介して、サブピクセルP1~P6を視認する。利用者の左眼が位置L1の位置から水平方向に移動して、視認領域70Aから視認領域70Bに移ると、利用者の左眼によって視認されるサブピクセルも変化する。利用者の左眼が視認領域70Bに含まれる位置L2にあるとき、利用者の左眼は、透光領域62aを介して、サブピクセルP2~P7を視認する。隣り合う視認領域70において利用者の眼から視認されるサブピクセルには、互いに1サブピクセル分の差異が生じる。 FIG. 6 is a diagram showing sub-pixels visually recognized by the eyes of the user located at the proper viewing distance d by the image light that has passed through one light-transmitting area 62a. Hereinafter, the sub-pixel group Pg will be described as including 12 sub-pixels P1 to P12 arranged continuously in the horizontal direction. FIG. 6 shows a visual recognition region 70 in which the right eye or the left eye of the user, who is located away from the barrier 6 by the appropriate viewing distance d, can visually recognize a predetermined subpixel. In one visual recognition region 70, six subpixels that are continuous in the horizontal direction are visually recognized. In FIG. 6, the range in which the left and right eyes of the user can be visually recognized through the translucent region 62a is represented by a broken line. For example, when the left eye of the user is at the position L1 included in the visual recognition area 70A, the left eye of the user visually recognizes the sub-pixels P1 to P6 through the light transmitting area 62a. When the left eye of the user moves in the horizontal direction from the position L1 and moves from the visual recognition area 70A to the visual recognition area 70B, the sub-pixel visually recognized by the left eye of the user also changes. When the left eye of the user is at the position L2 included in the visual recognition area 70B, the left eye of the user visually recognizes the sub-pixels P2 to P7 via the light transmitting area 62a. Subpixels visually recognized by the user's eyes in adjacent visual recognition areas 70 have a difference of one subpixel.
 上述のとおり、3次元表示システム100では、サブピクセル群Pgの水平方向に並ぶ2×n個のサブピクセルのうち、異なるn個ずつのサブピクセルが適視距離dにある利用者の左眼及び右眼で視認されるように、バリアピッチBp、及びギャップgが規定される。即ち、3次元表示システム100は、適視距離dに位置する利用者の左眼及び右眼で視認されるサブピクセルの領域に、nサブピクセル分だけ差異が生じるように構成されている。したがって、図6において、左眼が、透光領域62aを介してサブピクセルP1~P6を視認する視認領域70Aに含まれる位置L1にある場合、右眼は、サブピクセルP7~P12を視認する視認領域70Cに含まれる位置R1にある。このとき、コントローラ7は、サブピクセルP1~P6を左サブピクセルとし、利用者の左眼で視認される左眼画像を表示させる。コントローラ7は、サブピクセルP7~P12を右サブピクセルとし、利用者の右眼で視認される右眼画像を表示させる。これによって、透光領域62aを通過した画像光が利用者の左眼及び右眼に到達すると、利用者は、クロストークが低減された状態で3次元画像を視認することができる。利用者の左眼と右眼との間の距離である眼間距離Eは、n個の視認領域70の距離に相当する。即ち、1つの視認領域70の幅はE/nである。 As described above, in the three-dimensional display system 100, among the 2×n subpixels arranged in the horizontal direction of the subpixel group Pg, n different subpixels each have a left eye of the user at the proper viewing distance d and The barrier pitch Bp and the gap g are defined so as to be visually recognized by the right eye. That is, the three-dimensional display system 100 is configured such that a difference of n sub-pixels is generated in the sub-pixel regions visually recognized by the left and right eyes of the user located at the proper viewing distance d. Therefore, in FIG. 6, when the left eye is located at the position L1 included in the visual recognition region 70A for visually recognizing the sub-pixels P1 to P6 through the translucent region 62a, the right eye visually recognizes the sub-pixels P7 to P12. It is located at the position R1 included in the region 70C. At this time, the controller 7 sets the sub-pixels P1 to P6 as left sub-pixels and displays a left-eye image visually recognized by the left eye of the user. The controller 7 sets the sub-pixels P7 to P12 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user. As a result, when the image light that has passed through the translucent area 62a reaches the left and right eyes of the user, the user can visually recognize the three-dimensional image with reduced crosstalk. The inter-eye distance E, which is the distance between the left eye and the right eye of the user, corresponds to the distance of the n visual recognition areas 70. That is, the width of one visual recognition area 70 is E/n.
 コントローラ7は、取得部3によって取得した利用者の眼の位置に応じて、右眼画像又は左眼画像を表示させるサブピクセルを変更してよい。例えば、利用者が適視距離dにおいて水平方向に移動し、利用者の左眼が位置L1から位置L2に移動したとする。このとき、コントローラ7は、例えば、利用者の左眼の位置から、利用者の左眼がサブピクセルP2~P7を視認する視認領域70Bに位置すると判定する。コントローラ7は、利用者の左眼の位置から、左眼から眼間距離Eだけ離れた右眼がサブピクセルP8~P12及びP1を視認する視認領域70Dに位置すると判定する。コントローラ7は、サブピクセルP2~P7を左サブピクセルとして、サブピクセルP8~P12及びP1を右サブピクセルとする。これによって、利用者は、クロストークが低減された状態で3次元画像を視認することができる。 The controller 7 may change the sub-pixel displaying the right-eye image or the left-eye image according to the position of the user's eye acquired by the acquisition unit 3. For example, it is assumed that the user moves in the horizontal direction at the proper viewing distance d and the left eye of the user moves from the position L1 to the position L2. At this time, the controller 7 determines that, for example, from the position of the left eye of the user, the left eye of the user is located in the visual recognition area 70B for visually recognizing the sub-pixels P2 to P7. The controller 7 determines that the right eye, which is away from the left eye of the user by the inter-eye distance E from the left eye, is located in the visual recognition area 70D for visually recognizing the sub-pixels P8 to P12 and P1. The controller 7 sets the sub-pixels P2 to P7 as left sub-pixels and the sub-pixels P8 to P12 and P1 as right sub-pixels. As a result, the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
(観察距離が適視距離より短い場合)
 コントローラ7は、利用者の左眼及び右眼の少なくとも一方の位置とバリア6との間の観察距離が適視距離より短い場合、観察距離に基づいて、左サブピクセル及び右サブピクセルの各々の幅を設定する。
(When the viewing distance is shorter than the proper viewing distance)
When the viewing distance between the barrier 6 and the position of at least one of the left and right eyes of the user is shorter than the optimum viewing distance, the controller 7 determines each of the left subpixel and the right subpixel based on the viewing distance. Set the width.
 図7に示されるように、利用者の左眼及び右眼がバリア6から観察距離Y1だけ離れた、位置L7-2及び位置R7-2にあると仮定する。位置L7-2及び位置R7-2は、バリア6から適視距離dだけ離れた、位置L7-1及び位置R7-1よりも、奥行き方向に沿ってバリア6に近づいた位置である。この場合、表示パネル5から射出されて透光領域62aを通過した画像光は、光路71A及び光路71Bに沿って進行し、位置L7-2にある左眼及び位置R7-2にある右眼に各々到達する。光路71Aは、適視距離dにおいて視認領域70Eに含まれる位置L7-1’を通過している。視認領域70Eは、バリア6から適視距離dだけ離れた平面内において、サブピクセルP12及びP1~P5を視認できる領域に対応している。左眼が位置L7-2にある場合に光路71Aが位置L7-1’にて視認領域70Eと交差することは、左眼がサブピクセルP12及びP1~P5を視認できることを表す。つまり、コントローラ7は、左眼が位置L7-2にある場合であっても、光路71Aが交差する視認領域70を算出することによって、左眼が視認できるサブピクセルを特定できる。位置L7-2にある左眼は、サブピクセルP12及びP1~P5を視認できる。光路71Bは、適視距離dにおいて視認領域70Cに含まれる位置R7-1を通過している。視認領域70Cは、バリア6から適視距離dだけ離れた平面内において、サブピクセルP7~P12を視認できる領域に対応している。位置R7-2にある右眼は、サブピクセルP7~P12を視認できる。位置L7-2にある左眼及び位置R7-2にある右眼は、サブピクセルP12を共に視認する。図7において、サブピクセルP12は、網掛けして表される。コントローラ7は、サブピクセルP7~P12を左サブピクセルとして、サブピクセルP1~P5及びP12を右サブピクセルとする。コントローラ7は、サブピクセルP12のように、右サブピクセルでもあって、かつ左サブピクセルでもあるサブピクセルを、第3サブピクセルとしてよい。 As shown in FIG. 7, it is assumed that the left and right eyes of the user are at the position L7-2 and the position R7-2, which are separated from the barrier 6 by the observation distance Y1. The position L7-2 and the position R7-2 are positions closer to the barrier 6 along the depth direction than the positions L7-1 and R7-1, which are separated from the barrier 6 by the suitable viewing distance d. In this case, the image light emitted from the display panel 5 and passing through the translucent region 62a travels along the optical path 71A and the optical path 71B to reach the left eye at the position L7-2 and the right eye at the position R7-2. Reach each. The optical path 71A passes through the position L7-1' included in the visual recognition area 70E at the proper viewing distance d. The visual recognition region 70E corresponds to a region in which the sub-pixels P12 and P1 to P5 can be visually recognized in the plane separated from the barrier 6 by the appropriate visual distance d. When the left eye is at the position L7-2, the optical path 71A intersects the visual recognition region 70E at the position L7-1', which means that the left eye can visually recognize the sub-pixels P12 and P1 to P5. That is, the controller 7 can specify the sub-pixel that the left eye can visually recognize by calculating the visible region 70 where the optical path 71A intersects even when the left eye is at the position L7-2. The left eye at the position L7-2 can see the sub-pixels P12 and P1 to P5. The optical path 71B passes through the position R7-1 included in the visual recognition region 70C at the proper viewing distance d. The visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d. The right eye at the position R7-2 can see the sub-pixels P7 to P12. The left eye at the position L7-2 and the right eye at the position R7-2 both see the sub-pixel P12. In FIG. 7, the sub-pixel P12 is represented by hatching. The controller 7 sets the sub-pixels P7 to P12 as left sub-pixels and the sub-pixels P1 to P5 and P12 as right sub-pixels. The controller 7 may set a sub pixel that is both a right sub pixel and a left sub pixel, such as the sub pixel P12, as the third sub pixel.
 図7において、バリア6からの観察距離Y1と適視距離dとの比は、位置L7-2及び位置R7-2の距離と位置L7-1’及び位置R7-1の距離の比に相当する。観察距離Y1における、眼間距離Eだけ離れた利用者の左眼及び右眼が視認する画像は、適視距離dにおいて、眼間距離Eから1つの視認領域70の幅、即ちE/nだけ長くなった利用者の左眼及び右眼が視認する画像に相当すると考えてよい。利用者の左眼及び右眼が共に視認しうる画像が1サブピクセルとなる観察距離Y1は、片眼画像を構成するサブピクセルの数n、適視距離d、及び眼間距離Eを用いた次の式(3)にて、規定される。
 d:(E+E/n)=Y1:E   式(3)
In FIG. 7, the ratio of the observation distance Y1 from the barrier 6 and the proper viewing distance d corresponds to the ratio of the distance between the position L7-2 and the position R7-2 and the distance between the position L7-1′ and the position R7-1. .. The image visually recognized by the left and right eyes of the user at the observation distance Y1 which is separated by the inter-eye distance E is the width of one visual recognition region 70 from the inter-eye distance E at the optimum viewing distance d, that is, E/n. It can be considered that this corresponds to an image visually recognized by the left and right eyes of the user who has become longer. As the observation distance Y1 in which an image that can be visually recognized by both the left eye and the right eye of the user is one subpixel, the number n of subpixels forming the one-eye image, the appropriate viewing distance d, and the inter-eye distance E were used. It is defined by the following equation (3).
d:(E+E/n)=Y1:E Formula (3)
 式(4)に基づいて、利用者の右眼及び左眼がバリア6から適視距離dより短い観察距離Yにあるとき、眼間距離Eだけ離れた利用者の左眼及び右眼が視認する画像の領域は、適視距離dにおいて、眼間距離E+αだけ離れた利用者の左眼及び右眼が視認する画像の領域に相当する。
 d:(E+α)=Y:E   式(4)
When the right eye and the left eye of the user are at the observation distance Y shorter than the proper viewing distance d from the barrier 6 based on the equation (4), the left eye and the right eye of the user who are separated by the inter-eye distance E are visually recognized. The region of the image to be displayed corresponds to the region of the image visually recognized by the left eye and the right eye of the user who are separated by the inter-eye distance E+α at the proper viewing distance d.
d:(E+α)=Y:E Formula (4)
 図8には、利用者の左眼及び右眼がバリア6から観察距離d/2だけ離れた、位置L8-2及び位置R8-2にある場合に、利用者の眼から視認されるサブピクセルが示されている。位置L8-2及び位置R8-2は、適視距離dにある位置L8-1及び位置R8-1から、奥行き方向に沿ってバリア6に近づいた位置である。観察距離d/2は適視距離dの半分の距離である。図8において、位置L8-2にある利用者の左眼は、表示パネル5から射出され、光路71Aに沿って進行した画像光を視認する。光路71Aは、適視距離dにおいて視認領域70Fに含まれる位置L8-1’を通過している。視認領域70Fは、バリア6から適視距離dだけ離れた平面内において、サブピクセルP7~P12を視認できる領域に対応している。位置L8-2にある左眼は、サブピクセルP7~P12を視認できる。位置R8-2利用者の右眼は、表示パネル5から射出され、光路71Bに沿って進行した画像光を視認する。光路71Bは、適視距離dにおいて視認領域70Cに含まれる位置R8-1を通過している。視認領域70Cは、バリア6から適視距離dだけ離れた平面内において、サブピクセルP7~P12を視認できる領域に対応している。位置R8-2にある右眼は、サブピクセルP7~P12を視認できる。位置L8-2にある左眼及び位置R8-2にある右眼は、サブピクセルP7~P12を共に視認する。 FIG. 8 shows sub-pixels which are viewed by the user's eyes when the left and right eyes of the user are at the position L8-2 and the position R8-2, which are separated from the barrier 6 by the observation distance d/2. It is shown. The position L8-2 and the position R8-2 are positions closer to the barrier 6 along the depth direction from the position L8-1 and the position R8-1 at the proper viewing distance d. The observation distance d/2 is half the suitable viewing distance d. In FIG. 8, the left eye of the user at the position L8-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71A. The optical path 71A passes through the position L8-1' included in the visual recognition region 70F at the proper viewing distance d. The visible region 70F corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d. The left eye at the position L8-2 can visually recognize the sub-pixels P7 to P12. Position R8-2 The right eye of the user visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71B. The optical path 71B passes through the position R8-1 included in the visual recognition area 70C at the proper viewing distance d. The visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d. The right eye located at the position R8-2 can visually recognize the sub-pixels P7 to P12. The left eye at the position L8-2 and the right eye at the position R8-2 both see the sub-pixels P7 to P12.
 上述の式(4)のとおり、バリア6からの観察距離がd/2であって、眼間距離Eだけ離れた利用者の左眼及び右眼が視認する画像の領域は、適視距離dにおいて、眼間距離2Eだけ離れた利用者の左眼及び右眼が視認する画像の領域に相当する。このとき、右サブピクセルと左サブピクセルの幅を2倍に広げることで、適視距離dにおいて、利用者の左眼及び右眼が眼間距離Eだけ離れている場合と同一又は類似の制御を行うことができる。 As in the above formula (4), the observation distance from the barrier 6 is d/2, and the area of the image visually recognized by the left eye and the right eye of the user who is separated by the inter-eye distance E is the optimum viewing distance d. In, the area corresponds to the area of the image visually recognized by the left and right eyes of the user, which is separated by the inter-eye distance 2E. At this time, by doubling the widths of the right subpixel and the left subpixel, the same or similar control as in the case where the left eye and the right eye of the user are separated by the interocular distance E at the appropriate viewing distance d. It can be performed.
 具体的には、図9に示すように、コントローラ7は、片眼画像を構成するサブピクセルの数を、適視距離dにおける片眼画像を構成するサブピクセルの数の2倍としてよい。例えば、コントローラ7は、サブピクセル群Pgに含まれる水平方向に連続して配置されたサブピクセルの数を、適視距離dにおける12個から24個に変更する。変更されたサブピクセル群Pgには、サブピクセルP1~P24の24個のサブピクセルが含まれる。これによって、図8と同じく、バリア6から観察距離d/2だけ離れた位置L8-2にある利用者の左眼は、透光領域62aを介してサブピクセルP7~P12を視認する。位置R8-2にある利用者の右眼は、透光領域62aを介してサブピクセルP19~P24を視認する。このとき、コントローラ7は、サブピクセルP1~P12を左サブピクセルとし、利用者の左眼で視認される左眼画像を表示させる。コントローラ7は、サブピクセルP13~P24を右サブピクセルとし、利用者の右眼で視認される右眼画像を表示させる。これによって、利用者は、クロストークが低減された状態で3次元画像を視認することができる。 Specifically, as shown in FIG. 9, the controller 7 may set the number of sub-pixels forming the one-eye image to twice the number of sub-pixels forming the one-eye image at the proper viewing distance d. For example, the controller 7 changes the number of sub-pixels included in the sub-pixel group Pg and continuously arranged in the horizontal direction from 12 to 24 at the suitable viewing distance d. The changed sub-pixel group Pg includes 24 sub-pixels P1 to P24. As a result, similarly to FIG. 8, the left eye of the user at the position L8-2, which is separated from the barrier 6 by the observation distance d/2, visually recognizes the sub-pixels P7 to P12 through the translucent area 62a. The right eye of the user at the position R8-2 views the sub-pixels P19 to P24 through the translucent area 62a. At this time, the controller 7 sets the sub-pixels P1 to P12 as left sub-pixels and displays the left-eye image visually recognized by the left eye of the user. The controller 7 sets the sub-pixels P13 to P24 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user. As a result, the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
 バリア6からの観察距離がd/3であって、眼間距離Eだけ離れた利用者の左眼及び右眼が視認する画像の領域は、適視距離dにおいて、眼間距離3Eだけ離れた利用者の左眼及び右眼が視認する画像の領域に相当する。コントローラ7は、利用者の観察距離がd/3となるとき、右サブピクセルと左サブピクセルの幅を3倍に広げる。これによって、コントローラ7は、利用者の左眼及び右眼が適視距離dに位置する場合と同様の制御を行うことで、利用者に、クロストークが低減された状態で3次元画像を視認させることができる。 The observation distance from the barrier 6 is d/3, and the regions of the images visually recognized by the left and right eyes of the user, which are separated by the inter-eye distance E, are separated by the inter-eye distance 3E at the optimum viewing distance d. It corresponds to the area of the image visually recognized by the left and right eyes of the user. The controller 7 triples the width of the right sub-pixel and the left sub-pixel when the viewing distance of the user is d/3. Thereby, the controller 7 performs the same control as when the left eye and the right eye of the user are located at the proper viewing distance d, so that the user visually recognizes the three-dimensional image with the crosstalk reduced. Can be made.
 上述のとおり、コントローラ7は、利用者の観察距離が適視距離dより短い場合、観察距離が所定の距離より短くなるとき、右サブピクセル及び左サブピクセルの各々の幅を広げてよい。具体的には、コントローラ7は、利用者の観察距離が適視距離の1/nより短くなるとき、右サブピクセル及び左サブピクセルの各々の幅を、適視距離dにおける右サブピクセル及び左サブピクセルの各々の幅のn倍に広げてよい。コントローラ7は、利用者の観察距離Yと適視距離dに基づいて以下の式(5)を満たすnを算出し、右サブピクセル及び左サブピクセルの各々の幅をn倍に広げてよい。nは、2以上の自然数であってよい。
 d/n≧Y>d/(n+1)   式(5)
As described above, the controller 7 may widen the width of each of the right sub-pixel and the left sub-pixel when the viewing distance of the user is shorter than the suitable viewing distance d and when the viewing distance is shorter than the predetermined distance. Specifically, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets the width of each of the right subpixel and the left subpixel to the right subpixel and the left at the proper viewing distance d. It may be n times wider than the width of each sub-pixel. The controller 7 may calculate n that satisfies the following expression (5) based on the user's viewing distance Y and the appropriate viewing distance d, and may increase the width of each of the right subpixel and the left subpixel by n times. n may be a natural number of 2 or more.
d/n≧Y>d/(n+1) Formula (5)
 コントローラ7は、nが1の場合、即ち観察距離Yが適視距離d/2より大きい場合、右サブピクセルの幅及び左サブピクセルの幅を適視距離dにおける幅から変更しない。コントローラ7は、nが2の場合、即ち観察距離が適視距離d/2以下で適視距離d/3より大きい場合、右サブピクセルの幅及び左サブピクセルの幅を適視距離dにおける幅の2倍とする。コントローラ7は、nが3の場合、即ち観察距離が適視距離d/3以下で適視距離d/4より大きい場合、右サブピクセルの幅及び左サブピクセルの幅を適視距離dにおける幅の3倍とする。このように、コントローラ7は、観察距離Yに応じて、右サブピクセルの幅及び左サブピクセルの幅を変更する。コントローラ7は、右サブピクセル及び左サブピクセルの幅をn倍に広げる場合、全ての右サブピクセル及び左サブピクセルの幅をn倍に広げてよく、或いは一部の右サブピクセル及び左サブピクセルの幅をn倍に広げてよい。コントローラ7は、幅をn倍に広げられた右サブピクセル及び左サブピクセルの一部の幅を、他の右サブピクセル及び左サブピクセルの幅と異ならせてよい。すなわち、コントローラ7は、利用者の観察距離が適視距離の1/nより短くなるとき、右サブピクセル及び左サブピクセルの各々の幅のうち一部を、適視距離dにおける右サブピクセル及び左サブピクセルの各々の幅のn倍に広げ、右サブピクセル及び左サブピクセルの各々の幅のうち他の一部を、適視距離dにおける右サブピクセル及び左サブピクセルの各々の幅のm倍に広げてよい。mは、nとは異なる任意の数であってよい。 The controller 7 does not change the width of the right subpixel and the width of the left subpixel from the width at the proper viewing distance d when n is 1, that is, when the observation distance Y is larger than the proper viewing distance d/2. The controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 2, that is, when the viewing distance is equal to or less than the optimum viewing distance d/2 and is greater than the optimum viewing distance d/3. 2 times. The controller 7 sets the width of the right sub-pixel and the width of the left sub-pixel to the width at the optimum viewing distance d when n is 3, that is, when the viewing distance is equal to or less than the optimum viewing distance d/3 and is greater than the optimum viewing distance d/4. 3 times. In this way, the controller 7 changes the width of the right subpixel and the width of the left subpixel according to the observation distance Y. The controller 7 may increase the widths of all the right subpixels and the left subpixels by n times when expanding the widths of the right subpixel and the left subpixel by n times, or may increase the widths of all the right subpixels and the left subpixels by n times. The width may be increased by a factor of n. The controller 7 may make a part of the width of the right subpixel and the left subpixel whose width is increased by n times different from the width of the other right subpixel and the left subpixel. That is, when the viewing distance of the user is shorter than 1/n of the proper viewing distance, the controller 7 sets a part of the width of each of the right subpixel and the left subpixel to the right subpixel at the proper viewing distance d. The width of each of the left sub-pixels is expanded by n times, and the other part of the width of each of the right sub-pixel and the left sub-pixel is set to m of the width of each of the right sub-pixel and the left sub-pixel at the proper viewing distance d. You may double it. m may be any number different from n.
 コントローラ7は、アクティブエリア51に混合画像を表示させるように構成されてよい。具体的には、コントローラ7は、左サブピクセルであって右サブピクセルではないサブピクセルに左眼画像を表示させる。コントローラ7は、右サブピクセルであって左サブピクセルではないサブピクセルに右眼画像を表示させる。コントローラ7は、左サブピクセルであって、かつ右サブピクセルである第3サブピクセルがある場合、第3サブピクセルに第3画像を表示させてよい。 The controller 7 may be configured to display the mixed image in the active area 51. Specifically, the controller 7 displays the left-eye image on the sub-pixel that is the left sub-pixel and not the right sub-pixel. The controller 7 displays the right-eye image on the sub-pixel that is the right sub-pixel and not the left sub-pixel. The controller 7 may display the third image on the third subpixel when the third subpixel that is the left subpixel and the right subpixel is present.
 コントローラ7は、第3サブピクセルに表示する第3画像の輝度値を所定値以下とするように構成されてよい。例えば、コントローラ7は、第3画像として、黒画像を表示してよい。黒画像は、例えば、黒色のような、所定輝度を有する画像である。所定輝度は、サブピクセルの表示可能な階調レベルのうち、最も低い階調の輝度又はこれに準じる階調の輝度に対応する値とすることができる。 The controller 7 may be configured to set the brightness value of the third image displayed in the third sub-pixel to a predetermined value or less. For example, the controller 7 may display a black image as the third image. The black image is an image having a predetermined brightness, such as black. The predetermined brightness can be a value corresponding to the brightness of the lowest gradation among the displayable gradation levels of the sub-pixels or the brightness of the gradation corresponding thereto.
 コントローラ7は、第3サブピクセルに、第3画像として、左眼画像及び右眼画像の輝度値の平均値となる輝度値を有する平均画像を表示してよい。 The controller 7 may display an average image having a luminance value that is an average value of the luminance values of the left-eye image and the right-eye image as the third image in the third subpixel.
 コントローラ7は、第3サブピクセルに、第3画像として、利用者の特性に基づいて、左眼画像及び右眼画像のいずれかの画像を表示してよい。利用者の特性は、例えば、利用者の利き目に関する特性である。具体的には、コントローラ7は、予め設定された、或いは外部から入力された利用者の利き目を示す情報に基づいて、利き目に対応する、左眼画像又は右眼画像を表示してよい。コントローラ7は、利用者の利き目が左眼である場合、第3画像として左眼画像を表示し、利用者の利き目が右眼である場合、第3画像として右眼画像を表示してよい。 The controller 7 may display, in the third subpixel, a left-eye image or a right-eye image as the third image based on the characteristics of the user. The characteristic of the user is, for example, a characteristic regarding the dominant eye of the user. Specifically, the controller 7 may display the left-eye image or the right-eye image corresponding to the dominant eye based on the information indicating the dominant eye of the user set in advance or input from the outside. .. The controller 7 displays the left-eye image as the third image when the dominant eye of the user is the left eye, and displays the right-eye image as the third image when the dominant eye of the user is the right eye. Good.
(3次元表示システムの処理例)
 図10を参照して、本開示の一実施形態に係る3次元表示システム100の処理の流れを説明する。
(Example of processing by the three-dimensional display system)
With reference to FIG. 10, a processing flow of the three-dimensional display system 100 according to the embodiment of the present disclosure will be described.
 ステップS101:コントローラ7は、検出装置1から利用者の第1眼(例えば、左眼)及び第2眼(例えば、右眼)の少なくとも一方の位置を取得する。 Step S101: The controller 7 acquires the position of at least one of the first eye (for example, the left eye) and the second eye (for example, the right eye) of the user from the detection device 1.
 ステップS102:コントローラ7は、取得した利用者の眼の位置の情報から、利用者の眼の位置とバリア6との観察距離を算出する。 Step S102: The controller 7 calculates the observation distance between the user's eye position and the barrier 6 from the acquired information on the user's eye position.
 ステップS103:コントローラ7は、利用者の観察距離に基づいて、左サブピクセル及び右サブピクセルの各々の幅を判定する。コントローラ7は、利用者の観察距離が適視距離より短い場合、観察距離に応じて、左サブピクセル及び右サブピクセルの各々の幅を設定する。 Step S103: The controller 7 determines the width of each of the left subpixel and the right subpixel based on the viewing distance of the user. When the user's viewing distance is shorter than the optimum viewing distance, the controller 7 sets the width of each of the left subpixel and the right subpixel according to the viewing distance.
 ステップS104:コントローラ7は、利用者の眼の位置に基づいて、左サブピクセルであるサブピクセルと、右サブピクセルであるサブピクセルとを判定する。コントローラ7は、左サブピクセルであって、かつ右サブピクセルであるサブピクセルを第3サブピクセルと判定してよい。 Step S104: The controller 7 determines a subpixel that is a left subpixel and a subpixel that is a right subpixel based on the position of the user's eye. The controller 7 may determine the sub pixel that is the left sub pixel and the right sub pixel as the third sub pixel.
 ステップS105:コントローラ7は、左サブピクセルであるサブピクセルに左眼画像を表示する。 Step S105: The controller 7 displays the left-eye image on the sub-pixel which is the left sub-pixel.
 ステップS106:コントローラ7は、右サブピクセルであるサブピクセルに右眼画像を表示する。 Step S106: The controller 7 displays the right-eye image on the sub-pixel which is the right sub-pixel.
 ステップS107:コントローラ7は、左サブピクセルであって、かつ右サブピクセルである第3サブピクセルがある場合、第3サブピクセルに第3画像を表示する。 Step S107: The controller 7 displays the third image on the third subpixel when there is a third subpixel which is a left subpixel and a right subpixel.
 以上説明したように、本実施形態では、3次元表示装置2は、表示パネル5と、バリア6等の光学素子と、取得部3と、コントローラ7と、を備える。表示パネル5は、第1画像と第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成されている。光学素子は、表示パネル5から射出される画像光の光線方向を規定するように構成されている。取得部3は、利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成されている。表示パネル5は、利用者の第1眼で視認させる第1画像を表示するように構成された第1表示領域と、利用者の第2眼で視認させる第2画像を表示するように構成された第2表示領域と、を含む。第1表示領域及び第2表示領域は表示パネル5において交互に並ぶ。光学素子は、画像光を第1透過率で透過させるように構成された第1透光領域と、画像光を第2透過率で透過させるように構成された第2透光領域と、を含む。第1透光領域及び第2透光領域は光学素子において交互に並ぶ。コントローラ7は、利用者の第1眼及び第2眼の少なくとも一方の位置と光学素子との間の観察距離が適視距離より短い場合、観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている。かかる構成によれば、3次元表示装置2は、利用者の観察距離に応じて、表示パネル5に表示させる画像を調整することができる。これによって、3次元表示装置2は、利用者の観察距離が適視距離より短い場合に、利用者の左眼及び右眼が視認する画像を、クロストークが低減されるように制御しうる。したがって、3次元表示装置2は、利用者との距離の変化によらず、利用者に3次元画像を適切に視認させることができる。 As described above, in this embodiment, the three-dimensional display device 2 includes the display panel 5, optical elements such as the barrier 6, the acquisition unit 3, and the controller 7. The display panel 5 is configured to display a mixed image including a first image and a second image having a parallax with respect to the first image. The optical element is configured to define the light ray direction of the image light emitted from the display panel 5. The acquisition unit 3 is configured to acquire the position of at least one of the first eye and the second eye of the user. The display panel 5 is configured to display a first display area configured to display a first image visually recognized by the user's first eye, and a second image visually recognized by the user's second eye. And a second display area. The first display area and the second display area are alternately arranged on the display panel 5. The optical element includes a first light-transmitting region configured to transmit the image light with the first transmittance and a second light-transmitting region configured to transmit the image light with the second transmittance. .. The first light transmissive regions and the second light transmissive regions are alternately arranged in the optical element. When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller 7 determines the first display area and the second display based on the observation distance. It is configured to set the width of each of the regions. With this configuration, the three-dimensional display device 2 can adjust the image displayed on the display panel 5 according to the viewing distance of the user. Accordingly, the three-dimensional display device 2 can control the images visually recognized by the left and right eyes of the user so that the crosstalk is reduced when the viewing distance of the user is shorter than the proper viewing distance. Therefore, the three-dimensional display device 2 can appropriately allow the user to visually recognize the three-dimensional image regardless of the change in the distance from the user.
 本実施形態では、コントローラ7は、観察距離が第1距離より短くなるとき、第1表示領域及び第2表示領域の各々の幅を広げることができる。かかる構成によれば、コントローラ7は、利用者の観察距離が適視距離より短い場合に、利用者の左眼及び右眼が視認する画像を広げることで、利用者の左眼及び右眼が視認する画像のクロストークを低減させうる。 In the present embodiment, the controller 7 can widen the width of each of the first display area and the second display area when the observation distance becomes shorter than the first distance. According to such a configuration, the controller 7 spreads the image visually recognized by the left and right eyes of the user when the observation distance of the user is shorter than the optimum viewing distance, so that the left and right eyes of the user are Crosstalk of a visually recognized image can be reduced.
 本実施形態では、コントローラ7は、観察距離が適視距離の1/n(nは2以上の事前数)より短くなるとき、第1表示領域及び第2表示領域の各々の幅を、適視距離における第1表示領域及び第2表示領域の各々の幅からn倍に広げることができる。かかる構成によれば、コントローラ7は、利用者の観察距離を判定し、利用者の観察距離が適視距離より短い場合において、利用者が適視距離にいる場合と同様の制御を行うことができる。これにより、利用者の観察距離の変化に応じて、コントローラ7による演算処理量及びデータ量が増大することを低減させうる。 In the present embodiment, the controller 7 determines the width of each of the first display region and the second display region when the observation distance is shorter than 1/n (n is a number of 2 or more in advance) of the proper viewing distance. The width of each of the first display region and the second display region can be increased by n times the distance. According to this configuration, the controller 7 determines the observation distance of the user, and when the observation distance of the user is shorter than the proper viewing distance, the controller 7 can perform the same control as when the user is at the proper viewing distance. it can. As a result, it is possible to reduce an increase in the amount of calculation processing and the amount of data by the controller 7 according to the change in the observation distance of the user.
 本実施形態では、コントローラ7は、幅をn倍に広げられた第1表示領域及び第2表示領域のうち一部の表示領域の幅を、幅をn倍に広げられた第1表示領域及び第2表示領域のうち他の表示領域の幅と異ならせることができる。かかる構成によれば、コントローラ7は、例えば、表示領域をn倍に広げる際に、利用者の左眼及び右眼が視認する画像のクロストークの原因となりうる、一部の表示領域については、幅が広がることを低減させうる。 In the present embodiment, the controller 7 sets the width of a part of the first display area and the second display area whose width is expanded to n times the first display area whose width is expanded to n times. The width of the other display area in the second display area can be different. According to such a configuration, the controller 7 may cause, for example, crosstalk of images visually recognized by the left and right eyes of the user when expanding the display area by n times, for some display areas, The width can be reduced.
 上述の実施形態は代表的な例として説明したが、本発明の趣旨及び範囲内で、多くの変更及び置換ができることは当業者に明らかである。したがって、本発明は、上述の実施形態によって制限するものと解するべきではなく、特許請求の範囲から逸脱することなく、種々の変形及び変更が可能である。例えば、実施形態及び実施例に記載の複数の構成ブロックを1つに組合せたり、あるいは1つの構成ブロックを分割したりすることが可能である。 The above embodiments have been described as representative examples, but it is obvious to those skilled in the art that many modifications and substitutions can be made within the spirit and scope of the present invention. Therefore, the present invention should not be construed as being limited by the above-described embodiments, and various modifications and changes can be made without departing from the scope of the claims. For example, it is possible to combine a plurality of constituent blocks described in the embodiments and examples into one, or to divide one constituent block.
 上述の実施形態では、光学素子がバリア6であるとしたが、これに限られない。例えば、図11に示すように、3次元表示装置2が備える光学素子は、レンチキュラレンズ91としてよい。レンチキュラレンズ91は、垂直方向に延びる複数のシリンドリカルレンズ92を水平方向に配列して構成されている。レンチキュラレンズ91は、バリア6と同様に、左可視領域51aLのサブピクセルから射出した画像光を、利用者の左眼の位置に到達させるように伝播させることができる。レンチキュラレンズ91は、右可視領域51aRのサブピクセルから射出した画像光を、利用者の右眼の位置に到達させるように伝播させることができる。 In the above embodiment, the optical element is the barrier 6, but the optical element is not limited to this. For example, as shown in FIG. 11, the optical element included in the three-dimensional display device 2 may be a lenticular lens 91. The lenticular lens 91 is configured by arranging a plurality of vertically extending cylindrical lenses 92 in the horizontal direction. Like the barrier 6, the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the left visible region 51aL so as to reach the position of the left eye of the user. The lenticular lens 91 can propagate the image light emitted from the sub-pixel of the right visible region 51aR so as to reach the position of the right eye of the user.
 上述の実施形態では、3次元表示システム100において、3次元表示装置2と検出装置1とが別体であるものとして説明したが、これに限られない。例えば、3次元表示装置2は、検出装置1が提供する機能を備えてよい。かかる場合、3次元表示装置2は、利用者の左眼及び右眼の少なくとも一方の位置を検出することができる。 In the above embodiment, the three-dimensional display system 100 is described as the three-dimensional display device 2 and the detection device 1 being separate bodies, but the present invention is not limited to this. For example, the three-dimensional display device 2 may include the function provided by the detection device 1. In such a case, the three-dimensional display device 2 can detect the position of at least one of the left eye and the right eye of the user.
 図12に示すように、3次元表示システム100は、ヘッドアップディスプレイシステム400に搭載されうる。ヘッドアップディスプレイシステム400は、HUD(Head Up Display)400とも称される。HUD400は、3次元表示システム100と、光学部材410と、被投影面430を有する被投影部材420とを備える。HUD400は、3次元表示システム100から射出される画像光を、光学部材410を介して被投影部材420に到達させるように構成されている。HUD400は、被投影部材420で反射させた画像光を、利用者の左眼及び右眼に到達させるように構成されている。つまり、HUD400は、破線で示される光路440に沿って、3次元表示システム100から利用者の左眼及び右眼まで画像光を進行させることができる。利用者は、光路440に沿って到達した画像光を、虚像450として視認しうる。 As shown in FIG. 12, the three-dimensional display system 100 can be mounted on the head-up display system 400. The head-up display system 400 is also referred to as a HUD (Head Up Display) 400. The HUD 400 includes the three-dimensional display system 100, an optical member 410, and a projected member 420 having a projected surface 430. The HUD 400 is configured to cause the image light emitted from the three-dimensional display system 100 to reach the projection target member 420 via the optical member 410. The HUD 400 is configured to cause the image light reflected by the projection target member 420 to reach the left and right eyes of the user. That is, the HUD 400 can cause image light to travel from the three-dimensional display system 100 to the left and right eyes of the user along the optical path 440 indicated by the broken line. The user can visually recognize the image light reaching the optical path 440 as a virtual image 450.
 図13に示すように、3次元表示システム100を含むHUD400は、移動体10に搭載されてよい。HUD400は、構成の一部を、当該移動体10が備える他の装置、部品と兼用してよい。例えば、移動体10は、ウインドシールドを被投影部材420として兼用してよい。構成の一部を当該移動体10が備える他の装置、部品と兼用する場合、他の構成をHUDモジュール又は3次元表示コンポーネントと呼ばれうる。HUD400、3次元表示システム100は、移動体10に搭載されてよい。本開示における「移動体」には、車両、船舶、航空機を含む。本開示における「車両」には、自動車及び産業車両を含むが、これに限られず、鉄道車両及び生活車両、滑走路を走行する固定翼機を含めてよい。自動車は、乗用車、トラック、バス、二輪車、及びトロリーバスなどを含むがこれに限られず、道路上を走行する他の車両を含んでよい。産業車両は、農業及び建設向けの産業車両を含む。産業車両には、フォークリフト、及びゴルフカートを含むがこれに限られない。農業向けの産業車両には、トラクター、耕耘機、移植機、バインダー、コンバイン、及び芝刈り機を含むが、これに限られない。建設向けの産業車両には、ブルドーザー、スクレーバー、ショベルカー、クレーン車、ダンプカー、及びロードローラを含むが、これに限られない。車両は、人力で走行するものを含む。なお、車両の分類は、上述に限られない。例えば、自動車には、道路を走行可能な産業車両を含んでよく、複数の分類に同じ車両が含まれてよい。本開示における船舶には、マリンジェット、ボート、タンカーを含む。本開示における航空機には、固定翼機、回転翼機を含む。 As shown in FIG. 13, the HUD 400 including the three-dimensional display system 100 may be mounted on the mobile body 10. Part of the configuration of the HUD 400 may be combined with other devices and parts included in the moving body 10. For example, the moving body 10 may also use the windshield as the projection target member 420. When a part of the configuration is also used as another device or part included in the moving body 10, the other configuration may be referred to as a HUD module or a three-dimensional display component. The HUD 400 and the three-dimensional display system 100 may be mounted on the moving body 10. The “moving body” in the present disclosure includes a vehicle, a ship, and an aircraft. The “vehicle” in the present disclosure includes, but is not limited to, an automobile and an industrial vehicle, and may include a railroad vehicle, a living vehicle, and a fixed-wing aircraft traveling on a runway. Vehicles include, but are not limited to, passenger cars, trucks, buses, motorcycles, and trolleybuses, and may include other vehicles traveling on roads. Industrial vehicles include industrial vehicles for agriculture and construction. Industrial vehicles include, but are not limited to, forklifts and golf carts. Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers. Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, mobile cranes, dump trucks, and road rollers. Vehicles include those that are driven manually. The vehicle classification is not limited to the above. For example, an automobile may include an industrial vehicle that can travel on a road, and the same vehicle may be included in multiple classifications. Ships in the present disclosure include marine jets, boats, and tankers. The aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.
1          検出装置
2          3次元表示装置
3          取得部
4          照射器
5          表示パネル
6          バリア
7          コントローラ
10         移動体
51         アクティブエリア
51aL       左可視領域
51aR       右可視領域
51bL       左不可視領域
51bR       右不可視領域
51aLR      両眼可視領域
61         遮光面
62         透光領域
70         視認領域
71A        光路
71B        光路
91         レンチキュラレンズ
92         シリンドリカルレンズ
100        3次元表示システム
400        ヘッドアップディスプレイシステム
410        光学部材
420        被投影部材
430        被投影面
440        光路
450        虚像
 
 
1 Detection Device 2 3D Display Device 3 Acquisition Unit 4 Illuminator 5 Display Panel 6 Barrier 7 Controller 10 Moving Object 51 Active Area 51aL Left Visible Area 51aR Right Visible Area 51bL Left Invisible Area 51bR Right Invisible Area 51aLR Binocular Visible Area 61 Light Shading Surface 62 Transparent area 70 Visual area 71A Optical path 71B Optical path 91 Lenticular lens 92 Cylindrical lens 100 Three-dimensional display system 400 Head-up display system 410 Optical member 420 Projected member 430 Projected surface 440 Optical path 450 Virtual image

Claims (6)

  1.  第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成された表示パネルと、
     前記表示パネルから射出される画像光の光線方向を規定するように構成された光学素子と、
     利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成された取得部と、
     コントローラと、
    を備え、
     前記表示パネルは、
      前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、
      前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含み、
      前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並び、
     前記光学素子は、
      前記画像光を第1透過率で透過させるように構成された第1透光領域と、
      前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含み、
      前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並び、
     前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている、3次元表示装置。
    A display panel configured to display a mixed image including a first image and a second image having a parallax with respect to the first image;
    An optical element configured to define the light ray direction of the image light emitted from the display panel,
    An acquisition unit configured to acquire the position of at least one of the first eye and the second eye of the user,
    A controller,
    Equipped with
    The display panel is
    A first display area configured to display the first image to be visually recognized by the first eye of the user,
    A second display area configured to display the second image to be visually recognized by the second eye of the user,
    The first display area and the second display area are alternately arranged on the display surface of the display panel,
    The optical element is
    A first transmissive region configured to transmit the image light at a first transmissivity;
    A second translucent region configured to transmit the image light at a second transmissivity,
    The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel,
    When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. A three-dimensional display device configured to set the width of each of the second display areas.
  2.  前記コントローラは、前記観察距離が第1距離より短くなるとき、第1表示領域及び第2表示領域の各々の幅を広げるように構成されている、請求項1に記載の3次元表示装置。 The three-dimensional display device according to claim 1, wherein the controller is configured to widen the width of each of the first display region and the second display region when the observation distance becomes shorter than the first distance.
  3.  前記コントローラは、前記観察距離が前記適視距離の1/n(nは2以上の自然数)より短くなるとき、第1表示領域及び第2表示領域の各々の幅を、前記適視距離における前記第1表示領域及び前記第2表示領域の各々の幅からn倍に広げるように構成されている、請求項1又は2に記載の3次元表示装置。 When the observation distance is shorter than 1/n (n is a natural number of 2 or more) of the proper viewing distance, the controller determines the width of each of the first display area and the second display area to be equal to the appropriate viewing distance. The three-dimensional display device according to claim 1 or 2, wherein the three-dimensional display device is configured to be n times wider than the width of each of the first display region and the second display region.
  4.  前記コントローラは、幅をn倍に広げられた前記第1表示領域及び前記第2表示領域のうち一部の表示領域の幅を、幅をn倍に広げられた前記第1表示領域及び前記第2表示領域のうち他の表示領域の幅と異ならせるように構成されている、請求項3に記載の3次元表示装置。 The controller controls the width of a part of the first display area and the second display area whose width is expanded to n times, to the first display area and the first display area whose width is expanded to n times. The three-dimensional display device according to claim 3, wherein the three-dimensional display device is configured to have a width different from that of another display region of the two display regions.
  5.  第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成された表示パネルと、
     前記表示パネルから射出される画像光の光線方向を規定するように構成された光学素子と、
     利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成された取得部と、
     前記表示パネルから射出される画像光を、前記利用者に虚像として視認させるように構成された光学部材と、
     コントローラと、
    を備え、
     前記表示パネルは、
      前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、
      前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含み、
      前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並び、
     前記光学素子は、
      前記画像光を第1透過率で透過させるように構成された第1透光領域と、
      前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含み、
      前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並び、
     前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている、ヘッドアップディスプレイシステム。
    A display panel configured to display a mixed image including a first image and a second image having a parallax with respect to the first image;
    An optical element configured to define the light ray direction of the image light emitted from the display panel,
    An acquisition unit configured to acquire the position of at least one of the first eye and the second eye of the user,
    Image light emitted from the display panel, an optical member configured to allow the user to visually recognize as a virtual image,
    A controller,
    Equipped with
    The display panel is
    A first display area configured to display the first image to be visually recognized by the first eye of the user,
    A second display area configured to display the second image to be visually recognized by the second eye of the user,
    The first display area and the second display area are alternately arranged on the display surface of the display panel,
    The optical element is
    A first transmissive region configured to transmit the image light at a first transmissivity;
    A second translucent region configured to transmit the image light at a second transmissivity,
    The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel,
    When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. A heads-up display system configured to set the width of each of the second display areas.
  6.  第1画像と前記第1画像に対して視差を有する第2画像とを含む混合画像を表示するように構成された表示パネルと、
     前記表示パネルから射出される画像光の光線方向を規定するように構成された光学素子と、
     利用者の第1眼及び第2眼の少なくとも一方の位置を取得するように構成された取得部と、
     前記表示パネルから射出される画像光を、前記利用者に虚像として視認させるように構成された光学部材と、
     コントローラと、
    を含み、
     前記表示パネルは、
      前記利用者の第1眼で視認させる前記第1画像を表示するように構成された第1表示領域と、
      前記利用者の第2眼で視認させる前記第2画像を表示するように構成された第2表示領域と、を含み、
      前記第1表示領域及び前記第2表示領域は前記表示パネルの表示面において交互に並び、
     前記光学素子は、
      前記画像光を第1透過率で透過させるように構成された第1透光領域と、
      前記画像光を第2透過率で透過させるように構成された第2透光領域と、を含み、
      前記第1透光領域及び前記第2透光領域は前記表示パネルの表示面に沿う前記光学素子の平面において交互に並び、
     前記コントローラは、前記利用者の第1眼及び第2眼の少なくとも一方の位置と前記光学素子との間の観察距離が適視距離より短い場合、前記観察距離に基づいて、第1表示領域及び第2表示領域の各々の幅を設定するように構成されている、ヘッドアップディスプレイシステムを備える、移動体。
     
    A display panel configured to display a mixed image including a first image and a second image having a parallax with respect to the first image;
    An optical element configured to define the light ray direction of the image light emitted from the display panel,
    An acquisition unit configured to acquire the position of at least one of the first eye and the second eye of the user,
    Image light emitted from the display panel, an optical member configured to allow the user to visually recognize as a virtual image,
    A controller,
    Including,
    The display panel is
    A first display area configured to display the first image to be visually recognized by the first eye of the user,
    A second display area configured to display the second image to be visually recognized by the second eye of the user,
    The first display area and the second display area are alternately arranged on the display surface of the display panel,
    The optical element is
    A first transmissive region configured to transmit the image light at a first transmissivity;
    A second translucent region configured to transmit the image light at a second transmissivity,
    The first light transmissive regions and the second light transmissive regions are alternately arranged in a plane of the optical element along the display surface of the display panel,
    When the observation distance between the optical element and the position of at least one of the first and second eyes of the user is shorter than the proper viewing distance, the controller may display a first display area and a first display area based on the observation distance. A vehicle comprising a head-up display system configured to set the width of each of the second display areas.
PCT/JP2019/049677 2018-12-21 2019-12-18 Three-dimensional display device, head-up display system, and mobile body WO2020130049A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018240072A JP2020102772A (en) 2018-12-21 2018-12-21 Three-dimensional display device, head-up display system, and moving body
JP2018-240072 2018-12-21

Publications (1)

Publication Number Publication Date
WO2020130049A1 true WO2020130049A1 (en) 2020-06-25

Family

ID=71101847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049677 WO2020130049A1 (en) 2018-12-21 2019-12-18 Three-dimensional display device, head-up display system, and mobile body

Country Status (2)

Country Link
JP (1) JP2020102772A (en)
WO (1) WO2020130049A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074486A (en) * 2023-03-21 2023-05-05 北京光谱印宝科技有限责任公司 Naked eye 3D display device
WO2023199765A1 (en) * 2022-04-12 2023-10-19 公立大学法人大阪 Stereoscopic display device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131887A1 (en) * 2011-03-29 2012-10-04 株式会社 東芝 Three-dimensional image display device
WO2015132828A1 (en) * 2014-03-06 2015-09-11 パナソニックIpマネジメント株式会社 Image display method and image display apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012131887A1 (en) * 2011-03-29 2012-10-04 株式会社 東芝 Three-dimensional image display device
WO2015132828A1 (en) * 2014-03-06 2015-09-11 パナソニックIpマネジメント株式会社 Image display method and image display apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023199765A1 (en) * 2022-04-12 2023-10-19 公立大学法人大阪 Stereoscopic display device
CN116074486A (en) * 2023-03-21 2023-05-05 北京光谱印宝科技有限责任公司 Naked eye 3D display device
CN116074486B (en) * 2023-03-21 2023-07-25 北京光谱印宝科技有限责任公司 Naked eye 3D display device

Also Published As

Publication number Publication date
JP2020102772A (en) 2020-07-02

Similar Documents

Publication Publication Date Title
JP6924637B2 (en) 3D display device, 3D display system, mobile body, and 3D display method
JP7129789B2 (en) Head-up displays, head-up display systems, and moving objects
US20200053352A1 (en) Three-dimensional display apparatus, three-dimensional display system, head-up display system, and mobile body
JP7188981B2 (en) 3D display device, 3D display system, head-up display, and moving object
WO2020130049A1 (en) Three-dimensional display device, head-up display system, and mobile body
JP7145214B2 (en) 3D DISPLAY DEVICE, CONTROLLER, 3D DISPLAY METHOD, 3D DISPLAY SYSTEM, AND MOVING OBJECT
CN114503556A (en) Three-dimensional display device, controller, three-dimensional display method, three-dimensional display system, and moving object
WO2019225400A1 (en) Image display device, image display system, head-up display, and mobile object
JP7188888B2 (en) Image display device, head-up display system, and moving object
WO2020130048A1 (en) Three-dimensional display device, head-up display system, and moving object
US11874464B2 (en) Head-up display, head-up display system, moving object, and method of designing head-up display
JP7336782B2 (en) 3D display device, 3D display system, head-up display, and moving object
CN114730096A (en) Head-up display system and moving object
WO2023228887A1 (en) Three-dimensional display device, head-up display system, and mobile body
WO2022149599A1 (en) Three-dimensional display device
JP7475231B2 (en) 3D display device
WO2022163728A1 (en) Three-dimensional display device
WO2022019154A1 (en) Three-dimensional display device
WO2021060011A1 (en) Parallax barrier, 3d display device, 3d display system, heads-up display, and moving body
EP4040787A1 (en) Three-dimensional display device, three-dimensional display system, and mobile object
WO2021060012A1 (en) Parallax barrier, three-dimensional display device, three-dimensional display system, head-up display, and mobile body
CN116235241A (en) Three-dimensional display device, three-dimensional display method, three-dimensional display system, and moving object
JP2021056255A (en) Parallax barrier, three-dimensional display device, three-dimensional display system, head-up display, and movable body

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19898797

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19898797

Country of ref document: EP

Kind code of ref document: A1