WO2020130048A1 - Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile - Google Patents

Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile Download PDF

Info

Publication number
WO2020130048A1
WO2020130048A1 PCT/JP2019/049676 JP2019049676W WO2020130048A1 WO 2020130048 A1 WO2020130048 A1 WO 2020130048A1 JP 2019049676 W JP2019049676 W JP 2019049676W WO 2020130048 A1 WO2020130048 A1 WO 2020130048A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
eye
sub
user
pixel
Prior art date
Application number
PCT/JP2019/049676
Other languages
English (en)
Japanese (ja)
Inventor
薫 草深
秀也 高橋
濱岸 五郎
Original Assignee
京セラ株式会社
公立大学法人大阪
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社, 公立大学法人大阪 filed Critical 京セラ株式会社
Publication of WO2020130048A1 publication Critical patent/WO2020130048A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • H04N13/125Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues for crosstalk reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/317Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using slanted parallax optics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present disclosure relates to a three-dimensional display device, a head-up display system, and a moving body.
  • the three-dimensional display device of the present disclosure includes a display panel, an optical element, an acquisition unit, and a controller.
  • the display panel includes an active area configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the active area includes a plurality of sub-pixels.
  • the optical element is configured to define a light ray direction of image light emitted from the active area.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the controller is configured to display the mixed image in the active area.
  • the controller determines a first visible region in the active area that emits image light propagating to the position of the first eye based on the position of at least one of the first eye and the second eye of the user.
  • the second visible region in the active area that emits the image light propagating to the position of the second eye is determined.
  • the controller is configured to determine, from the sub-pixels, a first sub-pixel having a predetermined ratio or more in the first visible region.
  • the controller is configured to determine, from the sub-pixels, a second sub-pixel having a predetermined ratio or more in the second visible region.
  • the controller is configured to display a third image on the third sub-pixel, which is the first sub-pixel and is the second sub-pixel.
  • the head-up display system of the present disclosure includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel includes an active area configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the active area includes a plurality of sub-pixels.
  • the optical element is configured to define a light ray direction of image light emitted from the active area.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the active area as a virtual image.
  • the controller is configured to display the mixed image in the active area.
  • the controller determines a first visible region in the active area that emits image light propagating to the position of the first eye based on the position of at least one of the first eye and the second eye of the user. , The second visible region in the active area that emits the image light propagating to the position of the second eye is determined.
  • the controller is configured to determine, from the sub-pixels, a first sub-pixel having a predetermined ratio or more in the first visible region.
  • the controller is configured to determine, from the sub-pixels, a second sub-pixel having a predetermined ratio or more in the second visible region.
  • the controller is configured to display a third image on the third sub-pixel, which is the first sub-pixel and is the second sub-pixel.
  • the mobile object of the present disclosure includes a head-up display system.
  • the head-up display system includes a display panel, an optical element, an acquisition unit, an optical member, and a controller.
  • the display panel includes an active area configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the active area includes a plurality of sub-pixels.
  • the optical element is configured to define a light ray direction of image light emitted from the active area.
  • the acquisition unit is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the optical member is configured to allow the user to visually recognize the image light emitted from the active area as a virtual image.
  • the controller is configured to display the mixed image in the active area.
  • the controller is configured to, based on an observation distance between at least one of the first eye and the second eye of the user and the optical element, a controller in the active area that emits image light propagating to the position of the first eye. It is configured to determine one visible region and determine a second visible region in the active area that emits image light propagating to the position of the second eye.
  • the controller is configured to determine, from the sub-pixels, a first sub-pixel having a predetermined ratio or more in the first visible region.
  • the controller is configured to determine, from the sub-pixels, a second sub-pixel having a predetermined ratio or more in the second visible region.
  • the controller is configured to display a third image on the third sub-pixel, which is the first sub-pixel and is the second sub-pixel.
  • FIG. 1 is a diagram showing an example of a three-dimensional display system according to an embodiment viewed from a vertical direction.
  • FIG. 2 is a diagram showing an example of the display panel shown in FIG. 1 viewed from the depth direction.
  • FIG. 3 is a diagram showing an example of the barrier shown in FIG. 1 viewed from the depth direction.
  • FIG. 4 is a diagram for explaining the left visible region in the display panel shown in FIG.
  • FIG. 5 is a diagram for explaining the right visible region in the display panel shown in FIG.
  • FIG. 6 is a schematic diagram showing the sub-pixels visually recognized by the left and right eyes of the user located at the proper viewing distance d.
  • FIG. 7 is a diagram for explaining a binocular visible area in the three-dimensional display system according to the embodiment.
  • FIG. 1 is a diagram showing an example of a three-dimensional display system according to an embodiment viewed from a vertical direction.
  • FIG. 2 is a diagram showing an example of the display panel shown in FIG. 1 viewed from the
  • FIG. 8 is a schematic diagram showing an example of the sub-pixels that the left and right eyes of the user visually recognize when the observation distance Y is longer than the proper viewing distance d.
  • FIG. 9 is a schematic diagram showing another example of the sub-pixels that the left eye and the right eye of the user visually recognize when the observation distance Y is longer than the proper viewing distance d.
  • FIG. 10 is a schematic diagram showing an example of the sub-pixels visually recognized by the left eye and the right eye of the user when the observation distance Y is shorter than the proper viewing distance d.
  • FIG. 11 is a schematic diagram showing another example of the sub-pixels visually recognized by the left eye and the right eye of the user when the observation distance Y is shorter than the suitable viewing distance d.
  • FIG. 12 is a list of the observation distance Y and the number of third sub-pixels based on the observation distance Y.
  • FIG. 13 is a flowchart showing the processing of the three-dimensional display system according to the embodiment.
  • FIG. 14 is a schematic configuration diagram of a three-dimensional display device when the optical element is a lenticular lens.
  • FIG. 15 is a diagram showing an example of a head-up display system equipped with the three-dimensional display system according to this embodiment.
  • FIG. 16 is a diagram showing an example of a moving body equipped with the head-up display system shown in FIG.
  • An object of the present disclosure is to provide a three-dimensional display device, a head-up display system, and a moving body that allow a user to appropriately visually recognize a three-dimensional image.
  • a three-dimensional display system 100 is configured to include a detection device 1 and a three-dimensional display device 2 as shown in FIG.
  • the three-dimensional display system 100 displays an image on the display panel 5 of the three-dimensional display device 2. Part of the image light emitted from the display panel 5 is blocked by the barrier 6, so that different image lights reach the left eye and the right eye of the user.
  • the user can view the image stereoscopically because there is a parallax between the image viewed by the left eye and the image viewed by the right eye.
  • the three-dimensional display device 2 adjusts the image displayed on the display panel 5 according to the distance between the user's eyes and the barrier 6 detected by the detection device 1. Thereby, the three-dimensional display system 100 can allow the user to appropriately visually recognize the three-dimensional image regardless of the change in the distance to the user.
  • Detecting device 1 is configured to detect the position of the user's eyes.
  • the detection device 1 may detect the position of at least one of the left eye and the right eye of the user.
  • one eye of the user is also referred to as a first eye.
  • the other eye of the user is also referred to as the second eye.
  • the left eye is the first eye and the right eye is the second eye, but they may be reversed.
  • the position of the user's eyes is represented by, for example, coordinates in a three-dimensional space, but is not limited to this.
  • the detection device 1 may include, for example, a camera.
  • the detection device 1 may capture a user's face with a camera.
  • the detection device 1 may detect the position of the eyes of the user from the captured image of the face of the user.
  • the detection device 1 may detect the position of the user's eyes as the coordinates of the three-dimensional space from the image captured by one camera.
  • the detection device 1 may detect the position of the user's eyes as coordinates in a three-dimensional space from images captured by two or more cameras.
  • the detection device 1 outputs the position of at least one of the left eye and the right eye of the user to the three-dimensional display device 2.
  • Detecting device 1 may not be equipped with a camera and may be connected to a camera outside the device.
  • the detection device 1 may include an input terminal for inputting an image pickup signal from a camera outside the device.
  • the camera outside the device may be directly connected to the input terminal.
  • a camera outside the device may be indirectly connected to the input terminal via a shared network.
  • the detection device 1 may detect the position of the eye of the user from the video signal input to the input terminal.
  • the detection device 1 may include a sensor, for example.
  • the sensor may be an ultrasonic sensor or an optical sensor.
  • the detection device 1 may detect the position of the user's head with a sensor, and may detect the position of the user's eye based on the position of the head.
  • the detection device 1 may detect the position of the user's eyes as coordinates in the three-dimensional space by using one or more sensors.
  • the three-dimensional display system 100 does not have to include the detection device 1.
  • the three-dimensional display device 2 may include an input terminal for inputting a signal from the detection device outside the system.
  • the detection device outside the system may be directly connected to the input terminal.
  • the detection device outside the system may be indirectly connected to the input terminal via a shared network.
  • the three-dimensional display device 2 may acquire the position of the user's eyes from a detection device outside the system.
  • the three-dimensional display device 2 includes an acquisition unit 3, an illuminator 4, a display panel 5, a barrier 6, and a controller 7.
  • the acquisition unit 3 may be configured to acquire the position of at least one of the left eye and the right eye of the user detected by the detection device 1.
  • the acquisition unit 3 may include, for example, a communication module or the like.
  • the acquisition unit 3 may determine the distance between the user's eye and the barrier 6 from the acquired position of the user's eye.
  • the distance between the user's eyes and the barrier 6 may be the distance between the barrier 6 and at least one of the user's left and right eyes.
  • the distance between the user's eyes and the barrier 6 is also referred to as the user's observation distance.
  • the illuminator 4 can illuminate the display panel 5 in a plane.
  • the illuminator 4 may include a light source, a light guide plate, a diffusion plate, a diffusion sheet, and the like.
  • the irradiator 4 emits irradiation light from a light source, and uniformizes the irradiation light in the surface direction of the display panel 5 using a light guide plate, a diffusion plate, a diffusion sheet, or the like.
  • the illuminator 4 can emit uniformized light to the display panel 5.
  • the display panel 5 is, for example, a display panel such as a transmissive liquid crystal display panel, but is not limited to this. As shown in FIG. 2, the display panel 5 has a plurality of partitioned areas on a planar active area 51.
  • the active area 51 displays a mixed image.
  • the active area 51 is also referred to as a display surface.
  • the mixed image includes a left-eye image and a right-eye image having a parallax with respect to the left-eye image.
  • the left-eye image is also referred to as the first image.
  • the right eye image is also referred to as the second image.
  • the mixed image which will be described in detail later, may further include a third image.
  • the partitioned area is an area partitioned by the grid-like black matrix 52 in the first direction and the second direction orthogonal to the first direction.
  • the direction orthogonal to the first direction and the second direction is referred to as the third direction.
  • the first direction may be referred to as the horizontal direction.
  • the second direction may be referred to as the vertical direction.
  • the third direction may be referred to as the depth direction.
  • the first direction, the second direction, and the third direction are not limited to these.
  • the first direction is represented as the x-axis direction
  • the second direction is represented as the y-axis direction
  • the third direction is represented as the z-axis direction.
  • the active area 51 includes a plurality of sub-pixels arranged in a grid along the horizontal and vertical directions.
  • Each sub-pixel is made up of one set of three sub-pixels of R, G, and B, which correspond to any color of R (Red), G (Green), and B (Blue). You can One pixel is also referred to as one pixel.
  • the horizontal direction is, for example, a direction in which a plurality of subpixels forming one pixel are arranged.
  • the vertical direction is, for example, a direction in which subpixels of the same color are arranged.
  • the display panel 5 is not limited to a transmissive liquid crystal panel, and may be another display panel such as an organic EL (Electro Luminescence). When the display panel 5 is a self-luminous display panel, the three-dimensional display device 2 does not need to include the illuminator 4.
  • a plurality of sub-pixels arranged in the active area 51 can form a sub-pixel group Pg.
  • the sub-pixel group Pg is a minimum unit in which the controller 7, which will be described later, performs control for displaying an image in the active area 51.
  • the controller 7 causes the plurality of sub-pixels included in one sub-pixel group Pg to display the left-eye image or the right-eye image.
  • the number of subpixels displaying the left-eye image and the number of subpixels displaying the right-eye image may be the same.
  • the sub-pixel group Pg may be repeatedly arranged in the horizontal direction.
  • the sub-pixel group Pg may be repeatedly arranged adjacent to a position vertically displaced by one sub-pixel in the vertical direction.
  • a sub-pixel group Pg including twelve sub-pixels P1 to P12 arranged continuously, one in the vertical direction and twelve in the horizontal direction, is arranged.
  • the subpixels P(1) to P(2 ⁇ n ⁇ b) included in all the subpixel groups Pg may be collectively controlled by the controller 7. For example, when switching the image displayed on the sub-pixel P1 from the left-eye image to the right-eye image, the controller 7 changes the image displayed on the sub-pixel P1 included in all the sub-pixel groups Pg from the left-eye image to the right-eye image. You may switch at the same time.
  • the barrier 6 is formed by a flat surface along the active area 51, and is arranged apart from the active area 51 by a predetermined distance (gap) g.
  • the barrier 6 is, for example, a parallax barrier, but is not limited to this and may be any optical element.
  • the barrier 6 may be located on the opposite side of the illuminator 4 with respect to the display panel 5.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5. As shown in FIG. 3, the barrier 6 has a plurality of light blocking surfaces 61 that block image light.
  • the plurality of light shielding surfaces 61 define a light transmitting area 62 between the light shielding surfaces 61 adjacent to each other.
  • the light transmitting region 62 has a higher light transmittance than the light shielding surface 61.
  • the light shielding surface 61 has a lower light transmittance than the light transmitting area 62.
  • the light transmitting area 62 is also referred to as a first light transmitting area.
  • the light shielding surface 61 is also referred to as a second light transmitting area.
  • the light-transmitting area 62 is a portion for transmitting light incident on the barrier 6.
  • the translucent region 62 may transmit light at the first transmittance.
  • the first transmittance is, for example, about 100%, but is not limited to this, and may be a value in a range in which the image light emitted from the display panel 5 can be visually recognized well.
  • the first transmittance may be, for example, 80% or more, or 50% or more.
  • the light-shielding surface 61 is a portion that blocks light that enters the barrier 6 and hardly transmits it. That is, the light blocking surface 61 blocks the image displayed in the active area 51 of the display panel 5 from reaching the eyes of the user.
  • the light shielding surface 61 may transmit light at the second transmittance.
  • the second transmittance is, for example, approximately 0%, but is not limited to this, and may be a value greater than 0% and a value close to 0% such as 0.5%, 1%, or 3%.
  • the first transmittance can be set to a value that is several times or more, for example, 10 times or more larger than the second transmittance.
  • the translucent region 62 may be a plurality of strip-shaped regions extending in a predetermined direction in the plane.
  • the translucent area 62 defines the light ray direction, which is the direction in which the image light emitted from the sub-pixels propagates.
  • the predetermined direction is a direction that forms a predetermined angle that is not 0 degree or 90 degrees with the vertical direction.
  • the light-transmitting regions 62 and the light-shielding surfaces 61 may extend in a predetermined direction along the active area 51 and may be repeatedly and alternately arranged in a direction orthogonal to the predetermined direction.
  • the barrier 6 may be configured to define the light ray direction of the image light emitted from the display panel 5 by the light shielding surface 61 and the light transmitting area 62. As shown in FIG. 1, the barrier 6 defines the image light emitted from the sub-pixels arranged in the active area 51, thereby defining the area on the active area 51 where the user's eyes can visually recognize.
  • the area in the active area 51 that emits the image light propagating to the position of the user's eyes is referred to as a visible area 51a.
  • a region in the active area 51 that emits image light propagating to the position of the left eye of the user is referred to as a left visible region 51aL.
  • the left visible area 51aL is also referred to as a first visible area.
  • a region in the active area 51 that emits image light propagating to the position of the right eye of the user is referred to as a right visible region 51aR.
  • the right visible region 51aR is also referred to as a second visible region.
  • the appropriate viewing distance d is referred to as OVD (Optimum Viewing Distance).
  • the barrier pitch Bp which is an arrangement interval of the translucent regions 62 in the horizontal direction, and the gap g between the active area 51 and the barrier 6, are the horizontal length Hp of the sub-pixel and the sub-pixel of the one-eye image. It is defined that the following equations (1) and (2) using the number n, the appropriate viewing distance d, and the interocular distance E are established.
  • E:d (n ⁇ Hp):g
  • d:Bp (d+g):(2 ⁇ n ⁇ Hp) Formula (2)
  • the suitable viewing distance d is the distance between the barrier 6 and at least one of the left eye and the right eye of the user in which the horizontal length of the visible region 51a is n subpixels.
  • the inter-eye distance E is the distance between the user's left eye and right eye.
  • the inter-eye distance E may be a value calculated from the position of the user's eyes or may be a preset value. When set in advance, the inter-eye distance E may be set to a value of 61.1 mm to 64.4 mm, which is a value calculated by a research of the National Institute of Advanced Industrial Science and Technology, for example.
  • Hp is the horizontal length of the sub-pixel as shown in FIG.
  • the barrier 6 may be composed of a member having the second transmittance.
  • the barrier 6 may be composed of, for example, a film or a plate-shaped member.
  • the light shielding surface 61 is made of a film or a plate member.
  • the translucent area 62 is composed of an opening provided in the film or the plate member.
  • the film is made of, for example, resin, but is not limited to this.
  • the plate member is made of, for example, resin or metal, but is not limited to this.
  • the barrier 6 may be made of a light-shielding base material, or may be made of a base material containing a light-shielding additive.
  • the barrier 6 may be composed of a liquid crystal shutter.
  • the liquid crystal shutter can control the light transmittance according to the applied voltage.
  • the liquid crystal shutter may include a plurality of pixels and may control the light transmittance of each pixel.
  • the liquid crystal shutter can form a region having a high light transmittance or a region having a low light transmittance in an arbitrary shape.
  • the light transmitting region 62 may be a region having the first transmittance.
  • the light shielding surface 61 may be a region having the second transmittance.
  • the barrier 6 having the above-described configuration allows the image light emitted from a part of the sub-pixels of the active area 51 to pass through the transparent region 62 and be propagated to the right eye of the user.
  • the barrier 6 can transmit the image light emitted from some other sub-pixels through the translucent region 62 and propagate to the left eye of the user.
  • An image visually recognized by the user's eye by propagating the image light to each of the left eye and the right eye of the user will be described in detail with reference to FIGS. 4 and 5.
  • the left visible region 51aL shown in FIG. 4 is, as described above, the active area 51 visually recognized by the left eye of the user when the image light transmitted through the transparent region 62 of the barrier 6 reaches the left eye of the user.
  • the left invisible area 51bL is an area that the left eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the left visible region 51aL includes half of the sub-pixel P1, all of the sub-pixels P2 to P6, and half of the sub-pixel P7.
  • the image light from some other subpixels transmitted through the light transmitting region 62 of the barrier 6 reaches the right eye of the user, so that the right eye of the user is visually recognized.
  • the right invisible region 51bR is a region that the right eye of the user cannot visually recognize because the image light is blocked by the light blocking surface 61 of the barrier 6.
  • the right visible region 51aR includes half of the sub-pixel P7, all of the sub-pixels P8 to P12, and half of the sub-pixel P1.
  • the left eye and the right eye each visually recognize the image.
  • the left-eye image and the right-eye image are parallax images having a parallax with each other.
  • the left eye includes half of the left-eye image displayed in the sub-pixel P1, the entire left-eye image displayed in the sub-pixels P2 to P6, and the right-eye image displayed in the sub-pixel P7.
  • the right eye visually recognizes half of the right eye image displayed in subpixel P7, the entire right eye image displayed in subpixels P8 to P12, and half of the left eye image displayed in subpixel P1. ..
  • the sub-pixel displaying the left-eye image is labeled with “L”
  • the sub-pixel displaying the right-eye image is labeled with “R”.
  • the area of the left eye image that the user's left eye visually recognizes is the maximum, and the area of the right eye image is the minimum.
  • the area of the right-eye image visually recognized by the right eye of the user is maximum, and the area of the left-eye image is minimum.
  • the fact that the left eye of the user visually recognizes the right eye image or the right eye of the user visually recognizes the left eye image is also referred to as crosstalk. The user can visually recognize the three-dimensional image with the crosstalk reduced.
  • the user located at the proper viewing distance d displays
  • the image displayed on the panel 5 can be visually recognized as a three-dimensional image.
  • the left-eye image is displayed in the sub-pixels that are more than half visible by the left eye
  • the right-eye image is displayed in the sub-pixels that are more than half visible by the right eye.
  • the sub-pixels for displaying the left-eye image and the right-eye image are not limited to this, and the left-viewable area 51aL and the right-viewable area 51aR are designed to reduce crosstalk according to the design of the active area 51, the barrier 6, and the like. It may be appropriately determined based on. For example, depending on the aperture ratio of the barrier 6 or the like, the left-eye image is displayed on the sub-pixels that are viewed by the left eye at a predetermined rate or higher, and the right-eye image is displayed on the sub-pixels that are viewed by the right eye at a predetermined rate or higher. You can let me.
  • the controller 7 is connected to each component of the three-dimensional display system 100 and can control each component.
  • the controller 7 is configured as a processor, for example.
  • the controller 7 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for a specific process.
  • the dedicated processor may include an application-specific integrated circuit (ASIC: Application Specific Integrated Circuit).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 7 may be one of a SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and a SiP (System In-a-Package).
  • the controller 7 may include a storage unit, and the storage unit may store various kinds of information, a program for operating each component of the three-dimensional display system 100, or the like.
  • the storage unit may be composed of, for example, a semiconductor memory.
  • the storage unit may function as a work memory of the controller 7.
  • controller 7 The control of each component of the three-dimensional display system 100 by the controller 7 will be described below.
  • the controller 7 determines the left visible region 51aL and the right visible region 51aR in the active area 51 of the display panel 5 based on the positions of at least one of the left eye and the right eye of the user. For example, the controller 7 may determine that the right eye of the user is located at a position horizontally moved from the left eye position by a predetermined eye distance E based on the position of the left eye of the user. The controller may determine the left visible region 51aL and the right visible region 51aR so that the image light that has passed through the respective light transmissive regions 62 reaches the left and right eyes of the user.
  • the left visible region 51aL and the right visible region 51aR in one translucent region 62 have horizontal lengths of n subpixels. Therefore, as shown in FIG. 1, the left visible region 51aL and the right visible region 51aR do not overlap each other and are arranged alternately in the horizontal direction on the display surface of the display panel 5.
  • the controller 7 determines that the sub-pixel included in the left visible region 51aL is the left sub-pixel.
  • the left sub-pixel is, for example, a sub-pixel that includes a predetermined proportion or more in the left visible region 51aL.
  • the left subpixel is also referred to as a first subpixel or a first display area.
  • the controller 7 determines that the sub-pixel included in the right visible region 51aR is the right sub-pixel.
  • the right sub-pixel is, for example, a sub-pixel including a predetermined ratio or more in the right visible region 51aR.
  • the left subpixel is also referred to as a second subpixel or a second display area. As shown in FIG. 1, the left sub-pixels and the right sub-pixels do not overlap each other and are alternately arranged in the horizontal direction on the display surface of the display panel 5.
  • FIG. 6 is a diagram showing sub-pixels visually recognized by the eyes of the user located at the proper viewing distance d by the image light that has passed through one light-transmitting area 62a.
  • the sub-pixel group Pg will be described as including 12 sub-pixels P1 to P12 arranged continuously in the horizontal direction.
  • FIG. 6 shows a visual recognition region 70 in which the right eye or the left eye of the user, who is located away from the barrier 6 by the appropriate viewing distance d, can visually recognize a predetermined subpixel.
  • six subpixels that are continuous in the horizontal direction are visually recognized.
  • the range in which the left and right eyes of the user can be visually recognized through the translucent region 62a is represented by a broken line.
  • the left eye of the user when the left eye of the user is at the position L1 included in the visual recognition area 70A, the left eye of the user visually recognizes the sub-pixels P1 to P6 through the light transmitting area 62a.
  • the sub-pixel visually recognized by the left eye of the user also changes.
  • the left eye of the user is at the position L2 included in the visual recognition area 70B, the left eye of the user visually recognizes the sub-pixels P2 to P7 via the light transmitting area 62a.
  • Subpixels visually recognized by the user's eyes in adjacent visual recognition areas 70 have a difference of one subpixel.
  • the three-dimensional display system 100 among the 2 ⁇ n subpixels arranged in the horizontal direction of the subpixel group Pg, n different subpixels each have a left eye of the user at the proper viewing distance d and The barrier pitch Bp and the gap g are defined so as to be visually recognized by the right eye. That is, the three-dimensional display system 100 is configured such that a difference of n sub-pixels is generated in the sub-pixel regions visually recognized by the left and right eyes of the user located at the proper viewing distance d. Therefore, in FIG.
  • the right eye visually recognizes the sub-pixels P7 to P12. It is located at the position R1 included in the region 70C.
  • the controller 7 sets the sub-pixels P1 to P6 as left sub-pixels and displays a left-eye image visually recognized by the left eye of the user.
  • the controller 7 sets the sub-pixels P7 to P12 as right sub-pixels and displays a right-eye image visually recognized by the right eye of the user.
  • the inter-eye distance E which is the distance between the left eye and the right eye of the user, corresponds to the distance of the n visual recognition areas 70. That is, the width of one visual recognition area 70 is E/n.
  • the controller 7 may change the sub-pixel displaying the right-eye image or the left-eye image according to the position of the user's eye acquired by the acquisition unit 3. For example, it is assumed that the user moves in the horizontal direction at the proper viewing distance d and the left eye of the user moves from the position L1 to the position L2. At this time, the controller 7 determines that, for example, from the position of the left eye of the user, the left eye of the user is located in the visual recognition area 70B for visually recognizing the sub-pixels P2 to P7. The controller 7 determines that the right eye, which is away from the left eye of the user by the inter-eye distance E from the left eye, is located in the visual recognition area 70D for visually recognizing the sub-pixels P8 to P12 and P1.
  • the controller 7 sets the sub-pixels P2 to P7 as left sub-pixels and the sub-pixels P8 to P12 and P1 as right sub-pixels. As a result, the user can visually recognize the three-dimensional image in the state where the crosstalk is reduced.
  • the control of each component of the three-dimensional display system 100 by the controller 7 when the observation distance of the user is not the proper viewing distance d will be described.
  • a part of the left visible region 51aL in one light transmitting region 62a overlaps with a part of the right visible region 51aR.
  • the binocular visible region 51aLR may exist.
  • the left sub-pixel is included in the left visible region 51aL and is determined to display the left eye image, and the right eye image is displayed in the right visible region 51aR.
  • the third subpixel is also referred to as a third display area.
  • the controller 7 performs control for reducing crosstalk that occurs when the user located at the observation distance d1 visually recognizes the three-dimensional table image.
  • the controller 7 may be configured to determine the third subpixel, which is the left subpixel and the right subpixel, based on the viewing distance of the user.
  • the position L8-2 and the position R8-2 are positions which are apart from the barrier 6 by the appropriate viewing distance d and which are moved in the direction away from the barrier 6 along the depth direction from the positions L8-1 and R8-1. ..
  • the image light emitted from the display panel 5 and passing through the translucent area 62a travels along the optical path 71A and the optical path 71B to reach the left eye at the position L8-2 and the right eye at the position R8-2. Reach each.
  • the optical path 71A passes through the position L8-1′ included in the visual recognition area 70B at the proper viewing distance d.
  • the visible region 70B corresponds to a region in which the sub-pixels P2 to P7 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the controller 7 can specify the sub-pixel that the left eye can visually recognize by calculating the visual recognition area 70 where the optical path 71A intersects even when the left eye is at the position L8-2.
  • the left eye at the position L8-2 can see the sub-pixels P2 to P7.
  • the optical path 71B passes through the position R8-1 included in the visual recognition area 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye located at the position R8-2 can visually recognize the sub-pixels P7 to P12.
  • the left eye at the position L8-2 and the right eye at the position R8-2 both see the sub-pixel P7.
  • the sub-pixel P7 is shaded.
  • the sub-pixel P7 is a sub-pixel located on the boundary between the left sub-pixel and the right sub-pixel on the center side of the left sub-pixel and the right sub-pixel visually recognized from one translucent area 62.
  • the ratio of the observation distance Y1 from the barrier 6 and the proper viewing distance d corresponds to the ratio of the distance between the position L8-2 and the position R8-2 and the distance between the position L8-1′ and the position R8-1. ..
  • the images visually recognized by the left and right eyes of the user at the observation distance Y1 separated by the inter-eye distance E are the width of one visual recognition region 70 from the inter-eye distance E, that is, E/n at the proper visual distance d. It can be considered that the images correspond to the images visually recognized by the left and right eyes of the shortened user.
  • the position L9-2 and the position R9-2 are positions that are apart from the barrier 6 by the proper viewing distance d and that are moved in the direction away from the barrier 6 along the depth direction from the position L9-1 and the position R9-1. ..
  • the observation distance Y2 is a distance larger than the observation distance Y1 described above.
  • the left eye of the user at the position L9-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71A.
  • the optical path 71A passes through the position L9-1' included in the visual recognition region 70D at the proper viewing distance d.
  • the visible region 70D corresponds to a region in which the sub-pixels P3 to P8 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the left eye at the position L9-2 can see the sub-pixels P3 to P8.
  • Position R9-2 The right eye of the user visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71B.
  • the optical path 71B passes through the position R9-1 included in the visual recognition region 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye located at the position R9-2 can see the sub-pixels P7 to P12.
  • the left eye at position L9-2 and the right eye at position R9-2 both see the sub-pixels P7 and P8.
  • the images visually recognized by the left eye at the position L9-2 and the right eye at the position R9-2 at the observation distance Y2 are shortened by (E/n) ⁇ 2 from the interocular distance E at the proper viewing distance d.
  • L9-1' and the right eye at position R9-1 may be considered to correspond to the images visually recognized.
  • An observation distance Y2 at which an image that can be visually recognized by both the left eye and the right eye of the user has two subpixels is defined by the following expression (5).
  • Y2 (n ⁇ d)/(n ⁇ 2) Formula (5)
  • the controller 7 uses the following expression using the viewing distance Y of the user, the number n of sub-pixels forming one-eye image, and the suitable viewing distance d.
  • the controller 7 determines that the third sub-pixel is a boundary between the left sub-pixel and the right sub-pixel, and the left sub-pixel and the right sub-pixel that are visually recognized from one light-transmitting area 62. It may be determined that it occurs at the boundary on the center side of the sub-pixel.
  • the left and right eyes of the user are at the position L10-2 and the position R10-2, which are separated from the barrier 6 by the observation distance Y3.
  • the position L10-2 and the position R10-2 are positions closer to the barrier 6 along the depth direction than the position L10-1 and the position R10-1, which are apart from the barrier 6 by the suitable viewing distance d.
  • the image light emitted from the display panel 5 travels along the optical paths 71A and 71B, passes through one transparent region 62a, and is located at the left eye at the position L10-2 and at the position R10-2. Reach each right eye.
  • the optical path 71A passes through the position L10-1′ included in the visual recognition area 70E at the proper viewing distance d.
  • the visible region 70E corresponds to a region in which the sub-pixels P12 and P1 to P5 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the optical path 71A intersects the visual recognition region 70E at the position L10-1', which means that the left eye can visually recognize the sub-pixels P12 and P1 to P5. That is, the controller 7 can specify the sub-pixel that the left eye can visually recognize by calculating the visual recognition area 70 where the optical path 71A intersects even when the left eye is at the position L10-2.
  • the left eye at the position L10-2 can see the sub-pixels P12 and P1 to P5.
  • the optical path 71E passes through the position R10-1 included in the visual recognition region 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized in a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye at the position R10-2 can see the sub-pixels P7 to P12.
  • the left eye at the position L10-2 and the right eye at the position R10-2 both see the sub-pixel P12.
  • the sub-pixel P7 is shaded.
  • the sub-pixel P12 is a sub-pixel located on the outer boundary between the left sub-pixel and the right sub-pixel that is visually recognized from one translucent area 62 among the boundaries between the left sub-pixel and the right sub-pixel.
  • the ratio of the observation distance Y3 from the barrier 6 and the suitable viewing distance d corresponds to the ratio of the distance between the positions L10-2 and R10-2 and the distance between the positions L10-1′ and R10-1. ..
  • the images visually recognized by the left and right eyes of the user at the observation distance Y3, which are separated by the inter-eye distance E, are the width of one visual recognition region 70 from the inter-eye distance E, that is, E/n at the appropriate visual distance d. It can be considered that this corresponds to an image visually recognized by the left and right eyes of the user who has become longer.
  • the left and right eyes of the user are at positions L11-2 and R11-2, which are separated from the barrier 6 by the observation distance Y4.
  • the position L11-2 and the position R11-2 are positions that are closer to the barrier 6 along the depth direction than the positions L11-1 and R11-1 that are apart from the barrier 6 by the appropriate viewing distance d.
  • the observation distance Y4 is shorter than the observation distance Y3 described above.
  • the left eye of the user at the position L11-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71A.
  • the optical path 71A passes through the position L11-1' included in the visual recognition area 70F at the proper viewing distance d.
  • the visible region 70F corresponds to a region in which the sub-pixels P11 to P12 and P1 to P4 can be visually recognized in a plane separated from the barrier 6 by the appropriate viewing distance d.
  • the left eye at the position L11-2 can see the sub-pixels P11 to P12 and P1 to P4.
  • the right eye of the user at the position R11-2 visually recognizes the image light emitted from the display panel 5 and traveling along the optical path 71B.
  • the optical path 71B passes through the position R11-1 included in the visual recognition region 70C at the proper viewing distance d.
  • the visual recognition region 70C corresponds to a region in which the sub-pixels P7 to P12 can be visually recognized within a plane that is apart from the barrier 6 by the appropriate viewing distance d.
  • the right eye at the position R11-2 can see the sub-pixels P7 to P12.
  • the left eye at the position L11-2 and the right eye at the position R11-2 both see the sub-pixels P11 and P12.
  • the left eye at the position L11-2 and the right eye at the position R11-2 at the observation distance Y4 are longer than the interocular distance E by (E/n) ⁇ 2 at the proper viewing distance d. It may be considered that it corresponds to the distance between the left eye at 1′ and the right eye at the position R11-1.
  • the controller 7 uses the following expression using the user's viewing distance Y, the number n of sub-pixels forming one-eye image, and the suitable viewing distance d.
  • the controller 7 determines that the third sub-pixel is a boundary between the left sub-pixel and the right sub-pixel, and the left sub-pixel and the right sub-pixel visually recognized from one translucent area 62. It may be determined to occur at the outer boundary of the subpixel.
  • the controller 7 sets the number of third sub-pixels to an even number so that the number of left sub-pixels visually recognized by the user's left eye is equal to the number of right sub-pixels visually recognized by the user's right eye. May be corrected as follows.
  • FIG. 12 shows a list of the observation distance Y and the number of corrected third sub-pixels.
  • the controller 7 rounds up X calculated by the formula (6) or the formula (10) to an integer, and outputs the smallest even number of the third sub-pixel. Good as a number.
  • the controller 7 may set the maximum even number or 0 that does not exceed X calculated by Expression (6) or Expression (10) as the number of the third sub-pixels. For example, as shown in the correction 3 of FIG. 12, the controller 7 truncates X calculated by the formula (6) or the formula (10) to an integer, and the smallest even number or 0 is the third sub-number. It can be the number of pixels.
  • the controller 7 may be configured to determine the fourth subpixel that is not the left subpixel and is not the right subpixel based on the viewing distance Y of the user. For example, the controller 7 may determine the left subpixel and the right subpixel, respectively, as described above in the determination of the third subpixel. The controller 7 may determine a subpixel that is neither the left subpixel nor the right subpixel as the fourth subpixel.
  • the controller 7 may be configured to display the mixed image in the active area 51. Specifically, the controller 7 displays the left-eye image on the sub-pixel that is the left sub-pixel and not the right sub-pixel. The controller 7 causes the right-eye image to be displayed on the sub-pixel that is the right sub-pixel and is not the left sub-pixel. The controller 7 may display the third image on the third subpixel when the third subpixel that is the left subpixel and the right subpixel is present.
  • the controller 7 may be configured to set the brightness value of the third image displayed in the third sub-pixel to a predetermined value or less.
  • the controller 7 may display a black image as the third image.
  • the black image is an image having a predetermined brightness, such as black.
  • the predetermined brightness can be a value corresponding to the brightness of the lowest gradation among the displayable gradation levels of the sub-pixels or the brightness of the gradation corresponding thereto.
  • the controller 7 may display an average image having a luminance value that is an average value of the luminance values of the left-eye image and the right-eye image as the third image in the third subpixel.
  • the controller 7 may display, in the third subpixel, a left-eye image or a right-eye image as the third image based on the characteristics of the user.
  • the characteristic of the user is, for example, a characteristic regarding the dominant eye of the user.
  • the controller 7 may display the left-eye image or the right-eye image corresponding to the dominant eye based on the information indicating the dominant eye of the user set in advance or input from the outside. ..
  • the controller 7 displays the left-eye image as the third image when the dominant eye of the user is the left eye, and displays the right-eye image as the third image when the dominant eye of the user is the right eye. Good.
  • the controller 7 may be configured to set the brightness value of the fourth image displayed in the fourth subpixel to a predetermined value or less.
  • the controller 7 may display a black image as the fourth image.
  • the black image is an image having a predetermined brightness, such as black.
  • the predetermined brightness can be a value corresponding to the brightness of the lowest gradation among the displayable gradation levels of the sub-pixels or the brightness of the gradation corresponding thereto.
  • Step S101 The controller 7 acquires the position of at least one of the first eye (for example, the left eye) and the second eye (for example, the right eye) of the user from the detection device 1.
  • the first eye for example, the left eye
  • the second eye for example, the right eye
  • Step S102 The controller 7 calculates the observation distance between the user's eye position and the barrier 6 from the acquired information on the user's eye position.
  • Step S103 The controller 7 determines the left visible region 51aL based on the position of the user's eyes, and determines the left subpixel (first subpixel) based on the left visible region 51aL.
  • Step S104 The controller 7 determines the right visible region 51aR based on the position of the user's eyes, and determines the right subpixel (second subpixel) based on the right visible region 51aR.
  • Step S105 The controller 7 determines the third sub-pixel which is the right sub-pixel and the left sub-pixel based on the position of the user's eye.
  • the controller 7 may set the number of the third sub-pixels to 0 or an even number.
  • Step S106 The controller 7 determines the fourth subpixel which is neither the right subpixel nor the left subpixel based on the position of the user's eye.
  • Step S107 The controller 7 displays the left-eye image (first image) on the subpixel that is the left subpixel and not the right subpixel.
  • Step S108 The controller 7 displays the right-eye image (second image) on the subpixel which is the right subpixel and not the left subpixel.
  • Step S109 The controller 7 displays the third image on the third subpixel.
  • Step S110 The controller 7 displays a black image on the fourth subpixel.
  • the three-dimensional display device 2 includes the display panel 5, optical elements such as the barrier 6, the acquisition unit 3, and the controller 7.
  • the display panel 5 includes an active area 51 configured to display a mixed image including a first image and a second image having a parallax with respect to the first image.
  • the active area 51 includes a plurality of subpixels.
  • Optical elements such as the barrier 6 are configured to define the light ray direction of the image light emitted from the active area 51.
  • the acquisition unit 3 is configured to acquire the position of at least one of the first eye and the second eye of the user.
  • the controller 7 is configured to display the mixed image in the active area 51.
  • the controller 7 determines the first visible region in the active area 51 that emits the image light propagating to the position of the first eye based on the position of at least one of the first eye and the second eye of the user, It is configured to determine the second visible region in the active area 51 that emits the image light propagating to the positions of the two eyes.
  • the controller 7 is configured to determine the first sub-pixel in which the first visible region includes a predetermined ratio or more from the sub-pixels.
  • the controller 7 is configured to determine the second sub-pixel in which the second visible region includes a predetermined proportion or more from the sub-pixels.
  • the controller 7 is configured to display the third image on the third sub-pixel which is the first sub-pixel and the second sub-pixel.
  • the three-dimensional display system 100 can adjust the image displayed on the display panel 5 according to the viewing distance of the user. Accordingly, the three-dimensional display device 2 can control the images visually recognized by the left and right eyes of the user so that the crosstalk is reduced when the viewing distance of the user is not the proper viewing distance. Therefore, the three-dimensional display device 2 can appropriately allow the user to visually recognize the three-dimensional image regardless of the change in the distance from the user.
  • the controller 7 determines whether at least one of the first eye and the second eye of the user and the optical element such as the barrier 6 is based on the position of at least one of the first eye and the second eye of the user. The distance can be further determined. According to this configuration, the controller 7 determines the observation distance of the user, and when the observation distance of the user is not the proper viewing distance, the crosstalk reduces the images visually recognized by the left and right eyes of the user. Can be controlled as described.
  • the controller 7 can set the brightness value of the third image to a predetermined value or less.
  • the third image may be a black image whose brightness value is less than or equal to a predetermined value. Accordingly, it is possible to prevent the left eye and the right eye of the user from visually recognizing the left eye image or the right eye image corresponding to different eyes. Therefore, crosstalk can be reduced.
  • the controller 7 can set the brightness value of the third image to the average value of the brightness values of the first image and the second image.
  • the left eye of the user visually recognizes an image having a brightness closer to that of the left eye image than the right eye image.
  • the right eye of the user visually recognizes an image having a brightness closer to that of the right eye image than the left eye image. Therefore, as compared with the case where the left eye visually recognizes the right eye image, or the right eye visually recognizes the left eye image, an image with less discomfort can be visually recognized.
  • the controller 7 is configured to display an image having a luminance value equal to or lower than a predetermined value on the fourth subpixel which is not the first subpixel and is not the second subpixel. Therefore, the image light is not emitted from the fourth subpixel. Therefore, it is possible to prevent the stray light generated by the image light emitted from the fourth sub-pixel from being secondarily reflected by the member forming the barrier 6 or the like from reaching the eyes of the user. Therefore, the left eye and the right eye of the user can clearly see the left eye image and the right eye image, respectively, without being interfered by the stray light.
  • the optical element is the barrier 6, but the optical element is not limited to this.
  • the optical element included in the three-dimensional display device 2 may be a lenticular lens 91.
  • the lenticular lens 91 is configured by arranging a plurality of vertically extending cylindrical lenses 92 in the horizontal direction.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the left visible region 51aL so as to reach the position of the left eye of the user.
  • the lenticular lens 91 can propagate the image light emitted from the sub-pixel of the right visible region 51aR so as to reach the position of the right eye of the user.
  • the three-dimensional display system 100 is described as the three-dimensional display device 2 and the detection device 1 being separate bodies, but the present invention is not limited to this.
  • the three-dimensional display device 2 may include the function provided by the detection device 1.
  • the three-dimensional display device 2 can detect the position of at least one of the left eye and the right eye of the user.
  • the three-dimensional display system 100 can be mounted on the head-up display system 400.
  • the head-up display system 400 is also referred to as a HUD (Head Up Display) 400.
  • the HUD 400 includes the three-dimensional display system 100, an optical member 410, and a projected member 420 having a projected surface 430.
  • the HUD 400 is configured to cause the image light emitted from the three-dimensional display system 100 to reach the projection target member 420 via the optical member 410.
  • the HUD 400 is configured to cause the image light reflected by the projection target member 420 to reach the left and right eyes of the user.
  • the HUD 400 can cause image light to travel from the three-dimensional display system 100 to the left and right eyes of the user along the optical path 440 indicated by the broken line.
  • the user can visually recognize the image light reaching the optical path 440 as a virtual image 450.
  • the HUD 400 including the three-dimensional display system 100 may be mounted on the mobile body 10.
  • Part of the configuration of the HUD 400 may be combined with other devices and parts included in the moving body 10.
  • the moving body 10 may also use the windshield as the projection target member 420.
  • the other configuration may be referred to as a HUD module or a three-dimensional display component.
  • the HUD 400 and the three-dimensional display system 100 may be mounted on the moving body 10.
  • the “moving body” in the present disclosure includes a vehicle, a ship, and an aircraft.
  • the “vehicle” in the present disclosure includes, but is not limited to, an automobile and an industrial vehicle, and may include a railroad vehicle, a living vehicle, and a fixed-wing aircraft traveling on a runway.
  • Vehicles include, but are not limited to, passenger cars, trucks, buses, motorcycles, and trolleybuses, and may include other vehicles traveling on roads.
  • Industrial vehicles include industrial vehicles for agriculture and construction.
  • Industrial vehicles include, but are not limited to, forklifts and golf carts.
  • Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawn mowers.
  • Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, mobile cranes, dump trucks, and road rollers.
  • Vehicles include those that are driven manually.
  • the vehicle classification is not limited to the above.
  • an automobile may include an industrial vehicle that can travel on a road, and the same vehicle may be included in multiple classifications.
  • Ships in the present disclosure include marine jets, boats, and tankers.
  • the aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.

Abstract

Ce système d'affichage tridimensionnel (2) comprend un panneau d'affichage (5), un élément optique, une unité d'acquisition (3) et un dispositif de commande (7). Le panneau d'affichage (5) comprend une zone active (51) configurée pour afficher une image mixte comprenant une première image et une seconde image. La zone active (51) comprend une pluralité de sous-pixels. L'élément optique est configuré pour réguler une direction de faisceau lumineux de la lumière d'image émise à partir de la zone active (51). L'unité d'acquisition (3) est configurée pour acquérir la position d'au moins un premier œil ou d'un second œil d'un utilisateur. Le dispositif de commande (7) est configuré pour afficher une image mixte dans la zone active (51). Le dispositif de commande (7) est configuré pour déterminer un premier sous-pixel et un second sous-pixel sur la base de la position de l'œil de l'utilisateur. Le dispositif de commande (7) est configuré pour afficher une troisième image sur un troisième sous-pixel qui est le premier sous-pixel et le deuxième sous-pixel.
PCT/JP2019/049676 2018-12-21 2019-12-18 Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile WO2020130048A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-240070 2018-12-21
JP2018240070A JP2020101694A (ja) 2018-12-21 2018-12-21 3次元表示装置、ヘッドアップディスプレイシステム、及び移動体

Publications (1)

Publication Number Publication Date
WO2020130048A1 true WO2020130048A1 (fr) 2020-06-25

Family

ID=71101842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049676 WO2020130048A1 (fr) 2018-12-21 2019-12-18 Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile

Country Status (2)

Country Link
JP (1) JP2020101694A (fr)
WO (1) WO2020130048A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022019154A1 (fr) * 2020-07-20 2022-01-27 京セラ株式会社 Dispositif d'affichage tridimensionnel

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012176445A1 (fr) * 2011-06-20 2012-12-27 パナソニック株式会社 Dispositif d'affichage d'images
WO2015145934A1 (fr) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Appareil d'affichage d'image virtuelle, système d'affichage tête haute, et véhicule
JP2016177281A (ja) * 2015-03-20 2016-10-06 任天堂株式会社 動的自動立体3d画面の較正方法及び装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012176445A1 (fr) * 2011-06-20 2012-12-27 パナソニック株式会社 Dispositif d'affichage d'images
WO2015145934A1 (fr) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Appareil d'affichage d'image virtuelle, système d'affichage tête haute, et véhicule
JP2016177281A (ja) * 2015-03-20 2016-10-06 任天堂株式会社 動的自動立体3d画面の較正方法及び装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022019154A1 (fr) * 2020-07-20 2022-01-27 京セラ株式会社 Dispositif d'affichage tridimensionnel
JP7475231B2 (ja) 2020-07-20 2024-04-26 京セラ株式会社 3次元表示装置

Also Published As

Publication number Publication date
JP2020101694A (ja) 2020-07-02

Similar Documents

Publication Publication Date Title
JP6924637B2 (ja) 3次元表示装置、3次元表示システム、移動体、および3次元表示方法
JP7129789B2 (ja) ヘッドアップディスプレイ、ヘッドアップディスプレイシステム、および移動体
US20200053352A1 (en) Three-dimensional display apparatus, three-dimensional display system, head-up display system, and mobile body
JP7188981B2 (ja) 3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、及び移動体
WO2020130049A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile
JP7145214B2 (ja) 3次元表示装置、制御コントローラ、3次元表示方法、3次元表示システム、および移動体
CN114503556A (zh) 三维显示装置、控制器、三维显示方法、三维显示系统及移动体
WO2020130048A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et objet mobile
JP7188888B2 (ja) 画像表示装置、ヘッドアップディスプレイシステム、および移動体
WO2019225400A1 (fr) Dispositif d'affichage d'images, système d'affichage d'images, affichage tête haute, et objet mobile
US11874464B2 (en) Head-up display, head-up display system, moving object, and method of designing head-up display
JP7336782B2 (ja) 3次元表示装置、3次元表示システム、ヘッドアップディスプレイ、及び移動体
WO2023228887A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tête haute et corps mobile
WO2022149599A1 (fr) Dispositif d'affichage tridimensionnel
JP7475231B2 (ja) 3次元表示装置
US11961429B2 (en) Head-up display, head-up display system, and movable body
JP7483604B2 (ja) 3次元表示システム、光学素子、設置方法、制御方法、および移動体
EP4040787A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, et objet mobile
WO2022163728A1 (fr) Dispositif d'affichage tridimensionnel
US20240064282A1 (en) Three-dimensional display device, three-dimensional display method, three-dimensional display system, and movable body

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19898931

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19898931

Country of ref document: EP

Kind code of ref document: A1