WO2020090629A1 - Dispositif d'affichage d'image, système d'affichage d'image et corps mobile - Google Patents

Dispositif d'affichage d'image, système d'affichage d'image et corps mobile Download PDF

Info

Publication number
WO2020090629A1
WO2020090629A1 PCT/JP2019/041777 JP2019041777W WO2020090629A1 WO 2020090629 A1 WO2020090629 A1 WO 2020090629A1 JP 2019041777 W JP2019041777 W JP 2019041777W WO 2020090629 A1 WO2020090629 A1 WO 2020090629A1
Authority
WO
WIPO (PCT)
Prior art keywords
temperature
eye
image
controller
user
Prior art date
Application number
PCT/JP2019/041777
Other languages
English (en)
Japanese (ja)
Inventor
薫 草深
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Publication of WO2020090629A1 publication Critical patent/WO2020090629A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/376Image reproducers using viewer tracking for tracking left-right translational head movements, i.e. lateral movements
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors

Definitions

  • the present disclosure relates to an image display device, an image display system, and a moving body.
  • An image display device includes a display panel, a barrier panel, and a controller.
  • the display panel is configured to be able to display a frame including a plurality of right-eye images visually recognized by the user's right eye and a plurality of left-eye images visually recognized by the user's left eye.
  • the barrier panel is positioned to overlap the display panel.
  • the barrier panel has a plurality of light-transmitting parts and a plurality of light-reducing parts so that the image light of each of the plurality of right-eye images and the plurality of left-eye images reaches the right and left eyes of the user. Can be formed.
  • the controller is configured to be able to acquire temperature information.
  • the controller is configured to control the display panel and the barrier panel based on the temperature information.
  • An image display system includes an image display device and a reflecting member.
  • the image display device includes a display panel, a barrier panel, and a controller.
  • the display panel is configured to be able to display a frame including a plurality of right-eye images visually recognized by the user's right eye and a plurality of left-eye images visually recognized by the user's left eye.
  • the barrier panel is positioned to overlap the display panel.
  • the barrier panel has a plurality of light-transmitting parts and a plurality of light-reducing parts so that the image light of each of the plurality of right-eye images and the plurality of left-eye images reaches the right and left eyes of the user. Can be formed.
  • the controller is configured to be able to acquire temperature information.
  • the controller is configured to control the display panel and the barrier panel based on the temperature information.
  • the reflection member reflects the image light to reach the left and right eyes of the user.
  • a mobile body is equipped with an image display system.
  • the image display system includes an image display device and a reflecting member.
  • the image display device includes a display panel, a barrier panel, and a controller.
  • the display panel is configured to be able to display a frame including a plurality of right-eye images visually recognized by the user's right eye and a plurality of left-eye images visually recognized by the user's left eye.
  • the barrier panel is positioned to overlap the display panel.
  • the barrier panel has a plurality of light-transmitting parts and a plurality of light-reducing parts so that the image light of each of the plurality of right-eye images and the plurality of left-eye images reaches the right and left eyes of the user. Can be formed.
  • the controller is configured to be able to acquire temperature information.
  • the controller is configured to control the display panel and the barrier panel based on the temperature information.
  • the reflection member reflects the image light to reach the left and right eyes of the user.
  • the image display device 10 includes a display panel 20, a barrier panel 30, and a controller 50.
  • the image display device 10 may further include a temperature element 40.
  • the temperature element 40 may include a temperature sensor or the like.
  • the image display device 10 may further include a detection device 42.
  • the detection device 42 may include a camera or the like.
  • the image display device 10 may further include a light source 44.
  • the light source 44 may include a light emitting element such as an LED (Light Emission Diode).
  • the display panel 20 is configured to be able to display an image that the user can visually recognize.
  • the barrier panel 30 causes a part of the image light emitted from the display panel 20 to reach one of the left eye 5L and the right eye 5R of the user, and the other part of the image light of the other of the user. Configured to reach the eye. That is, the barrier panel 30 is configured to divide the traveling direction of at least part of the image light into the left eye 5L and the right eye 5R of the user.
  • the barrier panel 30 may be located closer or farther than the display panel 20 as seen from the user. The image light traveling in the direction limited by the barrier panel 30 can reach the user's left eye 5L and right eye 5R as different image light.
  • the image display device 10 may be configured to project the parallax image on both eyes of the user.
  • the parallax image is an image projected on each of the left eye 5L and the right eye 5R of the user, and is an image that gives parallax to both eyes of the user.
  • the user can stereoscopically view the image by viewing the parallax image with the left eye 5L and the right eye 5R.
  • the direction in which parallax is given to both eyes of the user is also called the parallax direction.
  • the parallax direction corresponds to the direction in which the left eye 5L and the right eye 5R of the user are lined up.
  • the user's left eye 5L and right eye 5R are also referred to as the user's first eye and second eye, respectively.
  • the controller 50 is connected to each component of the image display device 10 and configured to control each component.
  • the controller 50 may include a first controller 51 and a second controller 52.
  • the first controller 51 may be configured to be able to control the display panel 20.
  • the second controller 52 may be configured to be able to control the barrier panel 30.
  • the first controller 51 and the second controller 52 may be configured to be able to synchronize.
  • the first controller 51 and the second controller 52 may be configured to be controllable as a master and slave relationship.
  • One of the first controller 51 and the second controller 52 may be controllable as a master, and the other may be controllable as a slave.
  • the controller 50 may further include a host controller located above the first controller 51 and the second controller 52 and controlling the first controller 51 and the second controller 52.
  • the controller 50 may be configured as a processor, for example. Controller 50 may include one or more processors.
  • the processor may include a general-purpose processor that loads a specific program and executes a specific function, and a dedicated processor that is specialized for a specific process.
  • the dedicated processor may include an application-specific integrated circuit (ASIC).
  • the processor may include a programmable logic device (PLD: Programmable Logic Device).
  • the PLD may include an FPGA (Field-Programmable Gate Array).
  • the controller 50 may be one of a SoC (System-on-a-Chip) in which one or a plurality of processors cooperate, and a SiP (System In-a-Package).
  • SoC System-on-a-Chip
  • SiP System In-a-Package
  • the controller 50 may include a storage unit, and the storage unit may store various kinds of information, a program for operating each component of the image display device 10, and the like.
  • the storage unit may be composed of, for example, a semiconductor memory or the like.
  • the storage unit may be configured to function as a work memory of the controller 50.
  • the display panel 20 includes a left eye image 23L (see FIG. 8) visually recognized by the user's left eye 5L, a right eye image 23R (see FIG. 9) visually recognized by the user's right eye 5R, and both eyes of the user.
  • the plane image 24 (see FIG. 10) to be visually recognized is displayed.
  • the display panel 20 is assumed to be a liquid crystal device such as an LCD (Liquid Crystal Display).
  • the display panel 20 is configured to be able to form a first display area 21 and a second display area 22.
  • the first display region 21 includes a plurality of left-eye visual recognition regions 21L visually recognized by the user's left eye 5L and a plurality of right-eye visual recognition regions 21R visually recognized by the user's right eye 5R.
  • the display panel 20 is configured to be able to display the left-eye image 23L in each left-eye visual recognition area 21L.
  • the display panel 20 is configured to be able to display the right-eye image 23R in each right-eye visual recognition region 21R.
  • the display panel 20 is configured to be able to display parallax images in the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R.
  • the left-eye visual recognition area 21L and the right-eye visual recognition area 21R are aligned in the X-axis direction.
  • the parallax direction is associated with the X-axis direction.
  • the X-axis direction is also referred to as the horizontal direction or the first direction.
  • the Y-axis direction is also called the vertical direction or the second direction.
  • the left-eye visual recognition area 21L and the right-eye visual recognition area 21R may be positioned with a space therebetween as illustrated in FIG. 3, or may be adjacent to each other.
  • the display panel 20 is configured to be able to display the planar image 24 in the second display area 22.
  • the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R may extend along the Y-axis direction as shown in FIG. 3, or may be inclined at a predetermined angle with respect to the Y-axis direction. May extend to. In other words, the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R may extend along a direction intersecting the parallax direction. The plurality of left-eye visual recognition regions 21L and the plurality of right-eye visual recognition regions 21R may be alternately arranged along a predetermined direction including a parallax direction component.
  • the pitch in which the left-eye visual recognition area 21L and the right-eye visual recognition area 21R are alternately arranged is also referred to as a parallax image pitch.
  • the barrier panel 30 is configured to make the image light of the plurality of left eye images 23L reach the left eye 5L of the user.
  • the barrier panel 30 is configured to cause the image light of the plurality of right eye images 23R to reach the right eye 5R of the user.
  • the barrier panel 30 is configured to function as an active barrier.
  • the barrier panel 30 is configured to be able to form a first barrier region 31 and a second barrier region 32.
  • the barrier panel 30 is configured to be able to control the transmittance of image light emitted from the display panel 20.
  • the first barrier region 31 corresponds to the first display region 21 and is configured to control the transmittance of image light emitted from the first display region 21.
  • the first barrier region 31 is configured to be able to form a plurality of light transmitting portions 31T and a plurality of light reducing portions 31S.
  • the plurality of light transmissive portions 31T are configured to transmit light that enters the barrier panel 30 from the display panel 20.
  • the plurality of light transmissive portions 31T may be configured to transmit light with a transmittance equal to or higher than the first transmittance.
  • the first transmittance may be, for example, 100%, or may be a value close to 100%.
  • the plurality of light reducing units 31S are configured to reduce the light that enters the barrier panel 30 from the display panel 20.
  • the plurality of light-reducing units 31S may be configured to transmit light with a transmittance equal to or lower than the second transmittance.
  • the second transmittance may be 0% or may be a value close to 0%, for example.
  • the first transmittance is higher than the second transmittance.
  • the first transmittance may be a value smaller than 50%, for example, 10%, as long as it is within a range in which sufficient contrast can be secured with the light transmitted through the plurality of dimming parts 31S.
  • the second transmittance may be a value larger than around 0%, for example, 10%, as long as it is within a range where sufficient contrast can be secured with the light transmitted through the plurality of light transmitting portions 31T.
  • a sufficient contrast ratio may be, for example, 100: 1.
  • the barrier panel 30 When the barrier panel 30 is located farther than the display panel 20 from the user's perspective, the barrier panel 30 is configured to be able to control the transmittance of light incident on the display panel 20.
  • the plurality of translucent portions 31T are configured to transmit the light incident on the display panel 20.
  • the plurality of light reducing units 31S are configured to reduce the light that enters the display panel 20.
  • the first barrier region 31 is configured to be able to control the transmittance of light incident on the first display region 21.
  • the intensity of the image light emitted from the display panel 20 is controlled based on the intensity of the incident light.
  • the traveling direction of the image light emitted from the display panel 20 is controlled based on the traveling direction of the incident light.
  • the plurality of translucent portions 31T are configured to cause the image light of the left eye image 23L to reach the user's left eye 5L, and the image light of the right eye image 23R to reach the user's right eye 5R. Configured to let.
  • the plurality of dimming units 31S are configured to prevent the image light of the left-eye image 23L from reaching the user's right eye 5R or to make it difficult to reach the right-eye image 23R of the user.
  • the left eye 5L is configured not to reach or difficult to reach.
  • the barrier panel 30 may be configured such that the user visually recognizes the right eye image 23R with the right eye 5R, but does not visually recognize or is difficult to visually recognize the right eye image 23R with the left eye 5L.
  • the barrier panel 30 may be configured such that the user visually recognizes the left eye image 23L with the left eye 5L, but does not visually recognize or is difficult to visually recognize the left eye image 23L with the right eye 5R.
  • the plurality of light transmitting units 31T and the plurality of light reducing units 31S are configured to define the direction of image light relating to the parallax image including the left eye image 23L and the right eye image 23R.
  • the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S are alternately arranged in the X-axis direction.
  • the boundary between each light-transmitting portion 31T and each light-reducing portion 31S may be along the Y-axis direction as illustrated in FIG. 4, or may be along a direction inclined at a predetermined angle with respect to the Y-axis direction. Good.
  • the boundary between each light transmitting portion 31T and each light reducing portion 31S may be along a direction intersecting the parallax direction. In other words, the plurality of light-transmitting portions 31T and the plurality of light-reducing portions 31S may be alternately arranged along a predetermined direction that includes a parallax direction component.
  • the shapes of the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S may be determined based on the shapes of the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R. Conversely, the shapes of the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R may be determined based on the shapes of the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S.
  • the second barrier region 32 corresponds to the second display region 22, and is configured to control the transmittance of image light emitted from the second display region 22.
  • the barrier panel 30 is assumed to be composed of a liquid crystal shutter.
  • the liquid crystal shutter can be configured to control the light transmittance based on the applied voltage.
  • the liquid crystal shutter may include a plurality of pixels, and the light transmittance of each pixel may be controllable.
  • the liquid crystal shutter may be configured so that a region having a high light transmittance or a region having a low light transmittance can be formed in an arbitrary shape.
  • the plurality of translucent parts 31T may have a transmissivity equal to or higher than the first transmissivity.
  • the plurality of light reducing units 31S may have a transmittance equal to or lower than the second transmittance.
  • the display panel 20 and the barrier panel 30 each have a plurality of pixels.
  • the arrangement pitch of the plurality of pixels of the display panel 20 and the arrangement pitch of the plurality of pixels of the barrier panel 30 may be the same or different. In the present embodiment, it is assumed that the arrangement pitch of the plurality of pixels of the display panel 20 and the arrangement pitch of the plurality of pixels of the barrier panel 30 are the same. In this case, each pixel of the display panel 20 is associated with each pixel of the barrier panel 30.
  • Each pixel of the barrier panel 30 may be configured to be controllable so as to function as one of the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S.
  • the controller 50 may be configured so that control of each pixel of the display panel 20 and control of each pixel of the barrier panel 30 associated therewith can be synchronized.
  • the control of each pixel of the display panel 20 and the barrier panel 30, which are associated with each other, is configured to be synchronous, so that the image quality can be improved.
  • At least a part of the plurality of pixels included in the second display area 22 may be configured to be capable of displaying black.
  • An area formed by a plurality of pixels displaying black is also referred to as a black display area.
  • the controller 50 may be configured to be able to form a plurality of dimming parts 31S in the region corresponding to the black display region included in the second display region 22 in the barrier panel 30. By doing so, the transmittance of image light in the black display region is further reduced. As a result, the black display area looks even blacker for the user.
  • the barrier panel 30 is located between the left eye 5L and the right eye 5R of the user and the display panel 20.
  • the barrier panel 30 may be located on the side farther from the display panel 20 as seen from the user.
  • the barrier panel 30 is located along the display panel 20. It can be said that the barrier panel 30 is positioned so as to overlap the display panel 20.
  • the distance between the user's left eye 5L and right eye 5R and the barrier panel 30 is also referred to as the observation distance, and is represented as P.
  • the pitch at which the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S are alternately arranged in the X-axis direction is also referred to as a barrier pitch.
  • the distance between the left eye 5L and the right eye 5R is also called the interocular distance, and is represented by E.
  • the distance between the barrier panel 30 and the display panel 20 is also called a gap and is represented as g.
  • the display panel 20 is configured to be capable of forming a plurality of left-eye viewing areas 21L and a plurality of left-eye non-viewing areas 22L.
  • the plurality of left-eye visual recognition regions 21L are configured to be visible from the left eye 5L of the user via the plurality of translucent portions 31T.
  • the plurality of left-eye non-visible areas 22L are configured to be invisible or difficult to be seen by the user's left eye 5L by the plurality of light reducing units 31S.
  • the plurality of left-eye visible regions 21L and the plurality of left-eye non-visible regions 22L are alternately arranged in the X-axis direction.
  • the positions of the boundaries between the left-eye visible regions 21L and the left-eye non-visible regions 22L are the positions of the boundaries between the light-transmitting portions 31T and the light-reducing portions 31S, and the user's eyes from the barrier panel 30. (P) and the gap (g).
  • the display panel 20 is configured to be able to form a plurality of right-eye viewing regions 21R and a plurality of right-eye non-viewing regions 22R.
  • the plurality of right-eye visual recognition regions 21R are configured to be visible from the right eye 5R of the user via the plurality of translucent portions 31T.
  • the plurality of right-eye non-visible regions 22R are configured to be invisible or difficult to be seen by the user's right eye 5R by the plurality of light reducing units 31S.
  • the plurality of right-eye viewing regions 21R and the plurality of right-eye non-viewing regions 22R are alternately arranged in the X-axis direction.
  • the positions of the boundaries between the right-eye visible regions 21R and the right-eye non-visible regions 22R are the positions of the boundaries between the translucent parts 31T and the dimming parts 31S, and the barrier panel 30 and the user's eyes. (P) and the gap (g).
  • the display panel 20 is based on the position of the boundary between each light transmitting portion 31T and each light reducing portion 31S, the distance (P) from the barrier panel 30 to both eyes of the user, and the gap (g).
  • the parallax image may be displayed.
  • the left eye 5L can visually recognize only the plurality of left-eye images 23L and the right eye.
  • 5R can visually recognize only the plurality of right eye images 23R.
  • crosstalk can be reduced.
  • the state in which the left eye 5L and the right eye 5R can respectively visually recognize only the plurality of left eye images 23L and the plurality of right eye images 23R is realized when the observation distance (P) is the optimum viewing distance (OVD: Optimal Viewing Distance). Can be done.
  • the suitable viewing distance is determined based on the inter-eye distance (E), the gap (g), the barrier pitch, and the parallax image pitch.
  • the controller 50 may be configured to be able to control the display panel 20 and the barrier panel 30 so that the observation distance (P) becomes OVD.
  • the controller 50 controls the shapes and positions of the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S of the barrier panel 30 and the plurality of right eye images displayed on the display panel 20 so that the observation distance (P) becomes OVD.
  • 23R and the shapes and positions of the plurality of left-eye images 23L may be controllable.
  • the detection device 42 is configured to be able to acquire the position of the user's eyes.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R of the user.
  • the detection device 42 may include, for example, a camera.
  • the detection device 42 may be configured to be able to photograph the face of the user with a camera.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R from the image captured by the camera.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R as the coordinate of the three-dimensional space from the captured image of one camera.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R as coordinates in a three-dimensional space from images captured by two or more cameras.
  • the detection device 42 may not be provided with a camera and may be configured to be connectable to a camera outside the device.
  • the detection device 42 may include an input terminal configured to be able to input a signal from a camera outside the device.
  • the camera outside the device may be directly connectable to the input terminal.
  • the camera outside the device may be configured to be indirectly connectable to the input terminal via a shared network.
  • the detection device 42 that does not include a camera may include an input terminal configured to allow the camera to input a video signal.
  • the detection device 42 that does not include a camera may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R from the video signal input to the input terminal.
  • the detection device 42 may include, for example, a sensor.
  • the sensor may be an ultrasonic sensor, an optical sensor, or the like.
  • the detection device 42 may be configured to be able to detect the position of the user's head with a sensor.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R based on the position of the head.
  • the detection device 42 may be configured to be able to detect the position of at least one of the left eye 5L and the right eye 5R as a coordinate in a three-dimensional space by one or two or more sensors.
  • the image display device 10 does not have to include the detection device 42.
  • the controller 50 may include an input terminal configured to be able to input a signal from an external device of the image display device 10.
  • the external device may be configured to be connectable to the input terminal.
  • the external device may be configured to be able to use an electric signal and an optical signal as a transmission signal to the input terminal.
  • the external device may be configured to be indirectly connectable to the input terminal via a shared network.
  • the controller 50 is configured to be able to acquire information regarding the position of the user's eyes from the detection device 42 or an external device.
  • the controller 50 may be configured to be able to control the display panel 20 and the barrier panel 30 based on the position of the user's eyes.
  • the control of the display panel 20 and the barrier panel 30 based on the position of the user's eyes is also referred to as an eye track.
  • the display panel 20 is assumed to have a first pixel 20a, a second pixel 20b, a third pixel 20c, and a fourth pixel 20d arranged in the X-axis direction.
  • the plurality of left-eye viewing regions 21L include the first pixel 20a and the first pixel 20a. It is assumed to be included in the two pixels 20b and the third pixel 20c. In this case, the controller 50 can make the left eye 5L visually recognize the left eye image 23L by displaying the left eye image 23L on the first pixel 20a, the second pixel 20b, and the third pixel 20c.
  • the controller 50 can make the left eye 5L visually recognize the left eye image 23L by displaying the left eye image 23L on the second pixel 20b, the third pixel 20c, and the fourth pixel 20d.
  • the controller 50 can change the position of the left-eye image 23L displayed on the display panel 20 based on the position of the left eye 5L of the user to reduce crosstalk.
  • the controller 50 changes the position of the right eye image 23R to be displayed on the display panel 20 based on the position of the right eye 5R of the user by the same or similar control as the control based on the position of the left eye 5L to reduce crosstalk. Can be reduced.
  • the display panel 20 is assumed to have a fifth pixel 20e, a sixth pixel 20f, and a seventh pixel 20g arranged in the X-axis direction.
  • the controller 50 is configured to be able to set the left-eye visual recognition area 21L on the display panel 20, and displays the left-eye image 23L on the fifth pixel 20e, the sixth pixel 20f, and the seventh pixel 20g located in the left-eye visual recognition area 21L. It is supposed to be displayable.
  • the controller 50 can form a plurality of translucent parts 31T and a plurality of dimming parts 31S in the barrier panel 30 so that the left eye visible region 21L can be visually recognized from the position where the left eye 5L of the user is represented by B. Composed.
  • the controller 50 includes a plurality of light transmitting portions 31T ′ and a plurality of light reducing portions 31S ′ in the barrier panel 30 so that the left eye 5L of the user can visually recognize the left eye visible region 21L from a position represented by B ′. It is configured to be formable. By doing so, the controller 50 can cause the left eye 5L to visually recognize the left eye visual recognition region 21L regardless of whether the user's left eye 5L is located at B or B '.
  • the controller 50 may form the plurality of light transmitting portions 31T and the plurality of light reducing portions 31S of the barrier panel 30 based on the position of the left eye 5L of the user to reduce crosstalk.
  • the control based on the position of the left eye 5L has been described with reference to FIGS. 6 and 7.
  • the left eye 5L in FIGS. 6 and 7 may be replaced with the right eye 5R. That is, the controller 50 forms the plurality of translucent parts 31T and the plurality of dimming parts 31S of the barrier panel 30 based on the position of the right eye 5R of the user by the same or similar control as the control based on the left eye 5L.
  • crosstalk can be reduced.
  • Crosstalk can be further reduced by configuring the controller 50 to be able to control at least one of the display panel 20 and the barrier panel 30 based on the position of the user's eyes.
  • the display panel 20 is configured so that the image to be displayed can be sequentially updated. When the display panel 20 updates the display image, it can be considered that the display panel 20 is displaying a moving image.
  • the display panel 20 can be configured to be able to display a moving image by sequentially configuring a plurality of frames. In the present embodiment, the entire display area of the display panel 20 is regarded as one frame. When the entire display area is one frame, the display panel 20 is configured to be able to sequentially display the frames by sequentially updating the image displayed on the entire display surface.
  • the number of frames displayed on the display panel 20 in a unit time is also referred to as a frame rate.
  • the frame rate may be represented as the number of frames that the display panel 20 displays per second.
  • the display panel 20 combines a parallax image including a plurality of right-eye images 23R and a plurality of left-eye images 23L displayed in the first display area 21 and a plane image 24 displayed in the second display area 22 into one. It can be displayed as one frame.
  • the parallax image includes at least a part of the left eye image 23L illustrated in FIG. 8 and at least a part of the right eye image 23R illustrated in FIG. 9.
  • One left-eye image 23L illustrated in FIG. 8 includes a plurality of first sub-left images 231L and a plurality of second sub-left images 232L.
  • the plurality of first sub left-eye images 231L and the plurality of second sub left-eye images 232L do not overlap each other.
  • the plurality of first sub-left images 231L and the plurality of second sub-left images 232L extend in the Y-axis direction and alternate in the X-axis direction within one left-eye image 23L. Line up.
  • the total number of pixels in the X-axis direction of the plurality of first sub left-eye images 231L is 1/2 or less than the total number of pixels in the X-axis direction of one left-eye image 23L.
  • the total number of pixels in the X-axis direction of the plurality of second sub left-eye images 232L is 1/2 or less of the total number of pixels in the X-axis direction of one left-eye image 23L.
  • one full-pixel left-eye image 23L is divided into a plurality of first sub-left-eye images 231L and a plurality of second sub-left-eye images 232L, which are equal to or less than half the number of full-pixels.
  • the number of pixels of each first sub-left image 231L may be different from the number of pixels of each second sub-left image 232L.
  • One right-eye image 23R illustrated in FIG. 9 includes a plurality of first sub-right-eye images 231R and a plurality of second sub-right-eye images 232R.
  • the plurality of first sub-right image 231R and the plurality of second sub-right image 232R do not overlap each other.
  • the plurality of first sub-right image 231R and the plurality of second sub-right image 232R extend in the Y-axis direction, and alternate in the X-axis direction in one right-eye image 23R. Line up.
  • the total number of pixels in the X-axis direction of the plurality of first sub-right-eye images 231R is 1/2 or less than the total number of pixels in the X-axis direction of one right-eye image 23R.
  • the total number of pixels in the X-axis direction of the plurality of second sub-right-eye images 232R is less than or equal to 1/2 of the total number of pixels in the X-axis direction of one right-eye image 23R.
  • one full-pixel right-eye image 23R is divided into a plurality of first sub-right-eye images 231R and a plurality of second sub-right-eye images 232R, each of which has half or less the number of full-pixels.
  • the number of pixels of each first sub-right image 231R may be different from the number of pixels of each second sub-right image 232R.
  • the plurality of first sub-left images 231L and the plurality of first sub-right images 231R do not overlap each other.
  • the display panel 20 can be configured to be able to simultaneously display the plurality of first sub-left images 231L and the plurality of first sub-right images 231R.
  • the plurality of second sub left-eye images 232L and the plurality of second sub right-eye images 232R do not overlap each other.
  • the display panel 20 can be configured to be able to simultaneously display the plurality of second sub left-eye images 232L and the plurality of second sub right-eye images 232R.
  • the controller 50 causes the display panel 20 to display the plurality of first sub-left images 231L and the plurality of first sub-right images 231R in the first display area 21 as one parallax image. It is configured to display the containing frame.
  • the display area of each first sub-left image 231L includes the left-eye visual recognition area 21L
  • the display area of each first sub-right-eye image 231R includes the right-eye visual recognition area 21R.
  • the first barrier region 31 of the barrier panel 30 is configured to be controllable.
  • the controller 50 barriers the display areas of the plurality of first sub left-eye images 231L and the plurality of first sub right-eye images 231R so as to match the plurality of left eye visual recognition areas 21L and the plurality of right eye visual recognition areas 21R.
  • the panel 30 may be configured to be controllable.
  • the controller 50 includes a barrier panel so that the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R include the respective display areas of the plurality of first sub-left eye images 231L and the plurality of first sub-right-eye images 231R. 30 may be configured to be controllable.
  • the controller 50 causes the display panel 20 to display the plurality of second sub left-eye images 232L and the plurality of second sub right-eye images 232R as one parallax image in the first display area 21. It is configured to display the containing frame.
  • the display area of each second sub-left image 232L includes the left-eye visual recognition area 21L
  • the display area of each second sub-right-eye image 232R includes the right-eye visual recognition area 21R.
  • the first barrier region 31 of the barrier panel 30 is configured to be controllable.
  • the controller 50 controls the barriers so that the respective display areas of the plurality of second sub-left images 232L and the plurality of second sub-right images 232R match the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R.
  • the panel 30 may be configured to be controllable.
  • the controller 50 includes a barrier panel so that the plurality of left-eye visual recognition areas 21L and the plurality of right-eye visual recognition areas 21R include the respective display areas of the plurality of second sub left-eye images 232L and the plurality of second sub-right-eye images 232R. 30 may be configured to be controllable.
  • the user sees one of the plurality of first sub-left images 231L and the plurality of second sub-left images 232L as a plurality of images. It is visually recognized as the left eye image 23L.
  • the user visually recognizes one of the plurality of first sub right eye images 231R and the plurality of second sub right eye images 232R as the plurality of right eye images 23R.
  • the total number of pixels in the X-axis direction of each of the plurality of left-eye images 23L and the plurality of right-eye images 23R visually recognized by the user is 1 ⁇ 2 or less of the full pixel.
  • the left eye image 23L can be visually recognized.
  • One left-eye image 23L visually recognized by the user may have larger pixels than each of the plurality of first sub-left images 231L and the plurality of second sub-left images 232L.
  • One left-eye image 23L visually recognized by the user may be a full pixel.
  • the user can visually recognize one right eye image 23R that is a combination of the plurality of first sub right eye images 231R and the plurality of second sub right eye images 232R.
  • One right eye image 23R visually recognized by the user may have pixels larger than each of the plurality of first sub right eye images 231R and the plurality of second sub right eye images 232R.
  • One right-eye image 23R visually recognized by the user may be full pixels.
  • the controller 50 is configured to control the display panel 20 and the barrier panel 30 so that the user can visually recognize the frame of FIG. 10 and the frame of FIG. 11 as two continuous frames. By doing so, the user can visually recognize the afterimage of the first frame and the image of the second frame as a single image. As a result, in one embodiment, the user can recognize that one left-eye image 23L with full pixels and one right-eye image 23R with full pixels are both displayed.
  • the image synthesis using the afterimage in the user's eyes is also called human synthesis.
  • the control of synthesizing the parallax images by the afterimage and allowing the user to visually recognize the parallax images is also referred to as parallax image synthesis display.
  • the controller 50 displays the portion located in the first display area 21 as two subframes.
  • the sub-frame includes a parallax image.
  • the portion located in the first display area 21 is also referred to as a first subframe.
  • a portion of the frame shown in FIG. 11 located in the first display area 21 is also referred to as a second subframe.
  • the controller 50 is configured to display frames so that one frame includes two subframes. That is, the controller 50 is configured to be able to display two frames continuously displayed in the first display area 21 as one parallax image frame including the first subframe and the second subframe.
  • the controller 50 may be configured to be able to control the first barrier region 31 in accordance with the parallax image displayed in the first display region 21.
  • the controller 50 may be configured to be able to control the parallax image displayed in the first display region 21 in accordance with the right-eye visual recognition region 21R and the left-eye visual recognition region 21L formed by the first barrier region 31.
  • the controller 50 can display the plurality of second sub left-eye images 232L in the region where the plurality of first sub right-eye images 231R were displayed at the timing of changing the display from the first sub-frame to the second sub-frame. May be configured.
  • the controller 50 may be configured to be able to display the plurality of second sub-right eye images 232R in the area where the plurality of first sub-left eye images 231L were being displayed at the same timing. By doing so, the display positions of the plurality of left-eye images 23L and the plurality of right-eye images 23R can be exchanged between the first subframe and the second subframe.
  • the display positions of the plurality of left eye images 23L are included in the display attributes of the plurality of left eye images 23L.
  • the display positions of the plurality of right eye images 23R are included in the display attributes of the plurality of right eye images 23R.
  • the controller 50 is configured to be able to display a plane image frame including the plane image 24 in a portion of the displayed frame located in the second display area 22.
  • the controller 50 is configured to be able to control the second barrier area 32 of the barrier panel 30 in accordance with the planar image 24 displayed in the second display area 22.
  • the controller 50 can control the portion located in the second display area 22 as two continuous planar image frames. Composed.
  • One frame is displayed on the first display area 21 and the second display area 22 at the same time. That is, the frame rate in the portion displayed in the first display area 21 and the frame rate in the portion displayed in the second display area 22 are the same.
  • the frame rate of the sub-frame of the parallax image frame displayed in the first display area 21 is the same as the frame rate of the planar image frame displayed in the second display area 22. In other words, the frame rate of the parallax image frame displayed in the first display area 21 is 1/2 times the frame rate of the plane image frame displayed in the second display area 22.
  • the image display device 10 allows each eye of the user to visually recognize a parallax image with less pixel deterioration from a full pixel in the first display area 21, and the user in the second display area 22. It is possible to visually recognize the planar image 24 having a large frame rate. As a result, the image quality of the image display device 10 is improved.
  • the controller 50 controls the plurality of light-transmitting parts 31T and the plurality of dimming parts of the barrier panel 30 so that the plurality of right-eye visual recognition regions 21R are formed in the first display region 21 in each of the first sub-frame and the second sub-frame.
  • the unit 31S is configured to be controllable.
  • the controller 50 can control the barrier panel 30 so that the plurality of right-eye visual recognition regions 21R formed in the first sub-frame and the plurality of right-eye visual recognition regions 21R formed in the second sub-frame do not overlap each other. May be configured.
  • the controller 50 controls the plurality of translucent parts 31T and the plurality of dimming parts of the barrier panel 30 so that the plurality of left-eye visual recognition regions 21L are formed in the first display region 21 in each of the first sub-frame and the second sub-frame.
  • the unit 31S is configured to be controllable.
  • the controller 50 can control the barrier panel 30 so that the plurality of left-eye visual recognition areas 21L formed in the first sub-frame and the plurality of left-eye visual recognition areas 21L formed in the second sub-frame do not overlap each other. May be configured.
  • the controller 50 is configured to display the plurality of right eye images 23R and the plurality of left eye images 23L on the display panel 20 in accordance with the formed plurality of right eye visual recognition areas 21R and the plurality of left eye visual recognition areas 21L. You may.
  • the controller 50 while displaying one parallax image frame, the controller 50 assumes that the position of the user's eyes is the same until a new parallax image frame is displayed, and the display panel 20 and the barrier panel. 30 may be configured to be controllable. That is, when the positions of the eyes of the user are different between the first subframe and the second subframe, the right eye visual recognition region 21R may be formed assuming that the positions of the eyes of the respective subframes are the same.
  • the controller 50 When displaying the second subframe included in the same parallax image frame subsequent to the first subframe, the controller 50 in the second subframe based on the positions of the plurality of right-eye visual recognition regions 21R in the first subframe.
  • the positions of the plurality of right-eye visual recognition regions 21R may be determined.
  • the controller 50 is configured to be able to determine the positions of the plurality of right-eye visual recognition regions 21R without depending on the positions of the eyes of the user.
  • the controller 50 When displaying the first sub-frame of a new parallax image frame subsequent to the second sub-frame, the controller 50 is not based on the positions of the plurality of right-eye visual recognition regions 21R in the second sub-frame, The position of the right-eye visual recognition region 21R may be determined. In this case, the controller 50 does not base the positions of the plurality of right-eye visual recognition regions 21R in the second sub-frame displayed immediately before, but positions of the plurality of right-eye visual recognition regions 21R on the basis of the positions of the eyes of the user. Can be determined.
  • a plurality of right-eye visual recognition regions 21R are formed based on the eye position in each subframe.
  • at least one of the plurality of first sub-right eye images 231R displayed in the first sub-frame and the plurality of second sub-right eye images 232R displayed in the second sub-frame has a plurality of right-eye visual recognitions.
  • a phenomenon that the display is shifted with respect to the region 21R may occur. This phenomenon results from the plurality of first sub-right eye images 231R and the plurality of second sub-right eye images 232R not overlapping each other. When this phenomenon occurs, crosstalk occurs.
  • the plurality of first sub-right eye images 231R and the plurality of second sub-right eye images 232R are at least one. Overlap in parts. As a result, the image quality of one right-eye image 23R seen by the user is degraded.
  • the display mode of the plurality of right eye images 23R in the plurality of right eye visual recognition regions 21R has been described as the description of the right eye 5R, but the display of the plurality of left eye images 23L in the plurality of left eye visual recognition regions 21L.
  • the aspect can be described in the same or similar manner as the description regarding the right eye 5R.
  • the position of the user's eyes is controlled to be the same. By doing so, crosstalk is less likely to occur, and the quality of the image seen by the user is less likely to deteriorate.
  • the image display device 10 operates in a room temperature environment of about 20 ° C. to 25 ° C., or operates in a temperature lower or higher than room temperature.
  • the temperature of the image display device 10 itself may change.
  • the response speed of the liquid crystal changes based on the operating temperature of the liquid crystal device. For example, the lower the operating temperature of the liquid crystal device, the slower the response of the liquid crystal. The higher the operating temperature of the liquid crystal device, the faster the response of the liquid crystal.
  • the image display device 10 may be configured to be able to change the operation mode of the liquid crystal device based on the operation temperature of the liquid crystal device.
  • the controller 50 may be configured to be able to acquire temperature information from the temperature element 40.
  • the temperature element 40 may include, for example, an element such as a temperature sensor configured to measure temperature.
  • the temperature element 40 may include an element such as a thermocouple, a thermistor, or a crystal oscillator, whose characteristics change according to temperature changes.
  • the controller 50 may be configured to be able to acquire the thermoelectromotive force generated based on the temperature difference between the reference contact and the temperature measurement contact of the thermocouple as the temperature information.
  • the thermistor transitions between an electrically conductive state and an electrically insulating state with a predetermined temperature as a threshold value.
  • the controller 50 may be configured to be able to acquire the state of the thermistor as temperature information.
  • the controller 50 may be configured to be able to acquire the natural frequency of the crystal oscillator as temperature information.
  • the controller 50 may be configured without the temperature element 40 and capable of acquiring temperature information from an external device such as a sensor.
  • the image display device 10 may further include a temperature information acquisition unit configured to be able to acquire temperature information.
  • the controller 50 may be configured to be able to acquire temperature information from the temperature element 40 or an external device via the temperature information acquisition unit.
  • the external device may be directly connectable to the temperature information acquisition unit, or may be indirectly connectable via a shared network or the like.
  • the external device may be configured to be able to use an electric signal and an optical signal as a transmission signal to the temperature information acquisition unit.
  • the temperature information may include information on the temperature of the atmosphere around the image display device 10.
  • the temperature information may include information about the temperature of at least one of the display panel 20 and the barrier panel 30.
  • the temperature information may include information about the temperature of at least a part of the image display device 10.
  • the temperature element 40 may be positioned so as to contact at least a part of the image display device 10.
  • the temperature element 40 may be positioned so as to contact at least one of the display panel 20 and the barrier panel 30.
  • the temperature element 40 may be located around the image display device 10.
  • the temperature element 40 may be located inside the image display device 10.
  • the external device that detects the temperature may be located so as to be in contact with at least a part of the image display device 10, or may be located around the image display device 10.
  • the controller 50 may be configured to be able to change the operation mode of the liquid crystal device based on the temperature information.
  • the controller 50 may be configured to be able to estimate the operating temperature of the liquid crystal device based on the temperature information, and to be able to change the operating mode of the liquid crystal device based on the estimation result.
  • the controller 50 When the controller 50 is configured to be able to execute composite display of parallax images, the controller 50 is configured to be able to change the plurality of light-transmitting portions 31T and the plurality of light-reducing portions 31S in the barrier panel 30, and thus the liquid crystal device It is required to speed up the response. Since the controller 50 is configured to be able to change the display positions of the plurality of right eye images 23R and the plurality of left eye images 23L on the display panel 20, it is required to speed up the response of the liquid crystal device. When the operating temperature of the liquid crystal device is lower than the predetermined temperature, the response of each pixel of the display panel 20 or the barrier panel 30 may not be able to follow the combined display of parallax images.
  • the controller 50 may be configured to be capable of transitioning to either a mode in which parallax image composite display is performed or a mode in which parallax image display is not performed, based on the temperature information.
  • the predetermined temperature referred to by the controller 50 to determine whether to execute the composite display is also referred to as a composite display determination temperature.
  • the combined display determination temperature is also referred to as the first temperature.
  • the controller 50 may be configured to be capable of transitioning to a mode in which parallax image composite display is executed when the temperature corresponding to the temperature information is equal to or higher than the first temperature.
  • the controller 50 may be configured to be able to transition to a mode in which parallax image composite display is not executed when the temperature corresponding to the temperature information is lower than the first temperature.
  • crosstalk may occur. Since the controller 50 is configured to be able to determine whether to execute composite display of parallax images based on temperature information, crosstalk is less likely to occur.
  • the controller 50 When the controller 50 is configured to execute eye tracking, the faster the movement speed of the user's eye position is, the faster the response of the liquid crystal device is required. When the operating temperature of the liquid crystal device is lower than the predetermined temperature, the response of each pixel of the display panel 20 or the barrier panel 30 may not be able to follow the movement of the position of the user's eyes.
  • the controller 50 may be configured to be capable of transitioning to either a mode in which eye tracking is performed or a mode in which eye tracking is not performed based on the temperature information.
  • the predetermined temperature referred to by the controller 50 to determine whether to execute eyetrack is also referred to as eyetrack determination temperature.
  • the eye track determination temperature is also referred to as the second temperature.
  • the controller 50 may be configured to be able to transition to a mode for executing eye tracking when the temperature corresponding to the temperature information is equal to or higher than the second temperature.
  • the controller 50 may be configured to be able to transition to a mode in which eye tracking is not executed when the temperature corresponding to the temperature information is lower than the second temperature.
  • the controller 50 determines whether or not to execute the eye track based on the temperature information, so that crosstalk is less likely to occur.
  • the controller 50 may be configured to be able to transit to any control mode from the first mode to the fourth mode.
  • the first mode corresponds to a control mode in which neither parallax image composite display nor eye track is executed.
  • the second mode corresponds to the control mode in which the eye track is executed but the parallax image composite display is not executed.
  • the third mode corresponds to a control mode in which parallax image synthesis display is executed but eye track is not executed.
  • the fourth mode corresponds to a control mode in which both parallax image composite display and eye track are executed.
  • the controller 50 may be configured to be capable of transitioning between the first mode and the third mode based on the comparison between the temperature corresponding to the temperature information and the first temperature, and may be configured to switch between the second mode and the fourth mode. It may be configured to be transitionable between.
  • the controller 50 may be configured to be capable of transitioning between the first mode and the second mode based on the comparison between the temperature corresponding to the temperature information and the second temperature, and may be configured to be transitionable between the third mode and the fourth mode. It may be configured to be transitionable between. That is, the controller 50 may be configured to be transitable via any one of the second mode and the third mode when the controller 50 is configured to be transitable between the first mode and the fourth mode.
  • the controller 50 may be configured to be able to control either one of the display panel 20 and the barrier panel 30 or to be able to control both of them in order to execute eye tracking in the second mode and the fourth mode. Good.
  • both the display panel 20 and the barrier panel 30 are configured to be controllable so that the controller 50 performs the eye track, and only the display panel 20 is configured to be controllable. It may include a case and a case where only the barrier panel 30 is configured to be controllable.
  • the light source 44 may be located close to the display panel 20 and the barrier panel 30.
  • the heat generated by the light source 44 may change the operating temperature of the display panel 20 and the barrier panel 30.
  • the light source 44 is located farther from the display panel 20 and the barrier panel 30 when viewed from the user.
  • the barrier panel 30 When the display panel 20 is located closer to the user than the barrier panel 30, the barrier panel 30 is located closer to the light source 44 than the display panel 20. In this case, the temperature of the barrier panel 30 tends to be higher than the temperature of the display panel 20. When the temperature of the barrier panel 30 is higher than the temperature of the display panel 20, the response speed of each pixel of the barrier panel 30 may be faster than the response speed of each pixel of the display panel 20.
  • the controller 50 may be configured to control the barrier panel 30 to perform eye tracking in the second mode and the fourth mode. By doing so, the eye track executed by the controller 50 can easily follow the movement of the position of the eye of the user.
  • the display panel 20 When the barrier panel 30 is located closer to the user than the display panel 20, the display panel 20 is located closer to the light source 44 than the barrier panel 30. In this case, the temperature of the display panel 20 tends to be higher than the temperature of the barrier panel 30. When the temperature of the display panel 20 is higher than the temperature of the barrier panel 30, the response speed of each pixel of the display panel 20 may be faster than the response speed of each pixel of the barrier panel 30.
  • the controller 50 may be configured to control the display panel 20 to perform eye tracking in the second mode and the fourth mode. By doing so, the eye track executed by the controller 50 can easily follow the movement of the position of the eye of the user.
  • the controller 50 When the controller 50 is configured to display an image on the display panel 20, the controller 50 is configured to be able to control each pixel of the display panel 20 with an intermediate gradation.
  • the controller 50 may be configured to transition each pixel of the barrier panel 30 to one of two states, a white display state having the highest transmittance and a black display state having the lowest transmittance.
  • the controller 50 may be configured to be able to control each pixel of the barrier panel 30 with two gradations, a maximum gradation corresponding to a white display state and a minimum gradation corresponding to a black display state.
  • the response of the pixel becomes faster than when the pixel is controlled with half gradation.
  • the response speed of each pixel of the barrier panel 30 may be faster than the response speed of each pixel of the display panel 20. Even when the temperature of the barrier panel 30 is lower than the temperature of the display panel 20, the response speed of each pixel of the barrier panel 30 can approach the response speed of each pixel of the display panel 20.
  • the controller 50 may be configured to be able to control the barrier panel 30 to execute the eye track, so that the eye track can easily follow the movement of the position of the eye of the user.
  • the controller 50 can be configured to be able to control the gradation of each pixel of the barrier panel 30 by the magnitude of the voltage applied to each pixel. In the present embodiment, it is assumed that the pixel transitions to a black display state when no voltage is applied to the pixel and the pixel transitions to a white display state when a predetermined voltage is applied to the pixel.
  • the predetermined voltage may be determined by the specifications of the barrier panel 30.
  • the controller 50 may be configured to be able to apply a voltage higher than a predetermined voltage to each pixel of the barrier panel 30, so that each pixel can be quickly transitioned to the white display state. That is, the controller 50 can be configured so that the response speed of each pixel can be controlled by the voltage applied to each pixel of the barrier panel 30.
  • the controller 50 may be configured to make the voltage applied to each pixel of the barrier panel 30 larger than the predetermined voltage when the temperature corresponding to the temperature information is lower than the predetermined temperature.
  • the controller 50 is configured to make the voltage applied to each pixel of the barrier panel 30 larger than the predetermined voltage, so that the response speed of each pixel of the barrier panel 30 is reduced even when the temperature of the barrier panel 30 is reduced. It gets harder.
  • the frame rate of parallax image frames displayed as two sub-frames by composite display of parallax images is 1/2 of the frame rate of sub-frames.
  • the controller 50 can display the parallax images at the same frame rate as the frame rate of the subframe. That is, the frame rate of the parallax image when the parallax image composite display is performed is 1 ⁇ 2 of the frame rate of the parallax image when the parallax image composite display is not performed.
  • the frame rate of the parallax image is reduced, the image quality of the moving image provided as the parallax image is degraded.
  • the controller 50 increases the frame rate of each of the sub-frame, the parallax image frame, and the plane image frame when the temperature corresponding to the temperature information is equal to or higher than the predetermined temperature when the parallax image composite display is performed. May be configured to.
  • the controller 50 may be configured to double the frame rate, for example. By doing so, the frame rate when the parallax image composite display is executed can be made closer to the frame rate when the parallax image composite display is not executed. As a result, when a moving image is provided as a parallax image, its image quality is less likely to deteriorate.
  • the predetermined temperature referred to by the controller 50 to determine whether to double the frame rate is also referred to as a double speed display determination temperature.
  • the double speed display determination temperature is higher than the first temperature.
  • the controller 50 may be configured to increase the frame rate by a predetermined scaling factor based on the temperature information.
  • the controller 50 may be configured to double the frame rate or more.
  • the controller 50 may be configured to display all pixels of the barrier panel 30 in white when the temperature corresponding to the temperature information is lower than the predetermined temperature.
  • the controller 50 is configured to display only the plane image 24 on the display panel 20, and allows the user to visually recognize only the plane image 24. By doing so, the deterioration of the image quality of the parallax image due to the slow response speed of each pixel of the barrier panel 30 can be avoided.
  • the image display system 1 includes an image display device 10 and a reflection member 60.
  • the image display system 1 displays an image on the image display device 10 and emits the image light.
  • the image light is reflected by the reflecting member 60 along the path indicated by the broken line and reaches the left eye 5L and the right eye 5R of the user.
  • the user can visually recognize the image displayed on the image display device 10.
  • the user visually recognizes the image light reflected by the reflecting member 60 to visually recognize the image displayed on the image display device 10 as a virtual image 10Q.
  • the virtual image 10Q is positioned ahead of the path indicated by the alternate long and short dash line which extends the path indicated by the broken line connecting the left eye 5L and the right eye 5R of the user and the reflection member 60 to the opposite side of the reflection member 60.
  • the image display system 1 may be a head-up display (HUD).
  • the image display system 1 and the image display device 10 may provide a stereoscopic view to the user by directly looking at the user.
  • the image display system 1 may be mounted on a mobile body.
  • the user of the image display system 1 may be a driver or an operator of the mobile body or a passenger.
  • a part of the configuration of the image display system 1 may be combined with other devices and parts included in the mobile body.
  • the windshield of the moving body may also be used as a part of the configuration of the image display system 1.
  • the reflective member 60 shown in FIG. 1 may be replaced by a mobile windshield.
  • the temperature element 40 or an external device for detecting the temperature may be located inside the moving body.
  • the temperature element 40 or the external device configured to detect the temperature may be located inside the vehicle.
  • the temperature element 40 or the external device may be configured to be able to output temperature information about the temperature inside the moving body or the temperature inside the vehicle.
  • Vehicle in the present disclosure includes vehicles, ships, and aircraft.
  • Vehicle in the present disclosure includes, but is not limited to, automobiles and industrial vehicles, and may include railroad vehicles and daily vehicles, and fixed-wing aircraft that travel on a runway.
  • Vehicles include, but are not limited to, passenger cars, trucks, buses, motorcycles, trolleybuses, and the like, and may include other vehicles traveling on roads.
  • Industrial vehicles include industrial vehicles for agriculture and construction.
  • Industrial vehicles include, but are not limited to, forklifts and golf carts.
  • Industrial vehicles for agriculture include, but are not limited to, tractors, tillers, transplanters, binders, combines, and lawnmowers.
  • Industrial vehicles for construction include, but are not limited to, bulldozers, scrapers, excavators, mobile cranes, dump trucks, and road rollers. Vehicles include those that are driven manually.
  • vehicle classification is not limited to the above.
  • an automobile may include an industrial vehicle that can travel on a road, and the same vehicle may be included in multiple classifications.
  • the vessels in the present disclosure include marine jets, boats, and tankers.
  • the aircraft in the present disclosure includes a fixed-wing aircraft and a rotary-wing aircraft.
  • the explanation has been given under the assumption that the display panel 20 and the barrier panel 30 are liquid crystal devices.
  • at least one of the display panel 20 and the barrier panel 30 may be a device configured to change an operation speed according to an operation temperature.
  • descriptions such as “first” and “second” are identifiers for distinguishing the configuration.
  • the configurations distinguished by the description such as “first” and “second” in the present disclosure can exchange the numbers in the configurations.
  • the first mode can exchange the identifiers "first” and “second” with the second mode.
  • the exchange of identifiers is done simultaneously. Even after exchanging the identifiers, the configurations are distinguished.
  • the identifier may be deleted.
  • the configuration in which the identifier is deleted is distinguished by the code. Based on only the description of the identifiers such as “first” and “second” in the present disclosure, it should not be used as the basis for the interpretation of the order of the configuration and the existence of the identifier with a small number.
  • the X axis, the Y axis, and the Z axis are provided for convenience of description, and may be interchanged with each other.
  • the configuration according to the present disclosure has been described using the orthogonal coordinate system configured by the X axis, the Y axis, and the Z axis.
  • the positional relationship between the components according to the present disclosure is not limited to the orthogonal relationship.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Liquid Crystal (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

La présente invention concerne un dispositif d'affichage d'image comprenant un panneau d'affichage, un panneau barrière et un dispositif de commande. Le panneau d'affichage est conçu pour pouvoir afficher une trame comprenant une pluralité d'images d'œil droit visualisées à partir de l'œil droit d'un utilisateur, et une pluralité d'images d'œil gauche visualisées à partir de l'œil gauche de l'utilisateur. Le panneau barrière est positionné pour chevaucher le panneau d'affichage. Le panneau barrière est conçu pour pouvoir former une pluralité de sections de transmission de lumière et une pluralité de sections de gradation de telle sorte que la lumière d'image associée à la pluralité d'images d'œil droit et la pluralité d'images d'œil gauche atteignent l'œil droit et l'œil gauche de l'utilisateur. Le dispositif de commande est conçu pour pouvoir acquérir des informations de température. Le dispositif de commande est conçu pour pouvoir commander le panneau d'affichage et le panneau barrière sur la base des informations de température.
PCT/JP2019/041777 2018-10-31 2019-10-24 Dispositif d'affichage d'image, système d'affichage d'image et corps mobile WO2020090629A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-205975 2018-10-31
JP2018205975A JP2020072405A (ja) 2018-10-31 2018-10-31 画像表示装置、画像表示システム、及び移動体

Publications (1)

Publication Number Publication Date
WO2020090629A1 true WO2020090629A1 (fr) 2020-05-07

Family

ID=70462362

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/041777 WO2020090629A1 (fr) 2018-10-31 2019-10-24 Dispositif d'affichage d'image, système d'affichage d'image et corps mobile

Country Status (2)

Country Link
JP (1) JP2020072405A (fr)
WO (1) WO2020090629A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7317517B2 (ja) * 2019-02-12 2023-07-31 株式会社ジャパンディスプレイ 表示装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012155021A (ja) * 2011-01-24 2012-08-16 Sony Corp 表示装置、バリア装置、および表示装置の駆動方法
JP2013175805A (ja) * 2012-02-23 2013-09-05 Nikon Corp 表示装置および撮像装置
JP2014045474A (ja) * 2012-07-31 2014-03-13 Nlt Technologies Ltd 立体画像表示装置、画像処理装置及び立体画像処理方法
JP2014150304A (ja) * 2013-01-31 2014-08-21 Nippon Seiki Co Ltd 表示装置及びその表示方法
JP2015215510A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置および表示方法
JP2018120189A (ja) * 2017-01-27 2018-08-02 公立大学法人大阪市立大学 3次元表示装置、3次元表示システム、ヘッドアップディスプレイシステム、及び移動体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012155021A (ja) * 2011-01-24 2012-08-16 Sony Corp 表示装置、バリア装置、および表示装置の駆動方法
JP2013175805A (ja) * 2012-02-23 2013-09-05 Nikon Corp 表示装置および撮像装置
JP2014045474A (ja) * 2012-07-31 2014-03-13 Nlt Technologies Ltd 立体画像表示装置、画像処理装置及び立体画像処理方法
JP2014150304A (ja) * 2013-01-31 2014-08-21 Nippon Seiki Co Ltd 表示装置及びその表示方法
JP2015215510A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置および表示方法
JP2018120189A (ja) * 2017-01-27 2018-08-02 公立大学法人大阪市立大学 3次元表示装置、3次元表示システム、ヘッドアップディスプレイシステム、及び移動体

Also Published As

Publication number Publication date
JP2020072405A (ja) 2020-05-07

Similar Documents

Publication Publication Date Title
JP7100523B2 (ja) 表示装置、表示システムおよび移動体
CN112888990B (zh) 图像显示装置、图像显示系统以及移动体
WO2021065825A1 (fr) Dispositif d'affichage tridimensionnel, système d'affichage tridimensionnel, affichage tête haute et corps mobile
US20230004002A1 (en) Head-up display, head-up display system, and movable body
WO2020090629A1 (fr) Dispositif d'affichage d'image, système d'affichage d'image et corps mobile
US11881130B2 (en) Head-up display system and moving body
JP7274392B2 (ja) カメラ、ヘッドアップディスプレイシステム、及び移動体
JP7227977B2 (ja) 画像表示装置、画像表示システム、及び移動体
CN113016178A (zh) 平视显示器、平视显示器系统、移动体以及平视显示器的设计方法
JP7189228B2 (ja) 画像表示装置、画像表示システム、及び移動体
JP7178413B2 (ja) 画像表示装置、画像表示システム、及び移動体
US11961429B2 (en) Head-up display, head-up display system, and movable body
US20220402361A1 (en) Head-up display module, head-up display system, and movable body
WO2021090956A1 (fr) Affichage tête haute, systeme d'affichage tête haute, et corps mobile
WO2020256154A1 (fr) Dispositif et système d'affichage tridimensionnel, et objet mobile
WO2022019154A1 (fr) Dispositif d'affichage tridimensionnel
WO2020031978A1 (fr) Dispositif d'affichage d'image, système d'affichage d'image et corps mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19880331

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19880331

Country of ref document: EP

Kind code of ref document: A1