WO2023074138A1 - Image display apparatus, electronic appliance, and integrated type device - Google Patents

Image display apparatus, electronic appliance, and integrated type device Download PDF

Info

Publication number
WO2023074138A1
WO2023074138A1 PCT/JP2022/033700 JP2022033700W WO2023074138A1 WO 2023074138 A1 WO2023074138 A1 WO 2023074138A1 JP 2022033700 W JP2022033700 W JP 2022033700W WO 2023074138 A1 WO2023074138 A1 WO 2023074138A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
eye
image
sensing
sensing light
Prior art date
Application number
PCT/JP2022/033700
Other languages
French (fr)
Japanese (ja)
Inventor
利文 安井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023074138A1 publication Critical patent/WO2023074138A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers

Definitions

  • the present disclosure includes an image display device that can be mounted on a head-mounted display or an EVF (Electronic View Finder), an electronic device such as a head-mounted display having this image display device, an integrated device included in this image display device, Regarding.
  • EVF Electronic View Finder
  • Head-mounted displays may use eye-sensing technology to detect the line of sight of the viewer.
  • the image monitor device incorporates an image generation device that displays an image of a subject, and an optical system and an eyepiece are arranged in front of it.
  • a pair of infrared diodes that emit infrared rays toward the eyes of the observer are arranged near the bottom of the eyepiece.
  • a dichroic mirror is positioned between the eyepieces at an inclination angle greater than 45 degrees with respect to the optical axis of the eyepieces, the dichroic mirror reflecting each of the corneal reflections of the observer.
  • Each corneal reflection image reflected by the dichroic mirror is condensed by the imaging lens and formed on the image sensor (abstract).
  • Patent Document 1 can be implemented in a large image monitor device that has space for a dichroic mirror. I can't.
  • the purpose of the present disclosure is to provide a compact image display device that can be mounted on a head-mounted display or EVF with a compact eye sensing optical system to reduce the impact on the form factor.
  • An image display device includes: an optical system; an image light projection unit that emits image light that is imaged on the retina through the optical system; an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil; an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system; a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit; an integrated device having and the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
  • the entrance pupils of the image light projecting section and the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
  • the degree of freedom in the form factor is increased, the size is reduced, and the eye-sensing accuracy can be improved because the eye-sensing light receiving part can be viewed from the front.
  • the eye sensing light receiving unit has a cut filter that cuts the image light, Alternatively, it may have the light-receiving microlens for cutting image light.
  • Light projected from the eye-sensing light projecting unit passes through the optical system and is reflected by a pupil, and the eye-sensing light projecting unit and the entrance pupils of the light-receiving microlenses are at the same position in the optical axis direction of the optical system. may be placed in
  • the degree of freedom in the form factor has been increased, the size has been reduced, and the eye-sensing accuracy can be improved because the eye-sensing light receiving part can be viewed from the front.
  • the eye-sensing light projector may have a light-projecting microlens for increasing the amount of the eye-sensing light.
  • the projection microlens of the eye-sensing light projection unit may be decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light.
  • a radial direction of the effective image circle from the center of the image light projecting unit is defined as an ⁇ direction,
  • the direction orthogonal to the ⁇ direction is the ⁇ direction
  • D be the distance from the center of the image light projection unit to the center of the eye sensing light projection unit
  • B is the pupil diameter of the image light projection unit
  • the emergence angles from D with respect to B be ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ _Center_D , where ⁇ ⁇ _Center_D is the chief ray angle
  • the projection range of the eye sensing light projection unit be ⁇ direction C ⁇ PRJ and ⁇ direction C ⁇ PRJ , Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ⁇ +_PRJ , ⁇ ⁇ _PRJ , ⁇ +_PRJ , and ⁇ ⁇ _PRJ ,
  • the light-receiving microlens of the eye-sensing light receiving section may be decentered so that the eye-sensing light reflected by the pupil is imaged at the center of the eye-sensing light-receiving section.
  • a radial direction of the effective image circle from the center of the image light projecting unit is defined as an ⁇ direction,
  • the direction orthogonal to the ⁇ direction is the ⁇ direction
  • A be the distance from the center of the image light projecting unit to the center of the eye sensing light receiving unit
  • B is the pupil diameter of the image light projection unit
  • the emergence angles from A with respect to B be ⁇ ⁇ +_A , ⁇ ⁇ _A , ⁇ ⁇ +_A , ⁇ ⁇ _A , ⁇ ⁇ _Center_A , where ⁇ ⁇ _Center_A is the chief ray angle
  • the light receiving range of the eye sensing light receiving portion be ⁇ direction C ⁇ DET and ⁇ direction C ⁇ DET , Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ⁇ +_DET , ⁇ ⁇ _DET , ⁇ +_DET , and ⁇ ⁇ _DET ,
  • the image light projecting section may be provided between the eye sensing light projecting section and the eye sensing light receiving section, and the optical axis in a plane perpendicular to the optical axis of the optical system.
  • the image light projection unit has a rectangular shape on a plane perpendicular to the optical axis of the optical system
  • the eye sensing light receiving section may be positioned within the effective image circle of the optical system on a straight line connecting the center of the long side of the image light projecting section and the optical axis of the optical system.
  • the imaging resolution due to the projection of the image light from the image light projection unit does not deteriorate, and high resolution can be achieved.
  • the eye-sensing light receiving part may include a plurality of light-receiving unit sub-pixels separated from each other, the light-receiving microlenses may be plural, and each micro-lens may form an image of the eye-sensing light on each light-receiving unit sub-pixel.
  • each light receiving unit pixel has a plurality of light receiving unit sub-pixels separated from each other, so that each light receiving unit sub-pixel receives an eye from a specific on-pupil object point. Only sensing light can be imaged. Furthermore, although the Z-direction positions of the image light projection unit that emits the image light to be imaged on the retina and the eye sensing light projection unit that emits the eye sensing light to be imaged on the retina are substantially the same, The lens allows the eye-sensing light to form an image on each light-receiving unit pixel of the eye-sensing light receiving section. As a result, it is possible to achieve both miniaturization and improvement in form factor flexibility by using an integrated device, as well as improvement in eye sensing accuracy.
  • the plurality of light-receiving unit sub-pixels may be separated from each other.
  • each light-receiving unit pixel By shielding each light-receiving unit pixel, the resolution is improved, and by obtaining a sufficient amount of eye-sensing light, calculation is facilitated, and it can also be used in a triple-pass system.
  • the plurality of light-receiving unit sub-pixels included in some of the light-receiving unit pixels and the plurality of light-receiving unit sub-pixels included in the other portion of the light-receiving unit pixels may be offset.
  • each light-receiving unit pixel By offsetting each light-receiving unit pixel, it is possible to increase the pseudo-resolution while minimizing the loss of the amount of eye-sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
  • the image light projection unit has an inner area and an outer area, the pixel size of the outer region is larger than the pixel size of the inner region;
  • the eye-sensing light receiver may be provided as a pixel within the outer region of the image light projector.
  • the eye-sensing light receiving part as a pixel in the outer area of the image light projecting part, the eye-sensing light receiving part can be arranged further inside, so that the image quality is improved by being less affected by the aberration of the VR optical system.
  • the eye-sensing light projector may be provided as a pixel within the outer region of the image light projector.
  • the eye sensing light projection part as a pixel in the outer region of the image light projection part, it is possible to perform eye sensing with high accuracy.
  • An electronic device includes: An image display device according to any one of the above.
  • An integrated device includes: an image light projection unit that emits image light that is imaged on the retina through an optical system; an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil; an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system; a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit; and the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
  • the entrance pupils of the image light projecting section and the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
  • FIG. 1 shows a typical image display device. Light projection and light reception are shown schematically. Schematically shows a cross-sectional view of an eyeball.
  • 1 is a side view schematically showing an image display device according to a first embodiment; FIG. It is a side view which shows an optical system. It is a front view which shows an integrated device typically. It is a side view which shows an image light projection part typically.
  • FIG. 4 is a side view schematically showing an eye sensing light receiving section; FIG. 4 is a side view schematically showing an eye sensing light receiving section; 4 is a graph showing the relationship between defocusing position and contrast; It is a graph which shows the emission angle of an image light projection part.
  • FIG. 4 shows the distribution of eye-sensing light emitted by the eye-sensing light projector.
  • 3 shows an imaging range of an eye sensing light receiving section and a projection range of an eye sensing light projecting section
  • FIG. 11 is a front view schematically showing an integrated device according to a second embodiment
  • 4 is a graph showing the relationship between spatial frequency and contrast
  • An example of image correction is shown.
  • FIG. 4 is a diagram for explaining crosstalk
  • FIG. 1A and 1B are a side view and a partially enlarged view schematically showing an optical system
  • FIG. 11 is a front view schematically showing an integrated device according to a third embodiment; 4 is a graph showing the relationship between spatial frequency and contrast; It is a front view which shows typically the integrated device which concerns on 4th Embodiment.
  • FIG. 4 is a front view schematically showing an eye sensing light receiving section;
  • FIG. 4 is a side view schematically showing an eye sensing light receiving section;
  • FIG. 11 is a front view schematically showing light-receiving unit pixels included in an eye-sensing light-receiving section according to a fifth embodiment; It is a front view which shows typically the integrated device which concerns on 6th Embodiment. It is a front view which shows typically the inner area
  • FIG. 4 is a graph showing the relationship between spatial frequency and contrast; It is a front view which shows typically the integrated device which concerns on 4th Embodiment.
  • FIG. 4 is a front view schematically showing an eye sensing light receiving section
  • FIG. 11 is a front view schematically showing an integrated device according to a seventh embodiment; It is a front view which shows typically the inner area
  • FIG. 20 is a front view schematically showing an integrated device according to a ninth embodiment;
  • FIG. 21 is a front view schematically showing an integrated device according to a tenth embodiment;
  • FIG. 21 is a front view schematically showing an integrated device according to an eleventh embodiment; 3 shows an image display device according to a third embodiment.
  • Foveated Rendering is effective in securing the transmission band.
  • Foveal rendering uses the fact that the human eye generally has high resolution only at the fixation point and sharply drops in resolution around it. This is a method of securing a transmission band by displaying an image. Eye sensing techniques such as the following are typically used to detect the position of the gaze point.
  • Fig. 1 shows a typical image display device.
  • a typical image display device 10 mounted on a VR head-mounted display has an image light projector 11, an optical system 12, an eye sensing light projector 15, and an eye sensing light receiver 14. .
  • the image light projection unit 11 emits image light (RGB light) that forms an image on the retina.
  • the optical system 12 causes parallel image light (broken lines in the drawing) to enter the pupil and form an image of the image light on the retina of the eye 16 .
  • the eye-sensing light projector 15 is an infrared projector that emits eye-sensing light (infrared rays) (solid line in the drawing) that is reflected by the pupil (cornea or lens).
  • the eye-sensing light receiving unit 14 is an infrared camera, receives eye-sensing light (infrared rays) reflected by the pupil, and forms a Purkinje image (object information on the pupil), which is an object point on the pupil.
  • the image light projection unit 11 and the optical system 12 require angle information on the pupil because the image light is incident on the pupil in parallel.
  • the eye-sensing optical path of the eye-sensing light projecting unit 15 forms an image of an object point on the pupil on the eye-sensing light receiving unit 14, so object information on the pupil is required.
  • the image optical path and the eye sensing optical path require different information on the pupil. Therefore, the eye sensing light projecting unit 15 and the eye sensing light receiving unit 14 are separated from the image light projecting unit 11 in the optical axis direction (hereinafter also referred to as the Z direction) of the optical system. It is typically installed with For this reason, there are problems such as form factor restrictions, the increase in the size of the VR lens especially when the angle of view is large, and the deterioration of eye sensing accuracy in oblique viewing.
  • the eye sensing light projecting section and the eye sensing light receiving section are arranged in the optical axis direction (Z direction) of the optical system with respect to the image light projecting section. set approximately equal to More specifically, the image light projecting section, the eye sensing light projecting section, and the eye sensing light receiving section are configured as an integrated device configured on a plane perpendicular to the Z direction.
  • the image display device of the present embodiment increases the degree of freedom of the form factor by using an integrated device, is miniaturized, and can be viewed from the front by the eye-sensing light receiving part, so that the accuracy of eye-sensing can be improved. Plan.
  • FIG. 2 schematically shows light projection and light reception.
  • the light projected from the integrated device uses the position information (Z position) at the light projecting part of the integrated device, the angle information (Z position) at the pupil, and the position information (Z position) at the retina. can be regarded as converting to
  • the light reception by the integrated device is based on the angle information (Z position) on the retina and the position information (Z position) on the pupil, and the angle information (Z position) on the light receiving unit of the integrated device. can be regarded as converting to
  • a light field is a vector function that represents the amount of light flowing in multiple directions through space.
  • angle information on the XY plane orthogonal to the Z direction can be reconstructed from the light field.
  • FIG. 3 schematically shows a cross-sectional view of an eyeball.
  • both image light and eye sensing light are emitted from the integrated device.
  • the image light forms an image on the retina
  • the Purkinje image of the eye sensing light forms a virtual image on the pupil (cornea, lens).
  • the imaging position of the image light and the imaging position of the eye sensing light are different in the Z direction. For this reason, when light is received by an integrated device, it is necessary to deal with the problem that this difference in the Z direction causes defocus and blurring. Therefore, the image display device according to this embodiment has the following configuration.
  • FIG. 4 is a side view schematically showing the image display device according to the first embodiment.
  • An XY plane view of the image display device viewed from the optical axis direction (Z direction) of the optical system is hereinafter referred to as a front view.
  • a view of the image display device seen from a direction orthogonal to the Z direction is called a side view.
  • the image display device 100 is mounted on, for example, a head-mounted display (not shown). Specifically, two image display devices 100 are mounted on one head-mounted display so as to correspond to both eyes of the viewer.
  • the image display device 100 has an integrated device 110 and an optical system 120 .
  • FIG. 5 is a side view showing the optical system.
  • the optical system 120 causes parallel image light to enter the pupil and forms an image of the image light on the retina of the eye 16 .
  • the optical system 120 includes, for example, a polarizing plate 121, a quarter-wave plate 122, a lens 123, a half mirror 124, a quarter-wave plate 125, a polarizing plate 126, and a lens 127 in order from the light emitting surface of the integrated device 110. including.
  • the illustrated configuration of the optical system 120 is an example of the VR optical system, and is not limited to the illustrated configuration. However, if the configuration of the VR optical system is different, the angle of the peripheral ray will change, so it is necessary to design the light field capture angle accordingly.
  • FIG. 6 is a front view schematically showing the integrated device.
  • the integrated device 110 has an image light projecting section 140 , an eye sensing light projecting section 150 and an eye sensing light receiving section 130 . It has an image light projecting section 140 , an eye sensing light projecting section 150 and an eye sensing light receiving section 130 . Although it is desirable that the eye sensing light receiving part 130 is formed on the same wafer, it is not necessarily formed on the same wafer. Image light projecting section 140 , eye sensing light projecting section 150 , and eye sensing light receiving section 130 are arranged at the same position in the optical axis direction (Z direction) of optical system 120 . In other words, the image light projecting section 140, the eye sensing light projecting section 150, and the eye sensing light receiving section 130 are arranged on a common XY plane orthogonal to the Z direction.
  • FIG. 7 is a side view schematically showing the image light projecting section.
  • the image light projection unit 140 emits image light that passes through the optical system 120 and forms an image on the retina.
  • the image light projection unit 140 is a ⁇ OLED (Micro Organic Light Emitting Diode) panel and emits RGB light.
  • the image light projector 140 has a plurality of light emitting cells 141 and a lens 142 on each light emitting cell 141 .
  • Light-emitting cell 141 includes color filter 144 and light-emitting layer 145 .
  • the lens 142 is a lens for more efficiently forming an image of the image light emitted by the light emitting cell 141 on the retina.
  • Lens 142 is covered with cover glass 143 .
  • the image light emitted by the image light projecting section 140 forms an image on the retina of the viewer by the optical system 120 and the lens 142 on each light emitting cell 141 provided in the image light projecting section 140 .
  • the image light projection unit 140 includes ⁇ OLED (micro organic light emitting diode), LCD (liquid crystal display), LCOS (liquid crystal on silicon), LED (light emitting diode), LD (laser diode), VCSEL (vertical Light-emitting devices such as cavity surface emitting lasers, vertical cavity surface emitting lasers, QD (Quantum Dot) OLEDs, etc. may also be used.
  • ⁇ OLED micro organic light emitting diode
  • LCD liquid crystal display
  • LCOS liquid crystal on silicon
  • LED light emitting diode
  • LD laser diode
  • VCSEL vertical Light-emitting devices such as cavity surface emitting lasers, vertical cavity surface emitting lasers, QD (Quantum Dot) OLEDs, etc. may also be used.
  • the image light projection unit 140 has, for example, a rectangular shape on the XY plane perpendicular to the optical axis (Z direction) of the optical system 120 .
  • the diameter of the effective image circle (that is, the effective diameter of the image light projection section 140) is ⁇ 32.5 mm, which is equal to the diagonal distance (32.5 mm).
  • a center 21 of the image light projection unit 140 coincides with the optical axis of the optical system 120 .
  • the radiation direction from the center 21 of the image light projection unit 140 (the radial direction of the effective image circle 22) is defined as the ⁇ direction
  • the direction orthogonal to the ⁇ direction is defined as the ⁇ direction.
  • a plurality of eye-sensing light projection units 150 are provided around the image light projection unit 140 .
  • a distance D from the center 21 of the image light projecting section 140 to each eye sensing light projecting section 150 is, for example, 13 mm.
  • the size of each eye sensing light projection unit 150 is, for example, a rectangle of 1.3 mm ⁇ 0.98 mm.
  • Each eye-sensing light projector 150 emits eye-sensing light that passes through the optical system 120 and is reflected by the pupil.
  • the eye sensing light is, for example, infrared.
  • Each eye-sensing light projector 150 has an LED 151 and a light-projecting microlens 152 for increasing the amount of eye-sensing light.
  • a projection microlens 152 is provided on the LED 151 .
  • a triple-pass system can also be used by increasing the light intensity of the eye-sensing light with the projection microlens 152 .
  • the projection microlens 152 may be decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light. As a result, the projection microlens 152 increases the light amount of the eye sensing light, so that it can be used even in a triple-pass system.
  • FIG. 9 is a side view schematically showing the eye sensing light receiving part.
  • the eye sensing light receiving section 130 has a light receiving unit pixel 133, a light receiving microlens 132, and a cut filter 134 (bandpass filter) for cutting image light.
  • the eye sensing light receiving section 130 is, for example, an infrared light receiving area of 1.3 mm wide ⁇ 0.98 mm long.
  • the distance A from the center 21 of the image light projecting section 140 to the center of the eye sensing light receiving section 130 is preferably as short (close) as possible, for example, 10.6 mm. Power consumption can be reduced by using an event-based vision sensor for the eye sensing light receiving unit 130 .
  • the eye sensing light receiving section 130 is positioned outside the image light projecting section 140 with respect to the center 21 of the image light projecting section 140 corresponding to the optical axis of the optical system 120 .
  • the image light projecting section 140 is positioned between the eye sensing light projecting section 150 and the eye sensing light receiving section 130 and the optical axis. provided in between. For this reason, the imaging resolution due to the projection of the image light from the image light projection unit 140 is not deteriorated, and the resolution can be increased and the productivity can be improved.
  • the eye sensing light receiving unit 130 is on a straight line connecting the center of the long side of the rectangular image light projecting unit 140 and the optical axis of the optical system 120 (the center 21 of the image light projecting unit 140). It lies within the effective image circle 22 of system 120 . For this reason, the imaging resolution due to the projection of the image light from the image light projection section 140 does not deteriorate, and the resolution can be increased.
  • the eye-sensing light receiving unit 130 consists of light receiving unit pixels 131 .
  • the size is 3.25 ⁇ m
  • the resolution is 156 LP/mm
  • the number of pixels is 400 ⁇ 300 pixels
  • F 1.29.
  • the light-receiving microlenses 132 form images on the light-receiving unit pixels 131 of the eye-sensing light reflected by the pupil and incident through the optical system 120 .
  • the light-receiving microlens 132 is not included in the optical path of the image light, and is not included in the optical path along which the eye-sensing light emitted by the eye-sensing light projector 150 passes through the optical system 120 and reaches the pupil.
  • the optical path of the image light and the projection of the eye sensing light are suitable only for the optical system 120.
  • An optical path is obtained.
  • only the optical path for receiving the eye-sensing light can be obtained by adding the light-receiving microlens 132 in addition to the optical system 120 .
  • an appropriate optical path can be obtained by adding a minimum number of members, so that a compact image display device 100 can be realized.
  • the image light projection unit 140 that emits the image light to be imaged on the retina and the eye sensing light projection unit 150 that emits the eye sensing light to be imaged on the retina are substantially at the same position in the Z direction.
  • the eye-sensing light can be imaged on each light-receiving unit pixel 131 of the eye-sensing light receiving unit 130 by the light-receiving microlens 132 arranged at an appropriate position.
  • the image sensor surface of the eye-sensing light receiving unit 130 is positioned such that the center of the eye-sensing light receiving unit 130 is shifted in the Y-direction with respect to the optical system 132 because the eye-sensing light of 20 degrees from the top corresponds to the center of the pupil.
  • the eye sensing light receiver 130 may be shifted -0.49 mm in the Y direction.
  • the entrance pupil Z position of the light receiving microlens 132 coincides with the OLED emission Z position.
  • n in the figure means the refractive index.
  • the focal length of the optical system 132 is 1.29 mm, which is substantially equal to the back focus of the optical system 132 of 1.37 mm. In other words, it is a configuration for forming an image with parallel light, thereby converting angle information into position information to obtain an eye-sensing image.
  • FIG. 10 is a graph showing the relationship between the defocusing position of the light receiving microlens 132 and the contrast.
  • FIG. 11 is a graph showing the captureable angles of the optical system 120.
  • the exit angle is It is 15deg to 25deg (center 20deg).
  • the image height is 13 mm (that is, the distance D from the center 21 of the image light projection unit 140 to the eye sensing light projection unit 150 is 13 mm)
  • the emergence angle is 22 degrees to 33 degrees (27.5 degrees at the center).
  • FIG. 12 shows the distribution of eye-sensing light emitted by the eye-sensing light projector.
  • the light distribution of a typical eye-sensing light projection unit is, for example, 20 degrees.
  • the distribution area of the eye-sensing light projecting section 150 (including the LED 151 and the light projecting microlens 152) is widened and the orientation directions are aligned. For example, it is 5.5 deg to 49.5 deg in the ⁇ direction and 42 deg in the ⁇ direction.
  • the calculation method for the conditions for the transmission and reception of infrared rays is as follows.
  • D be the distance from the center 21 of the image light projection unit 140 to the center of the eye sensing light projection unit 150 (see FIG. 6).
  • the radiation direction from the center 21 of the image light projection unit 140 (the radial direction of the effective image circle 22) is defined as the ⁇ direction, and the direction orthogonal to the ⁇ direction is defined as the ⁇ direction.
  • the angles that can be captured by the optical system 120 at the image height D are ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ _Center_D .
  • ⁇ _Center_D is the chief ray angle at the image height D.
  • the projection microlenses 152 of the eye-sensing light projection unit 150 are arranged so that the direction of the eye-sensing light is It may be decentered so as to match the exit angle direction of the image light. As a result, the projection microlens 152 increases the light amount of the eye sensing light, so that it can be used even in a triple-pass system.
  • the light-receiving microlenses 132 of the eye-sensing light-receiving unit 130 each receive the eye-sensing It may be decentered so as to image the light to the center of the plurality of light receiving unit sub-pixels 131 . This enables accurate eye sensing.
  • ⁇ +_D and ⁇ ⁇ _D are approximately equal to the upper ray and lower ray at an image height of 0 mm.
  • the eye-sensing light projection range angles of the eye-sensing light projection unit 150 are ⁇ ⁇ +_PRJ , ⁇ ⁇ _PRJ , ⁇ ⁇ +_PRJ , and ⁇ ⁇ _PRJ .
  • the following formula holds.
  • FIG. 13 shows the imaging range of the eye sensing light receiving section and the projection range of the eye sensing light projecting section.
  • the eye sensing light receiving range angles are ⁇ +_DET , ⁇ ⁇ _DET , ⁇ +_DET , and ⁇ ⁇ _DET .
  • the following formula holds.
  • the imaging range of the eye sensing light receiving section 130 is rectangular.
  • the projection range of the eye-sensing light projecting section 150 is a circle (the projection coordinate system in FIG. 13) surrounding the rectangular imaging range of the eye-sensing light receiving section 130 .
  • the image display apparatus 100 of the present embodiment is compact because the image light projecting section 140, the eye sensing light projecting section 150, and the eye sensing light receiving section 130 are provided in the compact integrated device 110. Form factor constraints can be reduced.
  • FIG. 29 is a diagram for explaining a case where the viewer wears glasses.
  • a typical image display device 10 when a viewer wears spectacles 31, frontal reflection from an eye-sensing light projector 15 (infrared projector) is received. There is a flare optical path 32 that is captured by the unit 14 (infrared camera). Since the plurality of eye-sensing light projectors 15 are evenly arranged, the flare from the eyeglasses 31 is strong and enters the eye-sensing light receiver 14 in some flare light path 32 . In order to prevent this, it is preferable to arrange a plurality of eye-sensing light receiving units 14, but in this case, the form factor is further restricted. In addition, due to the restriction of the position of the eye sensing light receiving section 14, oblique viewing becomes severe, and the camera size of the eye sensing light receiving section 14 becomes extremely small, which tends to deteriorate the eye sensing accuracy.
  • the flare light path 32 naturally enters the eye-sensing light receiving section 130, but a plurality of eye-sensing light receiving sections 130 are arranged. Since there are minimal restrictions on the form factor, the required number of eye-sensing light receiving units 130 can be arranged. Also, since the oblique viewing angle can be made relatively loose, it is easy to maintain eye sensing accuracy.
  • the cut filter 134 that cuts image light may be replaced by a light-receiving microlens 132 made of a material that cuts image light.
  • the microlens 132 may be composed of a plurality of compound lenses instead of a single lens.
  • FIG. 14 is a front view schematically showing an integrated device according to the second embodiment.
  • the integrated device 210 according to the second embodiment and the integrated device 110 according to the first embodiment are different in the number and positions of the eye-sensing light projecting units 250 and the eye-sensing light receiving units 230, and
  • the eye sensing light receiving part has a plurality of light receiving unit sub-pixels separated from each other.
  • the specs of the image light projecting section 240 according to the second embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment.
  • the eye sensing light receiving part 230 is composed of light receiving unit pixels 131 . It has a plurality of light-receiving unit sub-pixels 131 separated from each other.
  • the plurality of light-receiving unit sub-pixels 131 are separated from each other.
  • the light-receiving unit sub-pixel 131 is configured by dividing each light-receiving unit pixel 133 into 2 ⁇ 2 pixels (4 pixels) or more.
  • the light-receiving microlens 132 forms an image on the light-receiving unit pixel 131 of the eye-sensing light reflected by the pupil and incident through the optical system 120 .
  • Six eye-sensing light projection units 250 are provided around the image light projection unit 240 .
  • the six eye-sensing light projectors 250 are arranged in a substantially circular shape to obtain a circularly arranged Purkinje image.
  • the specifications of the eye-sensing light projection unit 250 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 250, eye sensing can be performed more accurately.
  • Eight eye sensing light receiving units 230 are provided around the image light projecting unit 240 .
  • the arrangement of the eight eye-sensing light receiving units 230 is not limited to the illustrated example. By arranging the eight eye-sensing light receiving units 230 with more variation, it is possible to adjust wafer efficiency and acquisition signal variation.
  • the size of the eye sensing light receiving section 230 is, for example, 1.6 mm ⁇ 1 mm.
  • Each eye sensing light receiver 230 includes a plurality of (eg, 8 ⁇ 5) light receiving unit pixels 133 .
  • the size of each light receiving unit pixel 133 (see FIG. 8) is, for example, 200 ⁇ m.
  • Each light-receiving unit pixel 133 has a plurality of (for example, 80-division) light-receiving unit sub-pixels 131 (see FIG. 8) separated from each other.
  • the size of the light-receiving unit sub-pixel 131 is, for example, 2.5 ⁇ m.
  • the light-receiving microlens 132 images a maximum angle of 13.5 degrees to 100 ⁇ m.
  • the viewer's eyebox (the range in which the pupil moves) is ⁇ 8 mm.
  • the size of the pupil position is ⁇ 12 mm.
  • the light receiving resolution of the eye sensing light receiving section 230 is sufficient with about 80 ⁇ 80 pixels.
  • FIG. 15 is a graph showing the relationship between the spatial frequency of the light receiving microlens 132 and the contrast.
  • the integrated device 210 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +0.1 dB brighter, and the eye sensing becomes more efficient. can be done accurately. By increasing the amount of eye-sensing light, it can also be used in a triple-pass system.
  • FIG. 17 is a diagram for explaining crosstalk.
  • Crosstalk is a phenomenon in which an electrical signal leaks to an adjacent element due to leakage of an optical signal incident on a certain element.
  • a wall 34 is provided in the Z direction, or the eye-sensing light receiving portions 230 are sufficiently spaced from each other. It is better to let go.
  • FIG. 18 is a side view and a partially enlarged view schematically showing the optical system.
  • the positional relationship of the multi-lens array in the optical system 220 will be explained.
  • the Z position of the object plane of the image light projecting unit 240 that is, the display surface (approximately the outermost surface) coincides with the Z position of the entrance pupil of the light receiving microlens 132 of the eye sensing light receiving unit 230, a Purkinje image without blurring can be obtained. image can be obtained. On the other hand, if these Z positions are shifted, a blurred Purkinje image will result.
  • FIG. 19 is a front view schematically showing an integrated device according to the third embodiment.
  • FIG. 33 shows an image display device according to the third embodiment.
  • the difference between the integrated device 310 according to the third embodiment and the integrated device 110 according to the first embodiment is the arrangement of the eye-sensing light projecting unit 15 (FIG. 33) and the number of the eye-sensing light receiving units 330. and different positions, and the eye-sensing light-receiving part has a plurality of light-receiving unit sub-pixels separated from each other.
  • the specs of the image light projecting section 340 according to the third embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment.
  • Six eye-sensing light projection units 15 are provided around the optical system 12 .
  • the six eye-sensing light projectors 15 are arranged in a substantially circular shape in order to obtain a circularly arranged Purkinje image.
  • the eye-sensing light projection unit 15 (FIG. 33) irradiates each Lambertian light. Since six Purkinje images can be acquired by providing six eye sensing light projection units (not shown), eye sensing can be performed more accurately.
  • Two eye sensing light receiving units 330 are provided around the image light projecting unit 340 .
  • the arrangement of the two eye sensing light receiving units 330 is not limited to the illustrated example.
  • the size of the eye sensing light receiving section 330 is, for example, 18 mm ⁇ 3 mm.
  • Each eye sensing light receiver 330 includes a plurality of (eg, 90 ⁇ 15) light receiving unit pixels 133 (see FIG. 8).
  • the size of each light receiving unit pixel 133 is, for example, 200 ⁇ m.
  • Each light-receiving unit pixel 133 has a plurality of (for example, 80-division) light-receiving unit sub-pixels 131 (see FIG. 8) separated from each other.
  • the size of the light-receiving unit sub-pixel 131 is, for example, 2.5 ⁇ m.
  • the light-receiving microlens 132 images a maximum angle of 13.5 degrees to 100 ⁇ m.
  • the viewer's eyebox (the range in which the pupil moves) is ⁇ 8 mm.
  • the size of the pupil position is ⁇ 12 mm.
  • the light receiving resolution of the eye sensing light receiving section 330 is sufficient with about 80 ⁇ 80 pixels.
  • FIG. 16 shows an example of image correction.
  • the eye-sensing light receiving units 330 acquire an image at a position close to the maximum angle, the obtainable image shifts little by little depending on the image height position of each eye-sensing light receiving unit 330 .
  • the lens shift amount should be changed according to the distance from each central image height.
  • the two types of images shown in the figure may be synthesized by adding a shift in the signal processing stage.
  • FIG. 20 is a graph showing the relationship between the spatial frequency of the light receiving microlens 132 and the contrast.
  • the integrated device 310 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +9.6 dB brighter, and the eye sensing is more effective. can be done accurately.
  • the light-receiving area can be widened, and by increasing the amount of eye-sensing light, a triple-pass system can also be used.
  • FIG. 21 is a front view schematically showing an integrated device according to the fourth embodiment.
  • the integrated device 410 according to the fourth embodiment differs from the integrated device 110 according to the first embodiment in the number and positions of the eye-sensing light projecting units 450 and the eye-sensing light receiving units 430, and
  • the eye sensing light receiving part has a plurality of light receiving unit sub-pixels separated from each other.
  • the specs of the image light projecting section 440 according to the fourth embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment.
  • Six eye-sensing light projection units 450 are provided around the image light projection unit 440 .
  • the six eye-sensing light projectors 450 are arranged in a substantially circular shape to obtain a circularly arranged Purkinje image.
  • the specifications of the eye-sensing light projection unit 450 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 450, eye sensing can be performed more accurately.
  • Two eye sensing light receiving units 430 are provided around the image light projecting unit 440 .
  • the arrangement of the two eye sensing light receiving units 430 is not limited to the illustrated example.
  • the size of the eye sensing light receiving section 430 is, for example, 12 mm ⁇ 3 mm.
  • Each eye sensing light receiver 430 includes a plurality of (eg, 60 ⁇ 15) light receiving unit pixels 433 .
  • the size of each light receiving unit pixel 433 is, for example, 200 ⁇ m.
  • Each light-receiving unit pixel 433 has a plurality of light-receiving unit sub-pixels 431 (for example, 80 divisions) separated from each other.
  • the size of the light-receiving unit sub-pixel 431 is, for example, 2.5 ⁇ m.
  • the light-receiving microlens 432 images a maximum angle of 13.5 degrees to 100 ⁇ m.
  • the viewer's eyebox (the range in which the pupil moves) is ⁇ 8 mm.
  • the size of the pupil position is ⁇ 12 mm.
  • the light receiving resolution of the eye sensing light receiving section 330 is sufficient with about 80 ⁇ 80 pixels.
  • FIG. 22 is a front view schematically showing the eye sensing light receiving part.
  • FIG. 23 is a side view schematically showing the eye sensing light receiving section.
  • Each light-receiving unit pixel 433 has a plurality of light-receiving unit sub-pixels 431 separated from each other.
  • each light-receiving unit pixel 433 is divided into 2 ⁇ 2 pixels (4 pixels) or more, and 3 out of 4 sub-pixels are shielded by the shielding plate 435 .
  • each light-receiving unit pixel 433 is randomly shielded to a quarter size.
  • the pseudo-resolution in this configuration is 160 ⁇ 160 pixels.
  • the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +1.8 dB brighter, and the eye sensing is more effective. can be done accurately.
  • the resolution can be improved and a sufficient amount of eye sensing light can be obtained, thereby facilitating calculation and enabling use in a triple-pass system.
  • FIG. 24 is a front view schematically showing light-receiving unit pixels included in an eye-sensing light-receiving portion according to the fifth embodiment.
  • the difference between the integrated device according to the fifth embodiment and the integrated device 410 according to the fourth embodiment is the configuration of the light-receiving unit pixel 533 .
  • the front view of the integrated device according to the fifth embodiment is the same as the front view (FIG. 21) of the integrated device 410 according to the fourth embodiment, so illustration is omitted.
  • Each light-receiving unit pixel 533 has a plurality of light-receiving unit sub-pixels 531 separated from each other. For example, a plurality of light receiving unit sub-pixels 531 are separated from each other by providing a partition 535 extending in the Z direction between adjacent light receiving unit sub-pixels 531 .
  • the light-receiving unit sub-pixel 531 is configured by dividing each light-receiving unit pixel 533 into 2 ⁇ 2 pixels (4 pixels) or more.
  • the plurality of light-receiving unit sub-pixels 531A included in some of the light-receiving unit pixels 533A and the plurality of light-receiving unit sub-pixels 531B included in the other portion of the light-receiving unit pixels 533B are offset.
  • a plurality of light-receiving unit sub-pixels 531A included in a portion of light-receiving unit pixels 533A and a plurality of light-receiving unit sub-pixels 531B included in another portion of light-receiving unit pixels 533B are randomly divided into half pixels. offset.
  • the pseudo-resolution in this configuration is 160 ⁇ 160 pixels.
  • the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +7.9 dB brighter, and the eye sensing becomes more effective. can be done accurately.
  • the light amount is +7.9 dB brighter, and the eye sensing becomes more effective. can be done accurately.
  • By randomly offsetting each light-receiving unit pixel 533 it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
  • FIG. 25 is a front view schematically showing an integrated device according to the sixth embodiment.
  • the integrated device 610 according to the sixth embodiment differs from the integrated device 110 according to the first embodiment in the number and positions of the eye-sensing light projecting units 650, and the eye-sensing light receiving units are different from each other. It has a plurality of separated light-receiving unit sub-pixels, a configuration of the image light projecting section 640 , and an eye sensing light receiving section 630 arranged inside the image light projecting section 640 .
  • Six eye-sensing light projection units 650 are provided around the image light projection unit 640 .
  • the six eye-sensing light projectors 650 are arranged in a substantially circular shape in order to obtain a circularly arranged Purkinje image.
  • the specifications of the eye-sensing light projection unit 650 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 650, eye sensing can be performed more accurately.
  • FIG. 26 is a front view schematically showing the inner area and the outer area of the image light projecting section.
  • the size of the image light projection section 640 is the same as the size of the image light projection section 140 of the first embodiment.
  • the image light projector 640 has an inner region 640A and an outer region 640B.
  • the inner region 640A of the image light projection unit 640 is configured in an RGB Bayer array.
  • the pixel size is 6.7 ⁇ m.
  • the pixel size of the outer region 640B of the image light projection unit 640 is larger than the pixel size of the inner region 640A.
  • the outer region 640B has a lower resolution than the inner region 640A.
  • the outer region 640B may be 1/3 of a side with a half angle of 20 degrees, 1/6 of a side with a half angle of 30 degrees, and 1/12 of a side with a half angle of 40 degrees.
  • the pixel size may be 80 ⁇ m.
  • the eye-sensing light receiving section 630 is provided as a pixel within the outer region 640B of the image light projecting section 640 . Specifically, a plurality of light-receiving unit pixels 633 are scattered in the outer region 640 ⁇ /b>B, and the plurality of light-receiving unit pixels 633 constitute the eye-sensing light receiving section 630 .
  • Each light-receiving unit pixel 633 is divided into a plurality of light-receiving unit sub-pixels 131 (2.5 ⁇ m pitch). For example, each light-receiving unit pixel 633 includes 32 ⁇ 32 light-receiving unit sub-pixels 131 (see FIG. 8).
  • the plurality of light-receiving unit sub-pixels 131 included in some of the light-receiving unit pixels 633 and the plurality of light-receiving unit sub-pixels 131 included in the other portion of the light-receiving unit pixels 633 are offset. good too.
  • the plurality of light-receiving unit sub-pixels 131 included in some of the light-receiving unit pixels 633 and the plurality of light-receiving unit sub-pixels 131 included in the other portion of the light-receiving unit pixels 633 are randomly divided into quarter pixels. can be offset.
  • the pseudo-resolution in this configuration is 128 ⁇ 128 pixels.
  • the offset value may be 1/2.
  • the eye-sensing light receiving unit 630 by providing the eye-sensing light receiving unit 630 as a pixel in the outer region 640B of the image light projecting unit 640, the eye-sensing light receiving unit 630 can be arranged further inside, so that the VR optical system can be The image quality is improved because it is less affected by aberrations.
  • each light-receiving unit pixel 633 it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
  • FIG. 27 is a front view schematically showing an integrated device according to the seventh embodiment.
  • the difference between the integrated device 710 according to the seventh embodiment and the integrated device 610 according to the sixth embodiment is that, in addition to the eye sensing light receiving section 730, an eye sensing light receiving section 730 is provided inside the image light projecting section 740. The point is that the light projecting section 750 is arranged.
  • FIG. 28 is a front view schematically showing the inner area and the outer area of the image light projecting section.
  • the size of the image light projection section 740 is the same as the size of the image light projection section 140 of the first embodiment.
  • the image light projector 740 has an inner region 740A and an outer region 740B.
  • the configuration of the inner region 740A of the image light projecting section 740 is the same as that of the sixth embodiment.
  • the pixel sizes of the inner area 740A and the outer area 740B are the same as in the sixth embodiment.
  • the configuration of the eye sensing light receiving section 730 is the same as that of the sixth embodiment.
  • the eye-sensing light projectors 750 are provided as pixels within the outer region 740B of the image light projectors 740 . Specifically, a plurality of eye-sensing light projecting sections 750 are scattered in the outer region 740B.
  • IR-OLEDs may be mounted instead of IR-LEDs by different coatings for IR of OLEDs.
  • the eye sensing light projection section 750 as a pixel in the outer region 740B of the image light projection section 740, more accurate eye sensing is possible by switching on/off.
  • the eye-sensing light receiving section 730 as a pixel in the outer region 640B of the image light projecting section 740, the eye-sensing light receiving section 730 can be arranged further inside, so that it is less likely to be affected by the aberration of the VR optical system. Improves image quality.
  • by randomly offsetting each light-receiving unit pixel 733 it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
  • the difference between the integrated device according to the eighth embodiment and the integrated device 410 according to the fourth embodiment is the configuration of the eye sensing light receiving section.
  • the front view of the integrated device according to the eighth embodiment is the same as the front view (FIG. 21) of the integrated device 410 according to the fourth embodiment, so illustration is omitted.
  • the same reference numerals as in the fourth embodiment are used for the following description.
  • the size of the eye sensing light receiving section 430 is, for example, 12 mm x 3 mm.
  • Each eye sensing light receiver 430 includes a plurality of (eg, 2400 ⁇ 600) light receiving unit pixels 433 .
  • the size of each light receiving unit pixel 433 is, for example, 5 ⁇ m.
  • Each light-receiving unit pixel 433 has a plurality of (eg, 5-divided) light-receiving unit sub-pixels 431 separated from each other.
  • the size of the light-receiving unit sub-pixel 431 is, for example, 1 ⁇ m.
  • the light-receiving microlens 132 images the maximum angle of 13.5 degrees to 2.5 ⁇ m.
  • the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +7.9 dB brighter, and the eye sensing becomes more effective. can be done accurately.
  • the microlens array of the light receiving microlenses 132 and the image sensor of the eye sensing light receiving section 430 can be manufactured by a wafer process.
  • FIG. 30 is a front view schematically showing an integrated device according to the ninth embodiment.
  • the difference between the integrated device 910 according to the ninth embodiment and the integrated device 410 according to the fourth embodiment is that a plurality of eye sensing light receiving sections 930 of different sizes are provided.
  • MTF Modulation Transfer Function
  • An image light projection unit 940 and an eye sensing light projection unit 950 are the same as those in the fourth embodiment.
  • FIG. 31 is a front view schematically showing an integrated device according to the tenth embodiment.
  • the difference between the integrated device 1010 according to the tenth embodiment and the integrated device 410 according to the fourth embodiment is that the number and positions of the eye sensing light projecting units 1050 are different.
  • the interval between eye-sensing light projecting units 1050 is changed to improve accuracy.
  • the eye sensing performance is improved by making the shape of the eye sensing light projecting portion 1050 (LED or OLED) not rectangular but asymmetrical.
  • the image light projecting section 1040 is the same as in the fourth embodiment.
  • FIG. 32 is a front view schematically showing an integrated device according to the eleventh embodiment.
  • the difference between the integrated device 1110 according to the eleventh embodiment and the integrated device 1010 according to the tenth embodiment is the eye sensing light projecting section 1150 .
  • the eye-sensing light projection unit 1150 an IR-OLED (LED) is arranged so as to be able to drive the pixels, and an arbitrary pattern is displayed according to the situation, thereby improving eye-sensing accuracy.
  • the signal of the event-based vision sensor is obtained by controlling the light emission timing of the IR-OLED, thereby performing eye sensing with lower power consumption and higher accuracy. This system has a particularly high affinity with this optical system.
  • the image light projector 1140 is the same as in the fourth embodiment.
  • a polarization sensor may also be used as the eye sensing light receiving section. By using a polarization sensor, unnecessary noise can be reduced and accuracy can be improved.
  • the present disclosure may have the following configurations.
  • an optical system an image light projection unit that emits image light that is imaged on the retina through the optical system; an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil; an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system; a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit; an integrated device having and the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil; An image display device, wherein the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
  • the image display device has a cut filter that cuts the image light, Alternatively, the image display device has the light-receiving microlens for cutting image light.
  • the image display device according to (1) or (2) above, Light projected from the eye-sensing light projecting unit passes through the optical system and is reflected by a pupil, and the eye-sensing light projecting unit and the entrance pupils of the light-receiving microlenses are at the same position in the optical axis direction of the optical system. image display device.
  • a radial direction of the effective image circle from the center of the image light projecting unit is defined as an ⁇ direction
  • the direction orthogonal to the ⁇ direction is the ⁇ direction
  • D be the distance from the center of the image light projection unit to the center of the eye sensing light projection unit
  • B is the pupil diameter of the image light projection unit
  • the emergence angles from D with respect to B be ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ +_D , ⁇ ⁇ _D , ⁇ ⁇ _Center_D , where ⁇ ⁇ _Center_D is the chief ray angle
  • the projection range of the eye sensing light projection unit be ⁇ direction C ⁇ PRJ and ⁇ direction C ⁇ PRJ , Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ⁇ +_PRJ , ⁇ ⁇ _PRJ , ⁇ +_PRJ , and ⁇
  • a radial direction of the effective image circle from the center of the image light projecting unit is defined as an ⁇ direction
  • the direction orthogonal to the ⁇ direction is the ⁇ direction
  • A be the distance from the center of the image light projecting unit to the center of the eye sensing light receiving unit
  • B is the pupil diameter of the image light projection unit
  • the emergence angles from A with respect to B be ⁇ ⁇ +_A , ⁇ ⁇ _A , ⁇ ⁇ +_A , ⁇ ⁇ _A , ⁇ ⁇ _Center_A , where ⁇ ⁇ _Center_A is the chief ray angle
  • the light receiving range of the eye sensing light receiving portion be ⁇ direction C ⁇ DET and ⁇ direction C ⁇ DET , Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ⁇ +_DET , ⁇ ⁇ _DET , ⁇ +_DET , and
  • the image display device according to any one of (1) to (8) above, The image display device, wherein the image light projecting section is provided between the eye sensing light projecting section and the eye sensing light receiving section and the optical axis in a plane perpendicular to the optical axis of the optical system.
  • the image light projection unit has a rectangular shape on a plane perpendicular to the optical axis of the optical system, The image display device, wherein the eye sensing light receiving section is positioned within an effective image circle of the optical system on a straight line connecting the center of the long side of the image light projecting section and the optical axis of the optical system.
  • the image display device according to any one of (1) to (10) above, The image display device, wherein the eye-sensing light receiving part includes a plurality of light-receiving unit sub-pixels separated from each other, the light-receiving microlenses are plural, and each microlens forms an image of the eye-sensing light on each light-receiving unit sub-pixel.
  • the image display device has an inner area and an outer area, the pixel size of the outer region is larger than the pixel size of the inner region;
  • the image display device, wherein the eye sensing light receiving section is provided as a pixel in the outer region of the image light projecting section.
  • the image display device wherein the eye-sensing light projecting section is provided as a pixel in the outer region of the image light projecting section.
  • An electronic device comprising the image display device according to any one of (1) to (15) above.
  • an image light projection unit that emits image light that is imaged on the retina through an optical system; an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil; an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system; a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit; and the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil; An integrated device in which the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
  • image display device 110 integrated device 120 optical system 130 eye-sensing light receiver 131 light-receiving unit sub-pixel 132 light-receiving microlens 133 light-receiving unit pixel 134 cut filter 135 partition 140 image light projector 141 light-emitting cell 142 lens 143 cover glass 144 Color filter 145 Light-emitting layer 150 Eye-sensing light projector 151 LED 152 projection microlens

Abstract

The purpose of the present invention is to provide a compact eye-sensing optical system to a small-sized image display apparatus that can be mounted to a heat mount display and an EVF, and to reduce an influence on a form factor. An image display apparatus according to the present invention is provided with an optical system and an integration type device (110) having: an image light emission part (140) for emitting image light that passes through the optical system and forms an image on the retina; eye-sensing light emission parts (150) for emitting eye-sensing light that passes through the optical system and is reflected by the pupil; and an eye-sensing light reception part (130) having a light-reception unit pixel that includes a plurality of light-reception unit subpixels separated from each other and having a light reception microlens for forming images on the plurality of light-reception unit subpixels from the eye-sensing light which is reflected by the pupil, passes through the optical system, and enters the microlens.

Description

画像表示装置、電子機器及び一体型デバイスImage display devices, electronic equipment and integrated devices
 本開示は、ヘッドマウントディスプレイやEVF(Electronic View Finder)に搭載可能な画像表示装置と、この画像表示装置を有するヘッドマウントディスプレイ等の電子機器と、この画像表示装置に含まれる一体型デバイスと、に関する。 The present disclosure includes an image display device that can be mounted on a head-mounted display or an EVF (Electronic View Finder), an electronic device such as a head-mounted display having this image display device, an integrated device included in this image display device, Regarding.
 ヘッドマウントディスプレイ(VR(Virtual Reality)、AR(Augmented Reality))では視者の視線を検出するアイセンシング技術が用いられることがある。 Head-mounted displays (VR (Virtual Reality), AR (Augmented Reality)) may use eye-sensing technology to detect the line of sight of the viewer.
 特許文献1によれば、画像モニタ装置には、被写体像を映し出す画像生成装置が組み込まれ、その前方には光学系および接眼レンズが配置されている。接眼レンズの下部近傍位置には、観察者の目に向けて赤外線を放射する一対の赤外線ダイオードが配置されている。接眼レンズ間には、ダイクロイックミラーが接眼レンズの光軸に対し45度より大きい傾斜角度で配置され、ダイクロイックミラーは観察者の角膜反射像のそれぞれを反射する。ダイクロイックミラーで反射された各角膜反射像は結像レンズで集光され、イメージセンサ上にそれぞれ結像する(要約書)。 According to Patent Document 1, the image monitor device incorporates an image generation device that displays an image of a subject, and an optical system and an eyepiece are arranged in front of it. A pair of infrared diodes that emit infrared rays toward the eyes of the observer are arranged near the bottom of the eyepiece. A dichroic mirror is positioned between the eyepieces at an inclination angle greater than 45 degrees with respect to the optical axis of the eyepieces, the dichroic mirror reflecting each of the corneal reflections of the observer. Each corneal reflection image reflected by the dichroic mirror is condensed by the imaging lens and formed on the image sensor (abstract).
特開平8-9205号公報JP-A-8-9205
 アイセンシング用の光学系はフォームファクタを制約するため、特に全形を小さくする場合には性能が劣化することもある。例えば、特許文献1の技術は、ダイクロイックミラーを入れるスペースがあるような大型の画像モニタ装置では実現可能であるが、小型の画像モニタ装置では全系の焦点距離が短くなるため、ダイクロイックミラーを入れることができない。 The optical system for eye sensing restricts the form factor, so the performance may deteriorate, especially if the overall size is reduced. For example, the technique of Patent Document 1 can be implemented in a large image monitor device that has space for a dichroic mirror. I can't.
 以上のような事情に鑑み、本開示の目的は、ヘッドマウントディスプレイやEVFに搭載可能な小型の画像表示装置にコンパクトなアイセンシング光学系を持たせ、フォームファクタへの影響を減らすことにある。 In view of the circumstances described above, the purpose of the present disclosure is to provide a compact image display device that can be mounted on a head-mounted display or EVF with a compact eye sensing optical system to reduce the impact on the form factor.
 本開示の一形態に係る画像表示装置は、
 光学系と、
  前記光学系を経て網膜に結像される画像光を発する画像光投光部と、
  瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
  瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
  前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
 を有する一体型デバイスと、
 を具備し、
 前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
 前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される。
An image display device according to one aspect of the present disclosure includes:
an optical system;
an image light projection unit that emits image light that is imaged on the retina through the optical system;
an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
an integrated device having
and
the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
The entrance pupils of the image light projecting section and the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
 一体型デバイスを用いることでフォームファクタの自由度を増し、小型化し、アイセンシング光受光部によって正面視も見えるためアイセンシングの精度を高くすることができる。 By using an integrated device, the degree of freedom in the form factor is increased, the size is reduced, and the eye-sensing accuracy can be improved because the eye-sensing light receiving part can be viewed from the front.
 前記アイセンシング光受光部は、前記画像光をカットするカットフィルタを有する、
 または画像光をカットする前記受光マイクロレンズを有してもよい。
The eye sensing light receiving unit has a cut filter that cuts the image light,
Alternatively, it may have the light-receiving microlens for cutting image light.
 これにより、アイセンシングの精度を高くすることができる。 As a result, the accuracy of eye sensing can be increased.
 前記アイセンシング光投光部からの投光は、前記前記光学系を経て瞳で反射され、前記アイセンシング光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置されてもよい。 Light projected from the eye-sensing light projecting unit passes through the optical system and is reflected by a pupil, and the eye-sensing light projecting unit and the entrance pupils of the light-receiving microlenses are at the same position in the optical axis direction of the optical system. may be placed in
 フォームファクタの自由度を増し、小型化し、アイセンシング光受光部によって正面視も見えるためアイセンシングの精度を高くすることができる。 The degree of freedom in the form factor has been increased, the size has been reduced, and the eye-sensing accuracy can be improved because the eye-sensing light receiving part can be viewed from the front.
 前記アイセンシング光投光部は、前記アイセンシング光の光量を増加するための投光マイクロレンズを有してもよい。 The eye-sensing light projector may have a light-projecting microlens for increasing the amount of the eye-sensing light.
 投光マイクロレンズがアイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 By increasing the amount of eye-sensing light with the projection microlens, it can also be used in a triple-pass system.
 前記アイセンシング光投光部の前記投光マイクロレンズは、前記アイセンシング光の方向が前記画像光の射出角方向に一致するように偏心してもよい。 The projection microlens of the eye-sensing light projection unit may be decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light.
 投光マイクロレンズがアイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 By increasing the amount of eye-sensing light with the projection microlens, it can also be used in a triple-pass system.
 前記画像光投光部の中心から有効像円形の径方向をα方向とし、
 前記α方向に直交する方向をβ方向とし、
 前記画像光投光部の中心から、前記アイセンシング光投光部の中心までの距離をDとし、
 前記画像光投光部の瞳径をBとし、
 Bに対するDからの射出角をθα+_D,θα-_D,θβ+_D,θβ-_D,θα_Center_Dとし、θα_Center_Dは主光線角であり、
 前記アイセンシング光投光部の投光範囲をα方向CαPRJ、β方向CβPRJとし、
 受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_PRJ,ξα-_PRJ,ξβ+_PRJ,ξβ-_PRJとすると、
 式(1-1')、(1-2')、(1-3')及び(1-4')を満たすように、アイセンシング光投光部の前記受光マイクロレンズは、偏心してもよい。
 ξα+_PRJ-θα_Center_D≧CαPRJ/B・(θα+_D-θα_Center_D) (1-1')
 ξα-_PRJ-θα_Center_D≦CαPRJ/B・(θα-_D-θα_Center_D) (1-2')
 ξβ+_PRJ≧Cβ_D/B・θβ_D (1-3')
 ξβ-_PRJ≦Cβ_D/B・θβ_D (1-4')
A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
The direction orthogonal to the α direction is the β direction,
Let D be the distance from the center of the image light projection unit to the center of the eye sensing light projection unit,
B is the pupil diameter of the image light projection unit,
Let the emergence angles from D with respect to B be θ α+_D , θ α−_D , θ β+_D , θ β−_D , θ α_Center_D , where θ α_Center_D is the chief ray angle,
Let the projection range of the eye sensing light projection unit be α direction C αPRJ and β direction C βPRJ ,
Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_PRJ , ξα −_PRJ , ξβ +_PRJ , and ξβ −_PRJ ,
The light-receiving microlens of the eye-sensing light projector may be decentered so as to satisfy the formulas (1-1′), (1-2′), (1-3′) and (1-4′). .
ξ α+_PRJ − θ α_Center_D ≧C αPRJ /B・(θ α+_D −θ α_Center_D ) (1−1′)
ξ α−_PRJ −θ α_Center_D ≦C αPRJ /B・(θ α−_D −θ α_Center_D ) (1-2′)
ξ β+_PRJ ≧C β_D /B・θ β_D (1−3′)
ξ β-_PRJ ≤ C β_D /B・θ β_D (1-4')
 前記アイセンシング光受光部の前記受光マイクロレンズは、前記瞳で反射した前記アイセンシング光を前記前記アイセンシング光受光部の中心に結像させるように偏心してもよい。 The light-receiving microlens of the eye-sensing light receiving section may be decentered so that the eye-sensing light reflected by the pupil is imaged at the center of the eye-sensing light-receiving section.
 正確なアイセンシングが可能となる。 Accurate eye sensing is possible.
 前記画像光投光部の中心から有効像円形の径方向をα方向とし、
 前記α方向に直交する方向をβ方向とし、
 前記画像光投光部の中心から、前記アイセンシング光受光部の中心までの距離をAとし、
 前記画像光投光部の瞳径をBとし、
 Bに対するAからの射出角をθα+_A,θα-_A,θβ+_A,θβ-_A,θα_Center_Aとし、θα_Center_Aは主光線角であり、
 前記アイセンシング光受光部の受光範囲をα方向CαDET、β方向CβDETとし、
 前記受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_DET,ξα-_DET,ξβ+_DET,ξβ-_DETとすると、
 式(1-1)、(1-2)、(1-3)及び(1-4)を満たすように、前記アイセンシング光受光部の前記受光マイクロレンズは、偏心してもよい。
 ξα+_DET-θα_Center_A≧CαDET/B・(θα+_A-θα_Center_A) (1-1)
 ξα-_DET-θα_Center_A≦CαDET/B・(θα-_A-θα_Center_A) (1-2)
 ξβ+_DET≧Cβ_A/B・θβ_A (1-3)
 ξβ-_DET≦Cβ_A/B・θβ_A (1-4)
A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
The direction orthogonal to the α direction is the β direction,
Let A be the distance from the center of the image light projecting unit to the center of the eye sensing light receiving unit,
B is the pupil diameter of the image light projection unit,
Let the emergence angles from A with respect to B be θ α+_A , θ α−_A , θ β+_A , θ β−_A , θ α_Center_A , where θ α_Center_A is the chief ray angle,
Let the light receiving range of the eye sensing light receiving portion be α direction C αDET and β direction C βDET ,
Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_DET , ξα −_DET , ξβ +_DET , and ξβ −_DET ,
The light-receiving microlenses of the eye-sensing light-receiving section may be decentered so as to satisfy formulas (1-1), (1-2), (1-3) and (1-4).
ξ α+_DET −θ α_Center_A ≧C αDET /B・(θ α+_A −θ α_Center_A ) (1-1)
ξ α−_DET −θ α_Center_A ≦C αDET /B・(θ α−_A −θ α_Center_A ) (1-2)
ξ β+_DET ≧C β_A /B・θ β_A (1-3)
ξ β-_DET ≦ C β_A /B・θ β_A (1-4)
 前記光学系の光軸に直交する面において、前記画像光投光部は、前記アイセンシング光投光部及び前記アイセンシング光受光部と、前記光軸との間に設けられてもよい。 The image light projecting section may be provided between the eye sensing light projecting section and the eye sensing light receiving section, and the optical axis in a plane perpendicular to the optical axis of the optical system.
 これにより、最低限の部材の追加だけで適切な光路を得ることができるため、コンパクトな画像表示装置を実現可能である。 As a result, an appropriate optical path can be obtained with the addition of a minimum number of members, so a compact image display device can be realized.
 前記画像光投光部は、前記光学系の光軸に直交する面において、長方形状であり、
 前記アイセンシング光受光部は、前記画像光投光部の長辺の中央と前記光学系の光軸とを結ぶ直線上であって、前記光学系の有効像円形内に位置してもよい。
the image light projection unit has a rectangular shape on a plane perpendicular to the optical axis of the optical system,
The eye sensing light receiving section may be positioned within the effective image circle of the optical system on a straight line connecting the center of the long side of the image light projecting section and the optical axis of the optical system.
 画像光投光部からの画像光投光による結像解像力が劣化せず、高解像度化を図れる。  The imaging resolution due to the projection of the image light from the image light projection unit does not deteriorate, and high resolution can be achieved.
 前記アイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを含み、前記受光マイクロレンズは複数あり、各マイクロレンズは各々の受光単位サブピクセルにアイセンシング光を結像させてもよい。 The eye-sensing light receiving part may include a plurality of light-receiving unit sub-pixels separated from each other, the light-receiving microlenses may be plural, and each micro-lens may form an image of the eye-sensing light on each light-receiving unit sub-pixel.
 この様に、アイセンシング光受光部において、各受光単位ピクセルが、互いに分離された複数の受光単位サブピクセルを有することにより、各受光単位サブピクセルは、それぞれ、特定の瞳上物点からのアイセンシング光のみを結像することができる。さらに、網膜に結像すべき画像光を発する画像光投光部と、網膜に結像すべきアイセンシング光を発するアイセンシング光投光部のZ方向位置が略等しいにも拘らず、受光マイクロレンズによりアイセンシング光がアイセンシング光受光部の各受光単位ピクセルに結像させることができる。これにより、一体型デバイスを用いることによる小型化及びフォームファクタの自由度向上と、アイセンシング精度向上とを、両立することができる。 In this way, in the eye-sensing light receiving section, each light receiving unit pixel has a plurality of light receiving unit sub-pixels separated from each other, so that each light receiving unit sub-pixel receives an eye from a specific on-pupil object point. Only sensing light can be imaged. Furthermore, although the Z-direction positions of the image light projection unit that emits the image light to be imaged on the retina and the eye sensing light projection unit that emits the eye sensing light to be imaged on the retina are substantially the same, The lens allows the eye-sensing light to form an image on each light-receiving unit pixel of the eye-sensing light receiving section. As a result, it is possible to achieve both miniaturization and improvement in form factor flexibility by using an integrated device, as well as improvement in eye sensing accuracy.
 隣接する受光単位サブピクセルの間に、前記光学系の光軸方向に延びる仕切りを設けることにより、又は
 各受光単位サブピクセルの周囲が遮蔽されることにより、
 前記複数の受光単位サブピクセルが互いに分離されてもよい。
By providing a partition extending in the optical axis direction of the optical system between adjacent light-receiving unit sub-pixels, or by shielding the periphery of each light-receiving unit sub-pixel,
The plurality of light-receiving unit sub-pixels may be separated from each other.
 各受光単位ピクセルを遮蔽することで、解像度を向上させつつ、アイセンシング光の光量を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 By shielding each light-receiving unit pixel, the resolution is improved, and by obtaining a sufficient amount of eye-sensing light, calculation is facilitated, and it can also be used in a triple-pass system.
 一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルと、他の一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルとは、オフセットしてもよい。 The plurality of light-receiving unit sub-pixels included in some of the light-receiving unit pixels and the plurality of light-receiving unit sub-pixels included in the other portion of the light-receiving unit pixels may be offset.
 各受光単位ピクセルをオフセットすることで、アイセンシング光の光量のロスを最低限にしつつ、疑似解像度を上げることができる。を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 By offsetting each light-receiving unit pixel, it is possible to increase the pseudo-resolution while minimizing the loss of the amount of eye-sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
 前記画像光投光部は、内側領域及び外側領域を有し、
 前記外側領域のピクセルサイズは、前記内側領域のピクセルサイズより大きく、
 前記アイセンシング光受光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられてもよい。
The image light projection unit has an inner area and an outer area,
the pixel size of the outer region is larger than the pixel size of the inner region;
The eye-sensing light receiver may be provided as a pixel within the outer region of the image light projector.
 アイセンシング光受光部を画像光投光部の外側領域内のピクセルとして設けることで、アイセンシング光受光部をより内側に配置できるので、VR光学系の収差影響を受けづらく像品質が上がる。 By providing the eye-sensing light receiving part as a pixel in the outer area of the image light projecting part, the eye-sensing light receiving part can be arranged further inside, so that the image quality is improved by being less affected by the aberration of the VR optical system.
 前記アイセンシング光投光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられてもよい。 The eye-sensing light projector may be provided as a pixel within the outer region of the image light projector.
 アイセンシング光投光部を画像光投光部の外側領域内のピクセルとして設けることで、精度の良いアイセンシングが可能となる。 By providing the eye sensing light projection part as a pixel in the outer region of the image light projection part, it is possible to perform eye sensing with high accuracy.
 本開示の一形態に係る電子機器は、
 上記の何れかに記載の画像表示装置
 を具備する。
An electronic device according to one aspect of the present disclosure includes:
An image display device according to any one of the above.
 本開示の一形態に係る一体型デバイスは、
 光学系を経て網膜に結像される画像光を発する画像光投光部と、
 瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
 瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
 前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
 を具備し、
 前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
 前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される。
An integrated device according to one aspect of the present disclosure includes:
an image light projection unit that emits image light that is imaged on the retina through an optical system;
an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
and
the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
The entrance pupils of the image light projecting section and the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
典型的な画像表示装置を示す。1 shows a typical image display device. 投光と受光を模式的に示す。Light projection and light reception are shown schematically. 眼球の断面図を模式的に示す。Schematically shows a cross-sectional view of an eyeball. 第1の実施形態に係る画像表示装置を模式的に示す側面図である。1 is a side view schematically showing an image display device according to a first embodiment; FIG. 光学系を示す側面図である。It is a side view which shows an optical system. 一体型デバイスを模式的に示す正面図である。It is a front view which shows an integrated device typically. 画像光投光部を模式的に示す側面図である。It is a side view which shows an image light projection part typically. アイセンシング光受光部を模式的に示す側面図である。FIG. 4 is a side view schematically showing an eye sensing light receiving section; アイセンシング光受光部を模式的に示す側面図である。FIG. 4 is a side view schematically showing an eye sensing light receiving section; デフォーカシングポジションとコントラストとの関係を示すグラフである。4 is a graph showing the relationship between defocusing position and contrast; 画像光投光部の射出角度を示すグラフである。It is a graph which shows the emission angle of an image light projection part. アイセンシング光投光部が発するアイセンシング光の分布を示す。4 shows the distribution of eye-sensing light emitted by the eye-sensing light projector. アイセンシング光受光部の撮像範囲と、アイセンシング光投光部の投光範囲とを示す。3 shows an imaging range of an eye sensing light receiving section and a projection range of an eye sensing light projecting section; 第2の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 11 is a front view schematically showing an integrated device according to a second embodiment; 空間周波数とコントラストとの関係を示すグラフである。4 is a graph showing the relationship between spatial frequency and contrast; 画像の補正例を示す。An example of image correction is shown. クロストークを説明するための図である。FIG. 4 is a diagram for explaining crosstalk; FIG. 光学系を模式的に示す側面図及び部分拡大図である。1A and 1B are a side view and a partially enlarged view schematically showing an optical system; FIG. 第3の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 11 is a front view schematically showing an integrated device according to a third embodiment; 空間周波数とコントラストとの関係を示すグラフである。4 is a graph showing the relationship between spatial frequency and contrast; 第4の実施形態に係る一体型デバイスを模式的に示す正面図である。It is a front view which shows typically the integrated device which concerns on 4th Embodiment. アイセンシング光受光部を模式的に示す正面図である。FIG. 4 is a front view schematically showing an eye sensing light receiving section; アイセンシング光受光部を模式的に示す側面図である。FIG. 4 is a side view schematically showing an eye sensing light receiving section; 第5の実施形態に係るアイセンシング光受光部に含まれる受光単位ピクセルを模式的に示す正面図である。FIG. 11 is a front view schematically showing light-receiving unit pixels included in an eye-sensing light-receiving section according to a fifth embodiment; 第6の実施形態に係る一体型デバイスを模式的に示す正面図である。It is a front view which shows typically the integrated device which concerns on 6th Embodiment. 画像光投光部の内側領域及び外側領域を模式的に示す正面図である。It is a front view which shows typically the inner area|region and outer area|region of an image light projection part. 第7の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 11 is a front view schematically showing an integrated device according to a seventh embodiment; 画像光投光部の内側領域及び外側領域を模式的に示す正面図である。It is a front view which shows typically the inner area|region and outer area|region of an image light projection part. 視者がメガネを装着している場合を説明するための図である。It is a figure for demonstrating the case where the viewer wears spectacles. 第9の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 20 is a front view schematically showing an integrated device according to a ninth embodiment; 第10の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 21 is a front view schematically showing an integrated device according to a tenth embodiment; 第11の実施形態に係る一体型デバイスを模式的に示す正面図である。FIG. 21 is a front view schematically showing an integrated device according to an eleventh embodiment; 第3の実施形態に係る画像表示装置を示す。3 shows an image display device according to a third embodiment.
 以下、図面を参照しながら、本開示の実施形態を説明する。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
 I.第1の実施形態  I. 1st embodiment
 1.本実施形態のコンセプト 1. Concept of this embodiment
 VR用ヘッドマウントディスプレイに搭載される画像表示装置には、8K、16K等の高解像度のニーズがある。一方、8K、16K等の高解像度のデータを表示領域全体に表示するためには、画像表示パネルへの伝送速度帯域が不足するおそれがある。伝送帯域を確保するためには、中心窩レンダリング(Foveated Rendering)が有効である。中心窩レンダリングとは、人間の目がおおむね注視点のみ高解像度でその周辺では急峻に解像度が落ちることを利用して、注視点にのみ高解像度の画像を表示し、周辺領域には低解像度の画像を表示することにより、伝送帯域を確保する方法である。注視点の位置を検出するために、典型的には、以下の様なアイセンシング技術が利用される。 There is a need for high resolution such as 8K and 16K for image display devices installed in VR head-mounted displays. On the other hand, in order to display high-resolution data such as 8K and 16K over the entire display area, the transmission speed band to the image display panel may be insufficient. Foveated Rendering is effective in securing the transmission band. Foveal rendering uses the fact that the human eye generally has high resolution only at the fixation point and sharply drops in resolution around it. This is a method of securing a transmission band by displaying an image. Eye sensing techniques such as the following are typically used to detect the position of the gaze point.
 図1は、典型的な画像表示装置を示す。 Fig. 1 shows a typical image display device.
 VR用ヘッドマウントディスプレイに搭載される典型的な画像表示装置10は、画像光投光部11と、光学系12と、アイセンシング光投光部15と、アイセンシング光受光部14と、を有する。画像光投光部11は、網膜に結像される画像光(RGB光)を発する。光学系12は、平行な画像光(図中の破線)を瞳に入射し、画像光を目16の網膜上で結像させる。一方、アイセンシング光投光部15は、瞳(角膜や水晶体)で反射されるアイセンシング光(赤外線)(図中の実線)を発する赤外線投影機である。アイセンシング光受光部14は、赤外線カメラであり、瞳で反射したアイセンシング光(赤外線)を受光し、瞳上の物点であるプルキンエ像(瞳における物体情報)を結像する。 A typical image display device 10 mounted on a VR head-mounted display has an image light projector 11, an optical system 12, an eye sensing light projector 15, and an eye sensing light receiver 14. . The image light projection unit 11 emits image light (RGB light) that forms an image on the retina. The optical system 12 causes parallel image light (broken lines in the drawing) to enter the pupil and form an image of the image light on the retina of the eye 16 . On the other hand, the eye-sensing light projector 15 is an infrared projector that emits eye-sensing light (infrared rays) (solid line in the drawing) that is reflected by the pupil (cornea or lens). The eye-sensing light receiving unit 14 is an infrared camera, receives eye-sensing light (infrared rays) reflected by the pupil, and forms a Purkinje image (object information on the pupil), which is an object point on the pupil.
 画像光投光部11及び光学系12は、瞳に平行に画像光を入射するため、瞳上の角度情報を必要とする。一方、アイセンシング光投光部15のアイセンシング光路は、瞳上の物点をアイセンシング光受光部14に結像させるため、瞳上の物体情報が必要である。この様に、画像光路とアイセンシング光路とは、瞳上での必要な情報が異なる。このため、画像光投光部11に対して、アイセンシング光投光部15及びアイセンシング光受光部14を、光学系の光軸方向(以下、Z方向とも称する)に離間するように、独立して設置するのが典型的である。このため、フォームファクタの制約、特に画角が大きい場合にはVR用レンズの大型化、斜め視でのアイセンシングの精度悪化の問題がある。 The image light projection unit 11 and the optical system 12 require angle information on the pupil because the image light is incident on the pupil in parallel. On the other hand, the eye-sensing optical path of the eye-sensing light projecting unit 15 forms an image of an object point on the pupil on the eye-sensing light receiving unit 14, so object information on the pupil is required. In this way, the image optical path and the eye sensing optical path require different information on the pupil. Therefore, the eye sensing light projecting unit 15 and the eye sensing light receiving unit 14 are separated from the image light projecting unit 11 in the optical axis direction (hereinafter also referred to as the Z direction) of the optical system. It is typically installed with For this reason, there are problems such as form factor restrictions, the increase in the size of the VR lens especially when the angle of view is large, and the deterioration of eye sensing accuracy in oblique viewing.
 以上のような事情に鑑み、本実施形態の画像表示装置は、画像光投光部に対して、アイセンシング光投光部及びアイセンシング光受光部を、光学系の光軸方向(Z方向)に略等しく設置する。より具体的には、画像光投光部、アイセンシング光投光部及びアイセンシング光受光部を、Z方向に直交する面上に構成された一体型デバイスとして構成する。これにより、本実施形態の画像表示装置は、一体型デバイスを用いることでフォームファクタの自由度を増し、小型化し、アイセンシング光受光部によって正面視も見えるためアイセンシングの精度を高くすることを図る。 In view of the circumstances as described above, in the image display device of the present embodiment, the eye sensing light projecting section and the eye sensing light receiving section are arranged in the optical axis direction (Z direction) of the optical system with respect to the image light projecting section. set approximately equal to More specifically, the image light projecting section, the eye sensing light projecting section, and the eye sensing light receiving section are configured as an integrated device configured on a plane perpendicular to the Z direction. As a result, the image display device of the present embodiment increases the degree of freedom of the form factor by using an integrated device, is miniaturized, and can be viewed from the front by the eye-sensing light receiving part, so that the accuracy of eye-sensing can be improved. Plan.
 図2は、投光と受光を模式的に示す。 FIG. 2 schematically shows light projection and light reception.
 (A)に示す様に、一体型デバイスからの投光は、一体型デバイスの投光部における位置情報(Z位置)を、瞳における角度情報(Z位置)及び網膜における位置情報(Z位置)に変換するとみなせる。一方、(B)に示す様に、一体型デバイスでの受光は、網膜における角度情報(Z位置)及び瞳における位置情報(Z位置)を、一体型デバイスの受光部における角度情報(Z位置)に変換するとみなせる。一体型デバイスの受光部における角度情報を直接的には取得することはできないが、ライトフィールドを取得することで角度情報を得ることができる。ライトフィールドとは、空間内を通過する多方向に流れる光の量を表すベクトル関数である。(C)に示す様に、ライトフィールドから、Z方向に直交するXY平面での角度情報を再構成することができる。 As shown in (A), the light projected from the integrated device uses the position information (Z position) at the light projecting part of the integrated device, the angle information (Z position) at the pupil, and the position information (Z position) at the retina. can be regarded as converting to On the other hand, as shown in (B), the light reception by the integrated device is based on the angle information (Z position) on the retina and the position information (Z position) on the pupil, and the angle information (Z position) on the light receiving unit of the integrated device. can be regarded as converting to Although it is not possible to directly acquire the angle information in the light receiving part of the integrated device, it is possible to acquire the angle information by acquiring the light field. A light field is a vector function that represents the amount of light flowing in multiple directions through space. As shown in (C), angle information on the XY plane orthogonal to the Z direction can be reconstructed from the light field.
 図3は、眼球の断面図を模式的に示す。 FIG. 3 schematically shows a cross-sectional view of an eyeball.
 上述の様に、本実施形態では、一体型デバイスから、画像光及びアイセンシング光の両方を発する。画像光は網膜に結像し、アイセンシング光のプルキンエ像は瞳(角膜、水晶体)に虚像を形成する。この様に、画像光の結像位置と、アイセンシング光の結像位置とは、Z方向に異なる。このため、一体型デバイスによって受光を行う場合にはこのZ方向の差がデフォーカスとなってしまいボケてしまう課題に対応する必要がある。そこで、本実施形態に係る画像表示装置は、以下の構成を有する。 As described above, in this embodiment, both image light and eye sensing light are emitted from the integrated device. The image light forms an image on the retina, and the Purkinje image of the eye sensing light forms a virtual image on the pupil (cornea, lens). Thus, the imaging position of the image light and the imaging position of the eye sensing light are different in the Z direction. For this reason, when light is received by an integrated device, it is necessary to deal with the problem that this difference in the Z direction causes defocus and blurring. Therefore, the image display device according to this embodiment has the following configuration.
 2.画像表示装置の構成 2. Configuration of image display device
 図4は、第1の実施形態に係る画像表示装置を模式的に示す側面図である。 FIG. 4 is a side view schematically showing the image display device according to the first embodiment.
 以下、光学系の光軸方向(Z方向)から見た画像表示装置のXY面図を正面図と称する。Z方向に直交する方向から見た画像表示装置の図を側面図と称する。 An XY plane view of the image display device viewed from the optical axis direction (Z direction) of the optical system is hereinafter referred to as a front view. A view of the image display device seen from a direction orthogonal to the Z direction is called a side view.
 画像表示装置100は、例えば、ヘッドマウントディスプレイ(不図示)に搭載される。具体的には、1個のヘッドマウントディスプレイに、2個の画像表示装置100が、視者の両眼に対応して搭載される。 The image display device 100 is mounted on, for example, a head-mounted display (not shown). Specifically, two image display devices 100 are mounted on one head-mounted display so as to correspond to both eyes of the viewer.
 画像表示装置100は、一体型デバイス110と、光学系120とを有する。 The image display device 100 has an integrated device 110 and an optical system 120 .
 図5は、光学系を示す側面図である。 FIG. 5 is a side view showing the optical system.
 光学系120は、平行な画像光を瞳に入射し、画像光を目16の網膜上で結像させる。光学系120は、例えば、一体型デバイス110の発光面に近い方から、偏光板121、1/4波長版122、レンズ123、ハーフミラー124、1/4波長板125、偏光板126及びレンズ127を含む。図示の光学系120の構成は、VR光学系の一例であり、図示の構成に限定されない。ただし、VR光学系の構成が違う場合、周辺光線の角度が変わってくるので、それに応じたライトフィールド取り込み角の設計が必要である。 The optical system 120 causes parallel image light to enter the pupil and forms an image of the image light on the retina of the eye 16 . The optical system 120 includes, for example, a polarizing plate 121, a quarter-wave plate 122, a lens 123, a half mirror 124, a quarter-wave plate 125, a polarizing plate 126, and a lens 127 in order from the light emitting surface of the integrated device 110. including. The illustrated configuration of the optical system 120 is an example of the VR optical system, and is not limited to the illustrated configuration. However, if the configuration of the VR optical system is different, the angle of the peripheral ray will change, so it is necessary to design the light field capture angle accordingly.
 図6は、一体型デバイスを模式的に示す正面図である。 FIG. 6 is a front view schematically showing the integrated device.
 一体型デバイス110は、画像光投光部140と、アイセンシング光投光部150と、アイセンシング光受光部130とを有する。画像光投光部140と、アイセンシング光投光部150と、アイセンシング光受光部130を有する。アイセンシング光受光部130は同一のウェハに形成されるのが望ましいが、必ずしも同一ウェハ上に形成しなくてもよい。画像光投光部140と、アイセンシング光投光部150と、アイセンシング光受光部130とは、光学系120の光軸方向(Z方向)において同位置に配置される。言い換えれば、画像光投光部140と、アイセンシング光投光部150と、アイセンシング光受光部130とは、Z方向に直交する共通のXY面上に配置される。 The integrated device 110 has an image light projecting section 140 , an eye sensing light projecting section 150 and an eye sensing light receiving section 130 . It has an image light projecting section 140 , an eye sensing light projecting section 150 and an eye sensing light receiving section 130 . Although it is desirable that the eye sensing light receiving part 130 is formed on the same wafer, it is not necessarily formed on the same wafer. Image light projecting section 140 , eye sensing light projecting section 150 , and eye sensing light receiving section 130 are arranged at the same position in the optical axis direction (Z direction) of optical system 120 . In other words, the image light projecting section 140, the eye sensing light projecting section 150, and the eye sensing light receiving section 130 are arranged on a common XY plane orthogonal to the Z direction.
 図7は、画像光投光部を模式的に示す側面図である。 FIG. 7 is a side view schematically showing the image light projecting section.
 画像光投光部140は、光学系120を経て網膜に結像される画像光を発する。具体的には、画像光投光部140は、μOLED(マイクロ有機Light Emitting Diode)パネルであり、RGB光を発する。画像光投光部140は、複数の発光セル141と、各発光セル141上のレンズ142とを有する。発光セル141は、カラーフィルタ144と、発光層145とを含む。レンズ142は、発光セル141が発する画像光をより効率よく網膜に結像するためのレンズである。レンズ142は、カバーガラス143で覆われる。言い換えれば、画像光投光部140が発する画像光は、光学系120と、画像光投光部140に設けられた各発光セル141上のレンズ142により、視者の網膜に結像する。 The image light projection unit 140 emits image light that passes through the optical system 120 and forms an image on the retina. Specifically, the image light projection unit 140 is a μOLED (Micro Organic Light Emitting Diode) panel and emits RGB light. The image light projector 140 has a plurality of light emitting cells 141 and a lens 142 on each light emitting cell 141 . Light-emitting cell 141 includes color filter 144 and light-emitting layer 145 . The lens 142 is a lens for more efficiently forming an image of the image light emitted by the light emitting cell 141 on the retina. Lens 142 is covered with cover glass 143 . In other words, the image light emitted by the image light projecting section 140 forms an image on the retina of the viewer by the optical system 120 and the lens 142 on each light emitting cell 141 provided in the image light projecting section 140 .
 画像光投光部140は、μOLED(マイクロ有機Light Emitting Diode)の他、LCD(Liquid Crystal Display)、LCOS(Liquid crystal on silicon)、LED(light emitting diode)、LD(Laser Diode)、VCSEL(Vertical Cavity Surface Emitting Laser、垂直共振器型面発光レーザー)、QD(Quantum Dot)OLED等の発光デバイスでもよい。 The image light projection unit 140 includes μOLED (micro organic light emitting diode), LCD (liquid crystal display), LCOS (liquid crystal on silicon), LED (light emitting diode), LD (laser diode), VCSEL (vertical Light-emitting devices such as cavity surface emitting lasers, vertical cavity surface emitting lasers, QD (Quantum Dot) OLEDs, etc. may also be used.
 画像光投光部140は、光学系120の光軸(Z方向)に直交するXY面において、例えば、長方形状である。画像光投光部140のRGB発光領域のサイズは、例えば、対角1.28inch(=32.5mm)、横26mm×縦19.5mmである。有効像円形(即ち、画像光投光部140の有効径)の直径はφ32.5mmであり、対角の距離(32.5mm)と等しい。画像光投光部140の中心21は、光学系120の光軸と一致する。以下、画像光投光部140の中心21からの放射方向(有効像円形22の径方向)をα方向、α方向に直交する方向をβ方向とする。 The image light projection unit 140 has, for example, a rectangular shape on the XY plane perpendicular to the optical axis (Z direction) of the optical system 120 . The size of the RGB light emitting area of the image light projecting section 140 is, for example, 1.28 inches diagonally (=32.5 mm) and 26 mm wide×19.5 mm long. The diameter of the effective image circle (that is, the effective diameter of the image light projection section 140) is φ32.5 mm, which is equal to the diagonal distance (32.5 mm). A center 21 of the image light projection unit 140 coincides with the optical axis of the optical system 120 . Hereinafter, the radiation direction from the center 21 of the image light projection unit 140 (the radial direction of the effective image circle 22) is defined as the α direction, and the direction orthogonal to the α direction is defined as the β direction.
 アイセンシング光投光部150は、画像光投光部140の周囲に複数(本例では4個)設けられる。画像光投光部140の中心21から各アイセンシング光投光部150までの距離Dは、例えば、13mmである。各アイセンシング光投光部150のサイズは、例えば、1.3mm×0.98mmの矩形である。各アイセンシング光投光部150は、光学系120を経て瞳で反射されるアイセンシング光を発する。アイセンシング光は、例えば、赤外線である。各アイセンシング光投光部150は、LED151と、アイセンシング光の光量を増加するための投光マイクロレンズ152とを有する。投光マイクロレンズ152は、LED151上に設けられる。投光マイクロレンズ152がアイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。投光マイクロレンズ152は、アイセンシング光の方向が画像光の射出角方向に一致するように偏心してもよい。これにより、投光マイクロレンズ152がアイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 A plurality of eye-sensing light projection units 150 (four in this example) are provided around the image light projection unit 140 . A distance D from the center 21 of the image light projecting section 140 to each eye sensing light projecting section 150 is, for example, 13 mm. The size of each eye sensing light projection unit 150 is, for example, a rectangle of 1.3 mm×0.98 mm. Each eye-sensing light projector 150 emits eye-sensing light that passes through the optical system 120 and is reflected by the pupil. The eye sensing light is, for example, infrared. Each eye-sensing light projector 150 has an LED 151 and a light-projecting microlens 152 for increasing the amount of eye-sensing light. A projection microlens 152 is provided on the LED 151 . A triple-pass system can also be used by increasing the light intensity of the eye-sensing light with the projection microlens 152 . The projection microlens 152 may be decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light. As a result, the projection microlens 152 increases the light amount of the eye sensing light, so that it can be used even in a triple-pass system.
 図9は、アイセンシング光受光部を模式的に示す側面図である。 FIG. 9 is a side view schematically showing the eye sensing light receiving part.
 アイセンシング光受光部130は、受光単位ピクセル133と、受光マイクロレンズ132と、画像光をカットするカットフィルタ134(バンドパスフィルタ)とを有する。アイセンシング光受光部130は、例えば、横1.3mm×縦0.98mmの赤外線受光領域である。画像光投光部140の中心21から、アイセンシング光受光部130の中心までの距離Aは、短い(近い)ほどよく、例えば、10.6mmである。なお、アイセンシング光受光部130にイベントベースビジョンセンサを使用すれば、消費電力を抑えられる。 The eye sensing light receiving section 130 has a light receiving unit pixel 133, a light receiving microlens 132, and a cut filter 134 (bandpass filter) for cutting image light. The eye sensing light receiving section 130 is, for example, an infrared light receiving area of 1.3 mm wide×0.98 mm long. The distance A from the center 21 of the image light projecting section 140 to the center of the eye sensing light receiving section 130 is preferably as short (close) as possible, for example, 10.6 mm. Power consumption can be reduced by using an event-based vision sensor for the eye sensing light receiving unit 130 .
 アイセンシング光受光部130は、光学系120の光軸に相当する画像光投光部140の中心21に対して、画像光投光部140の外側に位置する。逆に言えば、光学系120の光軸(Z方向)に直交するXY面において、画像光投光部140は、アイセンシング光投光部150及びアイセンシング光受光部130と、光軸との間に設けられる。このため、画像光投光部140からの画像光投光による結像解像力が劣化せず、高解像度化、また製造性の向上を図れる。 The eye sensing light receiving section 130 is positioned outside the image light projecting section 140 with respect to the center 21 of the image light projecting section 140 corresponding to the optical axis of the optical system 120 . Conversely, in the XY plane orthogonal to the optical axis (Z direction) of the optical system 120, the image light projecting section 140 is positioned between the eye sensing light projecting section 150 and the eye sensing light receiving section 130 and the optical axis. provided in between. For this reason, the imaging resolution due to the projection of the image light from the image light projection unit 140 is not deteriorated, and the resolution can be increased and the productivity can be improved.
 アイセンシング光受光部130は、長方形状の画像光投光部140の長辺の中央と光学系120の光軸(画像光投光部140の中心21)とを結ぶ直線上であって、光学系120の有効像円形22内に位置する。このため、画像光投光部140からの画像光投光による結像解像力が劣化せず、高解像度化を図れる。 The eye sensing light receiving unit 130 is on a straight line connecting the center of the long side of the rectangular image light projecting unit 140 and the optical axis of the optical system 120 (the center 21 of the image light projecting unit 140). It lies within the effective image circle 22 of system 120 . For this reason, the imaging resolution due to the projection of the image light from the image light projection section 140 does not deteriorate, and the resolution can be increased.
 アイセンシング光受光部130は受光単位ピクセル131よりなる。例えば、サイズ3.25μm、解像度156LP/mm、ピクセル数400×300ピクセル、F=1.29である。 The eye-sensing light receiving unit 130 consists of light receiving unit pixels 131 . For example, the size is 3.25 μm, the resolution is 156 LP/mm, the number of pixels is 400×300 pixels, and F=1.29.
 受光マイクロレンズ132は、瞳で反射して光学系120を経て入射するアイセンシング光を、受光単位ピクセル131にそれぞれ結像させる。 The light-receiving microlenses 132 form images on the light-receiving unit pixels 131 of the eye-sensing light reflected by the pupil and incident through the optical system 120 .
 受光マイクロレンズ132は、画像光の光路に含まれず、また、アイセンシング光投光部150が発するアイセンシング光が光学系120を経て瞳に到達する光路に含まれない。言い換えれば、画像光の光路、アイセンシング光投光の光路及びアイセンシング光受光の光路の3個の光路のうち、画像光の光路及びアイセンシング光投光の光路は光学系120のみで適切な光路が得られる。一方、アイセンシング光受光の光路のみ、光学系120に加えて受光マイクロレンズ132を追加すれば、適切な光路が得られる。これにより、最低限の部材の追加だけで適切な光路を得ることができるため、コンパクトな画像表示装置100を実現可能である。 The light-receiving microlens 132 is not included in the optical path of the image light, and is not included in the optical path along which the eye-sensing light emitted by the eye-sensing light projector 150 passes through the optical system 120 and reaches the pupil. In other words, of the three optical paths of the image light, the projection of the eye sensing light, and the reception of the eye sensing light, the optical path of the image light and the projection of the eye sensing light are suitable only for the optical system 120. An optical path is obtained. On the other hand, only the optical path for receiving the eye-sensing light can be obtained by adding the light-receiving microlens 132 in addition to the optical system 120 . As a result, an appropriate optical path can be obtained by adding a minimum number of members, so that a compact image display device 100 can be realized.
 この様に、網膜に結像すべき画像光を発する画像光投光部140と、網膜に結像すべきアイセンシング光を発するアイセンシング光投光部150のZ方向位置が略等しいにも拘らず、適切な位置に配置された受光マイクロレンズ132によりアイセンシング光がアイセンシング光受光部130の各受光単位ピクセル131に結像させることができる。これにより、一体型デバイス110を用いることによる小型化及びフォームファクタの自由度向上と、アイセンシング精度向上とを、両立することができる。 In this way, although the image light projection unit 140 that emits the image light to be imaged on the retina and the eye sensing light projection unit 150 that emits the eye sensing light to be imaged on the retina are substantially at the same position in the Z direction. Instead, the eye-sensing light can be imaged on each light-receiving unit pixel 131 of the eye-sensing light receiving unit 130 by the light-receiving microlens 132 arranged at an appropriate position. As a result, it is possible to achieve both miniaturization and improvement in form factor flexibility by using the integrated device 110 and improvement in eye sensing accuracy.
 アイセンシング光受光部130のイメージセンサ面は、上から20degのアイセンシング光が瞳中心に相当するため、アイセンシング光受光部130の中心を、光学系132に対してY-方向にシフトした位置に設けられてもよいが、必須ではない。例えば、アイセンシング光受光部130は、Y方向に-0.49mmシフトしてもよい。同図の矢印に示す位置において、受光マイクロレンズ132の入射瞳Z位置は、OLED発光Z位置と一致する。図中のnは屈折率を意味する。また光学系132の焦点距離は1.29mmであり、光学系132のバックフォーカス1.37mmと略等しい。換言すれば平行光を結像する構成であり、これによって角度情報を位置情報へと変換してアイセンシング像を得る。 The image sensor surface of the eye-sensing light receiving unit 130 is positioned such that the center of the eye-sensing light receiving unit 130 is shifted in the Y-direction with respect to the optical system 132 because the eye-sensing light of 20 degrees from the top corresponds to the center of the pupil. may be provided, but is not required. For example, the eye sensing light receiver 130 may be shifted -0.49 mm in the Y direction. At the position indicated by the arrow in the figure, the entrance pupil Z position of the light receiving microlens 132 coincides with the OLED emission Z position. n in the figure means the refractive index. The focal length of the optical system 132 is 1.29 mm, which is substantially equal to the back focus of the optical system 132 of 1.37 mm. In other words, it is a configuration for forming an image with parallel light, thereby converting angle information into position information to obtain an eye-sensing image.
 図10は、受光マイクロレンズ132によるデフォーカシングポジションとコントラストとの関係を示すグラフである。 FIG. 10 is a graph showing the relationship between the defocusing position of the light receiving microlens 132 and the contrast.
 アイセンシング光受光部130の中心を受光マイクロレンズ132に対してシフトすると、2ピクセルはおおむね解像する。 When the center of the eye-sensing light receiving unit 130 is shifted with respect to the light receiving microlens 132, two pixels are generally resolved.
 図11は、光学系120の取り込み可能角度を示すグラフである。 FIG. 11 is a graph showing the captureable angles of the optical system 120. FIG.
 光学系120の取り込み可能角度は、像高10.6mm(即ち、画像光投光部140の中心21から、アイセンシング光受光部130の中心までの距離A=10.6mm)のとき、射出角15deg~25deg(中心20deg)である。像高13mm(即ち、画像光投光部140の中心21から、アイセンシング光投光部150までの距離D=13mm)のとき、射出角22deg~33deg(中心27.5deg)である。 When the image height of the optical system 120 is 10.6 mm (that is, the distance A from the center 21 of the image light projector 140 to the center of the eye sensing light receiver 130 = 10.6 mm), the exit angle is It is 15deg to 25deg (center 20deg). When the image height is 13 mm (that is, the distance D from the center 21 of the image light projection unit 140 to the eye sensing light projection unit 150 is 13 mm), the emergence angle is 22 degrees to 33 degrees (27.5 degrees at the center).
 図12は、アイセンシング光投光部が発するアイセンシング光の分布を示す。 FIG. 12 shows the distribution of eye-sensing light emitted by the eye-sensing light projector.
 (A)に示す様に、典型的なアイセンシング光投光部(LED及び投光マイクロレンズを含む)の光分布は、例えば、20degである。 As shown in (A), the light distribution of a typical eye-sensing light projection unit (including LEDs and projection microlenses) is, for example, 20 degrees.
 これに対して、本実施形態では、(B)に示す様に、アイセンシング光投光部150(LED151及び投光マイクロレンズ152を含む)の分布域を広げ、且つ、配向方向を合わせる。例えば、α方向に5.5deg~49.5deg、β方向に42degである。赤外線の投受光における条件の計算方法は以下の通りである。 On the other hand, in the present embodiment, as shown in (B), the distribution area of the eye-sensing light projecting section 150 (including the LED 151 and the light projecting microlens 152) is widened and the orientation directions are aligned. For example, it is 5.5 deg to 49.5 deg in the α direction and 42 deg in the β direction. The calculation method for the conditions for the transmission and reception of infrared rays is as follows.
 上述の様に、画像光投光部140の中心21から、アイセンシング光投光部150の中心までの距離をDとする(図6参照)。画像光投光部140の中心21からの放射方向(有効像円形22の径方向)をα方向、α方向に直交する方向をβ方向とする。 As described above, let D be the distance from the center 21 of the image light projection unit 140 to the center of the eye sensing light projection unit 150 (see FIG. 6). The radiation direction from the center 21 of the image light projection unit 140 (the radial direction of the effective image circle 22) is defined as the α direction, and the direction orthogonal to the α direction is defined as the β direction.
 まず、画像光投光部140の瞳径φB(Bは例えば4mm。図5参照)に対して、像高D(図11参照)での光学系120による取り込み可能角をθα+_D,θα-_D,θβ+_D,θβ-_D,θα_Center_Dとする。θα_Center_Dは像高Dにおける主光線角である。 First, with respect to the pupil diameter φB (B is, for example, 4 mm, see FIG. 5) of the image light projection unit 140, the angles that can be captured by the optical system 120 at the image height D (see FIG. 11) are θ α+_D , θ α− _D , θ β+_D , θ β−_D , θ α_Center_D . θα_Center_D is the chief ray angle at the image height D.
 次に、アイセンシング光投光部150が照明すべき瞳近傍の範囲C(Cは例えばCα_PRJ=16mm,Cβ_PRJ=16mm)に対して、必要となるアイセンシング光投光範囲角をξα+_PRJ,ξα-_PRJ,ξβ+_PRJ,ξβ-_PRJとする。 Next, the required eye sensing light projection range angle for the range C (C is, for example, C α_PRJ =16 mm, C β_PRJ =16 mm) in the vicinity of the pupil to be illuminated by the eye sensing light projection unit 150 is calculated as ξ α+_PRJ , ξ α−_PRJ , ξ β+_PRJ , ξ β−_PRJ .
 ξα+_PRJ-θα_Center_D≧Cα_PRJ/B・(θα+_D-θα_Center_D) (1-1)
 ξα-_PRJ-θα_Center_D≦Cα_PRJ/B・(θα-_D-θα_Center_D) (1-2)
 ξβ+_PRJ≧Cβ_PRJ/B・θβ_D (1-3)
 ξβ-_PRJ≦Cβ_PRJ/B・θβ_D (1-4)
ξ α+_PRJ − θ α_Center_D ≧C α_PRJ /B・(θ α+_D −θ α_Center_D ) (1-1)
ξ α−_PRJ −θ α_Center_D ≦C α_PRJ /B・(θ α−_D −θ α_Center_D ) (1-2)
ξ β+_PRJ ≧C β_PRJ /B・θ β_D (1-3)
ξ β-_PRJ ≦ C β_PRJ /B・θ β_D (1-4)
 式(1-1)、(1-2)、(1-3)及び(1-4)を満たす様に、アイセンシング光投光部150の投光マイクロレンズ152は、アイセンシング光の方向が画像光の射出角方向に一致するように偏心してもよい。これにより、投光マイクロレンズ152がアイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 The projection microlenses 152 of the eye-sensing light projection unit 150 are arranged so that the direction of the eye-sensing light is It may be decentered so as to match the exit angle direction of the image light. As a result, the projection microlens 152 increases the light amount of the eye sensing light, so that it can be used even in a triple-pass system.
 式(1-1)、(1-2)、(1-3)及び(1-4)を満たす様に、アイセンシング光受光部130の受光マイクロレンズ132は、それぞれ、瞳で反射したアイセンシング光を複数の受光単位サブピクセル131の中心に結像させるように偏心してもよい。これにより、正確なアイセンシングが可能となる。 The light-receiving microlenses 132 of the eye-sensing light-receiving unit 130 each receive the eye-sensing It may be decentered so as to image the light to the center of the plurality of light receiving unit sub-pixels 131 . This enables accurate eye sensing.
 像高D(図6、図11参照)からの射出角に関して、以下の式が成り立つ。 The following formula holds for the angle of emergence from image height D (see FIGS. 6 and 11).
 θα+_D=33deg
 θα-_D=22deg
 θα_Center_D=27.5deg
 θβ+_D=5.2deg
 θβ-_D=-5.2deg
  なおθβ+_Dとθβ-_Dは像高0mmにおける上光線と下光線に略等しい。
θα +_D = 33deg
θ α-_D = 22deg
θ α_Center_D =27.5 degrees
θβ +_D = 5.2deg
θ β-_D =-5.2deg
Note that θβ +_D and θβ −_D are approximately equal to the upper ray and lower ray at an image height of 0 mm.
 このとき、アイセンシング光投光部150のアイセンシング光投光範囲角について、アイセンシング光投光範囲角をξα+_PRJ,ξα-_PRJ,ξβ+_PRJ,ξβ-_PRJとする。以下の式が成り立つ。 At this time, regarding the eye-sensing light projection range angles of the eye-sensing light projection unit 150, the eye-sensing light projection range angles are ξ α+_PRJ , ξ α−_PRJ , ξ β+_PRJ , and ξ β−_PRJ . The following formula holds.
 ξα+_PRJ≧49.5deg
 ξα-_PRJ≦5.5deg
 ξβ+_PRJ≧20.8deg
 ξβ-_PRJ≦-20.8deg
ξα +_PRJ ≧49.5deg
ξ α-_PRJ ≤ 5.5deg
ξβ +_PRJ ≧20.8deg
ξ β-_PRJ ≤ -20.8deg
 同様に、アイセンシング光受光部130のアイセンシング光受光範囲角について、像高A(図6、図11参照)からの射出角に関して、以下の式が成り立つ。  Similarly, regarding the angle of the eye-sensing light receiving range of the eye-sensing light receiving unit 130, the following formula holds for the angle of emergence from the image height A (see FIGS. 6 and 11). 
 θα+_A=25deg
 θα-_A=15deg
 θα_Center_A=20deg
 θβ+_A=5.2deg
 θβ-_A=-5.2deg
 なおθβ+_Aとθβ-_Aは像高0mmにおける上光線と下光線に略等しい。
θα +_A =25deg
θ α-_A = 15deg
θ α_Center_A = 20deg
θβ +_A = 5.2deg
θ β-_A =-5.2deg
Note that θβ +_A and θβ −_A are substantially equal to the upper ray and lower ray at an image height of 0 mm.
 図13は、アイセンシング光受光部の撮像範囲と、アイセンシング光投光部の投光範囲とを示す。 FIG. 13 shows the imaging range of the eye sensing light receiving section and the projection range of the eye sensing light projecting section.
 1.3mm×0.98mmサイズのアイセンシング光受光部130に対しては、取得すべき瞳近傍の範囲C(Cは例えばCα_DET=12mm、Cβ_DET=16mm)(図13の受光座標系)に対して、アイセンシング光受光範囲角をξα+_DET,ξα-_DET,ξβ+_DET,ξβ-_DETとする。以下の式が成り立つ。 For the eye sensing light receiving unit 130 having a size of 1.3 mm×0.98 mm, a range C near the pupil to be acquired (C is, for example, C α_DET =12 mm, C β_DET =16 mm) (light receiving coordinate system in FIG. 13) , the eye sensing light receiving range angles are ξα +_DET , ξα −_DET , ξβ +_DET , and ξβ −_DET . The following formula holds.
 ξα+_DET-θα_Center_A≧Cα_DET/B・(θα+_A-θα_Center_A) (1-1')
 ξα-_DET-θα_Center_A≦Cα_DET/B・(θα-_A-θα_Center_A) (1-2')
 ξβ+_DET≧Cβ_DET/B・θβ_A (1-3')
 ξβ-_DET≦Cβ_DET/B・θβ_A (1-4')

 ξα+_DET=35deg
 ξα-_DET=5deg
 ξβ+_DET=20.8deg
 ξβ-_DET=-20.8deg
ξ α+_DET −θ α_Center_A ≧C α_DET /B・(θ α+_A −θ α_Center_A ) (1−1′)
ξ α−_DET −θ α_Center_A ≦C α_DET /B・(θ α−_A −θ α_Center_A ) (1-2′)
ξ β+_DET ≧C β_DET /B・θ β_A (1−3′)
ξ β-_DET ≦ C β_DET /B・θ β_A (1-4')

ξα +_DET = 35deg
ξ α-_DET = 5deg
ξβ +_DET = 20.8deg
ξ β-_DET = -20.8deg
 これらは、上記式を満たす。この場合、アイセンシング光受光部130の撮像範囲は矩形となる。一方、アイセンシング光投光部150はα=16mmとしており、もう少し大きな範囲を照らすことになる。例えば、アイセンシング光投光部150の投光範囲は、アイセンシング光受光部130の矩形の撮像範囲を囲む円形(図13の投光座標系)である。 These satisfy the above formula. In this case, the imaging range of the eye sensing light receiving section 130 is rectangular. On the other hand, the eye-sensing light projection unit 150 has α=16 mm, and illuminates a slightly larger range. For example, the projection range of the eye-sensing light projecting section 150 is a circle (the projection coordinate system in FIG. 13) surrounding the rectangular imaging range of the eye-sensing light receiving section 130 .
 以上の様に、本実施形態の画像表示装置100は、小型の一体型デバイス110に画像光投光部140、アイセンシング光投光部150及びアイセンシング光受光部130を持たせ、コンパクトでありフォームファクタの制約を減らすことができる。 As described above, the image display apparatus 100 of the present embodiment is compact because the image light projecting section 140, the eye sensing light projecting section 150, and the eye sensing light receiving section 130 are provided in the compact integrated device 110. Form factor constraints can be reduced.
 図29は、視者がメガネを装着している場合を説明するための図である。 FIG. 29 is a diagram for explaining a case where the viewer wears glasses.
 (A)に示す様に、典型的な画像表示装置10において、視者がメガネ31を装着していると、アイセンシング光投光部15(赤外線投影機)からの正面反射をアイセンシング光受光部14(赤外線カメラ)で撮像してしまうフレア光路32が存在する。複数のアイセンシング光投光部15が満遍なく配置されるため、どこかのフレア光路32はメガネ31でのフレアが強くアイセンシング光受光部14に入ってしまう。これを防止するためには、複数のアイセンシング光受光部14を配置するのがよいが、この場合、さらにフォームファクタを制約する。また、アイセンシング光受光部14の位置制約から斜め視がきつくなったり、アイセンシング光受光部14のカメラサイズが極端に小さくなってアイセンシング精度が悪化しやすい。 As shown in (A), in a typical image display device 10, when a viewer wears spectacles 31, frontal reflection from an eye-sensing light projector 15 (infrared projector) is received. There is a flare optical path 32 that is captured by the unit 14 (infrared camera). Since the plurality of eye-sensing light projectors 15 are evenly arranged, the flare from the eyeglasses 31 is strong and enters the eye-sensing light receiver 14 in some flare light path 32 . In order to prevent this, it is preferable to arrange a plurality of eye-sensing light receiving units 14, but in this case, the form factor is further restricted. In addition, due to the restriction of the position of the eye sensing light receiving section 14, oblique viewing becomes severe, and the camera size of the eye sensing light receiving section 14 becomes extremely small, which tends to deteriorate the eye sensing accuracy.
 一方、(B)に示す様に、本実施形態の画像表示装置100では、フレア光路32がアイセンシング光受光部130に入ることはもちろんあるが、複数のアイセンシング光受光部130を配置してもフォームファクタへの制約は最低限であるので、必要な数のアイセンシング光受光部130を配置可能である。また斜め視の角度も比較的緩くできるため、アイセンシング精度を保ちやすい。 On the other hand, as shown in (B), in the image display device 100 of the present embodiment, the flare light path 32 naturally enters the eye-sensing light receiving section 130, but a plurality of eye-sensing light receiving sections 130 are arranged. Since there are minimal restrictions on the form factor, the required number of eye-sensing light receiving units 130 can be arranged. Also, since the oblique viewing angle can be made relatively loose, it is easy to maintain eye sensing accuracy.
 画像光をカットするカットフィルタ134は画像光をカットする性質をもった材料から成る受光マイクロレンズ132によって代替してもよい。またマイクロレンズ132は単一レンズではなく複数枚の複合レンズとしてもよい。 The cut filter 134 that cuts image light may be replaced by a light-receiving microlens 132 made of a material that cuts image light. Also, the microlens 132 may be composed of a plurality of compound lenses instead of a single lens.
 II.第2の実施形態 II. Second embodiment
 以下、既に説明した構成等と同様の構成等は説明及び図示を省略し、異なる点を主に説明及び図示する。 In the following, explanations and illustrations of configurations that are the same as those already explained will be omitted, and different points will mainly be explained and illustrated.
 図14は、第2の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 14 is a front view schematically showing an integrated device according to the second embodiment.
 第2の実施形態に係る一体型デバイス210と第1の実施形態に係る一体型デバイス110との差異は、アイセンシング光投光部250及びアイセンシング光受光部230の個数及び位置が異なり、またアイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを有する点である。第2の実施形態に係る画像光投光部240のスペックは、第1の実施形態に係る画像光投光部140のスペックと同様である。アイセンシング光受光部230は受光単位ピクセル131よりなる。互いに分離された複数の受光単位サブピクセル131を有する。例えば、隣接する受光単位サブピクセル131の間に、光学系120の光軸方向(Z方向)に延びる仕切り135を設けることにより、複数の受光単位サブピクセル131が互いに分離される。受光単位サブピクセル131は、各受光単位ピクセル133が2×2ピクセル(4ピクセル)以上に分割されることにより構成される。受光マイクロレンズ132は、瞳で反射して光学系120を経て入射するアイセンシング光を、受光単位ピクセル131にそれぞれ結像させる。 The integrated device 210 according to the second embodiment and the integrated device 110 according to the first embodiment are different in the number and positions of the eye-sensing light projecting units 250 and the eye-sensing light receiving units 230, and The eye sensing light receiving part has a plurality of light receiving unit sub-pixels separated from each other. The specs of the image light projecting section 240 according to the second embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment. The eye sensing light receiving part 230 is composed of light receiving unit pixels 131 . It has a plurality of light-receiving unit sub-pixels 131 separated from each other. For example, by providing a partition 135 extending in the optical axis direction (Z direction) of the optical system 120 between adjacent light-receiving unit sub-pixels 131, the plurality of light-receiving unit sub-pixels 131 are separated from each other. The light-receiving unit sub-pixel 131 is configured by dividing each light-receiving unit pixel 133 into 2×2 pixels (4 pixels) or more. The light-receiving microlens 132 forms an image on the light-receiving unit pixel 131 of the eye-sensing light reflected by the pupil and incident through the optical system 120 .
 アイセンシング光投光部250は、画像光投光部240の周囲に6個設けられる。6個のアイセンシング光投光部250は、円形に配置されるプルキンエ像を取得するため、略円周状に配置される。アイセンシング光投光部250のスペックは、第1の実施形態に係るアイセンシング光投光部150のスペックと同様である。アイセンシング光投光部250を6個設けることで、6個のプルキンエ像を取得できるため、アイセンシングをより正確に行うことができる。 Six eye-sensing light projection units 250 are provided around the image light projection unit 240 . The six eye-sensing light projectors 250 are arranged in a substantially circular shape to obtain a circularly arranged Purkinje image. The specifications of the eye-sensing light projection unit 250 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 250, eye sensing can be performed more accurately.
 アイセンシング光受光部230は、画像光投光部240の周囲に8個設けられる。8個アイセンシング光受光部230の配置は図示の例に限定されない。8個のアイセンシング光受光部230をよりばらつかせて配置することで、ウェハ効率や取得信号ばらつきを調整することができる。アイセンシング光受光部230のサイズは、例えば、1.6mm×1mmである。各アイセンシング光受光部230は、複数(例えば、8×5個)の受光単位ピクセル133を含む。各受光単位ピクセル133(図8参照)のサイズは、例えば、200μmである。各受光単位ピクセル133は、互いに分離された複数(例えば、80分割)の受光単位サブピクセル131(図8参照)を有する。受光単位サブピクセル131のサイズは、例えば、2.5μmである。 Eight eye sensing light receiving units 230 are provided around the image light projecting unit 240 . The arrangement of the eight eye-sensing light receiving units 230 is not limited to the illustrated example. By arranging the eight eye-sensing light receiving units 230 with more variation, it is possible to adjust wafer efficiency and acquisition signal variation. The size of the eye sensing light receiving section 230 is, for example, 1.6 mm×1 mm. Each eye sensing light receiver 230 includes a plurality of (eg, 8×5) light receiving unit pixels 133 . The size of each light receiving unit pixel 133 (see FIG. 8) is, for example, 200 μm. Each light-receiving unit pixel 133 has a plurality of (for example, 80-division) light-receiving unit sub-pixels 131 (see FIG. 8) separated from each other. The size of the light-receiving unit sub-pixel 131 is, for example, 2.5 μm.
 受光単位ピクセル133を覆う受光マイクロレンズ132(図8参照)は、例えば、半径0.2157mm、非球面、レンズ全長0.64mm、f=0.64mmである。受光マイクロレンズ132は、最大角13.5degを100μmへ結像する。受光マイクロレンズ132は、回折限界2.5μm、MTF(Modulation Transfer Function)=200LP/mmである。受光マイクロレンズ132の焦点距離が短いため、この構成は厚みを薄くすることが可能であり、投光部と受光部のウェハ一体化に有利となる。 The light-receiving microlens 132 (see FIG. 8) covering the light-receiving unit pixel 133 has, for example, a radius of 0.2157 mm, an aspherical surface, a total lens length of 0.64 mm, and f=0.64 mm. The light-receiving microlens 132 images a maximum angle of 13.5 degrees to 100 μm. The light receiving microlens 132 has a diffraction limit of 2.5 μm and MTF (Modulation Transfer Function)=200 LP/mm. Since the focal length of the light-receiving microlens 132 is short, this configuration can be made thin, which is advantageous for wafer integration of the light-projecting section and the light-receiving section.
 前提として、視者のアイボックス(瞳が動く範囲)をφ8mmとする。このとき、瞳サイズをφ4mmとすれば、瞳位置の大きさはφ12mmである。光学系220の接眼レンズのf=18.6のとき、画像光投光部240の1.28インチパネル相当の軸上最大角θは、θ=18.5degである。周辺の軸上最大角θは、約73%であり、θ=13.5degである。瞳位置の大きさがφ12mmであるとすれば、アイセンシング光受光部230の受光解像度は80×80ピクセル程度で十分である。 As a premise, the viewer's eyebox (the range in which the pupil moves) is φ8 mm. At this time, if the pupil size is φ4 mm, the size of the pupil position is φ12 mm. When f of the eyepiece of the optical system 220 is 18.6, the maximum axial angle θ of the image light projection unit 240 corresponding to a 1.28-inch panel is θ=18.5 deg. The peripheral maximum on-axis angle θ is about 73% and θ=13.5 deg. Assuming that the size of the pupil position is φ12 mm, the light receiving resolution of the eye sensing light receiving section 230 is sufficient with about 80×80 pixels.
 図15は、受光マイクロレンズ132の空間周波数とコントラストとの関係を示すグラフである。 FIG. 15 is a graph showing the relationship between the spatial frequency of the light receiving microlens 132 and the contrast.
 上記構成の一体型デバイス210によれば、例えば、1mmのアイセンシング光投光部4個及び3mmのアイセンシング光受光部1個の場合に比べ、光量が+0.1dB明るくなり、アイセンシングをより正確に行うことができる。アイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 According to the integrated device 210 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +0.1 dB brighter, and the eye sensing becomes more efficient. can be done accurately. By increasing the amount of eye-sensing light, it can also be used in a triple-pass system.
 図17は、クロストークを説明するための図である。 FIG. 17 is a diagram for explaining crosstalk.
 クロストークとは、ある素子に入射した光信号の漏れによって、隣接する素子に電気信号が漏れる現象である。隣接するアイセンシング光受光部230同士の横方向への光漏れを防ぐために(即ち、クロストークの発生を防ぐために)、Z方向の壁34を設けたり、アイセンシング光受光部230同士を十分に離すのがよい。 Crosstalk is a phenomenon in which an electrical signal leaks to an adjacent element due to leakage of an optical signal incident on a certain element. In order to prevent light leakage in the lateral direction between adjacent eye-sensing light receiving portions 230 (that is, to prevent the occurrence of crosstalk), a wall 34 is provided in the Z direction, or the eye-sensing light receiving portions 230 are sufficiently spaced from each other. It is better to let go.
 図18は、光学系を模式的に示す側面図及び部分拡大図である。 FIG. 18 is a side view and a partially enlarged view schematically showing the optical system.
 光学系220において、マルチレンズアレイの位置関係を説明する。画像光投光部240の物体面、即ち、表示面(凡そ最表面)のZ位置と、アイセンシング光受光部230の受光マイクロレンズ132の入射瞳のZ位置とが一致すると、ボケの無いプルキンエ像を取得できる。一方、これらのZ位置がずれると、ボケたプルキンエ像となる。 The positional relationship of the multi-lens array in the optical system 220 will be explained. When the Z position of the object plane of the image light projecting unit 240, that is, the display surface (approximately the outermost surface) coincides with the Z position of the entrance pupil of the light receiving microlens 132 of the eye sensing light receiving unit 230, a Purkinje image without blurring can be obtained. image can be obtained. On the other hand, if these Z positions are shifted, a blurred Purkinje image will result.
 III.第3の実施形態 III. Third embodiment
 図19は、第3の実施形態に係る一体型デバイスを模式的に示す正面図である。図33は、第3の実施形態に係る画像表示装置を示す。 FIG. 19 is a front view schematically showing an integrated device according to the third embodiment. FIG. 33 shows an image display device according to the third embodiment.
 第3の実施形態に係る一体型デバイス310と第1の実施形態に係る一体型デバイス110との差異は、アイセンシング光投光部15(図33)の配置及びアイセンシング光受光部330の個数及び位置が異なり、またアイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを有する点である。第3の実施形態に係る画像光投光部340のスペックは、第1の実施形態に係る画像光投光部140のスペックと同様である。 The difference between the integrated device 310 according to the third embodiment and the integrated device 110 according to the first embodiment is the arrangement of the eye-sensing light projecting unit 15 (FIG. 33) and the number of the eye-sensing light receiving units 330. and different positions, and the eye-sensing light-receiving part has a plurality of light-receiving unit sub-pixels separated from each other. The specs of the image light projecting section 340 according to the third embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment.
 アイセンシング光投光部15(図33)は、光学系12の周囲に6個設けられる。6個のアイセンシング光投光部15(図33)は、円形に配置されるプルキンエ像を取得するため、略円周状に配置される。アイセンシング光投光部15(図33)は各々ランバートに光を照射する。アイセンシング光投光部(不図示)を6個設けることで、6個のプルキンエ像を取得できるため、アイセンシングをより正確に行うことができる。 Six eye-sensing light projection units 15 (FIG. 33) are provided around the optical system 12 . The six eye-sensing light projectors 15 (FIG. 33) are arranged in a substantially circular shape in order to obtain a circularly arranged Purkinje image. The eye-sensing light projection unit 15 (FIG. 33) irradiates each Lambertian light. Since six Purkinje images can be acquired by providing six eye sensing light projection units (not shown), eye sensing can be performed more accurately.
 アイセンシング光受光部330は、画像光投光部340の周囲に2個設けられる。2個アイセンシング光受光部330の配置は図示の例に限定されない。アイセンシング光受光部330のサイズは、例えば、18mm×3mmである。各アイセンシング光受光部330は、複数(例えば、90×15個)の受光単位ピクセル133(図8参照)を含む。各受光単位ピクセル133のサイズは、例えば、200μmである。各受光単位ピクセル133は、互いに分離された複数(例えば、80分割)の受光単位サブピクセル131(図8参照)を有する。受光単位サブピクセル131のサイズは、例えば、2.5μmである。 Two eye sensing light receiving units 330 are provided around the image light projecting unit 340 . The arrangement of the two eye sensing light receiving units 330 is not limited to the illustrated example. The size of the eye sensing light receiving section 330 is, for example, 18 mm×3 mm. Each eye sensing light receiver 330 includes a plurality of (eg, 90×15) light receiving unit pixels 133 (see FIG. 8). The size of each light receiving unit pixel 133 is, for example, 200 μm. Each light-receiving unit pixel 133 has a plurality of (for example, 80-division) light-receiving unit sub-pixels 131 (see FIG. 8) separated from each other. The size of the light-receiving unit sub-pixel 131 is, for example, 2.5 μm.
 受光単位ピクセル133を覆う受光マイクロレンズ132(図8参照)は、例えば、半径0.2157mm、非球面、レンズ全長0.64mm、f=0.64mmである。受光マイクロレンズ132は、最大角13.5degを100μmへ結像する。受光マイクロレンズ132は、回折限界2.5μm、MTF(Modulation Transfer Function)=200LP/mmである。 The light-receiving microlens 132 (see FIG. 8) covering the light-receiving unit pixel 133 has, for example, a radius of 0.2157 mm, an aspherical surface, a total lens length of 0.64 mm, and f=0.64 mm. The light-receiving microlens 132 images a maximum angle of 13.5 degrees to 100 μm. The light receiving microlens 132 has a diffraction limit of 2.5 μm and MTF (Modulation Transfer Function)=200 LP/mm.
 前提として、視者のアイボックス(瞳が動く範囲)をφ8mmとする。このとき、瞳サイズをφ4mmとすれば、瞳位置の大きさはφ12mmである。光学系320の接眼レンズのf=18.6のとき、画像光投光部340の1.28インチパネル相当の軸上最大角θは、θ=18.5degである。周辺の軸上最大角θは、約73%であり、θ=13.5degである。瞳位置の大きさがφ12mmであるとすれば、アイセンシング光受光部330の受光解像度は80×80ピクセル程度で十分である。 As a premise, the viewer's eyebox (the range in which the pupil moves) is φ8 mm. At this time, if the pupil size is φ4 mm, the size of the pupil position is φ12 mm. When the eyepiece lens f of the optical system 320 is f=18.6, the maximum axial angle θ of the image light projection unit 340 corresponding to a 1.28-inch panel is θ=18.5 deg. The peripheral maximum on-axis angle θ is about 73% and θ=13.5 deg. Assuming that the size of the pupil position is φ12 mm, the light receiving resolution of the eye sensing light receiving section 330 is sufficient with about 80×80 pixels.
 図16は、画像の補正例を示す。 FIG. 16 shows an example of image correction.
 アイセンシング光受光部330が最大角に近い位置で画像を取得する場合、各アイセンシング光受光部330の像高位置によって、取得可能な像が少しずつシフトする。これを避けるにはレンズシフト量を各中心像高からの距離に応じて変えればよい。あるいは、信号処理の段階でシフトを加味して図示の2種の画像を合成すればよい。 When the eye-sensing light receiving units 330 acquire an image at a position close to the maximum angle, the obtainable image shifts little by little depending on the image height position of each eye-sensing light receiving unit 330 . To avoid this, the lens shift amount should be changed according to the distance from each central image height. Alternatively, the two types of images shown in the figure may be synthesized by adding a shift in the signal processing stage.
 図20は、受光マイクロレンズ132の空間周波数とコントラストとの関係を示すグラフである。 FIG. 20 is a graph showing the relationship between the spatial frequency of the light receiving microlens 132 and the contrast.
 上記構成の一体型デバイス310によれば、例えば、1mmのアイセンシング光投光部4個及び3mmのアイセンシング光受光部1個の場合に比べ、光量が+9.6dB明るくなり、アイセンシングをより正確に行うことができる。受光領域を広げることができ、アイセンシング光の光量を増加することで、トリプルパス系でも使用可能である。 According to the integrated device 310 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +9.6 dB brighter, and the eye sensing is more effective. can be done accurately. The light-receiving area can be widened, and by increasing the amount of eye-sensing light, a triple-pass system can also be used.
 IV.第4の実施形態 IV. Fourth embodiment
 図21は、第4の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 21 is a front view schematically showing an integrated device according to the fourth embodiment.
 第4の実施形態に係る一体型デバイス410と第1の実施形態に係る一体型デバイス110との差異は、アイセンシング光投光部450及びアイセンシング光受光部430の個数及び位置が異なり、またアイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを有する点である。第4の実施形態に係る画像光投光部440のスペックは、第1の実施形態に係る画像光投光部140のスペックと同様である。 The integrated device 410 according to the fourth embodiment differs from the integrated device 110 according to the first embodiment in the number and positions of the eye-sensing light projecting units 450 and the eye-sensing light receiving units 430, and The eye sensing light receiving part has a plurality of light receiving unit sub-pixels separated from each other. The specs of the image light projecting section 440 according to the fourth embodiment are the same as the specs of the image light projecting section 140 according to the first embodiment.
 アイセンシング光投光部450は、画像光投光部440の周囲に6個設けられる。6個のアイセンシング光投光部450は、円形に配置されるプルキンエ像を取得するため、略円周状に配置される。アイセンシング光投光部450のスペックは、第1の実施形態に係るアイセンシング光投光部150のスペックと同様である。アイセンシング光投光部450を6個設けることで、6個のプルキンエ像を取得できるため、アイセンシングをより正確に行うことができる。 Six eye-sensing light projection units 450 are provided around the image light projection unit 440 . The six eye-sensing light projectors 450 are arranged in a substantially circular shape to obtain a circularly arranged Purkinje image. The specifications of the eye-sensing light projection unit 450 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 450, eye sensing can be performed more accurately.
 アイセンシング光受光部430は、画像光投光部440の周囲に2個設けられる。2個アイセンシング光受光部430の配置は図示の例に限定されない。アイセンシング光受光部430のサイズは、例えば、12mm×3mmである。各アイセンシング光受光部430は、複数(例えば、60×15個)の受光単位ピクセル433を含む。各受光単位ピクセル433のサイズは、例えば、200μmである。各受光単位ピクセル433は、互いに分離された複数(例えば、80分割)の受光単位サブピクセル431を有する。受光単位サブピクセル431のサイズは、例えば、2.5μmである。 Two eye sensing light receiving units 430 are provided around the image light projecting unit 440 . The arrangement of the two eye sensing light receiving units 430 is not limited to the illustrated example. The size of the eye sensing light receiving section 430 is, for example, 12 mm×3 mm. Each eye sensing light receiver 430 includes a plurality of (eg, 60×15) light receiving unit pixels 433 . The size of each light receiving unit pixel 433 is, for example, 200 μm. Each light-receiving unit pixel 433 has a plurality of light-receiving unit sub-pixels 431 (for example, 80 divisions) separated from each other. The size of the light-receiving unit sub-pixel 431 is, for example, 2.5 μm.
 受光単位ピクセル433を覆う受光マイクロレンズ432は、例えば、半径0.2157mm、非球面、レンズ全長0.64mm、f=0.64mmである。受光マイクロレンズ432は、最大角13.5degを100μmへ結像する。 The light-receiving microlens 432 covering the light-receiving unit pixel 433 has, for example, a radius of 0.2157 mm, an aspherical surface, a total lens length of 0.64 mm, and f=0.64 mm. The light-receiving microlens 432 images a maximum angle of 13.5 degrees to 100 μm.
 前提として、視者のアイボックス(瞳が動く範囲)をφ8mmとする。このとき、瞳サイズをφ4mmとすれば、瞳位置の大きさはφ12mmである。光学系320の接眼レンズのf=18.6のとき、画像光投光部340の1.28インチパネル相当の軸上最大角θは、θ=18.5degである。周辺の軸上最大角θは、約73%であり、θ=13.5degである。瞳位置の大きさがφ12mmであるとすれば、アイセンシング光受光部330の受光解像度は80×80ピクセル程度で十分である。 As a premise, the viewer's eyebox (the range in which the pupil moves) is φ8 mm. At this time, if the pupil size is φ4 mm, the size of the pupil position is φ12 mm. When the eyepiece lens f of the optical system 320 is f=18.6, the maximum axial angle θ of the image light projection unit 340 corresponding to a 1.28-inch panel is θ=18.5 deg. The peripheral maximum on-axis angle θ is about 73% and θ=13.5 deg. Assuming that the size of the pupil position is φ12 mm, the light receiving resolution of the eye sensing light receiving section 330 is sufficient with about 80×80 pixels.
 図22は、アイセンシング光受光部を模式的に示す正面図である。図23は、アイセンシング光受光部を模式的に示す側面図である。 FIG. 22 is a front view schematically showing the eye sensing light receiving part. FIG. 23 is a side view schematically showing the eye sensing light receiving section.
 各受光単位ピクセル433は、互いに分離された複数の受光単位サブピクセル431を有する。例えば、各受光単位ピクセル433が2×2ピクセル(4ピクセル)以上に分割され、4個のサブピクセルのうち3個が遮蔽板435により遮蔽される。言い換えれば、各受光単位ピクセル433は、ランダムに1/4サイズに遮蔽される。この様に、受光単位サブピクセル431の周囲が遮蔽されることにより、複数の受光単位サブピクセル431が互いに分離される。この構成において疑似解像度は160×160ピクセルである。 Each light-receiving unit pixel 433 has a plurality of light-receiving unit sub-pixels 431 separated from each other. For example, each light-receiving unit pixel 433 is divided into 2×2 pixels (4 pixels) or more, and 3 out of 4 sub-pixels are shielded by the shielding plate 435 . In other words, each light-receiving unit pixel 433 is randomly shielded to a quarter size. By shielding the periphery of the light-receiving unit sub-pixel 431 in this manner, the plurality of light-receiving unit sub-pixels 431 are separated from each other. The pseudo-resolution in this configuration is 160×160 pixels.
 上記構成の一体型デバイス410によれば、例えば、1mmのアイセンシング光投光部4個及び3mmのアイセンシング光受光部1個の場合に比べ、光量が+1.8dB明るくなり、アイセンシングをより正確に行うことができる。各受光単位ピクセル433をランダムに遮蔽することで、解像度を向上させつつ、アイセンシング光の光量を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 According to the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +1.8 dB brighter, and the eye sensing is more effective. can be done accurately. By randomly shielding each light-receiving unit pixel 433, the resolution can be improved and a sufficient amount of eye sensing light can be obtained, thereby facilitating calculation and enabling use in a triple-pass system.
 V.第5の実施形態  V. Fifth embodiment
 図24は、第5の実施形態に係るアイセンシング光受光部に含まれる受光単位ピクセルを模式的に示す正面図である。 FIG. 24 is a front view schematically showing light-receiving unit pixels included in an eye-sensing light-receiving portion according to the fifth embodiment.
 第5の実施形態に係る一体型デバイスと第4の実施形態に係る一体型デバイス410との差異は、受光単位ピクセル533の構成である。第5の実施形態に係る一体型デバイスの正面図は、第4の実施形態に係る一体型デバイス410の正面図(図21)と同様であるため、図示を省略する。 The difference between the integrated device according to the fifth embodiment and the integrated device 410 according to the fourth embodiment is the configuration of the light-receiving unit pixel 533 . The front view of the integrated device according to the fifth embodiment is the same as the front view (FIG. 21) of the integrated device 410 according to the fourth embodiment, so illustration is omitted.
 各受光単位ピクセル533は、互いに分離された複数の受光単位サブピクセル531を有する。例えば、隣接する受光単位サブピクセル531の間に、Z方向に延びる仕切り535を設けることにより、複数の受光単位サブピクセル531が互いに分離される。受光単位サブピクセル531は、各受光単位ピクセル533が2×2ピクセル(4ピクセル)以上に分割されることにより構成される。 Each light-receiving unit pixel 533 has a plurality of light-receiving unit sub-pixels 531 separated from each other. For example, a plurality of light receiving unit sub-pixels 531 are separated from each other by providing a partition 535 extending in the Z direction between adjacent light receiving unit sub-pixels 531 . The light-receiving unit sub-pixel 531 is configured by dividing each light-receiving unit pixel 533 into 2×2 pixels (4 pixels) or more.
 一部の受光単位ピクセル533Aに含まれる複数の受光単位サブピクセル531Aと、他の一部の受光単位ピクセル533Bに含まれる複数の受光単位サブピクセル531Bとは、オフセットしている。例えば、一部の受光単位ピクセル533Aに含まれる複数の受光単位サブピクセル531Aと、他の一部の受光単位ピクセル533Bに含まれる複数の受光単位サブピクセル531Bとは、ランダムに1/2ピクセルだけオフセットしている。この構成において疑似解像度は160×160ピクセルである。 The plurality of light-receiving unit sub-pixels 531A included in some of the light-receiving unit pixels 533A and the plurality of light-receiving unit sub-pixels 531B included in the other portion of the light-receiving unit pixels 533B are offset. For example, a plurality of light-receiving unit sub-pixels 531A included in a portion of light-receiving unit pixels 533A and a plurality of light-receiving unit sub-pixels 531B included in another portion of light-receiving unit pixels 533B are randomly divided into half pixels. offset. The pseudo-resolution in this configuration is 160×160 pixels.
 上記構成の一体型デバイス410によれば、例えば、1mmのアイセンシング光投光部4個及び3mmのアイセンシング光受光部1個の場合に比べ、光量が+7.9dB明るくなり、アイセンシングをより正確に行うことができる。各受光単位ピクセル533をランダムにオフセットすることで、アイセンシング光の光量のロスを最低限にしつつ、疑似解像度を上げることができる。を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 According to the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +7.9 dB brighter, and the eye sensing becomes more effective. can be done accurately. By randomly offsetting each light-receiving unit pixel 533, it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
 VI.第6の実施形態  VI. Sixth embodiment
 図25は、第6の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 25 is a front view schematically showing an integrated device according to the sixth embodiment.
 第6の実施形態に係る一体型デバイス610と第1の実施形態に係る一体型デバイス110との差異は、アイセンシング光投光部650の個数及び位置が異なり、またアイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを有する点と、画像光投光部640の構成と、画像光投光部640の内側領域にアイセンシング光受光部630が配置される点である。 The integrated device 610 according to the sixth embodiment differs from the integrated device 110 according to the first embodiment in the number and positions of the eye-sensing light projecting units 650, and the eye-sensing light receiving units are different from each other. It has a plurality of separated light-receiving unit sub-pixels, a configuration of the image light projecting section 640 , and an eye sensing light receiving section 630 arranged inside the image light projecting section 640 .
 アイセンシング光投光部650は、画像光投光部640の周囲に6個設けられる。6個のアイセンシング光投光部650は、円形に配置されるプルキンエ像を取得するため、略円周状に配置される。アイセンシング光投光部650のスペックは、第1の実施形態に係るアイセンシング光投光部150のスペックと同様である。アイセンシング光投光部650を6個設けることで、6個のプルキンエ像を取得できるため、アイセンシングをより正確に行うことができる。 Six eye-sensing light projection units 650 are provided around the image light projection unit 640 . The six eye-sensing light projectors 650 are arranged in a substantially circular shape in order to obtain a circularly arranged Purkinje image. The specifications of the eye-sensing light projection unit 650 are the same as those of the eye-sensing light projection unit 150 according to the first embodiment. Since six Purkinje images can be acquired by providing six eye sensing light projecting units 650, eye sensing can be performed more accurately.
 図26は、画像光投光部の内側領域及び外側領域を模式的に示す正面図である。 FIG. 26 is a front view schematically showing the inner area and the outer area of the image light projecting section.
 画像光投光部640のサイズは、第1の実施形態の画像光投光部140のサイズと同様である。画像光投光部640は、内側領域640A及び外側領域640Bを有する。 The size of the image light projection section 640 is the same as the size of the image light projection section 140 of the first embodiment. The image light projector 640 has an inner region 640A and an outer region 640B.
 (A)に示す様に、画像光投光部640の内側領域640Aは、RGBのベイヤー配列で構成される。4K実現のため、例えば、ピクセルサイズは6.7μmである。 As shown in (A), the inner region 640A of the image light projection unit 640 is configured in an RGB Bayer array. For 4K implementation, for example, the pixel size is 6.7 μm.
 (B)に示す様に、画像光投光部640の外側領域640Bのピクセルサイズは、内側領域640Aのピクセルサイズより大きい。言い換えれば、外側領域640Bは、内側領域640Aよりも解像度が低い。例えば、外側領域640Bは、半角20degで1辺1/3、半角30degで1辺1/6等、半角40degで1辺1/12等とすればよい。外側領域640Bの最周辺では、例えば、ピクセルサイズは80μmでよい。 As shown in (B), the pixel size of the outer region 640B of the image light projection unit 640 is larger than the pixel size of the inner region 640A. In other words, the outer region 640B has a lower resolution than the inner region 640A. For example, the outer region 640B may be 1/3 of a side with a half angle of 20 degrees, 1/6 of a side with a half angle of 30 degrees, and 1/12 of a side with a half angle of 40 degrees. At the extreme periphery of outer region 640B, for example, the pixel size may be 80 μm.
 アイセンシング光受光部630は、画像光投光部640の外側領域640B内のピクセルとして設けられる。具体的には、外側領域640B内に複数の受光単位ピクセル633が点在し、複数の受光単位ピクセル633がアイセンシング光受光部630を構成する。各受光単位ピクセル633は、複数の受光単位サブピクセル131(2.5μmピッチ)に分割される。例えば、各受光単位ピクセル633は、32×32個の受光単位サブピクセル131(図8参照)を含む。 The eye-sensing light receiving section 630 is provided as a pixel within the outer region 640B of the image light projecting section 640 . Specifically, a plurality of light-receiving unit pixels 633 are scattered in the outer region 640</b>B, and the plurality of light-receiving unit pixels 633 constitute the eye-sensing light receiving section 630 . Each light-receiving unit pixel 633 is divided into a plurality of light-receiving unit sub-pixels 131 (2.5 μm pitch). For example, each light-receiving unit pixel 633 includes 32×32 light-receiving unit sub-pixels 131 (see FIG. 8).
 図24と同様に、一部の受光単位ピクセル633に含まれる複数の受光単位サブピクセル131と、他の一部の受光単位ピクセル633に含まれる複数の受光単位サブピクセル131とは、オフセットしてもよい。例えば、一部の受光単位ピクセル633に含まれる複数の受光単位サブピクセル131と、他の一部の受光単位ピクセル633に含まれる複数の受光単位サブピクセル131とは、ランダムに1/4ピクセルだけオフセットしてよい。この構成において疑似解像度は128×128ピクセルである。図24と同様に、オフセットの値は1/2でもよい。 As in FIG. 24, the plurality of light-receiving unit sub-pixels 131 included in some of the light-receiving unit pixels 633 and the plurality of light-receiving unit sub-pixels 131 included in the other portion of the light-receiving unit pixels 633 are offset. good too. For example, the plurality of light-receiving unit sub-pixels 131 included in some of the light-receiving unit pixels 633 and the plurality of light-receiving unit sub-pixels 131 included in the other portion of the light-receiving unit pixels 633 are randomly divided into quarter pixels. can be offset. The pseudo-resolution in this configuration is 128×128 pixels. As in FIG. 24, the offset value may be 1/2.
 本実施形態によれば、アイセンシング光受光部630を画像光投光部640の外側領域640B内のピクセルとして設けることで、アイセンシング光受光部630をより内側に配置できるので、VR光学系の収差影響を受けづらく像品質が上がる。また、各受光単位ピクセル633をランダムにオフセットすることで、アイセンシング光の光量のロスを最低限にしつつ、疑似解像度を上げることができる。を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 According to this embodiment, by providing the eye-sensing light receiving unit 630 as a pixel in the outer region 640B of the image light projecting unit 640, the eye-sensing light receiving unit 630 can be arranged further inside, so that the VR optical system can be The image quality is improved because it is less affected by aberrations. In addition, by randomly offsetting each light-receiving unit pixel 633, it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
 VII.第7の実施形態 VII. Seventh embodiment
 図27は、第7の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 27 is a front view schematically showing an integrated device according to the seventh embodiment.
 第7の実施形態に係る一体型デバイス710と第6の実施形態に係る一体型デバイス610との差異は、画像光投光部740の内側領域にアイセンシング光受光部730に加えてアイセンシング光投光部750が配置される点である。 The difference between the integrated device 710 according to the seventh embodiment and the integrated device 610 according to the sixth embodiment is that, in addition to the eye sensing light receiving section 730, an eye sensing light receiving section 730 is provided inside the image light projecting section 740. The point is that the light projecting section 750 is arranged.
 図28は、画像光投光部の内側領域及び外側領域を模式的に示す正面図である。 FIG. 28 is a front view schematically showing the inner area and the outer area of the image light projecting section.
 画像光投光部740のサイズは、第1の実施形態の画像光投光部140のサイズと同様である。画像光投光部740は、内側領域740A及び外側領域740Bを有する。 The size of the image light projection section 740 is the same as the size of the image light projection section 140 of the first embodiment. The image light projector 740 has an inner region 740A and an outer region 740B.
 (A)に示す様に、画像光投光部740の内側領域740Aの構成は、第6の実施形態と同様である。内側領域740A及び外側領域740Bのピクセルサイズは、第6の実施形態と同様である。 As shown in (A), the configuration of the inner region 740A of the image light projecting section 740 is the same as that of the sixth embodiment. The pixel sizes of the inner area 740A and the outer area 740B are the same as in the sixth embodiment.
 (B)に示す様に、アイセンシング光受光部730の構成は、第6の実施形態と同様である。アイセンシング光投光部750は、画像光投光部740の外側領域740B内のピクセルとして設けられる。具体的には、外側領域740B内に複数のアイセンシング光投光部750が点在する。アイセンシング光投光部750について、OLEDのIRでは塗分けによってIR-LEDの代わりにIR-OLEDを実装すればよい。 As shown in (B), the configuration of the eye sensing light receiving section 730 is the same as that of the sixth embodiment. The eye-sensing light projectors 750 are provided as pixels within the outer region 740B of the image light projectors 740 . Specifically, a plurality of eye-sensing light projecting sections 750 are scattered in the outer region 740B. For the eye-sensing light projection unit 750, IR-OLEDs may be mounted instead of IR-LEDs by different coatings for IR of OLEDs.
 本実施形態によれば、アイセンシング光投光部750を画像光投光部740の外側領域740B内のピクセルとして設けることで、オン・オフの切り替えによってさらに精度の良いアイセンシングが可能となる。また、アイセンシング光受光部730を画像光投光部740の外側領域640B内のピクセルとして設けることで、アイセンシング光受光部730をより内側に配置できるので、VR光学系の収差影響を受けづらく像品質が上がる。また、各受光単位ピクセル733をランダムにオフセットすることで、アイセンシング光の光量のロスを最低限にしつつ、疑似解像度を上げることができる。を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。 According to the present embodiment, by providing the eye sensing light projection section 750 as a pixel in the outer region 740B of the image light projection section 740, more accurate eye sensing is possible by switching on/off. Further, by providing the eye-sensing light receiving section 730 as a pixel in the outer region 640B of the image light projecting section 740, the eye-sensing light receiving section 730 can be arranged further inside, so that it is less likely to be affected by the aberration of the VR optical system. Improves image quality. In addition, by randomly offsetting each light-receiving unit pixel 733, it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By sufficiently obtaining , it is possible to use it even in a triple-pass system while facilitating the calculation.
 VIII.第8の実施形態  VIII. Eighth embodiment
 第8の実施形態に係る一体型デバイスと第4の実施形態に係る一体型デバイス410との差異は、アイセンシング光受光部の構成である。第8の実施形態に係る一体型デバイスの正面図は、第4の実施形態に係る一体型デバイス410の正面図(図21)と同様であるため、図示を省略する。第4の実施形態と同じ参照符号を用いて以下説明する。 The difference between the integrated device according to the eighth embodiment and the integrated device 410 according to the fourth embodiment is the configuration of the eye sensing light receiving section. The front view of the integrated device according to the eighth embodiment is the same as the front view (FIG. 21) of the integrated device 410 according to the fourth embodiment, so illustration is omitted. The same reference numerals as in the fourth embodiment are used for the following description.
 アイセンシング光受光部430のサイズは、例えば、12mm×3mmである。各アイセンシング光受光部430は、複数(例えば、2400×600個)の受光単位ピクセル433を含む。各受光単位ピクセル433のサイズは、例えば、5μmである。各受光単位ピクセル433は、互いに分離された複数(例えば、5分割)の受光単位サブピクセル431を有する。受光単位サブピクセル431のサイズは、例えば、1μmである。 The size of the eye sensing light receiving section 430 is, for example, 12 mm x 3 mm. Each eye sensing light receiver 430 includes a plurality of (eg, 2400×600) light receiving unit pixels 433 . The size of each light receiving unit pixel 433 is, for example, 5 μm. Each light-receiving unit pixel 433 has a plurality of (eg, 5-divided) light-receiving unit sub-pixels 431 separated from each other. The size of the light-receiving unit sub-pixel 431 is, for example, 1 μm.
 受光単位ピクセル133を覆う受光マイクロレンズ132は、例えば、f=16um、t=16umである。受光マイクロレンズ132は、最大角13.5degを2.5μmへ結像する。 The light-receiving microlens 132 covering the light-receiving unit pixel 133 has, for example, f=16 um and t=16 um. The light-receiving microlens 132 images the maximum angle of 13.5 degrees to 2.5 μm.
 上記構成の一体型デバイス410によれば、例えば、1mmのアイセンシング光投光部4個及び3mmのアイセンシング光受光部1個の場合に比べ、光量が+7.9dB明るくなり、アイセンシングをより正確に行うことができる。各受光単位ピクセル533をランダムにオフセットすることで、アイセンシング光の光量のロスを最低限にしつつ、疑似解像度を上げることができる。光量を十分に得ることで、演算を容易にしつつ、トリプルパス系でも使用可能である。また、製造上、受光マイクロレンズ132のマイクロレンズアレイをアイセンシング光受光部430のイメージセンサと一体のウェハプロセスで製造できる。 According to the integrated device 410 having the above configuration, for example, compared to the case of four eye-sensing light projection units of 1 mm and one eye-sensing light-receiving unit of 3 mm, the light amount is +7.9 dB brighter, and the eye sensing becomes more effective. can be done accurately. By randomly offsetting each light-receiving unit pixel 533, it is possible to increase the pseudo-resolution while minimizing the loss of the light amount of the eye sensing light. By obtaining a sufficient amount of light, it is possible to use it even in a triple-pass system while facilitating calculations. In terms of manufacturing, the microlens array of the light receiving microlenses 132 and the image sensor of the eye sensing light receiving section 430 can be manufactured by a wafer process.
 IX.第9の実施形態 IX. Ninth embodiment
 図30は、第9の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 30 is a front view schematically showing an integrated device according to the ninth embodiment.
 第9の実施形態に係る一体型デバイス910と第4の実施形態に係る一体型デバイス410との差異は、異なるサイズの複数のアイセンシング光受光部930が設けられる点である。一般に像高が低いほうがMTF(Modulation Transfer Function)が良いため、画像光投光部の中心になるべく近くアイセンシング光受光部を配置できるように工夫するのが望ましい。またアイセンシング光受光部を数多く配置することで、直接の反射光を避けて精度を上げる効果が見込める。画像光投光部940及びアイセンシング光投光部950は第4の実施形態と同様である。 The difference between the integrated device 910 according to the ninth embodiment and the integrated device 410 according to the fourth embodiment is that a plurality of eye sensing light receiving sections 930 of different sizes are provided. Generally, the lower the image height, the better the MTF (Modulation Transfer Function), so it is desirable to arrange the eye sensing light receiving section as close to the center of the image light projecting section as possible. In addition, by arranging a large number of eye-sensing light receiving units, it is possible to avoid direct reflected light and improve accuracy. An image light projection unit 940 and an eye sensing light projection unit 950 are the same as those in the fourth embodiment.
 X.第10の実施形態  X. Tenth embodiment
 図31は、第10の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 31 is a front view schematically showing an integrated device according to the tenth embodiment.
 第10の実施形態に係る一体型デバイス1010と第4の実施形態に係る一体型デバイス410との差異は、アイセンシング光投光部1050の個数及び位置が異なる点である。直接の反射光を避けるため、アイセンシング光受光部1030を数多く配置する場合において、アイセンシング光投光部1050の間隔を変えることで精度を上げる。さらにアイセンシング光投光部1050(LED又はOLED)の形状を矩形ではなく非対称な図形とすることでアイセンシング性能を向上させる。画像光投光部1040は第4の実施形態と同様である。 The difference between the integrated device 1010 according to the tenth embodiment and the integrated device 410 according to the fourth embodiment is that the number and positions of the eye sensing light projecting units 1050 are different. In order to avoid direct reflected light, when a large number of eye-sensing light receiving units 1030 are arranged, the interval between eye-sensing light projecting units 1050 is changed to improve accuracy. Further, the eye sensing performance is improved by making the shape of the eye sensing light projecting portion 1050 (LED or OLED) not rectangular but asymmetrical. The image light projecting section 1040 is the same as in the fourth embodiment.
 XI.第12の実施形態 XI. 12th embodiment
 図32は、第11の実施形態に係る一体型デバイスを模式的に示す正面図である。 FIG. 32 is a front view schematically showing an integrated device according to the eleventh embodiment.
 第11の実施形態に係る一体型デバイス1110と第10の実施形態に係る一体型デバイス1010との差異は、アイセンシング光投光部1150である。アイセンシング光投光部1150として、IR-OLED(LED)を画素駆動できるように配置しておき、状況に応じて任意のパターンを表示することでアイセンシング精度を上げる。さらにアイセンシング光受光部1130にイベントベースビジョンセンサを用いることで、IR-OLEDの発光タイミング制御によってイベントベースビジョンセンサの信号を得ることで、より低消費電力、より高精度なアイセンシングを行う。この方式は特にこの光学系と親和性が高い。VR光学系外にOLEDを置いたとき、プルキンエ像にパターンを出すには「方向」を制御しなければならないが、こうした制御は難しい。VR光学系内にOLEDを置いた場合には、位置情報から方向情報に焼き直される。画像光投光部1140は第4の実施形態と同様である。 The difference between the integrated device 1110 according to the eleventh embodiment and the integrated device 1010 according to the tenth embodiment is the eye sensing light projecting section 1150 . As the eye-sensing light projection unit 1150, an IR-OLED (LED) is arranged so as to be able to drive the pixels, and an arbitrary pattern is displayed according to the situation, thereby improving eye-sensing accuracy. Furthermore, by using an event-based vision sensor for the eye sensing light receiving unit 1130, the signal of the event-based vision sensor is obtained by controlling the light emission timing of the IR-OLED, thereby performing eye sensing with lower power consumption and higher accuracy. This system has a particularly high affinity with this optical system. When the OLED is placed outside the VR optical system, it is necessary to control the "direction" in order to produce a pattern in the Purkinje image, but such control is difficult. When the OLED is placed in the VR optical system, the positional information is rewritten into the directional information. The image light projector 1140 is the same as in the fourth embodiment.
 またアイセンシング光受光部として偏光センサを用いてもよい。偏光センサを用いることで、不要なノイズを低減し精度を上げることができる。 A polarization sensor may also be used as the eye sensing light receiving section. By using a polarization sensor, unnecessary noise can be reduced and accuracy can be improved.
 本開示は、以下の各構成を有してもよい。 The present disclosure may have the following configurations.
 (1)
 光学系と、
  前記光学系を経て網膜に結像される画像光を発する画像光投光部と、
  瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
  瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
  前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
 を有する一体型デバイスと、
 を具備し、
 前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
 前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
 画像表示装置。
 (2)
 上記(1)に記載の画像表示装置であって、
 前記アイセンシング光受光部は、前記画像光をカットするカットフィルタを有する、
 または画像光をカットする前記受光マイクロレンズを有する
 画像表示装置。
 (3)
 上記(1)又は(2)に記載の画像表示装置であって、
 前記アイセンシング光投光部からの投光は、前記前記光学系を経て瞳で反射され、前記アイセンシング光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
 画像表示装置。
 (4)
 上記(1)乃至(3)の何れか一項に記載の画像表示装置であって、
 前記アイセンシング光投光部は、前記アイセンシング光の光量を増加するための投光マイクロレンズを有する
 画像表示装置。
 (5)
 上記(4)に記載の画像表示装置であって、
 前記アイセンシング光投光部の前記投光マイクロレンズは、前記アイセンシング光の方向が前記画像光の射出角方向に一致するように偏心する
 画像表示装置。
 (6)
 上記(1)乃至(5)の何れか一項に記載の画像表示装置であって、
 前記画像光投光部の中心から有効像円形の径方向をα方向とし、
 前記α方向に直交する方向をβ方向とし、
 前記画像光投光部の中心から、前記アイセンシング光投光部の中心までの距離をDとし、
 前記画像光投光部の瞳径をBとし、
 Bに対するDからの射出角をθα+_D,θα-_D,θβ+_D,θβ-_D,θα_Center_Dとし、θα_Center_Dは主光線角であり、
 前記アイセンシング光投光部の投光範囲をα方向CαPRJ、β方向CβPRJとし、
 受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_PRJ,ξα-_PRJ,ξβ+_PRJ,ξβ-_PRJとすると、
 式(1-1')、(1-2')、(1-3')及び(1-4')を満たすように、アイセンシング光投光部の前記受光マイクロレンズは、偏心する
 ξα+_PRJ-θα_Center_D≧CαPRJ/B・(θα+_D-θα_Center_D) (1-1')
 ξα-_PRJ-θα_Center_D≦CαPRJ/B・(θα-_D-θα_Center_D) (1-2')
 ξβ+_PRJ≧Cβ_D/B・θβ_D (1-3')
 ξβ-_PRJ≦Cβ_D/B・θβ_D (1-4')
 画像表示装置。
 (7)
 上記(1)乃至(6)の何れか一項に記載の画像表示装置であって、
 前記アイセンシング光受光部の前記受光マイクロレンズは、前記瞳で反射した前記アイセンシング光を前記前記アイセンシング光受光部の中心に結像させるように偏心する
 画像表示装置。
 (8)
 上記(1)乃至(7)の何れか一項に記載の画像表示装置であって、
 前記画像光投光部の中心から有効像円形の径方向をα方向とし、
 前記α方向に直交する方向をβ方向とし、
 前記画像光投光部の中心から、前記アイセンシング光受光部の中心までの距離をAとし、
 前記画像光投光部の瞳径をBとし、
 Bに対するAからの射出角をθα+_A,θα-_A,θβ+_A,θβ-_A,θα_Center_Aとし、θα_Center_Aは主光線角であり、
 前記アイセンシング光受光部の受光範囲をα方向CαDET、β方向CβDETとし、
 前記受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_DET,ξα-_DET,ξβ+_DET,ξβ-_DETとすると、
 式(1-1)、(1-2)、(1-3)及び(1-4)を満たすように、前記アイセンシング光受光部の前記受光マイクロレンズは、偏心する
 ξα+_DET-θα_Center_A≧CαDET/B・(θα+_A-θα_Center_A) (1-1)
 ξα-_DET-θα_Center_A≦CαDET/B・(θα-_A-θα_Center_A) (1-2)
 ξβ+_DET≧Cβ_A/B・θβ_A (1-3)
 ξβ-_DET≦Cβ_A/B・θβ_A (1-4)
 画像表示装置。
 (9)
 上記(1)乃至(8)の何れか一項に記載の画像表示装置であって、
 前記光学系の光軸に直交する面において、前記画像光投光部は、前記アイセンシング光投光部及び前記アイセンシング光受光部と、前記光軸との間に設けられる
 画像表示装置。
 (10)
 上記(1)乃至(9)の何れか一項に記載の画像表示装置であって、
 前記画像光投光部は、前記光学系の光軸に直交する面において、長方形状であり、
 前記アイセンシング光受光部は、前記画像光投光部の長辺の中央と前記光学系の光軸とを結ぶ直線上であって、前記光学系の有効像円形内に位置する
 画像表示装置。
 (11)
 上記(1)乃至(10)の何れか一項に記載の画像表示装置であって、
 前記アイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを含み、前記受光マイクロレンズは複数あり、各マイクロレンズは各々の受光単位サブピクセルにアイセンシング光を結像する
 画像表示装置。
 (12)
 上記(11)に記載の画像表示装置であって、
  隣接する受光単位サブピクセルの間に、前記光学系の光軸方向に延びる仕切りを設けることにより、又は
  各受光単位サブピクセルの周囲が遮蔽されることにより、
 前記複数の受光単位サブピクセルが互いに分離される
 画像表示装置。
 (13)
 上記(11)又は(12)に記載の画像表示装置であって、
 一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルと、他の一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルとは、オフセットしている
 画像表示装置。
 (14)
 上記(1)乃至(13)の何れか一項に記載の画像表示装置であって、
 前記画像光投光部は、内側領域及び外側領域を有し、
 前記外側領域のピクセルサイズは、前記内側領域のピクセルサイズより大きく、
 前記アイセンシング光受光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられる
 画像表示装置。
 (15)
 上記(14)に記載の画像表示装置であって、
 前記アイセンシング光投光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられる
 画像表示装置。
 (16)
 上記(1)乃至(15)の何れか一項に記載の画像表示装置
 を具備する電子機器。
 (17)
 光学系を経て網膜に結像される画像光を発する画像光投光部と、
 瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
 瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
 前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
 を具備し、
 前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
 前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
 一体型デバイス。
(1)
an optical system;
an image light projection unit that emits image light that is imaged on the retina through the optical system;
an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
an integrated device having
and
the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
An image display device, wherein the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
(2)
The image display device according to (1) above,
The eye sensing light receiving unit has a cut filter that cuts the image light,
Alternatively, the image display device has the light-receiving microlens for cutting image light.
(3)
The image display device according to (1) or (2) above,
Light projected from the eye-sensing light projecting unit passes through the optical system and is reflected by a pupil, and the eye-sensing light projecting unit and the entrance pupils of the light-receiving microlenses are at the same position in the optical axis direction of the optical system. image display device.
(4)
The image display device according to any one of (1) to (3) above,
The image display device, wherein the eye-sensing light projector has a light-projecting microlens for increasing the amount of the eye-sensing light.
(5)
The image display device according to (4) above,
The image display device, wherein the projection microlens of the eye-sensing light projection unit is decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light.
(6)
The image display device according to any one of (1) to (5) above,
A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
The direction orthogonal to the α direction is the β direction,
Let D be the distance from the center of the image light projection unit to the center of the eye sensing light projection unit,
B is the pupil diameter of the image light projection unit,
Let the emergence angles from D with respect to B be θ α+_D , θ α−_D , θ β+_D , θ β−_D , θ α_Center_D , where θ α_Center_D is the chief ray angle,
Let the projection range of the eye sensing light projection unit be α direction C αPRJ and β direction C βPRJ ,
Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_PRJ , ξα −_PRJ , ξβ +_PRJ , and ξβ −_PRJ ,
The light-receiving microlens of the eye-sensing light projector is decentered ξα +_PRJ so as to satisfy the equations (1-1′), (1-2′), (1-3′) and (1-4′). −θ α_Center_D ≧C αPRJ /B・(θ α+_D −θ α_Center_D ) (1−1′)
ξ α−_PRJ −θ α_Center_D ≦C αPRJ /B・(θ α−_D −θ α_Center_D ) (1-2′)
ξ β+_PRJ ≧C β_D /B・θ β_D (1−3′)
ξ β-_PRJ ≤ C β_D /B・θ β_D (1-4')
Image display device.
(7)
The image display device according to any one of (1) to (6) above,
The image display device, wherein the light-receiving microlens of the eye-sensing light receiving section is decentered so as to form an image of the eye-sensing light reflected by the pupil at the center of the eye-sensing light-receiving section.
(8)
The image display device according to any one of (1) to (7) above,
A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
The direction orthogonal to the α direction is the β direction,
Let A be the distance from the center of the image light projecting unit to the center of the eye sensing light receiving unit,
B is the pupil diameter of the image light projection unit,
Let the emergence angles from A with respect to B be θ α+_A , θ α−_A , θ β+_A , θ β−_A , θ α_Center_A , where θ α_Center_A is the chief ray angle,
Let the light receiving range of the eye sensing light receiving portion be α direction C αDET and β direction C βDET ,
Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_DET , ξα −_DET , ξβ +_DET , and ξβ −_DET ,
The light-receiving microlenses of the eye-sensing light-receiving unit are decentered ξ α+_DET −θ α_Center_A ≧ so as to satisfy the formulas (1-1), (1-2), (1-3) and (1-4). C αDET /B・(θ α+_A −θ α_Center_A ) (1-1)
ξ α−_DET −θ α_Center_A ≦C αDET /B・(θ α−_A −θ α_Center_A ) (1-2)
ξ β+_DET ≧C β_A /B・θ β_A (1-3)
ξ β-_DET ≦ C β_A /B・θ β_A (1-4)
Image display device.
(9)
The image display device according to any one of (1) to (8) above,
The image display device, wherein the image light projecting section is provided between the eye sensing light projecting section and the eye sensing light receiving section and the optical axis in a plane perpendicular to the optical axis of the optical system.
(10)
The image display device according to any one of (1) to (9) above,
the image light projection unit has a rectangular shape on a plane perpendicular to the optical axis of the optical system,
The image display device, wherein the eye sensing light receiving section is positioned within an effective image circle of the optical system on a straight line connecting the center of the long side of the image light projecting section and the optical axis of the optical system.
(11)
The image display device according to any one of (1) to (10) above,
The image display device, wherein the eye-sensing light receiving part includes a plurality of light-receiving unit sub-pixels separated from each other, the light-receiving microlenses are plural, and each microlens forms an image of the eye-sensing light on each light-receiving unit sub-pixel.
(12)
The image display device according to (11) above,
By providing a partition extending in the optical axis direction of the optical system between adjacent light-receiving unit sub-pixels, or by shielding the periphery of each light-receiving unit sub-pixel,
An image display device, wherein the plurality of light-receiving unit sub-pixels are separated from each other.
(13)
The image display device according to (11) or (12) above,
The plurality of light-receiving unit sub-pixels included in some of the light-receiving unit pixels and the plurality of light-receiving unit sub-pixels included in the other portion of the light-receiving unit pixels are offset from each other.
(14)
The image display device according to any one of (1) to (13) above,
The image light projection unit has an inner area and an outer area,
the pixel size of the outer region is larger than the pixel size of the inner region;
The image display device, wherein the eye sensing light receiving section is provided as a pixel in the outer region of the image light projecting section.
(15)
The image display device according to (14) above,
The image display device, wherein the eye-sensing light projecting section is provided as a pixel in the outer region of the image light projecting section.
(16)
An electronic device comprising the image display device according to any one of (1) to (15) above.
(17)
an image light projection unit that emits image light that is imaged on the retina through an optical system;
an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
and
the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
An integrated device in which the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
 本技術の各実施形態及び各変形例について上に説明したが、本技術は上述の実施形態にのみ限定されるものではなく、本技術の要旨を逸脱しない範囲内において種々変更を加え得ることは勿論である。 Although the embodiments and modifications of the present technology have been described above, the present technology is not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology. Of course.
 100 画像表示装置
 110 一体型デバイス
 120 光学系
 130 アイセンシング光受光部
 131 受光単位サブピクセル
 132 受光マイクロレンズ
 133 受光単位ピクセル
 134 カットフィルタ
 135 仕切り
 140 画像光投光部
 141 発光セル
 142 レンズ
 143 カバーガラス
 144 カラーフィルタ
 145 発光層
 150 アイセンシング光投光部
 151 LED
 152 投光マイクロレンズ
REFERENCE SIGNS LIST 100 image display device 110 integrated device 120 optical system 130 eye-sensing light receiver 131 light-receiving unit sub-pixel 132 light-receiving microlens 133 light-receiving unit pixel 134 cut filter 135 partition 140 image light projector 141 light-emitting cell 142 lens 143 cover glass 144 Color filter 145 Light-emitting layer 150 Eye-sensing light projector 151 LED
152 projection microlens

Claims (17)

  1.  光学系と、
      前記光学系を経て網膜に結像される画像光を発する画像光投光部と、
      瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
      瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
      前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
     を有する一体型デバイスと、
     を具備し、
     前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
     前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
     画像表示装置。
    an optical system;
    an image light projection unit that emits image light that is imaged on the retina through the optical system;
    an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
    an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
    a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
    an integrated device having
    and
    the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
    An image display device, wherein the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
  2.  請求項1に記載の画像表示装置であって、
     前記アイセンシング光受光部は、前記画像光をカットするカットフィルタを有する、
     または画像光をカットする前記受光マイクロレンズを有する
     画像表示装置。
    The image display device according to claim 1,
    The eye sensing light receiving unit has a cut filter that cuts the image light,
    Alternatively, the image display device has the light-receiving microlens for cutting image light.
  3.  請求項1に記載の画像表示装置であって、
     前記アイセンシング光投光部からの投光は、前記前記光学系を経て瞳で反射され、前記アイセンシング光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
     画像表示装置。
    The image display device according to claim 1,
    Light projected from the eye-sensing light projecting unit passes through the optical system and is reflected by a pupil, and the eye-sensing light projecting unit and the entrance pupils of the light-receiving microlenses are at the same position in the optical axis direction of the optical system. image display device.
  4.  請求項1に記載の画像表示装置であって、
     前記アイセンシング光投光部は、前記アイセンシング光の光量を増加するための投光マイクロレンズを有する
     画像表示装置。
    The image display device according to claim 1,
    The image display device, wherein the eye-sensing light projector has a light-projecting microlens for increasing the amount of the eye-sensing light.
  5.  請求項4に記載の画像表示装置であって、
     前記アイセンシング光投光部の前記投光マイクロレンズは、前記アイセンシング光の方向が前記画像光の射出角方向に一致するように偏心する
     画像表示装置。
    The image display device according to claim 4,
    The image display device, wherein the projection microlens of the eye-sensing light projection unit is decentered so that the direction of the eye-sensing light coincides with the exit angle direction of the image light.
  6.  請求項1に記載の画像表示装置であって、
     前記画像光投光部の中心から有効像円形の径方向をα方向とし、
     前記α方向に直交する方向をβ方向とし、
     前記画像光投光部の中心から、前記アイセンシング光投光部の中心までの距離をDとし、
     前記画像光投光部の瞳径をBとし、
     Bに対するDからの射出角をθα+_D,θα-_D,θβ+_D,θβ-_D,θα_Center_Dとし、θα_Center_Dは主光線角であり、
     前記アイセンシング光投光部の投光範囲をα方向CαPRJ、β方向CβPRJとし、
     受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_PRJ,ξα-_PRJ,ξβ+_PRJ,ξβ-_PRJとすると、
     式(1-1')、(1-2')、(1-3')及び(1-4')を満たすように、アイセンシング光投光部の前記受光マイクロレンズは、偏心する
     ξα+_PRJ-θα_Center_D≧CαPRJ/B・(θα+_D-θα_Center_D) (1-1')
     ξα-_PRJ-θα_Center_D≦CαPRJ/B・(θα-_D-θα_Center_D) (1-2')
     ξβ+_PRJ≧Cβ_D/B・θβ_D (1-3')
     ξβ-_PRJ≦Cβ_D/B・θβ_D (1-4')
     画像表示装置。
    The image display device according to claim 1,
    A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
    The direction orthogonal to the α direction is the β direction,
    Let D be the distance from the center of the image light projection unit to the center of the eye sensing light projection unit,
    B is the pupil diameter of the image light projection unit,
    Let the emergence angles from D with respect to B be θ α+_D , θ α−_D , θ β+_D , θ β−_D , θ α_Center_D , where θ α_Center_D is the chief ray angle,
    Let the projection range of the eye sensing light projection unit be α direction C αPRJ and β direction C βPRJ ,
    Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_PRJ , ξα −_PRJ , ξβ +_PRJ , and ξβ −_PRJ ,
    The light-receiving microlens of the eye-sensing light projector is decentered ξα +_PRJ so as to satisfy the equations (1-1′), (1-2′), (1-3′) and (1-4′). −θ α_Center_D ≧C αPRJ /B・(θ α+_D −θ α_Center_D ) (1−1′)
    ξ α−_PRJ −θ α_Center_D ≦C αPRJ /B・(θ α−_D −θ α_Center_D ) (1-2′)
    ξ β+_PRJ ≧C β_D /B・θ β_D (1−3′)
    ξ β-_PRJ ≤ C β_D /B・θ β_D (1-4')
    Image display device.
  7.  請求項1に記載の画像表示装置であって、
     前記アイセンシング光受光部の前記受光マイクロレンズは、前記瞳で反射した前記アイセンシング光を前記前記アイセンシング光受光部の中心に結像させるように偏心する
     画像表示装置。
    The image display device according to claim 1,
    The image display device, wherein the light-receiving microlens of the eye-sensing light receiving section is decentered so as to form an image of the eye-sensing light reflected by the pupil at the center of the eye-sensing light-receiving section.
  8.  請求項1に記載の画像表示装置であって、
     前記画像光投光部の中心から有効像円形の径方向をα方向とし、
     前記α方向に直交する方向をβ方向とし、
     前記画像光投光部の中心から、前記アイセンシング光受光部の中心までの距離をAとし、
     前記画像光投光部の瞳径をBとし、
     Bに対するAからの射出角をθα+_A,θα-_A,θβ+_A,θβ-_A,θα_Center_Aとし、θα_Center_Aは主光線角であり、
     前記アイセンシング光受光部の受光範囲をα方向CαDET、β方向CβDETとし、
     前記受光範囲Cに対する前記アイセンシング光の受光範囲角をξα+_DET,ξα-_DET,ξβ+_DET,ξβ-_DETとすると、
     式(1-1)、(1-2)、(1-3)及び(1-4)を満たすように、前記アイセンシング光受光部の前記受光マイクロレンズは、偏心する
     ξα+_DET-θα_Center_A≧CαDET/B・(θα+_A-θα_Center_A) (1-1)
     ξα-_DET-θα_Center_A≦CαDET/B・(θα-_A-θα_Center_A) (1-2)
     ξβ+_DET≧Cβ_A/B・θβ_A (1-3)
     ξβ-_DET≦Cβ_A/B・θβ_A (1-4)
     画像表示装置。
    The image display device according to claim 1,
    A radial direction of the effective image circle from the center of the image light projecting unit is defined as an α direction,
    The direction orthogonal to the α direction is the β direction,
    Let A be the distance from the center of the image light projecting unit to the center of the eye sensing light receiving unit,
    B is the pupil diameter of the image light projection unit,
    Let the emergence angles from A with respect to B be θ α+_A , θ α−_A , θ β+_A , θ β−_A , θ α_Center_A , where θ α_Center_A is the chief ray angle,
    Let the light receiving range of the eye sensing light receiving portion be α direction C αDET and β direction C βDET ,
    Assuming that the light receiving range angles of the eye sensing light with respect to the light receiving range C are ξα +_DET , ξα −_DET , ξβ +_DET , and ξβ −_DET ,
    The light-receiving microlenses of the eye-sensing light-receiving unit are decentered ξ α+_DET −θ α_Center_A ≧ so as to satisfy the formulas (1-1), (1-2), (1-3) and (1-4). C αDET /B・(θ α+_A −θ α_Center_A ) (1-1)
    ξ α−_DET −θ α_Center_A ≦C αDET /B・(θ α−_A −θ α_Center_A ) (1-2)
    ξ β+_DET ≧C β_A /B・θ β_A (1-3)
    ξ β-_DET ≦ C β_A /B・θ β_A (1-4)
    Image display device.
  9.  請求項1に記載の画像表示装置であって、
     前記光学系の光軸に直交する面において、前記画像光投光部は、前記アイセンシング光投光部及び前記アイセンシング光受光部と、前記光軸との間に設けられる
     画像表示装置。
    The image display device according to claim 1,
    The image display device, wherein the image light projecting section is provided between the eye sensing light projecting section and the eye sensing light receiving section and the optical axis in a plane perpendicular to the optical axis of the optical system.
  10.  請求項1に記載の画像表示装置であって、
     前記画像光投光部は、前記光学系の光軸に直交する面において、長方形状であり、
     前記アイセンシング光受光部は、前記画像光投光部の長辺の中央と前記光学系の光軸とを結ぶ直線上であって、前記光学系の有効像円形内に位置する
     画像表示装置。
    The image display device according to claim 1,
    the image light projection unit has a rectangular shape on a plane perpendicular to the optical axis of the optical system,
    The image display device, wherein the eye sensing light receiving section is positioned within an effective image circle of the optical system on a straight line connecting the center of the long side of the image light projecting section and the optical axis of the optical system.
  11.  請求項1に記載の画像表示装置であって、
     前記アイセンシング光受光部は互いに分離された複数の受光単位サブピクセルを含み、前記受光マイクロレンズは複数あり、各マイクロレンズは各々の受光単位サブピクセルにアイセンシング光を結像する
     画像表示装置。
    The image display device according to claim 1,
    The image display device, wherein the eye-sensing light receiving part includes a plurality of light-receiving unit sub-pixels separated from each other, the light-receiving microlenses are plural, and each microlens forms an image of the eye-sensing light on each light-receiving unit sub-pixel.
  12.  請求項11に記載の画像表示装置であって、
      隣接する受光単位サブピクセルの間に、前記光学系の光軸方向に延びる仕切りを設けることにより、又は
      各受光単位サブピクセルの周囲が遮蔽されることにより、
     前記複数の受光単位サブピクセルが互いに分離される
     画像表示装置。
    The image display device according to claim 11,
    By providing a partition extending in the optical axis direction of the optical system between adjacent light-receiving unit sub-pixels, or by shielding the periphery of each light-receiving unit sub-pixel,
    An image display device, wherein the plurality of light-receiving unit sub-pixels are separated from each other.
  13.  請求項11に記載の画像表示装置であって、
     一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルと、他の一部の受光単位ピクセルに含まれる前記複数の受光単位サブピクセルとは、オフセットしている
     画像表示装置。
    The image display device according to claim 11,
    The plurality of light-receiving unit sub-pixels included in some of the light-receiving unit pixels and the plurality of light-receiving unit sub-pixels included in the other portion of the light-receiving unit pixels are offset from each other.
  14.  請求項1に記載の画像表示装置であって、
     前記画像光投光部は、内側領域及び外側領域を有し、
     前記外側領域のピクセルサイズは、前記内側領域のピクセルサイズより大きく、
     前記アイセンシング光受光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられる
     画像表示装置。
    The image display device according to claim 1,
    The image light projection unit has an inner area and an outer area,
    the pixel size of the outer region is larger than the pixel size of the inner region;
    The image display device, wherein the eye sensing light receiving section is provided as a pixel in the outer region of the image light projecting section.
  15.  請求項14に記載の画像表示装置であって、
     前記アイセンシング光投光部は、前記画像光投光部の前記外側領域内のピクセルとして設けられる
     画像表示装置。
    15. The image display device according to claim 14,
    The image display device, wherein the eye-sensing light projecting section is provided as a pixel in the outer region of the image light projecting section.
  16.  請求項1に記載の画像表示装置
     を具備する電子機器。
    An electronic device comprising the image display device according to claim 1 .
  17.  光学系を経て網膜に結像される画像光を発する画像光投光部と、
     瞳で反射されるアイセンシング光を発するアイセンシング光投光部と、
     瞳で反射されるアイセンシング光を前記光学系経由で受光するアイセンシング光受光部と、
     前記瞳で反射して前記光学系を経て入射する前記アイセンシング光を、前記アイセンシング光受光部に結像させる受光マイクロレンズと、
     を具備し、
     前記受光マイクロレンズは、前記画像光の光路に含まれず、かつ前記アイセンシング光が瞳に到達する光路に含まれず、
     前記画像光投光部と前記受光マイクロレンズの入射瞳が前記光学系の光軸方向において同位置に配置される
     一体型デバイス。
    an image light projection unit that emits image light that is imaged on the retina through an optical system;
    an eye-sensing light projector that emits eye-sensing light that is reflected by the pupil;
    an eye-sensing light receiving unit that receives the eye-sensing light reflected by the pupil via the optical system;
    a light-receiving microlens for forming an image of the eye-sensing light reflected by the pupil and incident through the optical system on the eye-sensing light-receiving unit;
    and
    the light-receiving microlens is not included in the optical path of the image light and is not included in the optical path along which the eye sensing light reaches the pupil;
    An integrated device in which the image light projecting section and the entrance pupil of the light receiving microlens are arranged at the same position in the optical axis direction of the optical system.
PCT/JP2022/033700 2021-10-29 2022-09-08 Image display apparatus, electronic appliance, and integrated type device WO2023074138A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-177051 2021-10-29
JP2021177051 2021-10-29

Publications (1)

Publication Number Publication Date
WO2023074138A1 true WO2023074138A1 (en) 2023-05-04

Family

ID=86157694

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/033700 WO2023074138A1 (en) 2021-10-29 2022-09-08 Image display apparatus, electronic appliance, and integrated type device

Country Status (1)

Country Link
WO (1) WO2023074138A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072188A (en) * 2008-09-17 2010-04-02 Pioneer Electronic Corp Display device
WO2018181144A1 (en) * 2017-03-31 2018-10-04 シャープ株式会社 Head-mounted display
US20200103655A1 (en) * 2018-09-24 2020-04-02 Commissariat A L'energie Atomique Et Aux Energies Alternatives System for virtual reality or augmented reality having an eye sensor and method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072188A (en) * 2008-09-17 2010-04-02 Pioneer Electronic Corp Display device
WO2018181144A1 (en) * 2017-03-31 2018-10-04 シャープ株式会社 Head-mounted display
US20200103655A1 (en) * 2018-09-24 2020-04-02 Commissariat A L'energie Atomique Et Aux Energies Alternatives System for virtual reality or augmented reality having an eye sensor and method thereof

Similar Documents

Publication Publication Date Title
US10330938B2 (en) Waveguide optical element and near-eye display apparatus
JP6539672B2 (en) Immersive compact display glass
US8094377B2 (en) Head-mounted optical apparatus using an OLED display
JP2017116773A (en) Virtual image display device
JP4372891B2 (en) Video display device
EP1969422A2 (en) Projection system with beam homogenizer
JP2004325672A (en) Scanning optical system
US20140139899A1 (en) Display device
JP2006317604A (en) Image display device and imaging apparatus using same
US7206135B2 (en) Image observing apparatus
JPH10170860A (en) Eyeball projection type video display device
KR101590825B1 (en) Composite lens for Head Mounted Display and Device comprising the same
US20170277949A1 (en) Head-mounted display
US11051680B1 (en) Endoscope stereo imaging device
US11067876B2 (en) Projection display device
WO2023074138A1 (en) Image display apparatus, electronic appliance, and integrated type device
US20240061246A1 (en) Light field directional backlighting based three-dimensional (3d) pupil steering
US10983317B2 (en) Compact, lightweight optical imaging system having free-form surface and common optical axis direction
JP2020020859A (en) Virtual image display device
US10718949B1 (en) Display apparatus and method of displaying
CN111798762A (en) Display panel and head-mounted device
WO2019211145A1 (en) A head-mounted device comprising a micro-lens array on its display
TWI733498B (en) Display panel and head mounted device
US20220137413A1 (en) Display module, assembly method for display module, and virtual image display apparatus
Samoilova et al. Development of see-through near-eye optical module for AR eyewear

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22886472

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023556165

Country of ref document: JP

Kind code of ref document: A