WO2019173997A1 - Near eye display and associated method - Google Patents

Near eye display and associated method Download PDF

Info

Publication number
WO2019173997A1
WO2019173997A1 PCT/CN2018/079098 CN2018079098W WO2019173997A1 WO 2019173997 A1 WO2019173997 A1 WO 2019173997A1 CN 2018079098 W CN2018079098 W CN 2018079098W WO 2019173997 A1 WO2019173997 A1 WO 2019173997A1
Authority
WO
WIPO (PCT)
Prior art keywords
spatial light
light modulator
lens
focus
display
Prior art date
Application number
PCT/CN2018/079098
Other languages
French (fr)
Inventor
Zhen Xiao
Original Assignee
Nokia Technologies Oy
Nokia Technologies (Beijing) Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy, Nokia Technologies (Beijing) Co., Ltd. filed Critical Nokia Technologies Oy
Priority to PCT/CN2018/079098 priority Critical patent/WO2019173997A1/en
Publication of WO2019173997A1 publication Critical patent/WO2019173997A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/10Bifocal lenses; Multifocal lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0185Displaying image at variable distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • G02B3/08Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens

Definitions

  • Embodiments of the disclosure generally relate to displaying technologies, and more particularly, to a near eye display and associated method, an apparatus and a computer program product.
  • a virtual reality, or “VR” scenario typically involves presentation of digital or virtual image information without transparency to other actual real-word visual input.
  • a present, virtual reality has been developed as a technology that is now feasible at low cost via inexpensive device such as mobile phone components.
  • advances of high-resolution micro displays, and modern GPU facilitate extremely immersive experiences.
  • the convergence-accommodation conflict inherent to stereoscopic displays, especially to the VR display will have to be solved.
  • a near eye display comprises: a lens having more than one focus lengths, each focus length corresponding to at least one focus area; a display located at one of the focus lengths of the lens; a spatial light modulator located adjacent to the lens and configured to modulate light coming from the display to allow the light to pass through a first portion of the spatial light modulator, and block the light from passing through the rest portion of the spatial light modulator; an eye determining unit configured to determine a viewpoint position of user’s eyes; and a controller configured to control the spatial light modulator to modulate the light such that the first portion corresponds to at least one of the focus areas associated with the viewpoint position.
  • the spatial light modulator comprises a plurality of pixels. Each focus area of the lens is overlaid by at least one pixel. The first portion comprises the pixels overlaying the at least one of the focus areas. The controller is configured to control the pixels overlaying the at least one of the focus areas to be transparent, and control the pixels overlaying the remaining focus areas to be opaque.
  • the eye determining unit comprises an eye tracking unit, and the viewpoint position of user’s eyes is determined by means of the eye tracking unit.
  • the eye tracking unit comprises an imaging device.
  • the eye determining unit is integrated into the controller, and the viewpoint position of the user’s eyes is preestimated based on a content of an image to be displayed.
  • the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
  • the spatial light modulator comprises one of a liquid crystal display, a MEMS (Micro-Electro-Mechanical System) shutter, and a DLP (Digital Light Processing) DMD (Digital Micromirror Device) array.
  • MEMS Micro-Electro-Mechanical System
  • DLP Digital Light Processing
  • DMD Digital Micromirror Device
  • the lens comprises a refractive multifocal lens or a diffractive multifocal lens.
  • the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
  • each focus area has a shape of a circle, a ring, a sector or a polygon.
  • the display is one of a mobile phone, a laptop, and desktop.
  • a method implemented by a near eye display comprises: determining a viewpoint position of user’s eyes; and controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  • controlling the spatial light modulator to modulate the light comprises: determining a focus length of the lens based on the viewpoint position of user’s eyes; selecting the at least one of the focus areas of the lens based on the determined focus length; and controlling the spatial light modulator to modulate the light based on the at least one of the focus areas.
  • the spatial light modulator modulates the light by controlling pixels of the spatial light modulator overlaying the at least one of the focus areas to be transparent; and controlling the pixels of the spatial light modulator overlaying the remaining focus areas to be opaque;
  • the viewpoint position of user’s eyes is determined by means of an eye tracking unit.
  • the eye tracking unit comprises an imaging device.
  • the viewpoint position of user’s eyes is preestimated based on a content of an image to be display.
  • the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
  • the spatial light modulator comprises one of a liquid crystal display, a MEMS shutter, and a DLP DMD array.
  • the lens comprises a refractive lens or a diffractive lens.
  • the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
  • each focus area has a shape of a circle, a ring, a sector or a polygon.
  • the display is one of a mobile phone, a laptop, and a desktop.
  • an apparatus comprising at least one processor; and at least one memory including computer-executable code.
  • the at least one memory and the computer-executable code are configured to, with the at least one processor, cause the apparatus to operate: determining a viewpoint position of both eyes; and controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  • an apparatus comprising means for determining a viewpoint position of both eyes; and means for controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  • the means for controlling a spatial light modulator to modulate light comprises means for determining a focus length of the lens based on the viewpoint position of user’s eyes; means for selecting the at least one of the focus areas of the lens based on the determined focus length; and means for controlling the spatial light modulator to modulate the light based on the at least one of the focus areas.
  • a computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program instructions stored therein.
  • the computer-executable instructions being configured to, when being executed, cause an apparatus to operate: determining a viewpoint position of both eyes; and controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  • Fig. 1 is a schematic block diagram of a near eye display according to an example embodiment of the present disclosure
  • Fig. 2 is a simplified schematic view of a near eye display according to an example embodiment of the present disclosure
  • Figs. 3A to 3B are a front view (left) and a side view (right) of a refractive multifocal lens in one or more example embodiments of the present disclosure, respectively;
  • Figs. 4A to 4B are a front view (left) and a side view (right) of a diffractive multifocal lens in one or more example embodiments of the present disclosure, respectively;
  • Fig. 5 is a schematic view of a modulation of the spatial light modulator to light according to an example embodiment of the present disclosure
  • Fig. 6 is a schematic view depicting distance of viewpoint to eyes in one or more example embodiments of the present disclosure
  • Fig. 7 is an exploded schematic view of an example near eye display according to an embodiment of the present disclosure.
  • Fig. 8 is a flow chart of a method implemented by a near eye display according to an embodiment of the present disclusure
  • Fig. 9 is an flow chart of an example method implemented by a near eye display according to an example embodiment of the present disclusure.
  • Fig. 10 is a simplified block diagram of an apparatus suitable for use in some embodiments of the present disclosure.
  • stereoscopic displays in one kind of VR technology provide binocular disparity that supports convergence on any point but only allows the viewer to accommodate on the display surface and thus suffer from accommodation-convergence conflict. These displays often decouple the accommodation cue from the convergence cue and tie the accommodation cue to a fixed distance, which easily cause visual fatigue, dizziness, and other discomforts.
  • one solution for the convergence-accommodation conflict is a near eye light field display.
  • the near eye light field display two stacked transparent liquid crystal displays modulate the uniform backlight in a multiplicative fashion. In this way, when observed through a pair of lenses, the display provides focus cues in addition to binocular disparity afforded by VR displays.
  • the computational complexity of such near eye light field display is very high, which can reach above O (N 2 ) to O (N 3 ) .
  • O (N 3 ) O
  • huge computing power is needed.
  • Another solution is imaging with multifocal lenses. However, the focal points of the multifocal lenses interfere with each other. For example, when seeing in the near field, the light from the far focal is out of focus, resulting in blurred image.
  • FIG. 1 is a schematic block diagram of a near eye display according to an example embodiment of the present disclosure.
  • Fig. 2 is a simplified schematic view of a near eye display according to an example embodiment of the present disclosure.
  • the near eye display may include a lens 102, a display 104, a spatial light modulator 106, an eye determining unit 108, and a controller 110 (not shown in Fig. 2) .
  • the near eye display may be implemented in the front of both eyes.
  • the structure arrangement of the near eye display is described in details with respect to only one eye in the description and/or drawings for the purpose of simplicity. It shall be understood that the same structure arrangement is also suitable for the other eye.
  • the lens 102 may have more than one focus lengths (e.g., two, three, or even more focal lengths) , each focus length corresponding to at least one focus area.
  • the lens 102 may also be called a multifocal lens.
  • the term focus area refers to an area on the lens which corresponds to a particular focus length.
  • the position and size of each focus area may be fixed, or may be variable as a signal (for example, a voltage signal, a current signal or even a mechanical signal) applied to the lens.
  • the focus areas corresponding to different focus lengths may have different curvatures or refactive index materials.
  • the focus areas may have any shape suitable for the present disclosure, such as a circle, a ring, a sector, a polygon, or a variety of other shapes.
  • the lens 102 may be any kind of multifocal lens having more than one focus lengths, for example, a refractive multifocal lens or a diffractive multifocal lens.
  • Figs. 3A to 3B are a front view (left) and a side view (right) of a refractive multifocal lens in one or more example embodiments of the present disclosure, respectively.
  • the refractive multifocal lens may be for example a multifocal mosaic lens which has a plurality of hexagon focus areas with at least three different focus lengths. At least one of the focus areas to be used may be selected by shielding other focus areas.
  • the focus areas to be used may be selected according to the viewpoint or view axis of a user, and may be switched as the viewpoint or view axis of a user changes.
  • Figs. 4A to 4B are a front view (left) and a side view (right) of a diffractive multifocal lens in one or more example embodiments of the present disclosure, respectively.
  • the diffractive multifocal lens may comprise many little wedges on the surface. Light diffracted by each wedge may be focused on at least two focus points, thus form at least two different focus lengths, i.e., a near focus and a far focus.
  • the near focus or the far focus to be used may be selected by controlling amount of light focused on the near focus or the far focus. For example, if the amount of light focused on the near focus is controlled to be much larger than that focused on the far focus, the near focus is selected, and vice versa.
  • the lens 102 may be a variable focus lens.
  • the variable focus lens may be a refractive element, such as a liquid crystal lens, an electro-active lens, a conventional refractive lens with moving elements, a mechanical-deformation-based lens (such as a fluid-filled membrane lens, or a lens akin to the human crystalline lens wherein a flexible element is flexed and relaxed by actuators) , an electro-wetting lens, or a plurality of fluids with refractive indices.
  • variable focus lens may also comprise a switchable diffractive optical element (such as one featuring a polymer dispersed liquid crystal approach wherein a host medium, such as a polymeric material, has microdroplets of liquid crystal dispersed within the material; when a voltage is applied, the molecules reorient so that their refractive indices no longer match that of the host medium, thereby creating a high diffraction pattern) .
  • a switchable diffractive optical element such as one featuring a polymer dispersed liquid crystal approach wherein a host medium, such as a polymeric material, has microdroplets of liquid crystal dispersed within the material; when a voltage is applied, the molecules reorient so that their refractive indices no longer match that of the host medium, thereby creating a high diffraction pattern
  • the display 104 may be located at one of the focus lengths of the lens.
  • the display may be positioned at the nearest, the middle or the farthest focus point of the lens.
  • the display 104 may be provided for responding to at least one image/video signal, for presentation of information for visual reception.
  • the at least one image/video signal herein may be divided into left eye image/video signal entering into the left eye of a user and right eye image/video signal entering into the right eye of the user, and the left eye image/video signal and the right eye image/video signal may be synthesized by human brain to generated a stereoscopic image, which is perceived by user.
  • the display 104 may be any device capable of presenting visual information for visual reception, including, but not limited to, a mobile phone, a laptop, or desktop.
  • the display may present visual information by emitting light (e.g., one or more of liquid crystal display (LCD) , electroluminescence, photoluminescence, incandescence, and cathodoluminescence) .
  • LCD liquid crystal display
  • electroluminescence photoluminescence
  • incandescence incandescence
  • cathodoluminescence cathodoluminescence
  • the spatial light modular 106 may be located adjacent to the lens and may be configured to modulate light coming from the display 104 to allow the light to pass through a first portion of the spatial light modulator 106, and block the light from passing through the rest portion of the spatial light modulator 106.
  • the spatial light modulator 106 may be arranged between the display 104 and the lens 102.
  • the spatial light modulator 106 may be arranged on a side of the lens 102 away from the display 104. It should be understood that the spatial light modulator 106 may be arranged at other position with respect to the lens. For example, in the case that there are two lenses arranged in parallel for each eye, the spatial light modulator may be arranged between the two lenses.
  • the spatial light modulator 106 may comprise a plurality of pixels. Each focus area of the lens is overlaid by at least one pixel.
  • the spatial light modulator 106 may modulate the light coming from the display 104 by controlling whether the light passes through pixels overlaying the focus areas. Thereby, at least one of the focus areas of the lens may be selected to be used in the near eye display, and the selected focus areas may also be switched in time as the viewpoint of the user’s eye changes. As an example, by switching pixels of the spatial light modulator on or off, pixels may be controlled to be transparent or opaque such that the focus areas overlaid by the transparent pixels are selected to be used.
  • Fig. 5 is schematic view of a modulation of the spatial light modulator to light according to an example embodiment of the present disclosure.
  • the lens 102 has three focus lengths (indicated by fl, f2, and f3 each corresponding to multiple focus areas indicated by 1, 2 and 3) , each focus area of the lens 102 is overlaid by one or more pixels of the spatial light modulator 106.
  • the pixels overlaying the first focus areas 1 corresponding to the focus length fl
  • the first focus areas corresponding the opened pixels are selected to be used in the near eye display at present.
  • the pixels overlaying the other focus areas 2, 3 (corresponding to the focus length f2 and f3) to be opaque (closed)
  • the other focus areas corresponding the closed pixels are not selected at present.
  • the spatial light modulator 106 for example may be a liquid crystal display, a MEMS shutter, a DLP DMD array or any other kind of modulating device which may be independently controlled to block or transmit the light.
  • the eye determining unit 108 may be configured to determine a viewpoint position of user’s eyes.
  • the viewpoint position in embodiments of the disclosure may be a intersection of the visual axes of two eyes. It shall be appreciated that, the eye determining unit 108 described herein may be implemented by a hardware, a software or a combination of the hardware and the software to implement the functionality of the eye determining unit 108, i.e., the determination of the viewpoint position.
  • the eye determining unit 108 may comprise an eye tracking unit.
  • the eye tracking unit may comprise an imaging device (for example, a camera such as a CCD camera, and a CMOS camera) oriented toward an eye of the user for imaging an eye directly.
  • the eye tracking unit may comprise an imaging device (for example, a camera such as a CCD camera, and a CMOS camera) and a mirror for reflecting light coming from the eye to the imaging device such that the imaging device may imaging the eye indirectly.
  • the viewpoint position of user’s eyes may be determined based on the images of the eyes acquired by the imaging device, such as by extracting eye features from the images of the eyes.
  • the eye determining unit 108 (specifically, the eye tracking unit) may be placed before or after the lens 102 and the spatial light modulator 106, or near them.
  • this is not intented to limit the scope of the present disclosure to these specific positions. The skilled in the art may rather easily recognize how to adapt the related arrangements or conditions when employing a different position of the eye determining unit 108.
  • the eye determining unit 108 may be integrated into a controller, for example, in the form of a hardware circuit or in the form of a software module.
  • the controller may comprise a processor and a memory, wherein the processor executes a program stored in the memory to implement the functionality of the eye determining unit 108.
  • the viewpoint position of user’s eyes may be preestimated based on a content of an image to be displayed.
  • the program stored in the memory may analyze the image content to be displayed, determine from the image content some objects most likely to attact user’s eyes, and then preestimate some possible points on the objects at which the user′s eyes are most likely to gaze. These extracted possible points may act as possible viewpoints, the positions of which may be linked with some subjects in the image to be display and stored in the memory, for determining the focus areas associated with the viewpoint of user’s eyes.
  • any existing method may be used for this purpose, and it is not a limitation to the present disclosure.
  • the closest focal length of lens 102 to f may be determined as the focal length associated with the viewpoint of user’s eyes.
  • the spatial light modulator With the spatial light modulator, only the focus areas conrresponding to the determined focal length are opened, as described above. As the viewpoint of user’s eyes changes, the focus areas opened may be switched in time.
  • the controller 110 may be configured to control the spatial light modulator to modulate the light such that the first portion of the spatial light modulator through which the light is allowed to pass corresponds to the at least one of the focus areas associated with the viewpoint position of user’s eyes.
  • the controller 110 may be connected to the eye determining unit 108, for receiving information about the viewpoint of the user’s eyes, and connected to the spatial light modulator 106, for sending to the spatial light modulator 106 a modulation information based on the viewpoint of the user’s eyes such that the spatial light modulator 106 modulates light coming from the display 104 to allow the light to pass through the first portion corresponding to the at least one of the focus areas of the lens 102 associated with the viewpoint position of user’s eyes, and block the light from passing through the rest portion of the spatial light modulator 106.
  • the connection in the embodiments of the present disclosure may be wired or wirelessly connected and even integrated into other components.
  • the first portion of the spatial light modulator 106 comprises the pixels overlaying the at least one of the focus areas
  • the controller 110 may be configured to control the pixels overlaying the at least one of the focus areas to be transparent (for example, by switching ON (opening) the pixels overlaying the at least one of the focus areas) such that the light passes through the first portion, and control the pixels overlaying the remaining focus areas to be opaque (for example, by switching OFF (closing) the pixels overlaying the remaining focus areas) .
  • the controller 110 may be implemented as: (a) hardware-only circuit such as implemented as only analog and/or digital circuit; (b) combinations of circuits and software (and/or firmware) , such as a combination of digital signal processor (s) , software, and memory (ies) that work together to cause various functions to be performed.
  • the controller 110 may be a separated component or integrated into other components (such as the display 104 and the spatial light modulator 106) as a part thereof.
  • Fig. 7 is an exploded schematic view of an example near eye display according to an embodiment of the present disclosure. It should be noted that the embodiment as illustrated in Fig. 7 will be described with respect to two eyes of a user, that is, the near eye display may be implemented in the front of both eyes.
  • the near eye display may comprise two lenses 102 each with more than one focus lengths (for example, two, three or more focus lengths) , each focus length corresponding to a plurality of focus areas; a display 104 located at one of the focus lengths of the lenses (i.e., on one of the focus planes of the lenses) ; a spatial light modulator 106 located between the lenses 102 and the display 104; a eye determining unit 108 for determining a viewpoint position of user’s eyes; an optional housing 112 for holding the two lenses 102; and a controller (not shown in Fig. 7) which may be connected to or integrated into one or more of the display 104, the eye determining unit 108 and the spatial light modulator 106.
  • a controller not shown in Fig. 7
  • the eye determining unit 108 may comprise two cameras 1081 located on the housing 112 for acquiring images of user’s eye; and two optional mirrors 1082 for reflecting light coming from user’s eyes to the two cameras 1081.
  • the viewpoint of user’s eyes may be determined based on the images of user’s eye.
  • the viewpoint of user’s eyes may also be preestimated based on the content of an image to be displayed, as mentioned above.
  • One of the focus lengths to be used may first be determined, for example by the controller 110 with the process described above referring to Fig. 4 and formula (1) , based on the obtained or preestimated viewpoint of user’s eyes, and then at least one of the focus areas to be used may be determined based on the determined focus length.
  • the spatial light modulator 106 in the example embodiment may be a transparant liquid crystal display (LCD) .
  • the transparant LCD may include a plurality of pixels which overlay the focus areas of the lenses and may be independently controlled to block or transmit the light.
  • the spatial light modulator 106 may be controlled by the controller 110 based on the determined at least one of the focus areas to modulate the light coming from the display 104, such that the spatial light modulator:
  • the spatial light modulator 106 may molulate the light by:
  • pixels of the spatial light modulator 106 which are overlaid on the remaining focus areas of lens not presenting the determined focus length, to be opaque such that a content of the image (light) from the display 104 may be blocked from passing the remaining (non-selected) focus areas and user’s eyes may not see the content of the image.
  • any available technology may be used, for example, reorienting liquid crystal molecules of a LCD.
  • the spatial light modulator 106 may have a low resolution, for example, one focus area of the lens may be overlaid by only one pixel of the spatial light modulator 106, so as to achieve a higher refresh rate to switch quickly between multiple focus lengths.
  • a multifocal lens, a spatial light modulator and an eye determining unit are used in a near eye display, which may select suitable (or correct) focus areas (corresponding to a suitable focus length) of a lens by switching between multiple focus lengths of a lens based on the viewpoint of user’s eyes in time, thereby solving the accommodation-convergence conflict and providing fully natural visual cues and comfortable viewing experiences.
  • the near eye display according to the present disclosure only needs a single multifical lens for each eye, and the display may be any means in daily life which can display visual information (for example a mobile phone) without any change, so the structure complexity of the near eye display may be reduced.
  • the computing complexity and required co computing power are very low, so it can quickly produce acceptable visual effects.
  • the method may make use of the near eye display according to the present disclusure, such as of the near eye display according to one or more of the embodiments disclosed above and/or below in further detail.
  • the embodiments of the near eye display reference might be made to the embodiments of the near eye display.
  • the method comprises the following steps, which may be performed in the given order or in a different order. Further, additional method steps might be provided which are not listed. Further, two or more or even all of the method steps might be performed at least partially simultaneously. Further, a method step might be performed twice or even more than twice, repeatedly.
  • Fig. 8 is a flow chart depicting a method implemented by a near eye display according to an embodiment of the present disclusure. As shown in Fig. 8, the method may comprise:
  • step 802 determining a viewpoint position of user’s eyes.
  • step 804 controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  • the viewpoint position of user’s eyes may be determined by an eye tracking unit which can acquire images of user’s eye. Sepcifically, the viewpoint position of user’s eyes may be determined by processing the acquired images. In an alternatively embodiment, the viewpoint of user’s eyes may also be preestimated based on the content of an image to be displayed, as mentioned above in one or more embodiments relating to the near eye dispaly.
  • the spatial light modulator may be controlled by a controller to modulate light coming from a display.
  • the controller may operate: 1) determining a focus length of a lens based on the viewpoint position of user’s eyes; 2) selecting the at least one of the focus areas of the lens based on the determined focus length; and 3) controlling the spatial light modulator to modulate the light based on the at least one of the focus areas such that the spatial light modulator allows the light to pass through the first portion of the spatial light modulator corresponding to the at least one of the focus areas, and blocks the light from passing through the rest portion of the spatial light modulator.
  • the spatial light modulator may be a transparent LCD or other device allowing light to pass through a portion thereof, and blocking light from passing through the rest portion thereof.
  • the spatial light modulator may include a plurality of pixels which overlay the focus areas of the lenses and may be independently controlled to block or transmit the light.
  • the spatial light modulator may modulate the light from the display by controlling pixels of a spatial light modulator overlaying the selected at least one of the focus areas to be transparent; and controlling the pixels overlaying the remaining focus areas to be opaque.
  • Fig. 9 is an exemplary flow chart depicting a method implemented by the near eye display in an example embodiment. As shown in Fig. 9, the method may comprise steps 902, 904, 906, and 908.
  • a viewpoint position of user’s eyes is determined.
  • the viewpoint position of user’s eyes may be determined by an eye tracking unit (such as a camera) or preestimated by a controller based on the content of an image to be displayed.
  • a focus length of a lens is determined based on the viewpoint position of user’s eyes, so that the determined may be used to select focus areas matching with the viewpoint position of user’s eyes.
  • a controller may perform the determination of the focus length using the process mentioned in one or more embodiments relating to the near eye dislay, for example the method described with respect to Fig. 6 and formula (1) .
  • At step 906 at least one of focus areas of the lens is selected based on the determined focus length at step 904. Similar to step 904, this step 906 may also be perpormed by the controller.
  • the focus areas corresponding to the determined focus length may be determined as focus areas most suitable for the user′s current viewpoint.
  • a spatial light modulator is controlled (for example by a controller) to modulate the light based on the selected focus areas such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to the selected focus areas of a lens, and blocks the light from passing through the rest portion of the spatial light modulator.
  • the spatial light modulator may modulate the light from the display by controlling pixels of a spatial light modulator overlaying the selected focus areas to be transparent; and controlling the pixels overlaying the remaining focus areas to be opaque.
  • suitable (or correct) focus areas (corresponding to a suitable focus length) of a lens of the near eye display may be selected in time by switching between multiple focus lengths of a lens, as the viewpoint of user’s eyes changes, thereby solving the accommodation-convergence conflict and providing fully natural visual cues and comfortable viewing experiences.
  • the computational complexity in the method of above embodiments is very low, and it thus can quickly produce acceptable visual effects of a near eye display.
  • FIG. 10 is a simplified block diagram depicting an apparatus suitable for use in some embodiments of the present disclosure.
  • the apparatus 1000 may include a data processor 1001, a memory 1002 that stores a program 1003, and a communication interface 1004 for communicating data with other external devices through wired and/or wireless communication.
  • the program 1003 is assumed to include program instructions that, when executed by the data processor 1001, enable the apparatus 1000 to operate in accordance with the embodiments of this disclosure, as discussed above. That is, the embodiments of this disclosure may be implemented at least in part by computer software executable by the data processor 1001, or by hardware, or by a combination of software and hardware.
  • the memory 1002 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor based memory devices, flash memory, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory.
  • the data processor 1001 may be of any type suitable to the local technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multi-core processor architectures, as non-limiting examples.
  • the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof.
  • some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the disclosure is not limited thereto.
  • firmware or software which may be executed by a controller, microprocessor or other computing device, although the disclosure is not limited thereto.
  • While various aspects of the exemplary embodiments of this disclosure may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
  • the exemplary embodiments of the disclosure may be practiced in various components such as integrated circuit chips and modules. It should thus be appreciated that the exemplary embodiments of this disclosure may be realized in an apparatus that is embodied as an integrated circuit, where the integrated circuit may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor, a digital signal processor, baseband circuitry and radio frequency circuitry that are configurable so as to operate in accordance with the exemplary embodiments of this disclosure.
  • exemplary embodiments of the disclosure may be embodied in computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device.
  • the computer executable instructions may be stored on a computer readable medium such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc.
  • the function of the program modules may be combined or distributed as desired in various embodiments.
  • the function may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA) , and the like.
  • FPGA field programmable gate arrays

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Liquid Crystal (AREA)

Abstract

A near eye display and associated method are disclosed. According to an embodiment, the near eye display comprises a lens having more than one focus lengths, each focus length corresponding to at least one focus area; a display located at one of the focus lengths of the lens; a spatial light modulator located adjacent to the lens and configured to modulate light coming from the display to allow the light to pass through a first portion of the spatial light modulator, and block the light from passing through the rest portion of the spatial light modulator; an eye determining unit configured to determine a viewpoint position of user's eyes; and a controller configured to control the spatial light modulator to modulate the light such that the first portion corresponds to at least one of the focus areas associated with the viewpoint position

Description

NEAR EYE DISPLAY AND ASSOCIATED METHOD Field of the Invention
Embodiments of the disclosure generally relate to displaying technologies, and more particularly, to a near eye display and associated method, an apparatus and a computer program product.
Background
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” experience, wherein digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may perceived as, real. A virtual reality, or “VR” , scenario typically involves presentation of digital or virtual image information without transparency to other actual real-word visual input.
A present, virtual reality has been developed as a technology that is now feasible at low cost via inexpensive device such as mobile phone components. In particular, advances of high-resolution micro displays, and modern GPU (Graphics Processing Unit) facilitate extremely immersive experiences. To facilitate comfortable long-term experiences and wide-spread user acceptance, the convergence-accommodation conflict inherent to stereoscopic displays, especially to the VR display will have to be solved.
Summary
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
According to one aspect of the disclosure, there is provided a near eye display. The near eye display comprises: a lens having more than one focus lengths, each  focus length corresponding to at least one focus area; a display located at one of the focus lengths of the lens; a spatial light modulator located adjacent to the lens and configured to modulate light coming from the display to allow the light to pass through a first portion of the spatial light modulator, and block the light from passing through the rest portion of the spatial light modulator; an eye determining unit configured to determine a viewpoint position of user’s eyes; and a controller configured to control the spatial light modulator to modulate the light such that the first portion corresponds to at least one of the focus areas associated with the viewpoint position.
In an embodiment relating to the near eye display, the spatial light modulator comprises a plurality of pixels. Each focus area of the lens is overlaid by at least one pixel. The first portion comprises the pixels overlaying the at least one of the focus areas. The controller is configured to control the pixels overlaying the at least one of the focus areas to be transparent, and control the pixels overlaying the remaining focus areas to be opaque.
In an embodiment relating to the near eye display, the eye determining unit comprises an eye tracking unit, and the viewpoint position of user’s eyes is determined by means of the eye tracking unit.
In an embodiment relating to the near eye display, the eye tracking unit comprises an imaging device.
In an embodiment relating to the near eye display, the eye determining unit is integrated into the controller, and the viewpoint position of the user’s eyes is preestimated based on a content of an image to be displayed.
In an embodiment relating to the near eye display, the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
In an embodiment relating to the near eye display, the spatial light modulator comprises one of a liquid crystal display, a MEMS (Micro-Electro-Mechanical System) shutter, and a DLP (Digital Light Processing) DMD (Digital Micromirror Device) array.
In an embodiment relating to the near eye display, the lens comprises a refractive multifocal lens or a diffractive multifocal lens.
In an embodiment relating to the near eye display, the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
In an embodiment relating to the near eye display, each focus area has a shape of a circle, a ring, a sector or a polygon.
In an embodiment relating to the near eye display, the display is one of a mobile phone, a laptop, and desktop.
According to another aspect of the disclosure, there is provided a method implemented by a near eye display. The method comprises: determining a viewpoint position of user’s eyes; and controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
In an embodiment relating to the method, controlling the spatial light modulator to modulate the light comprises: determining a focus length of the lens based on the viewpoint position of user’s eyes; selecting the at least one of the focus areas of the lens based on the determined focus length; and controlling the spatial light modulator to modulate the light based on the at least one of the focus areas.
In an embodiment relating to the method, the spatial light modulator modulates the light by controlling pixels of the spatial light modulator overlaying the at least one of the focus areas to be transparent; and controlling the pixels of the spatial light modulator overlaying the remaining focus areas to be opaque;
In an embodiment relating to the method, the viewpoint position of user’s eyes is determined by means of an eye tracking unit.
In an embodiment relating to the method, the eye tracking unit comprises an imaging device.
In an embodiment relating to the method, the viewpoint position of user’s eyes is preestimated based on a content of an image to be display.
In an embodiment relating to the method, the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
In an embodiment relating to the method, the spatial light modulator comprises one of a liquid crystal display, a MEMS shutter, and a DLP DMD array.
In an embodiment relating to the method, the lens comprises a refractive lens or a diffractive lens.
In an embodiment relating to the method, the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
In an embodiment relating to the method, each focus area has a shape of a circle, a ring, a sector or a polygon.
In an embodiment relating to the method, the display is one of a mobile phone, a laptop, and a desktop.
According to another aspect of the disclosure, there is provided an apparatus. The apparatus comprises at least one processor; and at least one memory including computer-executable code. The at least one memory and the computer-executable code are configured to, with the at least one processor, cause the apparatus to operate: determining a viewpoint position of both eyes; and controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
According to another aspect of the disclosure, there is provided an apparatus. The apparatus comprises means for determining a viewpoint position of both eyes; and means for controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
In an embodiment relating to the apparatus, the means for controlling a spatial light modulator to modulate light comprises means for determining a focus length of the lens based on the viewpoint position of user’s eyes; means for selecting the at least one of the focus areas of the lens based on the determined focus length; and means for controlling the spatial light modulator to modulate the light based on the at least one of the focus areas.
According to another aspect of the disclosure, there is provided a computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program instructions stored therein. The computer-executable instructions being configured to, when being executed, cause an apparatus to operate: determining a viewpoint position of both eyes; and controlling a  spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
These and other objects, features and advantages of the disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which are to be read in connection with the accompanying drawings.
Brief Description of the Drawings
Fig. 1 is a schematic block diagram of a near eye display according to an example embodiment of the present disclosure;
Fig. 2 is a simplified schematic view of a near eye display according to an example embodiment of the present disclosure;
Figs. 3A to 3B are a front view (left) and a side view (right) of a refractive multifocal lens in one or more example embodiments of the present disclosure, respectively;
Figs. 4A to 4B are a front view (left) and a side view (right) of a diffractive multifocal lens in one or more example embodiments of the present disclosure, respectively;
Fig. 5 is a schematic view of a modulation of the spatial light modulator to light according to an example embodiment of the present disclosure;
Fig. 6 is a schematic view depicting distance of viewpoint to eyes in one or more example embodiments of the present disclosure;
Fig. 7 is an exploded schematic view of an example near eye display according to an embodiment of the present disclosure;
Fig. 8 is a flow chart of a method implemented by a near eye display according to an embodiment of the present disclusure;
Fig. 9 is an flow chart of an example method implemented by a near eye display according to an example embodiment of the present disclusure; and
Fig. 10 is a simplified block diagram of an apparatus suitable for use in some embodiments of the present disclosure.
Detailed Description
For the purpose of explanation, details are set forth in the following description in order to provide a thorough understanding of the embodiments disclosed. It is apparent, however, to those skilled in the art that the embodiments may be implemented without these specific details or with an equivalent arrangement.
As used herein and in the appended claims, the singular form of a word includes the plural, and vice versa, unless the context clearly dictates otherwise. Thus, the references “a” , “an” , and “the” are generally inclusive of the plurals of the respective terms. Similarly, the words “comprise” , “include” and grammatical variations are to be interpreted inclusively rather than exclusively, unless such a construction is clearly prohibited from the context. Where used herein the term “examples” particularly when followed by a listing of terms is merely exemplary and illustrative, and should not be deemed to be exclusive or comprehensive.
As mentioned above, the convergence-accommodation conflict inherent to stereoscopic displays, especially to the VR display is a big problem which will have to be solved. For example, stereoscopic displays in one kind of VR technology provide binocular disparity that supports convergence on any point but only allows the viewer to accommodate on the display surface and thus suffer from accommodation-convergence conflict. These displays often decouple the accommodation cue from the convergence cue and tie the accommodation cue to a fixed distance, which easily cause visual fatigue, dizziness, and other discomforts.
In VR technology, one solution for the convergence-accommodation conflict is a near eye light field display. As an example of the near eye light field display, two stacked transparent liquid crystal displays modulate the uniform backlight in a multiplicative fashion. In this way, when observed through a pair of lenses, the display provides focus cues in addition to binocular disparity afforded by VR displays. However, the computational complexity of such near eye light field display is very high, which can reach above O (N 2) to O (N 3) . For high resolution and high refresh rate, huge computing power is needed. Another solution is imaging with multifocal lenses. However, the focal points of the multifocal lenses interfere with each other. For example, when seeing in the near field, the light from the far focal is out of focus, resulting in blurred image.
An aspect of the present disclosure proposes a near eye display, which may solve the convergence-accommodation conflict with low computing power and high image quality. Fig. 1 is a schematic block diagram of a near eye display according to an example embodiment of the present disclosure. Fig. 2 is a simplified schematic view of a near eye display according to an example embodiment of the present disclosure. As shown in Figs. 1 and 2, the near eye display may include a lens 102, a display 104, a spatial light modulator 106, an eye determining unit 108, and a controller 110 (not shown in Fig. 2) .
Generally, the near eye display may be implemented in the front of both eyes. However, in some example embodiments, the structure arrangement of the near eye display is described in details with respect to only one eye in the description and/or drawings for the purpose of simplicity. It shall be understood that the same structure arrangement is also suitable for the other eye.
The lens 102 may have more than one focus lengths (e.g., two, three, or even more focal lengths) , each focus length corresponding to at least one focus area. Herein, the lens 102 may also be called a multifocal lens. As used herein, the term focus area refers to an area on the lens which corresponds to a particular focus length.  In an embodiment, the position and size of each focus area may be fixed, or may be variable as a signal (for example, a voltage signal, a current signal or even a mechanical signal) applied to the lens.
In an embodiment, the focus areas corresponding to different focus lengths may have different curvatures or refactive index materials. The focus areas may have any shape suitable for the present disclosure, such as a circle, a ring, a sector, a polygon, or a variety of other shapes.
The lens 102 may be any kind of multifocal lens having more than one focus lengths, for example, a refractive multifocal lens or a diffractive multifocal lens. Figs. 3A to 3B are a front view (left) and a side view (right) of a refractive multifocal lens in one or more example embodiments of the present disclosure, respectively. As shown in Figs. 3A and 3B, the refractive multifocal lens may be for example a multifocal mosaic lens which has a plurality of hexagon focus areas with at least three different focus lengths. At least one of the focus areas to be used may be selected by shielding other focus areas. In some embodiments, the focus areas to be used may be selected according to the viewpoint or view axis of a user, and may be switched as the viewpoint or view axis of a user changes. Figs. 4A to 4B are a front view (left) and a side view (right) of a diffractive multifocal lens in one or more example embodiments of the present disclosure, respectively. As shown in Figs. 4A and 4B, the diffractive multifocal lens may comprise many little wedges on the surface. Light diffracted by each wedge may be focused on at least two focus points, thus form at least two different focus lengths, i.e., a near focus and a far focus. The near focus or the far focus to be used may be selected by controlling amount of light focused on the near focus or the far focus. For example, ifthe amount of light focused on the near focus is controlled to be much larger than that focused on the far focus, the near focus is selected, and vice versa.
In an alternative embodiment, the lens 102 may be a variable focus lens. The variable focus lens may be a refractive element, such as a liquid crystal lens, an  electro-active lens, a conventional refractive lens with moving elements, a mechanical-deformation-based lens (such as a fluid-filled membrane lens, or a lens akin to the human crystalline lens wherein a flexible element is flexed and relaxed by actuators) , an electro-wetting lens, or a plurality of fluids with refractive indices. The variable focus lens may also comprise a switchable diffractive optical element (such as one featuring a polymer dispersed liquid crystal approach wherein a host medium, such as a polymeric material, has microdroplets of liquid crystal dispersed within the material; when a voltage is applied, the molecules reorient so that their refractive indices no longer match that of the host medium, thereby creating a high diffraction pattern) .
The display 104 may be located at one of the focus lengths of the lens. For example, the display may be positioned at the nearest, the middle or the farthest focus point of the lens. However, it should be understood that embodiments of the present disclosure are not limited thereto. The display 104 may be provided for responding to at least one image/video signal, for presentation of information for visual reception. The at least one image/video signal herein may be divided into left eye image/video signal entering into the left eye of a user and right eye image/video signal entering into the right eye of the user, and the left eye image/video signal and the right eye image/video signal may be synthesized by human brain to generated a stereoscopic image, which is perceived by user.
In an example embodiment, the display 104 may be any device capable of presenting visual information for visual reception, including, but not limited to, a mobile phone, a laptop, or desktop. The display may present visual information by emitting light (e.g., one or more of liquid crystal display (LCD) , electroluminescence, photoluminescence, incandescence, and cathodoluminescence) .
The spatial light modular 106 may be located adjacent to the lens and may be configured to modulate light coming from the display 104 to allow the light to pass through a first portion of the spatial light modulator 106, and block the light from  passing through the rest portion of the spatial light modulator 106. As an example, the spatial light modulator 106 may be arranged between the display 104 and the lens 102. As another example, the spatial light modulator 106 may be arranged on a side of the lens 102 away from the display 104. It should be understood that the spatial light modulator 106 may be arranged at other position with respect to the lens. For example, in the case that there are two lenses arranged in parallel for each eye, the spatial light modulator may be arranged between the two lenses.
The spatial light modulator 106 may comprise a plurality of pixels. Each focus area of the lens is overlaid by at least one pixel. The spatial light modulator 106 may modulate the light coming from the display 104 by controlling whether the light passes through pixels overlaying the focus areas. Thereby, at least one of the focus areas of the lens may be selected to be used in the near eye display, and the selected focus areas may also be switched in time as the viewpoint of the user’s eye changes. As an example, by switching pixels of the spatial light modulator on or off, pixels may be controlled to be transparent or opaque such that the focus areas overlaid by the transparent pixels are selected to be used.
Fig. 5 is schematic view of a modulation of the spatial light modulator to light according to an example embodiment of the present disclosure. As show in Fig. 5, the lens 102 has three focus lengths (indicated by fl, f2, and f3 each corresponding to multiple focus areas indicated by 1, 2 and 3) , each focus area of the lens 102 is overlaid by one or more pixels of the spatial light modulator 106. By controlling the pixels overlaying the first focus areas 1 (corresponding to the focus length fl) to be transparent (opened) , the first focus areas corresponding the opened pixels are selected to be used in the near eye display at present. Similarly, by controlling the pixels overlaying the other focus areas 2, 3 (corresponding to the focus length f2 and f3) to be opaque (closed) , the other focus areas corresponding the closed pixels are not selected at present.
In embodiments of the disclosure, the spatial light modulator 106 for example may be a liquid crystal display, a MEMS shutter, a DLP DMD array or any other kind of modulating device which may be independently controlled to block or transmit the light.
The eye determining unit 108 may be configured to determine a viewpoint position of user’s eyes. The viewpoint position in embodiments of the disclosure may be a intersection of the visual axes of two eyes. It shall be appreciated that, the eye determining unit 108 described herein may be implemented by a hardware, a software or a combination of the hardware and the software to implement the functionality of the eye determining unit 108, i.e., the determination of the viewpoint position.
In an example embodiment, the eye determining unit 108 may comprise an eye tracking unit. In one example, the eye tracking unit may comprise an imaging device (for example, a camera such as a CCD camera, and a CMOS camera) oriented toward an eye of the user for imaging an eye directly. In another example, the eye tracking unit may comprise an imaging device (for example, a camera such as a CCD camera, and a CMOS camera) and a mirror for reflecting light coming from the eye to the imaging device such that the imaging device may imaging the eye indirectly. In both cases, the viewpoint position of user’s eyes may be determined based on the images of the eyes acquired by the imaging device, such as by extracting eye features from the images of the eyes. In this embodiment, the eye determining unit 108 (specifically, the eye tracking unit) may be placed before or after the lens 102 and the spatial light modulator 106, or near them. However, this is not intented to limit the scope of the present disclosure to these specific positions. The skilled in the art may rather easily recognize how to adapt the related arrangements or conditions when employing a different position of the eye determining unit 108.
In another example embodiment, the eye determining unit 108 may be integrated into a controller, for example, in the form of a hardware circuit or in the form of a software module. By way of example, the controller may comprise a  processor and a memory, wherein the processor executes a program stored in the memory to implement the functionality of the eye determining unit 108. In this case, the viewpoint position of user’s eyes may be preestimated based on a content of an image to be displayed. Specifically, the program stored in the memory may analyze the image content to be displayed, determine from the image content some objects most likely to attact user’s eyes, and then preestimate some possible points on the objects at which the user′s eyes are most likely to gaze. These extracted possible points may act as possible viewpoints, the positions of which may be linked with some subjects in the image to be display and stored in the memory, for determining the focus areas associated with the viewpoint of user’s eyes.
With respect to how to determine the at least one of the focus areas associated with the viewpoint of user’s eyes based on the viewpoint position of user’s eyes, any existing method may be used for this purpose, and it is not a limitation to the present disclosure. By way of example, with the eye tracking technology, the visual axes of both eyes may be tracked and the intersection point of the visual axes may be calculated as the viewpoint of user’s eyes; then, a distance of the viewpoint (indicated by d obj) to user’s eyes may be obtained, i.e., d obj= distance of viewpoint to eyes (as shown in Fig. 6) ; after obtaining d obj, the required focus length (indicated by f) may be calculated for a given display position (d t2) by following formula:
Figure PCTCN2018079098-appb-000001
The closest focal length of lens 102 to f may be determined as the focal length associated with the viewpoint of user’s eyes. With the spatial light modulator, only the focus areas conrresponding to the determined focal length are opened, as described above. As the viewpoint of user’s eyes changes, the focus areas opened may be switched in time.
The controller 110 may be configured to control the spatial light modulator to modulate the light such that the first portion of the spatial light modulator through  which the light is allowed to pass corresponds to the at least one of the focus areas associated with the viewpoint position of user’s eyes. By way of example, the controller 110 may be connected to the eye determining unit 108, for receiving information about the viewpoint of the user’s eyes, and connected to the spatial light modulator 106, for sending to the spatial light modulator 106 a modulation information based on the viewpoint of the user’s eyes such that the spatial light modulator 106 modulates light coming from the display 104 to allow the light to pass through the first portion corresponding to the at least one of the focus areas of the lens 102 associated with the viewpoint position of user’s eyes, and block the light from passing through the rest portion of the spatial light modulator 106. It should be noted that, the connection in the embodiments of the present disclosure may be wired or wirelessly connected and even integrated into other components.
In an example embodiment, the first portion of the spatial light modulator 106 comprises the pixels overlaying the at least one of the focus areas, and the controller 110 may be configured to control the pixels overlaying the at least one of the focus areas to be transparent (for example, by switching ON (opening) the pixels overlaying the at least one of the focus areas) such that the light passes through the first portion, and control the pixels overlaying the remaining focus areas to be opaque (for example, by switching OFF (closing) the pixels overlaying the remaining focus areas) .
The controller 110 may be implemented as: (a) hardware-only circuit such as implemented as only analog and/or digital circuit; (b) combinations of circuits and software (and/or firmware) , such as a combination of digital signal processor (s) , software, and memory (ies) that work together to cause various functions to be performed. The controller 110 may be a separated component or integrated into other components (such as the display 104 and the spatial light modulator 106) as a part thereof.
Hereinafter, an example near eye display will be described with reference to Fig. 7. Fig. 7 is an exploded schematic view of an example near eye display according  to an embodiment of the present disclosure. It should be noted that the embodiment as illustrated in Fig. 7 will be described with respect to two eyes of a user, that is, the near eye display may be implemented in the front of both eyes.
As illustrated in Fig. 7, the near eye display may comprise two lenses 102 each with more than one focus lengths (for example, two, three or more focus lengths) , each focus length corresponding to a plurality of focus areas; a display 104 located at one of the focus lengths of the lenses (i.e., on one of the focus planes of the lenses) ; a spatial light modulator 106 located between the lenses 102 and the display 104; a eye determining unit 108 for determining a viewpoint position of user’s eyes; an optional housing 112 for holding the two lenses 102; and a controller (not shown in Fig. 7) which may be connected to or integrated into one or more of the display 104, the eye determining unit 108 and the spatial light modulator 106.
In the embodiment as illustrated in Fig. 7, the eye determining unit 108 may comprise two cameras 1081 located on the housing 112 for acquiring images of user’s eye; and two optional mirrors 1082 for reflecting light coming from user’s eyes to the two cameras 1081. In this case, the viewpoint of user’s eyes may be determined based on the images of user’s eye. In an alternative embodiment, the viewpoint of user’s eyes may also be preestimated based on the content of an image to be displayed, as mentioned above. One of the focus lengths to be used may first be determined, for example by the controller 110 with the process described above referring to Fig. 4 and formula (1) , based on the obtained or preestimated viewpoint of user’s eyes, and then at least one of the focus areas to be used may be determined based on the determined focus length.
The spatial light modulator 106 in the example embodiment may be a transparant liquid crystal display (LCD) . The transparant LCD may include a plurality of pixels which overlay the focus areas of the lenses and may be independently controlled to block or transmit the light. The spatial light modulator 106 may be controlled by the controller 110 based on the determined at least one of the focus  areas to modulate the light coming from the display 104, such that the spatial light modulator:
- allows the light to pass through a first portion of the spatial light modulator corresponding to the determined at least one of the focus areas associated with the viewpoint position of user’s eyes, and
- blocks the light from passing through the rest portion of the spatial light modulator.
In a specifical embodiment, the spatial light modulator 106 (e.g. LCD) may molulate the light by:
- controlling pixels of the spatial light modulator 106, which are overlaid on the determined at least one of the focus areas of lens presenting the determined focus length, to be transparent such that a content of the image (light) from the display 104 may pass the determined (selected) at least one of the focus areas and user’s eyes may see the content of the image;
- controlling pixels of the spatial light modulator 106, which are overlaid on the remaining focus areas of lens not presenting the determined focus length, to be opaque such that a content of the image (light) from the display 104 may be blocked from passing the remaining (non-selected) focus areas and user’s eyes may not see the content of the image.
As for how to controlling pixels of the spatial light modulator 106 to be transparent or opaque, any available technology may be used, for example, reorienting liquid crystal molecules of a LCD.
Alternatively, the spatial light modulator 106 may have a low resolution, for example, one focus area of the lens may be overlaid by only one pixel of the spatial light modulator 106, so as to achieve a higher refresh rate to switch quickly between multiple focus lengths.
According to one or more of the embodiments disclosed above and/or below in further detail, a multifocal lens, a spatial light modulator and an eye determining unit are used in a near eye display, which may select suitable (or correct) focus areas (corresponding to a suitable focus length) of a lens by switching between multiple focus lengths of a lens based on the viewpoint of user’s eyes in time, thereby solving the accommodation-convergence conflict and providing fully natural visual cues and comfortable viewing experiences. In addition, the near eye display according to the present disclosure only needs a single multifical lens for each eye, and the display may be any means in daily life which can display visual information (for example a mobile phone) without any change, so the structure complexity of the near eye display may be reduced. Furthermore, with the near eye display provided in example embodiments of the present disclosure, the computing complexity and required co computing power are very low, so it can quickly produce acceptable visual effects.
Another aspect of the present disclosure proposes a method implemented by a near eye display. Optionally, the method may make use of the near eye display according to the present disclusure, such as of the near eye display according to one or more of the embodiments disclosed above and/or below in further detail. Thus, for optional embodiments of the method, reference might be made to the embodiments of the near eye display. The method comprises the following steps, which may be performed in the given order or in a different order. Further, additional method steps might be provided which are not listed. Further, two or more or even all of the method steps might be performed at least partially simultaneously. Further, a method step might be performed twice or even more than twice, repeatedly.
Fig. 8 is a flow chart depicting a method implemented by a near eye display according to an embodiment of the present disclusure. As shown in Fig. 8, the method may comprise:
step 802, determining a viewpoint position of user’s eyes; and
step 804, controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
At step 802, the viewpoint position of user’s eyes may be determined by an eye tracking unit which can acquire images of user’s eye. Sepcifically, the viewpoint position of user’s eyes may be determined by processing the acquired images. In an alternatively embodiment, the viewpoint of user’s eyes may also be preestimated based on the content of an image to be displayed, as mentioned above in one or more embodiments relating to the near eye dispaly.
At step 804, the spatial light modulator may be controlled by a controller to modulate light coming from a display. As a example, the controller may operate: 1) determining a focus length of a lens based on the viewpoint position of user’s eyes; 2) selecting the at least one of the focus areas of the lens based on the determined focus length; and 3) controlling the spatial light modulator to modulate the light based on the at least one of the focus areas such that the spatial light modulator allows the light to pass through the first portion of the spatial light modulator corresponding to the at least one of the focus areas, and blocks the light from passing through the rest portion of the spatial light modulator.
The spatial light modulator may be a transparent LCD or other device allowing light to pass through a portion thereof, and blocking light from passing through the rest portion thereof. As described above, the spatial light modulator may include a plurality of pixels which overlay the focus areas of the lenses and may be independently controlled to block or transmit the light. As one example, the spatial light modulator may modulate the light from the display by controlling pixels of a spatial light modulator overlaying the selected at least one of the focus areas to be  transparent; and controlling the pixels overlaying the remaining focus areas to be opaque.
Fig. 9 is an exemplary flow chart depicting a method implemented by the near eye display in an example embodiment. As shown in Fig. 9, the method may comprise  steps  902, 904, 906, and 908.
At step 902, a viewpoint position of user’s eyes is determined. In this step, as already described above, the viewpoint position of user’s eyes may be determined by an eye tracking unit (such as a camera) or preestimated by a controller based on the content of an image to be displayed.
At step 904, a focus length of a lens is determined based on the viewpoint position of user’s eyes, so that the determined may be used to select focus areas matching with the viewpoint position of user’s eyes. In this step, for example, a controller may perform the determination of the focus length using the process mentioned in one or more embodiments relating to the near eye dislay, for example the method described with respect to Fig. 6 and formula (1) .
At step 906, at least one of focus areas of the lens is selected based on the determined focus length at step 904. Similar to step 904, this step 906 may also be perpormed by the controller. The focus areas corresponding to the determined focus length may be determined as focus areas most suitable for the user′s current viewpoint.
At step 908, a spatial light modulator is controlled (for example by a controller) to modulate the light based on the selected focus areas such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to the selected focus areas of a lens, and blocks the light from passing through the rest portion of the spatial light modulator. As mentioned therein, the spatial light modulator may modulate the light from the display by controlling pixels of a spatial light modulator overlaying the selected focus areas to be  transparent; and controlling the pixels overlaying the remaining focus areas to be opaque.
With the method described in embodiments or the present disclosure, suitable (or correct) focus areas (corresponding to a suitable focus length) of a lens of the near eye display may be selected in time by switching between multiple focus lengths of a lens, as the viewpoint of user’s eyes changes, thereby solving the accommodation-convergence conflict and providing fully natural visual cues and comfortable viewing experiences. Furthermore, the computational complexity in the method of above embodiments is very low, and it thus can quickly produce acceptable visual effects of a near eye display.
Another aspect of the present disclosure proposes an apparatus suitable for use in some embodiments of the present disclosure. Fig. 10 is a simplified block diagram depicting an apparatus suitable for use in some embodiments of the present disclosure. As shown in Fig. 10, the apparatus 1000 may include a data processor 1001, a memory 1002 that stores a program 1003, and a communication interface 1004 for communicating data with other external devices through wired and/or wireless communication.
The program 1003 is assumed to include program instructions that, when executed by the data processor 1001, enable the apparatus 1000 to operate in accordance with the embodiments of this disclosure, as discussed above. That is, the embodiments of this disclosure may be implemented at least in part by computer software executable by the data processor 1001, or by hardware, or by a combination of software and hardware.
The memory 1002 may be of any type suitable to the local technical environment and may be implemented using any suitable data storage technology, such as semiconductor based memory devices, flash memory, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory. The data processor 1001 may be of any type suitable to the local  technical environment, and may include one or more of general purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs) and processors based on multi-core processor architectures, as non-limiting examples.
In general, the various exemplary embodiments may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. For example, some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device, although the disclosure is not limited thereto. While various aspects of the exemplary embodiments of this disclosure may be illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that these blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
As such, it should be appreciated that at least some aspects of the exemplary embodiments of the disclosure may be practiced in various components such as integrated circuit chips and modules. It should thus be appreciated that the exemplary embodiments of this disclosure may be realized in an apparatus that is embodied as an integrated circuit, where the integrated circuit may comprise circuitry (as well as possibly firmware) for embodying at least one or more of a data processor, a digital signal processor, baseband circuitry and radio frequency circuitry that are configurable so as to operate in accordance with the exemplary embodiments of this disclosure.
It should be appreciated that at least some aspects of the exemplary embodiments of the disclosure may be embodied in computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data  types when executed by a processor in a computer or other device. The computer executable instructions may be stored on a computer readable medium such as a hard disk, optical disk, removable storage media, solid state memory, RAM, etc. As will be appreciated by one of skill in the art, the function of the program modules may be combined or distributed as desired in various embodiments. In addition, the function may be embodied in whole or in part in firmware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA) , and the like.
The present disclosure includes any novel feature or combination of features disclosed herein either explicitly or any generalization thereof. Various modifications and adaptations to the foregoing exemplary embodiments of this disclosure may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings. However, any and all modifications will still fall within the scope of the non-Limiting and exemplary embodiments of this disclosure.

Claims (25)

  1. A near eye display, comprising:
    a lens having more than one focus lengths, each focus length corresponding to at least one focus area;
    a display located at one of the focus lengths of the lens;
    a spatial light modulator located adjacent to the lens and configured to modulate light coming from the display to allow the light to pass through a first portion of the spatial light modulator, and block the light from passing through the rest portion of the spatial light modulator;
    an eye determining unit configured to determine a viewpoint position of user’s eyes; and
    a controller configured to control the spatial light modulator to modulate the light such that the first portion corresponds to at least one of the focus areas associated with the viewpoint position.
  2. The near eye diaplay according to claim 1, wherein the spatial light modulator comprises a plurality of pixels, each focus area of the lens is overlaid by at least one pixel, and the first portion comprises the pixels overlaying the at least one of the focus areas, and
    wherein the controller is configured to control the pixels overlaying the at least one of the focus areas to be transparent, and control the pixels overlaying the remaining focus areas to be opaque.
  3. The near eye display according to claim 1 or 2, wherein the eye determining unit comprises an eye tracking unit, and the viewpoint position of user’s eyes is determined by means of the eye tracking unit.
  4. The near eye display according to claim 3, wherein the eye tracking unit comprises an imaging device.
  5. The near eye display according to claim 1 or 2, wherein the eye determining unit is integrated into the controller, and the viewpoint position of the user’s eyes is preestimated based on a content of an image to be displayed.
  6. The near eye display according to any of claims 1 to 5, wherein the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
  7. The near eye display according to any of claims 1 to 6, wherein the spatial light modulator comprises one of a liquid crystal display, a MEMS shutter, and a DLP DMD array.
  8. The near eye display according to any of claims 1 to 7, wherein the lens comprises a refractive multifocal lens or a diffractive multifocal lens.
  9. The near eye display according to any of claims 1 to 8, wherein the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
  10. The near eye display according to any of claims 1 to 9, wherein each focus area has a shape of a circle, a ring, a sector or a polygon.
  11. The near eye display according to any of claims 1 to 10, wherein the display is one of a mobile phone, a laptop, and desktop.
  12. A method implemented by a near eye display, comprising:
    determining a viewpoint position of user’s eyes; and
    controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of  the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  13. The method according to claim 12, wherein controlling the spatial light modulator to modulate the light comprises:
    determining a focus length of the lens based on the viewpoint position of user’s eyes;
    selecting the at least one of the focus areas of the lens based on the determined focus length; and
    controlling the spatial light modulator to modulate the light based on the at least one of the focus areas.
  14. The method according to claim 13, wherein the spatial light modulator modulates the light by:
    controlling pixels of the spatial light modulator overlaying the at least one of the focus areas to be transparent; and
    controlling the pixels of the spatial light modulator overlaying the remaining focus areas to be opaque.
  15. The method according to any of claims 12 to 14, wherein the viewpoint position of user’s eyes is determined by means of an eye tracking unit.
  16. The method according to claim 15, wherein the eye tracking unit comprises an imaging device.
  17. The method according to any of claims 12 to 14, wherein the viewpoint position of user’s eyes is preestimated based on a content of an image to be display.
  18. The method according to any of claims 12 to 17, wherein the spatial light modulator is located between the display and the lens, or located on a side of the lens away from the display.
  19. The method according to any of claims 12 to 18, wherein the spatial light modulator comprises one of a liquid crystal display, a MEMS shutter, and a DLP DMD array.
  20. The method according to any of claims 12 to 19, wherein the lens comprises a refractive lens or a diffractive lens.
  21. The method according to any of claims 12 to 20, wherein the focus areas corresponding to different focus lengths have different curvatures or refactive index materials.
  22. The method according to any of claims 12 to 21, wherein each focus area has a shape of a circle, a ring, a sector or a polygon.
  23. The method according to any of claims 12 to 22, wherein the display is one of a mobile phone, a laptop, and a desktop.
  24. An apparatus, comprising:
    at least one processor; and
    at least one memory including computer-executable code,
    wherein the at least one memory and the computer-executable code are configured to, with the at least one processor, cause the apparatus to operate:
    determining a viewpoint position of both eyes; and
    controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens  associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
  25. A computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program instructions stored therein, the computer-executable program instructions being configured to, when being executed, cause an apparatus to operate:
    determining a viewpoint position of both eyes; and
    controlling a spatial light modulator to modulate light coming from a display such that the spatial light modulator allows the light to pass through a first portion of the spatial light modulator corresponding to at least one of focus areas of a lens associated with the viewpoint position of user’s eyes, and blocks the light from passing through the rest portion of the spatial light modulator.
PCT/CN2018/079098 2018-03-15 2018-03-15 Near eye display and associated method WO2019173997A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/079098 WO2019173997A1 (en) 2018-03-15 2018-03-15 Near eye display and associated method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/079098 WO2019173997A1 (en) 2018-03-15 2018-03-15 Near eye display and associated method

Publications (1)

Publication Number Publication Date
WO2019173997A1 true WO2019173997A1 (en) 2019-09-19

Family

ID=67908658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/079098 WO2019173997A1 (en) 2018-03-15 2018-03-15 Near eye display and associated method

Country Status (1)

Country Link
WO (1) WO2019173997A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023057226A1 (en) * 2021-10-04 2023-04-13 Ams-Osram International Gmbh Component for data glasses, and data glasses

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103605208A (en) * 2013-08-30 2014-02-26 北京智谷睿拓技术服务有限公司 Content projection system and method
WO2016180702A1 (en) * 2015-05-08 2016-11-17 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Eye tracking device and method for operating an eye tracking device
US20160363770A1 (en) * 2015-06-15 2016-12-15 Samsung Electronics Co., Ltd. Head mounted display apparatus
CN106371218A (en) * 2016-10-28 2017-02-01 苏州苏大维格光电科技股份有限公司 Head-mounted three-dimensional display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103605208A (en) * 2013-08-30 2014-02-26 北京智谷睿拓技术服务有限公司 Content projection system and method
WO2016180702A1 (en) * 2015-05-08 2016-11-17 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Eye tracking device and method for operating an eye tracking device
US20160363770A1 (en) * 2015-06-15 2016-12-15 Samsung Electronics Co., Ltd. Head mounted display apparatus
CN106371218A (en) * 2016-10-28 2017-02-01 苏州苏大维格光电科技股份有限公司 Head-mounted three-dimensional display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023057226A1 (en) * 2021-10-04 2023-04-13 Ams-Osram International Gmbh Component for data glasses, and data glasses

Similar Documents

Publication Publication Date Title
Itoh et al. Towards indistinguishable augmented reality: A survey on optical see-through head-mounted displays
EP3625648B1 (en) Near-eye display with extended effective eyebox via eye tracking
USRE48876E1 (en) Near-eye parallax barrier displays
JP6423945B2 (en) Display device and display method using projector
TWI565971B (en) Near-eye microlens array displays
TWI516802B (en) Near-eye optical deconvolution displays
US11435576B2 (en) Near-eye display with extended accommodation range adjustment
WO2019105323A1 (en) Display module, head-mounted display device, and stereoscopic image display method and apparatus
EP2988497B1 (en) Image processing method and apparatus
US20200301239A1 (en) Varifocal display with fixed-focus lens
EP3513254B1 (en) Holographic wide field of view display
US11546574B2 (en) High resolution 3D display
CN111751988B (en) Depth of field adjusting method and device and binocular near-to-eye display equipment
US10775617B2 (en) Eye tracked lens for increased screen resolution
WO2019173997A1 (en) Near eye display and associated method
US11477419B2 (en) Apparatus and method for image display
Wetzstein et al. State of the art in perceptual VR displays
EP4137872A1 (en) Display apparatus, system and method
US20230119935A1 (en) Gaze-guided image capture
CN110325897B (en) Near-eye display with extended adjustment range adjustment
Balram Fundamentals of light field imaging and display systems
Hua Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality
US20180017800A1 (en) Virtual image display apparatus
Wetzstein Augmented and virtual reality
Zhang Design and Prototyping of Wide Field of View Occlusion-capable Optical See-through Augmented Reality Displays by Using Paired Conical Reflectors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18909865

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18909865

Country of ref document: EP

Kind code of ref document: A1