WO2019184896A1 - 屈光调节方法及装置以及增强现实设备 - Google Patents

屈光调节方法及装置以及增强现实设备 Download PDF

Info

Publication number
WO2019184896A1
WO2019184896A1 PCT/CN2019/079618 CN2019079618W WO2019184896A1 WO 2019184896 A1 WO2019184896 A1 WO 2019184896A1 CN 2019079618 W CN2019079618 W CN 2019079618W WO 2019184896 A1 WO2019184896 A1 WO 2019184896A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
refractive
light
eyes
diopter
Prior art date
Application number
PCT/CN2019/079618
Other languages
English (en)
French (fr)
Inventor
李佃蒙
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US16/610,559 priority Critical patent/US11579448B2/en
Publication of WO2019184896A1 publication Critical patent/WO2019184896A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/30Collimators
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • G02F1/294Variable focal length devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • G02B2027/0125Field-of-view increase by wavefront division
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0185Displaying image at variable distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/06Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the phase of light

Definitions

  • the present disclosure relates to the field of display technologies, and in particular, to a refractive adjustment method of an augmented reality device and an apparatus thereof, and an augmented reality device.
  • Augmented Reality is a new technology that combines real-world information with virtual information. It is characterized by applying virtual information to the real environment, and can integrate the physical and virtual information in the real environment into the same picture or space, thereby achieving a sensory experience that transcends reality.
  • a refractive adjustment method of an augmented reality device comprising: receiving light reflected from two eyes of a user wearing the augmented reality device; determining the user's flaw based on the reflected light A refractive correction signal is generated based on the user's interpupillary distance and desired dioptric power to correct the two-eye diopter of the user by the refractive adjustment element in the augmented reality device.
  • the generating a refractive correction signal according to the user's interpupillary distance and the desired diopter to correct the two-eye diopter by the refractive adjustment element comprises: generating a focus adjustment signal according to the desired diopter to adjust the yaw
  • the focal length of the light adjusting component is initially corrected for the diopter of the two eyes;
  • the optical axis adjustment signal is generated according to the pupil distance of the user to adjust the optical axis position of the refractive adjusting component to correct the corrected two-eye diopter.
  • the determining, according to the reflected light, the user's pupil distance comprises: respectively acquiring a left eye image and a right eye image of the user according to the reflected light; respectively, the left eye image and the right The eye image performs image recognition to determine a left eye pupil position and a right eye pupil position; the user's pupil distance is determined according to the left eye pupil position and the right eye pupil position.
  • a refractive adjustment signal is generated to adjust the diopter of both eyes by the refractive adjustment element.
  • the method further includes: storing, according to the identity information of the user, a refractive correction signal generated by the user.
  • the method further includes: acquiring iris information of the two eyes according to the reflected light; determining identity information of the user according to the iris feature information of the two eyes; and acquiring the stored corresponding refractive power based on the determined identity information.
  • the correction signal is corrected, and the diopter of both eyes is corrected by the refractive adjustment element according to the acquired corresponding refractive correction signal.
  • a refractive adjustment control apparatus for an augmented reality device, comprising: a reflected light receiving unit, a pupil distance determining unit, and a refractive correction signal generating unit.
  • the reflected light receiving unit is configured to receive light reflected from the eyes of the user wearing the augmented reality device.
  • the pupil distance determining unit is configured to determine a pupil distance of the user according to the reflected light;
  • the refractive correction signal generating unit is configured to generate a refractive correction signal according to the user's pupil distance and the two-eye diopter to pass the augmented reality device
  • the light adjustment element corrects the two-eye diopter of the user.
  • an augmented reality device comprising: a refractive adjustment control device and a refractive adjustment element as described above and in other portions herein.
  • the refractive adjustment element is disposed on a side close to both eyes of the user for correcting the diopter of both eyes in accordance with the refractive correction signal generated by the refractive adjustment control means.
  • the augmented reality device further comprises a light source and an optical coupling element.
  • the light source is used to emit light.
  • the light coupling element is configured to couple the light emitted by the light source to the refractive adjustment element and then enter the visual field of the user, and reversely couple the light reflected from the eyes of the user to the refractive power. Adjust the control unit.
  • the optical coupling element comprises a beam splitting element and an optical waveguide element.
  • the light splitting element is located between the light source and the optical waveguide element.
  • the light emitted by the light source is reflected by the light splitting element into the optical waveguide element, and is coupled into the refractive adjustment element via the optical waveguide element and then incident into the field of view of the user.
  • the light reflected from the eyes of the user passes through the dioptric adjustment element, is reversely coupled to the spectroscopic element through the optical waveguide element, and is reflected by the spectroscopic element into the refractive adjustment control device.
  • the beam splitting component comprises a beam splitting prism and a dichroic mirror.
  • the dichroic prism is located between the light source and the dichroic mirror, and the dichroic mirror is located between the dichroic prism and the optical waveguide element.
  • Light emitted by the light source is incident on the dichroic mirror through the beam splitting prism and reflected into the optical waveguide element through the dichroic mirror.
  • Light reflected from the two eyes of the user is reversely coupled to the dichroic mirror by the optical waveguide element, reflected by the dichroic mirror to the dichroic prism, and then reflected by the dichroic prism to the In the refractive adjustment control device.
  • the enhanced display device further includes: a micro display element and a collimating element.
  • the microdisplay element is used to display an image, the wavelength of the light of the displayed image being different from the wavelength of the light emitted by the light source.
  • a collimating element is located between the microdisplay element and the dichroic mirror.
  • the dichroic mirror is located between the collimating element and the optical waveguide element. The light of the displayed image is collimated by the collimating element, passes through the dichroic mirror, enters the optical waveguide element, and is incident after being output to the refractive adjustment element through the optical waveguide element.
  • a micro display element and a collimating element.
  • the microdisplay element is used to display an image, the wavelength of the light of the displayed image being different from the wavelength of the light emitted by the light source.
  • a collimating element is located between the microdisplay element and the dichroic mirror.
  • the dichroic mirror is located between the collimating element and the optical waveguide element.
  • FIG. 1 is a flowchart of an operation of a refractive adjustment method of an augmented reality device according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart showing the operation of a refractive adjustment method of an augmented reality device according to another embodiment of the present disclosure
  • FIG. 3 is a flowchart showing the operation of a refractive adjustment method of an augmented reality device according to still another embodiment of the present disclosure
  • FIG. 4 is a flowchart showing the operation of a refractive adjustment method of an augmented reality device according to another embodiment of the present disclosure
  • FIG. 5 is a flowchart of an operation of a refractive adjustment method of an augmented reality device according to still another embodiment of the present disclosure
  • FIG. 6 is a block diagram of a refraction adjustment control apparatus of an augmented reality device according to still another embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of an augmented reality device according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of an augmented reality device according to another embodiment of the present disclosure.
  • augmented reality devices are typically designed for users with normal vision and are not suitable for users with abnormal vision.
  • a near-sighted or far-sighted user uses such an augmented reality device, in order to clearly view an image incident on the human eye, it is necessary to wear the near-sighted glasses or the far-sighted glasses before wearing the augmented reality device. This affects the comfort of augmented reality devices.
  • the embodiment of the present disclosure provides a method for adjusting the refractive power of the augmented reality device. As shown in FIG. 1 , the method includes the following steps.
  • Step S10 Receive light reflected from two eyes of a user wearing the augmented reality device
  • Step S20 determining a distance of the user according to the reflected light
  • Step S30 generating a refractive correction signal according to the user's interpupillary distance and desired dioptric power to correct the two-eye diopter of the user by the refractive adjustment element of the augmented reality device.
  • the user can observe the light incident into the field of view of the user's eyes after wearing the augmented reality device.
  • the light may be from a natural light source or a light source included in the augmented reality device. This light can be reflected by the eye. Light reflected by both eyes (including the left and right eyes) can be received by a camera or some specialized optical sensor.
  • the user's interpupillary distance refers to the distance between the left eye pupil and the right eye pupil.
  • the user's interpupillary distance may be the distance from the center of the left eye pupil to the center of the right eye pupil, or it may be the outer edge of the left eye pupil (near the side of the face temple) to the inner edge of the right eye pupil (near the face) The distance between the sides of the nose, or the distance from the inner edge of the left eye pupil to the outer edge of the right eye pupil.
  • the effect of the eye refracting light is called refraction.
  • the ability of refraction is expressed in terms of power, called diopter.
  • Diopter is the unit of the refractive power, expressed as D.
  • Glasses can be worn when the glasses have myopia or hyperopia problems.
  • the diopter of the eye itself is corrected by the lens of the eyeglasses so that the eye can see a clear image.
  • the diopter of the lens of the eyeglass is generally expressed in degrees, and the value of the diopter D is multiplied by 100 to be the degree.
  • the myopic lens is a concave lens and the far vision lens is a convex lens.
  • the diopter of the myopic lens is -D, and the diopter of the hyperopic lens is +D.
  • a 200 degree myopia lens has a diopter of -2D
  • a 150 degree distance vision lens has a diopter of +1.5D.
  • the diopter of the lens is related to the focal length of the lens. The greater the diopter, the smaller the focal length of the lens.
  • the diopter adjustment element is used to correct the diopter of both eyes and can be equivalent to two lenses located in front of the two eyes.
  • the lens can be a convex lens or a concave lens.
  • the distance between the center of the left ophthalmic lens and the center of the right ophthalmic lens ie, the optical center distance
  • the desired diopter may refer to the diopter required to correct the diopter of the user's eyes to a normal condition using a refractive adjustment element.
  • the desired diopter can be set according to user needs. For different users, the diopter is expected to be different because the diopter of the eye itself may be different.
  • the user can set the desired diopter by manually entering or voice inputting a value of the desired diopter in the enhanced display device.
  • the user may also select a desired diopter suitable for it by selecting a plurality of desired refracting powers preset from the enhanced display device.
  • the selection of the desired diopter can be made by a button or a knob.
  • correction of the user's two-eye diopter by the refractive adjustment element is not only related to the desired diopter, but also to the user's interpupillary distance. Accordingly, in the present embodiment, the refractive correction signal is generated based on the user's interpupillary distance and desired dioptric power. The refractive correction signal is used as an input signal of the refractive adjustment element, and the diopter of both eyes is corrected by the adjustment action of the refractive adjustment element.
  • the refractive adjustment element can adjust the focal length of the "equivalent lens” and the position of the optical axis according to the refractive correction signal, thereby functioning as a myopic lens or a distance vision lens to correct myopia refraction of both eyes or The effect of hyperopia refraction.
  • the refractive adjustment method when correcting the diopter of both eyes by the refractive adjustment element, refers not only to the desired refracting power but also to the pupil distance of the user.
  • the diopter can be corrected according to the two-eye diopter adaptability of different users wearing the augmented reality device, and the effects of correcting myopia and correcting farsight can be achieved.
  • the effect of correcting the diopter of both eyes is good.
  • the augmented reality device using the refractive adjustment method does not require additional wearing of myopia glasses or farsight glasses, thereby improving the comfort of use of the augmented reality device.
  • generating a refractive correction signal based on the user's interpupillary distance and desired dioptric power to correct the two-eye diopter by the refractive adjustment element includes:
  • Step S31 generating a focus adjustment signal according to the desired diopter, to perform preliminary correction on the diopter of the two eyes by adjusting the focal length of the refractive adjustment element;
  • Step S32 generating an optical axis adjustment signal according to the user's pupil distance to correct the preliminary corrected two-eye diopter by adjusting the optical axis position of the refractive adjustment element.
  • the diopter adjustment element can be equivalent to two "equivalent lenses" located in front of the two eyes.
  • the "equivalent lens” may be a convex lens or a concave lens.
  • the diopter of the equivalent lens changes as its focal length changes. Correction of the diopter of both eyes can be achieved by adjusting the focal length of the "equivalent lens". This allows the diopter of the refractive adjustment element to reach or approach the desired diopter.
  • the correction of the diopter of both eyes is finally realized by two steps.
  • a focus adjustment signal is generated based on the desired diopter, and the refractive adjustment element performs preliminary correction of the diopter of both eyes by adjusting the focal length thereof according to the focus adjustment signal. Since the adjustment of the diopter of both eyes is related not only to the focal length of the "equivalent lens” but also to the distance between the center of the "equivalent lens” of the left eye and the center of the "equivalent lens” of the right eye, the distance needs to be The user's distance is matched to achieve a good correction.
  • the center of the "equivalent lens” of the left eye that is, the optical axis of the "equivalent lens” of the left eye passes through the position of the "equivalent lens”.
  • the center of the "equivalent lens” of the right eye that is, the optical axis of the right eye “equivalent lens” passes through the position of the "equivalent lens”. Therefore, further, the optical axis adjustment signal is generated according to the user's pupil distance.
  • the refractive adjustment element adjusts its optical axis position according to the optical axis adjustment signal, including adjusting the optical axis position of the "equivalent lens” of the left eye and the optical axis position of the "equivalent lens” of the right eye.
  • the correction of the refractive power of the two eyes is matched with the user's pupil distance, and the correction effect on the diopter of the two eyes is better, so that the left eye and the right eye of the user wearing the augmented reality device are enabled.
  • the visual effect of viewing the image is better.
  • the refractive adjustment element may be a liquid crystal lens, a transmissive spatial modulator SLM (Spatial Light Modulator, SLM for short), a liquid lens or an ultrasonic grating.
  • SLM Spatial Light Modulator
  • the method in which the refractive adjustment element corrects the diopter according to the refractive correction signal can be referred to the related art.
  • the refractive adjustment element can be a liquid crystal lens.
  • the liquid crystal lens includes a liquid crystal layer and an electrode.
  • the liquid crystal molecules in the liquid crystal layer have birefringence characteristics and characteristics that vary with electric field distribution. Electrodes may be provided at a plurality of locations of the liquid crystal layer for forming an electric field applied in the liquid crystal layer.
  • the refractive correction signals can be implemented as corresponding voltage signals applied to electrodes at different locations.
  • the liquid crystal lens is equivalent to a convex lens or a concave lens by the application of the voltage signal.
  • the focal length and optical axis position of the "equivalent lens" are determined by the voltage signal. Accordingly, the liquid crystal lens can correct the diopter of both eyes.
  • determining the user's lay length according to the reflected light includes:
  • Step S21 acquiring a left eye image and a right eye image of the user according to the reflected light rays respectively;
  • Step S22 performing image recognition on the left eye image and the right eye image respectively to determine a left eye pupil center position and a right eye pupil center position;
  • Step S23 determining the user's pupil distance according to the center position of the pupil of the left eye and the center position of the pupil of the right eye.
  • the left eye image and the right eye image are acquired based on the reflected light.
  • Image recognition can then be performed by an image processing algorithm.
  • the size and position of the pupil, etc. are obtained by acquiring the image feature values of the pupil.
  • the center position of the left eye pupil is determined by the left eye image
  • the center position of the right eye pupil is determined by the right eye image.
  • the above-mentioned left eye pupil center position and right eye pupil center position may be relative positions of the pupil center with respect to the calibration point.
  • the calibration point can be a predetermined reference point. Further, the distance between the center of the left eye pupil and the center of the right eye pupil, that is, the user's pupil distance, is determined according to the center position of the pupil.
  • the user's interpupillary distance can also be determined by other means.
  • the user's interpupillary distance can also be determined by determining the position of the ipsilateral edge of the left eye pupil and the right eye pupil.
  • the method according to an embodiment of the present disclosure may further include:
  • Step S41 acquiring a muscle feature image around the two eyes
  • Step S42 determining whether the correction of the diopter of the two eyes is appropriate according to the muscle feature image
  • Step S43 if not, generating a refractive adjustment signal to adjust the diopter of both eyes by the refractive adjustment element. If appropriate, the correction is ended.
  • the expected correction effect can usually be achieved. However, in some cases there may be some error such that the diopter of the refractive adjustment element does not exactly match the actual desired desired diopter. Thus, when the user observes the image through the augmented reality device, the muscles around the eyes are inconsistent with the correct correction of the diopter. There may be a large contraction or expansion of the muscles around the eyes. Accordingly, in the present embodiment, the muscle feature image around the eyes can be further acquired, and whether the correction of the diopter of both eyes is appropriate is determined based on the muscle feature image.
  • the muscle feature image around the two eyes of the user's eyes in normal viewing can be used as a reference image, and the muscles around the eyes can be judged by comparing the currently acquired muscle feature image with the reference image. Whether there is a large contraction or expansion, that is, whether the diopter correction effect perceived by the user is appropriate. If it is determined that the muscle has a large contraction or expansion, it indicates that the diopter correction effect perceived by the user is not appropriate.
  • the refractive adjustment signal can be further generated based on, for example, the degree of muscle contraction or expansion. For example, the refractive adjustment signal can indicate an increase or decrease in desired diopter at a certain step size.
  • the voltage applied to the liquid crystal lens can be adjusted in accordance with a refractive adjustment signal to fine tune, for example, the focal length of the liquid crystal lens. Correction of the diopter of both eyes can be made more suitable by further fine-tuning the diopter of the two eyes according to the refractive adjustment signal.
  • the refractive adjustment method may further include:
  • Step S51 Acquire iris characteristic information of the user according to the reflected light
  • Step S52 Determine user identity information according to iris feature information.
  • Step S53 after determining the user identity information, correcting the diopter of both eyes by the refractive adjustment element according to the stored corresponding refractive correction signal.
  • the iris is located in the middle of the eye's eye, at the forefront of the vascular membrane, in front of the ciliary body.
  • the iris adjusts the size of the pupil and regulates how much light enters the eye.
  • the center of the iris is the pupil.
  • the iris is super unique, that is, each person's iris is different. Therefore, the iris can be used to identify the user. Accordingly, in this embodiment, the iris characteristic information of the user may be further acquired according to the reflected light.
  • the iris feature information may be information indicating characteristics such as the size, color, texture, and fiber tissue distribution of the iris.
  • the user identity information may be determined according to the feature information, and the iris feature information and the user identity information may be stored in advance, for example, the correspondence relationship between the iris feature information and the user identity information is stored. Further, the user identity information can be determined according to the acquired iris feature information. Alternatively or additionally, the iris feature information may be identified as it is acquired. This identifier represents user identity information.
  • the generated refractive correction signal may be stored corresponding to the user's identification information after the refractive correction signal is generated according to the user's interpupillary distance and the desired dioptric power.
  • the generated refractive correction signal may be stored in the form of a lookup table in which the user identity information has a one-to-one correspondence with the refractive correction signal. In this way, after the next time the user identity information is determined, the diopter correction can be directly performed according to the stored refractive correction signal corresponding to the user identity information, without re-generating the refractive correction signal. In this way, the refractive adjustment element can be automatically corrected for the diopter of both eyes to realize the function of memory storage.
  • the embodiment of the present disclosure further provides a refractive adjustment control device for the augmented reality device.
  • the refractive adjustment control device 06 includes:
  • a reflected light receiving unit 61 configured to receive light reflected from two eyes of the user wearing the augmented reality device
  • a distance determining unit 62 configured to determine a lay length of the user according to the reflected light
  • the refractive correction signal generating unit 63 is configured to generate a refractive correction signal according to the user's interpupillary distance and the desired dioptric power to correct the two-eye diopter by the refractive adjustment element.
  • the refractive adjustment control device can meet the calibration requirements of different users for diopter, and the correction effect on the diopter of both eyes is good. Moreover, the augmented reality device using the refractive adjustment control device does not require additional wearing of myopia glasses or hyperopic glasses, and the comfort of use of the augmented reality device can be improved.
  • the device in this embodiment may be implemented by means of software, or by software plus necessary general hardware, and may also be implemented by hardware.
  • a computer program or computer program element is provided, characterized by a method suitable for performing the method according to one of the preceding embodiments on a suitable system (computing device or processor) step.
  • one or more of the described method steps can constitute computer readable instructions stored on one or more computer readable instructions, which are executed by a computing device The computing device will be caused to perform the described operations.
  • Embodiments of the present disclosure also provide an augmented reality device including a refractive adjustment control device and a refractive adjustment element in accordance with an embodiment of the present disclosure.
  • the refractive adjustment element is disposed on a side close to both eyes of the user for correcting the diopter of both eyes according to the refractive correction signal generated by the refractive adjustment control means.
  • the augmented reality device further includes a light source for emitting light, and a light coupling element for coupling the light emitted by the light source to the refractive adjustment element and then incident into the field of view of the user, and The light reflected by the user's eyes is inversely coupled out to the refractive control element.
  • the light source can be a plurality of types of light sources, such as a Light Emitting Diode (LED) or an Organic Light-Emitting Diode (OLED).
  • LED Light Emitting Diode
  • OLED Organic Light-Emitting Diode
  • the optical coupling element couples light from the light source.
  • the light emitted by the light source is coupled to the refractive adjustment element through the optical coupling element, and then incident into the field of view of the user.
  • the light can be reflected by both eyes, and the light reflected by the two eyes passes through the refractive adjustment element, and then reversely coupled to the refractive adjustment control device through the optical coupling element.
  • the refractive adjustment control device generates a refractive correction signal based on the user's interpupillary distance and desired dioptric power.
  • the refractive correction signal can be transmitted to the refractive adjustment element, and the refractive adjustment element corrects the diopter of both eyes based on the refractive correction signal.
  • the augmented reality device provided by the embodiment of the present disclosure is provided with a refractive adjustment element, and the refractive adjustment element can correct the diopter of both eyes according to the user's interpupillary distance and desired dioptric power.
  • the refractive adjustment element can correct the diopter of both eyes according to the user's interpupillary distance and desired dioptric power.
  • FIG. 7 is a schematic structural diagram of an augmented reality device according to an exemplary embodiment of the present disclosure. The working principle of the augmented reality device will be described below with reference to FIG. 7 .
  • the augmented reality device includes a light source 100, a light coupling element 110, a refractive adjustment control device 120, and a refractive adjustment element 130.
  • the optical coupling element 110 includes a beam splitting element 111 and an optical waveguide element 112.
  • the light splitting element 111 is located between the light source 100 and the optical waveguide element 112.
  • the light emitted by the light source 100 is reflected by the light splitting element 111 into the optical waveguide element 112, coupled out to the refractive adjustment element 130 via the optical waveguide element 112, and then incident into the field of view of both eyes (including the left eye 201 and the right eye 202). .
  • Light reflected from both eyes of the user (including the left eye 201 and the right eye 202) is reversely coupled to the beam splitting element 111 through the optical waveguide element 112, and is reflected by the beam splitting element 111 into the refractive adjustment control device 120.
  • the light splitting element 111 may be an optical prism that reflects light emitted from the light source into the optical waveguide element 112.
  • the optical waveguide component 112 can be a substrate 1121 having two oppositely disposed reflective surfaces.
  • a coupling input prism 1122 and a coupling output prism 1123 are disposed in the substrate 1121.
  • the coupling input prism 1122 receives the light from the beam splitting element 111, reflects it, and enters the substrate 1121. The light is totally totally reflected between the oppositely disposed two reflecting surfaces of the substrate 1121, and then incident on the coupling output prism 1123.
  • the coupled output prism 1123 reflects light out of the substrate. The light is again incident into the refractive adjustment element 130. Then, after passing through the refractive adjustment element 130, it is incident into the field of view of the user.
  • the light reflected from the eyes of the user passes through the refractive adjustment element 130 and can be reversely coupled out to the refractive adjustment control device 120 through the optical waveguide element 110.
  • light is incident from the dimming adjustment element 130 to the coupled output prism 1123, and after being reflected therethrough, multiple total reflections are made between the oppositely disposed two reflecting surfaces in the substrate 1121, and then incident on the coupled input prism 1122. .
  • the coupled input prism 1122 reflects light out of the substrate. The light is incident on the refractive adjustment control device 120 again.
  • the optical coupling element is not limited to the structure shown in the figures, but may be other types of optical elements.
  • the refractive adjustment control device may perform, for example, determination of a lay length and/or generation of a refractive correction signal, etc., by a microprocessor having data calculation or processing capabilities.
  • the light splitting element 111 includes a dichroic prism 1111 and a dichroic mirror 1112.
  • the dichroic prism 1111 is located between the light source 100 and the dichroic mirror 1112, and the dichroic mirror 1112 is located between the dichroic prism 1111 and the optical waveguide element 112.
  • the light emitted from the light source 100 is incident on the dichroic mirror 1112 through the dichroic prism 1111, and is reflected by the dichroic mirror 1112 into the optical waveguide element 112.
  • the light splitting element in this embodiment includes a beam splitting prism and a dichroic mirror.
  • the light from the light source can be incident directly on the dichroic mirror through the beam splitting prism.
  • the light reflected from the two eyes of the user is emitted from the optical waveguide element, and then reflected by the dichroic mirror to the dichroic prism, and after being reflected by the dichroic prism, enters the dioptric adjustment control device, whereby the spectroscopic element realizes the incident of the light source.
  • the light and the light reflected from the eyes of the user are split, so that the refractive adjustment control device can receive the light reflected from the eyes of the user, and generate a refractive correction signal according to the reflected light to pass the refractive adjustment.
  • the component corrects the diopter of both eyes.
  • the dichroic prism can be two oppositely disposed triangular prisms as shown.
  • the opposite faces of the two triangular prisms constitute a spectroscopic reflecting surface, and the splitting effect can be achieved.
  • the dichroic prism may also be composed of other lenses or prisms, etc., and is not limited to the structure shown in the drawings.
  • the augmented reality device may further include a micro display element 140 and a collimating element 150.
  • Microdisplay element 140 can be used to display an image.
  • the wavelength of the light of the displayed image is different from the wavelength of the light emitted by the light source 100.
  • the collimating element 150 can be positioned between the microdisplay element 140 and the dichroic mirror 1112.
  • Dichroic mirror 1112 is located between collimating element 150 and optical waveguide element 112.
  • the light of the displayed image is collimated by the collimating element 150, directly passes through the dichroic mirror 1112, enters the optical waveguide component 112, is coupled to the refractive adjustment component 130 through the optical waveguide component 112, and is incident on the user. Within the field of view.
  • the micro display element 140 may be a liquid crystal display panel or an organic light emitting diode display panel for displaying an image.
  • the wavelength of the light of the displayed image is different from the wavelength of the light emitted by the light source.
  • the light wavelength of the image is visible light, and its wavelength range is usually between 380 and 780 nm.
  • the light source can be an infrared source that emits infrared light. Infrared light has a longer wavelength than visible light and typically has a wavelength range between 770 nm and 1 mm.
  • the dichroic mirror can function as a light splitting.
  • the light of the displayed image can be directly transmitted through the dichroic mirror, and the light of the light source reflected from the eyes of the user passes through the dichroic mirror and is reflected into the dichroic prism, and then enters the refractive adjustment control device through the dichroic prism. .
  • the dichroic mirror splits the light of the image with the light emitted by the light source, and the two lights do not interfere with each other, but are incident on the corresponding components.
  • the light of the image is collimated into collimated light by the collimating element, and then incident on the dichroic mirror, directly enters the optical waveguide component through the dichroic mirror, is coupled to the refractive adjustment component through the optical waveguide component, and is incident on the visual field of the user.
  • the image can be observed by both eyes of the user.
  • the image is fused as a virtual image with each object in the real environment actually observed by the two eyes to achieve an augmented reality effect.
  • the coupled output element may employ a laminated prism comprising a plurality of reflective prisms.
  • the enthalpy expansion can be achieved by reflecting or refracting light through the stacking prism and then exiting into both eyes.
  • the collimating element can be a convex lens or other optical element composed of a combined lens.
  • the dichroic mirror may be a reflective prism having a reflecting surface or the like.
  • exemplary is used herein to mean serving as an example, instance, illustration, etc., and not necessarily as an advantage.
  • “or” is intended to mean an exclusive “or” rather than an inclusive.
  • the use of “a” or “an” or “an” generally mean both A or B or both A and B.
  • the terms “including”, “having”, “having”, “having”, and/or variations thereof are used in the detailed description or the claims, and such terms are intended to be in a manner similar to the term “comprising”. It is inclusive.

Abstract

一种增强现实设备的屈光调节方法及其装置、增强现实设备,屈光调节方法包括:接收从佩戴增强现实设备用户的两眼反射的光线(10);根据反射的光线确定两眼瞳距(20);根据两眼瞳距和期望屈光度生成屈光校正信号,以使屈光调节元件对两眼屈光度进行校正(30)。屈光调节方法可提高增强现实设备使用的舒适性。

Description

屈光调节方法及装置以及增强现实设备
相关申请
本申请要求于2018年3月26日递交的中国专利申请No.201810252227.3的优先权,在此全文引用上述中国专利申请公开的内容作为本申请的一部分。
技术领域
本公开涉及显示技术领域,尤其涉及增强现实设备的屈光调节方法及其装置、增强现实设备。
背景技术
增强现实技术(Augmented Reality,简称AR)是一种将真实世界的信息和虚拟信息进行融合的新技术。其特点是将虚拟信息应用到真实环境中,可将真实环境中的实物和虚拟的信息融合到同一个画面或者是空间中,从而达到超越现实的感官体验。
然而,当前的增强现实设备通常是为具有正常视力的用户设计,并不适合视力不正常的用户使用。
发明内容
根据本公开实施例的第一方面,提供一种增强现实设备的屈光调节方法,包括:接收从佩戴增强现实设备的用户的两眼反射的光线;根据所述反射的光线确定该用户的瞳距;根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过增强现实设备中的屈光调节元件对该用户的两眼屈光度进行校正。
可选的,所述根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过所述屈光调节元件对两眼屈光度进行校正包括:根据所述期望屈光度生成焦距调节信号,以调节屈光调节元件的焦距对两眼屈光度进行初步校正;根据所述该用户的瞳距生成光轴调节信号,以调节屈光调节元件的光轴位置对初步校正后的两眼屈光度进行再次校正。
可选的,所述根据所述反射的光线确定该用户的瞳距包括:根据所述反射的光线分别获取该用户的左眼图像和右眼图像;分别对所述左眼图像和所述右眼图像进行图像识别以确定左眼瞳孔位置和右眼瞳孔位置;根据所述左眼瞳孔位置和所述右眼瞳孔位置确定出该用户的瞳距。
可选的,在通过所述屈光调节元件对该用户两眼的屈光度进行校正之后,还包括:获取两眼周围的肌肉特征图像;在根据所述肌肉特征图像判定对两眼屈光度的校正不合适时,生成屈光调节信号,以通过所述屈光调节元件对两眼屈光度进行调节。
可选的,该方法还包括:与用户的身份信息相对应地存储为该用户生成的屈光校正信号。
可选的,该方法还包括:根据所述反射的光线获取两眼虹膜特征信息;根据所述两眼虹膜特征信息确定用户的身份信息;基于所确定的身份信息获取所存储的对应的屈光校正信号,以及根据所获取的对应的屈光校正信号来通过所述屈光调节元件对两眼屈光度进行校正。
根据本公开实施例的第二方面,提供一种增强现实设备的屈光调节控制装置,包括:反射光线接收单元、瞳距确定单元和屈光校正信号生成单元。反射光线接收单元用于接收从佩戴增强现实设备用户的两眼反射的光线。瞳距确定单元用于根据所述反射的光线确定该用户的瞳距;屈光校正信号生成单元用于根据该用户的瞳距和两眼屈光度生成屈光校正信号,以通过增强现实设备的屈光调节元件对该用户的两眼屈光度进行校正。
根据本公开实施例的第三方面,提供一种增强现实设备,包括:如上所述以及在本文中其他部分中所述的屈光调节控制装置和屈光调节元件。屈光调节元件被设置于靠近用户两眼的一侧,用于根据所述屈光调节控制装置生成的屈光校正信号对两眼屈光度进行校正。
可选的,该增强现实设备还包括光源和光耦合元件。光源用于发出光线。光耦合元件用于将所述光源发出的光线耦合输出至所述屈光调节元件后再入射至该用户的视野范围内,并将从用户的两眼反射的光线逆向耦合输出至所述屈光调节控制装置。
可选的,所述光耦合元件包括分光元件和光波导元件。所述分光元件位于所述光源和光波导元件之间。所述光源发出的光线通过所述分光元件反射至所述光波导元件中,经所述光波导元件耦合输出至所述屈光调节元件中后再入射至该用户的视野范围内。从用户的两眼反射的光线经过所述屈光调节元件后,通过所述光波导元件逆向耦合至所述分光元件后、经所述分光元件反射至所述屈光调节控制装置中。
可选的,所述分光元件包括分光棱镜和二向色镜。所述分光棱镜 位于所述光源和所述二向色镜之间,所述二向色镜位于所述分光棱镜和所述述光波导元件之间。所述光源发出的光线透过所述分光棱镜入射至所述二向色镜,并经过所述二向色镜反射至所述光波导元件中。从用户的两眼反射的光线通过所述光波导元件逆向耦合至所述二向色镜后、经所述二向色镜反射至所述分光棱镜后,在通过所述分光棱镜反射至所述屈光调节控制装置中。
可选的,该增强显示设备还包括:微显示元件和准直元件。微显示元件用于显示图像,所显示图像的光线波长与所述光源发出光线的波长不同。准直元件位于所述微显示元件和所述二向色镜之间。所述二向色镜位于所述准直元件和所述光波导元件之间。所显示图像的光线经过所述准直元件准直后,透过所述二向色镜后进入所述光波导元件中,且在通过所述光波导元件耦合输出至屈光调节元件后再入射至该用户的视野范围内。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。
图1是根据本公开一实施例提供的增强现实设备的屈光调节方法的工作流程图;
图2是根据本公开另一实施例提供的增强现实设备的屈光调节方法的工作流程图;
图3是根据本公开又一实施例提供的增强现实设备的屈光调节方法的工作流程图;
图4是根据本公开另一实施例提供的增强现实设备的屈光调节方法的工作流程图;
图5是根据本公开又一实施例提供的增强现实设备的屈光调节方法的工作流程图;
图6是根据本公开又一实施例提供的增强现实设备的屈光调节控制装置的方框图;
图7是根据本公开一实施例提供的增强现实设备的结构示意图;
图8是根据本公开又一实施例提供的增强现实设备的结构示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
当前的增强现实设备通常是为具有正常视力的用户设计,并不适合视力不正常的用户使用。这样,当近视或者远视用户使用这样的增强现实设备时,为了能清楚观看入射至人眼的图像,需要先佩戴近视眼镜或远视眼镜,然后再佩戴增强现实设备。这影响了增强现实设备使用的舒适性。
针对现有增强现实设备佩戴不舒适的问题,本公开实施例提供一种增强现实设备的屈光调节方法,如图1所示,该方法包括以下的步骤。
步骤S10、接收从佩戴增强现实设备的用户的两眼反射的光线;
步骤S20、根据反射的光线确定该用户的瞳距;
步骤S30、根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过该增强现实设备的屈光调节元件对该用户的两眼屈光度进行校正。
用户在佩戴增强现实设备后可观察到入射至用户两眼视野范围内的光线。该光线可以来自自然光源或者该增强现实设备中包含的光源。该光线可被眼睛反射。被两眼(包括左眼和右眼)反射的光线可通过摄像装置或者一些专门的光学传感器接收。
通过对两眼反射的光线进行分析可以确定两眼瞳孔的位置进而确定该用户的瞳距。该用户的瞳距指左眼瞳孔和右眼瞳孔之间的距离。该用户的瞳距可以为左眼瞳孔的中心到右眼瞳孔的中心的距离,或者也可以是左眼瞳孔的外边缘(靠近人脸太阳穴一侧)到右眼瞳孔的内边缘(靠近人脸鼻子一侧)之间的距离,或者是从左眼瞳孔的内边缘到右眼瞳孔的外边缘之间距离。
眼睛折射光线的作用叫屈光。屈光的能力用光焦度来表示,叫做屈光度。屈光度为屈光能力大小的单位,用D表示。当眼睛存在视力健康问题时,例如近视或者远视时,会导致眼睛无法看到清晰的图像, 即两眼屈光度不正常。
当眼镜有近视或者远视问题时,可佩带眼镜。通过眼镜的镜片对眼睛本身的屈光度进行校正,使眼睛能看到清晰的图像。
眼镜的镜片的屈光度一般用度数表示,以屈光度D的数值乘以100即为度数。近视镜片为凹透镜,远视镜片为凸透镜。近视镜片的屈光度为-D,远视镜片的屈光度为+D。例如,200度的近视镜片的屈光度为-2D,150度的远视镜片的屈光度为+1.5D。镜片的屈光度与镜片的焦距有关。屈光度越大,镜片的焦距越小。
屈光调节元件用于对两眼屈光度进行校正,可以等效为位于两眼前方的两块镜片。该镜片可以为凸透镜或者凹透镜。左眼镜片的中心与右眼镜片的中心之间的距离(即光学中心距离)应当与该用户的瞳距相配合。否则,即使屈光调节元件的屈光度正确,也会使用户有不适的感觉,并且影响眼睛视力。
期望屈光度可以指采用屈光调节元件将用户的两眼屈光度校正到正常状况所需的屈光度。该期望屈光度可以根据用户需要设置。对于不同用户而言,因为其眼睛本身的屈光度可能不同,所以期望屈光度也不同。在一个实施例中,用户可以通过在增强显示设备中手动输入或者语音输入期望屈光度的值而设置该期望屈光度。替换地或者附加地,用户也可以通过从增强显示设备中预先设置的多个期望屈光度中选择适合其的期望屈光度。示例性地,期望屈光度的选择可以通过按钮或者旋钮来进行。
通过屈光调节元件对用户的两眼屈光度进行的校正不仅与期望屈光度有关,还与该用户的瞳距有关。据此,本实施例中,根据该用户的瞳距和期望屈光度生成屈光校正信号。该屈光校正信号作为屈光调节元件的输入信号,通过屈光调节元件的调节作用对两眼屈光度进行校正。示例性地,该屈光调节元件可根据屈光校正信号调节其“等效透镜”的焦距和光轴的位置等,从而起到近视镜片或者远视镜片的作用,达到校正两眼的近视屈光或者远视屈光的效果。
由上述描述可知,按照本公开实施例的屈光调节方法,在通过屈光调节元件对两眼屈光度进行校正时,不仅参考期望屈光度,而且参考该用户的瞳距。这样,可根据佩戴增强现实设备的不同用户的两眼屈光度适应性对屈光度进行校正,可以实现校正近视和校正远视的效 果。同时,因为能够满足不同用户对屈光度的个性化的校正要求,所以对两眼屈光度进行校正的效果好。采用该屈光调节方法的增强现实设备不需要另外佩戴近视眼镜或者远视眼镜,因而可提高增强现实设备使用的舒适性。
在一些例子中,如图2所示,根据用户的瞳距和期望屈光度生成屈光校正信号,以通过屈光调节元件对两眼屈光度进行校正包括:
步骤S31、根据期望屈光度生成焦距调节信号,以通过使屈光调节元件调节其焦距对两眼屈光度进行初步校正;
步骤S32、根据该用户的瞳距生成光轴调节信号,以通过使屈光调节元件调节其光轴位置对初步校正后的两眼屈光度进行再次校正。
屈光调节元件可以等效为位于两眼前方的两个“等效透镜”。“等效透镜”可以为凸透镜或者凹透镜。在对两眼屈光度进行校正时,等效透镜的屈光度随着其焦距的变化而变化。对两眼屈光度的校正可以通过对“等效透镜”的焦距进行调节实现。这样可以使屈光调节元件的屈光度达到或接近期望屈光度。
本实施例中,通过两个步骤最终实现对两眼屈光度的校正。首先根据期望屈光度生成焦距调节信号,屈光调节元件通过根据该焦距调节信号调节其焦距对两眼屈光度进行初步校正。由于对两眼屈光度的调节不仅与“等效透镜”的焦距有关,还与左眼“等效透镜”的中心与右眼“等效透镜”的中心之间的距离有关,该距离需要与该用户的瞳距相配合,才能达到好的校正效果。左眼“等效透镜”的中心即左眼“等效透镜”的光轴穿过该“等效透镜”的位置。同样的,右眼“等效透镜”的中心即右眼“等效透镜”的光轴穿过该“等效透镜”的位置。因此,进一步的,根据用户的瞳距生成光轴调节信号。屈光调节元件根据该光轴调节信号调节其光轴位置,包括调节左眼“等效透镜”的光轴位置和右眼“等效透镜”的光轴位置。通对光轴位置的进一步调节,使该屈光调节元件对两眼屈光度的校正与用户的瞳距匹配,对两眼屈光度的校正效果更好,使佩戴增强现实设备用户的左眼和右眼观看图像的视觉效果较佳。
屈光调节元件可以为液晶透镜、透射式空间调制器SLM(Spatial Light Modulator,简称SLM)、液体透镜或者超声光栅等。屈光调节元件根据屈光校正信号对屈光度进行校正的方法可以参考相关技术。 例如,屈光调节元件可以为液晶透镜。液晶透镜包括液晶层和电极。液晶层中液晶分子具有双折射特性以及随电场分布变化的特性。可在液晶层的多个位置设置电极,电极用于形成施加在液晶层中的电场。通过在不同位置的电极上施加对应电压可以形成作用于液晶层中各位置液晶分子的对应电场,进而使液晶层中各位置液晶分子的折射率对应分布,达到对光线聚焦的凸透镜的效果或者对光线发散的凹透镜的效果。据此,屈光校正信号(包括例如焦距调节信号和光轴调节信号)可以被实施为施加在不同位置的电极上的对应电压信号。通过该电压信号的施加使液晶透镜等效为凸透镜或者凹透镜。“等效透镜”的焦距和光轴位置由电压信号确定。据此,液晶透镜可对两眼屈光度进行校正。
在一个可选的实施方式中,如图3所示,根据反射的光线确定用户的瞳距包括:
步骤S21、根据反射的光线分别获取用户的左眼图像和右眼图像;
步骤S22、分别对左眼图像和所述右眼图像进行图像识别以确定左眼瞳孔中心位置和右眼瞳孔中心位置;
步骤S23、根据左眼瞳孔中心位置和右眼瞳孔中心位置确定出用户的瞳距。
本实施例中,根据反射的光线获取左眼图像和右眼图像。然后可通过图像处理算法进行图像识别。例如,通过获取瞳孔的图像特征值获取瞳孔的大小和位置等。据此,通过左眼图像确定左眼瞳孔中心位置,通过右眼图像确定右眼瞳孔中心位置。
上述的左眼瞳孔中心位置和右眼瞳孔中心位置可以是瞳孔中心相对于标定点的相对位置。该标定点可为预先确定的某一参考点。进而,根据瞳孔中心位置确定出左眼瞳孔中心和右眼瞳孔中心之间的间距,即用户的瞳距。
可以理解,还可以通过其他方式来确定用户的瞳距。举例而言,还可以通过确定左眼瞳孔和右眼瞳孔的同侧边缘位置来确定用户的瞳距。
在一个可选的实施方式中,如图4所示,在通过屈光调节元件对两眼屈光度进行校正之后,按照本公开实施例的方法还可以包括:
步骤S41、获取两眼周围的肌肉特征图像;
步骤S42、根据肌肉特征图像判断对两眼屈光度的校正是否合适;
步骤S43、若否,生成屈光调节信号,以通过屈光调节元件对两眼屈光度进行调节。若合适,则结束校正。
屈光调节元件对两眼屈光度进行校正之后,通常可以达到预期的校正效果。但是在一些场合中可能会存在一定误差,使得屈光调节元件的屈光度与实际所需的期望屈光度不能完全匹配。这样当用户通过增强现实设备观察图像时,两眼周围肌肉与对屈光度正确校正时的不一致。两眼周围肌肉可能出现较大收缩或者扩张的情况。据此,在本实施例中,可以进一步的获取两眼周围的肌肉特征图像,且根据该肌肉特征图像判断对两眼屈光度的校正是否合适。示例性地,可以将用户两眼在正常观看(也即自然或松弛状态)时两眼周围的肌肉特征图像作为基准图像,且通过比较当前获取的肌肉特征图像与基准图像来判断两眼周围肌肉是否存在较大收缩或者扩张等情况,即判断用户感受到的屈光度校正效果是否合适。如果判定肌肉存在较大收缩或者扩张等情况,则表明用户感受到的屈光度校正效果不合适。此时,可基于例如肌肉收缩或者扩张的程度来进一步的生成屈光调节信号。举例而言,屈光调节信号可以指示按照一定的步长增加或者减小期望屈光度。在屈光调节元件采用液晶透镜实现的场景中,可以按照屈光调节信号对实施在液晶透镜镜片上的电压进行调节以微调例如液晶透镜的焦距。通过使屈光调节元件根据该屈光调节信号对两眼屈光度进行进一步的微调,可以使对两眼屈光度的校正更加合适。
在一些例子中,如图5所示,该屈光调节方法还可以包括:
步骤S51、根据反射的光线获取用户的虹膜特征信息;
步骤S52、根据虹膜特征信息确定用户身份信息;
步骤S53、在确定出用户身份信息后根据存储的对应屈光校正信号来通过屈光调节元件对两眼屈光度进行校正。
虹膜位于眼睛的眼球中层,位于血管膜的最前部,在睫状体前方。虹膜可调节瞳孔的大小,调节进入眼内光线多少的作用。虹膜中央为瞳孔。
虹膜具有超唯一性,也即每个人的虹膜都是不相同的。因此,可用虹膜标识用户身份。据此,本实施例中,可以进一步的根据反射的光线获取用户的虹膜特征信息。虹膜特征信息可以为表示虹膜的大小、 颜色、纹理、纤维组织分布等特征的信息。根据该特征信息可以确定用户身份信息,可以预先对应地存储虹膜特征信息与用户身份信息,例如存储虹膜特征信息与用户身份信息的对应关系。进而根据获取到的虹膜特征信息可以确定出用户身份信息。替换地或者附加地,可以当获取虹膜特征信息后,对其进行标识。该标识即代表用户身份信息。
可选地,在根据用户的瞳距和期望屈光度生成屈光校正信号后可以与用户的身份信息相对应地存储所生成的屈光校正信号。举例而言,可以以查找表的形式来存储所生成的屈光校正信号,其中用户身份信息与屈光校正信号存在一一对应的关系。这样,当下一次确定出用户身份信息后可直接根据存储的与该用户身份信息对应的屈光校正信号进行屈光度校正,而不需要再次生成屈光校正信号。这样,可以使屈光调节元件能对两眼屈光度进行自动校正,实现记忆存储的功能。
本公开实施例还提供一种增强现实设备的屈光调节控制装置,如图6所示,该屈光调节控制装置06包括:
反射光线接收单元61,用于接收从佩戴增强现实设备用户的两眼反射的光线;
瞳距确定单元62,用于根据反射的光线确定该用户的瞳距;
屈光校正信号生成单元63,用于根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过屈光调节元件对两眼屈光度进行校正。
与前述屈光调节方法的实施例相对应,本公开提供的屈光调节控制装置,可满足不同用户对屈光度的校正要求,对两眼屈光度进行校正效果好。而且,采用该屈光调节控制装置的增强现实设备不需要另外佩戴近视眼镜或者远视眼镜,可提高增强现实设备使用的舒适性。
对于装置实施例而言,其中各个单元的功能和作用的实现过程具体详见上述方法中对应步骤的实现过程,在此不再赘述。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,上述各单元可以合并为一个单元,也可以进一步拆分成多个子单元。
通过以上的实施方式的描述,本实施例的装置可借助软件的方式实现,或者软件加必需的通用硬件的方式来实现,当然也可以通过硬件实现。
在本发明的另一示范性实施例中,提供了计算机程序或者计算机 程序单元,其特征为适合于在适当的系统(计算设备或者处理器)上执行按照在前实施例之一的方法的方法步骤。
在一个实施例中,所描述的方法步骤中的一个或多个操作可以构成存储于一个或多个计算机可读介质上的计算机可读指令,所述计算机可读指令在由计算设备执行的情况下将使计算设备执行所描述的操作。
本公开实施例还提供一种增强现实设备,包括按照本公开实施例的屈光调节控制装置和屈光调节元件。屈光调节元件设置于靠近用户两眼的一侧,用于根据屈光调节控制装置生成的屈光校正信号对两眼屈光度进行校正。
在一些实施例中,增强现实设备还包括用于发出光线的光源,以及光耦合元件,用于将光源发出的光线耦合输出至屈光调节元件后再入射至用户的视野范围内,并将从用户的两眼反射的光线逆向耦合输出至屈光控制元件中。
该光源可以为多种类型的光源,例如为发光二极管LED(Light Emitting Diode,简称LED)或者有机发光二极管OLED(Organic Light-Emitting Diode,简称OLED)。
光耦合元件对光源发出的光线起到耦合作用。光源发出的光线经过光耦合元件耦合输出到屈光调节元件后,再入射至用户的视野范围内。两眼可对该光线进行反射,经过两眼反射的光线通过屈光调节元件后,再通过光耦合元件逆向耦合输出至屈光调节控制装置。屈光调节控制装置根据该用户的瞳距和期望屈光度可生成屈光校正信号。该屈光校正信号可传输给屈光调节元件,屈光调节元件根据该屈光校正信号对两眼屈光度进行校正。
本公开实施例提供的增强现实设备设置有屈光调节元件,并且,该屈光调节元件可根据该用户的瞳距和期望屈光度对两眼屈光度进行校正。通过根据不同用户的两眼屈光度适应性对屈光度进行校正,可以实现校正近视和校正远视的效果,满足不同用户对屈光度的校正要求,而且对两眼屈光度进行校正的效果好。该增强现实设备不需要用户另外佩戴近视眼镜或者远视眼镜,可提高增强现实设备使用的舒适性。
可以理解,针对图1到图5讨论的所有可能性对于图6也是有效 的。
图7所示为本公开一示例性实施例提供的增强现实设备的结构示意图,下面结合图7对该增强现实设备的工作原理进行介绍。
如图7所示,该增强现实设备包括光源100、光耦合元件110、屈光调节控制装置120和屈光调节元件130。
光耦合元件110包括分光元件111和光波导元件112。分光元件111位于光源100和光波导元件112之间。
光源100发出的光线通过分光元件111反射至光波导元件112中,经光波导元件112耦合输出至屈光调节元件130中后再入射至两眼(包括左眼201和右眼202)视野范围内。
从用户的两眼(包括左眼201和右眼202)反射的光线通过光波导元件112逆向耦合至分光元件111后、经分光元件111反射至屈光调节控制装置120中。
分光元件111可以为光学棱镜,可将光源发出的光线进行反射,使其进入光波导元件112中。
光波导元件112可以为具有两个相对设置的反射面的基片1121。基片1121内设置有耦合输入棱镜1122和耦合输出棱镜1123。耦合输入棱镜1122接收来自分光元件111的光线,对其进行反射后使之进入基片1121内。光线在基片1121的相对设置的两个反射面之间进行多次全反射,之后入射至耦合输出棱镜1123。耦合输出棱镜1123将光线反射出基片外。光线再入射至屈光调节元件130中。之后经过屈光调节元件130后入射至用户的视野范围内。
从用户的两眼反射的光线经过屈光调节元件130后,可通过光波导元件110逆向耦合输出至屈光调节控制装置120中。示例性地,光线从屈光调节元件130入射至耦合输出棱镜1123,经其反射后在基片1121中的相对设置的两个反射面之间进行多次全反射,之后入射至耦合输入棱镜1122。耦合输入棱镜1122将光线反射出基片外。光线再入射至屈光调节控制装置120中。
可以理解,光耦合元件并不限于图中所示的结构,也可以为其他类型的光学元件。在一些实施例中,屈光调节控制装置可以通过具有数据计算或者处理能力的微处理器来进行例如瞳距的确定和/或屈光校正信号的生成等。
在一些例子中,如图7所示,分光元件111包括分光棱镜1111和二向色镜1112。分光棱镜1111位于光源100和二向色镜1112之间,二向色镜1112位于分光棱镜1111和光波导元件112之间。
光源100发出的光线透过分光棱镜1111入射至二向色镜1112,并经过二向色镜1112反射至光波导元件112中。
从用户的两眼反射的光线通过光波导元件112逆向耦合至二向色镜1112。在光线经二向色镜1112反射至分光棱镜1111后,可以通过分光棱镜1111反射至屈光调节控制装置120中。
本实施例中的分光元件包括分光棱镜和二向色镜。光源发出的光线可直接透过分光棱镜入射至二向色镜。而从用户的两眼反射的光线从光波导元件中出射后,通过二向色镜反射至分光棱镜,经过分光棱镜的反射后进入屈光调节控制装置,据此,分光元件实现将光源入射的光线和从用户的两眼反射的光线进行分光的作用,使屈光调节控制装置可以接收到从用户的两眼反射的光线,进而根据该反射的光线生成屈光校正信号,以通过屈光调节元件对两眼屈光度进行校正。
分光棱镜可以为如图所示的两个相对设置的三角棱镜。两个三角棱镜相对的面构成分光反射面,可以实现分光作用。当然,分光棱镜也可以由其他透镜或者棱镜等组成,不限于图中所示结构。
在一个可选的实施方式中,如图8所示,该增强现实设备还可以包括微显示元件140和准直元件150。
微显示元件140可以用于显示图像。所显示图像的光线波长与光源100发出光线的波长不同。
准直元件150可以位于微显示元件140和二向色镜1112之间。二向色镜1112位于准直元件150和光波导元件112之间。
所显示图像的光线经过准直元件150准直后,直接透过二向色镜1112进入光波导元件112中,再通过光波导元件112耦合输出至屈光调节元件130后再入射至该用户的视野范围内。
微显示元件140可以为液晶显示面板或者有机发光二极管显示面板,用于显示图像。所显示图像的光线波长与光源发出光线的波长不同。图像的光线波长为可见光,其波长范围通常位于380~780nm之间。光源可以为红外光源,其可以发出红外光线。红外光线的波长比可见光长,其波长范围通常位于770nm到1mm之间。
由于图像的光线波长与光源发出光线的波长不同,二向色镜可起到分光作用。所显示图像的光线可直接透过二向色镜,而从用户的两眼反射的光源的光线通过二向色镜后被反射至分光棱镜中,再通过分光棱镜入射至屈光调节控制装置中。通过二向色镜将图像的光线与光源发出的光线进行分光,两种光线不会互相干扰,而是分别入射至对应元件中。
图像的光线经过准直元件准直成平行光后入射至二向色镜,直接透过二向色镜进入光波导元件,通过光波导元件耦合输出至屈光调节元件后入射至该用户的视野范围内。这样,用户的两眼可观察到该图像。该图像作为虚拟图像与两眼实际观察到的真实环境中的各物体进行融合,实现增强现实的效果。
为了使两眼观察到微显示元件显示的完整图像,并且给两眼一定的活动空间,耦合输出元件可以采用包括多个反射棱镜的层叠棱镜。通过使光线经过层叠棱镜反射或者折射后出射进入两眼,可实现出瞳扩展。
准直元件可以为凸透镜或者组合透镜组成的其他光学元件。二向色镜可以为具有反射面的反射棱镜等。
本公开的各种方面可以单独地、组合地或以在前文描述的实施例中没有具体讨论的各种各样的安排被使用,所以,在它的应用上不限于在以上的说明中阐述的或在附图上图示的部件的细节和安排。例如,在一个实施例中描述的某些方面可以与在其它实施例中描述的某些方面以任何方式组合。
另外,本文使用“示范性”来意指充当示例、实例、例证说明等等,而并不必然是作为有利的。当在本文中使用时,“或”打算意指排他性的“或”而非包含性的“或”。此外,除非另外指定或者从上下文中清楚是指向单数形式,否则如在本申请中使用的“一”和“一个”一般被解读成意指“一个或多个”。此外,A和B中的至少一个和/或类似表述一般意指A或B或者A和B两者。另外,就在详细说明或权利要求中使用了“包含”、“具有”、“有”、“带有”和/或其变形来说,这样的术语打算以类似于术语“包括”的方式而是包含性的。
本领域技术人员在考虑说明书及实践这里公开的公开后,将容易 想到本公开的其它实施方案。本公开旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由权利要求指出。

Claims (12)

  1. 一种增强现实设备的屈光调节方法,包括:
    接收从佩戴增强现实设备用户的两眼反射的光线;
    根据所述反射的光线确定该用户的瞳距;
    根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过增强现实设备中的屈光调节元件对该用户的两眼屈光度进行校正。
  2. 根据权利要求1所述的方法,其中,所述根据该用户的瞳距和期望屈光度生成屈光校正信号,以通过所述屈光调节元件对两眼屈光度进行校正包括:
    根据所述期望屈光度生成焦距调节信号,以通过使屈光调节元件调节其焦距对两眼屈光度进行初步校正;
    根据该用户的瞳距生成光轴调节信号,以通过使屈光调节元件调节其光轴位置对初步校正后的两眼屈光度进行再次校正。
  3. 根据权利要求1所述的方法,其中,所述根据所述反射的光线确定该用户的瞳距包括:
    根据所述反射的光线分别获取该用户的左眼图像和右眼图像;
    分别对所述左眼图像和所述右眼图像进行图像识别以确定左眼瞳孔位置和右眼瞳孔位置;
    根据所述左眼瞳孔位置和所述右眼瞳孔位置确定出该用户的瞳距。
  4. 根据权利要求1所述的方法,其中,在通过所述屈光调节元件对该用户两眼的屈光度进行校正之后,该方法还包括:
    获取两眼周围的肌肉特征图像;
    在根据所述肌肉特征图像判定对两眼屈光度的校正不合适时,生成屈光调节信号,以通过所述屈光调节元件对两眼屈光度进行调节。
  5. 根据权利要求1所述的方法,还包括:与用户的身份信息相对应地存储为该用户生成的屈光校正信号。
  6. 根据权利要求5所述的方法,还包括:
    根据所述反射的光线获取两眼虹膜特征信息;
    根据所述两眼虹膜特征信息确定用户的身份信息;
    基于所确定的身份信息获取所存储的对应的屈光校正信号,以及
    根据所获取的对应的屈光校正信号来通过所述屈光调节元件对两眼屈光度进行校正。
  7. 一种用于增强现实设备的屈光调节控制装置,包括:
    反射光线接收单元,用于接收从佩戴增强现实设备的用户的两眼反射的光线;
    瞳距确定单元,用于根据所述反射的光线确定该用户的瞳距;
    屈光校正信号生成单元,用于根据该用户的瞳距和两眼屈光度生成屈光校正信号,以通过增强现实设备的屈光调节元件对该用户的两眼屈光度进行校正。
  8. 一种增强现实设备,包括:
    按照权利要求7所述的屈光调节控制装置;
    屈光调节元件,设置于靠近用户两眼的一侧,用于根据所述屈光调节控制装置生成的屈光校正信号对用户的两眼屈光度进行校正。
  9. 根据权利要求8所述的增强现实设备,还包括:
    光源,用于发出光线;
    光耦合元件,用于将所述光源发出的光线耦合输出至所述屈光调节元件后再入射至该用户的视野范围内,并将从用户的两眼反射的光线逆向耦合输出至所述屈光调节控制装置。
  10. 根据权利要求9所述的增强现实设备,其中,
    所述光耦合元件包括分光元件和光波导元件,所述分光元件位于所述光源和光波导元件之间;
    所述光源发出的光线通过所述分光元件反射至所述光波导元件中,经所述光波导元件耦合输出至所述屈光调节元件中后再入射至该用户的视野范围内;
    从用户的两眼反射的光线经过所述屈光调节元件后,通过所述光波导元件逆向耦合至所述分光元件后、经所述分光元件反射至所述屈光调节控制装置中。
  11. 根据权利要求10所述的增强现实设备,其中,所述分光元件包括分光棱镜和二向色镜;
    所述分光棱镜位于所述光源和所述二向色镜之间,所述二向色镜位于所述分光棱镜和所述述光波导元件之间;
    所述光源发出的光线透过所述分光棱镜入射至所述二向色镜,并 经过所述二向色镜反射至所述光波导元件中;
    从用户的两眼反射的光线通过所述光波导元件逆向耦合至所述二向色镜后、经所述二向色镜反射至所述分光棱镜后,在通过所述分光棱镜反射至所述屈光调节控制装置中。
  12. 根据权利要求11所述的增强现实设备,还包括:
    微显示元件,用于向用户显示图像,所显示图像的光线波长与所述光源发出光线的波长不同;
    准直元件,位于所述微显示元件和所述二向色镜之间,所述二向色镜位于所述准直元件和所述光波导元件之间;
    所显示图像的光线经过所述准直元件准直后,透过所述二向色镜后进入所述光波导元件中,且在通过所述光波导元件耦合输出至屈光调节元件后再入射至该用户的视野范围内。
PCT/CN2019/079618 2018-03-26 2019-03-26 屈光调节方法及装置以及增强现实设备 WO2019184896A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/610,559 US11579448B2 (en) 2018-03-26 2019-03-26 Method and device for refraction adjustment, and augmented reality apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810252227.3A CN108490611B (zh) 2018-03-26 2018-03-26 增强现实设备的屈光调节方法及其装置、增强现实设备
CN201810252227.3 2018-03-26

Publications (1)

Publication Number Publication Date
WO2019184896A1 true WO2019184896A1 (zh) 2019-10-03

Family

ID=63337716

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/079618 WO2019184896A1 (zh) 2018-03-26 2019-03-26 屈光调节方法及装置以及增强现实设备

Country Status (3)

Country Link
US (1) US11579448B2 (zh)
CN (1) CN108490611B (zh)
WO (1) WO2019184896A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112147786A (zh) * 2020-10-28 2020-12-29 南京爱奇艺智能科技有限公司 一种增强现实显示系统

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108490611B (zh) * 2018-03-26 2020-12-08 京东方科技集团股份有限公司 增强现实设备的屈光调节方法及其装置、增强现实设备
US10852619B1 (en) * 2018-11-01 2020-12-01 Facebook Technologies, Llc Multifocal system using adaptive lenses
CN109491107B (zh) * 2019-01-04 2021-04-02 京东方科技集团股份有限公司 一种液晶眼镜、眼镜度数的调节方法及显示装置
CN111856749A (zh) * 2019-04-28 2020-10-30 云谷(固安)科技有限公司 显示装置及方法
CN110187507A (zh) * 2019-05-28 2019-08-30 深圳市思坦科技有限公司 一种近眼光学显示装置
CN110187508A (zh) * 2019-06-10 2019-08-30 Oppo广东移动通信有限公司 头戴式显示设备、控制方法和存储介质
CN110286538B (zh) * 2019-06-28 2022-05-24 Oppo广东移动通信有限公司 显示方法、显示装置、头戴式显示装置和存储介质
CN113960800B (zh) * 2021-11-08 2023-09-29 歌尔光学科技有限公司 增强现实设备及其屈光度调节方法、存储介质
CN114488534A (zh) * 2022-01-26 2022-05-13 深圳市光舟半导体技术有限公司 Ar眼镜及其相关装置和方法
CN116271551A (zh) * 2023-02-15 2023-06-23 光朗(海南)生物科技有限责任公司 一种眼镜式哺光仪

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309035A (zh) * 2012-03-14 2013-09-18 索尼公司 图像显示装置和图像生成装置
CN105068249A (zh) * 2015-08-03 2015-11-18 众景视界(北京)科技有限公司 全息智能眼镜
CN106461939A (zh) * 2015-05-29 2017-02-22 深圳市柔宇科技有限公司 自适配显示调节的方法及头戴式显示设备
CN108490611A (zh) * 2018-03-26 2018-09-04 京东方科技集团股份有限公司 增强现实设备的屈光调节方法及其装置、增强现实设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7486988B2 (en) * 2004-12-03 2009-02-03 Searete Llc Method and system for adaptive vision modification
IL219907A (en) 2012-05-21 2017-08-31 Lumus Ltd Integrated head display system with eye tracking
CN103439801B (zh) 2013-08-22 2016-10-26 北京智谷睿拓技术服务有限公司 视力保护成像装置及方法
IL235642B (en) * 2014-11-11 2021-08-31 Lumus Ltd A compact head-up display system is protected by an element with a super-thin structure
CN105068248A (zh) * 2015-08-03 2015-11-18 众景视界(北京)科技有限公司 头戴式全息智能眼镜
US10901205B1 (en) * 2016-08-09 2021-01-26 Facebook Technologies, Llc Focus adjusting liquid crystal lenses in a head-mounted display
CN106293100A (zh) * 2016-08-24 2017-01-04 上海与德通讯技术有限公司 虚拟现实设备中视线焦点的确定方法及虚拟现实设备
CN206671681U (zh) * 2017-02-16 2017-11-24 王天龙 一种方便调节镜片度数且适应多种视力的vr观看设备
CN106950694A (zh) * 2017-03-28 2017-07-14 哈尔滨医科大学 一种外接式改善视力头戴vr装置
CN107291233B (zh) * 2017-06-21 2020-09-15 常州快来信息科技有限公司 头戴3d显示设备的视觉优化系统、智能终端及头戴设备
US10521661B2 (en) * 2017-09-01 2019-12-31 Magic Leap, Inc. Detailed eye shape model for robust biometric applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309035A (zh) * 2012-03-14 2013-09-18 索尼公司 图像显示装置和图像生成装置
CN106461939A (zh) * 2015-05-29 2017-02-22 深圳市柔宇科技有限公司 自适配显示调节的方法及头戴式显示设备
CN105068249A (zh) * 2015-08-03 2015-11-18 众景视界(北京)科技有限公司 全息智能眼镜
CN108490611A (zh) * 2018-03-26 2018-09-04 京东方科技集团股份有限公司 增强现实设备的屈光调节方法及其装置、增强现实设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112147786A (zh) * 2020-10-28 2020-12-29 南京爱奇艺智能科技有限公司 一种增强现实显示系统
CN112147786B (zh) * 2020-10-28 2024-04-12 南京爱奇艺智能科技有限公司 一种增强现实显示系统

Also Published As

Publication number Publication date
US11579448B2 (en) 2023-02-14
CN108490611A (zh) 2018-09-04
US20210157141A1 (en) 2021-05-27
CN108490611B (zh) 2020-12-08

Similar Documents

Publication Publication Date Title
WO2019184896A1 (zh) 屈光调节方法及装置以及增强现实设备
US11693247B2 (en) Augmented reality display having multi-element adaptive lens for changing depth planes
US11669161B2 (en) Enhancing the performance of near-to-eye vision systems
US20230032100A1 (en) Virtual and augmented reality systems and methods
US11828946B2 (en) Systems and methods for retinal imaging and tracking
US10319154B1 (en) Methods, systems, and computer readable media for dynamic vision correction for in-focus viewing of real and virtual objects
US8939579B2 (en) Autofocusing eyewear, especially for presbyopia correction
US20230099062A1 (en) Display systems and methods for clipping content to increase viewing comfort
JP2009539130A (ja) 眼鏡レンズを最適化および/または製造するための方法
US11675197B1 (en) System and method for automatic vision correction in near-to-eye displays
KR102511897B1 (ko) 가상 이미지에 대한 교정 광학 기능을 결정하는 방법
KR101632156B1 (ko) 초근거리를 볼 수 있는 교정렌즈
TWM629871U (zh) 擴增實境光學系統及頭戴式顯示器
KR101490778B1 (ko) 초 근거리를 볼 수 있는 교정 렌즈 및 그 장치
US20230404739A1 (en) Method and apparatus for correcting vision based on focus adjustable lens
US11914149B2 (en) System for generating a virtual image for a wearer
Padmanaban Enabling Gaze-Contingent Accommodation in Presbyopia Correction and Near-Eye Displays
Refai et al. Diffraction-based glasses for visually impaired people
KR101632140B1 (ko) 초근거리를 볼 수 있는 교정 렌즈 어셈블리
KR20220093549A (ko) 착용자의 시력을 범용적으로 보정하는 증강현실 광학시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19775269

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/01/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19775269

Country of ref document: EP

Kind code of ref document: A1