WO2014106330A1 - Lentille de contact pour mesure de la mise au point d'un globe oculaire - Google Patents

Lentille de contact pour mesure de la mise au point d'un globe oculaire Download PDF

Info

Publication number
WO2014106330A1
WO2014106330A1 PCT/CN2013/070062 CN2013070062W WO2014106330A1 WO 2014106330 A1 WO2014106330 A1 WO 2014106330A1 CN 2013070062 W CN2013070062 W CN 2013070062W WO 2014106330 A1 WO2014106330 A1 WO 2014106330A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
focal distance
contact lens
infrared light
image
Prior art date
Application number
PCT/CN2013/070062
Other languages
English (en)
Inventor
Zhen Xiao
Original Assignee
Empire Technology Development Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Empire Technology Development Llc filed Critical Empire Technology Development Llc
Priority to PCT/CN2013/070062 priority Critical patent/WO2014106330A1/fr
Publication of WO2014106330A1 publication Critical patent/WO2014106330A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/04Contact lenses for the eyes

Definitions

  • Example apparatus may include a wearable contact lens, an IR emitter disposed on the contact lens to emit IR light, an IR sensor disposed on the contact lens to detect reflections of the IR light, and a communication device disposed on the contact lens communicatively coupled to the IR sensor and configured to provide focal distance data associated with the eye.
  • the present disclosure describes various illustrative methods for detecting eye focal distance from a contact lens. Such methods may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor.
  • the present disclosure describes various illustrative machine- readable instructions for detecting eye focal distance from a contact lens.
  • Such machine-readable instructions may include emitting IR light from an emitter disposed on a wearable contact lens, detecting reflections at an IR sensor on the contact lens, and providing focal distance data associated with the eye from a communication device on the contact lens communicatively coupled to the IR sensor.
  • the foregoing summary may be illustrative only and may not be intended to be in any way limiting.
  • further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
  • Fig. 1 illustrates a system showing a cross section of an eye and a light source shining into the eye to project a set of Purkinje-Sanson reflections
  • Fig. 2 illustrates cross sections of an eye including a lens and an example contact lens configured to measure eyeball focus of the eye;
  • Fig. 3 illustrates an example contact lens configured to measure eyeball focus
  • Fig. 4 illustrates a cutaway view of an example contact lens including a groove
  • Fig. 5 illustrates a flow diagram of an example method for measuring eyeball focus
  • Fig. 6 illustrates a flow diagram of an example method for calibrating a determined focal distance with the perception of the individual wearing the contact lens
  • Fig. 7 illustrates a flow diagram of an example method for calibrating a known object distance to a measured focal distance
  • Fig. 8 illustrates a flow diagram of an example method for associating a particular focal distance to an adjusted object distance
  • Fig. 9 illustrates an example computer program product
  • Fig. 10 illustrates an example computing device, all arranged in accordance with at least some embodiments of the present disclosure.
  • This disclosure is drawn, inter alia, to apparatus, methods and computer-readable media related to measuring eyeball focus using a contact lens including an emitter, a receiver, a communication device, and/or a control circuit.
  • a sensor may be disposed on a contact lens and may be configured to detect focal distance data associated with an eye to determine the focal distance of the eye, as will be described in greater detail throughout.
  • a light source disposed on the contact lens may shine into the eye, reflect off one or more surfaces of the eye, and the reflections may project one or more
  • Purkinje-Sanson images at particular locations of the eye.
  • the locations of the Purkinje-Sanson images may be detected at the sensor disposed on the contact lens.
  • the locations of the detected Purkinje-Sanson images may be associated with the physiological state of a natural lens within the eye as part of a process called accommodation, as will be described in greater detail throughout.
  • the locations of the detected Purkinje-Sanson images may be processed to determine an accommodation state of the eye, which may in turn be processed to determine a focal distance of the eye.
  • the optical power of an eye for a given accommodation state may be particular to an individual; accordingly, one or more calibration processes may associate detected focal distance data with a determination of the focal distance of the particular eye wearing the contact lens, as will be described in greater detail throughout.
  • calibration may incorporate the use of an Alternate Reality (AR) image device that may provide one or more AR images at selected focal distances.
  • a focal distance data of the eye may be determined when focused on an AR image.
  • a perceived distance of the AR image may be manipulated to coincide with a known physical distance.
  • FIG. 1 illustrates a system 100 showing a cross section of an eye
  • light source 1 10 may produce light 1 1 1 that travels toward a cornea 104 of eye 101 and may continue to a lens
  • a portion of light 1 1 1 may reflect off an outer surface of cornea
  • a focal distance of an eye may be determined based at least in part on the locations of one or more of the Purkinje-Sanson Images PS1 1 12, PS2 113, PS3 1 16, and/or PS4 1 18.
  • images PS1 1 12 and PS2 1 14 may be reflected from the outer and inner surfaces of cornea 104, and therefore may not be directly influenced by lens 102, so that the locations of Purkinje- Sanson images PS1 112 and PS2 1 14 may not be best suited to provide information related to the state of lens 102.
  • PS2 114 is depicted without a corresponding label and candle-image in part because PS2 1 14 may have less consequence to the subject matter disclosed herein.
  • the locations of Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may correspond to the location of light source 1 10 and/or the accommodation of lens 102.
  • image PS1 1 12 and/or image PS2 1 14 may appear relatively close to light source 1 10.
  • image PS3 1 16 and image PS4 1 18 may appear relatively far from light source 1 10, and in some examples image PS3 1 16 and image PS4 1 18 may appear on the substantially opposite side of cornea 104.
  • the relative locations of images PS1 1 12, PS2 1 14, PS3 1 16, and PS4 1 18 may be increasingly more distant from light source 1 10.
  • the relative distance of images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 from light source 1 10 may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118.
  • PS3 1 16 is generally relatively closer to light source 1 10 than PS4 1 18, then given two detected images thought to be PS3 1 16 and PS4 1 18, the relative distances of the images from the light source may be determined and the closer image may be labeled PS3 116 and the further image may be labeled PS4 1 18.
  • Purkinje-Sanson images PS1 1 12, PS2 1 14, PS3 1 16, and/or PS4 1 18 may be detected as upright images or inverted images.
  • light source 1 10 may depict a candle with a flame on top.
  • the depiction of the candle may be for illustrative purposes to demonstrate the orientation of Purkinje- Sanson images PS1 112, PS3 1 16, and/or PS4 1 18.
  • image PS1 1 12 and image PS3 116 may appear to have a similar orientation as light source 110, depicted here as upright, with the candle flame appearing on top.
  • image PS4 1 18 may appear substantially inverted, depicted here with the candle flame appearing on the bottom.
  • the orientation of an image may be used to distinguish some or all of the Purkinje-Sanson images PS1 1 12, PS2 114, PS3 1 16, and PS4 118. For example, given two detected images thought to be PS3 1 16 and PS4 1 18, the image appearing substantially upright may be labeled PS3 1 16 and the image appearing substantially inverted may be labeled PS4 1 18.
  • lens 102 may change focus by changing the optical power of lens 102 via the accommodation process natural to the human eye.
  • muscles controlling lens 102 may tense and/or relax causing deformation of lens 102, thereby changing the optical power and focus of lens 102.
  • outer surface of lens 102 and/or inner surface of lens 102 may change in curvature and lens 102 may change in thickness, thus the locations of image PS3 1 16 and/or image PS4 118 may correspondingly change location(s).
  • the locations of PS3 1 16 and PS4 1 18 may provide an indication of the current accommodation state of eye 101 , which in turn may be processed to determine a current focal distance of eye 101.
  • Fig. 2 illustrates cross sections of eye 101 including lens 102 and an example contact lens 201 configured to measure eyeball focus of eye 101 , arranged in accordance with at least some embodiments of the present disclosure.
  • Fig. 2 may depict a cross section of eye 101 including lens 102. Eye 101 , lens 102, PS3 1 16, and/or PS4 1 18 are further described as discussed with respect to Fig. 1 and elsewhere herein.
  • an emitter 210 In some examples, an emitter 210, a sensor 220, and/or a
  • communication device 230 may be disposed on contact lens 201.
  • contact lens 201 may include emitter 210 and sensor 220 to determine the focus of lens 102 in eye 101.
  • lens 102 may correspondingly change shape, so that image PS3 1 16 and image PS4 1 18 may change locations.
  • emitter 210 may be used to project images PS3 1 16 and/or PS4 1 18, and
  • Sensor 220 may be used to detect the locations of images PS3 1 16 and/or PS4 1 18. The detected locations of images PS3 1 16 and PS4 1 18 may then be used to determine an accommodation of lens 102, which in turn may be used to determine a focal distance of eye 101.
  • emitter 210 may be configured to emit light to create PS3 1 16 and/or PS4 1 18 images.
  • emitter 210 may include a device for power/control 212, an LED 214, and/or a microlens 216.
  • LED 214 may be activated by power/control 212 to emit light; the emitted light may pass through microlens 216 toward eye 101 creating images PS3 1 16 and PS4 1 18.
  • emitter 210 may be configured to emit any appropriate wavelength(s) of light, including, but not limited to, IR light outside of the visible spectrum of a typical human eye.
  • the wearer of contact lens 201 may not be aware of and/or disturbed by the emissions of emitter 210.
  • sensor 220 may be configured to detect PS3 1 16 and/or PS4 1 18 images.
  • sensor 220 may include a CCD/CMOS 222 and a microlens array 224.
  • Microlens array 224 may be configured to focus PS3 1 16 and/or PS4 1 18 images onto CCD/CMOS 222.
  • CCD/CMOS 222 and/or microlens array 224 may be arranged in any appropriate shape, including a linear distribution according to the direction of warp.
  • CCD/CMOS 222 may describe a line so that received images may be associated with a linear displacement on the line to provide location data as an offset coordinate.
  • microlens array 224 may be linearly collaborated to CCD/CMOS 222 to associate one or more individual microlenses of microlens array 224 with corresponding one or more sensitive elements of CCD/CMOS 222.
  • CCD/CMOS 222 may be any appropriate type of image sensor, including Charge Coupled Device (CCD) and/or Complementary Metal Oxide Semiconductor (CMOS), or the like.
  • CCD/CMOS 222 may incorporate one or more filters to attenuate signals not originating from emitter 210, for example one or more IR filters configured to pass primarily IR light.
  • sensor 220 may be configured to detect focal distance data associated with eye 101.
  • focal distance data may be determined based at least in part on a distance separating images PS3 1 16 and PS4 1 18, which may indicate an accommodation of lens 102, which in turn may indicate a focal distance of eye 101.
  • Focal distance data may be any appropriate focal distance data, including, for example, an indication of the focal distance and/or indications of the locations of images PS3 1 16 and/or PS4 1 18.
  • focal distance data may include substantially unprocessed data from sensor 220 which may be processed to determine at least the locations of images PS3 1 16 and/or PS4 118.
  • the focal distance of eye 101 may be determined based at least in part on focal distance data.
  • the determined focal distance of eye 101 may be used, for example, to determine where a wearer of contact lens 201 may be looking, what the wearer may be looking at, and/or in cooperation with any appropriate augmented reality (AR) display system to display information at the focal distance of the person wearing contact lens 201 .
  • AR augmented reality
  • communication device 230 may be used to communicate focal distance data associated with eye 101.
  • communication device 230 may be located with sensor 220, although
  • communication device 230 may be positioned at any appropriate location. In some examples, communication device 230 may be communicatively coupled to sensor 220 and/or may be communicatively coupled to CCD/CMOS 222.
  • Communication device 230 may be any appropriate communication device, for example one or more antennas, wired connectivity, or the like. Communication device 230 may use any appropriate protocol, such as Bluetooth, WiFi, cellular, Ethernet, or the like.
  • contact lens 201 may measure eyeball focus based at least in part on detecting the locations of image PS3 1 16 and/or image PS4 1 18.
  • a distance separating PS3 1 16 and PS4 1 18 may generally indicate an accommodation of lens 102. That is, as lens 102 changes focus, the
  • accommodation of lens 102 may change so that the locations of and/or distance separating images PS3 116 and/or PS4 1 18 may also change correspondingly.
  • the determination of locations of and/or distances separating images PS3 1 16 and PS4 1 18 may require distinguishing between the locations of image PS3 1 16 and image PS4 1 18.
  • image PS3 1 16 may be located closer to the center of contact lens 201 and image PS4 1 18 may likewise be located further from the center of contact lens 201 .
  • other distinguishing characteristics may be used to distinguish PS3 1 16 and/or PS4 1 18 images.
  • Images PS3 1 16 and/or PS4 1 18 may be distinguished, in some examples, by configuring LED 214 to emit two or more distinct wavelengths of IR light in a particular orientation and/or pattern so that CCD/CMOS 222 may determine whether an image appears in a substantially upright orientation, such as may be observed at image PS3 1 16, or instead whether an image appears in a substantially inverted orientation, such as may be observed at image PS4 118.
  • LED 214 may comprise one or more LED devices.
  • sensor 220 may be capable of distinguishing two or more wavelengths of IR light.
  • individual photosensitive sites of sensor 220 may be configured to improve sensitivity to one or more particular wavelengths.
  • individual photosensitive sites of sensor 220 may include filters to attenuate sensitivity to undesired wavelengths.
  • filtered photosensitive sites of sensor 220 may be patterned, for example ABABAB, wherein A indicates sensitivity to a first wavelength and B indicates sensitivity to a second wavelength. Any suitable pattern of photosensitive sites may be employed.
  • emitter 210 may be arranged substantially opposite to sensor 220. In some examples, emitter 210 may be arranged on a first sector of the contact lens and sensor 220 may be arranged on a second sector of the contact lens, substantially opposite to the first sector. In some examples, emitter 210 may be arranged substantially linearly according to the direction of warp.
  • power/control 212 may be configured to power and/or control LED 214. In some examples— depending in part on available power (or any other suitable factors)— power/control 212 may shine LED 214 continuously and/or for a long duration. In some examples, power/control 212 may pulse LED 214 in any appropriate duty cycle or cycles. Power/control 212 may be configured to provide steady illumination, to conserve power, and/or to adapt output according to environmental needs. In some examples, such as in the presence of bright and/or dark ambient light, power/control 212 may increase and/or decrease the intensity of LED 214 to provide brighter PS3 1 16 and/or PS4 1 18, as appropriate.
  • emitter 210, sensor 220, and/or communication device 230 may be disposed on an inside surface of contact lens 201 , on an outside surface of contact lens 201 , substantially through contact lens 201 , and/or substantially enclosed within contact lens 201.
  • components included in contact lens 201 may operate at the speed of modern electronics.
  • emitter 210 may emit light
  • sensor 220 may detect light
  • communication device 230 may communicate in real time.
  • Fig. 3 illustrates contact lens 201 configured to measure eyeball focus, arranged in accordance with at least some embodiments of the present disclosure.
  • Contact lens 201 may include emitter 210, sensor 220, power supply 310, and/or leads 330.
  • Contact lens 201 , emitter 210, and/or sensor 220 may be further described as discussed with respect to Fig. 2 and elsewhere herein.
  • Power may be used for any appropriate purpose. In some embodiments,
  • emitter 210 may use power to emit light at the LED.
  • sensor 220 may use power to operate CCD/CMOS 222 and/or communication device 230.
  • a power supply 310 may be configured as a substantially circular and/or semicircular circuit, circumferentially located toward the extents of contact lens 201. In some examples, power supply 310 may not be visible by the wearer of contact lens 201. In some examples, power supply 310 may be visible at the extents of vision and/or at one or more leads 330 coupling power supply 310 to emitter 210 and/or sensor 220. In some examples, power supply 310 may include substantially transparent materials.
  • Power supply 310 may be configured to harvest power to operate electronic components of contact lens 201 , for example emitter 210 and/or sensor 220. Power supply 310 may use more conventional techniques either in combination with harvesting power or alone, such as battery power, which may provide uninterrupted and/or conditioned power.
  • power supply 310 may be operable to harvest power from any suitable sources, for example solar power, kinetic energy, thermoelectric power, rectifying antenna, or the like.
  • power supply 310 may include a radio frequency (RF) antenna configured to harvest RF energy.
  • RF radio frequency
  • power supply 310 may incorporate an antenna or may be used as an antenna as appropriate. In some examples,
  • Fig. 4 illustrates a cutaway view of an example contact lens 401 including a groove 410, arranged in accordance with at least some embodiments of the present disclosure.
  • Fig. 3 previously discussed above, describes contact lens 201 arranged to include emitter 210 and sensor 220 on substantially opposite sectors of contact lens 201.
  • Fig. 4 depicts contact lens 401 having a annular groove, groove 410, wherein groove 410 may be arranged to emit light and receive reflections as described herein and throughout.
  • the light emitted from groove 410 may function as described above, particularly with respect to Fig. 1 , so that the emitted light may project reflections associated with an accommodation state of a natural lens of an eye wearing contact lens 401 , and the received reflections may be processed to determine the
  • Groove 410 may be an annular groove formed on the contact lens and may permit determination of the focal distance of the eye wearing contact lens 401 .
  • an IR emitter including an infrared laser diode may be configured to provide IR light to groove 410.
  • the IR laser diode may use very low power.
  • Groove 401 may act as a semi-reflex lens, and may be configured to emit a portion of light and/or conduct a portion of light.
  • a first potion of the provided IR light may be configured to emit from groove 410 and may reflect off one or more surfaces of an eye wearing contact lens 401.
  • the reflected IR light may be received at groove 410 to provide a first reflected signal.
  • a second portion of the provided IR light may conduct a total reflection along groove 410 to provide a second reflected signal.
  • An IR sensor coupled to groove 410 may be configured to detect the first and second reflected signal.
  • the first and second reflected signals may produce interference, which may be detected at the IR sensor.
  • the first and second reflected signals, including interference between the signals may comprise focal distance data, which may in turn be processed to determine a focal distance of the eye based in part on the interference between the first and second reflected signal.
  • the interference between the first and second reflected signals may correspond at least in part on the distance traveled by the first emitted portion of light as reflected by the lens of the eye wearing contact lens 401.
  • the reflected light may travel a shorter or further distance, thereby correspondingly affecting the interference detected by the sensor.
  • measuring the interference may permit determination of the focal distance of the eye wearing contact lens 401.
  • groove 410 may be formed on either a top or a bottom surface of contact lens 401 as appropriate.
  • groove 410 may behave similarly to an optical fiber, acting, for example, as a waveguide for at least a portion of light transmitted via groove 410.
  • groove cross-sectional shape may be any suitable shape, including semi-circular, rectangular, diamond, or the like.
  • emitted light may originate from groove 410 and/or any suitable surface of contact lens 401.
  • Fig. 5 illustrates a flow diagram of an example method 500 for measuring eyeball focus, arranged in accordance with at least some
  • method 500 may be performed by any suitable device, devices, or systems such as those discussed herein. More specifically, method 500 may be performed by a contact lens to measure eyeball focus.
  • Method 500 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 5 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 500 may include one or more of functional operations as indicated by one or more of blocks 502, 504, 506, and/or 508. The process of method 500 may begin at block 502.
  • an emitter disposed on the contact lens may emit IR light.
  • any suitable type or types of light may be emitted by any suitable technique or techniques.
  • IR light may be emitted as may be further discussed with respect to Figs. 1 -4 and elsewhere herein, particularly with respect to emitter 210 and LED 214.
  • the emitted light may be comprised of wavelengths falling outside of human vision, for example IR light.
  • a sensor on the contact lens may detect reflections of the IR light.
  • the sensor may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to sensor 220 and/or CCD/CMOS 222.
  • the detected IR light may be associated with one or more locations on the sensor and may correspond to locations of images PS3 1 16 and/or PS4 1 18. Accordingly, in some examples, the locations of the detected IR light may be used to determine a focal distance of an eye wearing the contact lens.
  • detected data may be provided in a variety of formats to allow the determination of the focal distance, including formats such as unprocessed focal distance data, identified locations of images PS3 1 16 and/or PS4 118, an indication of focal distance, and/or as any suitable focal distance data.
  • Process of method 500 may continue from block 504 to block 506.
  • focal distance data associated with the eye wearing the contact lens may be provided via a communication device.
  • the communication device may operate as may be further discussed with respect to Figs. 2-4 and elsewhere herein, particularly with respect to communication device 230.
  • Focal distance data may be provided by communication techniques including radio communication, wired communication, optical communication, and/or any suitable communication technique or techniques.
  • Process of method 500 may optionally continue from block 506 to block 508 or stop after block 506.
  • focal distance data may be processed at a control circuit.
  • block 508 may be optional. Processing the focal distance data may include determining a focal distance based in part on one or more detected reflections of IR light.
  • the control circuit may determine the focal distance based at least in part on the distance separating images PS3 and PS4, for example as may be detected at block 504.
  • the focal distance data may include the focal distance, and accordingly determining the focal distance will not require further processing and/or interpretation of focal distance data.
  • control circuit may be disposed on the contact lens. In some examples, the control circuit may be located somewhere other than the contact lens. In some examples, the control circuit may be included in a mobile electronic device, a PDA, a computer, a cloud-based computing device, or the like. In general, the control circuit may be configured to process the focal distance data.
  • process 500 including blocks 502, 504, 506, and/or 508 may occur in real time.
  • processing focal distance data may include calibrating the determined focal distance to the perception of the contact lens wearer.
  • calibration may be optional.
  • calibration may occur as infrequently as once for a wearer of the contact lens, in some examples upon the wearer's first use of the contact lens.
  • calibration may be repeated, for example repeated to calibrate different focal distances or for example repeated at different times.
  • Process of method 500 may stop after block 508. In some embodiments,
  • process of method 500 may be repeated, beginning again at block 502, or the like. In some examples, process of method 500 may repeat without performing optional block 508, and may continue from block 506 to block 502, or the like.
  • Fig. 6 illustrates a flow diagram of an example method 600 for calibrating a determined focal distance with the perception of the individual wearing a contact lens, arranged in accordance with at least some embodiments of the present disclosure.
  • method 600 may be performed by any suitable device, devices, or systems such as those discussed herein.
  • method 600 may process focal distance data that may be provided via, for example, the communication device of Fig. 5 at block 506, though method 600 is not limited to this purpose.
  • method 600 may be similar to, encompass, and/or represent a portion of Fig. 5 at block 508, "Process focal distance data at a control circuit".
  • Method 600 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 6 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 600 may include one or more of functional operations as indicated by one or more of blocks 602, and/or 604. The process of method 600 may begin at block 602.
  • a focal distance associated with an eye of the wearer of the contact lens may be determined at a control circuit. As described herein, particularly with respect to Fig. 5 and block 508, the focal distance may be determined based in part on processing focal distance data. Process of method 600 may continue from block 602 to block 604.
  • an association between accommodation of the eye and the focal distance may be calibrated.
  • human eyes may be unique and the diopter of an eye may differ from another eye.
  • two different people exhibiting similar measured accommodation of the eye for example such as when a distance between PS3 1 16 and PS4 1 18 images may be similar, may perceive focus at different focal distances.
  • different people focused on the same object at the same distance may exhibit different measured accommodations of the eye. These differences may be measured, such as at block 602, and calibrated for the individual wearing the contact lens.
  • calibration may include a best fit curve to associate the accommodation of the eye at one or more distances with one or more focal distances perceived at those distances.
  • the calibrated distances may include positions such as 25cm, 33cm, 1 m, 2m, 5, etc., which correspond to diopters of 4D, 3D, 1 D, 0.5D, and 0.2D, respectively, which, when associated with corresponding detected accommodations of the eye, may provide data to develop a best fit to form a corresponding relationship between a measured indication of accommodation and a corresponding perceived focal distance of the subject.
  • Calibration may be done when appropriate, for example when the contact lens wearer wears the contact lens for the first time, occasionally, and/or periodically.
  • the AR image device may include any appropriate AR image device.
  • the AR image device may be disposed on the contact lens.
  • the AR image device may include a head mounted display.
  • the AR image device may include one or more fixed display units, including, for example, projectors and/or displays.
  • Process of method 600 may end at block 604. In some examples, process of method 600 may be repeated, beginning again at block 602.
  • Fig. 7 illustrates a flow diagram of an example method 700 for calibrating a known object distance to a measured focal distance, arranged in accordance with at least some embodiments of the present disclosure.
  • method 700 may be performed by any suitable device, devices, or systems such as those discussed herein.
  • Method 700 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 7 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 700 may include one or more of functional operations as indicated by one or more of blocks 702, 704, and/or 706. The process of method 700 may begin at block 702.
  • an AR image focused at a known focal distance may be provided.
  • the AR image may be provided by an alternate reality imaging device or any suitable alternative.
  • the alternate reality imaging device may project, overlay, and/or display computer generated data onto a view of the surroundings, or the like, which may permit viewing data without requiring a user to look away from a scene.
  • the provided AR image may have the appearance of a target, or the like.
  • the provided AR image may be displayed so as to be perceived at a known distance.
  • an AR imaging device may provide an image to be perceived at any suitable distance, including at infinity. Process of method 700 may continue from block 702 to block 704.
  • an accommodation of an eye wearing a contact lens may be determined when focused on the AR image.
  • the accommodation of the eye may correspond to a distance between images PS3 and PS4.
  • the contact lens may determine the accommodation of the eye as discussed herein, particularly with respect to Figs. 1 -5 and elsewhere herein.
  • contact lens 102 may emit IR light, detect reflections of the IR light, and/or provide focal distance data that may be processed to determine the accommodation of the eye.
  • a user may focus on the target and the
  • Process of method 700 may continue from block 704 to block 706.
  • the accommodation of the eye when focused on the target may be associated with the known focal distance of the provided AR image.
  • the association may be stored for later use and/or processed immediately.
  • the AR image device may include any appropriate AR image device.
  • process of method 700 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances.
  • process of method 700 may repeat at block 702.
  • Process of method 700 may stop after 706.
  • Fig. 8 illustrates a flow diagram of an example method 800 for associating a particular focal distance to an adjusted object distance, arranged in accordance with at least some embodiments of the present disclosure.
  • the associated data may permit the detection of a current focal distance of an eye to allow an AR image device to display information to be perceived at the same focal distance.
  • Method 800 sets forth various functional blocks or actions that may be described as processing steps, functional operations, events and/or acts, etc., which may be performed by hardware, software, and/or firmware. Numerous alternatives to the functional blocks shown in Fig. 8 may be practiced in various implementations. For example, intervening actions not shown and/or additional actions not shown may be employed and/or some of the actions shown may be eliminated, without departing from the scope of claimed subject matter.
  • Method 800 may include one or more of functional operations as indicated by one or more of blocks 802, 804, 806, and/or 808. The process of method 800 may begin at block 802.
  • a user may focus the eye at a first focal distance and an accommodation of the eye may be determined when focused at the first focal distance.
  • the first distance may be any suitable distance.
  • the user may focus on an object in a viewport, for example a wall or the like.
  • determining the accommodation of the eye may proceed as may be further discussed with respect to Fig. 7 and elsewhere herein, particularly with respect to block 704.
  • the user may not substantially change their gaze and/or change focal distance for the duration of method 800, in some examples block 802 may be performed at any time prior to block 808. Process of method 800 may continue from block 802 to block 804.
  • an AR image may be provided. As described herein, and with respect to Fig. 7 above, particularly block 702, the AR image may be provided at any suitable focal distance. In some examples, the provided AR image may not initially coincide with the first focal distance. Process of method 800 may continue from block 804 to block 806.
  • a user interface may be provided so that the user may adjust the focal distance of the displayed AR image. In general, the user will adjust the provided AR image until the AR image coincides with the first focal distance.
  • Process of method 800 may continue from block 806 to block 808.
  • "Associate the accommodation of the eye to the adjusted focal distance of the AR image” the accommodation of the eye, as determined at block 802, may be associated to the adjusted focal distance of the AR image. In some examples, this association may be used as described herein to calibrate an AR imaging device with the perception of an individual person. In some examples, as described herein, this association may be repeated as necessary at different focal distances to perform best fit analysis over a range of focal distances.
  • the AR image device may include any appropriate AR image device.
  • process of method 800 may be repeated as desired at different focal distances to perform best fit analysis over a range of focal distances.
  • process of method 800 may repeat at block 802.
  • process of method 800 may stop after 808.
  • Fig. 9 illustrates an example computer program product 900, arranged in accordance with at least some embodiments of the present disclosure.
  • Computer program product 900 may include machine readable non- transitory medium having stored therein instructions that, when executed, may operatively enable a computing device to measure eyeball focus and/or calibrate measured focus data according to the processes and methods discussed herein.
  • Computer program product 900 may include a signal bearing medium 902.
  • Signal bearing medium 902 may include one or more machine-readable instructions 904, which, when executed by one or more processors, may operatively enable a computing device to provide the functionality described herein. In various examples, some or all of the machine-readable instructions may be used by the devices discussed herein.
  • signal bearing medium 902 may encompass a computer-readable medium 906, such as, but not limited to, a hard disk drive, a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, memory, etc.
  • signal bearing medium 902 may encompass a recordable medium 908, such as, but not limited to, memory, read/write (R/W) CDs, R/W DVDs, etc.
  • signal bearing medium 902 may encompass a communications medium 910, such as, but not limited to, a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communication link, a wireless communication link, etc.).
  • signal bearing medium 902 may encompass a machine readable non-transitory medium.
  • Fig. 10 is a block diagram illustrating an example computing device 1000, arranged in accordance with at least some embodiments of the present disclosure.
  • computing device 1000 may be configured to measure eyeball focus and/or calibrate measured focus data as discussed herein.
  • computing device 1000 may include one or more processors 1010 and system memory 1020.
  • a memory bus 1030 can be used for communicating between the processor 1010 and the system memory 1020.
  • processor 1010 may be of any type including but not limited to a microprocessor ( ⁇ ), a microcontroller ( ⁇ ), a digital signal processor (DSP), or any combination thereof.
  • Processor 1010 can include one or more levels of caching, such as a level one cache 101 1 and a level two cache 1012, a processor core 1013, and registers 1014.
  • the processor core 1013 can include an arithmetic logic unit (ALU), a floating point unit (FPU), a digital signal processing core (DSP Core), or any combination thereof.
  • a memory controller 1015 can also be used with the processor 1010, or in some
  • the memory controller 1015 can be an internal part of the processor 1010.
  • system memory 1020 may be of any type including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof.
  • System memory 1020 may include an operating system 1021 , one or more applications 1022, and program data 1024.
  • Application 1022 may include eyeball focus measuring application 1023 that can be arranged to perform the functions, actions, and/or operations as described herein including the functional blocks, actions, and/or operations described herein.
  • Program Data 1024 may include eyeball focus measuring data 1025 for use with eyeball focus measuring application 1023.
  • application 1022 may be arranged to operate with program data 1024 on an operating system 1021 . This described basic configuration is illustrated in Fig. 10 by those components within dashed line 1001.
  • Computing device 1000 may have additional features or
  • a bus/interface controller 1040 may be used to facilitate communications between the basic configuration 1001 and one or more data storage devices 1050 via a storage interface bus 1041.
  • the data storage devices 1050 may be removable storage devices 1051 , non-removable storage devices 1052, or a combination thereof.
  • removable storage and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (HDD), optical disk drives such as compact disk (CD) drives or digital versatile disk (DVD) drives, solid state drives (SSD), and tape drives to name a few.
  • Example computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • System memory 1020, removable storage 1051 and non-removable storage 1052 are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 1000. Any such computer storage media may be part of device 1000.
  • Computing device 1000 may also include an interface bus 1042 for facilitating communication from various interface devices (e.g., output interfaces, peripheral interfaces, and communication interfaces) to the basic configuration 1001 via the bus/interface controller 1040.
  • Example output interfaces 1060 may include a graphics processing unit 1061 and an audio processing unit 1062, which may be configured to communicate to various external devices such as a display or speakers via one or more A/V ports 1063.
  • Example peripheral interfaces 1070 may include a serial interface controller 1071 or a parallel interface controller 1072, which may be configured to communicate with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device, etc.) or other peripheral devices (e.g., printer, scanner, etc.) via one or more I/O ports 1073.
  • An example communication interface 1080 includes a network controller 1081 , which may be arranged to facilitate communications with one or more other computing devices 1083 over a network communication via one or more communication ports 1082.
  • a communication connection is one example of a communication media.
  • Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and may include any information delivery media.
  • a "modulated data signal" may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared (IR) and other wireless media.
  • RF radio frequency
  • IR infrared
  • the term computer readable media as used herein may include both storage media and communication media.
  • Computing device 1000 may be implemented as a portion of a small-form factor portable (or mobile) electronic device such as a cell phone, a mobile phone, a tablet device, a laptop computer, a personal data assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that includes any of the above functions.
  • a small-form factor portable (or mobile) electronic device such as a cell phone, a mobile phone, a tablet device, a laptop computer, a personal data assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that includes any of the above functions.
  • Computing device 1000 may also be implemented as a personal computer including both laptop computer and non-laptop computer configurations.
  • computing device 1000 may be implemented as part of a wireless base station or other wireless system or device.
  • block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof.
  • ASICs Application Specific Integrated Circuits
  • FPGAs Field Programmable Gate Arrays
  • DSPs digital signal processors
  • a signal bearing medium examples include, but are not limited to, the following: a recordable type medium such as a flexible disk, a hard disk drive (HDD), a Compact Disc (CD), a Digital Versatile Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired
  • any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components.
  • any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality.
  • operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting

Landscapes

  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne une lentille (201) de contact destinée à mesurer la mise au point d'un globe oculaire. Ladite lentille (201) de contact comporte: une lentille (201) de contact configurée pour être portée sur un œil (101); un émetteur (210) d'infrarouges (IR) disposé sur la lentille (201) de contact, l'émetteur (210) d'IR étant configuré pour émettre une lumière infrarouge; un capteur (220) d'IR disposé sur la lentille (201) de contact, le capteur (220) d'IR étant configuré pour détecter une ou plusieurs réflexions de la lumière infrarouge; et un dispositif (230) de communications disposé sur la lentille (201) de contact, le dispositif (230) de communications étant couplé en vue de communiquer avec le capteur (220) d'IR et configuré pour transmettre des données de distance focale associées à l'œil (101).
PCT/CN2013/070062 2013-01-05 2013-01-05 Lentille de contact pour mesure de la mise au point d'un globe oculaire WO2014106330A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/070062 WO2014106330A1 (fr) 2013-01-05 2013-01-05 Lentille de contact pour mesure de la mise au point d'un globe oculaire

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/070062 WO2014106330A1 (fr) 2013-01-05 2013-01-05 Lentille de contact pour mesure de la mise au point d'un globe oculaire

Publications (1)

Publication Number Publication Date
WO2014106330A1 true WO2014106330A1 (fr) 2014-07-10

Family

ID=51062134

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/070062 WO2014106330A1 (fr) 2013-01-05 2013-01-05 Lentille de contact pour mesure de la mise au point d'un globe oculaire

Country Status (1)

Country Link
WO (1) WO2014106330A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678361B2 (en) 2014-06-13 2017-06-13 Verily Life Sciences Llc Power delivery for accommodation by an eye-mountable device
US9854437B1 (en) 2014-06-13 2017-12-26 Verily Life Sciences Llc Apparatus, system and method for exchanging encrypted communications with an eye-mountable device
US9880401B2 (en) 2014-06-13 2018-01-30 Verily Life Sciences Llc Method, device and system for accessing an eye-mountable device with a user interface
WO2019033334A1 (fr) * 2017-08-17 2019-02-21 Xinova, LLC Lentilles de contact ayant des caractéristiques bifocales
US20190179165A1 (en) * 2017-12-12 2019-06-13 RaayonNova, LLC Smart Contact Lens with Embedded Display and Image Focusing System
US10353463B2 (en) * 2016-03-16 2019-07-16 RaayonNova LLC Smart contact lens with eye driven control system and method
US20190353894A1 (en) * 2013-01-24 2019-11-21 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
CN112400134A (zh) * 2018-07-13 2021-02-23 德遁公司 用于眼部安装式成像系统的高级光学设计
CN114779471A (zh) * 2022-03-24 2022-07-22 闽都创新实验室 基于纳米像元阵列的眼机接口及其工作方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535743A (en) * 1992-12-19 1996-07-16 Boehringer Mannheim Gmbh Device for the in vivo determination of an optical property of the aqueous humour of the eye
US20030202155A1 (en) * 2002-04-25 2003-10-30 Andre Berube Multi-focal contact lens
US20090204207A1 (en) * 2007-02-23 2009-08-13 Pixeloptics, Inc. Advanced Electro-Active Optic Device
WO2011067391A1 (fr) * 2009-12-04 2011-06-09 Varioptic Dispositif ophtalmique de focalisation à commande électronique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5535743A (en) * 1992-12-19 1996-07-16 Boehringer Mannheim Gmbh Device for the in vivo determination of an optical property of the aqueous humour of the eye
US20030202155A1 (en) * 2002-04-25 2003-10-30 Andre Berube Multi-focal contact lens
US20090204207A1 (en) * 2007-02-23 2009-08-13 Pixeloptics, Inc. Advanced Electro-Active Optic Device
WO2011067391A1 (fr) * 2009-12-04 2011-06-09 Varioptic Dispositif ophtalmique de focalisation à commande électronique

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ROSALES, PATRICIA ET AL.: "Crystalline lens radii of curvature from Purkinje and Scheimpflug imaging.", JOURNAL OF VISION, vol. 6, no. 2006, 19 September 2006 (2006-09-19), pages 1057 - 1067 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190353894A1 (en) * 2013-01-24 2019-11-21 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
US11006102B2 (en) * 2013-01-24 2021-05-11 Yuchen Zhou Method of utilizing defocus in virtual reality and augmented reality
US9854437B1 (en) 2014-06-13 2017-12-26 Verily Life Sciences Llc Apparatus, system and method for exchanging encrypted communications with an eye-mountable device
US9880401B2 (en) 2014-06-13 2018-01-30 Verily Life Sciences Llc Method, device and system for accessing an eye-mountable device with a user interface
US9678361B2 (en) 2014-06-13 2017-06-13 Verily Life Sciences Llc Power delivery for accommodation by an eye-mountable device
US10353463B2 (en) * 2016-03-16 2019-07-16 RaayonNova LLC Smart contact lens with eye driven control system and method
WO2019033334A1 (fr) * 2017-08-17 2019-02-21 Xinova, LLC Lentilles de contact ayant des caractéristiques bifocales
US11793674B2 (en) 2017-08-17 2023-10-24 Lutronic Vision Inc. Contact lenses with bifocal characteristics
US20190179165A1 (en) * 2017-12-12 2019-06-13 RaayonNova, LLC Smart Contact Lens with Embedded Display and Image Focusing System
US11333902B2 (en) * 2017-12-12 2022-05-17 RaayonNova LLC Smart contact lens with embedded display and image focusing system
CN112400134A (zh) * 2018-07-13 2021-02-23 德遁公司 用于眼部安装式成像系统的高级光学设计
CN112400134B (zh) * 2018-07-13 2022-05-06 德遁公司 用于眼部安装式成像系统的高级光学设计
CN114779471A (zh) * 2022-03-24 2022-07-22 闽都创新实验室 基于纳米像元阵列的眼机接口及其工作方法
CN114779471B (zh) * 2022-03-24 2024-05-28 闽都创新实验室 基于纳米像元阵列的眼机接口及其工作方法

Similar Documents

Publication Publication Date Title
WO2014106330A1 (fr) Lentille de contact pour mesure de la mise au point d'un globe oculaire
US9213406B2 (en) Head-mount eye tracking system with improved determination of gazing position
KR101890542B1 (ko) 디스플레이 향상을 위한 시스템 및 방법
KR102366110B1 (ko) 번쩍임의 광원에의 매핑
ES2957329T3 (es) Sistemas y métodos para el seguimiento ocular en aplicaciones de realidad virtual y de realidad aumentada
CN107209551B (zh) 用于注视跟踪的系统和方法
US8619065B2 (en) Universal stylus device
US20230359274A1 (en) Eye tracking system for use in head-mounted display units and method of operating same
EP3797376A1 (fr) Éclairage en champ et imagerie de suivi oculaire
TW201512633A (zh) 感測照度及距離之光學感測器
US11243607B2 (en) Method and system for glint/reflection identification
US11455031B1 (en) In-field illumination for eye tracking
US9668319B2 (en) Lighting system and control method thereof
US10908425B2 (en) Transmission-type head mounted display apparatus, display control method, and computer program
CN111630478A (zh) 高速交错双眼跟踪系统
US11216066B2 (en) Display device, learning device, and control method of display device
KR102629149B1 (ko) 외부 광에 따라 디스플레이의 특성을 변경하는 전자 장치 및 방법
TW201427418A (zh) 感測裝置以及感測方法
US20240144533A1 (en) Multi-modal tracking of an input device
JP6607254B2 (ja) ウェアラブル電子機器、ウェアラブル電子機器のジェスチャー検知方法およびウェアラブル電子機器のジェスチャー検知プログラム
JP6555707B2 (ja) 瞳孔検出装置、瞳孔検出方法及び瞳孔検出プログラム
KR20150091724A (ko) 착용형 안경장치
KR20160066451A (ko) Hmd 디바이스 및 그 제어 방법
US11899209B2 (en) Eye tracking system for VR/AR HMD units including pancake lens modules
US20180260068A1 (en) Input device, input control method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13870087

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13870087

Country of ref document: EP

Kind code of ref document: A1