WO2014179857A1 - Système de lunettes à mise au point automatique - Google Patents

Système de lunettes à mise au point automatique Download PDF

Info

Publication number
WO2014179857A1
WO2014179857A1 PCT/CA2014/000377 CA2014000377W WO2014179857A1 WO 2014179857 A1 WO2014179857 A1 WO 2014179857A1 CA 2014000377 W CA2014000377 W CA 2014000377W WO 2014179857 A1 WO2014179857 A1 WO 2014179857A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
pixels
imager
feature
predetermined
Prior art date
Application number
PCT/CA2014/000377
Other languages
English (en)
Inventor
Ichiro Shinkoda
Original Assignee
Ravenni Technology Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ravenni Technology Inc. filed Critical Ravenni Technology Inc.
Publication of WO2014179857A1 publication Critical patent/WO2014179857A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • G02C7/081Ophthalmic lenses with variable focal length
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/08Auxiliary lenses; Arrangements for varying focal length
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the invention relates to the automatic adjustment of the focal length of eyeglasses to adapt to a varying object distance.
  • Presbyopia is the loss of accommodation in the lenses of the eye due to the aging process. In humans the symptoms typically appear near the age of 40 and generally, by the age of 55, a pair of bifocal glasses is required to read a book and to correct vision for a different distance. Absolute presbyopia is the condition where the depth of field of the eye is predominantly determined by the size of the iris. This pertains when the lenses of the eyes have lost almost all of their ability to change their power. At that point a pair of progressive glasses may be required, with optical compensation to enable reading a book, reading a monitor, seeing distant objects and for regions betweens those distances.
  • Progressive glasses are eyeglasses in which the optical power compensation of a particular lens is varied across that lens with very little demarcation between the regions of differing optical power.
  • the optical power compensation of the lens may be considered a continuous function or near-continuous function of position.
  • Another method by which to compensate for the affect of presbyopia is by employing devices that can change the optical power of a substantial area of the lens.
  • Such devices are commercially available, examples being the SuperfocusTM and EmpowerTM products.
  • the first product varies the optical power of a liquid lens contained within a flexible membrane structure. This is achieved by varying the amount of liquid in the lens to change the surface optical shape of the membrane structure.
  • a voltage applied to an index changing electro-active material is varied to thereby vary the compensation.
  • the optical compensation can be adjusted accurately for the viewing distance.
  • the devices developed to date for finding the distance to the object being viewed include eye safe laser range finders, infrared gaze monitoring devices and electrooculography detection systems. Each has its strengths and weakness. Generally, these devices are prominent and not suitable, given the constraints of the form factor of an ordinary eyeglass, much of which is dictated by aesthetics.
  • a method for determining the gaze direction of an eye of a user comprising: (a) directing an image of the eye onto a first plurality of imaging pixels of an imager; and (b) analyzing image information from a predetermined second plurality of pixels of the imager.
  • the second plurality of pixels may be a smaller number of pixels than the first plurality of pixels and/or a sparse subset of all the pixels of the imager.
  • the analyzing may comprise comparing an intercept of a feature of the eye with the sparse subset of pixels with predetermined intercepts of the feature with the sparse subset of pixels.
  • the predetermined intercepts may be based on image data from the first plurality of imaging pixels.
  • the method may further comprise selecting the second plurality of pixels from at least one scan line of pixels of the imager.
  • the selecting the second plurality of pixels may be based on a signal-to-noise ratio of the at least one scan line of pixels.
  • the method may further comprise selecting the second plurality of pixels to be an integer number of scan lines of the imager.
  • the selecting the second plurality of pixels may comprise selecting the second plurality of pixels to be at least one set of contiguously arrayed scan lines of the imager and the at least one set of scan lines may be oriented parallel to the horizon of the eye.
  • the analyzing may comprise comparing the intercept of a limbus of the eye with predetermined intercepts of the limbus with the second plurality of pixels.
  • the method may comprise selecting the second plurality of pixels to be located one of above and below the horizon of the eye.
  • the analyzing may comprise comparing the intercept of a feature on the sclera of the eye with predetermined intercepts of the feature on the sclera with the second plurality of pixels.
  • the invention provides a method for determining a gaze distance of first and second eyes of a user, the method comprising: (a) determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and analyzing image information from a predetermined second plurality of pixels of the first imager; (b) determining a second gaze direction of the second eye by directing an image of the second eye onto a third plurality of imaging pixels of a second imager and analyzing image information from a predetermined fourth plurality of pixels of the second imager; and (c) determining the gaze distance from a mutual spatial intercept of the first and second gaze directions.
  • the first and second imagers may be the same imager.
  • the method for determining a gaze distance of first and second eyes of a user may in other embodiments comprise: (a) determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and comparing the intercept of a first feature of the first eye with predetermined intercepts of the first feature with a first sparse subset of the first plurality of pixels; (b) determining a second gaze direction of the second eye by directing an image of the second eye onto a second plurality of imaging pixels of a second imager and comparing the intercept of a second feature of the second eye with predetermined intercepts of the second feature with a second sparse subset of the second plurality of pixels; and (c) determining the gaze distance from a mutual spatial intercept of the first and second gaze directions.
  • the first feature may be one of a limbus of the first eye and a feature of a sclera of the first eye and the second feature may be one of a limbus of the second eye and a feature of a sclera of the second eye.
  • the first and second imagers may be the same imager.
  • an automatically focused eyeglass system comprising: (a) a frame configured to engage with a nose and ears of a user; (b) first and second adjustable focus lenses disposed in the frame to be located in front of respectively a first and a second eye of the user when the frame is engaged with the nose and ears of the user; (c) first and second imagers comprising first and second respective pluralities of imaging pixels disposed to image respectively the first and second eye; (d) a focus adjustment subsystem for changing the focus of the first and second adjustable focus lenses; (e) a memory; (f) a power source; and (g) a controller configured for, when the frame is engaged with the nose and ears of the user, directing the focus adjustment subsystem to change a first focal length of the first adjustable focus lens based on first information about the first eye obtained from a first sparse subset of the first plurality of imaging pixels and to change a second focal length of the second adjustable focus lens based on second information about the second eye obtained from a second sparse
  • At least one of the first information and the second information may be information about a limbus of the corresponding one of the first and second eyes. Alternatively, at least one of the first information and the second information is information about a feature on the sclera of the corresponding one of the first and second eyes. At least one of the first and second sparse subsets may comprise an integer number of contiguously arrayed scan lines of the corresponding imager.
  • the focus adjustment subsystem, the memory, the power source and the controller may be embedded in the frame.
  • the automatically focused eyeglass system may further comprise a computer in communication with the controller for receiving and sending data between the controller and the computer. At least one of the first and second sparse subsets may form a mathematically describable curve.
  • the first and second imagers may be the same imager.
  • the focus adjustment subsystem may be restricted to change the focus of the first and second adjustable focus lenses to an integer number of predetermined viewing distances.
  • the integer number of predetermined viewing distances may be three.
  • the three predetermined viewing distances may be 35 cm +/-5cm, 65cm +/- 10cm, and 250cm +/- 40cm.
  • the successive predetermined viewing distances among the integer number of predetermined viewing distances may be separated from each other based on a constant incremental lens power.
  • the constant incremental lens power may be a constant factor of a user focal depth lens power variation.
  • the constant factor may be greater than zero and less than or equal to 2.
  • the constant difference in lenspower may a predetermined amount of lenspower.
  • the predetermined amount of lenspower may be 1 ⁇ 4 diopter.
  • At least one of the first and second imagers may further be disposed to image at least one feature of a face of the user and the controller may be configured for determining from the image of the face whether one of the first and second adjustable focus lenses has substantially changed position relative to the corresponding eye; and for directing the focus adjustment subsystem to change a focal length of the corresponding adjustable focus lens based on the change in position.
  • Figure 1 is an auto focus eyeglass system in use with a user.
  • Figure 2a is a projection of three horizontal pixel sections onto an image of the eye of a user.
  • Figure 2b is a projection of the three horizontal pixel sections of Fig.2a onto an image of the eye of the user with the gaze of the eye directed upwards.
  • Figure 3 is a flowchart of a method for determining the gaze direction of an eye automatically focusing an autofocus eyeglass system.
  • Figure 4 is a flowchart of a method for calibrating an autofocus eyeglass system.
  • Figure 5 is a flowchart of a method for automatically focusing an autofocus eyeglass system.
  • the autofocus eyeglass system 100 of the present invention is shown in Figure 1 and comprises adjustable focus lenses 10 and 20, a distance-finding subsystem 30 for determining the distance to an object being viewed, focus adjustment subsystem 40 for changing the focus of the adjustable focus lenses 10 and 20 to bring the object into focus for the user, a controller 50, at least one non-volatile memory 60, a power source 90, and a frame 25.
  • distance-finding subsystem 30, power source 90, and non-volatile memory 60 may be physically incorporated in frame 25.
  • the distance-finding subsystem 30 and focus adjustment subsystem 40 can be combined in some embodiments or distributed in various devices in other embodiments to conform to space and weight constraints.
  • inventions may include various combinations of the components of the autofocus eyeglasses to conform to space and weight constraints.
  • Further embodiments of the apparatus may include a miniature connector, such as a micro usb port, commonly found in consumer devices, to charge the battery or to provide a means to communicate with a device to program the controller 50 or write to nonvolatile memory 60.
  • Still further embodiments may include a wireless means to charge or to provide a means to communicate with a device to program the controller or write to memory.
  • the distance-finding subsystem 30 comprises two detectors 70 and 80, which may be without limitation two cameras, attached to the frame 25. Camera 70 is directed at eye 12, and camera 80 is directed to eye 22.
  • the total field of view of each of the cameras is at least 125% of the diameter of the cornea of the human eye in the horizontal direction and at least 50% of the diameter of the cornea of the human eye in the vertical direction.
  • the cameras may each be without limitation a digital camera.
  • the digital camera may be a miniature digital camera and the lens of the digital camera may be, without limitation, a single aspherical lens.
  • the camera may comprise a multipixel detector array.
  • the distance-finding subsystem 30 comprises a detector, which may be without limitation a single camera, attached to the bridge of the frame.
  • the single camera is directed to capture images of both eyes.
  • the digital camera may be a miniature digital camera and lens of the digital camera may be, without limitation, a fisheye lens.
  • the lens of the digital camera may include a curved mirror to direct the light from both eyes to be received by the digital camera.
  • the non-volatile memory 60 may contain predetermined information regarding the characteristics of a typical human eye.
  • the memory may also contain predetermined custom information regarding characteristics of the eyes 12 and 22 of the individual user.
  • the predetermined information may include, without limitation, the interocular distance, the position of the pupils of the eyes 12 and 22 relative to the lenses 10 and 20 and the distance of the eyeglasses from the vertices of the corneas. These three quantities place the center of the pupil of each eye 12 and 22 in three dimensions with respect to the lens 10 and 20 serving each respective eye. To this end a reference point may be selected on each of lenses 10 and 20 with respect to which to express position.
  • the autofocus eyeglass system 100 of the present invention is prepared for use by first calibrating the distance-finding subsystem 30 to characteristics of the eyes 12 and 22 of the user.
  • the calibration process may be conducted, for example, by putting the autofocus eyeglass system 100 into a calibration mode.
  • the user looks at a clearly definable object at some close distance, while input buttons 52 and 54 on the frame 25 are adjusted to vary the focus of the lenses 10 and 20 until the object of interest is in sharp focus as determined by the user.
  • the user then may enter the focus adjustment subsystem settings into the non-volatile memory 60 by, for example without limitation, pressing "Enter" button 56.
  • the user looks at a second object at a distance different from the first object, and the autofocus system adjusts the focus of the lenses 10 and 20 again.
  • the new settings of the adjustment subsystem 40 settings may be entered into the nonvolatile memory 60 by, for example without limitation, pressing "Enter” button 56.
  • the number of times the steps must be repeated will depend on the number of parameters being adjusted while the autofocus eyeglass system 100 is in the calibration mode.
  • the three dimensional position and orientation of the cameras 70 and 80, expressed in the reference frames of the lenses 10 and 20 respectively, may be one group of such parameters.
  • Measured optical parameters of the lenses 10 and 20, and the intraocular separation of the eyes are other parameters that may be modified in calibration mode. In this case, an initial number may be used, but fine- tuning of the parameter will occur during the calibration mode. The cycle repeats until no further manual adjustments are required.
  • the device by these steps builds or refreshes a table of parameters characterizing the user's eye vergence-accommodation requirements and self-calibrates any changes to the operating characteristics of the autofocus eyeglass system 100. It is advantageous to vary both the gaze distance and the direction of the gaze of each eye during the calibration mode.
  • the gaze distances measured while the autofocus eyeglass system 100 is in the calibration mode may be varied over a substantial portion of a nearest point of about 20 cm to a far point of about 7 meters.
  • the direction of the gaze of each eye may substantially cover a range of 50 degrees to either side of looking directly ahead.
  • the field of view of the cameras 70 and 80 may also include features of the face.
  • the positions and/or sizes of these features in the images taken by the cameras maybe used to determine the orientation and position of the frame relative to the eyes.
  • the set of determined information may be stored in memory.
  • the orientation and position of the frame, as determined from subsequent images, may be compared with the stored information. If a substantial change is detected, then the autofocusing eyeglasses system can modify the setting of the lenses 10 and 20 to compensate for the shift in the lenses relative to the eyes.
  • Cameras 70 and 80 may be the same camera.
  • the controller 50 can compensate for a change in the relative position of the lenses 10,20 to the eyes 12,22 of the user by changing the focus setting of the lenses 10,20 from the focus setting when the lenses 10,20 are in a nominal position.
  • the controller 50 can compensate for the change in the distance between the cornea of the eye 12,22 and the lens 10,20 corresponding to each eye 12,22, a change in gaze direction of each eye due to any optical translation or deviation as a consequence of the movement of the lens 10,20 relative to the eye 12,22, and a change in a distance between the lens 10,20 and a viewed object.
  • the modified quantities are stored in memory and used in subsequent calculations and process inputs.
  • the size of the pupil maybe determined from images of the eyes 12,22 by cameras 70,80 and the controller 50 can compensate for a change in the depth of focus of eyes 12,22 with the lighting conditions.
  • the computing processing power or storage capacity of the autofocusing eyeglass system 100 may be augmented by communicating with an external computer 285.
  • the external computer may also provide a means to input instructions and parameters during the calibration mode.
  • eye 12 is shown in a generally "straight ahead" viewing orientation 200 with three horizontal pixel sections comprising imaging pixels of camera 70 superimposed. Alternatively it may be viewed as the image of eye 12 projected onto the multi-pixel imaging array of camera 70. Three horizontal pixel sections 240, 242 and 244, each comprising three rows of pixels, are shown.
  • the present invention may be implemented using one or more horizontal pixel sections of the multi-pixel imaging array of camera 70.
  • Figure 2a shows an embodiment employing three horizontal pixel sections As shown in Figure 2a, the horizontal pixel sections, and thereby the image frames of camera 70, may be aligned substantially horizontally with respect to the view perceived by eye 12.
  • figure 2a shows three rows of pixels per horizontal pixel section
  • the number of rows of pixels included in each horizontal pixel section may vary dynamically with the signal-to-noise ratio in the pixels within a row. The lower the signal-to-noise ratio in each of the pixels in a row, the greater the number of rows required in a horizontal pixel section, as will become clear below.
  • the image frame of camera 70 may be aligned at an angle to the horizontal with respect to the view perceived by the eye 12.
  • the alignment of image frames of camera 70 to eye 10 and the alignment of image frames of camera 80 to eye 20 may at an angle with respect to each other.
  • the eye 12 With the camera 70 unchanged and the eye 12 reoriented to orientation 200' of Figure 2b for a different gaze direction, the eye 12 now maps differently onto horizontal pixel sections 240, 242 and 244. This difference in eye-to-horizontal pixel section mapping may be employed to determine the gaze direction of eye 12. While not shown in a similar figure, the same principle applies to eye 22 and camera 80 and to the determination of the gaze direction of eye 22.
  • a collection of data is used to determine the focus setting of the lenses for a measured gaze direction of each eye. The collection of data is a gaze map of the autofocus eyeglass system 100.
  • the focus setting of each lens 10 and 20 may be adjusted separately. It is advantageous to interface to the autofocusing eyeglass system 100 with the computer 285 during the calibration mode.
  • the inside diameter of the limbus 215 is approximately 1 1.7 mm and its width is approximately 1.5 mm. Since the iris 230 behind the cornea 220 is mostly circular, the number of rows of pixels to include in the horizontal pixel sections depends on what portion of the limbus 215 is measured.
  • the autofocusing eyeglass system 100 may be placed in a tracking mode.
  • the tracking mode may use a method for determining the transition point between the sclera and the cornea of eye 12, the method comprising choosing an initial trial number no of adjacent rows of imaging pixels of the imaging array of camera 70 to include in horizontal pixel sections 240, 242 and 244.
  • the initial trial number n 0 may be different for each of the horizontal pixel sections.
  • the signal is then averaged over corresponding pixels adjacent to each other across the n 0 rows. In this process of averaging the signal-to-noise ratio will generally be improved.
  • the image of eye 12 is analyzed and a transition region with the general characteristics of the limbus 215 is identified.
  • the number of rows of pixels included may then be increased from n 0 .
  • the noise within the signal at this transition region, and thereby the accuracy, is traded off against the sharpness of the transition representing the limbus 215.
  • the number of rows of pixels included may thus be optimized in order to obtain the minimum number of rows of pixels that need to be included to get a suitably consistent and reliable determination of the limbus 215.
  • the 50% image signal point of the transition is chosen as the transition point.
  • Methods for determining a value for the transition point of a one- dimensional curve defined by discrete data are well known in the art.
  • the transition point accurately describes an intercept of the limbus with a particular pixel section 240, 242 or 244.
  • three horizontal pixel sections comprising the optimal number of rows as determined above, are selected. They are chosen such that the three resulting horizontal pixel sections 240, 242 and 244 have a spacing that is more than 20% of the diameter of the cornea.
  • the middle one of the three horizontal pixel sections 240, 242 and 244 may be chosen to intercept the center of the cornea 220 of eye 12
  • the horizontal positions of the transition points may be used to determine the gaze direction in the horizontal direction by reference to the gaze map created during the calibration mode.
  • the horizontal positions of the transition points may also be used to determine the vertical direction of the gaze direction by reference to the gaze map created during the calibration mode.
  • the power consumed by the imaging system can be reduced in proportion to the ratio between the number of pixels in the active horizontal sections and the total number of pixels in the sensor. Reducing the number of pixels to be processed also allows for the use of a microprocessor, which consumes less power to process information to determine the focus setting of the lenses
  • a microprocessor which consumes less power to process information to determine the focus setting of the lenses
  • a human eye when the eye changes gaze direction, the eye rotates around a center of rotation.
  • the cornea of the human eye is an optical lens and has a vertex.
  • the center of rotation is located about 13.5 mm behind the vertex of the cornea.
  • Using only two horizontal sections of the type described above will generally provide four positions of intercept on the limbus 215. From these four positions a center of the pupil 230 of eye 12 can be inferred.
  • This center may be determined to an accuracy of within approximately 0.08 degrees as measured relative to the rotation point of the eye.
  • the gaze directions of the two eyes 12 and 22 may then be determined with a suitable accuracy to allow the gaze distance of the two eyes 12 and 22 to be calculated to within an uncertainty equivalent of plus or minus 0.1 diopter.
  • Measured total depth of field for a typical aged human eye with absolute prebyopia is approximately equivalent to a defocus of 0.2 diopter.
  • a value of 0.1 diopter, as an acceptable tolerance for the maximum uncertainty in the calculated equivalent distance, is one possible choice for the required accuracy to which the gaze distance needs to be determined.
  • a feature that corresponds to the pupil 230 of the eye 12 may be present in image data of at least one of the horizontal pixel sections 240, 242 or 244.
  • the pupil may range in size from 2 to 8 mm.
  • the pupil size is smaller than the inner diameter of the limbus and pupil is generally darker than the sclera in color.
  • the image of the pupil thus provides a signature distinct from the signature of the limbus 215 in the data from a horizontal pixel section which includes a portion of the pupil image.
  • the filter can also be implemented as a digital filter on the image data, which can be more readily enabled and disabled than an optical filter.
  • the right eye 12 may be tracked with the camera
  • the gaze direction of the eye 12 can be determined by measuring the portion of the limbus 215 visible in the profile view.
  • a second method of determining the gaze direction can be used in conjunction with the first method.
  • At least one, fixed feature such as for instance a prominent blood vessel structure or feature on the sclera 210, may be used.
  • the calibration mode may include the determination of the structures to utilize for the gaze tracking and create a gaze map using the position of those structures in the image frame as indexing parameters.
  • the gaze map for the second method may use a different set of pixels from the first method for determining the gaze direction.
  • a total map that is a combination of the gaze map for the first and the second method may be used to track the gaze of the right eye 12.
  • the total map includes a region where the gaze maps of the first and second methods overlap.
  • the same method applies to eye 22 and camera 80 and to the determination of the gaze direction of eye 22.
  • the autofocus eyeglass system 100 may employ the limbus of both eyes, a feature of the sclera of both eyes, or the sclera of one eye and the limbus of the other while tracking both eyes. Since the occasions where the user is looking to the extreme left or right without turning their head are rare, higher power consumption during that time will not significantly affect the operating time of the autofocus eyeglass system 100.
  • horizontal pixel sections 240, 242 and 244 need not be horizontal and may, in fact, be vertical or be at some angle to the horizontal. Using a sparse set of measurements to determine the gaze direction, preferably with some redundancy so that outlier measurements can be discarded, the system is able to determine accurately the required focus setting to provide sharp focus to the user.
  • autofocusing eyeglass system 100 when autofocusing eyeglass system 100 is unable to determine a focus position setting to the required tolerance, autofocusing eyeglass system 100 may set the focus setting to provide focus between about 1 and 3 meters or some other distance range. Once autofocusing eyeglass system 100 determines a focus position setting to the required tolerance, it may set the focus setting to the determined position. [00051] In another embodiment of the invention, the controller 50 may vary the focus setting of the adjustable focus lenses 10 and 20 to one of a set of predetermined viewing distances. These positions may correspond to a standard reading distance in a range around 35 cm, a standard computer monitor viewing distance in a range around 65 cm, and a distance of about 2.5 meters. An estimate of the distance to an object of interest is determined by distance-finding subsystem 30.
  • the controller 50 determines which of the predetermined viewing distances optimizes the focus of the object to the user.
  • the set of predetermined viewing distances may include more than 3 viewing distances.
  • the user is able to manually adjust the focus setting of any one of the set of predetermined viewing distances to compensate for fluctuations for the optimal focus setting for the eyes.
  • the manually adjusted focus setting may be stored temporarily in memory 60 and used as the focusing setting for that predetermined viewing distance.
  • the controller 50 may store in memory more than one set of focus setting for a set of predetermined viewing distances. The user may cause the controller 50 to store in new set of focus setting. In other embodiments, there may be more than one set of predetermined viewing distances.
  • the depth of focus of most optical devices depends on the focus setting.
  • a convenient means for expressing the total depth of focus in a manner substantially independent of the position of focus is to describe the total depth of focus as an equivalent lens power.
  • the focal length of the lens power is expressed in units of meter, the equivalent lens power is expressed in diopter.
  • a method for determining the viewing distances to be included in the set of predetermined viewing distances comprises choosing a minimum viewing distance and a maximum viewing distance to include in the set of predetermined viewing distances.
  • viewing range is used in the present specification to describe the physical distance range between the two limits so determined. Assosciated with this viewing range is a corresponding lens power range.
  • the method further comprises dividing the lens power range into equal segments of incremental lens power. This implies that the corresponding selection of viewing distances is not separated by equal distance segments.
  • a preferred segment size or increment size for the lens power range may be based on the lens power variation corresponding to the depth of focus of the compound lens formed by the eye 12, 22 of the user and the corresponding lens 10, 20 of the autofocus eyeglass system 100.
  • the term "user focal depth lens power variation" is used in the present specification to describe the variation in lens power that is equivalent to the depth of focus of the compound lens formed by the eye 12,22 of the user and the corresponding lens 10,20 of the autofocus eyeglass system 100.
  • the incremental lens power may be chosen to be a constant factor of the user focal depth lens power variation.
  • the constant factor may be greater than zero and less than or equal to two. This approach has the benefit of creating the smallest number of set points to store in memory. This prevents the controller 50 from "hunting" for distance settings and thereby keeps the required processing power and time to a minimum.
  • the adjustment range is divided in predetermined lens power units.
  • One preferred predetermined lens power units for use to segment the range of viewing distances is a quarter of a diopter, a unit commonly used in industry for adjustable optical equipment.
  • a method for determining the gaze direction of the eye 12 comprises, as shown in the flow chart of Fig. 3, capturing [310] an image data set for the eye 12 of the user, and transferring [315] to the controller 50 only a sparse portion of the image data set corresponding to the regions of the horizontal sections 240, 242, and 244 determined in the calibration mode. The method proceeds further by analyzing [320] the sparse portion of the image date set to determine the transition points of limbus of the eye 12, and determining [325] the gaze direction of the eye 12, using the parameters in a gaze map generated during a calibration mode.
  • the calibration mode may employ a method for determining the gaze map for autofocus eyeglass system 100 for use in the tracking mode, the method comprising determining [410] which of the device parameters and user specific parameters described at the hand of figure 1 are to be optimized; capturing [420] a full frame image of each of the eyes of the user while the user is looking at an object at a first distance; determining [430] a user preferred focus setting for optimal focus for the first object distance; changing [440] the object distance to a second object distance; repeating the steps [420] to [440] to gather a data pair comprising of image data and preferred focus setting; collecting data pairs until, at query point [450], the number of data pairs is at least equal to the number of parameters to be optimized; optimizing [460] the parameters to be optimized until, at query point [470], determining the focus setting using the image data provides focus setting values that agree with the corresponding user preferred focus settings; determining [480] a gaze direction of the eye in the images in the data set using the optimized parameters;
  • a second set of parameters by varying them until, at query point [495], calculating the gaze direction of the eye using only the information in the set of sparse pixels of each image of the data set and the values of the second set of parameters provides gaze directions that agree with the corresponding gaze direction calculated using the full frame images, this set of parameters includes parameters for describing the eccentricity and asymmetry of the shape of the limbus, parameters to account for the non-uniform color of the iris and the patterns in the iris, as well as the device and user specific parameters; and thus generating a gaze map for the device for use in the tracking mode.
  • the controller may use the existing value of the parameters to adjust the focus of the lenses prior to accepting input from the user.
  • any new information that may affect a value of one of the parameters being optimized during the calibration mode may initiate the respective optimization method to determine new values for the parameters.
  • Choosing a sparse set of pixels for determining the transition points of the limbus described at the hand of Figures 1, 2a and 2b may comprise choosing one or more horizontal sections, a mathematically describable curve, a set of piecewise contiguously arrayed scan lines, a set of pixels which is substantially smaller than the pixels in the entire image frame, or some combination thereof.
  • the term "contiguously arrayed" is used in the present specification to describe scan lines of the imagers 70 and 80 that are immediately adjoining one another along their long sides.
  • a mathematically describable curve is a set of pixels may include those pixels whose area includes a point that belongs to a mathematical function that is overlaid on the multi-pixels array sensory surface.
  • a method for automatically focusing the autofocusing eyeglass system 100 comprises, as shown in the flow chart of Fig. 5, selecting [510] a tracking mode, and then determining [520] a gaze direction for each of eyes 12 and 22, as described at the hand of Fig 3. The gaze distance is then determined

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Un système de lunettes à mise au point automatique comprend des lentilles à mise au point ajustable, un sous-système de détermination des distances destiné à la détermination de la distance jusqu'à un objet observé, un sous-système d'ajustement de la mise au point pour modifier la mise au point des lentilles à mise au point ajustable et mettre au point sur l'objet pour l'utilisateur, un dispositif de commande et au moins une mémoire rémanente, une source d'alimentation et un cadre. Le système de lunettes à mise au point automatique et le procédé associé utilisent un jeu réduit de mesures des yeux d'un utilisateur pour déterminer les directions de regard des deux yeux, et déterminent à partir des deux directions de regard la distance de regard des deux yeux avant de mettre au point les lunettes en fonction de ladite distance de regard. Les caractéristiques du visage peuvent être employées pour détecter une modification de position d'une lunette par rapport à l'œil correspondant et compenser ladite modification de position. Des distances de vision prédéterminées peuvent être séparées les unes des autres grâce à une puissance optique d'incrémentation constante.
PCT/CA2014/000377 2013-05-07 2014-04-25 Système de lunettes à mise au point automatique WO2014179857A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361820415P 2013-05-07 2013-05-07
US61/820,415 2013-05-07
US201361912730P 2013-12-06 2013-12-06
US61/912,730 2013-12-06

Publications (1)

Publication Number Publication Date
WO2014179857A1 true WO2014179857A1 (fr) 2014-11-13

Family

ID=51866562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2014/000377 WO2014179857A1 (fr) 2013-05-07 2014-04-25 Système de lunettes à mise au point automatique

Country Status (1)

Country Link
WO (1) WO2014179857A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105847504A (zh) * 2016-03-07 2016-08-10 乐视移动智能信息技术(北京)有限公司 可调度数眼镜放大手机屏幕信息的方法及可调度数眼镜
WO2017176898A1 (fr) * 2016-04-08 2017-10-12 Magic Leap, Inc. Systèmes et procédés de réalité augmentée comprenant des éléments de lentille à focale variable
EP3379325A1 (fr) * 2017-03-21 2018-09-26 Essilor International Dispositif optique adapté pour être porté par un utilisateur
CN109068972A (zh) * 2016-04-08 2018-12-21 维卫尔公司 用于测量观察距离的装置和方法
US10371945B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for diagnosing and treating higher order refractive aberrations of an eye
US10962855B2 (en) 2017-02-23 2021-03-30 Magic Leap, Inc. Display system with variable power reflector
US20210132414A1 (en) * 2017-04-20 2021-05-06 Essilor International Optical device adapted to be worn by a wearer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6478425B2 (en) * 2000-12-29 2002-11-12 Koninlijke Phillip Electronics N. V. System and method for automatically adjusting a lens power through gaze tracking
US20120133891A1 (en) * 2010-05-29 2012-05-31 Wenyu Jiang Systems, methods and apparatus for making and using eyeglasses with adaptive lens driven by gaze distance and low power gaze tracking
US8274578B2 (en) * 2008-05-15 2012-09-25 Sungkyunkwan University Foundation For Corporate Collaboration Gaze tracking apparatus and method using difference image entropy

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6478425B2 (en) * 2000-12-29 2002-11-12 Koninlijke Phillip Electronics N. V. System and method for automatically adjusting a lens power through gaze tracking
US8274578B2 (en) * 2008-05-15 2012-09-25 Sungkyunkwan University Foundation For Corporate Collaboration Gaze tracking apparatus and method using difference image entropy
US20120133891A1 (en) * 2010-05-29 2012-05-31 Wenyu Jiang Systems, methods and apparatus for making and using eyeglasses with adaptive lens driven by gaze distance and low power gaze tracking

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10466477B2 (en) 2015-03-16 2019-11-05 Magic Leap, Inc. Methods and systems for providing wavefront corrections for treating conditions including myopia, hyperopia, and/or astigmatism
US10545341B2 (en) 2015-03-16 2020-01-28 Magic Leap, Inc. Methods and systems for diagnosing eye conditions, including macular degeneration
US11474359B2 (en) 2015-03-16 2022-10-18 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US11256096B2 (en) 2015-03-16 2022-02-22 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
US11156835B2 (en) 2015-03-16 2021-10-26 Magic Leap, Inc. Methods and systems for diagnosing and treating health ailments
US10371945B2 (en) 2015-03-16 2019-08-06 Magic Leap, Inc. Methods and systems for diagnosing and treating higher order refractive aberrations of an eye
US10437062B2 (en) 2015-03-16 2019-10-08 Magic Leap, Inc. Augmented and virtual reality display platforms and methods for delivering health treatments to a user
US10788675B2 (en) 2015-03-16 2020-09-29 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using light therapy
US10775628B2 (en) 2015-03-16 2020-09-15 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
US11747627B2 (en) 2015-03-16 2023-09-05 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10444504B2 (en) 2015-03-16 2019-10-15 Magic Leap, Inc. Methods and systems for performing optical coherence tomography
US10473934B2 (en) 2015-03-16 2019-11-12 Magic Leap, Inc. Methods and systems for performing slit lamp examination
US10983351B2 (en) 2015-03-16 2021-04-20 Magic Leap, Inc. Augmented and virtual reality display systems and methods for diagnosing health conditions based on visual fields
US10527850B2 (en) 2015-03-16 2020-01-07 Magic Leap, Inc. Augmented and virtual reality display systems and methods for determining optical prescriptions by imaging retina
US10969588B2 (en) 2015-03-16 2021-04-06 Magic Leap, Inc. Methods and systems for diagnosing contrast sensitivity
US10539794B2 (en) 2015-03-16 2020-01-21 Magic Leap, Inc. Methods and systems for detecting health conditions by imaging portions of the eye, including the fundus
US10539795B2 (en) 2015-03-16 2020-01-21 Magic Leap, Inc. Methods and systems for diagnosing and treating eyes using laser therapy
US10451877B2 (en) 2015-03-16 2019-10-22 Magic Leap, Inc. Methods and systems for diagnosing and treating presbyopia
US10564423B2 (en) 2015-03-16 2020-02-18 Magic Leap, Inc. Augmented and virtual reality display systems and methods for delivery of medication to eyes
CN105847504A (zh) * 2016-03-07 2016-08-10 乐视移动智能信息技术(北京)有限公司 可调度数眼镜放大手机屏幕信息的方法及可调度数眼镜
US10459231B2 (en) 2016-04-08 2019-10-29 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US11614626B2 (en) 2016-04-08 2023-03-28 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
WO2017176898A1 (fr) * 2016-04-08 2017-10-12 Magic Leap, Inc. Systèmes et procédés de réalité augmentée comprenant des éléments de lentille à focale variable
CN109068972A (zh) * 2016-04-08 2018-12-21 维卫尔公司 用于测量观察距离的装置和方法
US11106041B2 (en) 2016-04-08 2021-08-31 Magic Leap, Inc. Augmented reality systems and methods with variable focus lens elements
US11058294B2 (en) 2016-04-08 2021-07-13 Vivior Ag Device and method for measuring viewing distances
US11300844B2 (en) 2017-02-23 2022-04-12 Magic Leap, Inc. Display system with variable power reflector
US10962855B2 (en) 2017-02-23 2021-03-30 Magic Leap, Inc. Display system with variable power reflector
US11774823B2 (en) 2017-02-23 2023-10-03 Magic Leap, Inc. Display system with variable power reflector
CN110462494B (zh) * 2017-03-21 2021-09-17 依视路国际公司 被适配为由配戴者配戴的光学装置
CN110462494A (zh) * 2017-03-21 2019-11-15 依视路国际公司 被适配为由配戴者配戴的光学装置
WO2018172366A1 (fr) * 2017-03-21 2018-09-27 Essilor International Dispositif optique adapté pour être porté par un porteur
EP3379325A1 (fr) * 2017-03-21 2018-09-26 Essilor International Dispositif optique adapté pour être porté par un utilisateur
US11567349B2 (en) 2017-03-21 2023-01-31 Essilor International Optical device adapted to be worn by a wearer
US20200018991A1 (en) * 2017-03-21 2020-01-16 Essilor International Optical device adapted to be worn by a wearer
US20210132414A1 (en) * 2017-04-20 2021-05-06 Essilor International Optical device adapted to be worn by a wearer
US11880095B2 (en) * 2017-04-20 2024-01-23 Essilor International Optical device adapted to be worn by a wearer

Similar Documents

Publication Publication Date Title
WO2014179857A1 (fr) Système de lunettes à mise au point automatique
US9867532B2 (en) System for detecting optical parameter of eye, and method for detecting optical parameter of eye
CN103424891B (zh) 成像装置及方法
US10048513B2 (en) Continuous autofocusing eyewear
US9961335B2 (en) Pickup of objects in three-dimensional display
US10048750B2 (en) Content projection system and content projection method
CN103595912B (zh) 局部缩放的成像方法和装置
KR20190015573A (ko) 시선 추적에 기초하여 자동 초점 조정하는 이미지 포착 시스템, 장치 및 방법
US10002293B2 (en) Image collection with increased accuracy
US20160166145A1 (en) Method for determining ocular measurements using a consumer sensor
CN104090371B (zh) 一种3d眼镜及3d显示系统
US10795162B2 (en) Image displayable eyeglasses
JP2008537608A (ja) 適応合焦する眼球外視力人工器官
WO2014206011A1 (fr) Dispositif et procédé d'imagerie.
JP6422954B2 (ja) 焦点距離の調節
US20160247322A1 (en) Electronic apparatus, method and storage medium
CN103163663A (zh) 估计观众所戴一副眼镜中矫正镜片的屈光度的方法和设备
KR101817436B1 (ko) 안구 전위 센서를 이용한 영상 표시 장치 및 제어 방법
CN110520788B (zh) 被适配为由配戴者配戴的光学装置
US11662574B2 (en) Determining gaze depth using eye tracking functions
KR102294822B1 (ko) 스테레오 비전을 이용한 안경 제작용 인체 데이터 측정장치
US20230057524A1 (en) Eyeglass devices and related methods
US11892366B2 (en) Method and system for determining at least one optical parameter of an optical lens
WO2023023398A1 (fr) Dispositifs de lunettes et procédés associés

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14794164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11/02/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14794164

Country of ref document: EP

Kind code of ref document: A1