WO2023148372A1 - A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye - Google Patents

A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye Download PDF

Info

Publication number
WO2023148372A1
WO2023148372A1 PCT/EP2023/052814 EP2023052814W WO2023148372A1 WO 2023148372 A1 WO2023148372 A1 WO 2023148372A1 EP 2023052814 W EP2023052814 W EP 2023052814W WO 2023148372 A1 WO2023148372 A1 WO 2023148372A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
target
electronic device
distance
computer
Prior art date
Application number
PCT/EP2023/052814
Other languages
French (fr)
Inventor
Norberto LÓPEZ-GIL
Mateusz JASKULSKI
Original Assignee
Visionapp Solutions S.L.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visionapp Solutions S.L. filed Critical Visionapp Solutions S.L.
Publication of WO2023148372A1 publication Critical patent/WO2023148372A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/09Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing accommodation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0033Operational features thereof characterised by user input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Definitions

  • This is related to the fields of optometry, visual optics, physiological optics, electronics and computers.
  • this is related to systems and methods of measuring the near point and far points of a human eye and its focusing errors, which can result in myopia, hyperopia, astigmatism and presbyopia.
  • a perfect eye forms images of infinitely distant objects precisely on the retina.
  • the far point (FP) of a perfect eye is thus located at infinity.
  • the eye maintains the object in focus by means of accommodation, which is a process where, primarily, the curvature of a crystalline lens inside the eye changes.
  • accommodation is a process where, primarily, the curvature of a crystalline lens inside the eye changes.
  • NP near point
  • D ray vergence in units of diopters
  • m the inverse of distance expressed in meters
  • Refractive errors which can be corrected by means of spectacles, contact lenses, intraocular lenses, or refractive surgery can be divided into spherical errors (myopia or hyperopia), cylindrical errors (astigmatism) and presbyopia.
  • Astigmatism is a condition where the optical power of the eye varies with meridian (orientation) causing the far point to split into two (meridians)-, one corresponding to, for example, horizontal, and the other corresponding to vertical components of the image.
  • This causes the visual quality of images of vertical objects (for example a fence) to be different from the visual quality of images of horizontal objects (for example a striped dress) and can give rise to nausea, seeing double images, and a general loss of vision quality.
  • Many authors have shown that astigmatism magnitude and axis do not change very much during accommodation [3][4],
  • both the FP and the NP can each be split. Each can correspond to two distances depending on the orientation of the object; distal far point (dFP) and proximal far point (pFP), and distal near point (dNP) and proximal near point (pNP). The first two can be found on the far side, and the latter two on the near side of the interval of clear vision (SICV).
  • dFP distal far point
  • pFP proximal far point
  • dNP distal near point
  • pNP proximal near point
  • the first two can be found on the far side, and the latter two on the near side of the interval of clear vision (SICV).
  • the positions of FP and NP depend on the spectral composition (color) of the object [5] imaged by the optics of the eye onto the retina.
  • color the spectral composition of the object [5] imaged by the optics of the eye onto the retina.
  • the FP can be located at a distance of 0.5 m, 0.4 m, and 0.53 m for white, blue and red objects, respectively. That is, the FP and NP depend on the color of the object.
  • the chromatic dispersion of light in the eye is well known and similar between human subjects, so FP and NP for any given wavelength (color) can be calculated [6],
  • refraction Prior to its correction, the type and amount of refractive error must be determined by means of a procedure known as refraction, which consists of finding the combination of spherical and cylindrical lenses which correct the focusing errors of the eye described above. Refraction is performed either using dedicated optical instruments, which can measure light exiting the eye (objective refraction), or by trained clinicians using a chart and a set of trial lenses (subjective refraction).
  • optimal refractions can also vary with task and the object being viewed [10], For instance, if the goal of the refraction is to read optotypes or letters, it will depend on letter size. Low myopes can be able to read large letters without correction but need a correction for small letters. Similarly, low presbyopes can read medium- or large-size fonts but are unable to read small print. Thus, in addition to the above- mentioned object color the position of FP and NP depends on object size [1 1] and subject’s refraction.
  • the present invention refers to computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye. They are based on measurements of distance between a subject’s head and an electronic device corresponding to either or both of the SICV, which is a proxy for measurements of ray vergence.
  • the system proposed herein can include the following components of an electronic device: a. Distance measurement circuitry, which can include passive components such as one or more cameras, or active components such as emitters, detectors, or others, or any combination thereof. b. User interface, which can include an electronic screen with a tactile surface or a keypad or a microphone, or others, or any combination thereof. c. Control circuitry and processing circuitry, which can include processors, memory modules, wired or wireless connections between system modules and components and remote networks, or others, or any combination thereof.
  • the method proposed herein can comprise the steps of: a. Acquiring information about a user, such as age (AGE), gender, geographic location, eye to be tested or others, or any combination thereof. i.
  • the acquiring information about a user comprises configuring a user interface module to prompt a user to input said information into a user interface. ii. According to some embodiments of the present invention, the acquiring information about a user comprises automatically detecting said information based on an image of a user’s head from a camera included in an electronic device or other databases.
  • the acquiring information about a user comprises configuring a connectivity and or storage module to retrieve said information from local or remote server or memory.
  • the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above in accordance with a user interaction with an electronic device using a user interface. For instance, in response to a certain user input or when a certain condition is met. ii.
  • the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change of distance between a user’s head and an electronic device.
  • the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change in a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above.
  • a target may dynamically change in respect of one or more characteristics according to a time interval.
  • the time interval can be individual for a given characteristic or for group of characteristics.
  • the time interval can be preconfigured in the electronic device.
  • changing the distance between a user’s head and an electronic device comprises holding a device in a user’s hand and bringing it closer to the user’s head or further away from the head.
  • the changing the distance between a user’s head and an electronic device comprises situating one or more reflective surfaces, the electronic device, and changing a distance between the device and the surface or a distance between the head and the surface, or any combination thereof.
  • the changing the distance between a user’s head and an electronic device comprises changing the distance by a third party, such as another person, another apparatus, or other, or any combination of any of the above.
  • a subjective visual criterion comprises said target or a part thereof becoming perceivable, detectable, recognizable, distinguishable, resolvable, legible or discriminable by a user.
  • the ceasing to satisfy a subjective visual criterion comprises said target or a part thereof becoming no longer perceivable, detectable, recognizable, distinguishable, resolvable, legible or discriminable by a user.
  • the measuring of the position of either or both SICV can comprise displaying on an electronic screen to a user a target with spatial features oriented at a certain angle a and measuring a corresponding distance between the user and the target.
  • the measuring of the position of either or both SICV can further comprise displaying to a user a target with spatial detail at a different angle p, which can be perpendicular to angle a, and measuring a corresponding distance between the user and the target.
  • the measuring of the position of either or both SICV can comprise configuring a distance measurement circuitry included in the electronic device to perform measurements. iv. According to some embodiments of the present invention, the measuring of the position of either or both SICV can comprise using an external apparatus to perform measurements, such as a ruler, a rangefinder or other, or any combination of any of the above. v. According to some embodiments of the present invention, the visual quality criterion can comprise a visual acuity criterion (e.g. resolving lines, letters, etc.), a contrast sensitivity criterion (e.g. distinguishing tones of gray), a color discrimination criterion (e.g.
  • the satisfying or ceasing to satisfy a certain visual quality criterion with the target can comprise continuous evaluation of the target by the user when the distance between the user’s head and the device is changed. For example, the user can stop seeing a fine spatial pattern in the target at a certain distance as it is moved away from the user’s head and it becomes blurred.
  • Measuring either or both sides of the interval of clear vision of the eye of a user by displaying a changing target allows to control a user's subjective visual quality criterion by allowing target comparison. This technique may be suitable for obtaining more precise measures.
  • FIG. 1 is a schematic of an illustrative system for interactively measuring either or both sides of the interval of clear vision of the eye with one embodiment of the invention.
  • FIG. 2 is a block diagram of an illustrative electronic device for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention where a camera is included in the distance measurement module and a screen is included in the user interface.
  • FIG. 3 is an example view of an illustrative screen of an electronic device for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
  • Several possible configurations of the target, including changes in time are presented at the bottom.
  • FIG. 4 is a flowchart of an illustrative sub-process for changing a target with distance between a head and electronic device in accordance with one embodiment of the invention.
  • FIG. 5 is a flowchart of an illustrative process for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
  • the present invention is directed to computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye.
  • the method is based on interactive measurements of distance between a user’s head and an electronic device, specifically corresponding to either or both SICV.
  • the present invention in some embodiments thereof, can provide systems for allowing users to accurately measure refractive errors or amplitude of accommodation with or without wearing optical correction, or power or addition of their reading glasses, or other people’s eyes or other people’s reading glasses.
  • FIG. 1 is a schematic of an illustrative, computer-implemented system for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
  • the system 100 can include distance measurement circuitry 110, user interface 120, control circuitry 130, processing circuitry 140, storage 150, and communications circuitry 160. In some embodiments, one or more of the device's components can be combined or omitted. In some embodiments, system 100 can include additional components not included in FIG. 1 , or a combination of any of the aforementioned components.
  • the system 100 can include any suitable type of electronic device with distance measurement circuitry used to measure the distance between the user’s head and the device.
  • the system 100 can include any of the following devices equipped with a camera: a mobile phone, a tablet, a “smart” television set, a personal digital assistant (PDA), a laptop or desktop computer, a stand-alone camera or video-recorder, and any other suitable device.
  • the electronic device included in system 100 is preferably, but not limited to, a portable device.
  • Distance measurement circuitry 110 can include any circuitry, emitters and detectors to measure the distance between the user’s head or part of it and the electronic device.
  • distance measurement circuitry 110 can include a passive system comprising one or more cameras for capturing images of the user’s head and circuitry to compute the distance between the user’s head or part of it and the device from said images.
  • distance measurement circuitry 110 can include an active system comprising one or more emitters and detectors for measuring said distance.
  • User interface 120 can include any suitable mechanism for interaction with a user such as one or more screens, loudspeakers, tactile surfaces, keypads, microphones, or others, or any combination thereof.
  • user interface 120 can include a tactile electronic screen for displaying targets and receiving user input.
  • Control circuitry 130 can include any type of circuitry, such as processors, micro-controllers and connections to control the functions, operations and performance of an electronic device included in system 100. Furthermore, control circuitry 130 can be electronically coupled with other components of the system 100, or any combination thereof. For example, in some embodiments of the invention, control circuitry 130 can send a control signal to user interface 120 to configure it for receiving input from a user or giving instructions to a user.
  • Processing circuitry 140 can include any type of circuitry, such as processors, micro-controllers and connections designed to process the data from distance measurement circuitry 110, user interface 120, and other components of the system 100, or any combination thereof for interactively measuring either or both sides of the interval of clear vision of the eye. Furthermore, processing circuitry 140 can be electronically coupled with other components of the system 100, or any combination thereof. For example, in some embodiments of the invention, processing circuitry 140 can send a signal to control circuitry 130 to configure the user interface 120 or distance measurement circuitry 110.
  • Storage 150 can include one or more storage media, such as internal or external memory of any type, such as: HDD, SSD, RAM, ROM, EPROM, Flash EEPROM, flash memory card such as an SD (i.e. Secure Digital) card of CF (i.e. Compact Flash) card, or any other type of memory suitable for the electronic device included in system 100.
  • SD i.e. Secure Digital
  • CF i.e. Compact Flash
  • Communications circuitry 160 can include any circuitry suitable to connect the electronic device included in system 100 to a communications network and transmit data using any suitable protocol such as, for example, Wi-Fi (e.g., 802.11 protocol), Bluetooth®, cellular protocol (e.g., GSM, GPRS, CDMA, EDGE, LTE), or any other communications protocol or any combination thereof.
  • Wi-Fi e.g., 802.11 protocol
  • Bluetooth® e.g., Bluetooth®
  • cellular protocol e.g., GSM, GPRS, CDMA, EDGE, LTE
  • FIG. 2 is a block diagram of an illustrative electronic device 200 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
  • Electronic device 200 can be very similar to the electronic device included in the system 100 shown in FIG. 1 and share descriptions of components of the latter.
  • electronic device 200 can also include storage 250 and communications circuitry 260, that can be substantially similar to respective components of electronic device in system 100; storage 150, and communications circuitry 160, or others, or any combination thereof.
  • Distance measurement circuitry 210 can be similar to distance measurement circuitry 110 and use any suitable technique or combination of techniques for measuring distance between a user’s head 270 and electronic device 200.
  • User interface 220 can be connected to control circuitry 230 and processing circuitry 240.
  • User interface (120; 220) can be configured to provide instructions to a user by means of a visual instruction message (see 304 in FIG. 3), or an audio message, or other user interface method, or any combination of the above methods.
  • user interface (120; 220) can be configured to receive input from the user by means of touching or swiping a tactile screen, or typing on a keypad or keyboard, issuing a voice command by speaking into a microphone, performing a gesture detected by a camera, performing a gesture detected by a gyroscope or others or any combination of any of the above.
  • Control circuitry 230 can be similar to control circuitry 130, and processing circuitry 240 can be similar to processing circuitry 140.
  • Processing circuitry 240 can use any suitable technique or combination of techniques for interactively measuring either or both sides of the interval of clear vision of the eye from measurements of distance between a user’s head or part of it 270 and electronic device 200 obtained from distance measurement circuitry 210, and user input obtained from user interface 220, both configured by signals from control circuitry 230.
  • Control circuitry 230 can configure the user interface 220 to instruct the user to slowly bring the electronic device 200 into proximity to the user’s head 270, until the screen 220 can only barely be read due to said proximity, corresponding to distance coincident with a near point NP situated on the near side of the interval of clear vision.
  • the user interface 220 can instruct the user to slowly bring the device 200 into proximity of the user’s head 270, until the screen 220 can no longer be read due to said proximity, corresponding to a distance beyond the interval of clear vision.
  • user interface 220 can instruct the user to slowly move the electronic device 200 further away from the user’s head 270, until the tactile screen 220 can only barely be read due to its distance from the head, corresponding to a distance coincident with a far point FP situated on the far side of the interval of clear vision.
  • the user interface 220 can instruct the user to separate the device 200 further away from the user’s head 270, until the screen 220 can no longer be read due to said separation, corresponding to a distance beyond a farthe interval of clear vision.
  • control circuitry 230 can instruct the user (or another) 220 to indicate said proximity to- or separation from the electronic device using the user interface 220 by touching the tactile screen or issuing a voice command to the voice recognition module, or another input method, or a combination of any of the above.
  • the control circuitry 240 can then use this user input and the corresponding measurement of distance between the user’s head 270 and the electronic device 200 obtained from distance measurement circuitry 210 to measure pNP or dNP, pFP, or dFP.
  • processing circuitry 240 can use any suitable technique or combination of techniques for computing either or both SICV and additional information, such as the user’s age, gender, eye to be tested or other, or others, or any combination thereof.
  • processing circuitry 240 can automatically detect the user’s age, gender or eye to be tested, or others, or any combination of any of the above parameters from an image of the user’s head 270 from a camera included in distance measurement circuitry 210. In some embodiments, processing circuitry 240 can obtain said user’s age, gender, eye to be tested, previous refraction or power or addition of reading glasses, or any combination of the above by sending a signal to control circuitry 230 configuring the tactile screen included in the user interface 220 to prompt the user to input these parameters. In some embodiments processing circuitry 240 can obtain said parameters by sending a signal to control circuitry 230 to configure storage 250 or communications circuitry 260 to retrieve said parameters from local or remote memory.
  • FIG. 3 is an example view of an illustrative screen of an electronic device 300 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention where a target is displayed on a screen included in the user interface.
  • Electronic device 300 can be substantially similar to device 100 shown in FIG. 1 and device 200 shown in FIG. 2 and share descriptions of components with either or both.
  • the electronic device 300 can include a camera in distance measurement circuitry 310 and a tactile screen in the user interface 320.
  • the tactile screen included in the user interface 320 can be configured to show a target 330 to the user, including but not limited to the following types of targets; an optotype 330a, text 330b, a geometrical pattern 330c, d, a grey discrimination test 330e, a color discrimination test 330f, a spatial geometry test 330g, or a picture or movie 330h or any combination of any of the above.
  • the target 330 or a component of the target including foreground, background or a part of its shape or form can be configured to change its change size, shape, rotation, color, background color or spatial frequency composition or other characteristics or any combination of the above in function of the measured distance between the user’s head 270 and electronic device 300, or in function of another dimension measured in the user’s head by the distance measurement circuitry 210, which changes proportionally with said distance, such as the interpupillary distance, head width, iris diameter, or others, or any combination thereof.
  • a time interval can be defined to control the speed of change desired for certain characteristics, either individually or in groups. For instance, every 3 seconds, an optotype may change its color, and every 2 seconds it may change its size and shape, etc.
  • the target 330 or a component of the target including foreground, background, or a part of its shape or form can be configured to change size, shape, rotation, color, background color or spatial frequency composition or other characteristics or any combination of the above in function of a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above.
  • a parameter of user’s environment such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above.
  • an optotype 330a shape e.g. letter E, letter C, etc.
  • background color or background brightness can change with ambient light level, color temperature or other spectral or chromatic characteristics of the environment, e.g. as a result of a passing cloud, during a sunset or a sunrise.
  • the target 330 can be configured to change its characteristics depending on the user input from the user interface 320.
  • target 330 or a component of the target can be configured to change size, shape, rotation, color, background color or spatial frequency composition or other characteristics, or any combination of any of the above as a result of user interaction with the electronic device 300 using the tactile screen 320 such as a swipe, tap, click, or voice command or other gesture, or any combination of any of the above.
  • the user interaction with user interface 320 can be performed using a keypad, keyboard, mouse, microphone, or any other interface method, or any combination of any of the above.
  • FIG. 4 is a flowchart of an illustrative sub-process 400 for changing a target with distance between a head and electronic device in accordance with one embodiment of the invention.
  • Sub-process 400 can consist of several steps. In some embodiments the order of steps of sub-process 400 can be changed or some steps can be omitted or repeated. Furthermore, sub-process 400 can be included in another process (parent process) as a sub-process.
  • Sub-process 400 can be performed by an electronic device (100; 200; 300) with distance measurement circuitry (110; 210; 310) and user interface (120; 220; 320), and one or more other components of the electronic device (100; 200; 300).
  • the first step of sub-process 400 can continue from a parent process and begin with block 410 where the user interface (120; 220; 320) can be configured to display a target 330 on a screen 320.
  • said target can be an optotype 330a, or text 330b, or one or more parallel lines 330c, 330d, or one or more grey patches 330e or color patches 330f, or a geometrical pattern such as a grid 330g, or a picture 330h, or other type of target, or any combination of any of the above.
  • a user can change the distance between the user’s head 270 and electronic device (100; 200; 300). Furthermore, distance measurement circuitry (110; 210; 310) can send to processing circuitry (140; 240) a signal including a measurement of distance between a user’s head 270 and electronic device (100; 200; 300). As previously indicated, distance measurement circuitry (110; 210; 310) can use any suitable technique or combination of techniques for measuring distance between a user’s head 270 and electronic device. Furthermore, said distance between the user's head and the device can be measured using another method (such as a ruler or rangefinder) and input into the user interface (120; 220; 320).
  • another method such as a ruler or rangefinder
  • the change of the distance between the user’s head 270 and electronic device can comprise holding the device in a user’s hand and bringing it closer to the face or further away from the face.
  • the change of distance between the user’s head 270 and electronic device can comprise situating a reflective surface, such as a mirror, in front of the electronic device (so that a reflection of a user’s head 270 is within the field of view of the electronic device), and changing the distance between the device and a mirror, or the distance between said user’s head or part of it 270 and a mirror, or any combination thereof.
  • a reflective surface such as a mirror
  • the changing the distance between the user’s head 270 and electronic device can comprise changing the distance by a third party, such as another person, another apparatus, or other, or any combination of any of the above.
  • Block 430 can be a decision block where the user interface (120; 220; 320) can be configured to instruct the user to evaluate if a target 330 or a component of the target meets a certain visual quality criterion.
  • said visual quality criterion can be a visual acuity criterion (e.g. being able to read optotypes (330a) or text (330b), or resolve two or more parallel lines (330c, d), or other, or any combination of any of the above).
  • said visual quality criterion can be a contrast sensitivity criterion (e.g.
  • the said evaluation if a target 330 or a component of the target meets or ceases to meet a certain visual quality criterion in block 430 can include the evaluation of: continuation, cessation or satisfaction of said visual quality criterion.
  • the displaying a target 330 comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change in a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above, the said evaluation if a target 330 or a component of the target meets or ceases to meet a certain visual quality criterion in block 430 can depend on said parameter of user’s environment.
  • the user interface 120; 220; 320
  • the user interface can be configured to instruct the user to evaluate if a target 330 comprising an optotype 330a meets or ceases to meet a certain visual quality criterion at a certain point of time whilst said optotype shape changes from optotype E to optotype C or other combination of optotype shapes.
  • time intervals can be preset. For instance, periodically, every 3 seconds, an optotype may change its color, each 2 seconds it may change its size and its shape, etc. Additionally, these time intervals may be activated only upon triggering a certain condition.
  • the user can be instructed to evaluate whether or not a target or a component of the target has ceased to meet said visual quality criterion, for example having become blurry, distorted, discolored, unclear, or other.
  • the sub-process 400 can proceed to block 440.
  • the user can stop seeing a fine spatial pattern in the target at a certain distance as it is moved away from the user’s head and it becomes blurred, in which case the sub-process 400 can proceed to block 440.
  • process 400 can go to block 450, which can be a decision block.
  • block 450 if the distance between a user’s head 270 and electronic device can not be further changed (for example a user is not able to move the electronic device further away than arm length), sub-process 400 can proceed to block 440.
  • a distance between a user’s head 270 and the electronic device (100; 200; 300) can be stored in storage (150; 250) or transmitted by communications circuitry (160; 260) along with, but not limited to user input data. Furthermore, at block 440 sub-process 400 can return to a parent process in which it can be included.
  • sub-process 400 can proceed to block 460.
  • the user interface can be configured to change characteristics of a target 330 or a component of the target.
  • the distance measurement circuitry 110; 210; 310) can send to processing circuitry (140; 240) a signal including a measurement of distance between a user’s head 270 and electronic device (100; 200; 300).
  • the processing circuitry can use any technique or combination of techniques to process said signal and send a signal to control circuitry (130; 230), which in turn can configure the user interface (120; 220; 320) to change such characteristics of the target 330 or a component of the target such as size, shape, rotation, color, background color, or spatial frequency composition or other characteristics or other, or any combination of any of the above, in function of said distance between a user’s head 270 and electronic device, or in function of another dimension measured in the user’s head by the distance measurement circuitry 210, which changes proportionally with said distance, such as the interpupillary distance, head width, iris diameter, or others, or any combination thereof.
  • FIG. 5 is a flowchart of an illustrative process 500 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
  • Process 500 can consist of several steps. In some embodiments the order of steps of process 500 can be changed or some steps can be omitted or repeated.
  • Process 500 can be performed by an electronic device (100; 200; 300) with distance measurement circuitry (110; 210; 310) and user interface (120; 220; 320), and one or more other components of the electronic device (100; 200; 300).
  • distance measurement circuitry 110; 210; 310
  • user interface 120; 220; 320
  • Process 500 can be performed by an electronic device (100; 200; 300) with distance measurement circuitry (110; 210; 310) and user interface (120; 220; 320), and one or more other components of the electronic device (100; 200; 300).
  • Process 500 can begin at block 510 where a user interface (120; 220; 320) of an electronic device (100; 200; 300) can be configured to receive user input information, such as age, gender, value of the sphero-cylindrical power of the ophthalmic or contact lenses already ported by the subject, vertex distance, or others or any combination thereof.
  • user input information such as age, gender, value of the sphero-cylindrical power of the ophthalmic or contact lenses already ported by the subject, vertex distance, or others or any combination thereof.
  • said information can be acquired by configuring the user interface (120; 220; 320) to prompt the user to input said information into the user interface using a tactile screen or voice recognition circuitry, or other, or any combination thereof.
  • said information can be acquired automatically by means of detecting it from an image of a user’s head 270 from a camera included in said user interface.
  • user interface 120; 220; 320
  • user interface can be configured to instruct a user to use one (left or right) eye or both eyes while interacting with an electronic device (100; 200; 300).
  • process 500 can include sub-process 400 (see FIG. 4).
  • a user can provide user input to user interface (120; 220; 320) indicating that visual quality of a target 330 or a component of the target satisfies or ceases to satisfy a certain visual quality criterion, corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision SICV.
  • the distance between an electronic device (100; 200; 300) and a subject’s head 270 can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
  • the user interface (120; 220; 320) can be configured to display a new target and instruct a user to select a preferred angle of target orientation while interacting with an electronic device (100; 200; 300).
  • the user interface (120; 220; 320) of an electronic device (100; 200; 300) situated in proximity of dFP or pNP can be configured to receive user input including a preferred angle of target orientation adFP or apNP.
  • the user interface 120; 220; 320
  • the user interface can be configured to change a target 330 or a component of the target on a tactile screen 320 in response to user input such as touching or swiping a tactile screen, or typing on a keypad or keyboard, speaking into a microphone, performing a gesture to detected by a camera, performing a gesture detected by a gyroscope, or others or any combination of any of the above.
  • user interface 120; 220; 320
  • a preferred angle of target orientation adFP selected by a user at block 540 can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
  • process 500 can include sub-process 400 (see FIG. 4).
  • user interface 120; 220; 320
  • user interface 120; 220; 320
  • a user can provide user input to user interface (120; 220; 320) indicating that visual quality of said target 330 or a component of the target satisfies or ceases to satisfy a certain visual quality criterion corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision, at a distance in the vicinity of either dFP or pNP.
  • said distance can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
  • process 500 can include sub-process 400 (see FIG. 4).
  • user interface 120; 220; 320
  • user interface 120; 220; 320
  • a user can provide user input to user interface (120; 220; 320) indicating that visual quality of said target 330 or a component of the target satisfies or ceases to satisfy said visual quality criterion, corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision, at a distance in the vicinity of either pFP or dNP (which are the inner two points).
  • said distance can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
  • processing circuitry can use any technique or combination of techniques to compute ocular refractive errors such as, but not limited to, sphere (SPH), cylinder (CYL) and axis (AXS) from dFP, pFP, adFP apFP, dNP, pNP, adNP, or apNP or others, or any combination of any of the above.
  • SPH sphere
  • CYL cylinder
  • AXS axis
  • AXS can be calculated from adFP and apFP using, for example, the following equations:
  • AXS 90° - apFP when 0° ⁇ adFP ⁇ 90°; or
  • AXS can be calculated from adNP and apNP using, for example, the following equations:
  • AXS 90° - adNP when adNP ⁇ 90°;
  • SPH and CYL can be calculated from dFP, pFP using, for example, the following equations:
  • K depends on target and background color.
  • K 0 D, K > 0 D and K ⁇ 0 D for white, blue and red targets, respectively.
  • the specific value of K depends on an emission spectrum of a physical target.
  • SPH and CYL can be also calculated from dNP, pNP using, for example, the following equations:
  • dFP, pFP, dNP, pNP can be expressed in meters and K in diopters.
  • AA value, expressed in diopters, can depend on AGE, expressed in years, as:
  • processing circuitry can use any technique or combination of techniques to compute power of reading glasses (P) from dNP or pNP, or other parameters, or any combination thereof.
  • P power of reading glasses
  • the power of the reading glasses P can be calculated as:
  • the user interface (120; 220; 320) can be configured to instruct a user to use one (left or right) eye or both eyes while interacting with an electronic device (100; 200; 300).
  • user interface (120; 220; 320) can be configured to display a target 330 including, but not limited to a text (330b).
  • a near point distance NP can be stored in storage (150; 250) and the power P of the reading glasses can be calculated as:
  • addition of reading glasses can be calculated using the following equations:
  • Eq. 1-12 correspond to corneal-plane refraction.
  • processing circuitry (140; 240) included in an electronic device (100; 200; 300) can use any suitable technique or combination of techniques to compute spectacle-plane refraction or power of reading glasses from corneal-plane refraction from dFP, pFP, dNP, pNP, FP, NP, vertex distance (VD) or others, or any combination thereof.
  • VD depends on the type of correction (usually 0.0 m for contact lenses and 0.014 m for glasses).
  • parameters such as, but not limited to, SPH, CYL, AXS, FP, NP, P, ADD, dFP, pFP, adFP apFP, dNP, pNP, adNP, apNP, VD, user input, or others, or any combination of any of the above can be saved in storage (150; 250).

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A method for interactively measuring the position of either of both sides of the interval of clear vision without the need of an optical component that would change the vergences of rays originating at a target. A system using said method and the user's age can estimate sphero-cylindrical refraction, addition and power of reading glasses of the user. The system can use different sizes, directions and colors of targets which can change with said distance or user interaction.

Description

A COMPUTER-IMPLEMENTED SYSTEMS AND METHODS FOR INTERACTIVELY MEASURING EITHER OR BOTH SIDES OF THE INTERVAL OF CLEAR VISION OF THE EYE
FIELD OF THE INVENTION
This is related to the fields of optometry, visual optics, physiological optics, electronics and computers. In particular, this is related to systems and methods of measuring the near point and far points of a human eye and its focusing errors, which can result in myopia, hyperopia, astigmatism and presbyopia.
BACKGROUND OF THE INVENTION
A perfect eye forms images of infinitely distant objects precisely on the retina. The far point (FP) of a perfect eye is thus located at infinity. As the distance between the eye and an object becomes shorter, the eye maintains the object in focus by means of accommodation, which is a process where, primarily, the curvature of a crystalline lens inside the eye changes. Once a minimum distance of accommodation is reached and the lens cannot become any more curved, an object is located at the eye’s near point (NP). In optometry, distance is often expressed in terms of ray vergence in units of diopters (D) defined as the inverse of distance expressed in meters (m). The interval in diopters between the vergence of FP and NP of an eye is called the interval of clear vision, which is often understood as synonymous with the amplitude of accommodation (AA) of the eye’s crystalline lens. Since the FP of a perfect eye is located at infinity, it corresponds to the far point vergence VFP = 0 D. A NP of a perfect eye can, for example, be located at a distance of 0.1 m, which corresponds to the near point vergence VNP = 10 D. In this case AA = VNP - VFP = 10 D.
Real eyes suffer from aging, and people who are approximately 45 and older suffer from presbyopia - a condition where the crystalline lens of the eye loses the ability to change shape. The amplitude of accommodation of real eyes drops with age from approximately 20 D in infancy to 0 D in late adulthood when the eye loses the ability to form clear images of near objects on the retina. There are many reports documenting the relationship between age and maximum change in accommodation in human eyes [1][2],
Furthermore, real eyes suffer from focusing errors caused by optical imperfections in their refractive surfaces (cornea and crystalline lens) and/or from a mismatch between the refractive power and the axial length of the eye, which are called refractive errors. Such errors, which cause the far point to be located closer than at infinity (myopia) or further away than infinity (hyperopia) prevent the eye from forming images precisely on the retina and result in deterioration of visual quality and require optical correction.
Refractive errors which can be corrected by means of spectacles, contact lenses, intraocular lenses, or refractive surgery can be divided into spherical errors (myopia or hyperopia), cylindrical errors (astigmatism) and presbyopia. Astigmatism is a condition where the optical power of the eye varies with meridian (orientation) causing the far point to split into two (meridians)-, one corresponding to, for example, horizontal, and the other corresponding to vertical components of the image. This causes the visual quality of images of vertical objects (for example a fence) to be different from the visual quality of images of horizontal objects (for example a striped dress) and can give rise to nausea, seeing double images, and a general loss of vision quality. Many authors have shown that astigmatism magnitude and axis do not change very much during accommodation [3][4],
The fact that astigmatism can be present both in a relaxed and accommodated eyes means that both the FP and the NP can each be split. Each can correspond to two distances depending on the orientation of the object; distal far point (dFP) and proximal far point (pFP), and distal near point (dNP) and proximal near point (pNP). The first two can be found on the far side, and the latter two on the near side of the interval of clear vision (SICV).
Due to the chromatic dispersion of light in ocular media, which depends on wavelength, the positions of FP and NP depend on the spectral composition (color) of the object [5] imaged by the optics of the eye onto the retina. For example, in the case of a 2 D myope looking at an object on a black background the FP can be located at a distance of 0.5 m, 0.4 m, and 0.53 m for white, blue and red objects, respectively. That is, the FP and NP depend on the color of the object. The chromatic dispersion of light in the eye is well known and similar between human subjects, so FP and NP for any given wavelength (color) can be calculated [6],
Prior to its correction, the type and amount of refractive error must be determined by means of a procedure known as refraction, which consists of finding the combination of spherical and cylindrical lenses which correct the focusing errors of the eye described above. Refraction is performed either using dedicated optical instruments, which can measure light exiting the eye (objective refraction), or by trained clinicians using a chart and a set of trial lenses (subjective refraction).
Perfect focus is never achieved by the human eye even after sphero-cylindrical correction due to the presence of high-order monochromatic aberrations [7] and well documented errors in both objective and clinician determined subjective refractions [8,9], Moreover, optimal refractions can also vary with task and the object being viewed [10], For instance, if the goal of the refraction is to read optotypes or letters, it will depend on letter size. Low myopes can be able to read large letters without correction but need a correction for small letters. Similarly, low presbyopes can read medium- or large-size fonts but are unable to read small print. Thus, in addition to the above- mentioned object color the position of FP and NP depends on object size [1 1] and subject’s refraction. REFERENCES
1. Duane A. Studies in Monocular and Binocular Accommodation, with Their Clinical Application. Transactions of the American Ophthalmological Society. 1922; 20:132-57.
2. Jackson E. Amplitude of Accommodation at Different Periods of Life. California state journal of medicine. 1907; 5(7):163-6.
3. Borish IM. Clinical refraction, 3rd ed. Chicago: Professional Press, 1970.
4. Bannon RE. A study of astigmatism at the near point with special reference to astigmatic accommodation. Am J Optom Arch Am Acad Optom. 1946; 23:53-75.
5. Sivak JG, Mandelman T. Chromatic dispersion of the ocular media. Vision Res 1982; 22:997-1003.
6. Thibos LN, Ye M, Zhang X, Bradley A. The chromatic eye: a new reduced-eye model of ocular chromatic aberration in humans. Appl Opt 1992; 31 :3594-3600.
7. Charman WN. Wavefront aberration of the eye: a review. Optom Vis Sci 1991 ; 68:574- 583.
8. Bullimore, M. A., Boyd, T., Mather, H. E., & Gilmartin, B. (1988). Near retinoscopy and refractive error. Clinical and Experimental Optometry, 71 4), 114-118.
9. Bullimore, M. A., Fusaro, R. E., & Adams, C. W. (1998). The repeatability of automated and clinician refraction. Optometry and vision science: official publication of the American Academy of Optometry, 75(8), 617-622.
10. Lopez-Gil, N., Peixoto-de-Matos, S. C., Thibos, L. N., & Gonzalez-Meijome, J. M. (2012). Shedding light on night myopia. Journal of Vision, 12 (5):4, 1-9.
11. Heath G. G. (1956). The influence of visual acuity on accommodative responses of the eye. Am. J. Opt&t. & drchs Am. Acad. Oprom. 33. 513-524.
SUMMARY OF THE INVENTION
The present invention refers to computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye. They are based on measurements of distance between a subject’s head and an electronic device corresponding to either or both of the SICV, which is a proxy for measurements of ray vergence.
The system proposed herein can include the following components of an electronic device: a. Distance measurement circuitry, which can include passive components such as one or more cameras, or active components such as emitters, detectors, or others, or any combination thereof. b. User interface, which can include an electronic screen with a tactile surface or a keypad or a microphone, or others, or any combination thereof. c. Control circuitry and processing circuitry, which can include processors, memory modules, wired or wireless connections between system modules and components and remote networks, or others, or any combination thereof. The method proposed herein can comprise the steps of: a. Acquiring information about a user, such as age (AGE), gender, geographic location, eye to be tested or others, or any combination thereof. i. According to some embodiments of the present invention, the acquiring information about a user comprises configuring a user interface module to prompt a user to input said information into a user interface. ii. According to some embodiments of the present invention, the acquiring information about a user comprises automatically detecting said information based on an image of a user’s head from a camera included in an electronic device or other databases.
Hi. According to some embodiments of the present invention, the acquiring information about a user comprises configuring a connectivity and or storage module to retrieve said information from local or remote server or memory. b. Displaying a target on an electronic screen such as one or more letters or optotypes, a geometrical pattern or a still or moving picture, or other, or any combination thereof. i. According to some embodiments of the present invention, the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above in accordance with a user interaction with an electronic device using a user interface. For instance, in response to a certain user input or when a certain condition is met. ii. According to some embodiments of the present invention, the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change of distance between a user’s head and an electronic device.
Hi. According to some embodiments of the present invention, the displaying a target on an electronic screen comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change in a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above. For instance, a target may dynamically change in respect of one or more characteristics according to a time interval. The time interval can be individual for a given characteristic or for group of characteristics. The time interval can be preconfigured in the electronic device. c. Changing the distance between a user’s head and an electronic device until a subjective visual quality criterion either is satisfied or ceases to be satisfied with a target displayed on said electronic screen. i. According to some embodiments of the present invention, changing the distance between a user’s head and an electronic device comprises holding a device in a user’s hand and bringing it closer to the user’s head or further away from the head. ii. According to some embodiments of the present invention, the changing the distance between a user’s head and an electronic device comprises situating one or more reflective surfaces, the electronic device, and changing a distance between the device and the surface or a distance between the head and the surface, or any combination thereof.
Hi. According to some embodiments of the present invention, the changing the distance between a user’s head and an electronic device comprises changing the distance by a third party, such as another person, another apparatus, or other, or any combination of any of the above. iv. According to some embodiments of the present invention, the satisfying a subjective visual criterion comprises said target or a part thereof becoming perceivable, detectable, recognizable, distinguishable, resolvable, legible or discriminable by a user. v. According to some embodiments of the present invention, the ceasing to satisfy a subjective visual criterion comprises said target or a part thereof becoming no longer perceivable, detectable, recognizable, distinguishable, resolvable, legible or discriminable by a user. d. Measuring the position of either or both SICV where a certain visual quality criterion regarding the target is criterion either is satisfied or ceases to be satisfied. i. According to some embodiments of the present invention, the measuring of the position of either or both SICV can comprise displaying on an electronic screen to a user a target with spatial features oriented at a certain angle a and measuring a corresponding distance between the user and the target. ii. According to some embodiments of the present invention, the measuring of the position of either or both SICV can further comprise displaying to a user a target with spatial detail at a different angle p, which can be perpendicular to angle a, and measuring a corresponding distance between the user and the target.
Hi. According to some embodiments of the present invention, the measuring of the position of either or both SICV can comprise configuring a distance measurement circuitry included in the electronic device to perform measurements. iv. According to some embodiments of the present invention, the measuring of the position of either or both SICV can comprise using an external apparatus to perform measurements, such as a ruler, a rangefinder or other, or any combination of any of the above. v. According to some embodiments of the present invention, the visual quality criterion can comprise a visual acuity criterion (e.g. resolving lines, letters, etc.), a contrast sensitivity criterion (e.g. distinguishing tones of gray), a color discrimination criterion (e.g. distinguishing colors), subjective clarity, or other, or any combination of any of the above. vi. According to some embodiments of the present invention, the satisfying or ceasing to satisfy a certain visual quality criterion with the target can comprise continuous evaluation of the target by the user when the distance between the user’s head and the device is changed. For example, the user can stop seeing a fine spatial pattern in the target at a certain distance as it is moved away from the user’s head and it becomes blurred.
Measuring either or both sides of the interval of clear vision of the eye of a user by displaying a changing target allows to control a user's subjective visual quality criterion by allowing target comparison. This technique may be suitable for obtaining more precise measures.
BRIEF DESCRIPTION OF DRAWINGS
The following figures accompanying the detailed description below serve to further illustrate the nature of the present invention and its advantages:
FIG. 1 is a schematic of an illustrative system for interactively measuring either or both sides of the interval of clear vision of the eye with one embodiment of the invention.
FIG. 2 is a block diagram of an illustrative electronic device for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention where a camera is included in the distance measurement module and a screen is included in the user interface.
FIG. 3 is an example view of an illustrative screen of an electronic device for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention. Several possible configurations of the target, including changes in time are presented at the bottom.
FIG. 4 is a flowchart of an illustrative sub-process for changing a target with distance between a head and electronic device in accordance with one embodiment of the invention.
FIG. 5 is a flowchart of an illustrative process for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
DETAILED DESCRIPTION
The present invention is directed to computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye. The method is based on interactive measurements of distance between a user’s head and an electronic device, specifically corresponding to either or both SICV. The present invention, in some embodiments thereof, can provide systems for allowing users to accurately measure refractive errors or amplitude of accommodation with or without wearing optical correction, or power or addition of their reading glasses, or other people’s eyes or other people’s reading glasses. FIG. 1 is a schematic of an illustrative, computer-implemented system for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention. The system 100 can include distance measurement circuitry 110, user interface 120, control circuitry 130, processing circuitry 140, storage 150, and communications circuitry 160. In some embodiments, one or more of the device's components can be combined or omitted. In some embodiments, system 100 can include additional components not included in FIG. 1 , or a combination of any of the aforementioned components.
The system 100 can include any suitable type of electronic device with distance measurement circuitry used to measure the distance between the user’s head and the device. For example, the system 100 can include any of the following devices equipped with a camera: a mobile phone, a tablet, a “smart” television set, a personal digital assistant (PDA), a laptop or desktop computer, a stand-alone camera or video-recorder, and any other suitable device. The electronic device included in system 100 is preferably, but not limited to, a portable device.
Distance measurement circuitry 110 can include any circuitry, emitters and detectors to measure the distance between the user’s head or part of it and the electronic device. In some embodiments, distance measurement circuitry 110 can include a passive system comprising one or more cameras for capturing images of the user’s head and circuitry to compute the distance between the user’s head or part of it and the device from said images. In some embodiments, distance measurement circuitry 110 can include an active system comprising one or more emitters and detectors for measuring said distance.
User interface 120 can include any suitable mechanism for interaction with a user such as one or more screens, loudspeakers, tactile surfaces, keypads, microphones, or others, or any combination thereof. For example, in some embodiments, user interface 120 can include a tactile electronic screen for displaying targets and receiving user input.
Control circuitry 130 can include any type of circuitry, such as processors, micro-controllers and connections to control the functions, operations and performance of an electronic device included in system 100. Furthermore, control circuitry 130 can be electronically coupled with other components of the system 100, or any combination thereof. For example, in some embodiments of the invention, control circuitry 130 can send a control signal to user interface 120 to configure it for receiving input from a user or giving instructions to a user.
Processing circuitry 140 can include any type of circuitry, such as processors, micro-controllers and connections designed to process the data from distance measurement circuitry 110, user interface 120, and other components of the system 100, or any combination thereof for interactively measuring either or both sides of the interval of clear vision of the eye. Furthermore, processing circuitry 140 can be electronically coupled with other components of the system 100, or any combination thereof. For example, in some embodiments of the invention, processing circuitry 140 can send a signal to control circuitry 130 to configure the user interface 120 or distance measurement circuitry 110. Storage 150 can include one or more storage media, such as internal or external memory of any type, such as: HDD, SSD, RAM, ROM, EPROM, Flash EEPROM, flash memory card such as an SD (i.e. Secure Digital) card of CF (i.e. Compact Flash) card, or any other type of memory suitable for the electronic device included in system 100.
Communications circuitry 160 can include any circuitry suitable to connect the electronic device included in system 100 to a communications network and transmit data using any suitable protocol such as, for example, Wi-Fi (e.g., 802.11 protocol), Bluetooth®, cellular protocol (e.g., GSM, GPRS, CDMA, EDGE, LTE), or any other communications protocol or any combination thereof.
FIG. 2 is a block diagram of an illustrative electronic device 200 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention.
Electronic device 200 can be very similar to the electronic device included in the system 100 shown in FIG. 1 and share descriptions of components of the latter. For example, electronic device 200 can also include storage 250 and communications circuitry 260, that can be substantially similar to respective components of electronic device in system 100; storage 150, and communications circuitry 160, or others, or any combination thereof.
Distance measurement circuitry 210 can be similar to distance measurement circuitry 110 and use any suitable technique or combination of techniques for measuring distance between a user’s head 270 and electronic device 200.
User interface 220 can be connected to control circuitry 230 and processing circuitry 240. User interface (120; 220) can be configured to provide instructions to a user by means of a visual instruction message (see 304 in FIG. 3), or an audio message, or other user interface method, or any combination of the above methods. Furthermore, user interface (120; 220) can be configured to receive input from the user by means of touching or swiping a tactile screen, or typing on a keypad or keyboard, issuing a voice command by speaking into a microphone, performing a gesture detected by a camera, performing a gesture detected by a gyroscope or others or any combination of any of the above.
Control circuitry 230 can be similar to control circuitry 130, and processing circuitry 240 can be similar to processing circuitry 140. Processing circuitry 240 can use any suitable technique or combination of techniques for interactively measuring either or both sides of the interval of clear vision of the eye from measurements of distance between a user’s head or part of it 270 and electronic device 200 obtained from distance measurement circuitry 210, and user input obtained from user interface 220, both configured by signals from control circuitry 230.
Control circuitry 230 can configure the user interface 220 to instruct the user to slowly bring the electronic device 200 into proximity to the user’s head 270, until the screen 220 can only barely be read due to said proximity, corresponding to distance coincident with a near point NP situated on the near side of the interval of clear vision. As another example, the user interface 220 can instruct the user to slowly bring the device 200 into proximity of the user’s head 270, until the screen 220 can no longer be read due to said proximity, corresponding to a distance beyond the interval of clear vision.
Similarly, user interface 220 can instruct the user to slowly move the electronic device 200 further away from the user’s head 270, until the tactile screen 220 can only barely be read due to its distance from the head, corresponding to a distance coincident with a far point FP situated on the far side of the interval of clear vision. As yet another example, the user interface 220 can instruct the user to separate the device 200 further away from the user’s head 270, until the screen 220 can no longer be read due to said separation, corresponding to a distance beyond a farthe interval of clear vision.
Furthermore, control circuitry 230 can instruct the user (or another) 220 to indicate said proximity to- or separation from the electronic device using the user interface 220 by touching the tactile screen or issuing a voice command to the voice recognition module, or another input method, or a combination of any of the above. The control circuitry 240 can then use this user input and the corresponding measurement of distance between the user’s head 270 and the electronic device 200 obtained from distance measurement circuitry 210 to measure pNP or dNP, pFP, or dFP. Furthermore, processing circuitry 240 can use any suitable technique or combination of techniques for computing either or both SICV and additional information, such as the user’s age, gender, eye to be tested or other, or others, or any combination thereof.
In some embodiments processing circuitry 240 can automatically detect the user’s age, gender or eye to be tested, or others, or any combination of any of the above parameters from an image of the user’s head 270 from a camera included in distance measurement circuitry 210. In some embodiments, processing circuitry 240 can obtain said user’s age, gender, eye to be tested, previous refraction or power or addition of reading glasses, or any combination of the above by sending a signal to control circuitry 230 configuring the tactile screen included in the user interface 220 to prompt the user to input these parameters. In some embodiments processing circuitry 240 can obtain said parameters by sending a signal to control circuitry 230 to configure storage 250 or communications circuitry 260 to retrieve said parameters from local or remote memory.
FIG. 3 is an example view of an illustrative screen of an electronic device 300 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention where a target is displayed on a screen included in the user interface.
Electronic device 300 can be substantially similar to device 100 shown in FIG. 1 and device 200 shown in FIG. 2 and share descriptions of components with either or both. For example, the electronic device 300 can include a camera in distance measurement circuitry 310 and a tactile screen in the user interface 320. In some embodiments the tactile screen included in the user interface 320 can be configured to show a target 330 to the user, including but not limited to the following types of targets; an optotype 330a, text 330b, a geometrical pattern 330c, d, a grey discrimination test 330e, a color discrimination test 330f, a spatial geometry test 330g, or a picture or movie 330h or any combination of any of the above.
In some embodiments the target 330 or a component of the target including foreground, background or a part of its shape or form can be configured to change its change size, shape, rotation, color, background color or spatial frequency composition or other characteristics or any combination of the above in function of the measured distance between the user’s head 270 and electronic device 300, or in function of another dimension measured in the user’s head by the distance measurement circuitry 210, which changes proportionally with said distance, such as the interpupillary distance, head width, iris diameter, or others, or any combination thereof. A time interval can be defined to control the speed of change desired for certain characteristics, either individually or in groups. For instance, every 3 seconds, an optotype may change its color, and every 2 seconds it may change its size and shape, etc.
In some embodiments the target 330 or a component of the target including foreground, background, or a part of its shape or form can be configured to change size, shape, rotation, color, background color or spatial frequency composition or other characteristics or any combination of the above in function of a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above. For example, an optotype 330a shape (e.g. letter E, letter C, etc.) can change over time. As another example, background color or background brightness can change with ambient light level, color temperature or other spectral or chromatic characteristics of the environment, e.g. as a result of a passing cloud, during a sunset or a sunrise.
In some embodiments the target 330 can be configured to change its characteristics depending on the user input from the user interface 320. For example, target 330 or a component of the target can be configured to change size, shape, rotation, color, background color or spatial frequency composition or other characteristics, or any combination of any of the above as a result of user interaction with the electronic device 300 using the tactile screen 320 such as a swipe, tap, click, or voice command or other gesture, or any combination of any of the above. Furthermore, in some embodiments the user interaction with user interface 320 can be performed using a keypad, keyboard, mouse, microphone, or any other interface method, or any combination of any of the above.
FIG. 4 is a flowchart of an illustrative sub-process 400 for changing a target with distance between a head and electronic device in accordance with one embodiment of the invention. Sub-process 400 can consist of several steps. In some embodiments the order of steps of sub-process 400 can be changed or some steps can be omitted or repeated. Furthermore, sub-process 400 can be included in another process (parent process) as a sub-process.
Sub-process 400 can be performed by an electronic device (100; 200; 300) with distance measurement circuitry (110; 210; 310) and user interface (120; 220; 320), and one or more other components of the electronic device (100; 200; 300). The first step of sub-process 400 can continue from a parent process and begin with block 410 where the user interface (120; 220; 320) can be configured to display a target 330 on a screen 320. For example, in one embodiment of the invention, said target can be an optotype 330a, or text 330b, or one or more parallel lines 330c, 330d, or one or more grey patches 330e or color patches 330f, or a geometrical pattern such as a grid 330g, or a picture 330h, or other type of target, or any combination of any of the above.
At block 420 a user can change the distance between the user’s head 270 and electronic device (100; 200; 300). Furthermore, distance measurement circuitry (110; 210; 310) can send to processing circuitry (140; 240) a signal including a measurement of distance between a user’s head 270 and electronic device (100; 200; 300). As previously indicated, distance measurement circuitry (110; 210; 310) can use any suitable technique or combination of techniques for measuring distance between a user’s head 270 and electronic device. Furthermore, said distance between the user's head and the device can be measured using another method (such as a ruler or rangefinder) and input into the user interface (120; 220; 320).
In some embodiments of the present invention, the change of the distance between the user’s head 270 and electronic device can comprise holding the device in a user’s hand and bringing it closer to the face or further away from the face.
In some embodiments of the present invention, the change of distance between the user’s head 270 and electronic device can comprise situating a reflective surface, such as a mirror, in front of the electronic device (so that a reflection of a user’s head 270 is within the field of view of the electronic device), and changing the distance between the device and a mirror, or the distance between said user’s head or part of it 270 and a mirror, or any combination thereof.
In some embodiments of the present invention, the changing the distance between the user’s head 270 and electronic device can comprise changing the distance by a third party, such as another person, another apparatus, or other, or any combination of any of the above.
Block 430 can be a decision block where the user interface (120; 220; 320) can be configured to instruct the user to evaluate if a target 330 or a component of the target meets a certain visual quality criterion. For example, in one embodiment of the invention, said visual quality criterion can be a visual acuity criterion (e.g. being able to read optotypes (330a) or text (330b), or resolve two or more parallel lines (330c, d), or other, or any combination of any of the above). As another example, in one embodiment of the invention, said visual quality criterion can be a contrast sensitivity criterion (e.g. being able to distinguish grey patches (330e) or match grey patches) or a color discrimination criterion (e.g. being able to distinguish colors (330f) or match colors), or a spatial geometry criterion (e.g. being able to detect deformations in geometrical patterns (330g) such as warping of a grid) or recognize pictures or details in pictures (330h), or other criterion, or any combination of any of the above. Furthermore, the said evaluation if a target 330 or a component of the target meets or ceases to meet a certain visual quality criterion in block 430 can include the evaluation of: continuation, cessation or satisfaction of said visual quality criterion.
Furthermore, given that in some embodiments of the present invention, the displaying a target 330 comprises changing the size, shape, rotation, color, background color or other characteristics such as spatial frequency of one or more components of the target or background, or any combination of any of the above with a change in a parameter of user’s environment, such as day time, ambient light level, ambient light color temperature, or others, or any combination of any of the above, the said evaluation if a target 330 or a component of the target meets or ceases to meet a certain visual quality criterion in block 430 can depend on said parameter of user’s environment. For example, the user interface (120; 220; 320) can be configured to instruct the user to evaluate if a target 330 comprising an optotype 330a meets or ceases to meet a certain visual quality criterion at a certain point of time whilst said optotype shape changes from optotype E to optotype C or other combination of optotype shapes. In order to define the pace of change for certain characteristics, either individually or in groups, time intervals can be preset. For instance, periodically, every 3 seconds, an optotype may change its color, each 2 seconds it may change its size and its shape, etc. Additionally, these time intervals may be activated only upon triggering a certain condition.
In one embodiment of the invention, at block 430 the user can be instructed to evaluate whether or not a target or a component of the target has ceased to meet said visual quality criterion, for example having become blurry, distorted, discolored, unclear, or other.
Furthermore, at the decision block 430 if user input to the user interface (120; 220; 320) indicates that a target 330 satisfies or ceases to satisfy a certain visual quality criterion, the sub-process 400 can proceed to block 440. For example the user can stop seeing a fine spatial pattern in the target at a certain distance as it is moved away from the user’s head and it becomes blurred, in which case the sub-process 400 can proceed to block 440.
On the other hand, at decision block 430 if the user interface (120; 220; 320) does not indicate user input regarding a certain visual quality criterion, process 400 can go to block 450, which can be a decision block. At block 450, if the distance between a user’s head 270 and electronic device can not be further changed (for example a user is not able to move the electronic device further away than arm length), sub-process 400 can proceed to block 440.
At block 440 a distance between a user’s head 270 and the electronic device (100; 200; 300) can be stored in storage (150; 250) or transmitted by communications circuitry (160; 260) along with, but not limited to user input data. Furthermore, at block 440 sub-process 400 can return to a parent process in which it can be included.
On the other hand, at block 450, if said distance between a user’s head 270 and electronic device can be further changed, sub-process 400 can proceed to block 460. At block 460, the user interface can be configured to change characteristics of a target 330 or a component of the target. For example, in one embodiment of the invention the distance measurement circuitry (110; 210; 310) can send to processing circuitry (140; 240) a signal including a measurement of distance between a user’s head 270 and electronic device (100; 200; 300). The processing circuitry can use any technique or combination of techniques to process said signal and send a signal to control circuitry (130; 230), which in turn can configure the user interface (120; 220; 320) to change such characteristics of the target 330 or a component of the target such as size, shape, rotation, color, background color, or spatial frequency composition or other characteristics or other, or any combination of any of the above, in function of said distance between a user’s head 270 and electronic device, or in function of another dimension measured in the user’s head by the distance measurement circuitry 210, which changes proportionally with said distance, such as the interpupillary distance, head width, iris diameter, or others, or any combination thereof.
FIG. 5 is a flowchart of an illustrative process 500 for interactively measuring either or both sides of the interval of clear vision of the eye in accordance with one embodiment of the invention. Process 500 can consist of several steps. In some embodiments the order of steps of process 500 can be changed or some steps can be omitted or repeated.
Process 500 can be performed by an electronic device (100; 200; 300) with distance measurement circuitry (110; 210; 310) and user interface (120; 220; 320), and one or more other components of the electronic device (100; 200; 300).
Process 500 can begin at block 510 where a user interface (120; 220; 320) of an electronic device (100; 200; 300) can be configured to receive user input information, such as age, gender, value of the sphero-cylindrical power of the ophthalmic or contact lenses already ported by the subject, vertex distance, or others or any combination thereof. For example, in one embodiment of the invention said information can be acquired by configuring the user interface (120; 220; 320) to prompt the user to input said information into the user interface using a tactile screen or voice recognition circuitry, or other, or any combination thereof. As another example, in one embodiment of the invention said information can be acquired automatically by means of detecting it from an image of a user’s head 270 from a camera included in said user interface.
At block 520, user interface (120; 220; 320) can be configured to instruct a user to use one (left or right) eye or both eyes while interacting with an electronic device (100; 200; 300).
At block 530 process 500 can include sub-process 400 (see FIG. 4). For example, in one embodiment of the invention, at the decision block 430 of sub-process 400 included at block 520 of process 500 a user can provide user input to user interface (120; 220; 320) indicating that visual quality of a target 330 or a component of the target satisfies or ceases to satisfy a certain visual quality criterion, corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision SICV. At block 440 (see FIG. 4) of sub-process 400 included at block 530 of process 500, the distance between an electronic device (100; 200; 300) and a subject’s head 270 can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
At block 540 the user interface (120; 220; 320) can be configured to display a new target and instruct a user to select a preferred angle of target orientation while interacting with an electronic device (100; 200; 300). As an example, in one embodiment of the invention, the user interface (120; 220; 320) of an electronic device (100; 200; 300) situated in proximity of dFP or pNP (which are the two outer points) can be configured to receive user input including a preferred angle of target orientation adFP or apNP. In one embodiment of the invention, the user interface (120; 220; 320) can be configured to change a target 330 or a component of the target on a tactile screen 320 in response to user input such as touching or swiping a tactile screen, or typing on a keypad or keyboard, speaking into a microphone, performing a gesture to detected by a camera, performing a gesture detected by a gyroscope, or others or any combination of any of the above. As a further example, in one embodiment of the invention, user interface (120; 220; 320) can be configured to display a target 330 including, but not limited to a set of parallel or concentric lines on a tactile screen 320, and to receive user input from said tactile screen 320 and perform change of orientation of said target 320 by angle adFP or apNP.
At block 550, a preferred angle of target orientation adFP selected by a user at block 540 can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
At block 560 process 500 can include sub-process 400 (see FIG. 4). At block 410, user interface (120; 220; 320) can be configured to display a new target 330 including, but not limited to a set of parallel or concentric lines on a tactile screen 320 oriented at angle adFP or apNP.
As previously described, at decision block 430 of sub-process 400 included at block 560 of process 500 a user can provide user input to user interface (120; 220; 320) indicating that visual quality of said target 330 or a component of the target satisfies or ceases to satisfy a certain visual quality criterion corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision, at a distance in the vicinity of either dFP or pNP. At block 440 of sub-process 400 said distance can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
At block 570 process 500 can include sub-process 400 (see FIG. 4). At block 410, user interface (120; 220; 320) can be configured to display a new target 330 including, but not limited to a set of parallel lines on a tactile screen 320 oriented at angle apFP = adFP + 90° or adNP = apNP - 90°.
As previously described, at decision block 430 of sub-process 400 included at block 570 of process 500 a user can provide user input to user interface (120; 220; 320) indicating that visual quality of said target 330 or a component of the target satisfies or ceases to satisfy said visual quality criterion, corresponding to an electronic device (100; 200; 300) being situated on either side of the interval of clear vision, at a distance in the vicinity of either pFP or dNP (which are the inner two points). At block 440 of sub-process 400 said distance can be stored in storage (150; 250) or transmitted using communications circuitry (160; 260).
At block 580 processing circuitry (140; 240) can use any technique or combination of techniques to compute ocular refractive errors such as, but not limited to, sphere (SPH), cylinder (CYL) and axis (AXS) from dFP, pFP, adFP apFP, dNP, pNP, adNP, or apNP or others, or any combination of any of the above.
In one embodiment of the invention, AXS can be calculated from adFP and apFP using, for example, the following equations:
AXS = 90° - apFP when 0° < adFP < 90°; or
AXS = 270°- adFP otherwise; Eq. 1 and adFP = apFP - 90°. Eq. 2
Furthermore, AXS can be calculated from adNP and apNP using, for example, the following equations:
AXS = 90° - adNP when adNP < 90°; or
AXS = 270°- adNP otherwise; Eq. 3 and adNP = apNP - 90°. Eq. 4 where adNP, apNP, adFP, and apFP are expressed in units of degrees from 1° to 180°.
SPH and CYL can be calculated from dFP, pFP using, for example, the following equations:
SPH = - 1/dFP + K. Eq. 5
CYL = - (1/pFP - 1/dFP), Eq. 6 where parameter K depends on target and background color. In the case of a black background, K = 0 D, K > 0 D and K < 0 D for white, blue and red targets, respectively. The specific value of K depends on an emission spectrum of a physical target.
In one embodiment of the invention, SPH and CYL can be also calculated from dNP, pNP using, for example, the following equations:
SPH = AA - 1/dNP + K. Eq. 7
CYL = - (1/pNP - 1/dNP), Eq. 8
Values of dFP, pFP, dNP, pNP can be expressed in meters and K in diopters. AA value, expressed in diopters, can depend on AGE, expressed in years, as:
AA = 15.6 - 0.3 * AGE when AGE <= 52 years; or
AA = 0 D otherwise. Eq. 9 Furthermore, at block 580 processing circuitry (140; 240) can use any technique or combination of techniques to compute power of reading glasses (P) from dNP or pNP, or other parameters, or any combination thereof. For example, in one embodiment of the invention, the power of the reading glasses P can be calculated as:
P = 3 D - E (1 / ((dNP + pNP)/2) + K), when E(1 I ((dNP + pNP)/2) + K) < 3D
P = 0 D otherwise. Eq. 10 where P can be expressed in diopters and E can be a constant value between 0 and 1 .
As described previously, at block 520 of process 500, the user interface (120; 220; 320) can be configured to instruct a user to use one (left or right) eye or both eyes while interacting with an electronic device (100; 200; 300). As an example, in one embodiment of the invention, at block 410 (see FIG. 4) of sub-process 400 included at block 530 of process 500, user interface (120; 220; 320) can be configured to display a target 330 including, but not limited to a text (330b). At block 440 of sub-process 400 a near point distance NP can be stored in storage (150; 250) and the power P of the reading glasses can be calculated as:
P = 3 D - E(1 / NP + K), when E(1 / NP + K) < 3 D
P = 0 D otherwise. Eq. 11 where NP can be expressed in meters.
In one embodiment of the invention, addition of reading glasses (ADD) can be calculated using the following equations:
ADD = P - (SPH + CYL/2) when P > (SPH + CYL/2); or
ADD = 0 D otherwise. Eq. 12
Eq. 1-12 correspond to corneal-plane refraction.
Furthermore, at block 580 of process 500 processing circuitry (140; 240) included in an electronic device (100; 200; 300) can use any suitable technique or combination of techniques to compute spectacle-plane refraction or power of reading glasses from corneal-plane refraction from dFP, pFP, dNP, pNP, FP, NP, vertex distance (VD) or others, or any combination thereof. VD depends on the type of correction (usually 0.0 m for contact lenses and 0.014 m for glasses).
At block 590 parameters such as, but not limited to, SPH, CYL, AXS, FP, NP, P, ADD, dFP, pFP, adFP apFP, dNP, pNP, adNP, apNP, VD, user input, or others, or any combination of any of the above can be saved in storage (150; 250).

Claims

CLAIMS A computer-implemented method for interactively measuring either or both sides of the interval of clear vision of the eye of a user of an electronic device comprising:
• displaying at least one target on an electronic screen of the electronic device,
• changing the vergence of light rays originating at said target and arriving at the user’s head,
• receiving input from the user indicating that the electronic device is positioned at a distance from the user’s head or part of it where the visual quality of said target or a component of the target satisfies or ceases to satisfy a certain visual quality criterion.
• computing at least one position of at least one of the sides of the interval of clear vision based on at least said distance. The computer-implemented method of claim 1 wherein displaying a target on an electronic screen of the electronic device comprises changing one or more displaying characteristics of said target or a component of the target such as, but not limited to: size, orientation, position, chromatic and spatial frequency components, wherein said characteristics change independently of each other with distance between said user and said electronic device. The computer-implemented method according to any of claims 1 or 2, where said target displayed on an electronic screen of the electronic device comprises:
• a single letter, optotype or a group thereof,
• a text,
• a geometrical pattern,
• a color or grayscale pattern,
• a repetitive pattern such as a grid,
• a picture or movie, other spatial stimuli,
• or any combination of any of the above. The computer-implemented method according to claim 1 wherein computing at least one position of at least one of the sides of the interval of clear vision comprises:
• interactively changing characteristics of said target or a component of the target on an electronic screen, wherein the displaying characteristic comprises: size, orientation or position, in order to satisfy or cease to satisfy said visual quality criterion according with the user’s preferred angle of target orientation;
• measuring a first distance between the electronic screen and the user’s head or part of it;
• changing a target so that it includes at least one line or curve oriented perpendicularly to said preferred angle of target orientation wherein and measuring a second distance between said electronic screen and the user’s head; The computer-implemented method according to claim 4, wherein the said preferred angle of target orientation is found by physically rotating the screen or the stimuli on the screen of the electronic device around the user’s line of sight. The computer-implemented method according to claim 4, wherein said preferred angle of target orientation is computed from an image of the user’s head or part of it which can be rotated with respect to the screen of the electronic device. The computer-implemented method according to claim 4 wherein said first distance can be computed from a mathematical relation between the user’s age and said second distance. The computer-implemented method according to any of claims 1 to 7 wherein a reflective surface can be placed between the user’s eye of a user’s head or part of it and said screen of the electronic device to change the vergence of rays travelling from said target to said user’s head. The computer-implemented method according to any of claims 1 to 8 wherein at least one of said parameters of refraction, wherein the parameters include: SPH, CYL, AXS, P, or ADD; wherein the parameters are calculated from a mathematical relation between said parameter and vertex distance from corneal-plane to spectacle-plane, or vice-versa. The computer-implemented method according to claim 1 wherein said user’s age, AGE, can be obtained from one of the following methods:
• inputting AGE or date of birth into a user interface by the user,
• retrieving remotely from a database or server,
• detecting AGE from an image of the user’s head or part of it, using an age detection algorithm;
• or any combination thereof. The computer-implemented method of any of previous claims 1 to 10, wherein displaying a target on an electronic screen of the electronic device comprises changing one or more displaying characteristics of said target or a component of the target, wherein the displaying characteristics include: size, orientation, position, chromatic and spatial frequency components. The computer-implemented method according to claim 11 , wherein said displaying characteristics of the target change independently of each other, wherein the displaying characteristics of the target change depending on one or more parameters of a user, wherein the parameter comprises:
• day time,
• ambient light level,
• ambient light color temperature,
• ambient light spectral or chromatic characteristics of environment. The computer-implemented method of claims 11 or 12, wherein changing one or more displaying characteristics of said target or a component of the target are initiated by a user or by the electronic device according to a time interval associated to one or more displaying characteristics. A non-transitory storage medium comprising computer executable instructions for performing the method for interactively measuring either or both sides of the interval of clear vision of the eye of a user of an electronic device according to any of previous claims 1 to 13 A system for interactively measuring either or both sides of the interval of clear vision of the eye of a user of an electronic device, the system comprising:
• a distance measurement circuitry configured to detect a distance between the user’s head or part of it and the electronic device;
• a user interface configured to give instructions to the user and receive user input;
• an electronic screen for displaying and changing a target;
• a processing circuitry for computing the position of at least one of the sides of the interval of clear vision. The system according to claim 15 wherein the processing circuitry of the electronic device can be further configured to:
• compute at least one parameter of refraction selected from sphere (SPH), cylinder (CYL), axis (AXS), power of the reading glasses (P), or addition (ADD) from the position of said either or both sides of the interval of clear vision, user’s age (AGE), chromatic or spatial displaying characteristics of said target, user- indicated angle of target orientation or other parameters, or any combination thereof.
• compute the amplitude of accommodation (AA) from the position of said either or both sides of the interval of clear vision, user’s age (AGE), chromatic or spatial displaying characteristics of said target, user-indicated angle of target orientation or other parameters, or any combination thereof. The system according to claim 16 wherein the distance measurement circuitry and user interface of the electronic device can be further configured to:
• measure a rotation of the device with respect to an axis;
• measure a rotation or tilt of the user’s head or part of it with respect to said axis. The system according to claim 16 wherein the axis is a line of sight between the user and said electronic device. The system according to any of claims 15 - 18 wherein the user interface further comprises:
• a speaker,
• a microphone,
• voice recognition circuitry,
• gesture recognition circuitry,
• or any combination thereof. An electronic device comprising the system according to any of claims 15-19, wherein the device is one of the following:
• a mobile phone;
• a tablet;
• a smart television;
• a personal digital assistant;
• a laptop computer;
• a desktop computer;
• a stand-alone camera;
• a game console;
• a video-recorder. The electronic device according to claim 20 further comprising:
• a storage configured to save said parameters and other information in a memory of the electronic device;
• a communications circuitry configured to transmit said parameters and other information to and from a network.
PCT/EP2023/052814 2022-02-06 2023-02-06 A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye WO2023148372A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ES202230087A ES2947436A1 (en) 2022-02-06 2022-02-06 COMPUTER IMPLEMENTED METHOD FOR INTERACTIVELY MEASURING ONE OR BOTH EDGES OF THE INTERVAL OF SHARP VISION OF THE USER'S EYE BY MEANS OF AN ELECTRONIC DEVICE, SYSTEM, AND COMPRISING ELECTRONIC DEVICE (Machine-translation by Google Translate, not legally binding)
ESP202230087 2022-02-06

Publications (1)

Publication Number Publication Date
WO2023148372A1 true WO2023148372A1 (en) 2023-08-10

Family

ID=85199413

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/052814 WO2023148372A1 (en) 2022-02-06 2023-02-06 A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye

Country Status (2)

Country Link
ES (1) ES2947436A1 (en)
WO (1) WO2023148372A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596379A (en) * 1995-10-24 1997-01-21 Kawesch; Gary M. Portable visual acuity testing system and method
WO2006029048A2 (en) * 2004-09-03 2006-03-16 Panaseca, Inc. Vision center kiosk
US20120212706A1 (en) * 2011-02-23 2012-08-23 Brian Chou Method and System for Self-Administering a Visual Examination Using a Mobile Computing Device
JP2015103991A (en) * 2013-11-26 2015-06-04 パナソニックIpマネジメント株式会社 Image processing apparatus, method and computer program
US9704216B1 (en) * 2016-08-04 2017-07-11 Le Technology Dynamic size adjustment of rendered information on a display screen
WO2019099952A1 (en) * 2017-11-17 2019-05-23 Oregon Health & Science University Smartphone-based measurements of the refractive error in an eye
EP3730038A1 (en) * 2019-04-25 2020-10-28 VisionApp Solutions S.L. A computer-implemented method and system for interactively measuring ocular refractive errors, addition and power of reading glasses

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5596379A (en) * 1995-10-24 1997-01-21 Kawesch; Gary M. Portable visual acuity testing system and method
WO2006029048A2 (en) * 2004-09-03 2006-03-16 Panaseca, Inc. Vision center kiosk
US20120212706A1 (en) * 2011-02-23 2012-08-23 Brian Chou Method and System for Self-Administering a Visual Examination Using a Mobile Computing Device
JP2015103991A (en) * 2013-11-26 2015-06-04 パナソニックIpマネジメント株式会社 Image processing apparatus, method and computer program
US9704216B1 (en) * 2016-08-04 2017-07-11 Le Technology Dynamic size adjustment of rendered information on a display screen
WO2019099952A1 (en) * 2017-11-17 2019-05-23 Oregon Health & Science University Smartphone-based measurements of the refractive error in an eye
EP3730038A1 (en) * 2019-04-25 2020-10-28 VisionApp Solutions S.L. A computer-implemented method and system for interactively measuring ocular refractive errors, addition and power of reading glasses

Non-Patent Citations (11)

* Cited by examiner, † Cited by third party
Title
BANNON RE: "A study of astigmatism at the near point with special reference to astigmatic accommodation", AM J OPTOM ARCH AM ACAD OPTOM, vol. 23, 1946, pages 53 - 75
BORISH IM: "Clinical refraction", 1970, PROFESSIONAL PRESS
BULLIMORE, M. A.BOYD, T.MATHER, H. E.GILMARTIN, B: "Near retinoscopy and refractive error", CLINICAL AND EXPERIMENTAL OPTOMETRY, vol. 71, no. 4, 1988, pages 114 - 118
BULLIMORE, M. A.FUSARO, R. E.ADAMS, C. W.: "The repeatability of automated and clinician refraction", OPTOMETRY AND VISION SCIENCE: OFFICIAL PUBLICATION OF THE AMERICAN ACADEMY OF OPTOMETRY, vol. 75, no. 8, 1998, pages 617 - 622
CHARMAN WN: "Wavefront aberration of the eye: a review", OPTOM VIS SCI, vol. 68, 1991, pages 574 - 583, XP009034665
DUANE A.: " Studies in Monocular and Binocular Accommodation, with Their Clinical Application", TRANSACTIONS OF THE AMERICAN OPHTHALMOLOGICAL SOCIETY, vol. 20, 1922, pages 132 - 57
HEATH G. G.: "The influence of visual acuity on accommodative responses of the eye", AM. J. OPT&T. & DRCHS AM. ACAD. OPROM., vol. 33, 1956, pages 513 - 524
JACKSON E: "Amplitude of Accommodation at Different Periods of Life", CALIFORNIA STATE JOURNAL OF MEDICINE, vol. 5, no. 7, 1907, pages 163 - 6
LOPEZ-GIL, N., PEIXOTO-DE-MATOS, S. C., THIBOS, L. N., & GONZALEZ-MEIJOME, J. M.: "Shedding light on night myopia", JOURNAL OF VISION, vol. 12, no. 5, 2012, pages 1 - 9
SIVAK JGMANDELMAN T: "Chromatic dispersion of the ocular media", VISION RES, vol. 22, 1982, pages 997 - 1003, XP025697100, DOI: 10.1016/0042-6989(82)90036-0
THIBOS LNYE MZHANG XBRADLEY A: "The chromatic eye: a new reduced-eye model of ocular chromatic aberration in humans", APPL OPT, vol. 31, 1992, pages 3594 - 3600, XP001206892, DOI: 10.1364/AO.31.003594

Also Published As

Publication number Publication date
ES2947436A1 (en) 2023-08-09

Similar Documents

Publication Publication Date Title
US11666211B2 (en) Computerized testing and determination of a visual field of a patient
US20230118575A1 (en) Computerized refraction and astigmatism determination
US20220151488A1 (en) Computer-implemented method and system for interactively measuring ocular refractive errors, addition and power of reading glasses
CN112689470B (en) Method for performing a test of the power of a scattered light using a computing device and corresponding computing device
CN112399817A (en) Measuring refraction of eye
CN111699432B (en) Method for determining the power of an eye using an immersive system and electronic device therefor
JP2023531694A (en) Subjective refraction system
WO2023148372A1 (en) A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye
CN116056623B (en) Method and device for determining at least one astigmatic effect of at least one eye

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23703754

Country of ref document: EP

Kind code of ref document: A1