WO2019099952A1 - Smartphone-based measurements of the refractive error in an eye - Google Patents

Smartphone-based measurements of the refractive error in an eye Download PDF

Info

Publication number
WO2019099952A1
WO2019099952A1 PCT/US2018/061694 US2018061694W WO2019099952A1 WO 2019099952 A1 WO2019099952 A1 WO 2019099952A1 US 2018061694 W US2018061694 W US 2018061694W WO 2019099952 A1 WO2019099952 A1 WO 2019099952A1
Authority
WO
WIPO (PCT)
Prior art keywords
computing device
mobile computing
subject
duochrome
computer
Prior art date
Application number
PCT/US2018/061694
Other languages
French (fr)
Inventor
David Huang
David Brownell
Original Assignee
Oregon Health & Science University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oregon Health & Science University filed Critical Oregon Health & Science University
Publication of WO2019099952A1 publication Critical patent/WO2019099952A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Definitions

  • Embodiments herein relate to the field of digital health and, more specifically to a smartphone-based system for determining the refraction of a subject’s eye.
  • Measuring the subjective refraction (also called manifest refraction) of a person is usually performed with a phoropter and an eye chart.
  • the refraction is used to prescribe eyeglasses.
  • a basic eye chart is simple and inexpensive
  • the phoropter is a relatively complicated assembly of lenses that is only available in an eye clinic or optical shop.
  • RDR Rapid Deductive Refraction
  • the method may include any and all of the following steps: initiating, using the mobile computing device, an eye exam for a subject; presenting to the subject, using a screen of the mobile computing device, one or more duochrome target images at defined distances that correspond to one or more diopter powers; querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome target images; entering and/or storing the results of the query in memory of the mobile computing device; and using the results of the query to determine a refractive value of an eye of the subject.
  • the one or more duochrome target images are a sequence of duochrome images selected from a decision tree based on the subjective responses from the subject.
  • the subjective responses from the subject are at least in part used to select one of a plurality of predefined decision trees for final refraction determination.
  • the duochrome target image comprises a duochrome chart with a range in optotypes or pattern size corresponding to a range in visual angles.
  • the range in optotypes is from logarithm of minimum angle of resolution (logMAR) 0.0 (20/20) to logMAR 0.50 (20/63).
  • apparatuses, systems, and non-transitory computer-readable storage media further include soliciting, from the subject a response of darker, bolder, or sharper in response to the query.
  • apparatuses, systems, and non-transitory computer-readable storage media further include: monitoring, using a distance-sensing module of the mobile computing device, a distance from the subject to the mobile computing device to set a correct distance for each of the duochrome target images at the defined distances.
  • apparatuses, systems, and non-transitory computer-readable storage media further include: monitoring, using a distance-sensing module of the mobile computing device, a distance to the subject, and continuously adjusting an optotype size for a correct measurement.
  • the duochrome target image comprises a first color field and a second, different color field where the first color field has a color with a shorter wavelength light than a color of the second color field.
  • the duochrome target image incudes a third background color that has a shorter wavelength than the first color field and the second color field to help minimize the subject's accommodation.
  • the duochrome target image further includes contrasting lines to help minimize the subject's accommodation.
  • a fogging lens with positive dioptric power is placed in front of a fellow eye to minimize the accommodative response.
  • a photo refractor is used expand a refractive range of the mobile computing device.
  • the method is used to determine one or more of an astigmatism axis and an astigmatism power a spherical refraction value and a cylindrical refraction value.
  • apparatuses, systems, and non-transitory computer-readable storage media further include: entering, using the mobile computing device, a preliminary refraction value in memory of the mobile computing device, wherein the preliminary refraction value comprises a manifest refraction or an eye glass prescription value.
  • the sequence of duochrome images is selected from the decision tree using a bracketing search algorithm.
  • computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media further include determining visual acuity using the mobile computing device, by: presenting to the subject, using a screen of the mobile computing device, a low-contrast visual acuity chart; querying the subject as to a smallest sized optotype the subject can read correctly at the visual acuity test distance; entering and/or storing a response to the query in memory of the mobile computing device; and storing the visual acuity based on the visual acuity test distance and the smallest readable optotype size.
  • apparatuses, systems, and non-transitory computer-readable storage media further include determining an astigmatism axis using the mobile computing device, by: displaying, using the mobile computing device, an astigmatism axis test chart on a display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions; querying the subject as to which line(s) on the astigmatism axis test chart appear the sharpest; and entering and/or storing a response to the query in memory of the mobile computing device thereby determining the astigmatism axis.
  • computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media further include determining an astigmatism magnitude using the mobile computing device by:
  • the mobile computing device is in communication with a network.
  • the network is a telecommunications network.
  • apparatuses, systems, and non-transitory computer-readable storage media include initiating, by the mobile computing device, a telemedicine session.
  • Figure 1 is a schematic representation of an example mobile computing device for conducting an eye exam, in accordance with disclosed embodiments.
  • Figure 2 is flow chart showing an example method for conducting an eye exam using a mobile computing device, in accordance with disclosed
  • Figure 3 is a schematic representation of an example mobile computing device displaying a duochrome eye chart, in accordance with disclosed
  • Figure 4 is a half-clock eye chart to test the astigmatism axis, in accordance with disclosed embodiments.
  • Figure 5 is a schematic representation of an example mobile computing device displaying a duochrome eye chart to test the plus-axis far-point, in accordance with disclosed embodiments.
  • Figure 6 is a schematic representation of an example mobile computing device displaying a duochrome eye chart to test the minus-axis far-point, in accordance with disclosed embodiments.
  • Figure 7 is a schematic diagram of a networked mobile computing device for conducting an eye exam, in accordance with embodiments herein.
  • FIG. 8 is an example of a Rapid Deductive Refraction (RDR) flow chart used for patients over 45 years old, for measuring the refractive error in an eye with a starting point refraction of low myopia, in accordance with disclosed
  • RDR Rapid Deductive Refraction
  • Figure 9 is an example of a flow chart for younger patients who may be accommodating, and shows the revised Duochrome Neutral Distance logic, in accordance with disclosed embodiments.
  • Figure 10 is table showing initial pilot study results for the RDR method and approaches to minimize accommodation in younger patients.
  • the RDR method took between 20-60 seconds and was able to measure 83% of patients within a range of +/- 0.5D, including 71 % of younger patients within +/- 0.5D.
  • Figure 11 is a flow chart outlining a method of Rapid Deductive
  • DND duochrome neutrality distance; estimates spherical equivalent refraction.
  • Far LCVA low-contrast visual acuity test at a far distance (e.g. 4m, 6m, 3m).
  • Near LCVA low-contrast visual acuity test at a near distance (e.g. 0.67m, 1 m, 0.5m).
  • LCVA* low-contrast visual acuity test at DND.
  • Figure 12 is an example of a 4 meter 25% low-contrast visual acuity chart.
  • Coupled may mean that two or more elements are in direct physical contact.
  • Coupled may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other.
  • a phrase in the form“A/B” or in the form“A and/or B” means (A), (B), or (A and B).
  • a phrase in the form“at least one of A, B, and C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C).
  • a phrase in the form“(A)B” means (B) or (AB) that is, A is an optional element.
  • Rapid Deductive Refraction is used in a stepwise fashion to determine and/or deduce the refractive error in the eye, for example by reaching a duochrome neutrality distance (DND) endpoint.
  • DND duochrome neutrality distance
  • responses by the subject can be used to deduce the refractive error in the eye.
  • the (RDR) method places images at comfortable and natural distances and uses subjective responses from the subject, such as‘darker’ and‘bolder’, as decision criteria for the subject. As discussed in detail below, these subjective responses are easier to easier to decide upon than, for example, reading the characters off of a traditional eye chart. In addition, as depicted in Figures 8 and 9 these subjective responses are used to guide the test down predetermined decision trees to a final determination of refractive error in the eye.
  • the disclosed RDR method uses a stepwise and predefined series of dioptric distances and presents a duochrome image for a subjective response to deduce the refractive error in the eye.
  • these subjective responses are not the blur limit at a maximum distance but are rather comparative differences in the darkness or boldness of the characters on different colored backgrounds (see, for example, Figures 3 and 10). These comparative differences are indicative of the refractive power in the eye at each step of the test, and the sequence of response determines the refractive power within a narrow range of diopters (see, for example, the flowcharts given in Figures 8 and 9).
  • the methods disclosed herein drives the test toward the Duochrome Neutrality Distance (DND) for a final result, which can reduce both eyestrain and discomfort for the subject undergoing the test.
  • DND Duochrome Neutrality Distance
  • MDBA Maximum Distance of Best Acuity
  • the disclosed methods do not require that a subject get a 'right' or‘correct’ answer on which small line/character they are looking at, but rather simplifies the process of an eye exam by asking a subject to compare characters of multiple sizes on different backgrounds for darkness and boldness. So getting it 'right' between an "E and an 'F' (for example) at some blurred distance isn't the end point, rather its letting the refractive power of the eye (which is different for red and green colors as discussed below) distinguish the darkness and boldness of the duochrome target image at each predefined distance to guide the results to the final refraction.
  • the result is that the disclosed methods provide for an eye exam that is both simpler and more rapid than other methods that reply on MDBA, for example as described in U.S. Patent No. 9,549,669.
  • the novel methods disclosed herein start with the general principle of refractive measurements of the eye in terms of diopters, and the associated distances from the eye that correspond to diopters of power.
  • a mobile computing device such as a smartphone
  • a subject’s specific refraction can be deduced from this general principle using a sequence of predefined distances and a duochrome target image to solicit subjective responses from the subject.
  • the subject’s specific responses are used in the method to guide the eye test down one of several predefined paths toward the final refraction determination (see, for example Figures, 8 and 9).
  • this approach is fundamentally different from the traditional and common method of pushing the images out to the blur limit of the subject’s vision to measure visual acuity.
  • the disclosed methods, devices and systems determine the refraction of a subject’s eye(s) based on a Rapid Deductive Refraction (RDR) stepwise method, using a mobile computing device, such as a smart phone, for example an appropriately programmed
  • RDR Rapid Deductive Refraction
  • an appropriately programed mobile computing device can be used to determine the spherical equivalent far-point vision, visual acuity, an astigmatism axis, and the astigmatism magnitude for one or both eyes.
  • Figure 11 describes an exemplary method of a subjects’ refraction, for example without prior knowledge of their approximate refractive error.
  • the method can begin with using a photoscreener, such as the GoCheck device (see, for example, US patent No 9,380,938, which is hereby incorporated herein by reference) to make an initial measurement on the refractive error of an eye of the subject.
  • a photoscreener such as the GoCheck device (see, for example, US patent No 9,380,938, which is hereby incorporated herein by reference) to make an initial measurement on the refractive error of an eye of the subject.
  • the test, methods, apparatus, etc. can be used on both eyes of the subject.
  • a photoscreener is able to measure moderate to severe myopia and hyperopia, but cannot detect low myopia and low hyperopia within a range generally referred to as the null zone.
  • the RDR method as disclosed herein is particularly useful when the refraction falls within this null zone.
  • the RDR method begins with one or more visual acuity tests to distinguish myopia, emmetropia, and/or hyperopia.
  • a low-contrast visual acuity chart is preferentially used for the best sensitivity (see, for example Figure 12), but a regular high-contract acuity chart can also be used, for example as displayed on a display of a mobile computing device. If the subject eye has normal acuity (equal to or better than 20/20 or logarithm of minimum angle of resolution [logMAR] of 0.0) at the far distance (e.g.
  • the eye is approximately emmetropic (neither hyperopic nor myopic). If however, the subject eye is worse than logMAR 0.0, then the acuity test is repeated at a near distance. If the near acuity is better (i.e. at the near distance) than the far acuity (i.e. at the far distance), then the eye is myopic (nearsighted). Conversely, If the far acuity is better than the near acuity, then the eye is hyperopic(farsighted). As an alternative to the near acuity test, a plus lens (e.g. +3.0 D) may be placed over the eye to repeat the far acuity test.
  • a plus lens e.g. +3.0 D
  • the eye is hyperopic. It the subject is nearsighted then the Rapid Deductive Refraction (RDR) stepwise method is continued using the duochrome tests as disclosed herein to determine the spherical equivalent refraction. If the subject is hyperopic then the RDR stepwise method may be performed using a +3D lens in front of the test eye.
  • the RDR stepwise method establishes the duochrome neutrality distance (DND), which is the distance at which the optotypes with red or green backgrounds appear equally bold or sharp.
  • DND duochrome neutrality distance
  • the DND is the bracketed distance in between a distance at which the optotype with red background appears sharper/bolder and a distance at which the optotype with the green background appears sharper or bolder. Either way, an endpoint is reached.
  • the DND is an estimate of the far point of the eye.
  • the spherical equivalent refraction of the eye is the reciprocal of the far point as estimated by DND. For example, if the DND is 2 meters, then spherical equivalent refraction is one divided by two meter or 0.5 diopters. After the DND is establish using the duochrome test on the mobile computing device, the visual acuity is again tested at the DND.
  • the astigmatism axis is first determined using the half-clock chart. Once the astigmatism axis is determined, then the DND is determined for lines targets oriented at the plus and minus axis orientations (see, for example Figures 5 and 6). The astigmatism magnitude is determined by the difference between the refraction at the plus and minus axes, which are calculated by taking the reciprocal of the DND for line targets oriented at the plus and minus axis orientation.
  • the spherical equivalent far-point is obtained by the use of a Rapid Deductive Refraction method for presenting a sequence of
  • determining an initial or final refractive value of an eye using the RDR method includes: accurately determining one or more of the refractive power, spherical equivalent, astigmatism axis, and astigmatism power in the eye with a predefined stepwise method using images displayed on a mobile computing device, such as on a smart phone, and soliciting subjective responses about the image quality from the subject, for example whether the image or portions of the image are ‘darker’ and/or‘bolder’.
  • the disclosed method includes providing, with the mobile computing device, short sequences of duochrome target images at predetermined distances.
  • the predetermined distances are separated by dioptric increments.
  • the subject After presentation of the duochrome target images, for example after each duochrome target image, the subject is queried as to the whether the characters presented in the duochrome image are, for example, darker and/or bolder on one of the two colored backgrounds on the duochrome target image.
  • the resulting subjective responses from the subject directs the test to the next appropriate test distance, for example, presentation of an image at either a greater or lesser predetermined distance are separated by dioptric increments.
  • the disclosed method can be a single step, for example where the correct
  • the disclosed method can multiple steps (see for example, Figures 8 and 9).
  • Various techniques can be used to minimize accommodation in the eye under test such as a different color background and a fogging lens on the fellow eye.
  • the accommodative response of eyes is minimized using various methods including a different colored background in the target image, a fogging lens in front of the fellow eye, and/or rapid fluctuations between a highly blurred image and a clear image.
  • a blue background and/or black contrasting lines tend to relax accommodation in younger patients. The eye refracts the blue
  • duochrome target images for example having 20/20 sized letters on the lower rows and larger sizes in the upper rows so the patient can find the best readable target for determining darker or bolder in case their vision is significantly worse than 20/20 due to astigmatism or spherical refractive error (see for example, the due-chrome image in Figure 3).
  • the disclosed method can be used in combination with a photo screener device such as a smart phone to provide a broad measurement range for refractive error with acceptable accuracy.
  • the astigmatism axis is measured by a chart of radial lines displayed on the mobile computing device, for example as a clock chart or fraction thereof.
  • the astigmatism magnitude is measured by measuring the far-point of a duochrome eye chart of lines oriented to the axis of plus cylinder and another duochrome eye chart of lines oriented to the axis of minus cylinder.
  • the distance measurements needed for calculations during the eye exam are measured by a distance sensor built into or integrated with the mobile computing device.
  • a duochrome target image using red and green may also include a blue background to minimize accommodation (see, for example, Figure 3).
  • presenting a blurred image to the subject that quickly changes to a clearly define image and then quickly back to a blurred image can minimize the accommodation of the subject’s eye. This can happen multiple times in succession.
  • a +2-3D lens can be placed in front of the fellow eye during test of the other eye can minimize
  • a photo refractor device for example GoCheck kids mobile photo screener, can be used to capture part of the refractive error range (see for example US patent No 9,380,938, which is hereby incorporated herein by reference).
  • the RDR technique can be used to test in the null zone of the photo screener to enable measurement through the entire refractive range.
  • duochrome test (see Figure 3), as implemented in the disclosed methods, devices, and systems, to determine spherical equivalent far-point represents a novel aspect of this disclosure. Additionally, the use of a duochrome test provides a more precise refraction endpoint than the alternative of relying the smallest size of target that can be seen by the subject, for example as described in the patent application by Lee and Dallek (US Patent Publication 2014/0268060 A1 ).
  • duochrome test as implemented in the disclosed methods, devices, and systems, to determine astigmatism magnitude also represents a novel approach.
  • the oriented lines had not been used on duochrome charts before to measure separate far-points along the plus and minus axes.
  • This innovation allows more precise measurement of astigmatism than relying on measurement of distortion as described in Lee and Dallek.
  • a mobile computing device distance sensor to determine the distance from the displayed eye charts represents a novel advance that is both faster and more convenient than using a tape measure as is typically employed.
  • adjusting the physical size of optotype and eye chart according to test distance to maintain the same visual-angle size is yet another innovation that allows eye chart to be presented at a wide range of test distances.
  • FIG. 1 illustrates a simplified diagram of a mobile computing device 100 for conducting an eye exam, in accordance with embodiments herein.
  • the mobile computing device comprises a smart phone, such as a commercially available smart phone, for example an iPhone ® , Samsung Galaxy®, Nokia Lumina ®, Motorola Droid ®, and the like.
  • the smartphone 100 is an iPhone, for example an iPhoneX.
  • the mobile computing device 100 includes a touch-screen display 110 and a distance measuring module 120.
  • the distance measuring module 120 includes an infrared dot projector 121 and an infrared camera 122.
  • mobile computing device 100 includes a number of components, such as one or more processors 140 and at least one communication module 142. In various embodiments,
  • the one or more processors 140 each include one or more processor cores.
  • the at least one communication module 142 is physically and electrically coupled to the one or more processors 140.
  • the communication module 142 is part of the one or more
  • mobile computing device 100 includes printed circuit board (PCB) 155.
  • PCB printed circuit board
  • the one or more processors 140 and communication module 142 is disposed thereon. Depending on its
  • mobile computing device 100 includes other components that may or may not be physically and electrically coupled to the PCB. These other components include, but are not limited to, a memory controller (not shown), volatile memory (e.g., dynamic random access memory (DRAM) (not shown)), non-volatile memory (not shown) such as read only memory (ROM), flash memory (not shown), an I/O port (not shown), (not shown), a digital signal processor (not shown), a crypto processor (not shown), a graphics processor (not shown), one or more antenna (not shown), a touch-screen display 110, a touch-screen display controller (not shown), a battery (not shown), an audio codec (not shown), a video codec (not shown), a global positioning system (GPS) device (not shown), a compass (not shown), an
  • volatile memory e.g., dynamic random access memory (DRAM) (not shown)
  • non-volatile memory such as read only memory (ROM), flash memory (not shown), an I/O port (not shown), (
  • a mass storage device such as hard disk drive, a solid state drive, compact disk (CD) (not shown), digital versatile disk (DVD) (not shown), a microphone 146, and so forth.
  • the one or more processors 140 is operatively coupled to system memory through one or more links (e.g., interconnects, buses, etc).
  • system memory is capable of storing information that the one or more processors 140 utilizes to operate and execute programs and operating systems.
  • system memory is any usable type of readable and writeable memory such as a form of dynamic random access memory (DRAM).
  • DRAM dynamic random access memory
  • the mobile computing device 110 includes a microphone 146 configured to capture audio. In embodiments, the mobile computing device 110 includes a speaker 141 configured to transmit audio. In embodiments, mobile computing device 100 includes or is otherwise associated with various input and output/feedback devices to enable user interaction with the mobile computing device 100 and/or peripheral components or devices associated with the mobile computing device 100 by way of one or more user interfaces or peripheral component interfaces.
  • the user interfaces include, but are not limited to a physical keyboard or keypad, a touchpad, a display device (touchscreen or non- touchscreen), speakers, microphones, image sensors, haptic feedback devices and/or one or more actuators, and the like.
  • the mobile computing device can comprise a memory element (not shown), which can exist within a removable smart chip or a secure digital (“SD”) card or which can be embedded within a fixed chip on the dental ex.
  • a memory element (not shown), which can exist within a removable smart chip or a secure digital (“SD”) card or which can be embedded within a fixed chip on the dental ex.
  • SIM Subscriber Identity Component
  • the memory element may allow a software application resident on the device.
  • an I/O link connecting a peripheral device to a mobile computing device is protocol-specific with a protocol-specific connector port that allows a compatible peripheral device to be attached to the protocol-specific connector port (i.e. , a USB keyboard device would be plugged into a USB port, a router device would be plugged into a LAN/Ethernet port, etc.) with a protocol-specific cable.
  • a compatible peripheral device i.e. , a USB keyboard device would be plugged into a USB port, a router device would be plugged into a LAN/Ethernet port, etc.
  • Any single connector port would be limited to peripheral devices with a compatible plug and compatible protocol. Once a compatible peripheral device is plugged into the connector port, a communication link would be established between the peripheral device and a protocol-specific controller.
  • a non-protocol-specific connector port is configured to couple the I/O interconnect with a connector port of the mobile computing device 100, allowing multiple device types to attach to the mobile computing device 100 through a single physical connector port. Moreover, the I/O link between the mobile
  • the computing device 100 and the I/O complex is configured to carry multiple I/O protocols (e.g., PCI Express®, USB, DisplayPort, HDMI, etc.) simultaneously.
  • the connector port is capable of providing the full bandwidth of the link in both directions with no sharing of bandwidth between ports or between upstream and downstream directions.
  • the connection between the I/O interconnect and the mobile computing device 100 supports electrical connections, optical connections, or both.
  • the one or more processors 140, flash memory, and/or a storage device includes associated firmware storing programming
  • the communication module 142 enables wired and/or wireless communications for the transfer of data to and from the mobile computing device 100.
  • the mobile computing device 100 also includes a network interface configured to connect the mobile computing device 100 to one or more networked computing devices wirelessly via a transmitter and a receiver (or optionally a transceiver) and/or via a wired connection using a communications port.
  • wireless transmitter/receiver and/or transceiver may be configured to operate in accordance with one or more wireless communications standards.
  • wireless and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may
  • the mobile computing device 100 includes a wireless communication module 142 for transmitting to and receiving data, for example for transmitting and receiving data from a network, such as a telecommunications network.
  • a wireless communication module 142 for transmitting to and receiving data, for example for transmitting and receiving data from a network, such as a telecommunications network.
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • cdmaOne CDMA2000
  • EV-DO Evolution- Data Optimized
  • EDGE Enhanced Data Rates for GSM Evolution
  • the mobile computing device 100 is directly connect with one or more devices via the direct wireless connection by using, for example, Bluetooth and/or BLE protocols, WiFi protocols, Infrared Data Association (IrDA) protocols, ANT and/or ANT+ protocols, LTE ProSe standards, and the like.
  • Bluetooth and/or BLE protocols WiFi protocols, Infrared Data Association (IrDA) protocols, ANT and/or ANT+ protocols, LTE ProSe standards, and the like.
  • the communications port is configured to operate in accordance with one or more known wired communications protocol, such as a serial communications protocol (e.g., the Universal Serial Bus (USB), FireWire, Serial Digital Interface (SDI), and/or other like serial communications protocols), a parallel communications protocol (e.g., IEEE 1284, Computer Automated Measurement And
  • a serial communications protocol e.g., the Universal Serial Bus (USB), FireWire, Serial Digital Interface (SDI), and/or other like serial communications protocols
  • SDI Serial Digital Interface
  • parallel communications protocol e.g., IEEE 1284, Computer Automated Measurement And
  • CAMAC CAMAC
  • FDDI Fiber Distributed Data Interface
  • the mobile computing device 100 is configured to run, execute, or otherwise operate one or more applications.
  • the applications include native applications, web applications, and hybrid applications.
  • the native applications are used for operating the mobile computing device 100, such as using a camera or other like sensor of the mobile computing device 100, cellular phone functionality of the mobile computing device 100, and other like functions of the mobile computing device 100.
  • native applications are platform or operating system (OS) specific or non-specific.
  • native applications are developed for a specific platform using platform-specific development tools, programming languages, and the like. Such platform-specific development tools and/or programming languages are provided by a platform vendor.
  • native applications are pre-installed on mobile computing device 100 during manufacturing, or provided to the mobile computing device 100 by an application server via a network.
  • Web applications are applications that load into a web browser of the mobile computing device 100 in response to requesting the web application from a service provider.
  • the web applications are websites that are designed or customized to run on a mobile computing device by taking into account various mobile computing device
  • Web applications may be any server-side application that is developed with any server-side development tools and/or programming languages, such as PHP, Node.js, ASP.NET, and/or any other like technology that renders HTML.
  • Hybrid applications may be a hybrid between native applications and web applications.
  • Hybrid applications may be a standalone, skeletons, or other like application containers that may load a website within the application container.
  • Hybrid applications may be written using website development tools and/or programming languages, such as HTML5, CSS, JavaScript, and the like.
  • hybrid applications use browser engine of the mobile computing device 100, without using a web browser of the mobile computing device 100, to render a website’s services locally.
  • hybrid applications also access mobile computing device capabilities that are not accessible in web
  • applications such as the accelerometer, camera, local storage, and the like.
  • the computer- usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
  • the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • CD-ROM compact disc read-only memory
  • a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
  • the computer- usable or computer-readable medium can even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
  • a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
  • the computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc.
  • Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming
  • the program code may execute entirely on the user's mobile computing device, partly on the user's mobile computing device, as a stand-alone software package, partly on the user's mobile computing device and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's mobile computing device, through any type of network, including a local area network (LAN) or a wide area network (WAN), or the
  • connection may be made to an external mobile computing device, (for example, through the Internet using an Internet Service Provider), or wireless network, such as described above.
  • an external mobile computing device for example, through the Internet using an Internet Service Provider
  • wireless network such as described above.
  • example embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium.
  • a code segment may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, program code, a software package, a class, or any combination of instructions, data structures, program statements, and the like.
  • an article of manufacture may be employed to implement one or more methods as disclosed herein.
  • the article of manufacture may include a computer-readable non-transitory storage medium and a storage medium.
  • the storage medium may include programming instructions configured to cause an apparatus to practice some or all aspects a method of conducting an eye exam using a mobile computing device, in accordance with embodiments of the present disclosure.
  • the storage medium may represent a broad range of persistent storage medium known in the art, including but not limited to flash memory, optical disks or magnetic disks.
  • the programming instructions in particular, may enable an
  • the storage medium may include programming instructions configured to cause an apparatus to practice some or all aspects of a method of conducting an eye exam using a mobile computing device, in accordance with embodiments of the present disclosure.
  • Figure 2 depicts a computer implemented method for conducting an eye exam using a mobile computing device, in accordance with example
  • the methods disclosed herein are preferably performed by a human operator or user in addition to the human subject although in certain situations the operator and the subject can be the same individual, in that case, the subject would be both the subject and the operator or user.
  • the method 200 is described with reference to the components, flowcharts and eyecharts illustrated in Figures 1 , and 3- 12.
  • an optional preliminary refraction is determined.
  • a preliminary refraction is obtained, for example using a smartphone- based photorefraction as described in US Patent 9,380,938, which is hereby incorporated herein in its entirety.
  • a previous manifest refraction or an eye glass prescription value are used as the preliminary refraction.
  • a preliminary refraction can be determined using Low Contrast Visual Acuity using a smart phone programmed with low contrast acuity images (see for example Figure 12).
  • the preliminary refraction is entered into the mobile computing device 100, and/or stored in memory of the mobile computing device 100, for example by the mobile computing device touch-screen display 110. If no preliminary refraction is available, then -0.25 diopters (D) can be entered into the memory of the mobile computing device 110 and/or defaulted as a starting value for further analysis. Other values for the default preliminary refraction can also be used without departing from the disclosure. Values for one or both eyes can be entered, defaulted, or otherwise determined. For each eye undergoing evaluation, the preliminary refraction may be converted, using the mobile computing device 100 processor 140, into the spherical equivalent power by the formula:
  • SE is the spherical equivalent power in diopters
  • Sph is the spherical component of refraction in diopters
  • Cyl is the magnitude of the cylinder component of the refraction in diopters.
  • the SE is determined using the RDR method of stepwise subjective evaluation of visual response to selected duochrome images at predefined test distances to determine the final DND distance, for example as outlined in the decision trees as shown in Figures 8 and 9.
  • the patient’s response to presented characters on different colored background as darker or bolder leads to a rapid conclusion of the spherical equivalent power in the test eye by following a predefined series of test steps and distances.
  • the SE is determined by calibrated photo refraction to objectively measure the refractive power in the eye.
  • the SE is further converted, using the mobile computing device 100 processor 140, to the preliminary far-point using the formula:
  • FP is the far-point distance determined in meters (or in some embodiments another unit of distance measure).
  • the refractive measurement can be further refined using the RDR method for astigmatism power and axis to separate the spherical power and astigmatic power from the initial SE measurement.
  • the astigmatism power can be quantified using two or more axes of photo refraction measurement to determine the plus and minus distances for astigmatism power calculation, and this can be used to break out the spherical power from SE.
  • the mobile computing device 100 display 110 such as touch screen display, is placed facing the subject.
  • the operator may hold the mobile computing device 100 display 110 ( Figure 1 ) facing the test subject.
  • the operator enters data using the mobile computing device 100.
  • the methods of the current disclosure can also be performed by the subject alone.
  • the mobile computing device 100 would be positioned facing the subject, for example on a stand, table, or other appropriate place.
  • the subject would enter responses into the mobile computing device by voice command and/or recognition, for example as received by microphone 146.
  • the operator could enter the responses by voice command, for example as received by microphone 146.
  • the left eye is closed or occluded with a hand, occluder, or patch.
  • the right eye is closed or occluded with a hand, occluder, or patch.
  • Directions as to which eye is being tested can be communicated to the subject and/or operator by visual or auditory signals, for example with display 110 and/or speaker 141.
  • the visual acuity is tested. After the SE far-point is established the visual acuity is tested using the mobile computing device 100 at that distance. To test the visual acuity of the subject decrease the size of the optotype is decreased by the processor 140 of the mobile computing device 100 until the subject cannot read the letters anymore. The visual acuity is determined by the smallest optotype that the subject can read correctly, and is measured in 0.1 logMAR units. In embodiments, the size of the optotype 151 is displayed on the touch-screen display 110 and can be increased by tapping the up-arrow 152 or decreased by tapping the down arrow 153. In embodiments, this is done automatically by the processor 140, or be oral indication.
  • logMAR logarithm of minimum angle of resolution
  • test distance between 4 m and 0.33 m as shown on Table 1 limits the range of refraction to between -0.25 to -3.00 D for testing eyes without the aid of glasses.
  • testing as above can be done using a photo refractor such as one on a mobile device or in an optometry clinic, or over existing eyeglasses to measure the residual refractive error.
  • the prescription of the existing eyeglasses and the residual refractive error can then be added. If the subject does not have existing glasses, then standard test glasses with spherical corrections can be used.
  • the +3.00 D test glasses should be put on the subject to bring the preliminary refraction to -1.00 D (1.00 m) as the starting point of testing.
  • the recommended test glasses power according to the preliminary refraction SE is given in Table 2, as can be stored in the memory of mobile computing device 100 and queried by processor 140.
  • test range as shown in Table 1 and 2 can be changed to 2.50 D instead of 3.00 D.
  • the smallest test increment can be changed to 1 ⁇ 4 D instead of 1/8 D, etc.
  • the duochrome chart 130 preferably utilizes optotype sizes that are near the limit of normal visual acuity, such as 20/20 (logMAR 0.0) or 20/25 (logMAR 0.1 ). Since the test distance is varied by the processor 140 according to methods of the present disclosure, the optotype size is adjusted by the processor to maintain the same size in terms of visual angle.
  • the standard 20/20 optotype is approximately 5 minutes of arc in visual angle and used at 20 feet (6 m) distance.
  • the physical optotype size in millimeters is calculated to maintain the same visual angle size over the range of test distances (Table 3 as can be stored in the memory of mobile computing device 100 and queried by processor 140).
  • the test distance is preferably monitored by the distance sensor 120 (see Figures 1 and 3). If the subject is too far or too close instructions are provided by the mobile computing device 100 processer 140 to“move closer” or“move away” using speaker 141. However, the current disclosure can still be practiced even if the mobile computing device 100 does not have a distance sensor. In this case, a tape measure with distance markings that correspond to the dioptric steps shown in Table 1 can be used. In embodiments, the operator looks at the mobile computing device 100 touch-screen display 110 for distance setting 150 ( Figure 3) and then walk to the position so that the mobile computing device 100 touch-screen display 110 is at the correct distance from the subject’s eyes.
  • the mobile computing device 100 is used to determine a final far- point distance and the processor 140 converts the results of this test to the spherical equivalent refraction, for example using equations 1 and 2.
  • a final far-point distance is tested and/or determined using a duochrome eye chart 130, such as shown in Figure 3 and/or Figure 10, as displayed on the display 110 of the mobile computing device 100.
  • a duochrome eye chart 130 such as shown in Figure 3 and/or Figure 10
  • the duochrome test is based on the principle that the eye refracts shorter wavelength light (e.g. green) more strongly than longer wavelength light (e.g. red).
  • Myopia is caused by the refractive power of the eye being too strong. Therefore, an uncorrected or undercorrected myopic eye would see the green half of the chart as being less dark or bold or clear, while the red half of the chart as being darker, bolder, or sharper. Given the principle, other color pairs can be used, for example red and blue. As shown in Figure 3, the chart halves 131 and 132 are labeled by large bold symbols, such as the numbers 1 and 2, so that the subject can report these numbers even if they are colorblind and cannot distinguish red from green, in the red/green implementation. Note that the duochrome principle is based on the wavelength dependence of refractive index in the media of the eye and therefore is not affected by the photopigment deficit that causes color blindness.
  • a third color can be used for the background in the chart to help minimize the accommodative response and improve accuracy as described previously (see, for example, Figure 3).
  • a fogging lens can be used on the fellow eye to also minimize accommodative response, and these techniques can be used together. Fogging of the eye contralateral to the test is used to help relax accommodation and improve the accuracy of RDR. This is particularly helpful for non-presbyopic subjects (usually those younger than 45 years of age). Fogging is preferably done by placing a lens over the fellow eye that is more plus than the lens used in the test eye. For example, in the most common situation when the test eye has low myopia, then no lens is put over the test eye and a +3D lens is placed over the fellow eye.
  • a +3D lens is placed over the test eye and a +6 lens is placed over the fellow eye.
  • a -3D lens is placed over the test eye and no lens is placed over the fellow eye.
  • the duochrome chart 130 is first displayed on the mobile computing device 100 display 110 at the far-point distance FP matching the preliminary SE. Because the SE is typically specified in 1/8 D increments, the FP can be determined using the conversion chart in Table 1 , which can be contained and/or stored within memory of the mobile computing device 100 and queried by processor 140. Table 1 starts from the distance of 4 meters because a longer distance is typically not practical due to consideration of room size and display size. This would also be the default distance if no preliminary refraction was available. Flowever, this distance and the associated table can be modified or augmented for longer distances as needed. Table 1 ends at 0.32 m because closer distance is difficult to maintain accurately and also due to display resolution considerations. Flowever, this distance and the associated table can be modified or augmented for shorter distances as needed. [0089] Table 1. Conversion between spherical equivalent power and far- point distance
  • Table 1 can be interrogated by the processor 140 of the mobile computing device 100 as follows. In this example assuming as a starting point that the subject’s previous eyeglasses refraction was -1.25 D Sph, -1.00 D Cy/ x
  • the duochrome chart 130 would then be displayed initially at a 0.57 m distance.
  • the mobile computing device 100 continually monitors the test distance between the mobile computing device 100 and the subject eyes using the distance sensing module 120.
  • the mobile computing device 100 is prompter by the processor 140 based on the distance measurement returned by the distance sensing module 120 to give either a visual or audio cue as to when or if the subject is at the correct distance.
  • a chime, or a voiced commend such as“move closer” or“move away” through its speaker 141 see, for example, Figure 3
  • a visual signal such as colors, or other graphics that indicate to the operator or subject to“move closer” or“move away” until the distance between the mobile computing device 100 and the subject’s eye(s) are at the correct distance, at which point a chime (or other sound) or visual cue is used to signal the operator to stop.
  • the mobile computing device 100 is prompted by the processor 140 to query the subject as to looks darker, bolder, or sharper, for example“which looks darker, bolder, or sharper, letters on the red chart or the green chart” This can be done visually, or audibly.
  • the operator can double-tap or otherwise indicate the chart 131 or 132 ( Figure 3) that appears darker, bolder, or sharper to the subject on the mobile computing device 100 display 110.
  • the operator can click, double-tap, or otherwise indicate the chart 131 or 132 ( Figure 3) that appears less dark, bold, or sharp to the subject on the mobile computing device 100 display 110.
  • the choice can be made by oral indication, for example by speaking such that voice recognition software on the mobile computing device 100 receives the indication, for example via microphone 146.
  • the choice can be made by oral indication, for example by speaking such that voice recognition software on the mobile computing device 100 receives the indication, for example via microphone 146. If the subject indicates he or she cannot see the letters at all, then the operator may tap the“cannot see” button 133 on the screen or make this oral indication as described above.
  • the duochrome chart 130 and optotype within can be made larger by tapping the up-arrow 152 or made smaller by tapping the down arrow 153. If the subject can see the optotype and still indicate that the red chart appears sharper at the closest distance, then test glasses with stronger minus or weaker plus (Table 2) should be used and the far-point test should be restarted at the furthest distance (i.e. 4 m).
  • the subject says characters on the red chart 131 are darker, bolder, or sharper at 0.57 m (-1.75 D) so the processor 140 of the mobile computing device 110 changes the test distance to 0.44 m (-2.25 D, an increment of - 0.50 D from -1.75 D), using Table 1.
  • the subject says the characters on the red chart 131 appears darker, bolder, or sharper so the processor 140 of the mobile computing device 110 changes the test distance to 0.36 m (-2.75 D), using Table 1.
  • the processor 140 of the mobile computing device 110 determines FP is 0.40 m and the SE is therefore -2.50 D.
  • the FP is defined the furthest distance at which characters on the red and green halves of the duochrome chart 130 appear equally dark, bold, or sharp to the subject. This result can be an accurate measure of the patient’s spherical equivalent power in the eye being tested.
  • the astigmatism axis is determined using the mobile computing device 100. Determination of the astigmatism axis comes after SE far- point measurement and is typically tested at a distance further than the SE far-point to introduce myopic blur. The preferred additional distance is 1 D equivalent but this is not critical; 1.50 D or 0.50 D can also be used. So to continue the example above where the far-point was found to be 0.40 m corresponding to -2.50 D (Table 1 ), the axis should be tested at 0.67 mm (-1.50 D, 1 D further than -2.50 D).
  • the astigmatism axis is tested using an astigmatism axis test chart on a touch-screen display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions, for example the half-clock test chart 160 displayed on the mobile computing device 100 touch-screen display 110 ( Figure 4).
  • the astigmatism axis test chart can comprise a collection of lines orientated in different radial directions, examples of such charts include but are not limited to the clock type charts or any fraction thereof.
  • the chart 160 is displayed at full size if the test distance is 4 m.
  • the chart 160 is composed of lines oriented at angular increment 1/3 clock hour (20 minutes) apart. The subject is asked which line or 2 lines appear the sharpest or boldest. If no line appears sharper than any other, then the subject eye has no astigmatism and the subjective refraction test is completed. In this case the operator taps on the“No Astigmatism” button 162 or otherwise makes this indication, for example orally. If one line appears sharpest, then the orientation of that line is the axis of plus cylinder. In this case the operator double- taps on the clock hour: minute numbers next to the line or otherwise makes this indication, for example orally.
  • the angle that lies in between those two lines is the axis of plus cylinder.
  • the operator taps on the numbers next to both of those lines or otherwise makes this indication, for example orally.
  • the operator can tap on the“Full Size/Test Size” toggle button 161 to make the half-clock display full sized for the purpose of entering the axis or otherwise makes this indication, for example orally.
  • the subject says both the 3 o’clock and 3:20 lines appear sharp. The operator then taps the number 3 and the 3:20 circle button to enter the axis.
  • the clock hour position is then converted by the processor 140 to degrees for the specification of the astigmatism plus cylinder axis according to Table 4 as can be stored in the memory of mobile computing device 100 and queried by processor 140. Since the axis is between 3 and 3:20 in the example, the equivalent axis of plus cylinder is 175 degrees. The axis of minus cylinder is 90 degrees away, which is 85 degrees.
  • This claim is regarding making astigmatism measurements, which involves determining the axis and magnitude of the eye’s astigmatism.
  • the axis is determined using a radial chart on the mobile device and having the subject select the radial direction where the line is clearer, darker, or bolder.
  • the magnitude is determined starting with this axis and the final distance from the RDR spherical equivalent measurement. Then an RDR method decision tree is used to determine the plus and minus axis distances and resulting power values. The difference in these powers is the astigmatism magnitude.
  • the astigmatism magnitude is determined.
  • the far-points for both charts are determined using the RDR method steps already described for the SE far-point test.
  • the astigmatism magnitude is the dioptric difference between the plus and minus axis far-points. To continue the example, suppose the far-point for the plus axis is 0.50 m (-2.00 D) and the far-point of the minus axis is 0.33 m (-3.00 D), then the magnitude of astigmatism is the difference between the 2 dioptric values: 1.00 D.
  • the method ends at block 260, at which time the results can be displayed and/or communicated.
  • the mobile computing device can prompt the operator to test the second eye.
  • FIG. 7 illustrates a networked telemedicine system 700, in accordance with embodiments herein.
  • the networked telemedicine system 700 includes the mobile computing device 100 in wireless communication therewith.
  • the networked telemedicine system 700 also induces other networked devices 710, which may be in wired or wireless communication therewith.
  • the mobile computing device 100 includes application software with executable instructions configured to transmit and receive information from the network 705.
  • the information can be transmitted to and/or received from another device, such as one or more networked devices 705 through a network.
  • the mobile computing device 100 is also capable transmitting information about an eye exam of a subject to one or more of a doctor, such as an eye doctor, other medical practitioner or an eyeglasses provider.
  • network 705 may be any network that allows computers to exchange data.
  • network 705 includes one or more network elements (not shown) capable of physically or logically connecting computers.
  • the network 705 may include any appropriate network, including an intranet, the Internet, a cellular network, a local area network (LAN), a wide area network (WAN), a personal network or any other such network or combination thereof. Components used for such a system can depend at least in part upon the type of network and/or environment selected.
  • Each network 705 includes a wired or wireless telecommunication means by which network systems (including systems adherence monitoring device 100 and networked devices 710 may communicate and exchange data.
  • each network 705 is implemented as, or may be a part of, a storage area network (SAN), personal area network (PAN), a metropolitan area network (MAN), a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), a virtual private network (VPN), an intranet, an Internet, a mobile telephone network, such as Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), cdmaOne, CDMA2000, Evolution- Data Optimized (EV-DO), Enhanced Data Rates for GSM Evolution (EDGE),
  • GSM Global System for Mobile Communications
  • GPRS General Packet Radio Service
  • cdmaOne CDMA2000
  • EV-DO Evolution- Data Optimized
  • EDGE Enhanced Data Rates for GSM Evolution
  • UMTS Universal Mobile Telecommunications System
  • DECT Digital Enhanced Cordless Telecommunications
  • IS-136/TDMA Digital AMPS
  • iDEN Integrated Digital Enhanced Network
  • LTE Long-Term Evolution
  • 3G 3 rd generation mobile network
  • 4G 4th generation mobile network
  • 5G 5th generation mobile network
  • card network Bluetooth, near field communication network (NFC), any form of standardized radio frequency, or any combination thereof, or any other appropriate architecture or system that facilitates the communication of signals, data, and/or messages (generally referred to as data).
  • NFC near field communication network
  • each network system (including
  • each networked device 710 includes a device having a communication component capable of transmitting and/or receiving data over the network 705.
  • each networked device 710 may comprise a server, personal computer, mobile device (for example, notebook computer, tablet computer, netbook computer, personal digital assistant (PDA), video game device, GPS locator device, cellular telephone, smartphone, or other mobile device), a television with one or more processors embedded therein and/or coupled thereto, or other appropriate technology that includes or is coupled to a web browser or other application for communicating via the network 705.
  • PDA personal digital assistant

Abstract

Described herein are methods, apparatuses, systems, and non-transitory media for conducting an eye exam with a mobile computing device.

Description

SMARTPHONE-BASED MEASUREMENTS OF THE REFRACTIVE ERROR IN AN
EYE
Cross-Reference to Related Application
[0001] This application claims the priority benefit of the earlier filing date of U.S. Provisional Application No. 62/588,304, filed November 17, 2017, which is hereby incorporated herein by reference in its entirety.
Technical Field
[0002] Embodiments herein relate to the field of digital health and, more specifically to a smartphone-based system for determining the refraction of a subject’s eye.
Background
[0003] Measuring the subjective refraction (also called manifest refraction) of a person is usually performed with a phoropter and an eye chart. The refraction is used to prescribe eyeglasses. Although a basic eye chart is simple and inexpensive, the phoropter is a relatively complicated assembly of lenses that is only available in an eye clinic or optical shop. To prescribe eyeglasses for people without access to professional equipment, it would be desirable to measure subjective refraction using a mobile computing device.
Summary
[0004] Disclosed herein are computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media that enable a Rapid Deductive Refraction (RDR) stepwise eye exam to be conducted on a subject's eye with a mobile computing device, such as a smartphone, for example an iPhone. With respect to the disclosed methods, the method may include any and all of the following steps: initiating, using the mobile computing device, an eye exam for a subject; presenting to the subject, using a screen of the mobile computing device, one or more duochrome target images at defined distances that correspond to one or more diopter powers; querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome target images; entering and/or storing the results of the query in memory of the mobile computing device; and using the results of the query to determine a refractive value of an eye of the subject.
[0005] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the one or more duochrome target images are a sequence of duochrome images selected from a decision tree based on the subjective responses from the subject.
[0006] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the subjective responses from the subject are at least in part used to select one of a plurality of predefined decision trees for final refraction determination.
[0007] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media, the duochrome target image comprises a duochrome chart with a range in optotypes or pattern size corresponding to a range in visual angles.
[0008] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the range in optotypes is from logarithm of minimum angle of resolution (logMAR) 0.0 (20/20) to logMAR 0.50 (20/63).
[0009] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media further include soliciting, from the subject a response of darker, bolder, or sharper in response to the query.
[0010] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media is used to measure a spherical equivalent refraction to establish a starting point refraction value. [0011] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media further include: monitoring, using a distance-sensing module of the mobile computing device, a distance from the subject to the mobile computing device to set a correct distance for each of the duochrome target images at the defined distances.
[0012] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media further include: monitoring, using a distance-sensing module of the mobile computing device, a distance to the subject, and continuously adjusting an optotype size for a correct measurement.
[0013] In various embodiments of computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media, the duochrome target image comprises a first color field and a second, different color field where the first color field has a color with a shorter wavelength light than a color of the second color field.
[0014] In various embodiments of computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media the duochrome target image incudes a third background color that has a shorter wavelength than the first color field and the second color field to help minimize the subject's accommodation.
[0015] In various embodiments of computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media the duochrome target image further includes contrasting lines to help minimize the subject's accommodation.
[0016] In various embodiments of computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media a fogging lens with positive dioptric power is placed in front of a fellow eye to minimize the accommodative response. [0017] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media a photo refractor is used expand a refractive range of the mobile computing device.
[0018] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the method is used to determine one or more of an astigmatism axis and an astigmatism power a spherical refraction value and a cylindrical refraction value.
[0019] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media further include: entering, using the mobile computing device, a preliminary refraction value in memory of the mobile computing device, wherein the preliminary refraction value comprises a manifest refraction or an eye glass prescription value.
[0020] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the sequence of duochrome images is selected from the decision tree using a bracketing search algorithm.
[0021] In various embodiments computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media further include determining visual acuity using the mobile computing device, by: presenting to the subject, using a screen of the mobile computing device, a low-contrast visual acuity chart; querying the subject as to a smallest sized optotype the subject can read correctly at the visual acuity test distance; entering and/or storing a response to the query in memory of the mobile computing device; and storing the visual acuity based on the visual acuity test distance and the smallest readable optotype size.
[0022] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media further include determining an astigmatism axis using the mobile computing device, by: displaying, using the mobile computing device, an astigmatism axis test chart on a display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions; querying the subject as to which line(s) on the astigmatism axis test chart appear the sharpest; and entering and/or storing a response to the query in memory of the mobile computing device thereby determining the astigmatism axis.
[0023] In various embodiments computer-implemented methods, apparatuses, systems, and non-transitory computer-readable storage media further include determining an astigmatism magnitude using the mobile computing device by:
displaying, using the mobile computing device, a duochrome chart of line targets oriented along a plus axis; querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets; storing the results of the query in memory of the mobile computing device; displaying, using the mobile computing device, a duochrome chart of line targets oriented along a minus axis; querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets; storing the results of the query in memory of the mobile computing device; and determining the dioptric difference between the plus- axis and the minus-axis, thereby determining the astigmatism magnitude.
[0024] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the mobile computing device is in communication with a network.
[0025] In various embodiments of computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media the network is a telecommunications network.
[0026] In various embodiments, computer-implemented methods,
apparatuses, systems, and non-transitory computer-readable storage media include initiating, by the mobile computing device, a telemedicine session.
Brief Description of the Drawings
[0027] Embodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings and the appended claims. Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings.
[0028] Figure 1 is a schematic representation of an example mobile computing device for conducting an eye exam, in accordance with disclosed embodiments.
[0029] Figure 2 is flow chart showing an example method for conducting an eye exam using a mobile computing device, in accordance with disclosed
embodiments.
[0030] Figure 3 is a schematic representation of an example mobile computing device displaying a duochrome eye chart, in accordance with disclosed
embodiments.
[0031] Figure 4 is a half-clock eye chart to test the astigmatism axis, in accordance with disclosed embodiments.
[0032] Figure 5 is a schematic representation of an example mobile computing device displaying a duochrome eye chart to test the plus-axis far-point, in accordance with disclosed embodiments.
[0033] Figure 6 is a schematic representation of an example mobile computing device displaying a duochrome eye chart to test the minus-axis far-point, in accordance with disclosed embodiments.
[0034] Figure 7 is a schematic diagram of a networked mobile computing device for conducting an eye exam, in accordance with embodiments herein.
[0035] Figure 8 is an example of a Rapid Deductive Refraction (RDR) flow chart used for patients over 45 years old, for measuring the refractive error in an eye with a starting point refraction of low myopia, in accordance with disclosed
embodiments.
[0036] Figure 9 is an example of a flow chart for younger patients who may be accommodating, and shows the revised Duochrome Neutral Distance logic, in accordance with disclosed embodiments.
[0037] Figure 10 is table showing initial pilot study results for the RDR method and approaches to minimize accommodation in younger patients. The RDR method took between 20-60 seconds and was able to measure 83% of patients within a range of +/- 0.5D, including 71 % of younger patients within +/- 0.5D.
[0038] Figure 11 is a flow chart outlining a method of Rapid Deductive
Refraction using Duochrome Neutrality Distance, in accordance with disclosed embodiments. DND = duochrome neutrality distance; estimates spherical equivalent refraction. Far LCVA = low-contrast visual acuity test at a far distance (e.g. 4m, 6m, 3m). Near LCVA = low-contrast visual acuity test at a near distance (e.g. 0.67m, 1 m, 0.5m). LCVA* = low-contrast visual acuity test at DND.
[0039] Figure 12 is an example of a 4 meter 25% low-contrast visual acuity chart.
Detailed Description of Disclosed Embodiments
[0040] In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which are shown by way of illustration embodiments that may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of embodiments is defined by the appended claims and their equivalents.
[0041] Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding the embodiments; however, the order of description should not be construed to imply that these operations are order dependent.
[0042] The description may use perspective-based descriptions such as up/down, back/front, and top/bottom. Such descriptions are merely used to facilitate the discussion and are not intended to restrict the application of disclosed
embodiments.
[0043] The terms“coupled” and“connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments,“connected” may be used to indicate that two or more elements are in direct physical contact with each other.“Coupled” may mean that two or more elements are in direct physical contact. However, “coupled” may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other.
[0044] For the purposes of the description, a phrase in the form“A/B” or in the form“A and/or B” means (A), (B), or (A and B). For the purposes of the description, a phrase in the form“at least one of A, B, and C” means (A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C). For the purposes of the description, a phrase in the form“(A)B” means (B) or (AB) that is, A is an optional element.
[0045] The description may use the terms“embodiment” or“embodiments,” which may each refer to one or more of the same or different embodiments.
Furthermore, the terms“comprising,”“including,”“having,” and the like, as used with respect to embodiments, are synonymous, and are generally intended as "open" terms (e.g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.).
[0046] With respect to the use of any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
[0047] Introduction
[0048] There are devices that use a smartphone display in conjunction with a scope (lens and tube assembly) to test refraction. However, the need for a hardware attachment in these devices increases both the cost and complexity. Thus, the need exists for a mobile computing device, such as a smart phone, that can measure refraction using just the hardware already integrated into the standard or typically mobile computing device, such as on the hardware already integrated into
smartphone, for example an iPhone. The current disclosure meets these needs.
[0049] As disclosed herein, Rapid Deductive Refraction (RDR) is used in a stepwise fashion to determine and/or deduce the refractive error in the eye, for example by reaching a duochrome neutrality distance (DND) endpoint. As disclosed herein, by presenting a subject with duochrome target images in a stepwise and predefined series of distances corresponding to dioptric distance subjective
responses by the subject (for example‘darker’ and‘bolder’) can be used to deduce the refractive error in the eye. As disclosed herein, the (RDR) method places images at comfortable and natural distances and uses subjective responses from the subject, such as‘darker’ and‘bolder’, as decision criteria for the subject. As discussed in detail below, these subjective responses are easier to easier to decide upon than, for example, reading the characters off of a traditional eye chart. In addition, as depicted in Figures 8 and 9 these subjective responses are used to guide the test down predetermined decision trees to a final determination of refractive error in the eye.
The disclosed RDR method uses a stepwise and predefined series of dioptric distances and presents a duochrome image for a subjective response to deduce the refractive error in the eye. In contrast to other methods, these subjective responses are not the blur limit at a maximum distance but are rather comparative differences in the darkness or boldness of the characters on different colored backgrounds (see, for example, Figures 3 and 10). These comparative differences are indicative of the refractive power in the eye at each step of the test, and the sequence of response determines the refractive power within a narrow range of diopters (see, for example, the flowcharts given in Figures 8 and 9). Thus, rather than finding the maximum distance the subject can see specific sized target images before it gets blurry, the methods disclosed herein drives the test toward the Duochrome Neutrality Distance (DND) for a final result, which can reduce both eyestrain and discomfort for the subject undergoing the test.
[0050] Other methods for both mobile devices and purpose-built instruments typically rely on finding the maximum distance the subject can see specific sized target images before it gets blurry. This pushes the subject to strain to see the images at a distance, as is the case with general visual acuity tests or the approach of finding the Maximum Distance of Best Acuity (MDBA) for example as described in U.S. Patent No. 9,549,669. The MDBA method is fundamentally different than the methods disclosed herein. MDBA and other Visual Acuity methods push the test distance and optotype size to the blur limit of the patient's vision, forcing them to strain and guess at the images to determine their specific limit on visual acuity. In contrast to the MDBA method, the disclosed methods do not require that a subject get a 'right' or‘correct’ answer on which small line/character they are looking at, but rather simplifies the process of an eye exam by asking a subject to compare characters of multiple sizes on different backgrounds for darkness and boldness. So getting it 'right' between an "E and an 'F' (for example) at some blurred distance isn't the end point, rather its letting the refractive power of the eye (which is different for red and green colors as discussed below) distinguish the darkness and boldness of the duochrome target image at each predefined distance to guide the results to the final refraction. The result is that the disclosed methods provide for an eye exam that is both simpler and more rapid than other methods that reply on MDBA, for example as described in U.S. Patent No. 9,549,669.
[0051] The novel methods disclosed herein start with the general principle of refractive measurements of the eye in terms of diopters, and the associated distances from the eye that correspond to diopters of power. The inventors discovered that when implemented on a mobile computing device, such as a smartphone, a subject’s specific refraction can be deduced from this general principle using a sequence of predefined distances and a duochrome target image to solicit subjective responses from the subject. The subject’s specific responses are used in the method to guide the eye test down one of several predefined paths toward the final refraction determination (see, for example Figures, 8 and 9). As discussed above, this approach is fundamentally different from the traditional and common method of pushing the images out to the blur limit of the subject’s vision to measure visual acuity.
[0052] Detailed Description on Several Embodiments
[0053] Disclosed herein are methods, devices and systems, such as telemedicine systems, to determine the refraction of a subject’s eye(s). The disclosed methods, devices and systems determine the refraction of a subject’s eye(s) based on a Rapid Deductive Refraction (RDR) stepwise method, using a mobile computing device, such as a smart phone, for example an appropriately programmed
smartphone. As disclosed here, an appropriately programed mobile computing device can be used to determine the spherical equivalent far-point vision, visual acuity, an astigmatism axis, and the astigmatism magnitude for one or both eyes.
[0054] Figure 11 describes an exemplary method of a subjects’ refraction, for example without prior knowledge of their approximate refractive error. As shown in Figure 11 in embodiments the method can begin with using a photoscreener, such as the GoCheck device (see, for example, US patent No 9,380,938, which is hereby incorporated herein by reference) to make an initial measurement on the refractive error of an eye of the subject. It is of course contemplated that the test, methods, apparatus, etc. can be used on both eyes of the subject. A photoscreener is able to measure moderate to severe myopia and hyperopia, but cannot detect low myopia and low hyperopia within a range generally referred to as the null zone. The RDR method as disclosed herein is particularly useful when the refraction falls within this null zone. In embodiments, the RDR method begins with one or more visual acuity tests to distinguish myopia, emmetropia, and/or hyperopia. A low-contrast visual acuity chart is preferentially used for the best sensitivity (see, for example Figure 12), but a regular high-contract acuity chart can also be used, for example as displayed on a display of a mobile computing device. If the subject eye has normal acuity (equal to or better than 20/20 or logarithm of minimum angle of resolution [logMAR] of 0.0) at the far distance (e.g. 20 feet or 4 meters), then the eye is approximately emmetropic (neither hyperopic nor myopic). If however, the subject eye is worse than logMAR 0.0, then the acuity test is repeated at a near distance. If the near acuity is better (i.e. at the near distance) than the far acuity (i.e. at the far distance), then the eye is myopic (nearsighted). Conversely, If the far acuity is better than the near acuity, then the eye is hyperopic(farsighted). As an alternative to the near acuity test, a plus lens (e.g. +3.0 D) may be placed over the eye to repeat the far acuity test. If the eye sees the distance chart better with the plus lens then the eye is hyperopic. It the subject is nearsighted then the Rapid Deductive Refraction (RDR) stepwise method is continued using the duochrome tests as disclosed herein to determine the spherical equivalent refraction. If the subject is hyperopic then the RDR stepwise method may be performed using a +3D lens in front of the test eye. The RDR stepwise method establishes the duochrome neutrality distance (DND), which is the distance at which the optotypes with red or green backgrounds appear equally bold or sharp. Or, if equality is not found at any distance, the DND is the bracketed distance in between a distance at which the optotype with red background appears sharper/bolder and a distance at which the optotype with the green background appears sharper or bolder. Either way, an endpoint is reached. The DND is an estimate of the far point of the eye. The spherical equivalent refraction of the eye is the reciprocal of the far point as estimated by DND. For example, if the DND is 2 meters, then spherical equivalent refraction is one divided by two meter or 0.5 diopters. After the DND is establish using the duochrome test on the mobile computing device, the visual acuity is again tested at the DND. Again, a low-contrast visual acuity test is preferred (see, for example Figure 12), but a high-contrast acuity test can also be used. If the visual acuity is normal at DND, then the subject eye is determined to have no significant astigmatism and no further testing is required. Flowever, if the visual acuity is worse than logMAR 0.0 at DND, then the subject eye should be tested for astigmatism, for example as described below. In embodiments, the astigmatism axis is first determined using the half-clock chart. Once the astigmatism axis is determined, then the DND is determined for lines targets oriented at the plus and minus axis orientations (see, for example Figures 5 and 6). The astigmatism magnitude is determined by the difference between the refraction at the plus and minus axes, which are calculated by taking the reciprocal of the DND for line targets oriented at the plus and minus axis orientation.
[0055] In embodiments, the spherical equivalent far-point is obtained by the use of a Rapid Deductive Refraction method for presenting a sequence of
duochrome eye chart displayed on the mobile computing device at specific predefined distances from the subject based on their subjective responses. In embodiments, determining an initial or final refractive value of an eye using the RDR method, includes: accurately determining one or more of the refractive power, spherical equivalent, astigmatism axis, and astigmatism power in the eye with a predefined stepwise method using images displayed on a mobile computing device, such as on a smart phone, and soliciting subjective responses about the image quality from the subject, for example whether the image or portions of the image are ‘darker’ and/or‘bolder’. In embodiments the disclosed method includes providing, with the mobile computing device, short sequences of duochrome target images at predetermined distances. In embodiments, the predetermined distances are separated by dioptric increments. After presentation of the duochrome target images, for example after each duochrome target image, the subject is queried as to the whether the characters presented in the duochrome image are, for example, darker and/or bolder on one of the two colored backgrounds on the duochrome target image. The resulting subjective responses from the subject directs the test to the next appropriate test distance, for example, presentation of an image at either a greater or lesser predetermined distance are separated by dioptric increments. In embodiments, the disclosed method can be a single step, for example where the correct
predetermined distance is found initially. In other embodiments, the disclosed method can multiple steps (see for example, Figures 8 and 9).
[0056] Various techniques can be used to minimize accommodation in the eye under test such as a different color background and a fogging lens on the fellow eye. In embodiments the accommodative response of eyes is minimized using various methods including a different colored background in the target image, a fogging lens in front of the fellow eye, and/or rapid fluctuations between a highly blurred image and a clear image. As an example a blue background and/or black contrasting lines tend to relax accommodation in younger patients. The eye refracts the blue
wavelength more strongly than the red or green in the duochrome image and thus appears farther away to the subject, which helps minimize the accommodative amplitude of the eye being tested so a more accurate result can be determined.
Various character types and sizes can be used in these duochrome target images, for example having 20/20 sized letters on the lower rows and larger sizes in the upper rows so the patient can find the best readable target for determining darker or bolder in case their vision is significantly worse than 20/20 due to astigmatism or spherical refractive error (see for example, the due-chrome image in Figure 3).
[0057] In certain embodiments, the disclosed method can be used in combination with a photo screener device such as a smart phone to provide a broad measurement range for refractive error with acceptable accuracy. In embodiments, the astigmatism axis is measured by a chart of radial lines displayed on the mobile computing device, for example as a clock chart or fraction thereof. In embodiments, the astigmatism magnitude is measured by measuring the far-point of a duochrome eye chart of lines oriented to the axis of plus cylinder and another duochrome eye chart of lines oriented to the axis of minus cylinder. In certain embodiments, the distance measurements needed for calculations during the eye exam are measured by a distance sensor built into or integrated with the mobile computing device. In certain embodiment, different images, colors, backgrounds, and optotype/content are used to minimize accommodation. For example, a duochrome target image using red and green may also include a blue background to minimize accommodation (see, for example, Figure 3). Other duochrome color combinations and background
combinations are also contemplated. Alternatively, presenting a blurred image to the subject that quickly changes to a clearly define image and then quickly back to a blurred image can minimize the accommodation of the subject’s eye. This can happen multiple times in succession. In some embodiments a +2-3D lens can be placed in front of the fellow eye during test of the other eye can minimize
accommodation. The disclosed methods can be combined with mobile device photo refractor. A photo refractor device, for example GoCheck kids mobile photo screener, can be used to capture part of the refractive error range (see for example US patent No 9,380,938, which is hereby incorporated herein by reference). Furthermore, the RDR technique can be used to test in the null zone of the photo screener to enable measurement through the entire refractive range.
[0058] Using the RDR method shown for example in Figure 8 for older patients and Figure 9 for younger patients, the disclosed methods a given subject’s refraction can be determined. The duochrome chart with blue background to minimize accommodation as in Figure 3 was used in a pilot study of patients across age ranges. All tests took between 20-60 seconds to administer. The results of this pilot study are shown in Figure 10 where 83% of patients overall had a refraction measurement within +/- 0.5D from their manifest refraction taken that same day. Further, 71 % of patients in the lower age group where accommodation minimization was important had refractions measured within +/- 0.5D of their manifest refraction taken that day. These are important and valuable early indications of the power of the RDR method as disclosed herein.
[0059] Using the duochrome test (see Figure 3), as implemented in the disclosed methods, devices, and systems, to determine spherical equivalent far-point represents a novel aspect of this disclosure. Additionally, the use of a duochrome test provides a more precise refraction endpoint than the alternative of relying the smallest size of target that can be seen by the subject, for example as described in the patent application by Lee and Dallek (US Patent Publication 2014/0268060 A1 ).
In addition, the use of the duochrome test, as implemented in the disclosed methods, devices, and systems, to determine astigmatism magnitude also represents a novel approach. The oriented lines (see Figure 4) had not been used on duochrome charts before to measure separate far-points along the plus and minus axes. This innovation allows more precise measurement of astigmatism than relying on measurement of distortion as described in Lee and Dallek. By using a mobile computing device distance sensor to determine the distance from the displayed eye charts represents a novel advance that is both faster and more convenient than using a tape measure as is typically employed. Furthermore, adjusting the physical size of optotype and eye chart according to test distance to maintain the same visual-angle size is yet another innovation that allows eye chart to be presented at a wide range of test distances.
[0060] Example Apparatus
[0061] FIG. 1 illustrates a simplified diagram of a mobile computing device 100 for conducting an eye exam, in accordance with embodiments herein. In certain embodiments, the mobile computing device comprises a smart phone, such as a commercially available smart phone, for example an iPhone®, Samsung Galaxy®, Nokia Lumina ®, Motorola Droid ®, and the like. In certain embodiments, the smartphone 100 is an iPhone, for example an iPhoneX. In embodiments, the mobile computing device 100 includes a touch-screen display 110 and a distance measuring module 120. In certain embodiments, the distance measuring module 120 includes an infrared dot projector 121 and an infrared camera 122. However, other
smartphones with similar features are also contemplated. For example, in alternative embodiments, the distance measuring module 120 can be based on other principles, such as laser range finding, sonar, or dual-camera imaging. In embodiments, mobile computing device 100 includes a number of components, such as one or more processors 140 and at least one communication module 142. In various
embodiments, the one or more processors 140 each include one or more processor cores. In various embodiments, the at least one communication module 142 is physically and electrically coupled to the one or more processors 140. In further implementations, the communication module 142 is part of the one or more
processors 140. In various embodiments, mobile computing device 100 includes printed circuit board (PCB) 155. For these embodiments, the one or more processors 140 and communication module 142 is disposed thereon. Depending on its
applications, mobile computing device 100 includes other components that may or may not be physically and electrically coupled to the PCB. These other components include, but are not limited to, a memory controller (not shown), volatile memory (e.g., dynamic random access memory (DRAM) (not shown)), non-volatile memory (not shown) such as read only memory (ROM), flash memory (not shown), an I/O port (not shown), (not shown), a digital signal processor (not shown), a crypto processor (not shown), a graphics processor (not shown), one or more antenna (not shown), a touch-screen display 110, a touch-screen display controller (not shown), a battery (not shown), an audio codec (not shown), a video codec (not shown), a global positioning system (GPS) device (not shown), a compass (not shown), an
accelerometer (not shown), a gyroscope (not shown) (not shown), a speaker 141 , a camera (not shown), and a mass storage device (such as hard disk drive, a solid state drive, compact disk (CD) (not shown), digital versatile disk (DVD) (not shown), a microphone 146, and so forth.
[0062] In some embodiments, the one or more processors 140 is operatively coupled to system memory through one or more links (e.g., interconnects, buses, etc). In embodiments, system memory is capable of storing information that the one or more processors 140 utilizes to operate and execute programs and operating systems. In different embodiments, system memory is any usable type of readable and writeable memory such as a form of dynamic random access memory (DRAM).
In embodiments, the mobile computing device 110 includes a microphone 146 configured to capture audio. In embodiments, the mobile computing device 110 includes a speaker 141 configured to transmit audio. In embodiments, mobile computing device 100 includes or is otherwise associated with various input and output/feedback devices to enable user interaction with the mobile computing device 100 and/or peripheral components or devices associated with the mobile computing device 100 by way of one or more user interfaces or peripheral component interfaces. In embodiments, the user interfaces include, but are not limited to a physical keyboard or keypad, a touchpad, a display device (touchscreen or non- touchscreen), speakers, microphones, image sensors, haptic feedback devices and/or one or more actuators, and the like. In some embodiments, the mobile computing device can comprise a memory element (not shown), which can exist within a removable smart chip or a secure digital ("SD") card or which can be embedded within a fixed chip on the dental ex. In certain example embodiments, Subscriber Identity Component ("SIM") cards may be used. In various embodiments, the memory element may allow a software application resident on the device.
[0063] In embodiments, an I/O link connecting a peripheral device to a mobile computing device is protocol-specific with a protocol-specific connector port that allows a compatible peripheral device to be attached to the protocol-specific connector port (i.e. , a USB keyboard device would be plugged into a USB port, a router device would be plugged into a LAN/Ethernet port, etc.) with a protocol-specific cable. Any single connector port would be limited to peripheral devices with a compatible plug and compatible protocol. Once a compatible peripheral device is plugged into the connector port, a communication link would be established between the peripheral device and a protocol-specific controller.
[0064] In embodiments, a non-protocol-specific connector port is configured to couple the I/O interconnect with a connector port of the mobile computing device 100, allowing multiple device types to attach to the mobile computing device 100 through a single physical connector port. Moreover, the I/O link between the mobile
computing device 100 and the I/O complex is configured to carry multiple I/O protocols (e.g., PCI Express®, USB, DisplayPort, HDMI, etc.) simultaneously. In various embodiments, the connector port is capable of providing the full bandwidth of the link in both directions with no sharing of bandwidth between ports or between upstream and downstream directions. In various embodiments, the connection between the I/O interconnect and the mobile computing device 100 supports electrical connections, optical connections, or both.
[0065] In some embodiments, the one or more processors 140, flash memory, and/or a storage device includes associated firmware storing programming
instructions configured to enable the mobile computing device 100, in response to execution of the programming instructions by one or more processors 140, to practice all or selected aspects of a method of conducting an eye exam using a mobile computing device, in accordance with embodiments of the present disclosure.
[0066] In embodiments, the communication module 142 enables wired and/or wireless communications for the transfer of data to and from the mobile computing device 100. In various embodiments, the mobile computing device 100 also includes a network interface configured to connect the mobile computing device 100 to one or more networked computing devices wirelessly via a transmitter and a receiver (or optionally a transceiver) and/or via a wired connection using a communications port. In embodiments, the network interface and the transmitter/receiver and/or
communications port are collectively referred to as a“communication module”. In embodiments, the wireless transmitter/receiver and/or transceiver may be configured to operate in accordance with one or more wireless communications standards. The term "wireless" and its derivatives may be used to describe circuits, devices, systems, methods, techniques, communications channels, etc., that may
communicate data through the use of modulated electromagnetic radiation through a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some embodiments they might not. In embodiments, the mobile computing device 100 includes a wireless communication module 142 for transmitting to and receiving data, for example for transmitting and receiving data from a network, such as a telecommunications network. In examples, the
communication module transmits data, including video data, though a cellular network or mobile network, such as a Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), cdmaOne, CDMA2000, Evolution- Data Optimized (EV-DO), Enhanced Data Rates for GSM Evolution (EDGE),
Universal Mobile Telecommunications System (UMTS), Digital Enhanced Cordless Telecommunications (DECT), Digital AMPS (IS-136/TDMA), and Integrated Digital Enhanced Network (iDEN), Long-Term Evolution (LTE), 3rd generation mobile network (3G), 4th generation mobile network (4G), and/or 5th generation mobile network (5G) networks. In embodiments, the mobile computing device 100 is directly connect with one or more devices via the direct wireless connection by using, for example, Bluetooth and/or BLE protocols, WiFi protocols, Infrared Data Association (IrDA) protocols, ANT and/or ANT+ protocols, LTE ProSe standards, and the like. In embodiments, the communications port is configured to operate in accordance with one or more known wired communications protocol, such as a serial communications protocol (e.g., the Universal Serial Bus (USB), FireWire, Serial Digital Interface (SDI), and/or other like serial communications protocols), a parallel communications protocol (e.g., IEEE 1284, Computer Automated Measurement And
Control (CAMAC), and/or other like parallel communications protocols), and/or a network communications protocol (e.g., Ethernet, token ring, Fiber Distributed Data Interface (FDDI), and/or other like network communications protocols).
[0067] In embodiments, the mobile computing device 100 is configured to run, execute, or otherwise operate one or more applications. In embodiments, the applications include native applications, web applications, and hybrid applications. For example, the native applications are used for operating the mobile computing device 100, such as using a camera or other like sensor of the mobile computing device 100, cellular phone functionality of the mobile computing device 100, and other like functions of the mobile computing device 100. In embodiments, native applications are platform or operating system (OS) specific or non-specific. In embodiments, native applications are developed for a specific platform using platform-specific development tools, programming languages, and the like. Such platform-specific development tools and/or programming languages are provided by a platform vendor. In embodiments, native applications are pre-installed on mobile computing device 100 during manufacturing, or provided to the mobile computing device 100 by an application server via a network. Web applications are applications that load into a web browser of the mobile computing device 100 in response to requesting the web application from a service provider. In embodiments, the web applications are websites that are designed or customized to run on a mobile computing device by taking into account various mobile computing device
parameters, such as resource availability, display size, touch-screen input, and the like. In this way, web applications may provide an experience that is similar to a native application within a web browser. Web applications may be any server-side application that is developed with any server-side development tools and/or programming languages, such as PHP, Node.js, ASP.NET, and/or any other like technology that renders HTML. Hybrid applications may be a hybrid between native applications and web applications. Hybrid applications may be a standalone, skeletons, or other like application containers that may load a website within the application container. Hybrid applications may be written using website development tools and/or programming languages, such as HTML5, CSS, JavaScript, and the like. In embodiments, hybrid applications use browser engine of the mobile computing device 100, without using a web browser of the mobile computing device 100, to render a website’s services locally. In some embodiments, hybrid applications also access mobile computing device capabilities that are not accessible in web
applications, such as the accelerometer, camera, local storage, and the like.
[0068] Any combination of one or more computer usable or computer readable medium(s) may be utilized with the embodiments disclosed herein. The computer- usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non- exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Note that the computer- usable or computer-readable medium can even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc.
[0069] Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming
languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's mobile computing device, partly on the user's mobile computing device, as a stand-alone software package, partly on the user's mobile computing device and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's mobile computing device, through any type of network, including a local area network (LAN) or a wide area network (WAN), or the
connection may be made to an external mobile computing device, (for example, through the Internet using an Internet Service Provider), or wireless network, such as described above.
[0070] Furthermore, example embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium. A code segment may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, program code, a software package, a class, or any combination of instructions, data structures, program statements, and the like.
[0071] In various embodiments, an article of manufacture may be employed to implement one or more methods as disclosed herein. The article of manufacture may include a computer-readable non-transitory storage medium and a storage medium. The storage medium may include programming instructions configured to cause an apparatus to practice some or all aspects a method of conducting an eye exam using a mobile computing device, in accordance with embodiments of the present disclosure.
[0072] The storage medium may represent a broad range of persistent storage medium known in the art, including but not limited to flash memory, optical disks or magnetic disks. The programming instructions, in particular, may enable an
apparatus, in response to their execution by the apparatus, to perform various operations described herein. For example, the storage medium may include programming instructions configured to cause an apparatus to practice some or all aspects of a method of conducting an eye exam using a mobile computing device, in accordance with embodiments of the present disclosure.
[0073] Example Method
[0074] Figure 2 depicts a computer implemented method for conducting an eye exam using a mobile computing device, in accordance with example
embodiments. The methods disclosed herein are preferably performed by a human operator or user in addition to the human subject although in certain situations the operator and the subject can be the same individual, in that case, the subject would be both the subject and the operator or user. The method 200 is described with reference to the components, flowcharts and eyecharts illustrated in Figures 1 , and 3- 12.
[0075] In block 210 an optional preliminary refraction is determined. In certain embodiments, a preliminary refraction is obtained, for example using a smartphone- based photorefraction as described in US Patent 9,380,938, which is hereby incorporated herein in its entirety. Alternatively, a previous manifest refraction or an eye glass prescription value are used as the preliminary refraction. Alternatively, a preliminary refraction can be determined using Low Contrast Visual Acuity using a smart phone programmed with low contrast acuity images (see for example Figure 12). Another method for quickly determining whether the subject is myopic or hyperopic is to use a +3 Diopter lens in front of the eye being tested and a chart at a known dioptric distance to see which is clearer, with or without the lens; without the lens indicates myopia and with the lens indicates hyperopia. In embodiments, the preliminary refraction is entered into the mobile computing device 100, and/or stored in memory of the mobile computing device 100, for example by the mobile computing device touch-screen display 110. If no preliminary refraction is available, then -0.25 diopters (D) can be entered into the memory of the mobile computing device 110 and/or defaulted as a starting value for further analysis. Other values for the default preliminary refraction can also be used without departing from the disclosure. Values for one or both eyes can be entered, defaulted, or otherwise determined. For each eye undergoing evaluation, the preliminary refraction may be converted, using the mobile computing device 100 processor 140, into the spherical equivalent power by the formula:
SE = Sph + 0.5 Cyl Eq. 1
where SE is the spherical equivalent power in diopters; Sph is the spherical component of refraction in diopters, and Cyl is the magnitude of the cylinder component of the refraction in diopters.
[0076] In certain embodiments, the SE is determined using the RDR method of stepwise subjective evaluation of visual response to selected duochrome images at predefined test distances to determine the final DND distance, for example as outlined in the decision trees as shown in Figures 8 and 9. The patient’s response to presented characters on different colored background as darker or bolder leads to a rapid conclusion of the spherical equivalent power in the test eye by following a predefined series of test steps and distances. In additional embodiments the SE is determined by calibrated photo refraction to objectively measure the refractive power in the eye.
[0077] In certain embodiments, the SE is further converted, using the mobile computing device 100 processor 140, to the preliminary far-point using the formula:
FP = M SE Eq. 2
where FP is the far-point distance determined in meters (or in some embodiments another unit of distance measure).
[0078] In certain embodiments, the refractive measurement can be further refined using the RDR method for astigmatism power and axis to separate the spherical power and astigmatic power from the initial SE measurement. In additional embodiments the astigmatism power can be quantified using two or more axes of photo refraction measurement to determine the plus and minus distances for astigmatism power calculation, and this can be used to break out the spherical power from SE.
[0079] Once the FP has been determined, the mobile computing device 100 display 110, such as touch screen display, is placed facing the subject. For example, the operator may hold the mobile computing device 100 display 110 (Figure 1 ) facing the test subject. In certain embodiments, the operator enters data using the mobile computing device 100. Alternatively, the methods of the current disclosure can also be performed by the subject alone. In this case the mobile computing device 100 would be positioned facing the subject, for example on a stand, table, or other appropriate place. In certain embodiments, the subject would enter responses into the mobile computing device by voice command and/or recognition, for example as received by microphone 146. Alternatively, the operator could enter the responses by voice command, for example as received by microphone 146. Typically, when the right eye is tested, the left eye is closed or occluded with a hand, occluder, or patch. Similarly, when the left eye is tested, the right eye is closed or occluded with a hand, occluder, or patch. Directions as to which eye is being tested can be communicated to the subject and/or operator by visual or auditory signals, for example with display 110 and/or speaker 141.
[0080] At block 220 visual acuity is tested. After the SE far-point is established the visual acuity is tested using the mobile computing device 100 at that distance. To test the visual acuity of the subject decrease the size of the optotype is decreased by the processor 140 of the mobile computing device 100 until the subject cannot read the letters anymore. The visual acuity is determined by the smallest optotype that the subject can read correctly, and is measured in 0.1 logMAR units. In embodiments, the size of the optotype 151 is displayed on the touch-screen display 110 and can be increased by tapping the up-arrow 152 or decreased by tapping the down arrow 153. In embodiments, this is done automatically by the processor 140, or be oral indication. The acuity in logarithm of minimum angle of resolution (logMAR) unit can be converted by the processor 140 of the mobile computing device 100 to the Snellen notation for vision at 20 feet (i.e. 20/20, 20/25, etc.). The conversion calculation is well known to those of ordinary skill in the art.
[0081] The range of test distance between 4 m and 0.33 m as shown on Table 1 limits the range of refraction to between -0.25 to -3.00 D for testing eyes without the aid of glasses. For subjects with SE refractive error beyond this range, testing as above can be done using a photo refractor such as one on a mobile device or in an optometry clinic, or over existing eyeglasses to measure the residual refractive error. The prescription of the existing eyeglasses and the residual refractive error can then be added. If the subject does not have existing glasses, then standard test glasses with spherical corrections can be used. For example, if a subject is known to be hyperopic (seeing better in the distance than up close) and the preliminary refraction SE was +2.00 D, then the +3.00 D test glasses should be put on the subject to bring the preliminary refraction to -1.00 D (1.00 m) as the starting point of testing. The recommended test glasses power according to the preliminary refraction SE is given in Table 2, as can be stored in the memory of mobile computing device 100 and queried by processor 140.
[0082] It should be understood that the specific range and increments described above is only one embodiment of the present disclosure. They can be varied in other embodiments of the disclosure. For example, the test range as shown in Table 1 and 2 can be changed to 2.50 D instead of 3.00 D. Or the smallest test increment can be changed to ¼ D instead of 1/8 D, etc.
[0083] Table 2. Recommended Test Glasses Power according to
Preliminary Refraction
Figure imgf000028_0001
[0084] The duochrome chart 130 preferably utilizes optotype sizes that are near the limit of normal visual acuity, such as 20/20 (logMAR 0.0) or 20/25 (logMAR 0.1 ). Since the test distance is varied by the processor 140 according to methods of the present disclosure, the optotype size is adjusted by the processor to maintain the same size in terms of visual angle. The standard 20/20 optotype is approximately 5 minutes of arc in visual angle and used at 20 feet (6 m) distance. The physical optotype size in millimeters is calculated to maintain the same visual angle size over the range of test distances (Table 3 as can be stored in the memory of mobile computing device 100 and queried by processor 140).
[0085] Table 3. Optotype Size as a Function of Test Distance
Distance (m) Optotype Size (mm) Magnification
logMAR O.O logMar O.1
(20/20) (20/25) Relative to 4 m
4.00 5.82 7.32 100.0%
2.67 3.88 4.88 66.7%
2.00 2.91 3.66 50.0%
1.60 2.33 2.93 40.0%
1.33 1.94 2.44 33.3%
1.14 1.66 2.09 28.6%
1.00 1.45 1.83 25.0%
0.89 1.29 1.63 22.2%
0.80 1.16 1.46 20.0%
0.73 1.06 1.33 18.2%
0.67 0.97 1.22 16.7%
0.62 0.90 1.13 15.4%
0.57 0.83 1.05 14.3%
0.53 0.78 0.98 13.3%
0.50 0.73 0.92 12.5%
0.47 0.68 0.86 11.8%
0.44 0.65 0.81 11.1 %
0.42 0.61 0.77 10.5%
0.40 0.58 0.73 10.0%
0.38 0.55 0.70 9.5%
0.36 0.53 0.67 9.1 % 0.35 0.51 0.64 8.7%
0.33 0.48 0.61 8.3%
0.32 0.47 0.59 8.0%
[0086] The test distance is preferably monitored by the distance sensor 120 (see Figures 1 and 3). If the subject is too far or too close instructions are provided by the mobile computing device 100 processer 140 to“move closer” or“move away” using speaker 141. However, the current disclosure can still be practiced even if the mobile computing device 100 does not have a distance sensor. In this case, a tape measure with distance markings that correspond to the dioptric steps shown in Table 1 can be used. In embodiments, the operator looks at the mobile computing device 100 touch-screen display 110 for distance setting 150 (Figure 3) and then walk to the position so that the mobile computing device 100 touch-screen display 110 is at the correct distance from the subject’s eyes.
[0087] In block 230 the mobile computing device 100 is used to determine a final far- point distance and the processor 140 converts the results of this test to the spherical equivalent refraction, for example using equations 1 and 2. A final far-point distance is tested and/or determined using a duochrome eye chart 130, such as shown in Figure 3 and/or Figure 10, as displayed on the display 110 of the mobile computing device 100. In the embodiment shown in Figure 3 half of the chart 131 has a red background and the other half 132 has a green background both over a larger blue background. The duochrome test is based on the principle that the eye refracts shorter wavelength light (e.g. green) more strongly than longer wavelength light (e.g. red). Myopia is caused by the refractive power of the eye being too strong. Therefore, an uncorrected or undercorrected myopic eye would see the green half of the chart as being less dark or bold or clear, while the red half of the chart as being darker, bolder, or sharper. Given the principle, other color pairs can be used, for example red and blue. As shown in Figure 3, the chart halves 131 and 132 are labeled by large bold symbols, such as the numbers 1 and 2, so that the subject can report these numbers even if they are colorblind and cannot distinguish red from green, in the red/green implementation. Note that the duochrome principle is based on the wavelength dependence of refractive index in the media of the eye and therefore is not affected by the photopigment deficit that causes color blindness. In additional embodiments, a third color can be used for the background in the chart to help minimize the accommodative response and improve accuracy as described previously (see, for example, Figure 3). Additionally, a fogging lens can be used on the fellow eye to also minimize accommodative response, and these techniques can be used together. Fogging of the eye contralateral to the test is used to help relax accommodation and improve the accuracy of RDR. This is particularly helpful for non-presbyopic subjects (usually those younger than 45 years of age). Fogging is preferably done by placing a lens over the fellow eye that is more plus than the lens used in the test eye. For example, in the most common situation when the test eye has low myopia, then no lens is put over the test eye and a +3D lens is placed over the fellow eye. In another example, when the test eye has low hyperopia, a +3D lens is placed over the test eye and a +6 lens is placed over the fellow eye. In yet another example, when the test eye has moderate myopia, a -3D lens is placed over the test eye and no lens is placed over the fellow eye.
[0088] The duochrome chart 130 is first displayed on the mobile computing device 100 display 110 at the far-point distance FP matching the preliminary SE. Because the SE is typically specified in 1/8 D increments, the FP can be determined using the conversion chart in Table 1 , which can be contained and/or stored within memory of the mobile computing device 100 and queried by processor 140. Table 1 starts from the distance of 4 meters because a longer distance is typically not practical due to consideration of room size and display size. This would also be the default distance if no preliminary refraction was available. Flowever, this distance and the associated table can be modified or augmented for longer distances as needed. Table 1 ends at 0.32 m because closer distance is difficult to maintain accurately and also due to display resolution considerations. Flowever, this distance and the associated table can be modified or augmented for shorter distances as needed. [0089] Table 1. Conversion between spherical equivalent power and far- point distance
Spherical Far- Spherical Far-
Equivalent point Equivalent point
(D) (m) (D) (m)
-0.25 4. 00 -1.75 0.57
-0.38 2. 67 -1.88 0.53
-0.50 2. 00 -2.00 0.50
-0.63 1. 60 -2.13 0.47
-0.75 1. 33 -2.25 0.44
-0.88 1. 14 -2.38 0.42
-1.00 1. 00 -2.50 0.40
-1.13 0. 89 -2.63 0.38
-1.25 0. 80 -2.75 0.36
-1.38 0. 73 -2.88 0.35
-1.50 0. 67 -3.00 0.33
-1.63 0. 62 -3.13 0.32
[0090] By way of example Table 1 can be interrogated by the processor 140 of the mobile computing device 100 as follows. In this example assuming as a starting point that the subject’s previous eyeglasses refraction was -1.25 D Sph, -1.00 D Cy/ x
180°, this gives an SE of -1.75 D according to Equation 1 , equivalent to FP of 0.57 m according to Table 1. The duochrome chart 130 would then be displayed initially at a 0.57 m distance. The mobile computing device 100 continually monitors the test distance between the mobile computing device 100 and the subject eyes using the distance sensing module 120. The mobile computing device 100 is prompter by the processor 140 based on the distance measurement returned by the distance sensing module 120 to give either a visual or audio cue as to when or if the subject is at the correct distance. For example a chime, or a voiced commend such as“move closer” or“move away” through its speaker 141 (see, for example, Figure 3), a visual signal such as colors, or other graphics that indicate to the operator or subject to“move closer” or“move away” until the distance between the mobile computing device 100 and the subject’s eye(s) are at the correct distance, at which point a chime (or other sound) or visual cue is used to signal the operator to stop.
[0091] In certain embodiments the mobile computing device 100 is prompted by the processor 140 to query the subject as to looks darker, bolder, or sharper, for example“which looks darker, bolder, or sharper, letters on the red chart or the green chart” This can be done visually, or audibly. In certain embodiments, the operator can double-tap or otherwise indicate the chart 131 or 132 (Figure 3) that appears darker, bolder, or sharper to the subject on the mobile computing device 100 display 110. Conversely, the operator can click, double-tap, or otherwise indicate the chart 131 or 132 (Figure 3) that appears less dark, bold, or sharp to the subject on the mobile computing device 100 display 110. In certain embodiments, the choice can be made by oral indication, for example by speaking such that voice recognition software on the mobile computing device 100 receives the indication, for example via microphone 146. For example, If the subject indicates the red and green charts 131 and 132 appear equally dark, bold, or sharp then the operator may click or otherwise make the indication on both chart 131 and 132 to indicate that both are equal. In certain embodiments, the choice can be made by oral indication, for example by speaking such that voice recognition software on the mobile computing device 100 receives the indication, for example via microphone 146. If the subject indicates he or she cannot see the letters at all, then the operator may tap the“cannot see” button 133 on the screen or make this oral indication as described above. Flowever, if the green chart 132 appears sharper, or if both appear equal, this indicates that the mobile computing device 100 should be moved further way from the subject. If the red chart 131 appears darker, bolder, or sharper, or if the subject cannot see the letters on the chart 130, this indicates that the mobile computing device 100 should be moved closer to the subject. In certain embodiments, this distance adjustment is done using a bracketing search algorithm that starts in ½ D steps and refines to 1/8 D, or for example as given in Figures 8 and9. If the subject cannot read the optotype even at the closest distance then a larger optotype can be used and the test repeated. In the embodiment shown, the size of the optotype is listed (151 ) on the display 110. In embodiments, the duochrome chart 130 and optotype within can be made larger by tapping the up-arrow 152 or made smaller by tapping the down arrow 153. If the subject can see the optotype and still indicate that the red chart appears sharper at the closest distance, then test glasses with stronger minus or weaker plus (Table 2) should be used and the far-point test should be restarted at the furthest distance (i.e. 4 m).
[0092] In this example, the subject says characters on the red chart 131 are darker, bolder, or sharper at 0.57 m (-1.75 D) so the processor 140 of the mobile computing device 110 changes the test distance to 0.44 m (-2.25 D, an increment of - 0.50 D from -1.75 D), using Table 1. Again, the subject says the characters on the red chart 131 appears darker, bolder, or sharper so the processor 140 of the mobile computing device 110 changes the test distance to 0.36 m (-2.75 D), using Table 1.
At this point the subject says both charts131 , 132 look equally dark, bold, or sharp. Because the search direction has reversed, the search increment will now be successively reduced to ¼ D then 1/8 D. So the processor 140 of the mobile computing device 110 changes the test distance to 0.40 m (-2.50 D, an increment of 0.25 D from -2.75 D), using Table 1. The subject says both charts look equally dark, bold, or sharp. So the mobile computing device 110 processor 140 changes the final test distance to 0.42 m (-2.38 D, an increment of 0.13 D from -2.50 D) endpoint. The subject says the characters on the red chart 131 looks darker, bolder, or sharper. So the processor 140 of the mobile computing device 110 determines FP is 0.40 m and the SE is therefore -2.50 D. The FP is defined the furthest distance at which characters on the red and green halves of the duochrome chart 130 appear equally dark, bold, or sharp to the subject. This result can be an accurate measure of the patient’s spherical equivalent power in the eye being tested.
[0093] At block 240 the astigmatism axis is determined using the mobile computing device 100. Determination of the astigmatism axis comes after SE far- point measurement and is typically tested at a distance further than the SE far-point to introduce myopic blur. The preferred additional distance is 1 D equivalent but this is not critical; 1.50 D or 0.50 D can also be used. So to continue the example above where the far-point was found to be 0.40 m corresponding to -2.50 D (Table 1 ), the axis should be tested at 0.67 mm (-1.50 D, 1 D further than -2.50 D).
[0094] The astigmatism axis is tested using an astigmatism axis test chart on a touch-screen display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions, for example the half-clock test chart 160 displayed on the mobile computing device 100 touch-screen display 110 (Figure 4). Although a half clock chat is shown, it is contemplated that the astigmatism axis test chart can comprise a collection of lines orientated in different radial directions, examples of such charts include but are not limited to the clock type charts or any fraction thereof. The chart 160 is displayed at full size if the test distance is 4 m. For closer test distance, the size of the chart 160 is reduced by the processor 140 of the mobile computing device 100 according to the relative magnification factor shown in Table 3. The chart 160 is composed of lines oriented at angular increment 1/3 clock hour (20 minutes) apart. The subject is asked which line or 2 lines appear the sharpest or boldest. If no line appears sharper than any other, then the subject eye has no astigmatism and the subjective refraction test is completed. In this case the operator taps on the“No Astigmatism” button 162 or otherwise makes this indication, for example orally. If one line appears sharpest, then the orientation of that line is the axis of plus cylinder. In this case the operator double- taps on the clock hour: minute numbers next to the line or otherwise makes this indication, for example orally. If two lines appear equally sharp, then the angle that lies in between those two lines is the axis of plus cylinder. In this case the operator taps on the numbers next to both of those lines or otherwise makes this indication, for example orally. Since the half-clock display 160 might be too small for the operator to finger tap on for close testing distances, the operator can tap on the“Full Size/Test Size” toggle button 161 to make the half-clock display full sized for the purpose of entering the axis or otherwise makes this indication, for example orally. To continue the example, suppose the subject says both the 3 o’clock and 3:20 lines appear sharp. The operator then taps the number 3 and the 3:20 circle button to enter the axis. The clock hour position is then converted by the processor 140 to degrees for the specification of the astigmatism plus cylinder axis according to Table 4 as can be stored in the memory of mobile computing device 100 and queried by processor 140. Since the axis is between 3 and 3:20 in the example, the equivalent axis of plus cylinder is 175 degrees. The axis of minus cylinder is 90 degrees away, which is 85 degrees.
[0095] This claim is regarding making astigmatism measurements, which involves determining the axis and magnitude of the eye’s astigmatism. The axis is determined using a radial chart on the mobile device and having the subject select the radial direction where the line is clearer, darker, or bolder. The magnitude is determined starting with this axis and the final distance from the RDR spherical equivalent measurement. Then an RDR method decision tree is used to determine the plus and minus axis distances and resulting power values. The difference in these powers is the astigmatism magnitude.
[0096] Table 4. Conversion between Clock-Hour and Plus-Cylinder Axis in
Degrees
Clock Clock Clock Clock
Hour HounMin Degree Hour HounMin Degree
3.00 3:00 180 6.00 6:00 90
3.17 3:10 175 6.17 6:10 85
3.33 3:20 170 6.33 6:20 80
3.50 3:30 165 6.50 6:30 75
3.67 3:40 160 6.67 6:40 70
3.83 3:50 155 6.83 6:50 65
4.00 4:00 150 7.00 7:00 60
4.17 4:10 145 7.17 7:10 55
4.33 4:20 140 7.33 7:20 50
4.50 4:30 135 7.50 7:30 45
4.67 4:40 130 7.67 7:40 40 4.83 4:50 125 7.83 7:50 35
5.00 5:00 120 8.00 8:00 30
5.17 5:10 115 8.17 8:10 25
5.33 5:20 110 8.33 8:20 20
5.50 5:30 105 8.50 8:30 15
5.67 5:40 100 8.67 8:40 10
5.83 5:50 95 8.83 8:50 5
9.00 9:00 0
[0097] At block 250 the astigmatism magnitude is determined. The
measurement is performed using duochrome chart 170 of line targets 171 and 172 oriented along the plus axis (Figure 5) and using duochrome chart 180 of line targets 181 and 182 oriented along the minus axis (Figure 6). The far-points for both charts are determined using the RDR method steps already described for the SE far-point test. The astigmatism magnitude is the dioptric difference between the plus and minus axis far-points. To continue the example, suppose the far-point for the plus axis is 0.50 m (-2.00 D) and the far-point of the minus axis is 0.33 m (-3.00 D), then the magnitude of astigmatism is the difference between the 2 dioptric values: 1.00 D. Since we know the SE is -2.50 D and the plus cylinder axis is 175 degrees, the manifest refraction is fully measured. The final result can be written as -3.00 D Sph, +1.00 D Cy/ x 175° in the plus cylinder notation. Alternatively, it can be written as - 2.00 D Sph, -1.00 D Cy/ x 85° in the minus cylinder notation. The method to convert between notations is known by those of ordinary skill in the art.
[0098] The method ends at block 260, at which time the results can be displayed and/or communicated. Alternatively, if only one eye has been tested, assuming the subject has more than one eye, the mobile computing device can prompt the operator to test the second eye.
[0099] Example Telemedicine System
[00100] FIG. 7 illustrates a networked telemedicine system 700, in accordance with embodiments herein. The networked telemedicine system 700 includes the mobile computing device 100 in wireless communication therewith. The networked telemedicine system 700 also induces other networked devices 710, which may be in wired or wireless communication therewith. In some embodiments, the mobile computing device 100 includes application software with executable instructions configured to transmit and receive information from the network 705. The information can be transmitted to and/or received from another device, such as one or more networked devices 705 through a network. In certain examples, the mobile computing device 100 is also capable transmitting information about an eye exam of a subject to one or more of a doctor, such as an eye doctor, other medical practitioner or an eyeglasses provider.
[00101] As depicted in FIG. 7, the telemedicine system 700 distributes and receives information to and from one or more networked devices 710 through one or more of network 705. According to various embodiments, network 705 may be any network that allows computers to exchange data. In some embodiments, network 705 includes one or more network elements (not shown) capable of physically or logically connecting computers. The network 705 may include any appropriate network, including an intranet, the Internet, a cellular network, a local area network (LAN), a wide area network (WAN), a personal network or any other such network or combination thereof. Components used for such a system can depend at least in part upon the type of network and/or environment selected. Protocols and components for communicating via such a network are well known and will not be discussed herein in detail. In embodiments, communication over the network 705 are enabled by wired or wireless connections, and combinations thereof. Each network 705 includes a wired or wireless telecommunication means by which network systems (including systems adherence monitoring device 100 and networked devices 710 may communicate and exchange data. For example, each network 705 is implemented as, or may be a part of, a storage area network (SAN), personal area network (PAN), a metropolitan area network (MAN), a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), a virtual private network (VPN), an intranet, an Internet, a mobile telephone network, such as Global System for Mobile Communications (GSM), General Packet Radio Service (GPRS), cdmaOne, CDMA2000, Evolution- Data Optimized (EV-DO), Enhanced Data Rates for GSM Evolution (EDGE),
Universal Mobile Telecommunications System (UMTS), Digital Enhanced Cordless Telecommunications (DECT), Digital AMPS (IS-136/TDMA), and Integrated Digital Enhanced Network (iDEN), Long-Term Evolution (LTE), 3rd generation mobile network (3G), 4th generation mobile network (4G), and/or 5th generation mobile network (5G) networks, a card network, Bluetooth, near field communication network (NFC), any form of standardized radio frequency, or any combination thereof, or any other appropriate architecture or system that facilitates the communication of signals, data, and/or messages (generally referred to as data). Throughout this specification, it should be understood that the terms "data" and "information" are used
interchangeably herein to refer to text, images, audio, video, or any other form of information that can exist in a computer-based environment.
[00102] In an example embodiment, each network system (including
telemedicine system 100 and networked devices 710) includes a device having a communication component capable of transmitting and/or receiving data over the network 705. For example, each networked device 710 may comprise a server, personal computer, mobile device (for example, notebook computer, tablet computer, netbook computer, personal digital assistant (PDA), video game device, GPS locator device, cellular telephone, smartphone, or other mobile device), a television with one or more processors embedded therein and/or coupled thereto, or other appropriate technology that includes or is coupled to a web browser or other application for communicating via the network 705.
[00103] Although various example methods, apparatus, systems, and articles of manufacture have been described herein, the scope of coverage of the present disclosure is not limited thereto. On the contrary, the present disclosure covers all methods, apparatus, and articles of manufacture fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents. For example, although the above discloses example systems including, among other components, software or firmware executed on hardware, it should be noted that such systems are merely illustrative and should not be considered as limiting. In particular, it is contemplated that any or all of the disclosed hardware, software, and/or firmware components can be embodied exclusively in hardware, exclusively in software, exclusively in firmware or in some combination of hardware, software, and/or firmware.

Claims

Claims What is claimed is:
1. A computer-implemented method for conducting Rapid
Deductive Refraction (RDR) stepwise eye exam of a subject’s eye with a mobile computing device, comprising:
initiating, using the mobile computing device, an eye exam for a subject; presenting to the subject, using a screen of the mobile computing
device, one or more duochrome target images at defined distances that
correspond to one or more diopter powers;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome target images; entering and/or storing the results of the query in memory of the mobile computing device; and
using the results of the query to determine a refractive value of an eye of the subject.
2. The computer-implemented method of claim 1 , wherein the one or more duochrome target images are a sequence of duochrome images selected from a decision tree based on the subjective responses from the subject.
3. The computer-implemented method of claim 1 , wherein the subjective responses from the subject are at least in part used to select one of a plurality of predefined decision trees for final refraction determination.
4. The computer-implemented method of claim 1 , wherein the duochrome target image comprises a duochrome chart with a range in
optotypes or pattern size and/or a range in visual angle.
5. The computer-implemented method of claim 4, wherein the range in optotypes is from logMAR 0.0 (20/20) to logMAR 0.50 (20/63).
6. The computer-implemented method of claim 1 , further comprising soliciting, from the subject a response of darker, bolder, or sharper in response to the query.
7. The computer-implemented method of claim 1 , wherein the method is to used measure a spherical equivalent refraction to establish a starting point refraction value for manifest refraction or retinoscopy.
8. The computer-implemented method of claim 1 , further comprising:
monitoring, using a distance-sensing module of the mobile computing device, a distance from the mobile computing device to the subject to set a correct distance for each duochrome target image at the defined distances.
9. The computer-implemented method of claim 1 , further comprising:
monitoring, using a distance-sensing module of the mobile computing device, and
continuously adjusting an optotype size for a correct measurement.
10. The computer-implemented method of claim 1 , wherein the duochrome target image comprises a first color field and a second, different color field where the first color field has a color with a shorter wavelength light than a color of the second color field.
11. The computer-implemented method of claim 10, wherein the duochrome target image incudes a third background color that has a shorter wavelength than the first color field and the second color field to help minimize accommodation of the subject.
12. The computer-implemented method of claim 1 , wherein the duochrome target image further includes contrasting lines to help minimize accommodation of the subject.
13. The computer-implemented method of claim 1 , wherein a fogging lens with positive dioptric power is placed in front of a fellow eye to help minimize accommodation of the subject.
14. The method of claim 1 , wherein a photorefractor is used expand a refractive range of the mobile computing device.
15. The computer-implemented method of claim 1 , wherein the method is used to determine one or more of an astigmatism axis, an astigmatism power, a spherical refraction value, and a cylindrical refraction value.
16. The computer-implemented method of claim 1 , further comprising entering, using the mobile computing device, a preliminary refraction value in memory of the mobile computing device, wherein the preliminary refraction value comprises a manifest refraction or an eye glass prescription value.
17. The computer-implemented method of claim 2, wherein the sequence of duochrome images are selected from the decision tree using bracketing search algorithm.
18. The computer-implemented method of claim 1 , further comprising determining visual acuity using the mobile computing device, the method further comprising:
presenting to the subject, using a screen of the mobile computing
device, a low-contrast visual acuity chart;
querying the subject as to a smallest sized optotype the subject can read correctly at the visual acuity test distance;
entering and/or storing a response to the query in memory of the mobile computing device; and
storing the visual acuity based on the visual acuity test distance and the smallest readable optotype size.
19. The computer-implemented method of claim 18, further
comprising:
performing the preliminary visual acuity tests at two distances to
determine if the subject eye is nearsighted or farsighted.
20. The computer-implemented method of claim 1 , further
comprising:
determining an astigmatism axis using the mobile computing device, the method further comprising:
displaying, using the mobile computing device, an astigmatism axis test chart on a display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions;
querying the subject as to which line(s) on the astigmatism axis test chart appear the sharpest; and
entering and/or storing a response to the query in memory of the mobile computing device thereby determining the astigmatism axis.
21. The computer-implemented method of claim 1 , further
comprising determining an astigmatism magnitude using the mobile computing device, the method further comprising:
displaying, using the mobile computing device, a duochrome chart of line targets oriented along a plus axis;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
storing the results of the query in memory of the mobile computing device; displaying, using the mobile computing device, a duochrome chart of line targets oriented along a minus axis;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
storing the results of the query in memory of the mobile computing device; and determining the dioptric difference between the plus-axis and the minus-axis, thereby determining the astigmatism magnitude.
22. The computer-implemented method of claim 1 , wherein the
mobile computing device is a smart phone.
23. The computer-implemented method of claim 21 , wherein the
smart phone is an iPhone.
24. The computer-implemented method of claim 1 , wherein the
mobile computing device is in communication with a network.
25. The computer-implemented method of claim 23, wherein the
network is a telecommunications network.
26. The computer-implemented method of claim 1 , further comprising:
initiating, by the mobile computing device, a telemedicine session.
27. The computer-implemented method of claim 1 , further
comprising: placing, over the eye being tested in a subject, a lens having a spherical diopter power more positive than the preliminary expected spherical equivalent refraction of the eye of the subject.
28. A non-transitory computer-readable storage medium with an executable program stored thereon for conducting a Rapid Deductive
Refraction (RDR) stepwise eye exam of a subject’s eye with a mobile
computing device, wherein the program instructs a microprocessor to perform the steps of:
initiating, using the mobile computing device, an eye exam for a subject; presenting to the subject, using the mobile computing device, one or more duochrome target images at defined distances that correspond to one or more diopter powers;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome target images; entering and/or storing the results of the query in memory of the mobile computing device; and
using the results of the query to determine a refractive value of an eye of the subject.
29. The non-transitory computer-readable storage medium of claim 28, wherein the one or more duochrome target images are a sequence of duochrome images selected from the decision tree based on the subjective responses from the subject.
30. The non-transitory computer-readable storage medium of claim 28, wherein the subjective responses from the subject are at least in part used to select one of a plurality of predefined decision trees to final refraction determination.
31. The non-transitory computer-readable storage medium of claim 28, wherein the duochrome target image comprises a duochrome chart with a range in optotypes or pattern size and/or a range in visual angle.
32. The non-transitory computer-readable storage medium of claim 31 , wherein the range in optotypes is from logMAR 0.0 (20/20) to logMAR 0.50 (20/63).
33. The non-transitory computer-readable storage medium of claim 28, wherein the program instructs the microprocessor to further perform the steps of:
monitoring, using a distance-sensing module of the mobile computing device, a distance from the mobile computing device to the subject to set a correct distance for each duochrome target image at the defined distances.
34. The non-transitory computer-readable storage medium of claim 28, wherein the program instructs a microprocessor to further perform the steps of:
monitoring, using a distance-sensing module of the mobile computing device, a distance from the mobile computing device to the subject, and
continuously adjusting an optotype size for a correct measurement.
35. The non-transitory computer-readable storage medium of claim 28, wherein the duochrome target image comprises a first color field and a second, different color field where the first color field has a color with a shorter wavelength light than a color of the second color field.
36. The non-transitory computer-readable storage medium of claim 35, wherein the duochrome target image incudes a third background color that has a shorter wavelength than the first color field and a color of the second color field to help minimize accommodation of the subject.
37. The non-transitory computer-readable storage medium of claim 28, wherein the duochrome target image further includes contrasting lines to help minimize accommodation of the subject.
38. The non-transitory computer-readable storage medium of claim
28, wherein a photo refractor is used expand a refractive range of the mobile computing device.
39. The non-transitory computer-readable storage medium of claim
29, wherein the sequence of duochrome images are selected from the decision tree using a bracketing search algorithm.
40. The non-transitory computer-readable storage medium of claim 28, wherein the program instructs a microprocessor to further perform the steps of:
presenting to the subject, using a screen of the mobile computing device, a low-contrast visual acuity chart;
querying the subject as to a smallest sized optotype the subject can read correctly at the visual acuity test distance;
entering and/or storing a response to the query in memory of the mobile computing device; and storing the visual acuity based on the visual acuity test distance and the smallest readable optotype size.
41. The non-transitory computer-readable storage medium of claim
40, wherein the program instructs a microprocessor to further perform the
steps of:
performing the preliminary visual acuity tests at two distances to
determine if the subject eye is nearsighted or farsighted.
42. The non-transitory computer-readable storage medium of claim
28, wherein the program instructs a microprocessor to further perform the
steps of:
displaying, using the mobile computing device, an astigmatism axis test chart on a display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions;
querying the subject as to which line(s) on the astigmatism axis test chart appear the sharpest; and
entering and/or storing a response to the query in memory of the mobile computing device thereby determining the astigmatism axis.
43. The non-transitory computer-readable storage medium of claim
28, wherein the program instructs a microprocessor to further perform the
steps of:
displaying, using the mobile computing device, a duochrome chart of line targets oriented along a plus axis;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
storing the results of the query in memory of the mobile computing device; displaying, using the mobile computing device, a duochrome chart of line targets oriented along a minus axis;
querying, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
storing the results of the query in memory of the mobile computing device; and determining the dioptric difference between the plus-axis and the minus-axis, thereby determining the astigmatism magnitude.
44. The non-transitory computer-readable storage medium of claim
28, wherein the program instructs a microprocessor to further perform the
steps of initiating, by the mobile computing device, a telemedicine session.
45. An apparatus for conducting Rapid Deductive Refraction (RDR) stepwise eye exam of a subject’s eye with a mobile computing device, comprising a mobile computing device, comprising:
a display; and
a processor coupled to memory, wherein the processor is programed to:
initiate an eye exam for a subject;
present one or more duochrome target images at defined distances that correspond to one or more diopter powers;
query the subject for subjective responses as the quality of individual color fields in the duochrome target images;
storing the results of the query in the memory of the mobile computing device; and
use the results of the query to determine a refractive value of an eye of the subject.
46. The apparatus of claim 45, wherein the one or more duochrome target images are a sequence of duochrome images selected from a decision tree based on the subjective responses from the subject.
47. The apparatus of claim 45, wherein the subjective responses from the subject are at least in part used to select a one of a plurality of predefined decision trees for final refraction determination.
48. The apparatus of claim 45, wherein the duochrome target image comprises a duochrome chart with a range in optotypes or pattern size and/or a range in visual angle.
49. The apparatus of claim 48, wherein the range in optotypes is from logMAR 0.0 (20/20) to logMAR 0.50 (20/63).
50. The apparatus of claim 45, wherein the processor is programed to solicit, from the subject a response of darker, bolder, or sharper in response to the query.
51. The apparatus of claim 45, wherein the processor is programed to monitor, using a distance-sensing module of the mobile computing device, a distance from the subject to the mobile computing device to set a correct distance for each of the duochrome target images at the defined distances.
52. The apparatus of claim 45, wherein the processor is programed to monitor, using a distance-sensing module of the mobile computing device a distance from the subject to the mobile computing device, and
adjust an optotype size for a correct measurement.
53. The apparatus of claim 45, wherein the duochrome target image comprises a first color field and a second, different color field where the first color field has a color with a shorter wavelength light than a color of the second color field.
54. The apparatus of claim 53, wherein the duochrome target image incudes a third background color that has a shorter wavelength than the first color field and the second color field to help minimize accommodation of the subject.
55. The apparatus of claim 45, wherein the duochrome target image further includes contrasting lines to help minimize the subject’s
accommodation.
56. The apparatus of claim 45, wherein a photo refractor is used expand a refractive range of the mobile computing device.
57. The apparatus of claim 45, wherein the apparatus is used to determine one or more of an astigmatism axis and an astigmatism power a spherical refraction value and a cylindrical refraction value.
58. The apparatus of claim 45, wherein the processor is programed to: enter a preliminary refraction value in memory of the mobile computing system, wherein the preliminary refraction value comprises a manifest refraction or an eye glass prescription value.
59. The apparatus of claim 47, wherein are the sequence of
duochrome images are selected from the decision tree using bracketing search algorithm.
60. The apparatus of claim 45, wherein the processor is programed to:
present to the subject, using a screen of the mobile computing device, a low- contrast visual acuity chart;
query the subject as to a smallest sized optotype the subject can read correctly at the visual acuity test distance;
enter and/or store a response to the query in memory of the mobile computing device; and
store the visual acuity based on the visual acuity test distance and the smallest readable optotype size.
61. The apparatus of claim 60, wherein the wherein the processor is programed to:
perform the preliminary visual acuity tests at two distances to
determine if the subject eye is nearsighted or farsighted.
62. The apparatus of claim 45, wherein the processor is programed to: display an astigmatism axis test chart on a touch-screen display of the mobile computing device, wherein the astigmatism axis test chart comprises a collection of lines orientated in different radial directions;
query the subject as to which line(s) on the astigmatism axis test chart appear the sharpest;
enter and/or store a response to the query in memory of the mobile computing device thereby determining the astigmatism axis.
63. The apparatus of claim 45, wherein the processor is programed to: display, using the mobile computing device, a duochrome chart of line targets oriented along a plus axis; query, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
store the results of the query in memory of the mobile computing device; display, using the mobile computing device, a duochrome chart of line targets oriented along a minus axis;
query, using the mobile computing device, the subject for subjective responses as the quality of individual color fields in the duochrome chart of line targets;
store the results of the query in memory of the mobile computing device; and determining the dioptric difference between the plus-axis and the minus-axis, thereby determining the astigmatism magnitude.
64. The apparatus of claim 45, wherein the mobile computing device is a smart phone.
65. The apparatus of claim 64, wherein the smart phone is an
iPhone.
66. The apparatus of claim 45, wherein the mobile computing device is in communication with a network.
67. The apparatus of claim 66, wherein the network is a
telecommunications network.
68. The apparatus of claim 45, wherein the processor is programed to: initiate a telemedicine session.
PCT/US2018/061694 2017-11-17 2018-11-16 Smartphone-based measurements of the refractive error in an eye WO2019099952A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762588304P 2017-11-17 2017-11-17
US62/588,304 2017-11-17

Publications (1)

Publication Number Publication Date
WO2019099952A1 true WO2019099952A1 (en) 2019-05-23

Family

ID=66539932

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/061694 WO2019099952A1 (en) 2017-11-17 2018-11-16 Smartphone-based measurements of the refractive error in an eye

Country Status (1)

Country Link
WO (1) WO2019099952A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112043230A (en) * 2019-06-07 2020-12-08 斯派克斯有限公司 Eye testing
EP3811849A1 (en) * 2019-10-24 2021-04-28 Essilor International Method and system for determining a prescription for an eye of a person
USD938485S1 (en) 2019-09-17 2021-12-14 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD938986S1 (en) 2019-09-17 2021-12-21 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
EP3940449A1 (en) * 2020-07-15 2022-01-19 Essilor International System and method for determining a rounded value of an optical feature of an ophtalmic lens adapted to provide a dioptric correction for improving the vision of a subject
WO2022082247A1 (en) * 2020-10-21 2022-04-28 Bischel Roland Computer-implemented method for optometric color testing
GB2612366A (en) * 2021-11-01 2023-05-03 Ibisvision Ltd Method and system for eye testing
WO2023148372A1 (en) * 2022-02-06 2023-08-10 Visionapp Solutions S.L. A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye
US11779202B2 (en) 2019-09-17 2023-10-10 Lombart Brothers, Inc. Systems and methods for automated subjective refractions

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130128229A1 (en) * 2011-11-21 2013-05-23 Icheck Health Connection, Inc. Video game to monitor retinal diseases
US20140211166A1 (en) * 2011-08-09 2014-07-31 ESSILOR INTERNATIONAL (Compagnie Generate d'Optiqu Device for determining a group of vision aids suitable for a person
US20160120402A1 (en) * 2013-06-06 2016-05-05 Ofer Limon System and method for measurement of refractive error of an eye based on subjective distance metering
US9380938B2 (en) * 2011-09-08 2016-07-05 Gobiquity, Inc. System and methods for documenting and recording of the pupillary red reflex test and corneal light reflex screening of the eye in infants and young children
KR101784599B1 (en) * 2016-04-08 2017-10-11 이선구 (Eye Measuring System and Operation Method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140211166A1 (en) * 2011-08-09 2014-07-31 ESSILOR INTERNATIONAL (Compagnie Generate d'Optiqu Device for determining a group of vision aids suitable for a person
US9380938B2 (en) * 2011-09-08 2016-07-05 Gobiquity, Inc. System and methods for documenting and recording of the pupillary red reflex test and corneal light reflex screening of the eye in infants and young children
US20130128229A1 (en) * 2011-11-21 2013-05-23 Icheck Health Connection, Inc. Video game to monitor retinal diseases
US20160120402A1 (en) * 2013-06-06 2016-05-05 Ofer Limon System and method for measurement of refractive error of an eye based on subjective distance metering
KR101784599B1 (en) * 2016-04-08 2017-10-11 이선구 (Eye Measuring System and Operation Method thereof

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112043230A (en) * 2019-06-07 2020-12-08 斯派克斯有限公司 Eye testing
WO2020243771A1 (en) * 2019-06-07 2020-12-10 SPEQS Limited Eye test
JP7166473B2 (en) 2019-06-07 2022-11-07 スペックス リミテッド eye examination
JP2022526867A (en) * 2019-06-07 2022-05-26 スペックス リミテッド Eye examination
AU2020286342B2 (en) * 2019-06-07 2021-07-22 SPEQS Limited Eye test
USD938485S1 (en) 2019-09-17 2021-12-14 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
US11779202B2 (en) 2019-09-17 2023-10-10 Lombart Brothers, Inc. Systems and methods for automated subjective refractions
USD938986S1 (en) 2019-09-17 2021-12-21 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD1012124S1 (en) 2019-09-17 2024-01-23 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD986277S1 (en) 2019-09-17 2023-05-16 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD986278S1 (en) 2019-09-17 2023-05-16 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD986922S1 (en) 2019-09-17 2023-05-23 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
USD1012123S1 (en) 2019-09-17 2024-01-23 Lombart Brothers, Inc. Display screen or portion thereof with graphical user interface
WO2021078880A1 (en) * 2019-10-24 2021-04-29 Essilor International Method and system for determining a prescription for an eye of a person
EP3811849A1 (en) * 2019-10-24 2021-04-28 Essilor International Method and system for determining a prescription for an eye of a person
EP3940449A1 (en) * 2020-07-15 2022-01-19 Essilor International System and method for determining a rounded value of an optical feature of an ophtalmic lens adapted to provide a dioptric correction for improving the vision of a subject
WO2022013231A1 (en) * 2020-07-15 2022-01-20 Essilor International System and method for determining a rounded value of an optical feature of an ophthalmic lens adapted to provide a dioptric correction for improving the vision of a subject
WO2022082247A1 (en) * 2020-10-21 2022-04-28 Bischel Roland Computer-implemented method for optometric color testing
GB2612366A (en) * 2021-11-01 2023-05-03 Ibisvision Ltd Method and system for eye testing
WO2023148372A1 (en) * 2022-02-06 2023-08-10 Visionapp Solutions S.L. A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye

Similar Documents

Publication Publication Date Title
WO2019099952A1 (en) Smartphone-based measurements of the refractive error in an eye
Mohammadpour et al. Smartphones, tele-ophthalmology, and VISION 2020
US20110267577A1 (en) Ophthalmic diagnostic apparatus
US11278200B2 (en) Measurement method for the determination of a value of a visual correction need for near vision of an individual
US20100292999A1 (en) Ophthalmic diagnostic apparatus
WO2018107108A1 (en) Method for visual field perimetry testing
CN109645953B (en) Visual detection and training method and device and VR equipment
US11129526B2 (en) Devices, method, and computer programs for determining the refraction of the eye
CN113840566A (en) Apparatus, system and method for determining one or more parameters of refractive error of an eye under test
KR102304369B1 (en) SYSTEM AND METHOD FOR EXAMINATING ophthalmic using VR
US20220151488A1 (en) Computer-implemented method and system for interactively measuring ocular refractive errors, addition and power of reading glasses
US11744462B2 (en) Head-mounted vision detection equipment, vision detection method and electronic device
CN111479104A (en) Method for calculating line-of-sight convergence distance
CN112205960B (en) Vision monitoring method, system, management end and storage medium
CN116634920A (en) Subjective refraction inspection system
US11256110B2 (en) System and method of utilizing computer-aided optics
JP2023517521A (en) Systems and associated methods for determining subjective values of optical characteristics of at least corrective lenses fitted to a subject's eye
EP4029455A1 (en) Head-position sway measuring device, head-position sway measuring method, and biological information acquisition system using said device and method
KR102204112B1 (en) Diagnostic method of bppv using pupil and iris
KR102189783B1 (en) Diagnosis name marking method of bppv
CN116171124A (en) Method for determining a near point, method for determining a near point distance, method for determining a sphere power and method for producing an ophthalmic lens, and corresponding mobile terminal and computer program
WO2023148372A1 (en) A computer-implemented systems and methods for interactively measuring either or both sides of the interval of clear vision of the eye
US20230363637A1 (en) Remote subjective refraction techniques
US20210212560A1 (en) Device, method and booth for automatic determination of the subjective ocular refraction of a patient

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18879422

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18879422

Country of ref document: EP

Kind code of ref document: A1