WO2017108952A1 - System and method for dynamically adjusting a visual acuity test - Google Patents

System and method for dynamically adjusting a visual acuity test Download PDF

Info

Publication number
WO2017108952A1
WO2017108952A1 PCT/EP2016/082186 EP2016082186W WO2017108952A1 WO 2017108952 A1 WO2017108952 A1 WO 2017108952A1 EP 2016082186 W EP2016082186 W EP 2016082186W WO 2017108952 A1 WO2017108952 A1 WO 2017108952A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
characters
visual acuity
display
distance
Prior art date
Application number
PCT/EP2016/082186
Other languages
French (fr)
Inventor
Jennifer Caffarel
Gijs Geleijnse
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017108952A1 publication Critical patent/WO2017108952A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/005Constructional features of the display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications

Definitions

  • the present disclosure pertains to a method and system for dynamically adjusting a visual acuity test.
  • Visual acuity test is usually performed in an optician's or a physician's office using a specialized chart such as Snellen chart or LogMAR chart.
  • the visual acuity test follows a structured setting in accordance with various testing requirements. For example, a Snellen chart or a LogMAR chart is placed twenty feet away from a subject whose long vision eyesight is to be tested. In another example, the Snellen chart or the LogMAR chart is placed sixteen inches from the subject for testing near vision sight. The subject is asked to identify the smallest letters they can read without the use of glasses. Such test is performed for each eye, after which an assessment of visual acuity can be made.
  • Visual Acuity Test provides the visual acuity test schemes to allow a person to perform a visual acuity test on a subject or perform a self-test on the mobile device.
  • test schemes require the subject to be in a fixed position with respect to the device, which may not always be possible given the environmental circumstances in which the test is conducted. Therefore, there exists a need for a simplified visual acuity test that can be set up flexibly to perform a visual acuity test or a self-test when the physician and the specialized settings are not available or not feasible.
  • one or more aspects of the present disclosure relate to a system configured for dynamically adjusting a visual acuity test.
  • the system comprises an interface configured to receive an input request to test visual acuity; a display comprising a first surface configured to display one or more characters for testing the visual acuity; one or more sensors configured to generate a first signal related to a first distance of a subject to the first surface; and at least one processor operatively communicated with the interface and with the one or more sensors.
  • the at least one processor receives the first signal and is configured by machine -readable instructions to determine the first distance between the subject and the first surface based on the first signal; determine a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and cause the one or more characters to be displayed on the first surface.
  • Yet another aspect of the present disclosure relates to a method for
  • the system comprises an interface, a display having a first surface, one or more sensors, and at least one processor.
  • the method comprises receiving a request to test visual acuity with the interface; communicating the request to test the visual acuity with the at least one processor; generating a first signal related to a first distance of a subject to the first surface with the one or more sensors; receiving the first signal with the at least one processor; determining the first distance between the subject and the first surface based on the first signal; determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and causing the one or more characters to be displayed on the first surface.
  • Still another aspect of the present disclosure relates to a system configured for dynamically adjusting a visual acuity test.
  • the system comprises means for receiving a request from a subject to test visual acuity with a subject interface; means for displaying one or more characters for testing the visual acuity with a first surface; means for generating a first signal related to a first distance of the subject to the first surface with one or more sensors; and means for receiving the first signal and executing machine- readable instructions with at least one processor.
  • the machine-readable instructions comprise instructions for determining the first distance between the subject and the first surface based on the first signal; determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and causing the one or more characters to be displayed on the first surface.
  • FIG. 1 illustrates an exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching
  • FIG. 2 illustrates another exemplary configuration for dynamically
  • FIG. 3 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching
  • FIG. 4 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching
  • FIG. 5 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching
  • FIG. 6 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with another embodiment of the present teaching
  • FIG. 7 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching.
  • the word "unitary” means a component is created as a single piece or unit. That is, a component that includes pieces that are created separately and then coupled together as a unit is not a “unitary” component or body.
  • the statement that two or more parts or components "engage” one another shall mean that the parts exert a force against one another either directly or through one or more intermediate parts or components.
  • the term “number” shall mean one or an integer greater than one (i.e., a plurality).
  • top, bottom, left, right, upper, lower, front, back, and derivatives thereof, relate to the orientation of the elements shown in the drawings and are not limiting upon the claims unless expressly recited therein.
  • FIG. 1 illustrates an exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching.
  • a subject 102 is at a distance 110 from interface 104 for a visual acuity self-test.
  • Interface 104 may be implemented on a computing device (not shown) that is capable to display a testing chart 106 on interface 104.
  • Testing chart 106 may include one or more characters, the display size of which is determined based on distance 110 between subject 102 and interface 104.
  • a visual acuity measurement 108 may be automatically determined based on distance 110 and the display size of the one or more characters on testing chart 106.
  • the visual acuity measurement of the subject's right eye may be outputted as 2/2 (or other output indicative of normal visual acuity).
  • FIG. 2 illustrates another exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching.
  • the computing device on which interface 104 is implemented detects the change of the position of subject 102 and determines a new distance between subject 102 and interface 104 (e.g., automatic determination of the new distance responsive to the detected change).
  • the display size of the one or more characters included in testing chart 106 is adjusted based on the newly determined distance
  • visual acuity measurement 108 is automatically adjusted based on the newly determined distance and the display size of the one or more characters. For example, if subject 102 can read all smallest characters displayed on the bottom line (or other portion) of testing chart 106 using the right eye at distance 202, the visual acuity measurement of the subject's right eye may be outputted as 1/1 (or other output indicative of normal visual acuity). In some embodiments, the display size of the one or more characters decreases when subject 102 moves closer to interface 104, and the display size of the one or more characters increases when subject 102 moves away from interface 104. Testing chart 106 displaying four lines of characters for visual acuity testing in FIG. 1 and FIG. 2 is one example of adjusted test chart according to the illustrated
  • test chart 106 may be generated to include more or less lines of characters for the visual acuity test (and/or to include other arrangements or formats of the characters or other units or symbols) depending on the distance between the subject and the interface as well as a screen resolution of the interface or other criteria.
  • FIG. 3 illustrates an exemplary system (300) for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching.
  • System 300 comprises an interface 104, one or more sensors 302, a first surface 336, a recorder 338, a speaker 344, and a processor 304.
  • Interface 104 is configured to receive an input request to test visual acuity.
  • interface 104 is a keyboard, a mouse, or a combination thereof connected to a computing device via one or more wired or wireless connections.
  • interface 104 may be a touch screen of a smart phone, a tablet, a laptop, a desktop, or other computing device.
  • interface 104 may be a wearable device that receives the input request and remotely synchronizes the input request with a smart phone, a tablet, a laptop, a desktop, and/or any other devices to perform the visual acuity test.
  • interface 104 may be a voice recognition component implemented on a smart phone, a tablet, a laptop, a desktop, a wearable device, and/or any other devices that automatically detects a voice input, and determines as to whether the voice input comprises a request to test visual acuity (e.g., via speech recognition, natural language processing, or other techniques).
  • First interface 336 is configured to display one or more characters for testing the visual acuity.
  • first interface 336 is a non-touchable screen and separate from interface 104.
  • first interface 336 is a touchable screen capable of receiving input and displaying content, and as such, first interface 336 and interface 104 may be a single integrated component.
  • Interface 104, first surface 336, recorder 338, and speaker 344 in FIG. 3 are for illustrative purpose only, and the present disclosure is not intended to be limiting.
  • the shapes and the depositions of these components may be determined in accordance with the device manufacturing specification.
  • One or more sensors 302 are configured to detect a position of subject 102 upon receiving the input request at interface 104, and generate a first signal related to a first distance 336 of subject 102 to first screen 336.
  • One or more sensors 302 may be a capacitive displacement sensor, an inductive sensor, an ultrasonic sensor, and/or any type of sensors that can assess the distance.
  • one or more sensors 302 may be used alone or with an addition of a sensing accessory, for example, a marker used by subject 102 as a reference point for the sensor.
  • a built-in camera of a device for example, a smart phone, a tablet, a laptop, etc. may be used to estimate the first distance.
  • one or more sensors 302 may be an integrated component of the system. In some other embodiments, one or more sensors 302 may be a separate component wire or wireless connected to the system. As illustrated in FIG. 3, one or more sensors 302 are disposed in front of interface 104 and faces subject 102. However, the one or more sensors may be deployed anywhere to perform the distance estimation.
  • Processor 304 is operatively communicated with interface 104, first
  • Processor 304 receives the first signal from one or more sensor 302 and process information conveyed by the first signal.
  • Processor 304 may include one or more of a digital processor(s), analog processor(s), a digital circuit designed to process information, an analog circuit designed to process
  • processor 304 may include one or more processing units.
  • the one or more processing units may be physically located within a same device, or processor 304 may represent 304 may be configured to execute one or more computer program components.
  • the one or more computer program components comprise a data processing component 306, a test initializing component 308, a voice recognition component 318, a visual acuity determination component 320, a communication component 322, a visual display component 324, and an audio play component 326.
  • Testing initializing component 308 may further comprise a distance evaluation component 310, a display properties determination component 312, a character generation component 314, and a measurement scaling component 316.
  • Processor 304 may be configured to execute components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 304.
  • Each of the one or more computer programmed components comprises a set of algorithms implemented on processor 304 that instructs processor 304 to perform one or more functions related to dynamically adjusting the visual acuity test, and/or other operations.
  • data processing component 306 comprises algorithms implemented on processor 304 that instruct processor 304 to receive output signals from one or more sensors 302 and process data read out from the output signals;
  • distance evaluation component 310 comprises algorithms implemented on processor 304 that instruct processor 304 to estimate a first distance between subject 102 and first surface 336;
  • display properties determination 312 comprises algorithms implemented on processor 304 that instruct processor 304 to obtain information related to display properties of first surface 336;
  • character generation component 314 comprises algorithms implemented on processor 304 that instruct processor 304 to determine one or more characters for the visual acuity test and a display size of one or more characters based on the estimated first distance and the display properties;
  • measurement scaling component 316 comprises algorithms implemented on processor 304 that instruct processor 304 to scale the visual acuity test measurement in accordance with the display size of the one
  • FIG. 3 314, 316, 318, 320, 322, 324, and 326 are illustrated in FIG. 3 as being co-located within a single processing unit, in implementations in which processor 304 includes multiple processing units, one or more of these components may be located remotely from the other components.
  • the description of the functionality provided by the different components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 described below is for illustrative purposes, and is not intended to be limiting, as any of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 may provide more or less functionality than is described.
  • processor 304 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326.
  • Data processing component 306 is configured to receive output signals from one or more sensors 302 and process data read out from the output signals so that reliable information is forwarded to test initializing component 308.
  • Data collected from one or more sensors 302 may sometimes comprise one or more types of interfering signals from the surrounding environment and/or from other sources that affect the accuracy of information read out from the output signals.
  • the collected date related to the proximity of the subject may include reflected and/or scattered sound waves from other objects in the area.
  • Data processing component 306 may be configured to filter out the interfering signals based on one or more algorithms such that data after filtering provides more accurate readings related to the distance of subject 102 relative to first surface 336.
  • Test initializing component 308 is configured to dynamically set up the visual acuity for subject 102 based on a first distance between subject 102 and first surface 336. As such, test initializing component 308 may further comprise a distance evaluation component 310, a display properties determination component 312, a character generation component 314, and a measurement scaling component 316.
  • Distance evaluation component 310 is configured to receive the output from data processing component 306 and estimate a first distance between subject 102 and first surface 336.
  • the output from data processing component 306 may comprise a direct measurement of first distance 340.
  • the output from data processing component 306 may include a time interval between sending the signal from the ultrasonic sensor and receiving the echo from the subject.
  • Distance evaluation component 310 determines the first distance based on the time interval and the frequency that the sound waves are transmitted via the ultrasonic sensor.
  • the output from data processing component 306 may include a plurality of measurements of the capacitance in the area between a probe of the capacitive displacement sensor and the subject.
  • Distance evaluation component 310 can determine the first distance based on the changes of the capacitance in this area.
  • the processed data from data processing component 306 may comprise an image of subject 102.
  • Distance evaluation component 310 may apply an image processing algorithm to extract information related to the first distance based on the subject size in the received image and the parameters configured in the camera.
  • any type of sensing technology in addition to the above examples may be employed to determine the proximity of a subject for a visual acuity test, and as such, the signals generated from the sensor indicative of a subject's proximity may be in various formats.
  • Data processing component 306 may be further configured to adapt to the various formats of those signals, and distance evaluation component 310 may be further configured with adaptive schemes and algorithms to calculate the first distance between the subject and the first surface based on the processed sensing signals.
  • the functionalities of data processing component 306 and distance evaluation component 310 may be configured and/or updated remotely from server 332 via network 330.
  • Display properties determination component 312 is configured to
  • the display properties may include a screen size and a resolution of first surface 336.
  • the display properties may be retrieved directly from configuration information stored in a memory of the computing device (e.g., a tablet, a smart phone, a laptop, a desktop, etc.).
  • Character generation component 314 is configured to determine one or more characters to be displayed for the visual acuity test, and a display size of the one or more characters based on first distance 340 and the display properties of first surface 336. Given the screen resolution of first surface 336, character generation component 314 may select a type of font used to display the characters for the visual acuity test. Further, character generation component 314 may determine a height of the selected font based on the distance between the subject and the first surface and using a font to height conversion table. In some embodiments, character generation component 314 may determine the lines of characters based on the screen size of first surface 336. For example, character generation component 314 may determine to display four lines of characters on a 13 inch laptop screen and six lines of characters on a 24 inch desktop monitor.
  • Measurement scaling component 316 is configured to generate the visual acuity test measurement scales that correspond to the determined display size of the one or more characters.
  • the visual acuity test measurement scales may be automatically generated once the display size of the one or more characters are determined. Referring to FIG. 1 , when testing chart 106 is determined to include four lines of characters, corresponding visual acuity measurement 108 is automatically selected and displayed.
  • one or more sensors 302 may detect a distance change between subject 102 and first surface 336 during a visual acuity test.
  • Distance evaluation component 310 receives the signals from one or more sensor 302 indicative of the distance change, and determines as to whether the distance change is significant or not.
  • Distance evaluation component 310 may set a variation threshold, for example, distance change to be 10% with respect to the distance between subject 102 and first surface 336. When the distance change is detected above the variation threshold, distance evaluation component 310 may re-estimate the distance and subject 102 is signaled that the visual acuity test needs to be reset.
  • one or more sensors 302 may also detect a change in a direction of subject 102 relative to first surface 336, alone or in combination with the distance change. The system may adjust the configuration of the visual acuity test based on the changes being observed.
  • Voice recognition component 318 is configured to recognize the content from received voice input from subject 102. During the visual acuity test, subject 102 reads out the displayed characters. Voice recognition component 318 receives the voice input and recognizes the exact characters that subject 102 reads. Voice recognition component 318 further determines whether the recognized characters match the displayed characters and sends the result to visual acuity determination component 320. [36] Visual acuity determination component 320 is configured to determine a visual acuity measurement based on the result from voice recognition component 318. Referring to FIG.
  • visual acuity determination component 320 presents to subject 102 that the visual acuity measurement of the subject's right eye is 2/2.
  • visual acuity determination component 320 may determine the visual acuity measurement to be 2/2 when the subject can read correctly seven out of the nine characters on the bottom line (e.g., depending on the visual acuity measurement scale used).
  • Visual display component 324 is configured to cause one or more
  • Visual display component 324 may further display the corresponding visual acuity measurement scales alongside the one or more characters, a measurement result to the subject, an alert message, and/or other information related to the test.
  • Audio play component 326 is configured to play an audio signal to the subject.
  • audio play component 326 may play the characters recognized from the voice input of the subject, a message to the subject requesting confirmation of the recognized characters, a measurement result of the visual acuity test measurement, and/or any communications between the system and the subject.
  • Communication component 322 is configured to perform communications within one or more components of processor 304, and between processor 304 and other components of the system and/or other network components.
  • communication component 322 communicates with server 332 remotely connected to network 330 and downloads one or more software packages from database 334 to modify and/or upgrade the functionalities of one or more components of processor 304.
  • communication component 322 communicates with electronic storage 328 locally connected to processor 304 or database 334 remotely connected to network 330 to retrieve historical information related to past visual acuity tests associated with the subject and/or other subjects, and provide the historical information to a user to determine and/or adjust one or more parameters related to a visual acuity test.
  • the present disclosure contemplates any techniques for communication including but not limited to hard- wired and wireless communications.
  • Electronic storage 328 is configured to electronically stores information in an electronic storage media.
  • Electronic storage 328 may comprise one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge- based storage media (e.g., EPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
  • optically readable storage media e.g., optical disks, etc.
  • magnetically readable storage media e.g., magnetic tape, magnetic hard drive, floppy drive, etc.
  • electrical charge- based storage media e.g., EPROM, RAM, etc.
  • solid-state storage media e.g., flash drive, etc.
  • the electronic storage media of electronic storage 328 may comprise one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 300 and/or removable storage that is removably connectable to system 300 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.).
  • Electronic storage 328 may store software packages, information related to past visual acuity tests, information related to a plurality of subjects that receive the visual acuity tests, information processed by processor 304, information received via interface 104, and/or other information that enables system 300 to function properly. Data stored in electronic storage 328 may be further uploaded in a remote database 334 for backup and management.
  • Network 330 is configured to transmit information among a plurality of network components. For example, a request inputted via interface 104 is received at server 332 via network 330 to retrieve historical information related to past visual acuity tests on the subject and/or other subjects for analysis. Network 330 forwards an instruction from server 332 to retrieve the requested historical information from electronic storage 328 or database 334.
  • Network 330 may be a single network or a combination of multiple networks.
  • network 330 may be a local area network (LAN), a wide area network (WAN), a public network, a private network, a proprietary network, a Public Telephone Switched Network (PSTN), the Internet, a wireless communication network, a virtual network, and/or any combination thereof.
  • LAN local area network
  • WAN wide area network
  • PSTN Public Telephone Switched Network
  • FIG. 4 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching.
  • System 400 may include a second surface 402 and an auxiliary sensor 408 in addition to the system components as illustrated in FIG. 3.
  • the one or more characters for testing a visual acuity test may be further displayed on second surface 402 in replace of and/or in addition to first surface 336.
  • Second surface 402 may be any surface capable of displaying the one or more characters with clarity for testing, such as a projector screen, a wall, a TV screen, etc.
  • Auxiliary sensor 408 is configured to detect the proximity of second surface 402 and assist in determining a second distance 406 between second surface 402 and a projecting point.
  • Auxiliary sensor 408 may be one of one or more sensors 302 described above, or a separate sensor that can be coupled to system 300.
  • the projecting point when the one or more characters are projected on a TV screen via a cable, or an Apple TV set-top box, or an Amazon Fire set-top box, etc., the projecting point is a smart phone, a tablet, a laptop, or a desktop, from which the one or more characters are projected.
  • the projecting point may be a lens surface of a projector 404 that is wire or wireless connected to system 300.
  • the display size of the one or more characters on second surface 402 may be further adjusted using geometry based on second distance 406 between the projecting point and second surface 402 or in conjunction with display properties of second surface 406.
  • the components of system 400 are for illustrative purpose and the present disclosure is not intended to be limiting. System 400 may be adapted to include more or less components depending on the requirements of the visual acuity test.
  • FIG. 5 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed.
  • operation 502 is performed by an interface the same as or similar to interface 104 (shown in FIG. 3 and described herein).
  • the request to test the visual acuity is communicated with the at least one processor.
  • operation 504 is performed by a processor the same as or similar to processor 304 (shown in FIG. 3 and described herein).
  • a first signal related to a first distance of a subject to the first surface with the one or more sensors is generated.
  • operation 506 is performed by one or more sensors the same as or similar to one or more sensors 302 (shown in FIG. 3 and described herein).
  • the first signal is received with the at least one
  • operation 508 is performed by a processor the same as or similar to processor 304 (shown in FIG. 3 and described herein).
  • operation 510 is performed by a distance evaluation component the same as or similar to distance evaluation component 310 (shown in FIG. 3 and described herein).
  • a display size of one or more characters for testing the visual acuity of the subject is determined based on the first distance between the subject and the first surface.
  • operation 512 is performed by a character generation component the same as or similar to character generation component 314 (shown in FIG. 3 and described herein).
  • operation 514 the one or more characters are caused to be displayed on the first surface.
  • operation 514 is performed by a visual display component the same as or similar to visual display component 324 (shown in FIG. 3 and described herein).
  • FIG. 6 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with another embodiment of the present teaching.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 6 and described below is not intended to be limiting.
  • a second signal related to a second distance of a projector relative to a second surface is generated with the one or more sensors. In some embodiments, operation 602 is performed by one or more sensors the same as or similar to one or more sensors 302 (shown in FIG. 4 and described herein).
  • the second signal is received with the at least one
  • operation 604 is performed by a processor the same as or similar to processor 304 (shown in FIG. 4 and described herein).
  • operation 606 is performed by a distance evaluation component the same as or similar to distance evaluation component 310 (shown in FIG. 4 and described herein
  • operation 608 the display size of the one or more characters is adjusted based on the second distance.
  • operation 508 is performed by a character generation component the same as or similar to character generation component 314 (shown in FIG. 4 and described herein).
  • operation 610 the one or more characters are projected to the second surface.
  • operation 610 is performed by a projector the same as or similar to projector 404 (shown in FIG. 4 and described herein).
  • FIG. 7 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 7 and described below is not intended to be limiting.
  • an audio input is received from a subject.
  • operation 702 is performed by an interface and/or a recorder the same as or similar to interface 104 and recorder 338 (shown in FIG. 3 and described herein).
  • operation 704 one or more characters are identified from the audio input.
  • operation 704 is performed by a voice recognition component the same as or similar to voice recognition component 318 (shown in FIG. 3 and described herein).
  • operation 706 is performed by a voice recognition component the same as or similar to voice recognition component 318 (shown in FIG. 3 and described herein).
  • a visual acuity measurement is determined based on the decision.
  • operation 708 is performed by a visual acuity determination component the same as or similar to visual acuity determination component 320 (shown in FIG. 3 and described herein).
  • operation 710 the visual acuity measurement is caused to be presented to the subject.
  • operation 710 is performed by a visual display component the same as or similar to visual display component 324 (shown in FIG. 3 and described herein).
  • characters are shown as letters, in alternative embodiments they can be numbers, symbols, pictures, and/or objects that can be visualized and identified by the subject.
  • the present invention in its broadest sense is not limited in regards to the type of character used.
  • the above illustrated embodiments dynamically adjust the one or more characters to be displayed in a visual acuity test table based on the distance between a subject and a display surface.
  • the present disclosure may also enable a manual input of the parameters that are required to set up the visual acuity test.
  • the one or more sensors may also detect if the subject is wearing glasses or contact lenses using image processing techniques or with the help of other sensors (e.g., to detect reflection of the glasses). Based on the detected information, the subject may be prompted to remove the visual aids.
  • the user may be prompted to enter the prescription of the glasses or contact lenses to calculate the display sizes of the one or more characters for display and adjust the visual acuity measurements.
  • the above illustrated embodiments may also be adapted to allow a better estimation of far-sight vision by projecting the test chart on a wall rather than visualization and depth rendering on a screen.
  • either the distance between the projector and projection screen would have to be known and entered manually.
  • the distance from the wall to the projector may be automatically estimated, similarly to the distance between the subject and the computing device, taking into account the optical magnifying properties of the projecting device.
  • indications may be given to the subject if the set-up is sub-optimal for carrying out the test, e.g., given limitations including screen size and resolution would be a boundary condition to the minimum distance that a subject should be from the testing device.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” or “including” does not exclude the presence of elements or steps other than those listed in a claim.
  • several of these means may be embodied by one and the same item of hardware.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • any device claim enumerating several means several of these means may be embodied by one and the same item of hardware.
  • the mere fact that certain elements are recited in mutually different dependent claims does not indicate that these elements cannot be used in combination.

Abstract

The present disclosure pertains to a method and system configured for dynamically adjusting a visual acuity test. In some embodiments, the system comprises an interface, a display, one or more sensors, at least one processor, and/or other components. The dynamically adjusting a visual acuity test is performed by generating a first signal related to a first distance of a subject to the first surface, determining the first distance between the subject and the first surface based on the first signal, determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface, and displaying the one or more characters on the first surface.

Description

SYSTEM AND METHOD FOR DYNAMICALLY ADJUSTING
A VISUAL ACUITY TEST
BACKGROUND
1. Field
[01] The present disclosure pertains to a method and system for dynamically adjusting a visual acuity test.
2. Description of the Related Art
[02] Visual acuity test is usually performed in an optician's or a physician's office using a specialized chart such as Snellen chart or LogMAR chart. The visual acuity test follows a structured setting in accordance with various testing requirements. For example, a Snellen chart or a LogMAR chart is placed twenty feet away from a subject whose long vision eyesight is to be tested. In another example, the Snellen chart or the LogMAR chart is placed sixteen inches from the subject for testing near vision sight. The subject is asked to identify the smallest letters they can read without the use of glasses. Such test is performed for each eye, after which an assessment of visual acuity can be made.
[03] For an elderly patient with chronic conditions, sufficient sight is important for all activities of daily living. Problems with sight affect the health and well-being of the patient. As an example, such problems may lead to reduced mobility and activity and may cause various dangerous incidents such as falls and medication intake errors.
Regular checkups of an individual's eye sight may help to identify sight-related problems early and enable early-stage intervention when necessary.
[04] Existing visual acuity test apps implemented on a mobile device such as
Visual Acuity Test, Vision Test 2.0, Eye Test Pro, Peek Vision, etc., provide the visual acuity test schemes to allow a person to perform a visual acuity test on a subject or perform a self-test on the mobile device. However, such test schemes require the subject to be in a fixed position with respect to the device, which may not always be possible given the environmental circumstances in which the test is conducted. Therefore, there exists a need for a simplified visual acuity test that can be set up flexibly to perform a visual acuity test or a self-test when the physician and the specialized settings are not available or not feasible.
SUMMARY
[05] Accordingly, one or more aspects of the present disclosure relate to a system configured for dynamically adjusting a visual acuity test. The system comprises an interface configured to receive an input request to test visual acuity; a display comprising a first surface configured to display one or more characters for testing the visual acuity; one or more sensors configured to generate a first signal related to a first distance of a subject to the first surface; and at least one processor operatively communicated with the interface and with the one or more sensors. The at least one processor receives the first signal and is configured by machine -readable instructions to determine the first distance between the subject and the first surface based on the first signal; determine a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and cause the one or more characters to be displayed on the first surface.
[06] Yet another aspect of the present disclosure relates to a method for
dynamically adjusting a visual acuity test implemented in a system. The system comprises an interface, a display having a first surface, one or more sensors, and at least one processor. The method comprises receiving a request to test visual acuity with the interface; communicating the request to test the visual acuity with the at least one processor; generating a first signal related to a first distance of a subject to the first surface with the one or more sensors; receiving the first signal with the at least one processor; determining the first distance between the subject and the first surface based on the first signal; determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and causing the one or more characters to be displayed on the first surface.
[07] Still another aspect of the present disclosure relates to a system configured for dynamically adjusting a visual acuity test. The system comprises means for receiving a request from a subject to test visual acuity with a subject interface; means for displaying one or more characters for testing the visual acuity with a first surface; means for generating a first signal related to a first distance of the subject to the first surface with one or more sensors; and means for receiving the first signal and executing machine- readable instructions with at least one processor. The machine-readable instructions comprise instructions for determining the first distance between the subject and the first surface based on the first signal; determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and causing the one or more characters to be displayed on the first surface.
These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[09] FIG. 1 illustrates an exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching;
[10] FIG. 2 illustrates another exemplary configuration for dynamically
adjusting a visual acuity test in accordance with an embodiment of the present teaching;
[11] FIG. 3 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching;
[12] FIG. 4 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching;
[13] FIG. 5 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching; [14] FIG. 6 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with another embodiment of the present teaching; and
[15] FIG. 7 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
[16] As used herein, the singular form of "a", "an", and "the" include plural references unless the context clearly dictates otherwise. As used herein, the statement that two or more parts or components are "coupled" shall mean that the parts are joined or operate together either directly or indirectly, i.e., through one or more intermediate parts or components, so long as a link occurs. As used herein, "directly coupled" means that two elements are directly in contact with each other. As used herein, "fixedly coupled" or "fixed" means that two components are coupled so as to move as one while maintaining a constant orientation relative to each other.
[17] As used herein, the word "unitary" means a component is created as a single piece or unit. That is, a component that includes pieces that are created separately and then coupled together as a unit is not a "unitary" component or body. As employed herein, the statement that two or more parts or components "engage" one another shall mean that the parts exert a force against one another either directly or through one or more intermediate parts or components. As employed herein, the term "number" shall mean one or an integer greater than one (i.e., a plurality).
[18] Directional phrases used herein, such as, for example and without
limitation, top, bottom, left, right, upper, lower, front, back, and derivatives thereof, relate to the orientation of the elements shown in the drawings and are not limiting upon the claims unless expressly recited therein.
[19] FIG. 1 illustrates an exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching. According to the illustrated embodiment, a subject 102 is at a distance 110 from interface 104 for a visual acuity self-test. Interface 104 may be implemented on a computing device (not shown) that is capable to display a testing chart 106 on interface 104. Testing chart 106 may include one or more characters, the display size of which is determined based on distance 110 between subject 102 and interface 104. A visual acuity measurement 108 may be automatically determined based on distance 110 and the display size of the one or more characters on testing chart 106. For example, if subject 102 can read all smallest characters displayed on the bottom line (or other portion) of testing chart 106 using the right eye at distance 110, the visual acuity measurement of the subject's right eye may be outputted as 2/2 (or other output indicative of normal visual acuity).
FIG. 2 illustrates another exemplary configuration for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching. According to the illustrated embodiment, when subject 102 moves closer to interface 104 at a distance 202, the computing device on which interface 104 is implemented (not shown) detects the change of the position of subject 102 and determines a new distance between subject 102 and interface 104 (e.g., automatic determination of the new distance responsive to the detected change). The display size of the one or more characters included in testing chart 106 is adjusted based on the newly determined distance
(e.g., automatic adjustment of the display size based on the newly determined distance). Further, for instance, visual acuity measurement 108 is automatically adjusted based on the newly determined distance and the display size of the one or more characters. For example, if subject 102 can read all smallest characters displayed on the bottom line (or other portion) of testing chart 106 using the right eye at distance 202, the visual acuity measurement of the subject's right eye may be outputted as 1/1 (or other output indicative of normal visual acuity). In some embodiments, the display size of the one or more characters decreases when subject 102 moves closer to interface 104, and the display size of the one or more characters increases when subject 102 moves away from interface 104. Testing chart 106 displaying four lines of characters for visual acuity testing in FIG. 1 and FIG. 2 is one example of adjusted test chart according to the illustrated
embodiments. It should be appreciated that test chart 106 may be generated to include more or less lines of characters for the visual acuity test (and/or to include other arrangements or formats of the characters or other units or symbols) depending on the distance between the subject and the interface as well as a screen resolution of the interface or other criteria.
FIG. 3 illustrates an exemplary system (300) for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching. System 300 comprises an interface 104, one or more sensors 302, a first surface 336, a recorder 338, a speaker 344, and a processor 304. Interface 104 is configured to receive an input request to test visual acuity. In some embodiments, interface 104 is a keyboard, a mouse, or a combination thereof connected to a computing device via one or more wired or wireless connections. In some other embodiments, interface 104 may be a touch screen of a smart phone, a tablet, a laptop, a desktop, or other computing device. In yet another embodiment, interface 104 may be a wearable device that receives the input request and remotely synchronizes the input request with a smart phone, a tablet, a laptop, a desktop, and/or any other devices to perform the visual acuity test. In yet another embodiment, interface 104 may be a voice recognition component implemented on a smart phone, a tablet, a laptop, a desktop, a wearable device, and/or any other devices that automatically detects a voice input, and determines as to whether the voice input comprises a request to test visual acuity (e.g., via speech recognition, natural language processing, or other techniques). If it is determined that the voice input comprises a request to test visual acuity, the voice recognition component cooperates with other components of the system to set up the visual acuity test. The voice recognition component may utilize recorder 338 to receive the voice input and speaker 344 to communicate with the person inputting the voice in order to accurately determine the content of the voice input. First interface 336 is configured to display one or more characters for testing the visual acuity. In some embodiments, first interface 336 is a non-touchable screen and separate from interface 104. In some other embodiments, first interface 336 is a touchable screen capable of receiving input and displaying content, and as such, first interface 336 and interface 104 may be a single integrated component. Interface 104, first surface 336, recorder 338, and speaker 344 in FIG. 3 are for illustrative purpose only, and the present disclosure is not intended to be limiting. The shapes and the depositions of these components may be determined in accordance with the device manufacturing specification.
[22] One or more sensors 302 are configured to detect a position of subject 102 upon receiving the input request at interface 104, and generate a first signal related to a first distance 336 of subject 102 to first screen 336. One or more sensors 302 may be a capacitive displacement sensor, an inductive sensor, an ultrasonic sensor, and/or any type of sensors that can assess the distance. In some embodiments, one or more sensors 302 may be used alone or with an addition of a sensing accessory, for example, a marker used by subject 102 as a reference point for the sensor. In some embodiments, a built-in camera of a device, for example, a smart phone, a tablet, a laptop, etc. may be used to estimate the first distance. Under such circumstances, image processing algorithms may be applied to analyze the image of subject 102, from which the first distance can be estimated. In some embodiments, one or more sensors 302 may be an integrated component of the system. In some other embodiments, one or more sensors 302 may be a separate component wire or wireless connected to the system. As illustrated in FIG. 3, one or more sensors 302 are disposed in front of interface 104 and faces subject 102. However, the one or more sensors may be deployed anywhere to perform the distance estimation.
[23] Processor 304 is operatively communicated with interface 104, first
surface 336, and one or more sensors 302. Processor 304 receives the first signal from one or more sensor 302 and process information conveyed by the first signal. Processor 304 may include one or more of a digital processor(s), analog processor(s), a digital circuit designed to process information, an analog circuit designed to process
information, a state machine, a transmitter, a receiver, and/or other mechanism(s) or processor(s) for electronically processing information. Although processor 304 is shown in FIG. 3 as a single entity, this is for illustrative purposes only. In some embodiments, processor 304 may include one or more processing units. The one or more processing units may be physically located within a same device, or processor 304 may represent 304 may be configured to execute one or more computer program components. The one or more computer program components comprise a data processing component 306, a test initializing component 308, a voice recognition component 318, a visual acuity determination component 320, a communication component 322, a visual display component 324, and an audio play component 326. Testing initializing component 308 may further comprise a distance evaluation component 310, a display properties determination component 312, a character generation component 314, and a measurement scaling component 316. Processor 304 may be configured to execute components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 304.
Each of the one or more computer programmed components comprises a set of algorithms implemented on processor 304 that instructs processor 304 to perform one or more functions related to dynamically adjusting the visual acuity test, and/or other operations. For example, data processing component 306 comprises algorithms implemented on processor 304 that instruct processor 304 to receive output signals from one or more sensors 302 and process data read out from the output signals; distance evaluation component 310 comprises algorithms implemented on processor 304 that instruct processor 304 to estimate a first distance between subject 102 and first surface 336; display properties determination 312 comprises algorithms implemented on processor 304 that instruct processor 304 to obtain information related to display properties of first surface 336; character generation component 314 comprises algorithms implemented on processor 304 that instruct processor 304 to determine one or more characters for the visual acuity test and a display size of one or more characters based on the estimated first distance and the display properties; measurement scaling component 316 comprises algorithms implemented on processor 304 that instruct processor 304 to scale the visual acuity test measurement in accordance with the display size of the one or more characters; voice recognition component 318 comprises algorithms implemented on processor 304 that instruct processor 304 to recognize one or more characters from an audio input of subject 102 and determine whether the one or more character from the audio input match one or more displayed characters for visual acuity test; visual acuity determination component 320 comprises algorithms implemented on processor 304 that instruct processor 304 to determine a visual acuity measurement based on the result from voice recognition component 318; visual display component 324 comprises algorithms implemented on processor 304 that instruct processor 304 to display one or more characters for the visual acuity test on first surface 336; audio play component 326 comprises algorithms implemented on processor 304 that instruct processor 304 to play audio instructions, alerts, and/or test results to subject 102; and communication component 332 comprises algorithms implemented on processor 304 that instruct processor 304 to perform communications within one or more components of processor 304, and between processor 304 and other components of the system and/or other network components.
[25] It should be appreciated that although components 306, 308, 310, 312,
314, 316, 318, 320, 322, 324, and 326 are illustrated in FIG. 3 as being co-located within a single processing unit, in implementations in which processor 304 includes multiple processing units, one or more of these components may be located remotely from the other components. The description of the functionality provided by the different components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 described below is for illustrative purposes, and is not intended to be limiting, as any of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 may provide more or less functionality than is described. For example, one or more of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326 may be eliminated, and some or all of its functionality may be provided by other ones of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326. As another example, processor 304 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 306, 308, 310, 312, 314, 316, 318, 320, 322, 324, and 326.
[26] Data processing component 306 is configured to receive output signals from one or more sensors 302 and process data read out from the output signals so that reliable information is forwarded to test initializing component 308. Data collected from one or more sensors 302 may sometimes comprise one or more types of interfering signals from the surrounding environment and/or from other sources that affect the accuracy of information read out from the output signals. For example, when an ultrasonic sensor is used to determine the proximity of a subject, the collected date related to the proximity of the subject may include reflected and/or scattered sound waves from other objects in the area. Data processing component 306 may be configured to filter out the interfering signals based on one or more algorithms such that data after filtering provides more accurate readings related to the distance of subject 102 relative to first surface 336.
[27] Test initializing component 308 is configured to dynamically set up the visual acuity for subject 102 based on a first distance between subject 102 and first surface 336. As such, test initializing component 308 may further comprise a distance evaluation component 310, a display properties determination component 312, a character generation component 314, and a measurement scaling component 316.
[28] Distance evaluation component 310 is configured to receive the output from data processing component 306 and estimate a first distance between subject 102 and first surface 336. In some embodiments, the output from data processing component 306 may comprise a direct measurement of first distance 340. For example, when an ultrasonic sensor is used, the output from data processing component 306 may include a time interval between sending the signal from the ultrasonic sensor and receiving the echo from the subject. Distance evaluation component 310 determines the first distance based on the time interval and the frequency that the sound waves are transmitted via the ultrasonic sensor. In yet another example, when a capacitive displacement sensor is used, the output from data processing component 306 may include a plurality of measurements of the capacitance in the area between a probe of the capacitive displacement sensor and the subject. Distance evaluation component 310 can determine the first distance based on the changes of the capacitance in this area. In some other embodiments, the processed data from data processing component 306 may comprise an image of subject 102.
Distance evaluation component 310 may apply an image processing algorithm to extract information related to the first distance based on the subject size in the received image and the parameters configured in the camera. [29] It should be understood that any type of sensing technology in addition to the above examples may be employed to determine the proximity of a subject for a visual acuity test, and as such, the signals generated from the sensor indicative of a subject's proximity may be in various formats. Data processing component 306 may be further configured to adapt to the various formats of those signals, and distance evaluation component 310 may be further configured with adaptive schemes and algorithms to calculate the first distance between the subject and the first surface based on the processed sensing signals. It should be further understood that the functionalities of data processing component 306 and distance evaluation component 310 may be configured and/or updated remotely from server 332 via network 330.
[30] Display properties determination component 312 is configured to
determine the display properties of first surface 336. The display properties may include a screen size and a resolution of first surface 336. In some embodiments, the display properties may be retrieved directly from configuration information stored in a memory of the computing device (e.g., a tablet, a smart phone, a laptop, a desktop, etc.).
[31] Character generation component 314 is configured to determine one or more characters to be displayed for the visual acuity test, and a display size of the one or more characters based on first distance 340 and the display properties of first surface 336. Given the screen resolution of first surface 336, character generation component 314 may select a type of font used to display the characters for the visual acuity test. Further, character generation component 314 may determine a height of the selected font based on the distance between the subject and the first surface and using a font to height conversion table. In some embodiments, character generation component 314 may determine the lines of characters based on the screen size of first surface 336. For example, character generation component 314 may determine to display four lines of characters on a 13 inch laptop screen and six lines of characters on a 24 inch desktop monitor.
[32] Measurement scaling component 316 is configured to generate the visual acuity test measurement scales that correspond to the determined display size of the one or more characters. The visual acuity test measurement scales may be automatically generated once the display size of the one or more characters are determined. Referring to FIG. 1 , when testing chart 106 is determined to include four lines of characters, corresponding visual acuity measurement 108 is automatically selected and displayed.
[33] In some embodiments, one or more sensors 302 may detect a distance change between subject 102 and first surface 336 during a visual acuity test. Distance evaluation component 310 receives the signals from one or more sensor 302 indicative of the distance change, and determines as to whether the distance change is significant or not. Distance evaluation component 310 may set a variation threshold, for example, distance change to be 10% with respect to the distance between subject 102 and first surface 336. When the distance change is detected above the variation threshold, distance evaluation component 310 may re-estimate the distance and subject 102 is signaled that the visual acuity test needs to be reset. In some embodiments, one or more sensors 302 may also detect a change in a direction of subject 102 relative to first surface 336, alone or in combination with the distance change. The system may adjust the configuration of the visual acuity test based on the changes being observed.
[34] Distance evaluation component 310, display properties determination component 312, character generation component 314, and measurement scaling component 316 included in test initializing component 308 described above are for illustrative purpose only. The present disclosure is not intended to be limiting. System 300 may be further developed and/or modified to encompass more functionality components than illustrated. Yet in another example, system 300 may be further developed and/or modified to integrate the plurality of functionality components such that test initializing component 308 includes less functionality components than illustrated.
[35] Voice recognition component 318 is configured to recognize the content from received voice input from subject 102. During the visual acuity test, subject 102 reads out the displayed characters. Voice recognition component 318 receives the voice input and recognizes the exact characters that subject 102 reads. Voice recognition component 318 further determines whether the recognized characters match the displayed characters and sends the result to visual acuity determination component 320. [36] Visual acuity determination component 320 is configured to determine a visual acuity measurement based on the result from voice recognition component 318. Referring to FIG. 1, for example, if subject 102 reads correctly all nine characters on the bottom line of testing chart 106 using the right eye, visual acuity determination component 320 presents to subject 102 that the visual acuity measurement of the subject's right eye is 2/2. In some embodiments, visual acuity determination component 320 may determine the visual acuity measurement to be 2/2 when the subject can read correctly seven out of the nine characters on the bottom line (e.g., depending on the visual acuity measurement scale used).
[37] Visual display component 324 is configured to cause one or more
characters to be displayed on first surface 336 for the visual acuity test. Visual display component 324 may further display the corresponding visual acuity measurement scales alongside the one or more characters, a measurement result to the subject, an alert message, and/or other information related to the test.
[38] Audio play component 326 is configured to play an audio signal to the subject. For example, audio play component 326 may play the characters recognized from the voice input of the subject, a message to the subject requesting confirmation of the recognized characters, a measurement result of the visual acuity test measurement, and/or any communications between the system and the subject.
[39] Communication component 322 is configured to perform communications within one or more components of processor 304, and between processor 304 and other components of the system and/or other network components. In some embodiments, communication component 322 communicates with server 332 remotely connected to network 330 and downloads one or more software packages from database 334 to modify and/or upgrade the functionalities of one or more components of processor 304. In some embodiments, communication component 322 communicates with electronic storage 328 locally connected to processor 304 or database 334 remotely connected to network 330 to retrieve historical information related to past visual acuity tests associated with the subject and/or other subjects, and provide the historical information to a user to determine and/or adjust one or more parameters related to a visual acuity test. The present disclosure contemplates any techniques for communication including but not limited to hard- wired and wireless communications.
[40] Electronic storage 328 is configured to electronically stores information in an electronic storage media. Electronic storage 328 may comprise one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge- based storage media (e.g., EPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. The electronic storage media of electronic storage 328 may comprise one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 300 and/or removable storage that is removably connectable to system 300 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 328 may store software packages, information related to past visual acuity tests, information related to a plurality of subjects that receive the visual acuity tests, information processed by processor 304, information received via interface 104, and/or other information that enables system 300 to function properly. Data stored in electronic storage 328 may be further uploaded in a remote database 334 for backup and management.
[41] Network 330 is configured to transmit information among a plurality of network components. For example, a request inputted via interface 104 is received at server 332 via network 330 to retrieve historical information related to past visual acuity tests on the subject and/or other subjects for analysis. Network 330 forwards an instruction from server 332 to retrieve the requested historical information from electronic storage 328 or database 334. Network 330 may be a single network or a combination of multiple networks. For example, network 330 may be a local area network (LAN), a wide area network (WAN), a public network, a private network, a proprietary network, a Public Telephone Switched Network (PSTN), the Internet, a wireless communication network, a virtual network, and/or any combination thereof.
[42] FIG. 4 illustrates an exemplary system for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching. System 400 may include a second surface 402 and an auxiliary sensor 408 in addition to the system components as illustrated in FIG. 3. According to the illustrated embodiment, the one or more characters for testing a visual acuity test may be further displayed on second surface 402 in replace of and/or in addition to first surface 336. Second surface 402 may be any surface capable of displaying the one or more characters with clarity for testing, such as a projector screen, a wall, a TV screen, etc. Auxiliary sensor 408 is configured to detect the proximity of second surface 402 and assist in determining a second distance 406 between second surface 402 and a projecting point. Auxiliary sensor 408 may be one of one or more sensors 302 described above, or a separate sensor that can be coupled to system 300. In some embodiments, when the one or more characters are projected on a TV screen via a cable, or an Apple TV set-top box, or an Amazon Fire set-top box, etc., the projecting point is a smart phone, a tablet, a laptop, or a desktop, from which the one or more characters are projected. In some other embodiments, when the one or more characters are projected to a wall or a projector screen hanging on a wall, the projecting point may be a lens surface of a projector 404 that is wire or wireless connected to system 300. The display size of the one or more characters on second surface 402 may be further adjusted using geometry based on second distance 406 between the projecting point and second surface 402 or in conjunction with display properties of second surface 406. The components of system 400 are for illustrative purpose and the present disclosure is not intended to be limiting. System 400 may be adapted to include more or less components depending on the requirements of the visual acuity test.
[43] FIG. 5 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with an embodiment of the present teaching. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed.
Additionally, the order in which the operations of the process as illustrated in FIG. 5 and described below is not intended to be limiting.
[44] At operation 502, a request to test visual acuity is received with the
interface. In some embodiments, operation 502 is performed by an interface the same as or similar to interface 104 (shown in FIG. 3 and described herein). [45] At operation 504, the request to test the visual acuity is communicated with the at least one processor. In some embodiments, operation 504 is performed by a processor the same as or similar to processor 304 (shown in FIG. 3 and described herein).
[46] At operation 506, a first signal related to a first distance of a subject to the first surface with the one or more sensors is generated. In some embodiments, operation 506 is performed by one or more sensors the same as or similar to one or more sensors 302 (shown in FIG. 3 and described herein).
[47] At operation 508, the first signal is received with the at least one
processor. In some embodiments, operation 508 is performed by a processor the same as or similar to processor 304 (shown in FIG. 3 and described herein).
[48] At operation 510, the first distance between the subject and the first
surface is determined based on the first signal. In some embodiments, operation 510 is performed by a distance evaluation component the same as or similar to distance evaluation component 310 (shown in FIG. 3 and described herein).
[49] At operation 512, a display size of one or more characters for testing the visual acuity of the subject is determined based on the first distance between the subject and the first surface. In some embodiments, operation 512 is performed by a character generation component the same as or similar to character generation component 314 (shown in FIG. 3 and described herein).
[50] At operation 514, the one or more characters are caused to be displayed on the first surface. In some embodiments, operation 514 is performed by a visual display component the same as or similar to visual display component 324 (shown in FIG. 3 and described herein).
[51] FIG. 6 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with another embodiment of the present teaching. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 6 and described below is not intended to be limiting. [52] At operation 602, a second signal related to a second distance of a projector relative to a second surface is generated with the one or more sensors. In some embodiments, operation 602 is performed by one or more sensors the same as or similar to one or more sensors 302 (shown in FIG. 4 and described herein).
[53] At operation 604, the second signal is received with the at least one
processor. In some embodiments, operation 604 is performed by a processor the same as or similar to processor 304 (shown in FIG. 4 and described herein).
[54] At operation 606, the second distance between the projector and the
second surface is determined based on the second signal. In some embodiments, operation 606 is performed by a distance evaluation component the same as or similar to distance evaluation component 310 (shown in FIG. 4 and described herein
[55] At operation 608, the display size of the one or more characters is adjusted based on the second distance. In some embodiments, operation 508 is performed by a character generation component the same as or similar to character generation component 314 (shown in FIG. 4 and described herein).
[56] At operation 610, the one or more characters are projected to the second surface. In some embodiments, operation 610 is performed by a projector the same as or similar to projector 404 (shown in FIG. 4 and described herein).
[57] FIG. 7 illustrates an exemplary flowchart of the process for dynamically adjusting a visual acuity test in accordance with yet another embodiment of the present teaching. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process as illustrated in FIG. 7 and described below is not intended to be limiting.
[58] At operation 702, an audio input is received from a subject. In some
embodiments, operation 702 is performed by an interface and/or a recorder the same as or similar to interface 104 and recorder 338 (shown in FIG. 3 and described herein).
[59] At operation 704, one or more characters are identified from the audio input. In some embodiments, operation 704 is performed by a voice recognition component the same as or similar to voice recognition component 318 (shown in FIG. 3 and described herein).
[60] At operation706, a decision is made as to whether the one or more
characters match the displayed characters. In some embodiments, operation 706 is performed by a voice recognition component the same as or similar to voice recognition component 318 (shown in FIG. 3 and described herein).
[61] At operation 708, a visual acuity measurement is determined based on the decision. In some embodiments, operation 708 is performed by a visual acuity determination component the same as or similar to visual acuity determination component 320 (shown in FIG. 3 and described herein).
[62] At operation 710, the visual acuity measurement is caused to be presented to the subject. In some embodiments, operation 710 is performed by a visual display component the same as or similar to visual display component 324 (shown in FIG. 3 and described herein).
[63] While in the illustrated embodiments the characters are shown as letters, in alternative embodiments they can be numbers, symbols, pictures, and/or objects that can be visualized and identified by the subject. Thus, the present invention in its broadest sense is not limited in regards to the type of character used.
[64] The above illustrated embodiments dynamically adjust the one or more characters to be displayed in a visual acuity test table based on the distance between a subject and a display surface. However, the present disclosure may also enable a manual input of the parameters that are required to set up the visual acuity test. For example, the one or more sensors may also detect if the subject is wearing glasses or contact lenses using image processing techniques or with the help of other sensors (e.g., to detect reflection of the glasses). Based on the detected information, the subject may be prompted to remove the visual aids. In the alternative, the user may be prompted to enter the prescription of the glasses or contact lenses to calculate the display sizes of the one or more characters for display and adjust the visual acuity measurements.
[65] The above illustrated embodiments may also be adapted to allow a better estimation of far-sight vision by projecting the test chart on a wall rather than visualization and depth rendering on a screen. In some embodiments, either the distance between the projector and projection screen would have to be known and entered manually. In yet another embodiment, the distance from the wall to the projector may be automatically estimated, similarly to the distance between the subject and the computing device, taking into account the optical magnifying properties of the projecting device. In some embodiments, indications may be given to the subject if the set-up is sub-optimal for carrying out the test, e.g., given limitations including screen size and resolution would be a boundary condition to the minimum distance that a subject should be from the testing device.
In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" or "including" does not exclude the presence of elements or steps other than those listed in a claim. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. In any device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain elements are recited in mutually different dependent claims does not indicate that these elements cannot be used in combination.
Although the description provided above provides detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the expressly disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.

Claims

What is Claimed is:
1. A system for dynamically adjusting a visual acuity test, the system comprising:
an interface configured to receive an input request to test visual acuity;
a display comprising a first surface configured to display one or more characters for testing the visual acuity;
one or more sensors configured to generate a first signal related to a first distance of a subject to the first surface; and
at least one processor operatively communicated with the interface and with the one or more sensors, the at least one processor receiving the first signal and configured by machine -readable instructions to:
determine the first distance between the subject and the first surface based on the first signal;
determine a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and
cause the one or more characters to be displayed on the first surface.
2. The system of claim 1 , further comprising a projector operatively communicated with the at least one processor, wherein the one or more sensors generates a second signal related to a distance of the projector relative to a second surface, and wherein the at least one processor is further configured by the machine -readable instructions to:
determine a second distance between the projector and the second surface based on the second signal;
adjust the display sizes of the one or more characters based on the second distance; and project the one or more characters to the second surface.
3. The system of claim 1 , wherein the at least one processor is further configured by the machine -readable instructions to:
receive an audio input from the subject;
determine whether one or more characters received from the audio input match the displayed one or more characters; and
based on the determination, cause a visual acuity measurement to be presented to the subject.
4. The system of claim 1, wherein the at least one processor is further configured by machine-readable instructions to obtain one or more display properties of the first surface, wherein the one or more display properties of the first surface comprise a display size and a resolution of the first surface.
5. The system of claim 1, wherein the at least one processor is further configured by the machine -readable instructions to:
determine visual acuity measurement scales corresponding to the display sizes of the one or more characters.
6. The system of claim 1, wherein the display comprises a mobile device display and wherein the first surface comprises a screen of the mobile device display.
7. The system of claim 1, wherein the display comprises a computer display and wherein the first surface comprises a screen of the computer display.
8. The system of claim 1, wherein the display comprises a projector and wherein the first surface comprises a surface onto which an image from the projector is received.
9. A method for dynamically adjusting a visual acuity test implemented in a system, the system comprising an interface, a display having a first surface, one or more sensors, and at least one processor, the method comprising:
receiving a request to test visual acuity with the interface;
communicating the request to test the visual acuity with the at least one processor;
generating a first signal related to a first distance of a subject to the first surface with the one or more sensors;
receiving the first signal with the at least one processor;
determining the first distance between the subject and the first surface based on the first signal;
determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and
causing the one or more characters to be displayed on the first surface.
10. The method of claim 9, further comprising:
generating a second signal related to a second distance of a projector relative to a second surface with the one or more sensors;
receiving the second signal with the at least one processor;
determining the second distance between the projector and the second surface based on the second signal;
adjusting the display size of the one or more characters based on the second distance; and
projecting the one or more characters to the second surface.
11. The method of claim 9, further comprising:
receiving an audio input from the subject; determining whether one or more characters received from the audio input match the displayed one or more characters; and
based on the determination, causing a visual acuity measurement to be presented to the subject.
12. The method of claim 9, further comprising:
obtaining one or more display properties of the first surface, wherein the one or more display properties of the first surface comprise a display size and a resolution of the first surface.
13. The method of claim 9, further comprising:
determining visual acuity measurement scales corresponding to the display sizes of the one or more characters.
14. A system for dynamically adjusting a visual acuity test, the system comprising:
means for receiving a request from a subject to test visual acuity with a subject interface;
means for displaying one or more characters for testing the visual acuity with a first surface;
means for generating a first signal related to a first distance of the subject to the first surface with one or more sensors; and
means for receiving the first signal and executing machine- readable instructions with at least one processor, wherein the machine-readable instructions comprising:
determining the first distance between the subject and the first surface based on the first signal;
determining a display size of one or more characters for testing the visual acuity of the subject based on the first distance between the subject and the first surface; and causing the one or more characters to be displayed on the first surface.
15. The system of claim 14, further comprising:
means for generating a second signal related to a distance of a projector relative to a second surface;
means for receiving the second signal and executing the machine- readable instructions with the at least one processor, wherein the machine-readable instructions comprising:
determining a second distance between the projector and the second surface based on the second signal;
adjusting the display size of the one or more characters based on the second distance; and
projecting the one or more characters to the second surface.
16. The system of claim 14, wherein the machine-readable instructions further comprising:
receiving an audio input from the subject;
determining whether one or more characters received from the audio input match the displayed one or more characters; and
based on the determination, causing a visual acuity measurement to be presented to the subject.
17. The system of claim 14, further comprising:
means for obtaining one or more display properties of the first surface, wherein the one or more display properties of the first surface comprise a display size and a resolution of the first surface.
18. The system of claim 14, further comprising: means for determining visual acuity measurement scales corresponding to the display sizes of the one or more characters.
19. The system of claim 14, wherein the display comprises a mobile device display and wherein the first surface comprises a screen of the mobile device display.
20. The system of claim 14, wherein the display comprises a computer display and wherein the first surface comprises a screen of the computer display.
PCT/EP2016/082186 2015-12-22 2016-12-21 System and method for dynamically adjusting a visual acuity test WO2017108952A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562270855P 2015-12-22 2015-12-22
US62/270,855 2015-12-22

Publications (1)

Publication Number Publication Date
WO2017108952A1 true WO2017108952A1 (en) 2017-06-29

Family

ID=57609907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/082186 WO2017108952A1 (en) 2015-12-22 2016-12-21 System and method for dynamically adjusting a visual acuity test

Country Status (1)

Country Link
WO (1) WO2017108952A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108042102A (en) * 2017-12-20 2018-05-18 宋营营 A kind of device of the inspection eyesight of dormitory
EP3430974A1 (en) * 2017-07-19 2019-01-23 Sony Corporation Main module, system and method for self-examination of a user's eye
IT201700101120A1 (en) * 2017-09-11 2019-03-11 Idm Srl EQUIPMENT FOR IMPROVEMENT, TRAINING AND / OR REHABILITATION OF THE VISUAL FUNCTION
US10413172B2 (en) 2017-12-11 2019-09-17 1-800 Contacts, Inc. Digital visual acuity eye examination for remote physician assessment
FR3080760A1 (en) * 2018-05-03 2019-11-08 Padoa EVALUATION OF VISUAL ACUTE
GB2612366A (en) * 2021-11-01 2023-05-03 Ibisvision Ltd Method and system for eye testing

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030214630A1 (en) * 2002-05-17 2003-11-20 Winterbotham Chloe Tyler Interactive occlusion system
US20050124375A1 (en) * 2002-03-12 2005-06-09 Janusz Nowosielski Multifunctional mobile phone for medical diagnosis and rehabilitation
US20080189173A1 (en) * 2004-09-03 2008-08-07 Panaseca, Inc. Vision Center Kiosk
US20080259278A1 (en) * 2007-04-20 2008-10-23 Nidek Co., Ltd Visual acuity testing apparatus
JP2012040430A (en) * 2011-11-29 2012-03-01 Panasonic Corp Visual performance inspection device
US20120068998A1 (en) * 2010-09-20 2012-03-22 Samsung Electronics Co., Ltd. Display apparatus and image processing method thereof
CN203609407U (en) * 2013-07-11 2014-05-28 广州奥翼电子科技有限公司 Eyesight chart
FR3010890A1 (en) * 2013-09-23 2015-03-27 Acep France DEVICE FOR MEASURING THE VISUAL ACUTE

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050124375A1 (en) * 2002-03-12 2005-06-09 Janusz Nowosielski Multifunctional mobile phone for medical diagnosis and rehabilitation
US20030214630A1 (en) * 2002-05-17 2003-11-20 Winterbotham Chloe Tyler Interactive occlusion system
US20080189173A1 (en) * 2004-09-03 2008-08-07 Panaseca, Inc. Vision Center Kiosk
US20080259278A1 (en) * 2007-04-20 2008-10-23 Nidek Co., Ltd Visual acuity testing apparatus
US20120068998A1 (en) * 2010-09-20 2012-03-22 Samsung Electronics Co., Ltd. Display apparatus and image processing method thereof
JP2012040430A (en) * 2011-11-29 2012-03-01 Panasonic Corp Visual performance inspection device
CN203609407U (en) * 2013-07-11 2014-05-28 广州奥翼电子科技有限公司 Eyesight chart
FR3010890A1 (en) * 2013-09-23 2015-03-27 Acep France DEVICE FOR MEASURING THE VISUAL ACUTE

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3430974A1 (en) * 2017-07-19 2019-01-23 Sony Corporation Main module, system and method for self-examination of a user's eye
CN109285602A (en) * 2017-07-19 2019-01-29 索尼公司 Main module, system and method for self-examination eyes of user
US10709328B2 (en) 2017-07-19 2020-07-14 Sony Corporation Main module, system and method for self-examination of a user's eye
IT201700101120A1 (en) * 2017-09-11 2019-03-11 Idm Srl EQUIPMENT FOR IMPROVEMENT, TRAINING AND / OR REHABILITATION OF THE VISUAL FUNCTION
US10413172B2 (en) 2017-12-11 2019-09-17 1-800 Contacts, Inc. Digital visual acuity eye examination for remote physician assessment
CN108042102A (en) * 2017-12-20 2018-05-18 宋营营 A kind of device of the inspection eyesight of dormitory
FR3080760A1 (en) * 2018-05-03 2019-11-08 Padoa EVALUATION OF VISUAL ACUTE
GB2612366A (en) * 2021-11-01 2023-05-03 Ibisvision Ltd Method and system for eye testing

Similar Documents

Publication Publication Date Title
WO2017108952A1 (en) System and method for dynamically adjusting a visual acuity test
US11330977B2 (en) Digital visual acuity eye examination for remote physician assessment
CN109363620B (en) Vision detection method and device, electronic equipment and computer storage medium
US7959287B1 (en) Eyeglass frame sizing systems and methods
KR102190812B1 (en) Method for determining at least one value of a parameter for customising a visual compensation device
AU2017202152B2 (en) Method and system for determining postural balance of a person
EP3320829A8 (en) System for integrally measuring clinical parameters of visual function
CN103576853A (en) Method and display apparatus for providing content
CN111344222A (en) Method of performing an eye examination test
CN108697389B (en) System and method for supporting neurological state assessment and neurological rehabilitation, in particular cognitive and/or speech dysfunction
CN105224084B (en) Determine the method and device of virtual article position in Virtual Space
GB2514529A (en) Visual function testing device
JP2019202131A (en) Information processing apparatus, information processing method, and program
US20210315545A1 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic system
WO2016073572A1 (en) System and methods for diplopia assessment
Bungay et al. Contactless hand tremor amplitude measurement using smartphones: development and pilot evaluation
TW201907860A (en) Vision test device and vision test method capable of automatically adjusting size of optotype including a display screen, a database module, a distance measuring module, a calculating module, and a control module
WO2017219641A1 (en) Touch time obtaining method and system, and touch graphic displaying method and system
EP3138478B1 (en) A heart rate sensing wearable device
RU2752373C1 (en) Method for measuring duration of individual steps of left and right feet
KR20190108909A (en) System and method for measurement of visual acuity based on the analysis of user eye movements
WO2023242635A2 (en) Single device remote visual acuity testing systems and methods
US20230033256A1 (en) Program, information processing apparatus, and terminal device
US11468787B1 (en) Diabetic treatment management system
TWI810495B (en) Pulse-measuring equipment and pulse-measuring method for pulse diagnosis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16816696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16816696

Country of ref document: EP

Kind code of ref document: A1