CN110167421A - Integrally measure the system of the clinical parameter of visual performance - Google Patents

Integrally measure the system of the clinical parameter of visual performance Download PDF

Info

Publication number
CN110167421A
CN110167421A CN201780082894.5A CN201780082894A CN110167421A CN 110167421 A CN110167421 A CN 110167421A CN 201780082894 A CN201780082894 A CN 201780082894A CN 110167421 A CN110167421 A CN 110167421A
Authority
CN
China
Prior art keywords
user
display unit
interface
scene
visual performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780082894.5A
Other languages
Chinese (zh)
Other versions
CN110167421B (en
Inventor
艾娃·加西亚·拉莫斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Electronic Medical Technology Solutions Co Ltd
Original Assignee
Electronic Medical Technology Solutions Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Electronic Medical Technology Solutions Co Ltd filed Critical Electronic Medical Technology Solutions Co Ltd
Publication of CN110167421A publication Critical patent/CN110167421A/en
Application granted granted Critical
Publication of CN110167421B publication Critical patent/CN110167421B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/18Arrangement of plural eye-testing or -examining apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0033Operational features thereof characterised by user input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/005Constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/024Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/06Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing light sensitivity, e.g. adaptation; for testing colour vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/06Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing light sensitivity, e.g. adaptation; for testing colour vision
    • A61B3/066Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing light sensitivity, e.g. adaptation; for testing colour vision for testing colour vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/08Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing binocular or stereoscopic vision, e.g. strabismus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/09Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing accommodation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles

Abstract

System for the whole clinical parameter for measuring visual performance, comprising: for indicating the display unit (20) of the scene with 3D object, 3D object is with alterable features, such as 3D object virtual location in the scene and virtual volume;For detecting the position of user's head and the motion sensor (60) at a distance from display unit (20);For detecting the position of user's pupil and the tracking transducer (10) of interpupillary distance;Interface (30) for the interaction between user and scene;And for based on from sensor (60,10) with the data of interface (30) and in conjunction with the changing features of three dimensional object, and the estimation of multiple clinical parameters based on visual performance relevant to Binocular vison, adjusting, eye movement and visual perception and analyze the processing unit (42,44) of user response.

Description

Integrally measure the system of the clinical parameter of visual performance
Technical field
The invention belongs to measure the field of the system and method for the clinical parameter of visual performance.More particularly, to using Immersive VR technology measures this kind of parameter.
Background technique
Currently, the clinical parameter for measuring this kind of visual performance needs clinical expert to carry out wherein through repeatedly test and sighting target The consultation of doctors that patient is checked.In general, personal and artificial component part in measurement process provide it is subjective, be difficult to reappear And only qualitative result.
On the other hand, measurement is independently executed based on the visual performance to be checked.Due to not accounting for other factors It influences, result is not effective sometimes.It is known, for example, that patient is usually whole using compensating it with other visual performances The special abnormality of visual performance or damage.
In brief, there is presently no the adaptability for considering patient, accordingly, it is intended to which the action for correcting specific exception exists It may cause the overall deteriorating of patient's vision in practice.In addition, what the measurement and test to patient were measured and were tested The influence of the subjectivity of expert, to greatly limit the repeatability and consistency of the experimental result of acquisition.
Summary of the invention
The present invention relates to one kind for integrally measuring (preferably real-time measurement) eye, oculomotor nerve and visual performance ginseng Number, and the system for generating the therapy and training that improve for visual performance.
For this purpose, the tracking transducer of the position using detection user's pupil, and it is scheduled about size, shape using having The 3D object of the characteristics such as shape, color, speed is the three-dimensional display unit (3D) that user reproduces certain scenes.The two is according to work The selection of test-types performed by part for the consultation of doctors.Motion sensor senses user movement, enables display unit to fit It answers scene and provides immersion characteristic for it.
System further includes the interface that user can interact.Particularly, interface order from the user to be with display System interacts.These orders can be registered in a number of different ways (by control button, voice command, gloves Deng).
System also realizes the processing unit being managed with coordination mode to display unit, sensor and interface.Therefore, exist The user response to visual stimulus generated in display unit is examined by sensor and is sent to processing unit for measuring clinic Parameter.
The technology based on virtual reality for focusing on allowing generating the environment interacted with user of the invention.Particularly, Sought is the ability for being immersed in virtual environment.This especially has for creating condition similar with true environment for user Interest, so if allowing to repeat it if needing and reproducing for several times.For this purpose, display unit needs the movement carried with user Sensor coupling.In some embodiments, display unit can be virtual reality glasses, be 3D screen in further embodiments And polarised light glasses.Under any circumstance, coupled motions sensor and display unit allow the 3D rendering for making display to be adapted to people Movement or position, allow user's perceptual image to move along the virtual environment of display, that is to say, that feel oneself to immerse wherein, Preferably with minimum 60 ° of the visual field, so as to carry out evaluation, treatment and training appropriate to visual performance.On realizing Target is stated, the precise coordination participated between the element of the consultation of doctors is important.Therefore, 3D virtual reality ring is introduced for user first Border allows user to immerse wherein.In the 3D environment, some 3D objects will be shown in a manner of " sighting target ", these " sighting target " purports The stimulation of its sight thereon must be focused being used as user, and related to the test to be executed.
Detailed description of the invention
Fig. 1 shows the simplified block diagram of possible embodiment according to the present invention.
Fig. 2A and 2B shows the example for checking the measurement of healthy user of the scene with movement 3D object.
Fig. 3 A and 3B show the measurement for checking the user with dysfunction of the scene with movement 3D object Example.
Fig. 4 shows the overview diagram for the general step realized in embodiment.
Specific embodiment
Referring to the figure of front, exemplary non-limiting embodiments are further described.
The system that Fig. 1 shows the clinical parameter of whole measurement visual performance in real time, including multiple components.In the presence of for fixed Phase detects the tracking transducer 10 of the position of user's pupil.Therefore, can not only be changed with measurement direction, it can be with measuring speed. In general, tracking transducer 10 allows to measure multiple parameters based on the fc-specific test FC executed in the consultation of doctors.Fig. 2 and Fig. 3 are further detailed Illustrate this respect of the invention.For example, tracking transducer 10 can position to right and left eyes, user (by eyes and simple eye) Position, eye sensor distance, pupil size, interocular distance, oculomotor speed of the object seen etc. carry out value.One As for, in order to execute measurement operation, tracking transducer 10 include be intended for focusing on user eyes and capture theirs A pair of of camera of movement and position.This needs sufficiently high sample frequency to capture the quick movement of eyes.It must also calculate The position that user checks in the virtual environment of generation.Tracking transducer 10 is necessary to correct optometry measurement.Largely Dysfunction is to be detected by eyes to the abnormal motion of several stimulations.For clarity the reason of, Fig. 2 and Fig. 3 show respectively How associated with the visual condition of user the measurement that sensor 10,60 is carried out out is, hinders presence or absence of possible function Hinder.
It is that user reproduces or there is the scene of depth characteristic (to be included in ruler for projection with the 3D display unit 20 for immersing ability Very little, shape, color, scenery position, at a distance from user, it is static or movement etc. with predetermined attribute 3D object).These Scene including 3D object works as sighting target, and can be selected in systems according to the type for the test to be executed, this Sample permission generates specific visual stimulus in user.Therefore, it can be designed for user with different visual challenges and stimulation Multiple scenes, evaluation, treatment or training for visual performance.
System further includes the interface 30 for user's interaction.Particularly, interface order from the user is aobvious to control Show the other assemblies of unit 20 and system.Interface 30 can also send test instruction to user in turn.Therefore, system can be surveyed Measure the response (moving, the position in 3D environment, key etc.) to user action.
System further includes processing unit 40, is preferably implemented as server 42 and terminal 44, they share display in phase The management that unit 20, sensor 10 and interface 30 control detects eye response from the user simultaneously by sensor 10 It is transferred to server 42, for measuring the clinical parameter of visual performance.In addition, display unit 20 allow according to user movement come The represented 3D rendering of adjustment.Display unit 20 may include separation system (such as polarised light glasses).
Preferably, start to test by 3D interface.When visualizing to concrete scene, sensor 10 is in given time The user's visual stimulus having been detected by is associated with the 3D object indicated at the time in display unit 20.These user's pupils The variation of hole site is detected, and is combined with the movement that user's head carries out, these movements are examined by motion sensor 60 It measures.Connection motion sensor 60 and display unit 20 allow to show the 3D rendering of the movement or position that are adapted to people, allow user Feel them as actually moving in visual virtual environment, that is to say, that be like to immerse wherein.
Data are processed, and the attribute of 3D object is associated with the generation stimulation that sensor 10,60 detects.This allows The clinical parameter of visual performance is measured under conditions of reproducible and controllable.Therefore, appropriate by being carried out to the data of acquisition Processing, can learn visual behaviour, eye movement, convergence of user etc..Further, it is also possible to by the clinical parameter of visual performance with Desired extent is compared, to assess whether that there are any problems.
As described above, visualizing together with the 3D object in display unit 20, tracking transducer 10 is in reality environment Track the sight of user.Tracking transducer 10 records:
The position of eyes (left eye and right eye)
The position that each eye (difference) is watched attentively
The position that user is checked in 3D environment by the way that eyes are applied in combination.
At the same time it can also idsplay order, user is instructed by explaining that user must do at each moment.These refer to Enabling can be realized by way of text or audio by interface 30.The interface 30 also allows user and 20 tables of display unit 3D object interaction in the scene shown.Interaction with user is from then on, it is necessary to the response that record provides the stimulation of display (i.e. measured value).
For example, these responses of user may include:
The movement (any direction in space) of equipment
Position of the equipment in reality environment
Press the button of equipment
Voice command.
In these cases, for task above-mentioned, although these tasks provide (downloading) from external server 42, The process preferably executes in client terminal 44.Distributed environment allows to reduce the technical requirements of terminal 44, different user The centralized control of the test of middle execution, the access etc. to statistical data.For example, most heavy operation and calculating can be in server It is executed in 42, unloads processing workload from terminal 44.It is also possible to be defined as testing and the feature of foundation from server 42:
Reality environment to be used
- 3D object and its feature (size, distance, color, movement ...)
Give user which type of instruction
When 10 capturing information of tracking transducer is used
When 30 capturing information of user interface is used
Record and export result of which data as execution task.
About the data that needs record, there are the data from sensor 10,60, also have from the interface 30 interacted with user Data.
Once the processing locality of entire data finishes, they, which are grouped and are sent to server 42, carries out storage and subsequent point Analysis.Therefore, can be counted, newly test, suggest, treat etc..
For example, the acquired value of given parameters can be verified whether in tolerance according to the scientific research stored in server 42 In range.On the other hand, new scene can be designed as suggestion, provide some functions of poor outcome as improving test Treatment or training.
Fig. 2A and Fig. 2 B shows user in the example at two moment and system interaction.When carving t=ti at the beginning, system exists Indicate to correspond to a column train in display unit 30 along the 3D model of railway operation.
User carries motion sensor 60, for recording head movement (X at two momentic, Yic)、(Xfc, Yfc) and with The distance D of display unit 20ic, Dfc.Similarly, tracking transducer 10 records user in the pupil movement at two moment, provides The more information of position about two pupils.The right: (xi1,yi1,zi1), (xf1,yf1,zf1);The left side: (xi2,yi2,zi2), (xf2,yf2,zf2)。
On the other hand, display unit 20 is in two different virtual location (xio、yio、zio)、(xfo、yfo、zfo), and every A moment is with two different volumes Vio, VfoIndicate 3D object.The color of other attributes such as 3D object, may be in the consultation of doctors It executes the function of test and changes.
When handling above-mentioned value, check whether the eyes of user suitably assist with the 3D object movement in scene It adjusts.Visual behaviour is corresponding with healthy individuals.
Fig. 3 A and Fig. 3 B schematically show above situation, and wherein the visual behaviour of user is to the no sound appropriate of stimulation It answers.As shown in Figure 3A, user is not by its left eye (xi2, yi2, zi2) the optical axis correctly snap to interested object (Vio) on, it shows The limitation (strabismus) of binocular vision is shown.In this case, deviation angle (Fig. 3 B) passes through mobile interested object (Vfo) It remains unchanged, this shows the concomitant state of an illness, that is to say, that it has identical deviation angle in the different location checked.This information The vision treatment type of severity and recommendation for the determining state of an illness is most important, to re-establish the eyes of patient Depending on.
Obviously, the scene of selection is example.Other scenes can be the fish of different shape, color and size not The aquarium of disconnected appearing and subsiding;Vehicle is close to the road of user;The cave, etc. that mole runs out at random.At these In scene, all parameters (not underestimating existing each other influence) can be objectively measured simultaneously.
Possible a series of actions during operation of the system for testing is illustrated in Fig. 4 in brief.In first step 50, note The personal relevant information of volume user.Preferably, the data introduced are as follows: gender, age, habit etc., for this purpose, interface can be used 30.User issues request as user end to server 42 by terminal 44, and installs answer associated with selected test-types Use program.
User is located at the front of display unit 20, and instruction is given to user by interface 30 or display unit 20, so as to correct It places tracking transducer 10 or tracking transducer 10 is located at relative to display unit by position appropriate according to motion sensor 60 It sets.Then, at step 51, scene relevant to selected test, these objects are indicated using one or more 3D objects Attribute with the time or user interaction and change.
User and display unit are interacted at step 52 by interface 30.In general, user can be in the test phase Between instructed using figure or audio.For example, interface 30 may include any element, so that user is single more easily with display The 3D object of the scene 30 indicated in member 20 interacts.
At step 53, the detected value when scene is reappeared of sensor 10,60.These data must be sent to the minimum delay Terminal 44.
At step 54, terminal 44 receives the data of capture and server 42 is pre-processed and be sent to it, to obtain Obtain the clinical parameter value of visual performance.
After the completion of test or the consultation of doctors with different tests, terminal 44 sends server 42 for the data of acquisition and carries out It stores and is further processed.Particularly, at step 55, parameter is compared with the desired extent of user profile.
At step 56, when server 42 handles data obtained, it is by these data and possible dysfunction It is associated.
Finally, server generates the possibility suggestion for improving dysfunction, and sends end for these suggestions at step 57 End 44, to display together their results together with acquisition to user.
Due to this technology used, test is executed in a manner of objective, entirety and customization, and allows to identify different Visual dysfunction.Particularly, (convergence is insufficient, hair for the ability of those dysfunctions limitation interested object of eye alignment It is superfluous to dissipate property, rotation is not flexible), or limitation focusing capability (adjusting is insufficient, adjusts excessively, adjusts not flexible), or limit from one A object is checked variation (saccadic eye movement) to another object, or limitation tracking object (steadily following movement), or limitation is known Visual ability needed for other and management environment information.It is all these to be assessed and be trained by way of customization (being based not only on the condition for executing test, the also exploitation based on each work consultation of doctors).On the other hand, diversified visualization can It is provided to identical practice, this allows to better adapt to daily visual demand, and interest and attention are kept in practice.
It should be noted that the present invention cannot be only used for identification function obstacle, can also be trained by visual stimulus and challenge The physical and technical activity of healthy user.It may be directly applied to sport personage and children, extend also to specific profession The visual demand of personage (driver, pilot ...) and amateur's (ability for handling miniature picture, amusement game ...).
It should be noted that the advantages of feature of present invention first is that only needing to accommodate the space of the reduction of necessaries.Example Such as, the embodiment of the screen of display unit 20 is configured to for using, anything can be placed on a certain distance from user On desk, user is preferably sitting between 50 centimetres to 1 meter, a part along with a computer as processing unit 40. Remaining component is by user's head and hand-held band (control equipment, gloves etc.).The case where there are also less elements, if implementing Display unit 20 is a secondary VR glasses in example.

Claims (13)

1. a kind of system for integrally measuring the clinical parameter of visual performance characterized by comprising
Display unit (20), be configured for indicate wherein at least 3D object have for promote user eye response can Become the scene of feature, wherein the alterable features include at least the virtual location (X of 3D object described in sceneo, Yo, Zo) and it is empty Quasi- volume (Vo);
Multiple motion sensors (60) are configured for detection user's head position (Xc, Yc) and user's head and the display Distance (the D of unit (20)c);
Multiple tracking transducers (10) are configured for detecting user's pupil position (Xp, Yp, Zp) and pupil diameter (dp);
Interface (30), is configured to allow for the user to interact in scene;
Processing unit (42,44), is configured for following analysis user response:
By the 3D object indicated in the data from the sensor (60,10) and the interface (30) and the display unit Changing features are associated;
Assess multiple clinical parameters of user's visual performance.
2. the system as claimed in claim 1, wherein described be characterized according to scheduled programming and as the function of time Variable.
3. system as claimed in claim 1 or 2, wherein the alterable features further include the color of the 3D object.
4. system as described in any one of the preceding claims, wherein the feature is as passing through the described of the interface (30) The function of user's interaction is variable.
5. system as claimed in claim 4, wherein the interface (30) includes at least one of the following: digital pen, gloves, Control equipment etc..
6. system as described in any one of the preceding claims, wherein the display unit (20) includes 3D screen.
7. system as described in any one of the preceding claims, wherein the display unit (20) includes virtual reality glasses.
8. system as described in any one of the preceding claims, wherein the display unit (20) includes separation system.
9. system as described in any one of the preceding claims, wherein the processing unit (42,44) is additionally configured to institute The estimation clinical parameter for stating visual performance is compared with the reference range of storage, and establishes possible view based on the comparison Feel dysfunction.
10. system as claimed in claim 8 or 9, wherein based on the user profile comprising at least age information realize with The comparison of reference range.
11. system as described in any one of the preceding claims, wherein the processing unit include client terminal (44) and Server (42), wherein the client terminal (44) is configured for receiving and handling the number measured by sensor (10,60) According to and send it to the server (42).
12. system as claimed in claim 11, wherein the server (42) is configured for described value and has reference The database of value is compared.
13. system as described in any one of the preceding claims, wherein the visual performance of clinical parameter is related in following extremely It is one few: Binocular vison, adjusting, eye movement and visual perception.
CN201780082894.5A 2016-11-10 2017-10-27 System for integrally measuring clinical parameters of visual function Active CN110167421B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP16382521.9 2016-11-10
EP16382521.9A EP3320829A1 (en) 2016-11-10 2016-11-10 System for integrally measuring clinical parameters of visual function
PCT/ES2017/070721 WO2018087408A1 (en) 2016-11-10 2017-10-27 System for integrally measuring clinical parameters of visual function

Publications (2)

Publication Number Publication Date
CN110167421A true CN110167421A (en) 2019-08-23
CN110167421B CN110167421B (en) 2022-03-04

Family

ID=57708479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780082894.5A Active CN110167421B (en) 2016-11-10 2017-10-27 System for integrally measuring clinical parameters of visual function

Country Status (11)

Country Link
US (1) US11559202B2 (en)
EP (1) EP3320829A1 (en)
JP (1) JP7344795B2 (en)
KR (1) KR102489677B1 (en)
CN (1) CN110167421B (en)
AU (1) AU2017359293B2 (en)
BR (1) BR112019009614A8 (en)
CA (1) CA3043276C (en)
IL (1) IL266461B2 (en)
RU (1) RU2754195C2 (en)
WO (1) WO2018087408A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112866679A (en) * 2021-04-23 2021-05-28 广东视明科技发展有限公司 Multi-point stereoscopic vision detection method in motion state

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3062142B1 (en) 2015-02-26 2018-10-03 Nokia Technologies OY Apparatus for a near-eye display
CN106569339B (en) * 2016-11-08 2019-11-15 歌尔科技有限公司 The control method of VR helmet and VR helmet
US10650552B2 (en) 2016-12-29 2020-05-12 Magic Leap, Inc. Systems and methods for augmented reality
EP4300160A2 (en) 2016-12-30 2024-01-03 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
US11280937B2 (en) 2017-12-10 2022-03-22 Magic Leap, Inc. Anti-reflective coatings on optical waveguides
WO2019126331A1 (en) 2017-12-20 2019-06-27 Magic Leap, Inc. Insert for augmented reality viewing device
US10755676B2 (en) 2018-03-15 2020-08-25 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
WO2019232282A1 (en) 2018-05-30 2019-12-05 Magic Leap, Inc. Compact variable focus configurations
JP7319303B2 (en) 2018-05-31 2023-08-01 マジック リープ, インコーポレイテッド Radar head pose localization
CN112400157A (en) 2018-06-05 2021-02-23 奇跃公司 Homography transformation matrix based temperature calibration of viewing systems
EP3803545A4 (en) 2018-06-08 2022-01-26 Magic Leap, Inc. Augmented reality viewer with automated surface selection placement and content orientation placement
US11579441B2 (en) 2018-07-02 2023-02-14 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
WO2020010226A1 (en) 2018-07-03 2020-01-09 Magic Leap, Inc. Systems and methods for virtual and augmented reality
US11856479B2 (en) 2018-07-03 2023-12-26 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
JP7426982B2 (en) 2018-07-24 2024-02-02 マジック リープ, インコーポレイテッド Temperature-dependent calibration of movement sensing devices
WO2020023543A1 (en) 2018-07-24 2020-01-30 Magic Leap, Inc. Viewing device with dust seal integration
CN112740665A (en) 2018-08-02 2021-04-30 奇跃公司 Observation system for interpupillary distance compensation based on head movement
CN116820239A (en) 2018-08-03 2023-09-29 奇跃公司 Fusion gesture based drift correction of fusion gestures for totem in a user interaction system
US10914949B2 (en) 2018-11-16 2021-02-09 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
KR102184972B1 (en) * 2019-02-01 2020-12-01 주식회사 룩시드랩스 Apparatus and method for measuring distance between pupil centers
EP3921720A4 (en) 2019-02-06 2022-06-29 Magic Leap, Inc. Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors
EP3939030A4 (en) 2019-03-12 2022-11-30 Magic Leap, Inc. Registration of local content between first and second augmented reality viewers
CN114127837A (en) 2019-05-01 2022-03-01 奇跃公司 Content providing system and method
US20220240775A1 (en) 2019-05-29 2022-08-04 E-Health Technical Solutions, S.L. System for Measuring Clinical Parameters of Visual Function
CN114174895A (en) 2019-07-26 2022-03-11 奇跃公司 System and method for augmented reality
JP2023502927A (en) 2019-11-15 2023-01-26 マジック リープ, インコーポレイテッド Visualization system for use in a surgical environment

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1550039A (en) * 2001-07-06 2004-11-24 О Imaging system and methodology employing reciprocal space optical design
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
JP2008012223A (en) * 2006-07-10 2008-01-24 Nippon Telegr & Teleph Corp <Ntt> Device for adding function for game table having sight line analyzing and tracking function, and method for adding function for game table
TW200913957A (en) * 2007-09-17 2009-04-01 Liao Li Shi System for detecting and compensating vision defect and method thereof
CN101727531A (en) * 2008-10-16 2010-06-09 国际商业机器公司 Method and system used for interaction in virtual environment
CN203253067U (en) * 2011-03-15 2013-10-30 迈克尔·格特纳 System for applying focused ultrasound energy on patient or nerve around artery of patient
CN104244806A (en) * 2012-03-08 2014-12-24 埃西勒国际通用光学公司 Method for determining a behavioural, postural or geometric-morphological characteristic of a person wearing spectacles
CN104382552A (en) * 2014-11-27 2015-03-04 毕宏生 Equipment and method for testing comprehensive vision functions
CN104545787A (en) * 2014-12-12 2015-04-29 许昌红 Wearable pupil light reflex measurement equipment
CN104994776A (en) * 2012-10-22 2015-10-21 真正视野有限责任公司 Network of devices for performing optical/optometric/ophthalmological tests, and method for controlling said network of devices
CN105208917A (en) * 2012-11-26 2015-12-30 澳大利亚国立大学 Clustered volley method and apparatus
CN105212890A (en) * 2006-01-26 2016-01-06 诺基亚公司 Eye tracker equipment
CN105578954A (en) * 2013-09-25 2016-05-11 迈恩德玛泽股份有限公司 Physiological parameter measurement and feedback system
CN105832502A (en) * 2016-03-15 2016-08-10 广东卫明眼视光研究院 Intelligent visual function training method and instrument
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
UY28083A1 (en) * 2003-11-14 2003-12-31 Nicolas Fernandez Tourn Suarez VESTIBULAR REHABILITATION UNIT
US20060005846A1 (en) * 2004-07-07 2006-01-12 Krueger Wesley W Method for balance enhancement through vestibular, visual, proprioceptive, and cognitive stimulation
JP4890060B2 (en) 2005-03-31 2012-03-07 株式会社トプコン Ophthalmic equipment
US9149222B1 (en) * 2008-08-29 2015-10-06 Engineering Acoustics, Inc Enhanced system and method for assessment of disequilibrium, balance and motion disorders
US9345957B2 (en) * 2011-09-30 2016-05-24 Microsoft Technology Licensing, Llc Enhancing a sport using an augmented reality display
US9706910B1 (en) * 2014-05-29 2017-07-18 Vivid Vision, Inc. Interactive system for vision assessment and correction
JP6663441B2 (en) * 2015-03-01 2020-03-11 ノバサイト リミテッド System for measuring eye movement
JP6887953B2 (en) * 2015-03-16 2021-06-16 マジック リープ,インコーポレイティド Methods and systems for diagnosing and treating health-impairing illnesses

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1550039A (en) * 2001-07-06 2004-11-24 О Imaging system and methodology employing reciprocal space optical design
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
CN105212890A (en) * 2006-01-26 2016-01-06 诺基亚公司 Eye tracker equipment
JP2008012223A (en) * 2006-07-10 2008-01-24 Nippon Telegr & Teleph Corp <Ntt> Device for adding function for game table having sight line analyzing and tracking function, and method for adding function for game table
TW200913957A (en) * 2007-09-17 2009-04-01 Liao Li Shi System for detecting and compensating vision defect and method thereof
CN101727531A (en) * 2008-10-16 2010-06-09 国际商业机器公司 Method and system used for interaction in virtual environment
CN203253067U (en) * 2011-03-15 2013-10-30 迈克尔·格特纳 System for applying focused ultrasound energy on patient or nerve around artery of patient
CN104244806A (en) * 2012-03-08 2014-12-24 埃西勒国际通用光学公司 Method for determining a behavioural, postural or geometric-morphological characteristic of a person wearing spectacles
CN104994776A (en) * 2012-10-22 2015-10-21 真正视野有限责任公司 Network of devices for performing optical/optometric/ophthalmological tests, and method for controlling said network of devices
CN105208917A (en) * 2012-11-26 2015-12-30 澳大利亚国立大学 Clustered volley method and apparatus
CN105578954A (en) * 2013-09-25 2016-05-11 迈恩德玛泽股份有限公司 Physiological parameter measurement and feedback system
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
CN104382552A (en) * 2014-11-27 2015-03-04 毕宏生 Equipment and method for testing comprehensive vision functions
CN104545787A (en) * 2014-12-12 2015-04-29 许昌红 Wearable pupil light reflex measurement equipment
CN105832502A (en) * 2016-03-15 2016-08-10 广东卫明眼视光研究院 Intelligent visual function training method and instrument

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LETICIA FLORES-PULIDO1等: "Radial Basis Function for Visual Image Retrieval", 《2010 ELECTRONICS, ROBOTICS AND AUTOMOTIVE MECHANICS CONFERENCE》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112866679A (en) * 2021-04-23 2021-05-28 广东视明科技发展有限公司 Multi-point stereoscopic vision detection method in motion state
CN112866679B (en) * 2021-04-23 2021-08-10 广东视明科技发展有限公司 Multi-point stereoscopic vision detection method in motion state

Also Published As

Publication number Publication date
EP3320829A1 (en) 2018-05-16
CA3043276A1 (en) 2018-05-17
KR20190104137A (en) 2019-09-06
US20190254519A1 (en) 2019-08-22
KR102489677B1 (en) 2023-01-16
RU2019116179A3 (en) 2021-02-09
US11559202B2 (en) 2023-01-24
AU2017359293A1 (en) 2019-06-06
IL266461B2 (en) 2023-09-01
BR112019009614A2 (en) 2019-08-13
JP2019535401A (en) 2019-12-12
AU2017359293B2 (en) 2022-11-17
WO2018087408A1 (en) 2018-05-17
CA3043276C (en) 2023-10-10
IL266461B1 (en) 2023-05-01
BR112019009614A8 (en) 2023-03-21
CN110167421B (en) 2022-03-04
RU2754195C2 (en) 2021-08-30
EP3320829A8 (en) 2018-07-18
JP7344795B2 (en) 2023-09-14
RU2019116179A (en) 2020-12-10
IL266461A (en) 2019-06-30

Similar Documents

Publication Publication Date Title
CN110167421A (en) Integrally measure the system of the clinical parameter of visual performance
EP3384437B1 (en) Systems, computer medium and methods for management training systems
Nyström et al. The influence of calibration method and eye physiology on eyetracking data quality
KR20210060595A (en) Human-computer interface using high-speed and accurate tracking of user interactions
US20170150907A1 (en) Method and system for quantitative assessment of visual motor response
CN107592798A (en) Method and apparatus for determining user&#39;s eyesight
EP3621276A1 (en) Apparatus, method and program for determining a cognitive state of a user of a mobile device
JP7146800B2 (en) A decentralized network for securely collecting, analyzing, and sharing data across platforms
JP7442596B2 (en) Platform for Biomarker Identification Using Navigation Tasks and Treatment Using Navigation Tasks
AU2016410178A1 (en) Method and system for quantitative assessment of visual motor response
WO2019210087A1 (en) Methods, systems, and computer readable media for testing visual function using virtual mobility tests
CN115191018A (en) Evaluation of a person or system by measuring physiological data
US11666259B1 (en) Assessing developmental disorders via eye tracking
Wang et al. TAT-HUM: Trajectory Analysis Toolkit for Human Movements in Python
De Bruin Automated usability analysis and visualisation of eye tracking data
Sasaoka et al. Ease of hand rotation during active exploration of views of a 3-D object modulates view generalization
EP4325517A1 (en) Methods and devices in performing a vision testing procedure on a person
Spenthof Development and evaluation of a naturalistic dyadic eye-tracking paradigm
Gutierrez et al. Monitoring and management System of patients with shoulder tendinopathy in rehabilitation using Kinect 2.0
US20210133429A1 (en) Augmented reality system for measurement and therapeutic influence of mental processes
Estemyr et al. Designing a VR user experience test regarding the Vergence-Accommodation Conflict: An investigation surrounding the relations to Depth Perception
Eriksson et al. Eye Diagnostics with Virtual Reality
Renaud et al. The use of virtual reality in clinical psychology research: Focusing on approach and avoidance behaviors
Renaud et al. The Use of Virtual Reality in Clinical Psychology Research

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant