CN101657145A - Unitary vision and coordination testing center - Google Patents
Unitary vision and coordination testing center Download PDFInfo
- Publication number
- CN101657145A CN101657145A CN200880011931A CN200880011931A CN101657145A CN 101657145 A CN101657145 A CN 101657145A CN 200880011931 A CN200880011931 A CN 200880011931A CN 200880011931 A CN200880011931 A CN 200880011931A CN 101657145 A CN101657145 A CN 101657145A
- Authority
- CN
- China
- Prior art keywords
- test
- visual
- input
- visual capacity
- capacity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
System and methods for testing and/or training a subject's visual ability are provided. More specifically, the method may include testing various aspects of the subject's visual acuity, such as clarity, contrast, tracking, etc. By using various tests, a more efficient examination may be administered. In accordance with the invention, an individual may be tested using methods of testing and/or training at a unitary center, where the unitary center is capable of presenting visual tests to the individual, receiving input from the individual, and processing the received input. Such a unitary testcenter may further be configurable, so that the tests administered may vary based on the needs of the individual. The received input may then, for example, be used to compute data related to the user's visual acuity, both overall and for each individual test.
Description
The cross reference of related application
The application number that the application requires on April 13rd, 2007 to submit to is 60/923,434, name is called the priority of the U.S. Provisional Patent Application of " system and method for testing vision ability in the simulation game process (System and Method for TestingVisual Ability During Simulated Activity) ", should be in first to file to be incorporated into this with reference to the mode of quoting.The application number that the application also requires on June 4th, 2007 to submit to is 60/941,915, name is called the priority of the U.S. Provisional Patent Application of " system and method (System and Method for Decoupled Visual Ability Testing) that separates the visual capacity test ", should be in first to file to be incorporated into this with reference to the mode of quoting.
About the research of federal government's subsidy or the statement of exploitation
Inapplicable.
Technical field
Present invention relates in general to the evaluation and/or the training of individual's visual capacity.
Background technology
When the participation activity, when for example moving, except physical ability, individual's vision plays a role in individual's behavior.Typically, in motion or activity, in order to be improved,, the individual promotes their overall behavioral competence thereby will being devoted to improve their physical ability.Yet by test and training individual's visual capacity, individual's behavioral competence also can be improved.
Brief summary of the invention
Summary of the present invention is introduced the selection of notion with the form of simplifying, and all notions further specify in the following specific embodiment.The present invention neither be for the auxiliary scope of determining the theme that claim is protected generally if it were not for the key feature or the essential feature of the theme of protecting for definite claim.
According to the present invention, provide the method for the visual capacity of a kind of test and/or training objects.Particularly, this method can comprise each different aspect of the visual capacity of tested object, for example definition, contrast, tracking etc.By utilizing various test, can carry out how improved inspection.According to the present invention, the individual can present visual test to the individual, receive from individual's the input and the single formula center of handling received input and accept the such test and/or the method for training.Single formula test center like this can further can be provided with, so that performed test can change according to individual human needs.Received then input is passable, for example, is used to calculate the data relevant with the visual capacity of user, no matter is overall or being used for each individual's test all can.
Description of drawings
The present invention is described in detail hereinafter with reference to accompanying drawing, wherein:
Fig. 1 is applicable to the block chart of implementing computingasystem environment of the present invention (computing systemenvironment);
Fig. 2 represents the block diagram according to the exemplary test suite of embodiment of the invention use;
Fig. 3 represents to be used to implement the block chart of exemplary process assembly of the present invention;
Fig. 4 represents the exemplary unitary vision test cell according to the embodiment of the invention;
Fig. 5 represents another embodiment according to unitary vision test cell of the present invention, and
Fig. 6 represents that the expression according to the embodiment of the invention is used for the flow chart in the method for the visual capacity of single position measurement object.
The specific embodiment
Here will specify theme of the present invention to meet laws and regulations requirement (statutoryrequirement).Yet explanation itself is not in order to limit the scope of this patent.On the contrary, the inventor predicts: the theme that claim is protected can also realize by alternate manner, comprises combining in conjunction with other present or following technology and similar different step described in this patent file or step.
According to the present invention, provide in one the system and method for the visual capacity of tested object on single formula test cell.Such method can be included in each different aspect of the visual sensitivity (visual acuity) of the single formula test cell tested object that can also handle the gained data and/or handle to another location transmission data by network, for example definition, contrast, tracking (tracking) or the like.When doing like this, single formula test center can simplify the test process of the visual capacity of object, and can reduce the funds (for example reducing equipment) that need test.In addition, single formula test center can be provided with, so that performed test can change according to individual human needs.Then, received input is passable, for example is used to calculate the result relevant with the visual capacity of user, no matter is overall or being used for each individual's test all can.
A kind of test set of visual capacity of tested object is provided in one embodiment.Such test set can comprise and presents assembly (presenting component), input module and processing components, presents assembly and can present vision definition test, contrast sensitivity test, visual pursuit's test, distance focusing test and vision alignment (visual aiming) test to object.Object can be made response to each test, provides input to test set.Input module is set then receiving input, and can the set handling assembly to handle received input.
In another embodiment, provide a kind of method of visual capacity of tested object, this method takes place in single position.This method partly comprises, and study subject is carried out two or more visual capacity tests; The input of the response that reception is made each test from study subject; And processing is received from the input of study subject.
Totally with reference to accompanying drawing, particularly from Fig. 1, the block chart of expression exemplary computer system, this computing system are set to be provided for the visual capacity of tested object generally with computing system 100 expressions.It will be appreciated by those skilled in the art that and figure out the example that computing system shown in Figure 1 100 only is a suitable computingasystem environment, is not in order to propose about the use of the embodiment of the invention or any restriction of functional scope.Computing system 100 should be interpreted as rely on or any any single component described herein of needs or the combination of assembly.
If input equipment 102 is eyes follow-up mechanisms, position and/or focus that can the monitoring target eyes, and when eyes are in and/or focus in position, the record input.
If input equipment 102 is gesture discrimination systems, can receive input with various systems and/or method.For example, can come the body limbs of monitoring target and/or moving of brothers (extremities) in conjunction with suitable hardware and/or software with one or more photographing units, record input when object is made suitable gesture.Thereby the gesture discrimination system can also utilize the optical markers (opticalmarkers) that is contained on the object be convenient to motion tracking (motion tracking).Be contained on the object one and a plurality of emitters (Transmitter) (for example, utilizing radio infrared, velocity of sound, subsonic speed (subsonic) or ultrasound-transmissive) and can also be used as the part of gesture discrimination system.
If input equipment 102 is touch screens, can utilize the touch screen of any kind.Equally, touch quick material covering (overlay) can with itself not the tool display that touches quick property be used for receiving and touch input.Such covering can be from any distance of display.
Return Fig. 1, shown in embodiments of the invention, test cell 110 can comprise and presents assembly 112, input module 114, test suite 116 and processing components 118.It will be apparent to one skilled in the art that assembly shown in Figure 1 112,114,116 and 118 is exemplary in itself and quantitatively, can not be interpreted as restriction.Any amount of assembly can be used for realizing that the scope of the embodiment of the invention is interior needed functional.
Presenting assembly 112 can the visually observable output video of display object, and it can be computer, the testing equipment of any kind, perhaps TV monitor, comprise cathode ray tube, liquid crystal display, plasma screen, perhaps any other type of display perhaps can comprise from the front or the screen of projected image from behind.
In one embodiment, presenting assembly 112 can be the equipment that utilizes reflecting mirror (mirror) and/or lens, reflecting mirror and/or lens are placed strategicly, thereby in limited area of space, produce the visual perspective (visual perspective) (for example, the edge of cremasteric reflex mirror is configured to produce tunnel-effect (tunnel effect)) of distance.The example of such equipment is to utilize reflecting mirror to produce the perspective testing equipment of distance perspective.Such equipment can be included in the reflecting mirror (that is, directly in the object front) that central fovea zone shows visual indicia, thereby and may further include the side mirror (side mirrors) that shows visual indicia test peripheral vision ability.
In another embodiment, thus equipment can comprise the lens of the size that changes the distance feel and/or shown visual indicia realizes simulated range.As a result, such equipment can provide to the study subject demonstration shown visual indicia nearer or more farther than actual displayed.Therefore, this configuration produces the perspective of optical infinity to study subject.
One of skill in the art will appreciate that, present assembly 112 and can comprise multiple arrangement that multiple arrangement combines and shows that some typical cases are used for the visual stimulus of specific activities.In one embodiment, single assembly can be used for showing a plurality of demonstrations (for example split screen (split-screen)) of visual indicia.
Presenting that assembly 112 selectively comprises can be by the demonstration glasses of subject wears, protective eye lens, face shield (visors) or the like, thereby provides other people typically sightless visual display for object.In addition, present assembly 112 and can provide two dimension or 3-D view to study subject.3-D view shows can comprise that virtual reality (virtual reality) or holography to object present (holographicpresentations).
In the operation, can be provided with and present assembly 112 to present one or more visual indicias to study subject.As following more complete explanation, thereby present the different aspect that assembly 112 can present the visual capacity of visual indicia tested object in a different manner.Generally speaking, each visual indicia can have one or more features.This feature can be, for example, directed location (for example, arrow, Landolt C, E word visual acuity chart (Tumbling E) or the like), the position on the user interface (for example, be arranged in the specific fan section of display), pre-determine in the mutual exclusion feature of quantity a feature (for example, face upper and lower, right, left indicator), the perhaps combination in any of feature.In addition, it should be appreciated by those skilled in the art that and figure out, also can utilize further feature, and the present invention is not limited to any special characteristic.
Input module 114 can be set to receive input (for example, by utilizing input equipment 102) from study subject.According to the present invention, can utilize any suitable receiving unit that the input that provides by object can be provided.For instance, but be not limited to, object can provide the input that utilizes keyboard, control stick, trace ball (trackball) or analog.Input can be depended on and presents assembly.For example, touch quickly if present assembly, object can present assembly by touch provides input.In another embodiment, input module can have the sound discrimination ability, and object can offer input with the acoustic response that has that is picked out by input module.It should be appreciated by those skilled in the art that and figure out,, can utilize any suitable input module according to the present invention.Just as explained above, based on by presenting test that assembly presents and the ability that presents assembly, some type is preferred.After the input of reception from object, input module 114 can be stored input, for example, is stored in the reference that is used for future among the data base 104.
Provide processing components 118 to handle the input that receives by input module 114.As shown in Figure 3, processing components 118 can comprise scoring assembly 310, data collecting assembly 312, training developer component 314 and transfer assembly (delivery component) 316.Thereby scoring assembly 310 can be set based on object the response of the test that presented be obtained mark to utilize the scoring algorithm.The response of object can compare to determine by the response that with such response with from special group (population), typically is received from data base 104.Scoring assembly 310 can provide the object incident to receiving and measure the evaluation to the visual capacity of one or more responses of visual indicia.As soon as determine mark (for example, percentile), just can present to object by presenting assembly 112.Can be when each test finish, when all tests finish, or during their combination, present this mark.
One of skill in the art will appreciate that transfer assembly 316 can be with any needed frequency from test cell 110 transmission information.That is to say, can with information for example after object be finished all tests, perhaps selectively, finish each and separately send to the position that needs after the test.Be used for storage and/or handle if information is sent to center 106 or data base 104, can when finishing, send information together for all objects.Frequency can depend on the memory capacity and the disposal ability of test cell 110, and the information that needs utilization.
Referring now to Fig. 2, further specify test suite 116.Test suite 116 can comprise visual clarity component 210, contrast sensitivity component 212, visual pursuit's assembly 214, near/focus pack 216 far away and look deviation (fixation disparity) assembly 218 admittedly.Extra assembly can comprise target acquisition (target acquisition), visual sensitivity, dynamic vision sensitivity, stares stable (gaze stability), depth perception (depth perception), stereopsis vision (binocular vision) or any other vision skill.Test cell 110 can utilize in these assemblies each to test each different aspect of individual's visual capacity.One of skill in the art will appreciate that, can utilize other visual test and within the scope of the invention, and can utilize these combination in any and/or other test suite.
The vision definition of visual clarity component 210 with tested object is set, and visual clarity component 210 can be included in certain predetermined distance display visual indicia and need object to discern this labelling.For example, visual indicia can present with less size up to object identification marking again.Such test can utilize typical test chart (eyechart).Selectively, can present Landolt C to object.In one embodiment, thus can utilize testing equipment to present visual indicia as presenting assembly 112 simulated object to the distance of visual indicia.It will be appreciated by those of skill in the art that and understand that visual clarity component 210 can be utilized the suitable test of the vision definition of any tested object.
The contrast sensitivity of contrast sensitivity component 212 with the test study subject is set.Such test can be used for the ability of measuring object in the different brightness of rest image perception.In one embodiment, the visual indicia that will have two different brightness is presented to object.The different brightness that the feature that can need object recognition visible sensation labelling then, this visual indicia need object to distinguish to be presented.It will be appreciated by those of skill in the art that and understand that contrast sensitivity component 212 can be utilized any suitable contrast sensitivity test.
The visual pursuit ability of visual pursuit's assembly 214 with the test study subject is set.Can utilize any suitable test and within the scope of the invention.Illustrate, but do not limit, visual indicia, for example box can be presented to study subject.Box can move on each different directions in the display device with various speed then.Object must be discerned second visual indicia then, Landolt C for example, and this second visual indicia can be presented on the inside of box.Such test needs object to follow the trail of first visual indicia before identification second visual indicia, thus visual pursuit's ability of tested object.
Be provided with near/focus pack 216 far away with tested object from the near to the remote with by as far as their visual focus ability of nearly change.Illustrate, but do not limit, this assembly can utilize two or more display devices for presenting assembly 112, and display device far away is arranged on from the position of object optical infinity or approaching from the position of object optical infinity, and nearly display device is arranged on the position from the object different distance.Need object identification to be presented at two visual indicias in the display device, the ability that tested object focuses on then on the distance that changes.
Referring now to Fig. 4, Exemplary Visual test macro 400 according to an embodiment of the invention is described.Thereby the object 410 that participates in test can utilize the visual capacity of test macro 400 tested object 410 in active procedure.Test cell 412 comprises display device 414 and input equipment 416.In this embodiment, the input that can the input equipment 416 by being connected to (for example, by wired or wireless connection) test cell 412 receives of test cell 412 from object 410.One of skill in the art will appreciate that, for suitable response is provided, object 410 can be provided to test cell 412 with the input equipment of importing by any kind, and object 412 can pass through any way (for example, physical interaction, sound discrimination) and input equipment 416 interactions.
Rely on the single formula test cell that is had, for example can present the test cell 412 of several tests, the overall assessment of the visual capacity of object can be provided to object.In addition, because test cell 412 can comprise disposal ability, it can deal with data, the mark and/or the training program that obtain determining for object.
Fig. 5 illustrates vision testing system 500 according to an embodiment of the invention.In this embodiment, utilize two display devices 518 and 520 to present eyesight testing.Having two display devices allows these two devices all are used for presenting visual indicia to object.As described above, if utilize a plurality of display devices, it is better that some visual test (for example, near/far away focusing) can be worked.Input equipment shown in Figure 5 516 is the example explanation with the control stick; Yet, predicted the input equipment of any kind within the scope of the invention.
With reference to Fig. 6, the method for the visual capacity of flow chart 600 expression tested objects.Although the following different key elements of the method that is adopted with term " step (step) " and " square (block) " expression, but these terms should not be interpreted as meaning in disclosed here each different step or between any particular order, unless and except when beyond when offering some clarification on the order of individual steps.At first, study subject is carried out two or more visual capacity tests (for example, utilizing the test cell 110 of Fig. 1).This represents in square 610.One of skill in the art will appreciate that, can carry out any visual capacity test of above explanation, and any other measured the test of individual's visual capacity.Although each visual test of being carried out by test cell can anyly occur in sequence,, typically at first object is carried out the vision definition test for the reference information of object vision is provided.When object was carried out test, object can be by providing suitable response with the input equipment interaction, and this input equipment is connected with test cell by input module.This represents in square 620.Then processing components (for example, the processing components 118 among Fig. 1) can, for example, handle the input that receives by collecting data, determining mark etc.Can perhaps send to by transfer assembly with data storage in data base 104 for example, for example the center 106.This represents in square 630.
Selectively, in square 640, the data that are received from the input of the object of having accepted each test can be used for determining mark for object.Can determine individual mark for each test, and can determine overall score based on data from all tests.Mark can be further based on the corresponding data of specific crowd, and the mark of object can compare (percentile that for example, can give a behavioral competence of object) according to this.In square 650, thereby the study subject exploitation training program that is input as that can be for example responds that the visual capacity test receives based on the fixed mark of study subject and they is trained his or his visual capacity.
Describing the present invention with respect to specific embodiment, is in order to describe in all respects rather than to limit.The alternate embodiments relevant with the present invention will be conspicuous to one skilled in the art under the situation that does not deviate from its scope.
From the front as can be seen, the present invention is to be well suited for to obtain above-mentioned all results and target and to native system with method is apparent and the invention of inherent other advantage.Should be understood that, some feature and time in conjunction with (sub-combinations) be effectiveness arranged and can under not with reference to further feature and time bonded situation, utilize.This also is expected and within the scope of the claims.
Claims (21)
1. device that is used for the visual capacity of tested object comprises:
Present assembly, be arranged in order to present two or more visual capacity tests, wherein, object responds each test input is provided; Input module is provided by the input that is provided by object in order to receive; And processing components, be arranged in order to handle received input.
2. device according to claim 1 is characterized in that, described processing components comprises based on received input determines fractional scoring assembly.
3. device according to claim 2 is characterized in that, described processing components further comprises the training developer component that training program is provided based on fixed mark.
4. device according to claim 1 is characterized in that, a visual capacity test in two or more visual capacity tests comprises the vision definition test.
5. device according to claim 4 is characterized in that, the test of described vision definition comprises the visual indicia that presents concrete size to object, and wherein this size increases to visual indicia always and discerned by object.
6. device according to claim 5 is characterized in that described visual indicia is a Landolt C.
7. device according to claim 1 is characterized in that, a visual capacity test in two or more visual capacity tests comprises the contrast sensitivity test.
8. device according to claim 7 is characterized in that, contrast sensitivity test comprises at least one of presenting to object and has the visual indicia of one or more concrete brightness.
9. device according to claim 1 is characterized in that, a visual capacity test in two or more visual capacity tests comprises that distance focuses on test.
10. device according to claim 9 is characterized in that, the distance that distance focusing test pack is contained near object presents first visual indicia and presents second visual indicia in the distance away from object.
11. a method of testing the visual capacity of tested object, wherein this method occurs in single position, and this method comprises:
Study subject is carried out two or more visual capacity tests; The input of the response that reception is made each test from study subject; And handle received input.
12. method according to claim 11 is characterized in that, described method further comprises based on received input determines mark.
13. method according to claim 12 is characterized in that, described method further comprises the training developer component that training program is provided based on fixed mark.
14. method according to claim 11 is characterized in that, a visual capacity test in two or more visual capacity tests comprises the vision definition test.
15. method according to claim 14 is characterized in that, the test of described vision definition comprises the visual indicia that presents concrete size to object, and wherein this size increases to visual indicia always and discerned by object.
16. method according to claim 15 is characterized in that, described visual indicia is a Landolt C.
17. method according to claim 11 is characterized in that, a visual capacity test in two or more visual capacity tests comprises the contrast sensitivity test.
18. method according to claim 17 is characterized in that, contrast sensitivity test comprises at least one of presenting to object and has the visual indicia of one or more concrete brightness.
19. method according to claim 11 is characterized in that, a visual capacity test in two or more visual capacity tests comprises that distance focuses on test.
20. method according to claim 19 is characterized in that, the distance that distance focusing test pack is contained near object presents first visual indicia and presents second visual indicia in the distance away from object.
21. a device that is used for the visual capacity of tested object comprises:
Present assembly, be arranged in order to present two or more visual capacity tests, wherein a visual capacity test in two or more visual capacity tests comprises that distance focuses on test, and wherein object responds each test input is provided; Input module is provided by the input that is provided by object in order to receive; And processing components, be arranged in order to handle received input.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US92343407P | 2007-04-13 | 2007-04-13 | |
US60/923,434 | 2007-04-13 | ||
US94191507P | 2007-06-04 | 2007-06-04 | |
US60/941,915 | 2007-06-04 | ||
PCT/US2008/060229 WO2008128178A1 (en) | 2007-04-13 | 2008-04-14 | Unitary vision testing center |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101657145A true CN101657145A (en) | 2010-02-24 |
CN101657145B CN101657145B (en) | 2012-01-25 |
Family
ID=41711088
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2008800119314A Expired - Fee Related CN101657145B (en) | 2007-04-13 | 2008-04-14 | Unitary vision testing center |
CN200880011961.5A Expired - Fee Related CN101657846B (en) | 2007-04-13 | 2008-04-14 | The method and system of visual cognition and coordination testing and training |
CN200880011916XA Expired - Fee Related CN101657144B (en) | 2007-04-13 | 2008-04-14 | Unitary vision and neuro-processing testing center |
CN2008800118947A Expired - Fee Related CN101657143B (en) | 2007-04-13 | 2008-04-14 | Unitary vision and neuro-processing testing center |
CN200880011994XA Expired - Fee Related CN101657146B (en) | 2007-04-13 | 2008-04-14 | Syetems and methods for testing and/or training near and farvisual abilities |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200880011961.5A Expired - Fee Related CN101657846B (en) | 2007-04-13 | 2008-04-14 | The method and system of visual cognition and coordination testing and training |
CN200880011916XA Expired - Fee Related CN101657144B (en) | 2007-04-13 | 2008-04-14 | Unitary vision and neuro-processing testing center |
CN2008800118947A Expired - Fee Related CN101657143B (en) | 2007-04-13 | 2008-04-14 | Unitary vision and neuro-processing testing center |
CN200880011994XA Expired - Fee Related CN101657146B (en) | 2007-04-13 | 2008-04-14 | Syetems and methods for testing and/or training near and farvisual abilities |
Country Status (1)
Country | Link |
---|---|
CN (5) | CN101657145B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104887467A (en) * | 2015-06-03 | 2015-09-09 | 侯跃双 | Child vision correction recovery instrument |
CN109998491A (en) * | 2019-04-25 | 2019-07-12 | 淮南师范学院 | A kind of glasses and method of test depth perceptibility |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3041412A1 (en) * | 2013-09-02 | 2016-07-13 | Ocuspecto OY | Testing and determining a threshold value |
CN105407800B (en) * | 2013-09-11 | 2019-04-26 | 麦克赛尔株式会社 | Brain disorder evaluating apparatus and storage medium |
EP3057508B1 (en) * | 2013-10-17 | 2020-11-04 | Children's Healthcare Of Atlanta, Inc. | Methods for assessing infant and child development via eye tracking |
FR3014673A1 (en) * | 2013-12-17 | 2015-06-19 | Essilor Int | APPARATUS AND METHOD FOR DETECTING VIEW FAULTS AND VISUAL ACUTE MEASUREMENT OF A USER |
CN104970763A (en) * | 2014-04-09 | 2015-10-14 | 冯保平 | Full-automatic vision detecting training instrument |
CN104382560A (en) * | 2014-12-08 | 2015-03-04 | 丹阳市司徒镇合玉健身器械厂 | Ataxia detector |
CN104586403A (en) * | 2015-01-21 | 2015-05-06 | 陕西省人民医院 | Finger movement mode monitoring and analysis device and use method thereof |
CN104851326A (en) * | 2015-05-18 | 2015-08-19 | 吉首大学 | Ideological and political work demonstration and teaching instrument |
CN105496347B (en) * | 2016-01-12 | 2017-06-06 | 哈尔滨学院 | Depending on depth electronic measuring device |
ES2702484T3 (en) | 2016-01-15 | 2019-03-01 | Centre Nat Rech Scient | Device and procedure for determining eye movements by touch interface |
US20190076077A1 (en) * | 2016-03-31 | 2019-03-14 | Koninklijke Philips N.V. | Device and system for detecting muscle seizure of a subject |
CN106726388B (en) * | 2017-01-04 | 2019-02-05 | 深圳市眼科医院 | A kind of training device and its control method of extraocular muscle neural feedback muscle |
CN107736889B (en) * | 2017-09-08 | 2021-01-08 | 燕山大学 | Detection method of human body coordination detection device |
CN109727508B (en) * | 2018-12-11 | 2021-11-23 | 中山大学中山眼科中心 | Visual training method for improving visual ability based on dynamic brain fitness |
CN109744994A (en) * | 2019-03-12 | 2019-05-14 | 西安爱特眼动信息科技有限公司 | A kind of perimetry device based on multihead display |
CN114929105A (en) * | 2019-10-30 | 2022-08-19 | 朱拉隆功大学 | Stimulation system for cooperative brain and body function |
CN113018124A (en) * | 2021-03-02 | 2021-06-25 | 常州市第一人民医院 | Rehabilitation device for unilateral neglect of patient |
CN115969677B (en) * | 2022-12-26 | 2023-12-08 | 广州视景医疗软件有限公司 | Eyeball movement training device |
CN116115981A (en) * | 2023-02-09 | 2023-05-16 | 湖南理工学院 | Table tennis player service action recognition training instrument |
CN116172560B (en) * | 2023-04-20 | 2023-08-29 | 浙江强脑科技有限公司 | Reaction speed evaluation method for reaction force training, terminal equipment and storage medium |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4528989A (en) * | 1982-10-29 | 1985-07-16 | Weinblatt Lee S | Screening method for monitoring physiological variables |
US4618231A (en) * | 1984-02-22 | 1986-10-21 | The United States Of America As Represented By The Secretary Of The Air Force | Accommodative amplitude and speed measuring instrument |
US5088810A (en) * | 1989-01-23 | 1992-02-18 | Galanter Stephen M | Vision training method and apparatus |
CN1077873A (en) * | 1992-04-22 | 1993-11-03 | 四川大学 | Computerized comprehensive test system for visual sense |
US5825460A (en) * | 1994-04-30 | 1998-10-20 | Canon Kabushiki Kaisha | Visual function measuring apparatus |
US5812239A (en) * | 1996-10-22 | 1998-09-22 | Eger; Jeffrey J. | Method of and arrangement for the enhancement of vision and/or hand-eye coordination |
US6092058A (en) * | 1998-01-08 | 2000-07-18 | The United States Of America As Represented By The Secretary Of The Army | Automatic aiding of human cognitive functions with computerized displays |
US6066105A (en) * | 1998-04-15 | 2000-05-23 | Guillen; Diego | Reflex tester and method for measurement |
US6364845B1 (en) * | 1998-09-17 | 2002-04-02 | University Of Rochester | Methods for diagnosing visuospatial disorientation or assessing visuospatial orientation capacity |
US6454412B1 (en) * | 2000-05-31 | 2002-09-24 | Prio Corporation | Display screen and vision tester apparatus |
US6632174B1 (en) * | 2000-07-06 | 2003-10-14 | Cognifit Ltd (Naiot) | Method and apparatus for testing and training cognitive ability |
-
2008
- 2008-04-14 CN CN2008800119314A patent/CN101657145B/en not_active Expired - Fee Related
- 2008-04-14 CN CN200880011961.5A patent/CN101657846B/en not_active Expired - Fee Related
- 2008-04-14 CN CN200880011916XA patent/CN101657144B/en not_active Expired - Fee Related
- 2008-04-14 CN CN2008800118947A patent/CN101657143B/en not_active Expired - Fee Related
- 2008-04-14 CN CN200880011994XA patent/CN101657146B/en not_active Expired - Fee Related
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104887467A (en) * | 2015-06-03 | 2015-09-09 | 侯跃双 | Child vision correction recovery instrument |
CN109998491A (en) * | 2019-04-25 | 2019-07-12 | 淮南师范学院 | A kind of glasses and method of test depth perceptibility |
Also Published As
Publication number | Publication date |
---|---|
CN101657144B (en) | 2012-05-30 |
CN101657143B (en) | 2012-05-30 |
CN101657146A (en) | 2010-02-24 |
CN101657145B (en) | 2012-01-25 |
CN101657846B (en) | 2016-03-09 |
CN101657144A (en) | 2010-02-24 |
CN101657146B (en) | 2012-01-18 |
CN101657846A (en) | 2010-02-24 |
CN101657143A (en) | 2010-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101657145B (en) | Unitary vision testing center | |
CA2683808C (en) | Unitary vision testing center | |
CA2683723C (en) | Unitary vision and coordination testing center | |
CN102481094B (en) | System and method for testing/training visual perception speed and/or span | |
Chandra et al. | Eye tracking based human computer interaction: Applications and their uses | |
Naceri et al. | Depth perception within peripersonal space using head-mounted display | |
US8317324B2 (en) | Unitary vision and neuro-processing testing center | |
EP2525880A1 (en) | Apparatus and method for detecting and classifying so as to be able to evaluate and self-evaluate exercises | |
CN109171638A (en) | The method of eyesight detection, wears display equipment and vision inspection system at terminal | |
Hurter et al. | Commercial virtual reality displays: Issues of performance and simulator sickness from exocentric depth-perception tasks | |
Zhao | Enhancing undergraduate research experience with cutting edge technologies | |
Parit et al. | Eye tracking based human computer interaction | |
Piantanida et al. | Audition and Vision in Virtual Environments. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
ASS | Succession or assignment of patent right |
Owner name: NIKE INNOVATION LIMITED PARTNERSHIP Free format text: FORMER OWNER: NIKE INTERNATIONAL LTD. Effective date: 20141117 |
|
C41 | Transfer of patent application or patent right or utility model | ||
TR01 | Transfer of patent right |
Effective date of registration: 20141117 Address after: oregon Patentee after: NIKE INNOVATE C.V. Address before: oregon Patentee before: Nike International Ltd. |
|
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120125 |