EP3104764A1 - Procédé et appareil d'entraînement de la vision - Google Patents
Procédé et appareil d'entraînement de la visionInfo
- Publication number
- EP3104764A1 EP3104764A1 EP14881439.5A EP14881439A EP3104764A1 EP 3104764 A1 EP3104764 A1 EP 3104764A1 EP 14881439 A EP14881439 A EP 14881439A EP 3104764 A1 EP3104764 A1 EP 3104764A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- target
- subject
- visual
- trial
- programmable device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title abstract description 18
- 238000002645 vision therapy Methods 0.000 title description 3
- 230000000007 visual effect Effects 0.000 claims abstract description 71
- 238000012549 training Methods 0.000 claims abstract description 50
- 230000002123 temporal effect Effects 0.000 claims abstract description 16
- 230000004438 eyesight Effects 0.000 claims abstract description 9
- 241000282414 Homo sapiens Species 0.000 claims abstract description 8
- 230000008713 feedback mechanism Effects 0.000 claims abstract 4
- 230000004424 eye movement Effects 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 8
- 230000001360 synchronised effect Effects 0.000 claims description 3
- 238000010079 rubber tapping Methods 0.000 claims description 2
- 238000000926 separation method Methods 0.000 claims 2
- 230000001771 impaired effect Effects 0.000 claims 1
- 238000013459 approach Methods 0.000 abstract description 16
- 201000004569 Blindness Diseases 0.000 abstract description 8
- 238000001514 detection method Methods 0.000 abstract description 4
- 230000000638 stimulation Effects 0.000 abstract description 2
- 230000006735 deficit Effects 0.000 abstract 1
- 230000003252 repetitive effect Effects 0.000 abstract 1
- 230000033001 locomotion Effects 0.000 description 14
- 230000006872 improvement Effects 0.000 description 11
- 241001465754 Metazoa Species 0.000 description 8
- 210000004556 brain Anatomy 0.000 description 7
- 241001422033 Thestylus Species 0.000 description 6
- 230000004044 response Effects 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 208000003443 Unconsciousness Diseases 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 230000001225 therapeutic effect Effects 0.000 description 4
- 230000004382 visual function Effects 0.000 description 4
- 206010061216 Infarction Diseases 0.000 description 3
- 230000006399 behavior Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000007574 infarction Effects 0.000 description 3
- 230000013016 learning Effects 0.000 description 3
- 230000003278 mimic effect Effects 0.000 description 3
- 230000001537 neural effect Effects 0.000 description 3
- 238000011084 recovery Methods 0.000 description 3
- 230000002787 reinforcement Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 241001508691 Martes zibellina Species 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 239000012141 concentrate Substances 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000005764 inhibitory process Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000926 neurological effect Effects 0.000 description 2
- 210000000869 occipital lobe Anatomy 0.000 description 2
- 230000037361 pathway Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004064 recycling Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000002739 subcortical effect Effects 0.000 description 2
- 230000002459 sustained effect Effects 0.000 description 2
- KHOITXIGCFIULA-UHFFFAOYSA-N Alophen Chemical compound C1=CC(OC(=O)C)=CC=C1C(C=1N=CC=CC=1)C1=CC=C(OC(C)=O)C=C1 KHOITXIGCFIULA-UHFFFAOYSA-N 0.000 description 1
- 206010003694 Atrophy Diseases 0.000 description 1
- 206010005177 Blindness cortical Diseases 0.000 description 1
- 241000282693 Cercopithecidae Species 0.000 description 1
- 206010011878 Deafness Diseases 0.000 description 1
- 208000007460 Hemianopsia Diseases 0.000 description 1
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- 244000261422 Lysimachia clethroides Species 0.000 description 1
- 206010057430 Retinal injury Diseases 0.000 description 1
- 241000542420 Sphyrna tudes Species 0.000 description 1
- 206010047571 Visual impairment Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000035045 associative learning Effects 0.000 description 1
- 230000037444 atrophy Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001447 compensatory effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 208000009153 cortical blindness Diseases 0.000 description 1
- 230000009509 cortical damage Effects 0.000 description 1
- 230000001054 cortical effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 125000001475 halogen functional group Chemical group 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000008904 neural response Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000000246 remedial effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 230000004936 stimulating effect Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 230000000946 synaptic effect Effects 0.000 description 1
- 210000003478 temporal lobe Anatomy 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 208000029257 vision disease Diseases 0.000 description 1
- 230000004393 visual impairment Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H5/00—Exercisers for the eyes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5023—Interfaces to the user
- A61H2201/5048—Audio interfaces, e.g. voice or music controlled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H23/00—Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms
- A61H23/02—Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms with electric or magnetic drive
Definitions
- the present invention relates to an apparatus and methodology to retrain visual function in patients who have sustained damage to areas of visual processing in the brain.
- blindsight This phenomenon of unconscious visual processing, called "blindsight" has been investigated in both humans and animals.
- Human subjects were generally stroke or accident victims who lost all or a substantial portion of their visual fields.
- the animals had been surgically altered to eliminate all cortex associated with conscious vision.
- Huxlin U.S. Patent No. 7,549,743 created a vision training device with the following features: 1. Use of moving stimuli, which are believed to be more effective than stationary lights in stimulating the cortical and sub-cortical cells of the visual system. Huxlin employs random dot kinematograms of which some proportion (from 0 - 100%) of the small dots move in the same direction.
- auditory feedback is provided to indicate a correct keyboard response.
- the target is a contrast modulated sinusoidal grating.
- the data input device includes an eye tracker. According to Huxlin et al., when patients attend to visual stimuli in a stationary environment, they show improved motion awareness in the blind hemifield.
- test stimulus is briefly presented (for approximately 500 milliseconds) and the patient either correctly responds to it or fails to respond. Moments later a new target with different parameters (location or motion) ensues.
- a training session involves hundreds of trials.
- the patient indicates target detection with a button press.
- Sabel the patient's response speed is fed back to the software as an indirect measure of visual function, e.g., those test areas corresponding to an absent or delayed response are assumed to represent either blind or visually degraded field. Performance feedback is not implemented; Sabel assumes that the mere act of focusing attention upon the blind field is therapeutic.
- Huxlin In Huxlin, one of four keyboard buttons must be pressed to indicate the perceived direction of target motion. This assumes the process of conscious motion discrimination to be the therapeutic element. In some embodiments of Huxlin, an auditory signal serves as feedback to indicate that the correct "motion direction" key was pressed.
- the present approach uses multimodal stimuli (such as sound and vibration) to accompany each onset of the stimulus, as well as biofeedback principles to train conscious perception.
- multimodal stimuli such as sound and vibration
- biofeedback principles to train conscious perception.
- the present advance in the art is also based on the realization that any device or method which does not provide a "dark-ON" stimulus, does not fully train visual function.
- Targets employed with the present approach have spatial characteristics to stimulate both light and dark detectors.
- the present approach does not involve the mapping of transitional zones or selecting only a portion of the blind field to train. This is because clinical testing has shown the blind field to be non-uniform, with areas of relative sensitivity interspersed with those of deep blindness; a finding that could not be predicted from perimetrically evaluated fields.
- the outcome of visual training using the present invention shows a widening of the entire field (including the sighted hemifield) even when visual targets are randomly presented anywhere within the blind field (and despite the fact that the sighted field is not specifically stimulated), as more fully described below.
- the present approach does not confine training to a single plane. Instead, placement of the fixation point is independent of the display screen and can be varied along the x, y, z dimensions, with the only requirement being that it is placed so that the training device falls into the perimetrically blind field. In rare cases of complete cortical blindness, the patient is positioned to face the display monitor without regard to a specific fixation point. In the present approach, large unauthorized departures from fixation (by more than 2 degrees of visual angle from the fixation point) are interpreted as "cheating" (e.g., seeking the target by using the intact (sighted) field).
- the same (temporally changing) target is repeatedly cycled for a flexible but relatively long duration (generally determined entirely by the patient).
- a new trial begins only when the patient initiates it with a key press.
- an easily detected target might be viewed for a few seconds before the next trial is initiated.
- a target which is not detected will be displayed for as long as the patient wishes. It has been determined that new patients need upwards of five (and frequently twenty) minutes with a single target in order to understand/recognize it. Thus, an hour's session may involve working with only a few targets for very long durations.
- presentation of the visual stimulus is always accompanied ("shadowed") by a stimulus of another modality which exactly mimics the temporal characteristics of the target. For example, if the visual target has a frequency of 0.5 Hz, then the companion ("shadow") click or vibration occurs in synchrony with this visual target.
- the purpose of this non- visual accompaniment is to aid the patient in knowing "what he is seeking”.
- this non-visual input will provide an additional and reliable source of excitation for these weakly responding visual cells.
- the subject is enabled to "hear/feel" the accuracy of his visuo-motor estimates of target location to help isolate and identify the visual neural responses specific to the target.
- feedback indicates the accuracy of his motor search for the target by increasing its temporal frequency as his hand nears the target and decreasing as he goes off course.
- Correct hand/stylus placement is associated with the maximal and very rapid frequency of audible sound/vibration.
- FIG. 1 schematically illustrates an embodiment for a retraining system for patients with post-retinal damage to the visual system.
- FIG 2A illustrates the patent seated at the training apparatus.
- FIG. 2B shows three possible target choices and how each pair appears in its phase reversed-configuration (Tl and T2).
- FIG. 2C demonstrates a timeline for target presentation
- FIG. 2D demonstrates one possible embodiment for determining feedback frequency by associating the target area with concentric distance/reward zones.
- FIG. 3 illustrates a sample menu for patient trials, as well as for some research options.
- FIG. 4 represents the procedure for a single trial.
- FIG. 5 is a flow chart which demonstrates a sample training sequence.
- FIG. 6A and FIG. 6B illustrate empirical data for a first subject (SI), collected during two sessions, (one at baseline and another, after approximately one month of training).
- SI first subject
- FIG. 6B and FIG. 6D illustrate empirical data for a second subject (S2), collected during two sessions, (one at baseline and another, after approximately one month of training).
- FIGS. 7A, 7B, 7C and 7D illustrate changes in visual field for one patient, from baseline to various points in time points during training, (as independently assessed by the Humphrey Perimeter).
- FIGS. 8A and 8B illustrate changes in visual field for a second patient from baseline to two months into training (as independently assessed by the Humphrey Perimeter).
- subject and “patient” both are used herein to refer to an individual using the retraining system and method disclosed herein.
- the preferred embodiment for retraining the visual system is comprised of a conventional computer 10 including a CPU (Central Processing Unit) and having a hard drive containing one or more computer programs in a format executable by the CPU.
- the CPU containing the software can be connected via internet to the training device.
- Other programmable devices which can be used include a game box, or virtual reality device.
- the computer or other programmable device is connected to the following peripheral devices.
- a computer monitor 20 (or any visual display capable of displaying a light or image specified by the programs), for example a CRT, LCD, array of LEDs, OLED, virtual reality goggles and the like is connected to computer 10.
- Touch device 30 represents an interface for detecting a patient's hand position (for example, a touch screen overlay (such as is available from Keytec Inc TX, USA)).
- a touch screen overlay such as is available from Keytec Inc TX, USA
- a light pen such as is available from Interactive Computer Products, Inc. CA, USA
- a photocell such as is available from Interactive Computer Products, Inc. CA, USA
- virtual reality glove also known as a virtual reality glove, a data glove or a cyber glove
- a keyboard 40 or any equivalent input device known to the art
- a stylus 50 is held during the search task assigned to the patient and is capable of communicating hand/target position to the computer 10 and/or providing vibrational feedback to the patient.
- the stylus can be a handheld photocell which responds with increased voltage to increased target proximity. If the monitor 20 is a CRT, the stylus can be a lightpen (such as that made by Interactive Computer Products, Inc).
- An embodiment which delivers vibrational feedback requires the conversion of a computer generated algorithm into an electrical pulse pattern.
- Communication between the computer software and an external vibrator can be accomplished by any interface known in the art for this purpose, for example, the programmable device produced by Phidgets; (SSR Relay Board (Item# 3052) and the Phidget Interface Kit (Item# 1018)).
- Phidgets SSR Relay Board (Item# 3052) and the Phidget Interface Kit (Item# 1018)
- a commercially available mouse-glove may also be modified for this purpose.
- Standard audio speakers 60 are connected to computer 10. Sound intensity can be adjusted to a level which is comfortable to the patient.
- An eye movement detector 70 can be any device known in the art, capable of detecting gross eye movements; such detector 70 is commercially available from ISCAN Inc. (Burlington, Mass.). Information regarding eye position is fed back to the software residing on computer 10 to activate instructional voice clips.
- the eye tracking device is mounted above a fixation point, as more fully described below.
- the eye tracking device can be worn by the patient.
- a fixation point generator such as a light 80, which can be, for example, a 3 volt red LED activated by a lithium battery is positioned near the borderline of the subject's blind/sighted field.
- This light (whether freestanding or attached to the computer by sliding/adjustable hinges) - can be positioned anywhere in ⁇ , ⁇ , ⁇ space, enabling training to occur at any depth or portion of the visual field. Except when the embodiment involves virtual reality, the fixation point 80 is the only device in FIG. 1 which otherwise does not connect to the computer 10.
- a competing stimulus device 90 such as a light, is positioned in the sighted field and has temporal characteristics that are synchronized to the target displayed in the blind field.
- the competing light 90 can be an LED or visual image capable of rapid recycling at the same rate as the target.
- the competing stimulus device 90 displayed in FIG. 1 is an LED encased in a gooseneck lamp frame. Initiation of the voltage output which activates this competing light, is determined by the software, in accordance with pulse supplied by a USB port of computer 10. To meet LED voltage requirements, which can be greater than the 5V USB output, a battery pack may be inserted into the circuit between the USB port and the LED lamp. Software instructions to control the USB output are channeled through the already mentioned Phigets interface system (FIG. 1, numeral 50) although it will be recognized by those familiar with the art, that other means of generating an output pulse (for example through an RS 232 port of computer 10) are possible. In the embodiment of virtual reality, the competing light may be programmed by the software and presented as a virtual image in the sighted field.
- a hand held control 100 can regulate characteristics of the light of the competing device 90, and can comprise:
- a rheostat to adjust voltage input to the light of the competing device 100 in order to raise or lower its luminance may include the following attachments (not shown):
- this procedure can be adapted to a virtual reality device in which the target and fixation points are projected into virtual space and the patient's limb position is monitored with a virtual reality glove.
- Virtual reality would allow for the creation of three dimensional targets and fixation points of different depths.
- the training procedure can be adapted to goggles sensitive to eye position, where correct target localization results in auditory feedback.
- FIG. 2A shows a patient with left sided blindness seated at the training apparatus. He is facing the fixation point and eye monitor. For a patient with right sided blindness, a mirror image arrangement would be used.
- FIG. 2B shows three of the many possible target choices (a circle, or two sizes of checkerboards) and how each pair appears in its two phase reversed configurations (at times Tl and T2).
- FIG. 2C illustrates a timeline for target display during which the two phases of target configurations (Tl and T2) alternate in time.
- Tl and T2 targets spatially overlap, but they may also be placed in near proximity to give the illusion of movement.
- multiple targets may be displayed at the same time, or in close succession so as to mimic motion.
- the Tl and T2 combinations can vary in size and spatial location, so that during the course of a trial, the smallest size travels a short distance (while simultaneously expanding) into the largest size, and then "explodes” (with corresponding sound effects indicating motion and a "pop").
- target is intended to include any type of temporally changing visual stimulus which can be associated with additional non-visual sensory information.
- the spatial configurations of the target can include all those to which the normal visual system is responsive, including those typically used in vision research, such as sinusoidal gratings, checkerboards, spirals, etc.
- a brief click is played to mimic the temporal frequency of the visual information.
- a tactile pulse can be synchronized to the visual display frequency.
- FIG. 2D shows one embodiment for search feedback. All targets are associated with concentric distance-related "zones.” When the patient's hand touches the zone directly over the target, he is rewarded with a rapidly recycling sound/vibration (which continues as long as his hand is in contact with the screen). Sound feedback is probably sufficient for patients with normal hearing. Vibrational feedback (conveyed via the stylus) is necessary for deaf patients. In some embodiments, both types of feedback can be used simultaneously. It remains to be clinically determined whether the combination of sound and touch feedback is superior to unimodal reinforcement.
- the feedback frequency of the sound decreases.
- Prerecorded sound clips are associated with each feedback zone.
- the precise distance of the hand to the target can be calculated, for example, by using coordinate data of the guessed position and the actual position of the target, converted by a mathematical algorithm into a pulse frequency, which then activates an external sound generating semiconductor chip and associated circuitry (not shown).
- the present approach is intended to include all ways known to those familiar with the art, in which the feedback information can be made to vary according to target position guessed by the patient. With an appropriate command such as a stylus tap, the patient can turn the feedback off or on.
- reinforcement zones outline the target area, it is possible for the patient to use this multimodal feedback to locate and learn (with his auditory and motor systems) the spatial details (shape/size/spatial envelope of motion) of a visual target, which he cannot see.
- all reward zones (with the exception of the one containing the target) can be deactivated, to aid in the recognition of target boundaries.
- FIG. 3A depicts menu options for the target stimuli.
- the target parameters include the targets (Tl and T2) already described in FIG. 2B, and various options for size, color and temporal frequency.
- FIG. 3B allows for choice of screen color and contrast with respect to the target.
- FIG. 3C demonstrates one training protocol for a subject's first experience with the procedure.
- the option for "custom" parameters allows the user to select his own spatial and temporal parameters and also to upload his own visual stimuli. This option is desirable for those conducting research in blindsight and consciousness.
- FIG. 4 describes a trial format for a subject. While he fixates ahead 410, a clicking target is presented 420 to a random location within his blind field. At 430 he is encouraged to place his hand or the stylus upon the target and to be guided by feedback at 440. Active motor involvement not only maximizes the contribution of unconscious visual-motor pathways to learning, it is more effective than passive activity (i.e. verbal report) in establishing a visual-spatial map (Hein et al, 1970).
- the subject is encouraged to concentrate upon the target and try to determine why this location is correct.
- he may be told to look directly at the target with his sighted field and then return his gaze to fixation.
- he is encouraged to manually explore the region around the target and to observe the change in feedback as he deviates from the correct location. The patient may develop his own strategy for "understanding" the location of the target.
- the search can be repeated with a competing light in the sighted hemifield, adjustments may be made in the intensity of the competing light and it may be turned on and off by the patient.
- FIG. 5 shows the format of a training sequence for new and more experienced patients.
- Patient data is inputted (step 510). New patients typically begin training with the largest, brightest target presented on a black background (step 520). After several sessions, levels of difficulty may be increased (steps 530 and 540).
- a first trial is initiated at step 550, during which the patient searches for a desired time, (step 555). At any time during this search, he has the option of using a competing light at step 558, as described below. Or, by hitting the keyboard 40 (Fig 1), the subject may initiate a new trial (step 560), in which the same target is displayed in a different location. The same sequence of steps is repeated at 565 and 568. This procedure is iterated as many times as the subject desires.
- a last trial is conducted at step 570. At the conclusion of the session, the search data is printed and stored, at step 580.
- Levels of the trials include but are not limited to: a. Smaller target sizes, b. Dimmer targets, c. Counterphasing checkerboard targets of varying spatial frequency, d. Lower target/background contrast ratios, e. Increasing the number of simultaneous targets. The patient is required not only to locate them but to bisect the space between them. f. Presentation of large dark targets (flickering or jiggling) in a small area (on white or grey screens). g. Competing illumination (of increasing intensity) from the good field. The use of a competing light in training is based upon the assumption that the blindness experienced by brain-injured patients results from an active suppression generated by the intact brain upon the weak/damaged areas (Richards, 1973). The greater the stimulation of the good brain (e.g., the brighter the room illumination), the more substantial its blinding suppressive effect upon the weaker brain will be (Harrington, 1970). The present technique seeks to regulate this inhibition through the following requirements:
- This competing stimulus may be (but is not limited to) a light that flickers in synchrony with the test target.
- the patient can a. control its size, color, and pattern information (by using masks , filters and transparent overlays, respectively) and spatial position (by moving it closer or further). b. regulate its luminance and/or turn it on and off at will using the control 100 (FIG. 1).
- FIG. 6A and FIG. 6C for a first patient and to FIG. 6B and 6D for a second patient, the typical change in search accuracy is shown, from the baseline condition to that noted after one month (approximately 10 hrs) of training.
- Each drawing documents all search paths made for several targets in a single sixty to ninety minute session.
- the search path for each target was created in color as the hand moved across the screen (each target having its own associated color path to differentiate it from the search paths for other targets shown in that session).
- FIGS. 7A, 7B, 7C and 7D, as well as FIGS. 8A and 8B show the change in visual field for two patients as demonstrated by the Humphrey perimeter.
- This device presents extremely brief target lights onto a dimly lit background, making it different (and far more difficult than) the training paradigm in which the target is large and presented on a dark background for a long duration.
- FIG. 7A The four fields presented herein were obtained at baseline (FIG. 7A), after five weeks of training (FIG. 7B), and the last session after five months of training (FIG. 7C).
- FIG. 7D One year after training, a follow-up field was taken (FIG. 7D). Not only was the improvement preserved, but the patient had returned to work doing surgical consulting - which included reading x-rays).
- FIG. 8 demonstrates the visual fields of a second patient, a seventy seven year old man with hemi-blindness due to occipital stroke.
- His CAT scan showed (1) a low density area in the left occipital lobe with effacement of the sulci and (2) obliteration of the left occipital horn. He was first seen fifteen months post- traumatically. His baseline evaluation showed total absence of vision in the right field. After two months of training, (seventeen sessions), his functional field crossed the midline, enabling him to read and to see his entire face in the mirror. For all patients, the portion of the visual field whose increase can be documented with the Humphrey Perimeter, shows color and form which appears subjectively normal.
- the data that is saved, and used in a manner different than that the prior art includes:
- session parameters name, date, target size, etc
- the software makes it possible for researchers and clinicians to obtain measures of the time required to locate target (at a given level of difficulty).
- search time decreases as proficiency improves.
- this information is less meaningful than the search path, since the target can occasionally be located by accident without sight.
- the patient may delay the immediate search and instead simply contemplate possible target locations without touching the screen until a certain measure of certainty develops.
- Trial duration is automatically recorded. In general, less time is spent exploring targets in locations of greater sensitivity. However, a trial could also be rejected if the target is randomly placed in a very similar location as in an earlier trial belonging to the same session. Thus, this information may be less valuable.
- the blind field is not uniformly blind. If the entire visual field is trainable, some areas will show improvement before others. After only a few hours of training the patient may report a first "intuition” that "something is there” but he is reluctant to label this experience as visual. This intuition is eventually replaced by a halo which emanates from "somewhere" in the blind field but has no identifiable source. When he locates the target by sound, it may suddenly appear brighter but is still non-localized.
- the brightness will seem to be more concentrated and may assume a location in space, either in its true position or it will appear closer to him than it really is.
- the association of sound and sight are crucial. When the patient withdraws his hand from the screen, the experience of the target lessens.
- a typical behavior of a patient who has learned to see with the sound feedback is to concentrate upon the target, occasionally refreshing his image by placing his hand upon it for the sound reinforcement. Later behaviors are to place the hand above the target (without activating sound) and to confirm accuracy by looking with the intact field.
- Fluctuation of the visual experience is extremely common.
- the same target which is mastered at one time during the session may have to be retrained later that session. This is particularly true after a very difficult condition is introduced; for example, if room illumination is raised. Under this circumstance, an "easy target” may suddenly become invisible for several minutes, even if complete darkness is restored. (This is suggestive of a longstanding inhibitory effect).
- the general trend is toward improvement over sessions.
- Stray light which enters the good field is of little value in pinpointing the target location. The naive subject will report that he sees nothing and that he cannot locate the target except by sound. In cases where stray light is detected, the patient commonly begins his search along the border of his sighted field, annoyed by the absence of feedback. For targets far from midline, stray light is frequently unnoticed; a patient may sit beside a brilliant flashing target asking "Tell me when we're ready to start.”
Landscapes
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Epidemiology (AREA)
- Pain & Pain Management (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Rehabilitation Tools (AREA)
Abstract
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2014/015523 WO2015119630A1 (fr) | 2014-02-10 | 2014-02-10 | Procédé et appareil d'entraînement de la vision |
Publications (3)
Publication Number | Publication Date |
---|---|
EP3104764A1 true EP3104764A1 (fr) | 2016-12-21 |
EP3104764A4 EP3104764A4 (fr) | 2017-03-22 |
EP3104764B1 EP3104764B1 (fr) | 2019-01-09 |
Family
ID=53778308
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP14881439.5A Active EP3104764B1 (fr) | 2014-02-10 | 2014-02-10 | Procédé et appareil d'entraînement de la vision |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3104764B1 (fr) |
WO (1) | WO2015119630A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107661198A (zh) * | 2017-08-29 | 2018-02-06 | 中山市爱明天使视光科技有限公司 | 基于恢复及提升视力的眼睫状肌体训练的模拟场景互动系统 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11301993B2 (en) | 2017-02-17 | 2022-04-12 | The Schepens Eye Research Institute, Inc. | Treatment of ocular disorders using a content guide for viewing images |
GB201711916D0 (en) * | 2017-07-24 | 2017-09-06 | Moon Hub | Virtual reality training system and method |
US11790804B2 (en) | 2018-09-14 | 2023-10-17 | De Oro Devices, Inc. | Cueing device and method for treating walking disorders |
CN113439682A (zh) * | 2021-06-07 | 2021-09-28 | 中国人民解放军军事科学院军事医学研究院 | 用于灵长类动物三维运动目标拦截的训练装置及方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4786058A (en) * | 1987-06-22 | 1988-11-22 | Baughman James S | Electric target and display |
US7819818B2 (en) * | 2004-02-11 | 2010-10-26 | Jamshid Ghajar | Cognition and motor timing diagnosis using smooth eye pursuit analysis |
AU2005278771B2 (en) | 2004-09-03 | 2010-03-25 | Ucansi Inc | Systems and methods for improving visual perception |
WO2006074434A2 (fr) * | 2005-01-06 | 2006-07-13 | University Of Rochester | Systemes et methodes permettant d'ameliorer la discrimination visuelle |
CA2655438A1 (fr) * | 2006-06-30 | 2008-01-10 | Novavision, Inc. | Diagnostic et systeme therapeutique pour vision peripherique |
US20110066069A1 (en) | 2009-09-16 | 2011-03-17 | Duffy Charles J | Method and system for quantitative assessment of visual form discrimination |
US8646910B1 (en) * | 2009-11-27 | 2014-02-11 | Joyce Schenkein | Vision training method and apparatus |
CN202801563U (zh) * | 2012-08-07 | 2013-03-20 | 北京嘉铖视欣数字医疗技术有限公司 | 基于双眼立体视知觉矫治训练系统 |
-
2014
- 2014-02-10 WO PCT/US2014/015523 patent/WO2015119630A1/fr active Application Filing
- 2014-02-10 EP EP14881439.5A patent/EP3104764B1/fr active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107661198A (zh) * | 2017-08-29 | 2018-02-06 | 中山市爱明天使视光科技有限公司 | 基于恢复及提升视力的眼睫状肌体训练的模拟场景互动系统 |
Also Published As
Publication number | Publication date |
---|---|
EP3104764B1 (fr) | 2019-01-09 |
WO2015119630A1 (fr) | 2015-08-13 |
EP3104764A4 (fr) | 2017-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8646910B1 (en) | Vision training method and apparatus | |
US7549743B2 (en) | Systems and methods for improving visual discrimination | |
O'regan et al. | A sensorimotor account of vision and visual consciousness | |
Spence et al. | Spatial constraints on visual-tactile cross-modal distractor congruency effects | |
EP3104764B1 (fr) | Procédé et appareil d'entraînement de la vision | |
US20080013047A1 (en) | Diagnostic and Therapeutic System for Eccentric Viewing | |
O'Regan et al. | Acting out our sensory experience | |
CUMMING | Eye movements and visual perception | |
Gallese et al. | Mirror neurons: A sensorimotor representation system | |
Clark et al. | Sensorimotor chauvinism? | |
Ryan et al. | The existence of internal visual memory representations | |
Block | Behaviorism revisited | |
Schaadt et al. | Vision and visual processing deficits | |
Scholl et al. | Change blindness, Gibson, and the sensorimotor theory of vision | |
Revonsuo | Dreaming and the place of consciousness in nature | |
Bartolomeo et al. | Visual awareness relies on exogenous orienting of attention: Evidence from unilateral neglect | |
Cohen | Whither visual representations? Whither qualia? | |
Whittaker et al. | Managing Peripheral Visual Field Loss and Neglect | |
Blackmore | Three experiments to test the sensorimotor theory of vision | |
Chan et al. | Gaze and attention: mechanisms underlying the therapeutic effect of optokinetic stimulation in spatial neglect | |
Van Gulick | Still room for representations | |
Tatler | Re-presenting the case for representation | |
De Graef et al. | Trans-saccadic representation makes your Porsche go places | |
Hardcastle | Visual perception is not visual awareness | |
Humphrey | Doing it my way: Sensation, perception–and feeling red |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20160907 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20170222 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 5/00 20060101ALI20170216BHEP Ipc: A61B 3/00 20060101AFI20170216BHEP |
|
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 3/00 20060101AFI20180829BHEP Ipc: A61B 5/00 20060101ALI20180829BHEP |
|
INTG | Intention to grant announced |
Effective date: 20180928 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 1086302 Country of ref document: AT Kind code of ref document: T Effective date: 20190115 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602014039818 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20190109 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1086302 Country of ref document: AT Kind code of ref document: T Effective date: 20190109 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190509 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190409 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190409 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190509 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190410 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602014039818 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190210 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
26N | No opposition filed |
Effective date: 20191010 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190228 Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190210 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20190210 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20140210 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20190109 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20240103 Year of fee payment: 11 Ref country code: GB Payment date: 20240109 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20240221 Year of fee payment: 11 Ref country code: BE Payment date: 20240220 Year of fee payment: 11 |