WO2017164383A1 - 視覚フィルタ同定方法及び装置 - Google Patents
視覚フィルタ同定方法及び装置 Download PDFInfo
- Publication number
- WO2017164383A1 WO2017164383A1 PCT/JP2017/012067 JP2017012067W WO2017164383A1 WO 2017164383 A1 WO2017164383 A1 WO 2017164383A1 JP 2017012067 W JP2017012067 W JP 2017012067W WO 2017164383 A1 WO2017164383 A1 WO 2017164383A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern image
- stimulus pattern
- presentation
- stimulus
- image
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0008—Apparatus for testing the eyes; Instruments for examining the eyes provided with illuminating means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0091—Fixation targets for viewing direction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/024—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Definitions
- the present invention relates to a method and apparatus for non-invasively, objectively and quantitatively identifying a visual filter representing the basic performance of a visual function.
- the techniques for evaluating visual function include, for example, “visual acuity inspection method” and “visual field measurement inspection method”. These techniques are used to evaluate visual “spatial resolution” and “viewable area”. As a technique for measuring the visual acuity when viewing a moving object, for example, there is a “moving visual acuity inspection method”. This technique is used to evaluate visual "spatial resolution" for moving objects.
- the subject's allegiance is included in the report contents and measurement results.
- the subject can also lie. If the subject can report an answer that is different from the actual result, either intentionally or by assumption, data with sufficient objectivity cannot be acquired.
- Non-Patent Document 1 described above does not describe a technique for identifying a visual filter representing the basic properties of visual functions non-invasively, objectively and quantitatively using eye movement reflexes. .
- the present invention proposes the following method as an example.
- A On a monitor arranged in front of the inspection object, an initial image having a uniform brightness, a first stimulus pattern image having the same average brightness as the initial image, and an apparent motion together with the first stimulus pattern image Sequentially presenting second stimulating pattern images to be evoked,
- B measuring eye movements within a predetermined period during presentation of the second stimulus pattern image, and storing the eye movement in association with the presentation time length of the first stimulus pattern image used at the time of measurement;
- C The execution of each step shown in (a) and (b) is set as one trial, and the setting of the presentation time length of the first stimulus pattern image used in each trial is changed a plurality of times.
- a step of repeating the trial (D) calculating a change in gaze direction based on the measured measurement data of the eye movement for each trial in the step shown in (c); (E) calculating the simulation result by inputting the first stimulus pattern image and the second stimulus pattern image to the kinetic energy model of the eye movement response; (F) A measurement waveform specified by the change in the line-of-sight direction obtained in the step shown in (d) and the presentation time length associated with the change, and obtained in the step shown in (e).
- a parameter value of the kinetic energy model is optimized so as to minimize a difference from a simulation result obtained, and a time filter specific to the inspection target is calculated;
- a visual filter identification method characterized by comprising:
- a visual filter representing a basic property of visual function non-invasively, objectively and quantitatively using eye movement reflex.
- the figure explaining the measurement area used for the identification of the time filter in an ISI inspection method The figure which shows the change data of the eyeball position measured by the multiple trials of an ISI inspection method. The figure which arranged the change data shown in FIG. 12 in order of the length of presentation time length. The figure which shows the response characteristic measured combining the MOD inspection method and the ISI inspection method.
- FIG. 1 shows a schematic configuration of a visual filter identification device 1 used by the inventors for experiments.
- the visual filter identification device 1 includes a stimulus presentation monitor 2, a visual stimulus presentation device 3, an experiment control / data recording device 4, an eye movement measurement device 5, and a data analysis device 6.
- the visual stimulus presentation device 3, the experiment control / data recording device 4, and the data analysis device 6 are all configured by a computer, and functions corresponding to each device are realized through execution of a program.
- the computer is composed of an input / output device that exchanges data with the outside, a storage device that records data, a control device that controls the execution status of the program and the state of each device, an arithmetic device that performs data calculation and processing, etc. .
- the visual stimulus presentation device 3, the experiment control / data recording device 4, and the data analysis device 6 are all realized by independent computers. However, all or some of the functions corresponding to these devices may be realized on one computer.
- the stimulus presentation monitor 2 is a device that presents various images used for time filter identification, and is arranged in front of the examination target (for example, human or animal).
- the stimulus presentation monitor 2 for example, a CRT (CathodeathLay Tube), a flat panel display such as a liquid crystal display or an organic EL display, or a projector is used.
- a 19-inch CRT monitor (size: 360 ⁇ 270 mm, resolution: 1280 ⁇ 1024 mm pixels, refresh rate: 100 Hz) was used.
- the head to be examined was fixed during measurement.
- a method of pressing the jaw against a pedestal (chin base) a method of biting a tooth profile called a bite block, or the like is used.
- the distance between the inspection object and the monitor screen was 63.4 cm.
- this figure is an example.
- the visual stimulus presentation device 3 is a device that generates a predetermined image (including a first stimulus pattern image and a second stimulus pattern image described later) to be presented on the stimulus presentation monitor 2.
- MATLAB registered trademark
- free software Psychtoolbox developed in the field of psychology were used for image generation.
- the experiment control / data recording device 4 is a device that controls the output of various images in conjunction with the visual stimulus presentation device 3.
- software REX: Real-time EXPerimental system developed by NIH (National Institutes of Health) was used. REX can be replaced with commercially available software such as LabView (trademark).
- the experiment control / data recording device 4 includes at least an A / D converter and a UDP communication function as hardware.
- ⁇ REX also records the eye position of the test subject.
- the voltage value representing the eyeball position to be examined is converted into a 12-bit data value through the A / D converter.
- the converted data values are collected every 1 millisecond and recorded on a hard disk.
- the experiment control / data recording device 4 instructs the visual stimulus presentation device 3 to present a circular gaze target having a diameter of 0.5 ° at 5 ° up, down, left, and right, and presents the gaze target.
- the voltage output measured by the eye movement measuring device 5 is recorded on a hard disk or the like.
- the eye movement measuring device 5 is a device that measures the eye movement caused by the presentation of the image that causes the apparent movement. Basically, the eye movement is always measured while the subject is watching the monitor. However, eye movement measurement may be performed only during a period necessary for data analysis, which will be described later, and eye movement measurement is performed while any image including a gray image is displayed on the stimulus presentation monitor 2. May be executed.
- a method using the first and fourth Purkinje images, a search coil method, and a scleral reflection method can be used.
- a method in which the eyeball part to be examined is imaged with a video camera and the movement of the eyeball is extracted from the captured image.
- the eyeball position (the direction of the line of sight) of the subject was measured using a method using the first and fourth Purkinje images (Dual-Purkinje Image Eye Tracker, produced by Forward Technology).
- the eyeball position is obtained as a voltage signal of the eye movement measuring device.
- the data analysis device 6 is a device that analyzes data measured with the presentation of an image that causes an apparent movement and executes a calculation process for identifying a visual filter (including a time filter) specific to the examination target. is there.
- the time filter is identified by optimizing the parameters of the kinetic energy model so that the simulation result calculated from the kinetic energy model of the eye movement response matches the measurement data.
- the term “kinetic energy model of eye movement response” is used to mean any kinetic energy model that can explain the eye movement response.
- the model proposed by Adelson and Bergen in 1985 to explain the subjective perception of movement is used as an example of the kinetic energy model of the eye movement response.
- Elaborated Reichardt model and other models may be used.
- (1-2) MOD (Motion Onset Delay) Inspection Method a method called a MOD inspection method is applied as a method for measuring the reflex-induced eye movement response.
- the measurement data required for identifying the time filter is collected by repeating the trial, which is a unit of image presentation, a plurality of times.
- 2 and 3 show the arrangement of images constituting one trial as a presentation unit in the MOD inspection method. The presentation of an image corresponding to one trial shown in FIGS. 2 and 3 is executed by the visual stimulus presentation device 3 under the control of the experiment control / data recording device 4.
- Each trial consists of (1) a period during which the gaze target and the first gray image are presented on the monitor screen, (2) a period during which the gaze target and the first stimulus pattern image are presented on the monitor screen, and (3) a second stimulus.
- This is composed of a period during which the pattern image is presented on the monitor screen and (4) a period during which the second gray image is presented on the monitor screen.
- the first gray image and the second gray image both have a uniform luminance value (for example, 5 cd / m 2 ) on the entire screen.
- the luminance value of the first gray image and the luminance value of the second gray image may not be the same.
- the luminance value of the first gray image and the average luminance value of the first stimulus pattern image and the second stimulus pattern image are the same.
- Each trial is started by presenting a circular gaze target with a diameter of 0.5 ° in the center of the first gray image.
- the presentation time of the first gray image is randomly changed between 1000500 and 1000 milliseconds so as to be different for each trial.
- the presentation time of the first gray image may be a fixed length between 500 and 1000 milliseconds, for example.
- the first stimulus pattern image is a vertical stripe pattern whose luminance changes in a sine wave shape along the horizontal direction (spatial frequency: 0.25 cycles / deg, Michelson contrast: 32%, average luminance 5 cd / m 2 ). Is used.
- the luminance of the first stimulus pattern image is expressed by 2048 gray levels.
- the presentation time length of the first stimulus pattern image is randomly varied so as to be different for each trial. This is to prevent the examination target from predicting the presentation time length. However, if it is guaranteed that there is no influence due to the prediction of the subject, it is not necessarily random, and the presentation time length of each trial may be changed based on a predetermined pattern.
- the presentation time length of the first stimulus pattern image in each trial is set to, for example, 0 ms, 10 ms, 20 ms, 30 ms, 40 ms, 60 ms, 80 ms, 160 Randomly select from milliseconds, 320 milliseconds, or 640 milliseconds.
- the presentation time length of 0 milliseconds means that the first stimulus pattern image is not presented.
- one presentation time length is randomly selected from 10 presentation time lengths so that the same presentation time length does not appear repeatedly during 10 trials. Select and use to present the first stimulus pattern image. It should be noted that an output form in which the presentation time length monotonously increases or an output form in which the presentation time length monotonously decreases during 10 trials is not desirable.
- the first stimulus pattern image may be a pattern in which an apparent movement is caused between the first stimulus pattern image and the second stimulus pattern image. Accordingly, the first stimulus pattern image is not limited to the vertical stripe pattern described above, but a horizontal stripe pattern whose luminance changes in a sine wave shape along the vertical direction, a diagonal stripe pattern whose luminance changes in a sine wave shape along the diagonal direction, and a lattice pattern But it ’s okay. Further, the luminance change in the first stimulus pattern image is not limited to a sine wave, and may change in a binary manner.
- the gaze target disappears from the monitor screen, and at the same time, the presentation of the second stimulus pattern image is started.
- MOD Motion Onset delay
- the above-mentioned MOD is the time interval from when the first stimulus pattern image is presented until the second stimulus pattern image is presented (that is, the presentation time length of the first stimulus pattern image). It is.
- the second stimulus pattern image in the present embodiment is the same pattern image as the first stimulus pattern image (in this embodiment, a sinusoidal vertical stripe pattern), but the phase is 90 ° with respect to the first stimulus pattern image. ° (1/4 wavelength) Use one shifted to the right or left. Note that the phase ⁇ is arbitrary as long as it is in the range of 0 ° ⁇ ⁇ 180 °. However, the amount of phase shift is the same for each trial.
- the second stimulus pattern image is presented for 200 milliseconds. Of course, 200 milliseconds is an example.
- the measurement data of the change in the eyeball position that is reflected when the image presented on the monitor screen is switched from the first stimulus pattern image to the second stimulus pattern image is the second stimulus pattern.
- An interval of 50 milliseconds to 200 milliseconds (preferably 80 milliseconds to 160 milliseconds) is used after presentation of an image is started. This is because the eye movement due to reflection is delayed with respect to the change of the stimulation pattern.
- the section used for measuring eye movement is an example, and it is desirable to set an appropriate time according to the contrast and spatial frequency of the apparent movement stimulus.
- At least a point in time before the presentation of the second stimulus pattern image is started (for example, 50 milliseconds before) in order to execute data analysis to exclude trials including saccade movement (gaze shift due to high-speed eye movement), which will be described later It is desirable to start recording eye movements from
- the presentation time length of the second stimulus pattern image When the presentation time length of the second stimulus pattern image has elapsed, the second gray image is presented on the monitor screen. A gaze target is not presented in the second gray image.
- the presentation time length of the second gray image in the present embodiment is about 1.5 seconds, and when this presentation time length elapses, the next trial is started.
- a total of 20 trials are executed as one block for 10 types of presentation time lengths prepared for the first stimulus pattern image ⁇ 2 types of movement directions (right direction and left direction).
- the movement direction may be only one direction.
- the presentation time length or the type of movement direction may be increased.
- it is desirable that trials satisfying the same presentation time length and the same direction of movement are executed a plurality of times. If a plurality of measurement data can be collected for the same condition, measurement noise can be reduced by calculating the average value of them.
- FIG. 4 shows an example of a waveform obtained by converting data (change in eyeball position) measured under a stimulus condition in which the presentation time length of the first stimulus pattern image is 10 milliseconds into a change speed of the eyeball position.
- FIG. 4 shows a waveform of a change rate corresponding to the average value of a plurality of measurement data acquired during a plurality of trials performed under the same stimulation condition. Specifically, the movement to the right is shown. It shows a waveform obtained by subtracting the measurement data when the leftward movement is given from the measurement data when given.
- the horizontal axis of FIG. 4 is the elapsed time (milliseconds) when the time of starting the presentation of the second stimulus pattern image in the measurement section shown in FIG. 3 is zero, and the vertical axis is the change speed of the eyeball position. .
- the process of converting the eyeball position into the change speed is executed by the data analysis device 6. Further, the experiment control / data recording device 4 associates the presentation time length of the first stimulus pattern image with the direction of the apparent motion stimulus with respect to the measurement data of each measurement time, and records it in the hard disk or the like. Therefore, the measurement data shown in FIG. 5 is recorded on the hard disk at the end of all trial times.
- FIG. 4 is a diagram for explaining the mechanism of data analysis. In an actual data analysis operation, the eyeball is calculated without calculating the average value of the change speed data of the eyeball position acquired during a plurality of trials. The eyeball position change data corresponding to each presentation time length is directly calculated from the position data.
- FIG. 5 shows the length of the presentation time length of the first stimulus pattern image used at the time of measurement data (change speed of the eyeball position shown in FIG. 4) measured with the presentation time length of various first stimulus pattern images. It is a figure rearranged and shown in order.
- the integral value of a period of 80 milliseconds to 160 milliseconds indicates a change in the eyeball position corresponding to each presentation time length, and a time filter specific to the examination target is selected.
- FIG. 5 is also a drawing for explaining the mechanism of data analysis. In the actual data analysis operation, the eyeball position change data corresponding to each presentation time length is obtained from the eyeball position data without calculating the integral value. Calculate directly.
- the data analysis device 6 analyzes the measurement data read from the experiment control / data recording device 4 using MATLAB (registered trademark). Specifically, the eyeball position change data corresponding to each presentation time length is analyzed using a kinetic energy model to identify a time filter. Data analysis is performed according to the following procedure.
- the data analysis device 6 uses a digital low-pass filter (for example, 4-pole Butterworth Filter, -3dB at 25Hz) to remove high-frequency noise from the eye position data measured for each trial.
- a digital low-pass filter for example, 4-pole Butterworth Filter, -3dB at 25Hz
- the data analysis device 6 calculates the time difference of the data of the eyeball position from which the noise is removed, and calculates the change speed of the eyeball position.
- the data analysis device 6 calculates the time difference of the eyeball velocity data and calculates the acceleration related to the change in the eyeball position.
- the data analysis device 6 uses the data of the eyeball velocity and the eyeball acceleration, and starts the second stimulation from the 50 ms before the presentation of the second stimulation pattern image (the presentation period of the first stimulation pattern image). Eliminate trials in which saccade movement (gaze shift due to high-speed eye movement) occurred within the time interval from the start of presentation of the stimulus pattern image to 200 milliseconds. In this embodiment, saccadic movement, the motion of the eye eyeball rate exceeds 30degree / s, or, ocular acceleration is defined as the movement of the eye in excess of 1000degree / s 2.
- the data analysis device 6 only targets the measurement data of trial times in which no saccade movement was detected, and after 160 ms from the start of presentation of the second stimulus pattern image.
- the change (deg) of the eyeball position is calculated based on the measurement data measured until.
- the eye movement response generated in the 80 to 160 millisecond interval from the start of the presentation of the second stimulus pattern image is calculated, the eyeball position at the time corresponding to 160 millisecond is changed to 80 millisecond.
- the change of the eye movement position was calculated by subtracting the eye position data at the corresponding time.
- an integral value of a change speed of the eyeball position in a period of, for example, 80 milliseconds to 160 milliseconds may be used instead.
- the data analysis device 6 calculates the average of a plurality of measurement data collected for the same stimulus condition in order to obtain an eye movement response corresponding to each presentation time length.
- the data analysis device 6 calculates the average value of the eye movement response when the second stimulus pattern image moves to the right with respect to the first stimulus pattern image, The difference of the average value of the eye movement response when moving in the direction is calculated. Note that if the direction of the apparent motion stimulus is the reverse direction, the direction of the eye movement reaction is also the reverse direction (the sign is positive or negative), and thus the effect is that the measured value is doubled by calculating the difference.
- the data analysis device 6 obtains response characteristics (measurement data) to the stimulation condition of the subject to be examined.
- FIG. 6 shows an image of the measured response characteristics.
- the horizontal axis in FIG. 6 is the length of the presentation time of the first stimulus pattern image, and the vertical axis is the magnitude of the corresponding response (the eyeball position occurring between 80 and 160 milliseconds from the start of presentation of the second stimulus pattern image). The magnitude of change).
- the data analyzer 6 applies an image of the same stimulation condition as the input value to the kinetic energy model (Adelson & Bergen, 1985) including a time filter given by the following equation as an input value.
- the simulation result corresponding to is calculated.
- k is a time scale
- b is the magnitude of the negative component
- N is a parameter that defines the order.
- the kinetic energy model includes two time filters, Fast filter and Slow filter, which differ only in order.
- the order of the Fast filter is N fast and the order of the Slow filter is N slow (> N fast ).
- the initial value prepared for each parameter is used for the first simulation calculation.
- the data analysis device 6 is configured to repeat one trial for inputting the first stimulus pattern image and the second stimulus pattern image and repeating a plurality of trials while changing the presentation time length of the first stimulus pattern image.
- the simulation results are calculated for each of these multiple inputs.
- the output of the kinetic energy model is the difference between the energy in the right direction and the left direction (hereinafter referred to as “kinetic energy model output”), and is given as a function of time.
- the average value of the kinetic energy model at the time corresponding to 0 to 160 milliseconds is used as the simulation result.
- the interval for calculating the average value is not limited to 0 to 160 milliseconds.
- an integrated value may be used instead of the average value.
- the data analysis device 6 uses a time filter in the kinetic energy model so that the difference between the simulation result of the kinetic energy model and the response characteristic (measurement data) quantified for each stimulus condition is minimized.
- the fast time filter order N fast and the slow time filter order N slow are fixed to 3 and 6, respectively.
- N fast and N slow may be optimized simultaneously.
- the parameter value of the kinetic energy model is optimized so that the difference between the simulation result and the average value of the eye movement response corresponding to each presentation time length is minimized. Rather than matching the time waveform of the eye movement itself, optimization is performed so that the average value of the magnitude of the eye movement response determined depending on the presentation time length of the first stimulus pattern image matches the output of the model.
- FIG. 7 shows an image of optimization.
- Visualization that is the basis for the temporal resolution of the visual test of the subject to be examined when the parameter value that minimizes the difference between the simulation results of the kinetic energy model and the response characteristics (measurement data) quantified for each stimulus condition is determined
- the time filter of the system is identified quantitatively.
- FIG. 8 shows an example of the identified time filter. The figure is an example of a time filter that best reproduces the measurement data.
- the data analysis device 6 performs Fourier analysis on the identified time filter, and calculates characteristics in the frequency domain.
- FIG. 9 shows an example of the frequency characteristic of the calculated visual filter.
- the visual filter includes the filter representing the frequency characteristics and the time filter described above. From the visual filter, it is possible to obtain quantitative information representing the temporal resolution and properties of the visual system, such as the optimal time frequency, passband frequency band, and optimal speed.
- the technology according to the present embodiment examines the performance of the most basic functions (visual filters) of the visual system, the retina, optic nerve, primary visual cortex, higher visual cortex (occipital and parietal association areas). It can also be used for the development of new quantitative diagnostic methods for normality and abnormality of motor vision and spatial vision. Of course, the technology according to the present embodiment can also be used to develop a test method for measuring the effects of treatment, rehabilitation, training, and the like.
- the technology according to the embodiment should be used for the examination of basic visual functions in various medical departments (pediatrics, ophthalmology, neurology, psychiatry, rehabilitation, etc.) in the field of medicine and welfare, and development of testing equipment. Can do. Moreover, (1) infants and children to adults, (2) patients who cannot answer questions properly, and (3) non-human animals can be included in the test.
- the technique according to the embodiment is based on the reflexive eye movement reaction, it can be repeatedly applied to the same person. For this reason, longitudinal examinations are possible and can be used to measure the effects of development and treatment. For example, it can be used in devices and products for evaluating changes in visual functions associated with development and aging, and visual function disorders associated with neuropsychiatric disorders.
- the visual filter that can be quantified in this embodiment is one of the factors that determine the dynamic visual acuity.
- the moving vision test has been to measure the spatial resolution of "how much detail can be seen in moving objects", but the technology according to this embodiment looks at "changes" in the visual image. It measures the time resolution for this, and is related to the recognition of the movement itself.
- a new dimension of dynamic visual acuity that has not been measured so far can be measured. For example, it can be expected to be used for a new visual function test for evaluating the visual ability of athletes. It can also be applied to product development in industry.
- the MOD inspection method described in the first embodiment can be used in combination with the ISI inspection method.
- the ISI inspection method is an inspection method for measuring the following eye movement induced when an image having an average luminance value is inserted between a first stimulus pattern image that gives an apparent motion stimulus and a second stimulus pattern image. The test object is made to perceive a movement opposite to the movement direction of the apparent movement stimulus.
- the measurement based on the ISI inspection method may be executed after the measurement based on the MOD inspection method, or the measurement execution based on the ISI inspection method. Later, measurement based on the MOD inspection method may be performed, measurement based on the MOD inspection method and measurement based on the ISI inspection method may be performed alternately, or each trial of the MOD inspection method and the ISI inspection method Each trial may be mixed and measured in a random order.
- an ISI inspection method using the visual filter identification device 1 FIG. 1
- FIGS. 10 and 11 show the image layout of one trial in the ISI inspection method.
- the presentation of an image corresponding to one trial shown in FIGS. 10 and 11 is executed by the visual stimulus presentation device 3 under the control of the experiment control / data recording device 4.
- each trial round consists of (1) a period in which the gaze target and the first gray image are presented on the monitor screen, and (2) a period in which the gaze target and the first stimulus pattern image are presented on the monitor screen. , (3) Period for presenting the gaze target and the second gray image on the monitor screen, (4) Period for presenting the second stimulus pattern image on the monitor screen, (5) Period for presenting the third gray image on the monitor screen Consists of.
- the difference from the first embodiment is that the gaze target and the second gray image are presented between the first stimulus pattern image and the second stimulus pattern image.
- the first gray image, the second gray image, and the third gray image all have a uniform luminance value (for example, 5 cd / m 2 ) on the entire screen.
- the luminance value of the first gray image and the luminance value of the second gray image may be the same as the average luminance value of the first stimulus pattern image and the average luminance value of the second stimulus pattern image.
- the presentation time of the gaze target and the first gray image is randomly changed between 500 and 1000 milliseconds so as to be different for each trial.
- the presentation time of the first gray image may be a fixed length between 500 and 1000 milliseconds, for example.
- the gaze target and the first stimulus pattern image are presented on the monitor screen.
- the first stimulus pattern image the same image as that of the first embodiment is used. That is, a vertical stripe pattern (spatial frequency: 0.25 cycles / deg, Michelson contrast: 32%, average luminance 5 cd / m 2 ) whose luminance changes in a sine wave shape along the horizontal direction is used as the first stimulus pattern image.
- the presentation time length of the first stimulus pattern image is a fixed length of 320 milliseconds.
- the gaze target and the second gray image are presented on the monitor screen.
- the presentation time length of the second gray image is randomly varied so as to be different for each trial. This is to prevent the examination target from predicting the presentation time length.
- the presentation time length of the second gray image in each trial is set to, for example, 0 milliseconds, 10 milliseconds, 20 milliseconds, 30 milliseconds, 40 milliseconds, 60 milliseconds, 80 milliseconds, 160 milliseconds. Randomly select from seconds, 320 milliseconds, or 640 milliseconds.
- the presentation time length of 0 milliseconds means that the second gray image is not presented.
- one presentation time length is randomly selected from 10 presentation time lengths so that the same presentation time length does not appear repeatedly during 10 trials. Select and use to present the first stimulus pattern image.
- an output form in which the presentation time length monotonously increases or an output form in which the presentation time length monotonously decreases is not desirable.
- the gaze target disappears from the monitor screen, and at the same time, the presentation of the second stimulus pattern image is started.
- the second stimulus pattern image the same image as that of the first embodiment is used. That is, the second stimulus pattern image is the same pattern image as the first stimulus pattern image, but the phase is shifted 90 ° (1/4 wavelength) right or left with respect to the first stimulus pattern image. use. The amount of phase shift is the same for each trial. In this embodiment, the second stimulus pattern image is presented for 200 milliseconds. It should be noted that the period used for measuring the reflexive eyeball position change is the same as that of the first embodiment.
- the third gray image is presented on the monitor screen.
- a gaze target is not presented in the third gray image.
- the presentation time length of the third gray image is about 1.5 seconds, and when this presentation time length elapses, the next trial is started. The number of times each trial is repeated is the same as in the first embodiment.
- FIG. 12 shows an example of a waveform obtained by converting data (change in eyeball position) measured under a stimulation condition where the presentation time length of the second gray image is 10 milliseconds into a change speed of the eyeball position.
- FIG. 12 shows a waveform of a change rate corresponding to the average value of a plurality of measurement data acquired during a plurality of trials performed under the same stimulation condition. More precisely, the movement to the right is shown. It shows a waveform obtained by subtracting the measurement data when the leftward movement is given from the measurement data when given.
- the horizontal axis in FIG. 12 is the elapsed time (milliseconds) when the time when the presentation of the second stimulus pattern image is started is zero, and the vertical axis is the change speed of the eyeball position.
- the data analysis device 6 executes conversion processing for changing the eyeball position to the changing speed. Further, the experiment control / data recording device 4 associates the presentation time length of the second gray image with the direction of the apparent motion for the measurement data of each measurement time, and records it in the hard disk or the like. Therefore, the measurement data shown in FIG. 13 is recorded on the hard disk at the end of all trial times.
- FIG. 13 shows the measurement data (change speed of the eyeball position shown in FIG. 12) measured at various presentation time lengths of the second gray image arranged in order of the presentation time length of the second gray image used at the time of measurement.
- the integral value of a period of 80 milliseconds to 160 milliseconds indicates a change in the eyeball position corresponding to each presentation time length, and a time filter specific to the examination target is selected.
- FIG. 14 shows an image of response characteristics in which measurement data of response characteristics measured using the MOD inspection method and measurement data of response characteristics measured using the ISI inspection method are arranged on the same time axis.
- the presentation time length of the first stimulus pattern image is as long as 320 milliseconds, it is arranged behind the measurement data corresponding to the MOD inspection method.
- the maximum value of the presentation time length of the first stimulus pattern image in the MOD inspection method is 320 milliseconds.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Pathology (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
(a)検査対象の前方に配置したモニタ上に,一様に一定輝度を有する初期画像,前記初期画像と同じ平均輝度を有する第1刺激パターン画像,前記第1刺激パターン画像と共に仮現運動を惹起する第2刺激パターン画像を順番に提示するステップと,
(b)前記第2刺激パターン画像の提示中における一定期間内に眼球運動を測定し,測定時に用いた前記第1刺激パターン画像の提示時間長と対応付けて保存するステップと,
(c)前記(a)及び前記(b)に示す各ステップの実行を1試行回とし,各試行回で使用する前記第1刺激パターン画像の提示時間長の設定を変更しながら,複数回の試行を繰り返すステップと,
(d)前記(c)に示すステップにおける各試行回について,測定された前記眼球運動の測定データに基づいて視線方向の変化を計算するステップと,
(e)眼球運動反応の運動エネルギーモデルに,前記第1刺激パターン画像と前記第2刺激パターン画像を入力してシミュレーション結果を計算するステップと,
(f)前記(d)に示すステップで得られた視線方向の前記変化と,前記変化に対応付けられた前記提示時間長とによって特定される測定波形と,前記(e)に示すステップで得られたシミュレーション結果との差が最小になるように,前記運動エネルギーモデルのパラメータ値を最適化し,検査対象に固有の時間フィルタを計算するステップと,
を有することを特徴とする視覚フィルタ同定方法。
(1-1)視覚フィルタ同定装置の構成
図1に,発明者らが実験に使用した視覚フィルタ同定装置1の概略構成を示す。視覚フィルタ同定装置1は,刺激提示用モニタ2,視覚刺激提示装置3,実験制御・データ記録装置4,眼球運動測定装置5,データ解析装置6で構成される。このうち,視覚刺激提示装置3,実験制御・データ記録装置4,データ解析装置6は,いずれもコンピュータで構成され,各装置に対応する機能はプログラムの実行を通じて実現される。
本実施例では,反射的に誘発される眼球運動反応の測定手法として,MOD検査法と呼ぶ手法を適用する。本実施例の場合,画像の提示単位である試行を複数回繰り返すことにより,時間フィルタの同定に必要とされる測定データを収集する。図2及び図3は,MOD検査法における提示単位としての1試行回を構成する画像の配置を表している。なお,図2及び図3に示す1試行回に対応する画像の提示は,実験制御・データ記録装置4による制御の下,視覚刺激提示装置3が実行する。
以下では,データ解析装置6で実行されるデータ解析動作の内容を説明する。データ解析装置6は,実験制御・データ記録装置4から読み出した測定データを,MATLAB(登録商標)を用いてデータ解析する。具体的には,各提示時間長に対応する眼球位置の変化のデータを,運動エネルギーモデルを用いて解析し,時間フィルタを同定する。データ解析は,以下の手順で実行される。
前述したように,本実施例で説明する視覚フィルタ同定装置1を用いれば,眼球運動反応を使用するため,従来技術のような課題(「被験者が検査者の指示を理解しなければならない」,「被験者が随意的に身体を動かして反応しなければならない」,「恣意性の混入により十分な客観性を持ったデータが得られない」等)の心配がなく,視覚系の時間分解能を表す視覚系の時間フィルタを,非侵襲的に客観的かつ定量的に同定することができる。
(2-1)ISI(InterStimulus Interval)検査法
実施例1で説明したMOD検査法は,ISI検査法と組み合わせて用いることもできる。ISI検査法とは,仮現運動刺激を与える第1刺激パターン画像と第2刺激パターン画像の間にそれらの平均輝度値を有する画像を挿入した際に誘発される追従眼球運動を測定する検査方法であり,仮現運動刺激の移動方向とは逆向きの運動を検査対象に知覚させる。
以上説明したように,モニタ画面に提示する画面の組合せを替えてMOD検査とISI検査をそれぞれ実行し,それらの測定データをデータ解析すれば,眼球運動反応に加えて主観的視知覚に関する視覚フィルタを1台の視覚フィルタ同定装置1を用いて,非侵襲的に客観的かつ定量的に同定することができる。
2…刺激提示用モニタ,
3…視覚刺激提示装置,
4…実験制御・データ記録装置,
5…眼球運動測定装置,
6…データ解析装置。
Claims (8)
- (a)検査対象の前方に配置したモニタ上に,一様に一定輝度を有する初期画像,前記初期画像と同じ平均輝度を有する第1刺激パターン画像,前記第1刺激パターン画像と共に仮現運動を惹起する第2刺激パターン画像を順番に提示するステップと,
(b)前記第2刺激パターン画像の提示中における一定期間内に眼球運動を測定し,測定時に用いた前記第1刺激パターン画像の提示時間長と対応付けて保存するステップと,
(c)前記(a)及び前記(b)に示す各ステップの実行を1試行回とし,各試行回で使用する前記第1刺激パターン画像の提示時間長の設定を変更しながら,複数回の試行を繰り返すステップと,
(d)前記(c)のステップにおける各試行回について,測定された前記眼球運動の測定データに基づいて視線方向の変化を計算するステップと,
(e)眼球運動反応の運動エネルギーモデルに,前記第1刺激パターン画像と前記第2刺激パターン画像を入力してシミュレーション結果を計算するステップと,
(f)前記(d)のステップで得られた視線方向の前記変化と,前記変化に対応付けられた前記提示時間長とによって特定される測定波形と,前記(e)のステップで得られたシミュレーション結果との差が最小になるように,前記運動エネルギーモデルのパラメータ値を最適化し,検査対象に固有の時間フィルタを計算するステップと,
を有することを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
前記(d)のステップにおいて,
同じ提示時間長について測定された複数個の前記測定データの平均値を,当該提示時間長における視線方向の前記変化の代表値として使用する
ことを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
前記(a)及び前記(b)の各ステップにおいて,前記第1刺激パターン画像の各提示時間長について,前記第2刺激パターン画像として,第1の方向への仮現運動を惹起するパターン画像を提示する第1の試行と,前記第1の方向とは逆向きの第2の方向への仮現運動を惹起するパターン画像を提示する第2の試行とを行い,
前記(d)のステップにおいて,前記第1の試行における変化と前記第2の試行における変化との差分を,各提示時間長について更に計算し,算出された値を前記提示時間長に対応する前記変化の代表値として使用する,
ことを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
(g)前記時間フィルタをフーリエ解析して周波数領域における特性を同定するステップ を更に有することを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
前記(b)のステップにおける前記一定期間は,前記第2刺激パターン画像の提示開始から50ミリ秒~200ミリ秒の区間である
ことを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
前記第1刺激パターン画像と前記第2刺激パターン画像はいずれも空間周波数が等しいパターンであり,前記第2刺激パターン画像の前記第1刺激パターン画像に対する位相θのずれは,一定方向に対して0°<θ<180°である
ことを特徴とする視覚フィルタ同定方法。 - 請求項1に記載の視覚フィルタ同定方法において,
前記初期画像と前記第1刺激パターン画像には注視視標が表示される
ことを特徴とする視覚フィルタ同定方法。 - 検査対象の前方に配置されたモニタ上に,一様に一定輝度を有する初期画像,前記初期画像と同じ平均輝度を有する第1刺激パターン画像,前記第1刺激パターン画像と共に仮現運動を惹起する第2刺激パターン画像を順番に提示させる視覚刺激提示部であり,各試行回で使用する前記第1刺激パターン画像の提示時間長の設定を変更しながら,前記初期画像,前記第1刺激パターン画像及び前記第2刺激パターン画像の提示を複数回繰り返す視覚刺激提示部と,
前記第2刺激パターン画像の提示中における一定期間内に測定された眼球運動の測定データを,測定時に用いた前記第1刺激パターン画像の提示時間長と対応付けて記録するデータ記録部と,
個々の前記提示時間長について測定された前記眼球運動の測定データに基づいて視線方向の変化を計算する第1の計算部と,眼球運動反応の運動エネルギーモデルに,前記第1刺激パターン画像と前記第2刺激パターン画像を入力してシミュレーション結果を計算する第2の計算部と,前記提示時間長と対応する視線方向の前記変化とによって特定される測定波形と,前記シミュレーション結果との差が最小になるように,前記運動エネルギーモデルのパラメータ値を最適化し,検査対象に固有の時間フィルタを計算する第3の計算部を有するデータ解析部と
を有することを特徴とする視覚フィルタ同定装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018507448A JP6765131B2 (ja) | 2016-03-24 | 2017-03-24 | 視覚フィルタ同定方法及び装置 |
EP17770423.6A EP3434194A4 (en) | 2016-03-24 | 2017-03-24 | METHOD AND DEVICE FOR IDENTIFYING VISUAL FILTER |
US16/087,807 US20210177257A1 (en) | 2016-03-24 | 2017-03-24 | Visual filter identification method and device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016059657 | 2016-03-24 | ||
JP2016-059657 | 2016-03-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017164383A1 true WO2017164383A1 (ja) | 2017-09-28 |
Family
ID=59900436
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/012067 WO2017164383A1 (ja) | 2016-03-24 | 2017-03-24 | 視覚フィルタ同定方法及び装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20210177257A1 (ja) |
EP (1) | EP3434194A4 (ja) |
JP (1) | JP6765131B2 (ja) |
WO (1) | WO2017164383A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019167899A1 (ja) * | 2018-03-01 | 2019-09-06 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP2019150253A (ja) * | 2018-03-01 | 2019-09-12 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP2019150252A (ja) * | 2018-03-01 | 2019-09-12 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014021673A (ja) * | 2012-07-17 | 2014-02-03 | Toshiba Corp | 画像提示装置及び方法 |
JP2014236944A (ja) * | 2013-06-07 | 2014-12-18 | イスイックス・ワールド株式会社 | 眼球運動測定装置 |
JP2015072709A (ja) * | 2014-11-28 | 2015-04-16 | 株式会社東芝 | 画像提示装置、方法、及びプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4570796B2 (ja) * | 2001-02-08 | 2010-10-27 | 株式会社トプコン | コントラストチャート装置 |
WO2002062208A1 (en) * | 2001-02-08 | 2002-08-15 | Topcon Corporation | Contrast chart apparatus, contrast sensitivity measuring apparatus, and contrast sensitivity measuring method |
GB0513603D0 (en) * | 2005-06-30 | 2005-08-10 | Univ Aberdeen | Vision exercising apparatus |
US20070166675A1 (en) * | 2005-12-15 | 2007-07-19 | Posit Science Corporation | Cognitive training using visual stimuli |
US8992019B2 (en) * | 2012-01-06 | 2015-03-31 | Baylor College Of Medicine | System and method for evaluating ocular health |
-
2017
- 2017-03-24 EP EP17770423.6A patent/EP3434194A4/en not_active Withdrawn
- 2017-03-24 WO PCT/JP2017/012067 patent/WO2017164383A1/ja active Application Filing
- 2017-03-24 JP JP2018507448A patent/JP6765131B2/ja not_active Expired - Fee Related
- 2017-03-24 US US16/087,807 patent/US20210177257A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014021673A (ja) * | 2012-07-17 | 2014-02-03 | Toshiba Corp | 画像提示装置及び方法 |
JP2014236944A (ja) * | 2013-06-07 | 2014-12-18 | イスイックス・ワールド株式会社 | 眼球運動測定装置 |
JP2015072709A (ja) * | 2014-11-28 | 2015-04-16 | 株式会社東芝 | 画像提示装置、方法、及びプログラム |
Non-Patent Citations (2)
Title |
---|
BURR; MORRONE, J. OPT. SOC. AM. A, 1993 |
See also references of EP3434194A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019167899A1 (ja) * | 2018-03-01 | 2019-09-06 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP2019150253A (ja) * | 2018-03-01 | 2019-09-12 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP2019150252A (ja) * | 2018-03-01 | 2019-09-12 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP7043890B2 (ja) | 2018-03-01 | 2022-03-30 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
JP7043889B2 (ja) | 2018-03-01 | 2022-03-30 | 株式会社Jvcケンウッド | 視機能検出装置、視機能検出方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20210177257A1 (en) | 2021-06-17 |
JPWO2017164383A1 (ja) | 2019-02-07 |
EP3434194A1 (en) | 2019-01-30 |
EP3434194A4 (en) | 2019-11-13 |
JP6765131B2 (ja) | 2020-10-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12059207B2 (en) | Cognitive training system with binocular coordination analysis and cognitive timing training feedback | |
US8668337B2 (en) | System for the physiological evaluation of brain function | |
US9101312B2 (en) | System for the physiological evaluation of brain function | |
JP6585058B2 (ja) | 患者の生理的摂動を決定するための装置と方法 | |
EP2482710B1 (en) | System and method for applied kinesiology feedback | |
JP6530239B2 (ja) | 両眼計測装置、両眼計測方法、及び両眼計測プログラム | |
JP2019122816A (ja) | 神経疾患を検出するためのシステムおよび方法 | |
JP2018520820A (ja) | 視覚の様相を検査する方法及びシステム | |
JP2012050759A (ja) | 視覚疲労度検出装置、視覚疲労度制御装置及び視覚疲労度検出方法 | |
US11317861B2 (en) | Vestibular-ocular reflex test and training system | |
WO2017164383A1 (ja) | 視覚フィルタ同定方法及び装置 | |
AU2017372951A1 (en) | Stimulus and eye tracking system | |
JP2024512045A (ja) | メンタルヘルスを診断および監視するための視覚システム | |
RU2480142C2 (ru) | Устройство и способ дистанционной оценки характеристик зрительного анализатора человека и проведения тренинговых упражнений для развития бинокулярных и высших зрительных функций | |
JP2008206830A (ja) | 統合失調症診断装置及びプログラム | |
JP6497005B2 (ja) | 視機能測定装置、および視機能測定プログラム | |
Różanowski et al. | Estimation of operators’ fatigue using optical methods for determination of pupil activity | |
Ichige et al. | Visual assessment of distorted view for metamorphopsia patient by interactive line manipulation | |
RU98682U1 (ru) | Устройство для оценки психофизиологического состояния человека | |
DK3287074T3 (en) | Method of recording the cerebral cognition time and device for recording the cerebral cognition time | |
JP3937871B2 (ja) | 眼振検査装置およびそれに用いるプログラムおよび記録媒体 | |
JP7572435B2 (ja) | 欺まん検出のための眼球システム | |
US20240268660A1 (en) | Determining a visual performance of an eye of a person | |
JP2006167276A (ja) | 固視微動量推定方法、装置、及びプログラム並びに該プログラムを記録した記録媒体 | |
RU2484760C1 (ru) | Способ диагностирования состояния глазодвигательных мышц |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018507448 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017770423 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017770423 Country of ref document: EP Effective date: 20181024 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17770423 Country of ref document: EP Kind code of ref document: A1 |