WO2024043041A1 - Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium - Google Patents

Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium Download PDF

Info

Publication number
WO2024043041A1
WO2024043041A1 PCT/JP2023/028514 JP2023028514W WO2024043041A1 WO 2024043041 A1 WO2024043041 A1 WO 2024043041A1 JP 2023028514 W JP2023028514 W JP 2023028514W WO 2024043041 A1 WO2024043041 A1 WO 2024043041A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
floating object
unit
optical system
Prior art date
Application number
PCT/JP2023/028514
Other languages
French (fr)
Japanese (ja)
Inventor
達夫 山口
龍 坂東
Original Assignee
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社トプコン filed Critical 株式会社トプコン
Publication of WO2024043041A1 publication Critical patent/WO2024043041A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/13Ophthalmic microscopes
    • A61B3/135Slit-lamp microscopes

Definitions

  • the present invention relates to an ophthalmologic apparatus, a method for controlling an ophthalmologic apparatus, a program, and a recording medium.
  • Ophthalmological equipment includes slit lamp microscopes, fundus cameras, scanning laser ophthalmoscopes (SLO), and optical coherence tomography (OCT).
  • SLO scanning laser ophthalmoscopes
  • OCT optical coherence tomography
  • various inspection and measurement devices such as refractometers, keratometers, tonometers, specular microscopes, wavefront analyzers, and microperimeters are also equipped with functions for photographing the anterior segment and fundus of the eye.
  • slit lamp microscope which is also called a stethoscope for ophthalmologists.
  • a slit lamp microscope is an ophthalmological device that illuminates a subject's eye with slit light and observes or photographs the illuminated cross section from the side with a microscope (see, for example, Patent Documents 1 and 2).
  • a slit lamp microscope is known that can scan a three-dimensional area of the eye to be examined at high speed by using an optical system configured to satisfy Scheimpflug conditions (for example, see Patent Document 3). reference).
  • a rolling shutter camera is also known as an imaging method that scans an object with slit light.
  • Slit lamp microscopes are used not only to observe ocular tissues (eyelids, conjunctiva, cornea, iris, anterior chamber, crystalline lens, vitreous, retina, etc.), but also to evaluate floating objects in the aqueous humor (in the anterior chamber). are also used (for example, see Patent Documents 4 and 5). Suspended substances in the aqueous humor include inflammatory cells, white blood cells (macrophages, lymphocytes, etc.), and blood proteins.
  • JP 2016-159073 Publication Japanese Patent Application Publication No. 2016-179004 JP2019-213733A International Publication No. 2018/003906 Special Publication No. 2022-520832
  • One purpose of the present invention is to improve the quality of evaluation of suspended matter in the aqueous humor.
  • One exemplary aspect of the embodiment is an ophthalmological apparatus including a first imaging unit, a second imaging unit, and an evaluation processing unit.
  • the first imaging unit includes a first optical system that satisfies Scheimpflug conditions, and is configured to apply first imaging under first imaging conditions to the anterior segment of the subject's eye.
  • the second photographing unit includes a second optical system that satisfies Scheimpflug conditions, and performs a second photograph under a second photographing condition different from the first photographing condition in parallel with the first photographing. Configured for application to the eye.
  • the evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit.
  • the device is configured to generate evaluation information of an object.
  • This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the method of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit.
  • the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging.
  • the evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
  • Yet another exemplary aspect of the embodiment is a program for operating an ophthalmological device.
  • This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the program of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit.
  • the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging.
  • the evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
  • Yet another exemplary aspect of the embodiment is a computer-readable non-transitory recording medium on which a program for operating an ophthalmological device is recorded.
  • This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the program recorded on the recording medium of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit.
  • the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging.
  • the evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • 5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • 5 is a timing chart illustrating processing performed
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • 3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • 3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • FIG. 3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • 3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
  • any known technology can be combined with any aspect of the present disclosure.
  • any matter disclosed in the documents cited herein can be combined with any aspect of the present disclosure.
  • any known technology in the technical field related to the present disclosure can be combined with any aspect of the present disclosure.
  • Patent Document 3 Japanese Unexamined Patent Publication No. 2019-213733
  • Patent Document 3 Japanese Unexamined Patent Publication No. 2019-213733
  • any technical matter disclosed by the applicant of the present application regarding technology related to the present disclosure can be combined with any aspect of the present disclosure.
  • the circuit configuration or processing circuit configuration includes a general-purpose processor, a special-purpose processor, an integrated circuit, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit) configured and/or programmed to perform at least a portion of the disclosed functions. ), ASIC (Application Specific Integrated Circuit), programmable logic device (for example, SPLD (Simple Programmable Logic Device), CPLD (Complex Pr Logic Device), FPGA (Field Programmable Gate Array), conventional circuit configurations, and any of these A processor is considered to be a processing circuitry or circuitry that includes transistors and/or other circuitry.
  • circuitry, unit, means, or similar terminology refers to the disclosure Hardware that performs at least some of the functions disclosed herein, or hardware that is programmed to perform at least some of the functions disclosed. or may be known hardware programmed and/or configured to perform at least some of the functions described.A processor where the hardware may be considered a type of circuitry.
  • circuitry, unit, means, or similar terms is a combination of hardware and software, the software being used to configure the hardware and/or the processor.
  • One purpose of the embodiments of the present disclosure is to improve the quality of evaluation of suspended matter in the aqueous humor.
  • the embodiments of the present disclosure perform a plurality of parallel imaging (multiple simultaneous imaging) using a plurality of optical systems that satisfy Scheimpflug conditions under mutually different imaging conditions,
  • the apparatus is configured to evaluate floating objects in the aqueous humor based on a plurality of images (image group) generated by the plurality of parallel imaging operations.
  • the object of evaluation according to the embodiments of the present disclosure is not limited to floating objects present in the aqueous humor.
  • Some exemplary embodiments include other floaters present within the eye (e.g., floaters present in the vitreous), floaters present on the ocular surface (e.g., floaters present in tear fluid), It may be configured to evaluate floating objects present in the ocular appendages (orbital socket, eyelid, conjunctiva, lacrimal organ, ocular muscle). Therefore, one purpose of the embodiments of the present disclosure is to improve the quality of evaluation of floating objects within the eye, on the ocular surface, or in the ocular appendage, and to satisfy the Scheimpflug conditions.
  • Multiple parallel shootings (multiple simultaneous shootings) using multiple optical systems are performed under mutually different shooting conditions, and multiple images (image group) generated by these multiple parallel shootings are ) is configured to evaluate the floating objects.
  • an embodiment that deals with floating objects in the aqueous humor will be described in detail, but it is clear that similar operations can be performed with a similar configuration even when other floating objects are the object.
  • the present disclosure does not intentionally exclude embodiments that deal with floating matter other than floating matter in the aqueous humor (floating matter that is or can become an evaluation target in ophthalmology).
  • a group of images is collected from a three-dimensional region of the anterior segment of the eye by repeatedly performing a plurality of parallel imaging while moving a plurality of optical systems with respect to the eye to be examined;
  • the system is configured to evaluate suspended matter in the aqueous humor in this three-dimensional region based on the image group.
  • Scheimpflug's conditions that is, collecting images from a three-dimensional region of the anterior segment
  • Embodiments according to the present disclosure utilize this three-dimensional scanning, and include performing a plurality of parallel imaging using a plurality of optical systems under mutually different imaging conditions, and A further feature is that floating objects in the aqueous humor are evaluated based on a plurality of images collected by conventional photography.
  • Scheimpflug conditions are conditions related to the optical system that projects illumination light onto the object (illumination optical system) and the optical system that photographs the object (photographic optical system). This stipulates that the illumination optical system and photographing optical system are configured so that the principal surface of the lens and the film surface intersect on the same straight line.
  • the object surface is not parallel to the lens principal surface, so according to the Scheimpflug camera, it is possible to simultaneously focus over a wide depth range from near objects to far objects.
  • the number of optical systems that satisfy Scheimpflug conditions may be arbitrary.
  • the Scheimpflug condition is a condition regarding the pair of the illumination optical system and the photographing optical system, so the "number of optical systems" in the embodiments of the present disclosure refers to More specifically, it is the number of photographic optical systems.
  • the number of illumination optical systems ( ⁇ ) and the number of photographic optical systems ( ⁇ ) are different ( ⁇ ).
  • the number of illumination optical systems is smaller than the number of photographic optical systems ( ⁇ )
  • at least one illumination optical system is shared by two or more photographic optical systems.
  • the number of illumination optical systems is greater than the number of photographing optical systems ( ⁇ > ⁇ )
  • at least one photographing optical system performs photographing using two or more illumination optical systems.
  • the ophthalmic device includes at least two optical systems (at least (a first optical system and a second optical system); the first optical system includes a first illumination optical system and a second photographing optical system configured to satisfy Scheimpflug conditions.
  • the second optical system includes a second illumination optical system and a second photographing optical system configured to satisfy Scheimpflug conditions; the first illumination optical system and the second illumination optical system (that is, the first illumination optical system and the second illumination optical system are the same); the first photographing optical system and the second photographing optical system are different.
  • Those skilled in the art will be able to understand the configuration of an ophthalmological apparatus in which the number of illumination optical systems and/or the number of imaging optical systems differs from that of this embodiment through the description of this embodiment.
  • a plurality of parallel imaging operations using a plurality of optical systems that satisfy the Scheimpflug condition, which are performed according to the embodiments of the present disclosure, are performed under different imaging conditions.
  • the imaging conditions may be any conditions that can be employed in ophthalmological imaging.
  • the photographing conditions in the exemplary embodiments detailed in this disclosure include at least the exposure time conditions described below, the embodiments are not limited thereto.
  • the imaging conditions in some exemplary embodiments may include conditions that have the same or similar effect as the exposure time conditions.
  • the imaging conditions of some exemplary aspects can be combined with the exposure time conditions to improve the quality of the embodiments of the present disclosure (improvement of imaging quality, improvement of imaging efficiency, imaging of subjects and/or patients). conditions that contribute to reducing the burden on optometry (such as reducing the burden on optometry).
  • Types of photographing conditions include, for example, conditions related to the optical system (optical system conditions), conditions related to movement of the optical system (movement conditions), and conditions other than these.
  • Types of optical system conditions include conditions related to the illumination optical system for projecting illumination light onto the eye to be examined (illumination conditions), conditions for the photographic optical system to photograph the eye to be examined using an image sensor (exposure conditions), etc. There is.
  • Types of illumination conditions include conditions regarding the intensity of illumination light (illumination intensity conditions), conditions regarding the projection time of illumination light (illumination time conditions), etc.
  • the projection time of the illumination light is the length of the period (projection period) during which the illumination light is projected onto the eye to be examined.
  • the projection time of illumination light may be referred to as illumination time.
  • illumination intensity conditions include conditions related to the light source that emits illumination light (for example, the height of the light source control pulse), conditions related to the neutral density filter provided in the illumination optical system (for example, conditions related to the selection of two or more neutral density filters) conditions, conditions related to variable neutral density filter control), etc.
  • illumination time conditions include conditions related to the light source that emits illumination light (for example, the width of the light source control pulse), conditions related to the shutter provided in the illumination optical system (for example, the opening time of the shutter), etc.
  • the exposure time conditions include, for example, conditions related to the image sensor (eg, exposure time of the image sensor), conditions related to the shutter provided in the photographic optical system (eg, shutter opening time), and the like.
  • the exposure time is the length of a period (exposure period) during which the image sensor can receive light.
  • An ophthalmologic apparatus includes a plurality of optical systems (a pair of an illumination optical system and a photographing optical system) that satisfy Scheimpflug conditions.
  • optical systems a pair of an illumination optical system and a photographing optical system
  • Scheimpflug conditions various matters will be explained in the case where two optical systems that satisfy the Scheimpflug condition are provided, but they will also be explained when there are three or more optical systems that satisfy the Scheimpflug condition. Those skilled in the art will understand that the following matters hold true.
  • first optical system Two optical systems that each satisfy Scheimpflug conditions are referred to as a first optical system and a second optical system. That is, the first optical system satisfies the Scheimpflug condition, and the second optical system satisfies the Scheimpflug condition.
  • the photographing unit including the first optical system is referred to as a first photographing section, and the photographing unit including the second optical system is referred to as a second photographing section.
  • the first imaging unit includes, for example, electrical or electronic elements (circuits, connection lines, connectors, etc.) and elements included in the first optical system (optical elements, etc.). It includes the driving mechanism, metal fittings, etc. The same applies to the second photographing section.
  • the first imaging unit (first optical system) and the second imaging unit (second optical system) perform imaging in parallel in time under mutually different imaging conditions. That is, in some exemplary aspects, while the first imaging section performs imaging under a first imaging condition (first imaging), the imaging unit performs imaging under a second imaging condition different from the first imaging condition. Photographing (second photographing) is performed by the second photographing section.
  • Some example aspects provide a method for performing a first image capture at a first exposure time condition by a first image capturing unit, while performing a second image capture at a second exposure time condition different from the first exposure time condition.
  • Photographing is performed by the second photographing section.
  • the first optical system and the second optical system include separate image sensors, and the image sensor of the first optical system is called the first image sensor, and the image sensor of the second optical system is called the first image sensor. This is called the second image sensor.
  • the first optical system uses a first region of the image sensor as the first image sensor, and the second optical system uses a second region of the same image sensor (a region different from the first region). ) may be configured to be used as the second image sensor.
  • the first exposure time condition includes the exposure time of the first image sensor in the first imaging unit (referred to as the first exposure time). Further, the second exposure time condition includes the exposure time of the second image sensor in the second imaging section (referred to as the second exposure time).
  • the value of the second exposure time for the second imaging section is set larger than the value of the first exposure time for the first imaging section. Note that when three or more imaging units are provided, the three or more exposure times corresponding to the three or more imaging units are different from each other.
  • the value of the first exposure time and the value of the second exposure time may be determined based on arbitrary parameters, such as the characteristics (size, movement speed) of the suspended matter in the aqueous humor that is the detection target, the illumination It may be determined based on parameters such as light intensity (light amount) and illumination light projection time.
  • the value of the first exposure time and/or the value of the second exposure time may be constant or variable.
  • a plurality of values may be provided, each corresponding to a plurality of types of suspended matter in the aqueous humor.
  • the value of the first exposure time and/or the value of the second exposure time may be determined by analyzing a preliminary image obtained by photographing the anterior segment of the eye to be examined.
  • the movement conditions are imaging conditions used when performing a three-dimensional scan of the anterior segment of the eye using an optical system that satisfies Scheimpflug conditions.
  • Types of movement conditions include optical system movement range (scan range, scan start position, scan end position), movement distance (scan distance), movement speed (scan speed), movement timing (scan start timing, scan end timing, etc.) ), movement modes (continuous movement, intermittent movement, etc.).
  • Conditions other than optical system conditions and movement conditions include conditions related to linkage (synchronization) of two or more conditions, conditions related to the eye to be examined, etc.
  • conditions related to coordination of two or more conditions include conditions related to coordination of two or more optical system conditions, conditions related to coordination of two or more movement conditions, and conditions related to coordination of one or more optical system conditions and one or more movement conditions. There are conditions regarding cooperation with.
  • conditions related to the eye to be examined include conditions related to fixation, conditions related to use of a contrast medium, conditions related to use of a mydriatic agent, conditions related to eye characteristics, etc.
  • the eye characteristics may include parameters that affect or can affect the brightness of the eye image, such as pupil diameter, iris color, etc. be.
  • Embodiments according to the present disclosure may be configured to execute control for realizing the above-described imaging function (imaging).
  • This control may include, for example, one or two of the following: control of the optical system (illumination optical system and/or photographing optical system), control of elements of the photographing section other than the optical system, and control of elements other than the photographing section. It may include a combination of more than one control.
  • the embodiment according to the present disclosure may be configured to be able to execute control for realizing functions other than the above-mentioned photographing function.
  • Control of the illumination optical system may be any type of control, for example, electrical control such as light source control (on/off) or electronic shutter control, or mechanical shutter control or rotary shutter control. It may be mechanical control such as control, or it may be a combination of electrical control and mechanical control. Note that these shutters are provided in the illumination optical system, and are designed to switch between passing and blocking the illumination light output from the light source (that is, switching between projecting and not projecting the illumination light to the subject's eye). )Function.
  • Control of the illumination optical system is not limited to control for switching between a state in which illumination light is projected onto the eye to be examined (projection state) and a state in which it is not projected (non-projection state); Control may also be used to modulate the intensity or amount of illumination light.
  • switching between the projection state and the non-projection state corresponds to switching the intensity of the illumination light projected onto the eye to be examined between a positive value and zero.
  • intensity modulation corresponds to switching the intensity of illumination light projected onto the eye to be examined between a first value and a second value that are different from each other.
  • both the first value and the second value are non-negative values, and either one or both of the first value and the second value is a positive value. Therefore, switching between the projection state and the non-projection state can be said to correspond to one example of intensity modulation.
  • the control of the photographing optical system may be any type of control, for example, electrical control such as control of an image sensor or control of an electronic shutter, or control of a mechanical shutter or control of a rotary shutter. It may be mechanical control or a combination of electrical control and mechanical control.
  • Embodiments according to the present disclosure employ the imaging method described above, that is, a plurality of parallel imaging using a plurality of optical systems satisfying the Scheimpflug condition are performed under mutually different imaging conditions,
  • the apparatus is configured to evaluate suspended matter in the aqueous humor based on the plurality of images thus acquired.
  • FIG. 1 shows the configuration of an ophthalmological apparatus according to one aspect of the embodiment.
  • the ophthalmological apparatus 1000 of this embodiment includes an imaging section 1100, a control section 1200, an evaluation processing section 1300, and a movement mechanism 1400.
  • the photographing unit 1100 generates a digital image (Scheimpflug image) by photographing the anterior segment of the eye to be examined using an optical system that satisfies Scheimpflug conditions.
  • the photographing section 1100 includes a first photographing section 1110 and a second photographing section 1120.
  • the first imaging unit 1110 includes an optical system (first optical system) 1111 that satisfies Scheimpflug conditions.
  • the first optical system 1111 includes an image sensor (first image sensor) 1112 for generating a digital image.
  • first imaging conditions Photographing performed by the first photographing unit 1110 under the first photographing conditions. Photographing performed by the first photographing unit 1110 under the first photographing conditions is referred to as first photographing.
  • the image generated by the first photographing is called a first image.
  • the second photographing unit 1120 includes an optical system (second optical system) 1121 that satisfies Scheimpflug conditions.
  • the second optical system 1121 includes an image sensor (second image sensor) 1122 for generating a digital image.
  • the second imaging unit 1120 applies imaging to the anterior segment of the subject's eye under preset imaging conditions (second imaging conditions). Photographing performed by the second photographing unit 1120 under the second photographing condition is referred to as second photographing.
  • the image generated by the second shooting is called a second image.
  • the configuration of the first imaging unit 1110 and the configuration of the second imaging unit 1120 are (substantially) the same. Note that in some exemplary embodiments, the configuration of the first imaging unit and the configuration of the second imaging unit may be different.
  • a part of the first optical system in the first imaging section and a part of the second optical system in the second imaging part may be common.
  • a configuration may be adopted in which a common objective lens is used as the objective lens of the first optical system and the objective lens of the second optical system.
  • a configuration is adopted in which the light guided by the common optical path of the first optical system and the second optical system is split into two and detected by the first image sensor and the second image sensor, respectively.
  • the configuration in which the first optical system and the second optical system are partially shared is not limited to these examples.
  • the first imaging and the second imaging are performed in parallel. Further, the first imaging condition and the second imaging condition are different from each other.
  • the first photographing condition includes the value of the exposure time (first exposure time) of the first image sensor 1112
  • the second photographing condition includes the value of the exposure time of the second image sensor 1122. (second exposure time). In this aspect, the second exposure time is longer than the first exposure time.
  • the second exposure time is longer than the first exposure time.
  • generality is not lost.
  • the relationship between the exposure time of one imaging section and the exposure time of the other imaging section is asymmetric.
  • the control unit 1200 is configured to control each part of the ophthalmologic apparatus 1000. As shown in FIG. 1, the control unit 1200 controls the imaging unit 1110, the evaluation processing unit 1300, the moving mechanism 1400, etc. based on predetermined imaging conditions.
  • control unit 1200 may be configured to control any element of the ophthalmologic apparatus 1000 and/or peripheral devices of the ophthalmologic apparatus 1000.
  • elements and/or peripheral devices of the ophthalmological apparatus 1000 include a user interface, a communication device, an element other than the imaging unit 1100, an element for testing an eye to be examined, and another device in a system including the ophthalmological apparatus 1000. , another device used with ophthalmologic device 1000, and the like.
  • the control unit 1200 includes hardware elements such as a processor and a storage device.
  • the storage device stores computer programs such as control programs.
  • the functions of the control unit 1200 are realized by cooperation between software such as a control program and hardware such as a processor.
  • the control unit 1200 controls the imaging unit 1100 to perform first imaging by the first imaging unit 1110 and second imaging by the second imaging unit 1120 in parallel and different from each other. Execute with conditions. That is, the control unit 1200 of this aspect controls the imaging unit 1100 to cause the first imaging unit 1110 to perform the first imaging under the first imaging condition, while performing the first imaging under the first imaging condition.
  • the second photographing unit 1120 is caused to perform the second photographing under the second photographing condition. More specifically, the control unit 1200 of this embodiment controls the imaging unit 1100 to change the first imaging in which the exposure time of the first image sensor 1112 is set to the first exposure time to the first imaging unit 1100.
  • the second imaging unit 1120 is caused to perform second imaging in which the exposure time of the second image sensor 1122 is set to a second exposure time longer than the first exposure time while causing the imaging unit 1110 to perform the second imaging.
  • an image depicting the movement of suspended matter in the aqueous humor over a relatively short period and an image depicting the movement of suspended matter within the aqueous humor over a relatively long period were created.
  • An image is obtained.
  • the first image generated by the first imaging unit 1110 is an image depicting the movement of suspended matter in the aqueous humor over a relatively short period
  • the first image generated by the second imaging unit 1120 The second image is an image depicting the movement of suspended matter in the aqueous humor over a relatively long period of time.
  • the ophthalmological apparatus 1000 of this embodiment has a novel function of evaluating floating matter in the aqueous humor by comparing these two images. This evaluation process will be described later.
  • the moving mechanism 1400 is configured to move the imaging unit 1100 (at least the first optical system 1111 and the second optical system 1121).
  • the moving mechanism 1400 may include a mechanism that has the same function as moving the imaging section 1100 (in other words, a mechanism that has the same effect as moving the imaging section 1110).
  • Examples of such mechanisms include a mechanism (illumination scanner, movable illumination mirror) that moves the illumination position by deflecting illumination light (slit light), and a mechanism that moves the illumination position by deflecting the light directed from the eye to the imaging unit 1100.
  • the control unit 1200 controls the first image and the first image by controlling the image capturing unit 1100 (control of the first image capturing unit 1110 and controlling the second image capturing unit 1120) in combination with control of the moving mechanism 1400.
  • the first photographing unit 1110 and the second photographing unit 1120 generate a plurality of pairs with the second image.
  • the control unit 1200 combines the control of the first imaging unit 1110, the control of the second imaging unit 1120, and the control of the movement mechanism 1400 to scan a three-dimensional region of the anterior segment of the subject's eye (anterior (ocular scan), that is, image collection from a three-dimensional region of the anterior segment of the eye to be examined.
  • the anterior segment scan is performed by changing the positions of the first optical system 1111 and the second optical system 1121 with respect to the anterior segment, and performing the first imaging by the first imaging unit 1110 and the second imaging by the second imaging unit 1120.
  • This is an imaging mode that performs pairing operations multiple times with shooting.
  • a plurality of image pairs (a plurality of pairs of a first image and a second image) each depicting a plurality of different parts of the anterior segment are obtained.
  • the scan shown in FIG. 2A consists of light emission from a light source (output of illumination light, projection of illumination light onto the subject's eye), exposure from camera A (first image sensor 1112), and exposure from camera B (second image sensor 1122). This is realized by linked control (synchronous control) between the exposure and the scan position (the position of the moving imaging unit 1100).
  • the scan in FIG. 2A involves repeating control to continuously output illumination light and exposing the first image sensor 1112 at a relatively short exposure time (first exposure time) "a”. control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b”, and control to repeatedly perform exposure from the scan start position to the scan end position.
  • first exposure time first exposure time
  • second exposure time second exposure time
  • Repetitive exposure of the first image sensor 1112 is performed by alternately repeating exposure during the first exposure time and charge transfer (and exposure standby).
  • the repetitive exposure of the second image sensor 1122 is performed by alternately repeating exposure at the second exposure time and charge transfer (and exposure standby).
  • the scanning shown in FIG. 2A has the advantage that illumination light control is simple and scanning can be achieved through relatively simple synchronous control.
  • the intensity (light amount) of the illumination light can be modulated according to the exposure timing of the two image sensors 1112 and 1122.
  • the intensity of the illumination light may be relatively high during the exposure period of the first image sensor 1112, and the intensity of the illumination light may be relatively low during the exposure period of the second image sensor 1122.
  • intensity modulation the brightness of the image obtained by the first image sensor 1112 is increased with a relatively short exposure time, and the image is made clearer and has higher definition. Saturation of the brightness of the image obtained by the image sensor 1122 can be prevented.
  • the scan shown in FIG. 2B is executed in the same manner as the scan shown in FIG. 2A with regard to exposure control of the first image sensor 1112, exposure control of the second image sensor 1122, and scan position control, but As for control, unlike the scan of FIG. 2A in which illumination light is emitted continuously, illumination light is output intermittently (pulse light emission).
  • the scan in FIG. 2B involves repeated control to output illumination light intermittently and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a”. control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b”, and control to repeatedly perform exposure from the scan start position to the scan end position.
  • first exposure time first exposure time
  • second exposure time second exposure time
  • the scan of FIG. 2B has the disadvantage that synchronization control is more complicated than the scan of FIG. 2A, it is possible to shorten the time during which the illumination light is projected onto the eye to be examined. This has the advantage that the burden on the subject can be reduced.
  • the intensity (light amount) of the illumination light may be modulated according to the exposure timing of the two image sensors 1112 and 1122.
  • the scan shown in FIG. 2D is executed in the same manner as the scan shown in FIG. 2A with respect to illumination light emission control, exposure control of the first image sensor 1112, and exposure control of the second image sensor 1122, but the scanning position Regarding control, unlike the scan of FIG. 2A in which the imaging unit 1100 is moved continuously, the imaging unit 1100 is moved intermittently.
  • the scan in FIG. 2D repeats control to continuously output illumination light and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a”. control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b”, and control to repeatedly perform exposure from the scan start position to the scan end position.
  • first exposure time first exposure time
  • second exposure time second exposure time
  • the scan shown in FIG. 2E is executed in the same manner as the scan shown in FIG. 2B with regard to illumination light emission control, exposure control of the first image sensor 1112, and exposure control of the second image sensor 1122, but the scanning position Regarding control, unlike the scan of FIG. 2B in which the imaging unit 1100 is moved continuously, the imaging unit 1100 is moved intermittently.
  • the scan in FIG. 2E involves repeated control to output illumination light intermittently and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a”. control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b”, and control to repeatedly perform exposure from the scan start position to the scan end position.
  • first exposure time first exposure time
  • second exposure time second exposure time
  • the complexity of synchronization control increases, and vibrations due to repeated sudden starts and stops of the imaging unit 1100 occur.
  • the advantage is that each image capturing can be performed with the image capturing unit 1100 stopped, so that blurring or blurring of the image due to movement of the image capturing unit 1100 does not occur. There is.
  • the aspect of the anterior segment scan of this aspect is not limited to these. Furthermore, it is possible to select the scanning mode to be applied to the anterior segment based on the advantages and disadvantages described above, or to at least partially combine two or more scanning modes.
  • two or more images acquired with the optical system stationary i.e., two images acquired at the same optical system position
  • scanning moving the optical system
  • Such non-scan imaging and evaluation can be performed by an ophthalmological apparatus equipped with a moving mechanism like the ophthalmological apparatus 1000 of this embodiment, or an ophthalmic apparatus not equipped with a moving mechanism.
  • an ophthalmologic apparatus equipped with a moving mechanism it may be possible to select a mode in which scanning-based imaging and evaluation is performed and a mode in which non-scanning imaging and evaluation are performed.
  • the evaluation processing unit 1300 generates a first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition, and an image generated by the second imaging unit under the second imaging condition.
  • 1120 is configured to generate evaluation information on suspended matter in the aqueous humor in the anterior segment (anterior chamber) of the eye to be examined.
  • the evaluation information may be any information regarding suspended matter in the aqueous humor.
  • the evaluation information may include information regarding any item indicating the state of the floating object in the aqueous humor, information regarding any item grasped based on the state of the floating object in the aqueous humor, and the like.
  • evaluation information includes numerical information (values of predetermined parameters, etc.), information indicating judgment results (grade indicating the pathology of a specific disease, grade indicating the degree of progression of a specific disease, presence or absence of suspicion of a specific disease, (probability, etc.), information indicating the data used to derive the judgment result, information that visualizes any of these information, etc. may be included.
  • the evaluation processing unit 1300 includes hardware elements such as a processor and a storage device.
  • a computer program such as an evaluation processing program is stored in the storage device.
  • the functions of the evaluation processing unit 1300 are realized by cooperation between software such as an evaluation processing program and hardware such as a processor.
  • the evaluation processing section 1300 includes a floating object image detection section 1310 and an evaluation information generation section 1320.
  • the functions of the floating object image detection unit 1310 are realized by cooperation between software such as a floating object image detection program and hardware such as a processor.
  • the functions of the evaluation information generation unit 1320 are realized by cooperation between software such as an evaluation information generation program and hardware such as a processor.
  • the floating object image detection unit 1310 is configured to detect a first floating object image in the first image and a second floating object image in the second image, which correspond to the same floating object in the aqueous humor. There is. More specifically, the floating object image detection unit 1310 detects an image of floating objects in the aqueous humor from a first image generated by first imaging performed by the first imaging unit 1110 under first imaging conditions. (the first floating object image) and from the second image generated by the second imaging performed by the second imaging unit 1120 under the second imaging condition. It is configured to detect an image of an object (second floating object image).
  • the evaluation information generation unit 1320 determines whether the aqueous humor is detected based on a pair of a first floating object image and a second floating object image that correspond to the same floating object in the aqueous humor detected by the floating object image detection unit 1310.
  • the system is configured to generate floating object evaluation information.
  • the first imaging and the second imaging are performed in parallel.
  • the exposure period (first exposure period) of the first image sensor 1112 for the first image capture is the same as that of the second image sensor 1122 for the second image capture. This corresponds to a part of the exposure period (second exposure period).
  • the configuration of the first imaging unit 1110, the configuration of the second imaging unit 1120, and the relative positional relationship between the first imaging unit 1110 and the second imaging unit 1120 are known. Therefore, it is possible to determine the correspondence between the pixel positions (coordinates) of the first image and the pixel positions (coordinates) of the second image.
  • the floating object image detection unit 1310 may be configured to detect the same image of the floating object in the aqueous humor from both images by using this correspondence (coordinate transformation).
  • floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image, and further performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image; Perform image segmentation to identify floating object images.
  • image segmentations may be machine learning-based processes, non-machine learning-based processes, or a combination of machine learning-based processes and non-machine learning-based processes.
  • one or more floating object images are detected from the first image.
  • multiple floating object images are detected from the first image.
  • the set of floating object images detected from the first image is called a first image set.
  • a second image set is detected, which is a set of floating object images detected from the second image.
  • the floating object image detection unit 1310 detects a second image based on a known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinates) of the second image.
  • a positional correspondence between the first image set and the second image set is determined.
  • This positional correspondence is information representing a pairing relationship between an element of the first image set (first floating object image) and an element of the second image set (second floating object image). .
  • this positional correspondence determines a pair of one first floating object image and one second floating image.
  • two images of floating objects that exist at substantially the same position are determined.
  • two images of the same floating object are determined.
  • the evaluation information generation unit 1320 of this embodiment generates this image based on the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined by the above-mentioned positional correspondence relationship. It may be configured to generate evaluation information of suspended matter in the aqueous humor corresponding to the pair.
  • floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image; Perform image segmentation to identify floating object images in the region. As a result, one or more floating object images are detected from the first image. One of the floating object images detected from the first image is called a first floating object image.
  • the floating object image detection unit 1310 determines the coordinates of the first floating object image detected from the first image.
  • the coordinates of the first floating object image are, for example, the coordinates of the center of gravity of the first floating object image, the coordinates of the center of the first floating object image, the coordinates of one or more points on the outline of the first floating object. , the coordinates of a representative position of an approximate figure of the outline of the first floating object image (for example, the center of an approximate ellipse, the center of an approximate circle, etc.), and the coordinates of one or more points on the outline of the first floating object image. It may be any of the coordinates.
  • the floating object image detection unit 1310 performs the following based on the known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinate) of the second image.
  • a floating object image at a position corresponding to the coordinates of the first floating object image is detected from the second image. Note that in this aspect, since it is assumed that the second exposure time is longer than the first exposure time, the floating object image detection unit 1310 detects the second exposure time corresponding to the coordinates of the first floating object image.
  • a floating object image that exists to include the position (coordinates) in the image is detected as a floating object image in a second image (second floating object image) corresponding to the same floating object as the first floating object image. do.
  • the evaluation information generation unit 1320 of this aspect corresponds to the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined in this way.
  • the present invention may be configured to generate evaluation information on suspended matter in the aqueous humor.
  • floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the second image; Perform image segmentation to identify floating object images in the region. As a result, one or more floating object images are detected from the second image. One of the floating object images detected from the second image is referred to as a second floating object image.
  • the floating object image detection unit 1310 determines the coordinates of the second floating object image detected from the second image.
  • the coordinates of the second floating object image are, for example, the coordinates of the center of gravity of the second floating object image, the coordinates of the center of the second floating object image, the coordinates of one or more points on the outline of the second floating object. , the coordinates of a representative position of an approximate figure of the outline of the second floating object image, and the coordinates of one or more points on the outline of the second floating object image.
  • the floating object image detection unit 1310 performs the following based on the known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinate) of the second image.
  • a floating object image at a position corresponding to the coordinates of the second floating object image is detected from the first image.
  • the floating object image detection unit 1310 detects, for example, the second exposure time corresponding to the coordinates of the second floating object image.
  • the image is detected as a floating object image in the first image (first floating object image) corresponding to the same floating object as the second floating object image.
  • the evaluation information generation unit 1320 of this aspect corresponds to the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined in this way.
  • the present invention may be configured to generate evaluation information on suspended matter in the aqueous humor.
  • the evaluation processing unit 1300 of this example determines the movement direction of the suspended matter in the aqueous humor based on the second image generated by the second imaging performed by the second imaging unit 1120 under the second imaging condition. is configured to estimate.
  • the direction of movement of suspended matter in the aqueous humor is one example of evaluation information.
  • the image 2100 shown in FIG. 4A is the first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition
  • the image 2200 is the first image generated under the second imaging condition.
  • a second image generated by second photography performed by the second photography unit 1120 is the first imaging and the second imaging were performed in parallel.
  • FIG. 4A “Z” indicates the direction along the axis of the subject's eye (Z direction), and “X” indicates the left and right direction for the subject (horizontal direction, X direction) in the direction orthogonal to the Z direction.
  • "Y” indicates a direction (vertical direction, body axis direction, Y direction) orthogonal to both the X direction and the Z direction. The same applies to other drawings (such as FIGS. 4B to 4D).
  • Reference numeral 2110 indicates an image of floating objects in the aqueous humor (first floating object image) detected from the first image 2100 by the floating object image detection unit 1310
  • reference numeral 2210 indicates the image of floating objects in the aqueous humor detected from the first image 2100 by the floating object image detection unit 1310.
  • 2 shows an image of the same floating object in the aqueous humor detected from the second image 2200 (second floating object image).
  • the first floating object image 2110 may be an approximate figure (for example, an approximate circle, an approximate ellipse, etc.) of the image of the floating object in the aqueous humor
  • the second floating object image 2210 may be an approximate figure of the image of the floating object in the aqueous humor. It may be an approximate figure (for example, an approximate circle, an approximate ellipse, etc.) of the image of the floating object.
  • the evaluation information generation unit 1320 of this example generates evaluation information from the second image 2200.
  • the evaluation information generation unit 1320 calculates the maximum dimension of the second floating object image 2210 by analyzing the second floating object image 2210 detected from the second image 2200 by the floating object image detection unit 1310. .
  • the maximum dimension is the maximum value of the distance between two points on the boundary (outline, outer edge, edge) of the second floating object image 2210. In other words, the maximum dimension is the maximum length of a line segment connecting two points on the boundary of the second floating object image 2210. If the second floating object image 2210 is circular, its maximum dimension is the length of the diameter of the circle. If the second floating object image 2210 is elliptical, its maximum dimension is the length of the major axis of the ellipse. Reference numeral 2220 in FIG. 4B indicates the major axis of the second elliptical floating object image 2210.
  • the evaluation information generation unit 1320 of this example sets the direction along the line segment that defines the maximum dimension of the second floating object image 2210 as the moving direction of the floating object in the aqueous humor. In other words, the evaluation information generation unit 1320 of this example determines the direction of a straight line passing through two points on the boundary of the second floating object image 2210 that defines the maximum dimension of the second floating object image 2210 within the aqueous humor.
  • Reference numeral 2230 in FIG. 4C indicates the moving direction of the floating object in the aqueous humor, which corresponds to the long axis 2220 of the elliptical second floating object image 2210 shown in FIG. 4B.
  • Reference numeral 2110a in FIG. 4D indicates that the first floating object image 2110 in the first image 2100 shown in FIG. Shows the captured image (area).
  • the exposure period (first exposure period) of the first image sensor 1112 for the first image capture is different from the exposure period (first exposure period) of the first image sensor 1112 for the first image capture. It is assumed that this corresponds to the initial part of the exposure period (second exposure period) of the element 1122.
  • a region 2110a in the second image 2200 shown in FIG. 4D is a region (initial position region) that is estimated to be the initial position of the floating object in the aqueous humor during the second exposure period.
  • a region 2110b in the second image 2200 shown in FIG. 4D is a region (final position region) estimated to be the final position of the object suspended in the aqueous humor during the second exposure period.
  • the movement direction 2230 (FIG. 4C) determined by the evaluation information generation unit 1320 indicates the direction from the initial position area 2110a to the final position area 2110b.
  • the moving direction 2230 is information representing the moving state of the suspended matter in the aqueous humor during the second exposure period.
  • the evaluation processing unit 1300 of this example uses a first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition, and a second image generated under the second imaging condition.
  • the second image generated by the second imaging performed by the imaging unit 1120 is configured to estimate the movement vector of the suspended matter in the aqueous humor.
  • the movement vector includes the movement direction and movement amount (movement distance) of the floating object in the aqueous humor, and is an example of evaluation information.
  • FIG. 5A shows a first image 2300 and a second image 2400 that are processed in this example.
  • the floating object image detection unit 1310 of this example detects the first floating object image 2310 in the first image 2300 and the second floating object image 2410 in the second image 2400.
  • the evaluation information generation unit 1320 of this example estimates the movement vector of the floating object in the aqueous humor based on the first floating object image 2310 and the second floating object image 2410 detected by the floating object image detection unit 1310. do.
  • the evaluation information generation unit 1320 determines the feature points of the first floating object image 2310 and calculates the feature points of the first floating object image 2310 and the second floating object image 2410.
  • the method may be configured to estimate a movement vector of suspended matter in the aqueous humor.
  • the feature point of the first floating object image 2310 may be a representative point of the first floating object image 2310, for example, the center, the center of gravity, one or more points on the boundary, etc.
  • Reference numeral 2320 in FIG. 5B indicates one example of feature points of the first floating object image 2310. The following will explain using this exemplary feature point 2320.
  • the evaluation information generation unit 1320 specifies the position in the second floating object image 2410 that corresponds to the feature point 2320 of the first floating object image 2310, and The moving vector of the floating object in the aqueous humor may be estimated based on the position in the image 2410 (the position in the second floating object image 2410 corresponding to the feature point 2320).
  • Reference numeral 2320a in FIG. 5C indicates the position of an image obtained by mapping the feature point 2320 of the first floating object image 2310 to the second image 2400 by coordinate transformation between the first image 2300 and the second image 2400.
  • the evaluation information generation unit 1320 of this example estimates the movement vector of the floating object in the aqueous humor based on the position 2320a in the second floating object image 2410 that corresponds to the feature point 2320 of the first floating object image 2310.
  • the evaluation information generation unit 1320 includes a position 2320a in the second floating object image 2410 corresponding to the feature point 2320 of the first floating object image 2310, as well as a position 2320a of the first image sensor 1112. Based on the timing relationship between the exposure period (the exposure period in the first photographing, the first exposure period) and the exposure period of the second image sensor 1122 (the exposure period in the second photographing, the second exposure period) may be configured to estimate a movement vector of suspended matter in the aqueous humor.
  • the first exposure period corresponds to a part (partial period) of the second exposure period.
  • the timing relationship (exposure timing relationship) between the first exposure period and the second exposure period is the (temporal) position of the first exposure period in the second exposure period, that is, the second exposure period. This is information representing the position of the partial period in .
  • the position of the sub-period corresponding to the first exposure period may be defined, for example, by any point in time in the sub-period, for example by the start, middle, or end of the first exposure period. good.
  • the evaluation information generation unit 1320 of this example estimates the moving direction of the floating object in the aqueous humor based on the second floating object image 2410 in the second image 2400, for example, as in the first example described above. I can do it.
  • the evaluation information generation unit 1320 corresponds to the feature points 2320 of the second floating object image 2410 and the first floating object image 2310 The position 2320a in the second floating object image 2410 and the exposure timing relationship are referenced.
  • the evaluation information generation unit 1320 generates the first floating object image 2310 based on the temporal positional relationship between the first exposure period and the second exposure period shown in the exposure timing relationship.
  • the position 2320a corresponding to the feature point 2320 represents which point in the movement of the suspended matter in the aqueous humor (which point on the movement route) in the second exposure period is determined.
  • the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 is located during the second exposure period. represents the start point of movement of suspended matter in the aqueous humor (starting point of the movement route). Further, when the first exposure period corresponds to the end of the second exposure period, the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 corresponds to the position 2320a of the floating object in the aqueous humor during the second exposure period. It represents the end point of movement (the end point of the movement route).
  • the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 is the position of the floating object in the aqueous humor during the second exposure period. represents the center point of movement (the center point of the movement route).
  • the first exposure period corresponds to p percent points between the start point (0 percent point) and the end point (100 percent point) of the second exposure period (0 ⁇ p ⁇ 100)
  • the first The position 2320a corresponding to the feature point 2320 of the floating object image 2310 is p percent point in time between the start and end of the movement of the floating object in the aqueous humor during the second exposure period (from the starting point to the ending point of the movement route).
  • p percent points between The time points (points) determined in this manner will be referred to as corresponding time points (corresponding points).
  • the evaluation information generation unit 1320 uses the corresponding time point determined as described above, the spread (range, boundary) of the second floating object image 2410, and the moving direction of the floating object in the aqueous humor. , the amount of movement of suspended matter in the aqueous humor can be estimated.
  • the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 represents the starting point of the movement route of the floating object in the aqueous humor during the second exposure period.
  • the evaluation information generation unit 1320 calculates the movement of the floating objects in the aqueous humor during the second exposure period based on the movement direction of the floating objects in the aqueous humor and the spread (range, boundary) of the second floating object image 2410.
  • the end point (end point of the travel route) can be estimated.
  • the evaluation information generation unit 1320 finds two intersections 2410a and 2410b between a straight line 2420 along the moving direction of the floating object in the aqueous humor and the boundary of the second floating object image 2410.
  • the first intersection 2410a is located on the side closer to the starting point (position 2320a) of the movement route of the suspended matter in the aqueous humor.
  • the evaluation information generation unit 1320 calculates the distance D between the position 2320a and the first intersection 2410a.
  • the evaluation information generation unit 1320 determines a position 2320b displaced by a distance D from the second intersection 2410b along the straight line 2420 in the direction of the position 2320a (first intersection 2410a). This position 2320b is estimated to be the end point of the movement route of the suspended matter in the aqueous humor during the second exposure period.
  • the evaluation information generation unit 1320 calculates the distance between the position 2320a and the position 2320b. This distance is estimated to be the moving distance of the suspended matter in the aqueous humor during the second exposure period.
  • Reference numeral 2430 shown in FIG. 5E is an estimated movement vector of the floating object in the aqueous humor during the second exposure period. This movement vector is information representing the direction and amount of movement of the objects suspended in the aqueous humor during the second exposure period, and is information representing the state of movement of the objects suspended within the aqueous humor during the second exposure period.
  • the first exposure period corresponds to the end of the second exposure period or if the first exposure period corresponds to the middle of the second exposure period, generally the first exposure period corresponds to the second exposure period. Even in the case where the second exposure period corresponds to an arbitrary partial period of the exposure period, it is possible to obtain the movement vector of the suspended matter in the aqueous humor in the second exposure period by a method similar to the above method.
  • FIG. 6 shows one configuration example of an ophthalmological apparatus that can be adopted to realize the first example and second example described above.
  • the ophthalmological apparatus 1000A of this example is obtained by adding an illumination system 1130 to the ophthalmological apparatus 1000 of FIG.
  • the illumination system 1130 is included in the imaging unit 1100 and is configured to project a slit light onto the anterior segment of the subject's eye.
  • the first imaging unit 1110 and the second imaging unit 1120 share a single illumination system 1130, but as described above, the illumination system for the first imaging unit 1110 and the second imaging unit An illumination system for the imaging unit 1120 may be provided separately.
  • the longitudinal direction of the beam shape (cross-sectional shape of the beam) of the slit light is aligned with a predetermined direction, for example, aligned with the Y direction.
  • the moving mechanism 1400 is configured to integrally move the first imaging section 1110, the second imaging section 1120, and the illumination system 1130.
  • the moving direction is, for example, a direction perpendicular to the longitudinal direction of the beam shape of the slit light, and is, for example, the X direction. This makes it possible to scan a three-dimensional region of the anterior segment of the subject's eye and collect a series of Scheimpflug images. Regarding such three-dimensional scanning, please refer to Patent Document 3 (Japanese Patent Laid-Open No. 2019-213733).
  • a series of Scheimpflug images (a plurality of first images) collected by the first imaging unit 1110 and a series of Scheimpflug images collected by the first imaging unit 1120
  • a proof image (a plurality of second images) is obtained.
  • a plurality of pairs of the first image and the second image ie, a plurality of image pairs in which different parts of the anterior segment are respectively depicted, are obtained.
  • Each Scheimpflug image (first image, second image) acquired by the ophthalmological apparatus 1000A of this aspect is arranged along a coordinate axis along the longitudinal direction of the beam shape of the slit light and along the projection direction of the slit light onto the subject's eye.
  • This is an image representing a cross section of the anterior segment of the eye expressed in a two-dimensional coordinate system defined by the coordinate axes.
  • the ophthalmological apparatus 1000A of this embodiment can, for example, An image pair such as the pair with image 2200 (the pair of first image 2300 and second image 2400 shown in FIG. 5A) can be obtained.
  • the ophthalmological apparatus 1000A of this embodiment moves the first imaging unit 1110, the second imaging unit 1120, and the illumination system 1130 in the A plurality of image pairs can be collected, each corresponding to a cross section of the image.
  • the evaluation processing unit 1300 can generate evaluation information for floating objects in the aqueous humor based on each of a plurality of image pairs collected by three-dimensional scanning. This makes it possible to generate evaluation information for floating objects in the aqueous humor in a plurality of cross sections (for example, YZ cross sections) of the anterior segment. In other words, it is possible to generate evaluation information of floating substances in the aqueous humor in a three-dimensional region of the eye to be examined.
  • the evaluation processing unit 1300 generates a three-dimensional distribution of predetermined evaluation values regarding floating objects in the aqueous humor based on a plurality of pieces of evaluation information on a plurality of cross sections of the anterior segment generated based on a plurality of pairs. I can do it.
  • This evaluation value may be any type of evaluation value described in this disclosure.
  • Visual information eg, a color map
  • the generated three-dimensional distribution can be analyzed.
  • the ophthalmological apparatus 1000A of this aspect projects the slit light onto the anterior segment of the eye onto which the illumination system 1130 projects the slit light, while the first imaging unit 1110 performs the first imaging under the first imaging condition. and second photography under the second photography condition by the second photography unit 1120 can be performed in parallel.
  • the ophthalmological apparatus 1000A of this embodiment uses the evaluation processing unit 1300 to generate a pair of the first image and the second image generated by the first imaging and the second imaging performed together with the projection of the slit light. Based on this, evaluation information on suspended matter in the aqueous humor can be generated.
  • the floating object image detection unit 1310 of the ophthalmological apparatus 1000A of this aspect detects a first floating object image from a first image, and detects a second floating object image from a second image. be able to. Furthermore, the evaluation information generation unit 1320 of the ophthalmological apparatus 1000A of this embodiment generates illumination based on the first floating image detected from the first image and the second floating object image detected from the second image.
  • the amount of movement of suspended matter in the aqueous humor in the first direction (e.g., Z direction), which is the projection direction of the slit light applied to the anterior segment by the system 1130, and the longitudinal direction of the beam shape of this slit light (e.g., The amount of movement of suspended matter in the aqueous humor in the Y direction) can be estimated.
  • the movement state of the suspended matter in the aqueous humor in a cross section parallel to the plane defined by the first direction and the second direction (for example, the YZ plane) (that is, the movement vector of the suspended matter in the aqueous humor in the cross section) can be detected.
  • the process of estimating the amount of movement in the first direction and the process of estimating the amount of movement in the second direction can be performed, for example, by the method described above, but the estimation method is not limited to this.
  • the first The amount of movement in the direction and the amount of movement in the second direction can be estimated.
  • the feature point of the floating object image may be any representative point of the floating object image, such as the center, the center of gravity, or one or more points on the boundary.
  • the evaluation information generation unit 1320 of the ophthalmological apparatus 1000A of this aspect operates in a third direction (for example, the X direction) orthogonal to both the first direction (for example, the Z direction) and the second direction (for example, the Y direction). It may be configured to estimate the amount of movement of suspended matter in the aqueous humor.
  • FIG. 7A shows a first image 2500 and a second image 2600 that are processed in this example.
  • the first direction is the Z direction
  • the second direction is the Y direction
  • the third direction is the X direction.
  • Floating object image detection section 1310 detects first floating object image 2510 in first image 2500 and second floating object image 2610 in second image 2600.
  • the evaluation information generation unit 1320 generates a first floating object image 2510 detected from the first image 2500 and a second floating object image 2510 detected from the second image 2600 using any of the methods described in the present disclosure.
  • Estimating the movement vector of the floating object in the aqueous humor in the YZ plane based on the object image 2610 that is, estimating the movement vector of the floating object in the aqueous humor in the Y direction and the movement amount ⁇ y of the floating object in the aqueous humor in the Z direction.
  • the amount of movement ⁇ z can be estimated.
  • the evaluation information generation unit 1320 can estimate the movement amount ⁇ x of the floating object in the aqueous humor in the X direction based on the second floating object image 2610 detected from the second image 2600.
  • the evaluation information generation unit 1320 is configured to estimate the movement amount ⁇ x of the floating object in the aqueous humor in the X direction based on the degree of blurring of the second floating object image 2610. It's fine.
  • the evaluation information generation unit 1320 calculates the dimensions of the second floating object image 2610 detected from the second image 2600 generated by the second imaging unit 1120 and the numerical aperture NA of the objective lens of the second imaging unit 1120. Based on this, the amount of movement ⁇ x of the suspended matter in the aqueous humor in the X direction can be estimated.
  • the dimension of the second floating object image 2610 may be, for example, the width w of the second floating object image 2610.
  • the width w of the second floating object image 2610 is, for example, the dimension of the second floating object image 2610 in the direction perpendicular to the long axis 2220 of the second floating object image 2210 shown in FIG. It may be the length of the minor axis of the floating object image 2210 of No. 2 (see FIG. 7B). Even when the second floating object image 2610 is not elliptical, the dimensions (width) of the second floating object image 2610 can be determined in a similar manner.
  • the evaluation information generation unit 1320 can also calculate the moving speed of floating objects in the aqueous humor.
  • the evaluation information generation unit 1320 may calculate the first exposure time “a” of the first image sensor 1112 of the first image capturing unit 1110 and the second exposure time “a” of the second image sensor 1122 of the second image capturing unit 1120.
  • Exposure time “b” of 2 the amount of movement ⁇ x of the floating substances in the aqueous humor in the X direction, the amount of movement ⁇ y of the floating substances in the aqueous humor in the Y direction, and the amount of movement of the floating substances in the aqueous humor in the z direction ⁇ z
  • This moving speed V 2 is determined during the period from the end of the exposure period (first exposure period) of the first imaging section 1100 to the end of the exposure period (second exposure period) of the second imaging section 1120. This is an estimated value of the three-dimensional moving speed (speed) of suspended matter in the aqueous humor.
  • the evaluation information generation unit 1320 generates a cluster based on the first floating object image and the second floating object image detected from the first image and the second image, respectively, by the floating object image detection unit 1310. It is possible to estimate the speed of movement of suspended objects in water.
  • the evaluation processing unit 1300 according to some exemplary aspects will be described. As shown in FIG. 8, the evaluation processing section 1300 of this embodiment includes a type identification section 1330 in addition to a floating object image detection section 1310 and an evaluation information generation section 1320 similar to those of the embodiment of FIG.
  • the type identification unit 1330 identifies images acquired by the imaging unit 1100 (a first image generated by the first imaging unit 1110 under a first imaging condition, and a second image generated by the first imaging unit 1110 under a second imaging condition).
  • the second image generated by the imaging unit 1120) is configured to identify the type of suspended matter in the aqueous humor.
  • Types of suspended matter in the aqueous humor include inflammatory cells, white blood cells (macrophages, lymphocytes, etc.), and blood proteins.
  • the function of the type identification unit 1330 is realized by cooperation between software such as a type identification program and hardware such as a processor.
  • the type identifying unit 1330 can identify, for each of the plurality of detected floating object images, the type of the floating object in the aqueous humor that corresponds to that floating object image. can. Further, the type identification unit 1330 can classify a plurality of floating object images based on the result of type identification.
  • the type identification unit 1330 depicts the floating object image in the image based on the characteristics of the floating object image detected by the floating object image detection unit 1310 from the image acquired by the imaging unit 1100. It may be configured to identify the type of suspended matter in the aqueous humor. Examples of the characteristics of the floating object image used as criteria for identifying the type in this example include the brightness, size, shape, distribution state (number, density, position, etc.) of the floating object image, and movement state (moving direction, amount of movement, etc.). movement speed, etc.).
  • the type identifying unit 1330 determines the type of floater in the aqueous humor depicted in the image of the anterior segment of the subject (patient) based on the subject's (patient) medical data. may be configured to identify the The medical data referred to in identifying the type in this example includes disease names (diagnosis names, suspected disease names, etc.), test data, doctor's findings, and the like. The medical data is obtained, for example, from patient data (electronic medical records, reports, medical documents, etc.) stored in a database.
  • the type identification unit 1330 may be configured to identify the type of the object suspended in the aqueous humor based on an image obtained by imaging using polarized light. Photographing using polarized light is performed, for example, by projecting illumination light in a first polarization direction onto the subject's eye, and extracting and detecting a component in the second polarization direction from the returned light from the subject's eye.
  • a component whose polarization direction has changed due to reflection at the subject's eye can be detected. I can do it. As a result, a diffuse reflection image of the anterior segment of the eye is obtained.
  • the evaluation processing unit 1300 of this embodiment may be configured to identify the type of floating matter in the aqueous humor based on the intensity ratio between the specular reflection image and the diffuse reflection image. First, the evaluation processing unit 1300 of this embodiment detects a floating object image from a specular reflection image and also detects a floating object image from a diffuse reflection image using a floating object image detection unit 1310.
  • the type identification unit 1330 performs registration between the specular reflection image and the diffuse reflection image, and identifies one or more floating objects identified from the specular reflection image based on the result of this registration. A correspondence is established between the position (coordinates) of the image and the position (coordinate) of one or more floating object images identified from the diffuse reflection image. As a result, the floating object image in the specular reflection image and the floating object image in the diffuse reflection image, which correspond to the same floating object in the aqueous humor, are identified and associated with each other.
  • the type identifying unit 1330 calculates the intensity value of the floating object image in the specular reflection image corresponding to one floating object in the aqueous humor, and also calculates the intensity value of the floating object image in the diffuse reflection image corresponding to the same floating object in the aqueous humor. Find the intensity value of the image. Intensity values are determined based on pixel values.
  • the intensity value may be any statistic calculated from pixel values in the floating object image.
  • This statistic may be, for example, an average, variance, standard deviation, maximum value, minimum value, mode, median value, etc.
  • the type identifying unit 1330 determines the intensity value of a floating object image (specular reflection floating object image) in the specular reflection image corresponding to one floating object in the aqueous humor and the diffuse reflection corresponding to the same floating object in the aqueous humor. The intensity value of the floating object image (diffuse reflection floating object image) in the image is compared.
  • the type identification unit 1330 calculates a ratio T1/T2 between the intensity T1 of the specularly reflected floating object image and the intensity T2 of the diffusely reflected floating object image, and sets this ratio T1/T2 to a predetermined value. is compared with the threshold value TH. For example, if the absolute value abs(T1/T2) of the ratio T1/T2 is greater than or equal to the threshold TH, the type identifying unit 1330 estimates that the floating material is a macrophage, and also specifies the absolute value of the ratio T1/T2. If the value abs(T1/T2) is below a threshold TH, the float may be configured to be presumed to be a lymphocyte.
  • the evaluation information generating section 1320 of the evaluation processing section 1300 of this embodiment can generate evaluation information according to the type of floating matter in the aqueous humor specified by the type specifying section 1330.
  • the evaluation information generation unit 1320 may be configured to generate evaluation information for each type of floating matter in the aqueous humor.
  • the type identification unit 1330 may be configured to refer to color information of the image in addition to or instead of the intensity values described above. If the first image acquired by the first image capture unit 1110 and/or the second image acquired by the second image capture unit 1120 are color images, the type identification unit 1330 may, for example, The configuration may be such that the type of floating matter in the aqueous humor is identified using at least one of three color component images (R component image, G component image, B component image), or three color component images.
  • It may be configured to identify the type of floating matter in the aqueous humor using information generated based on the information (for example, luminance signal value (Y)), or to convert a color image into a monochrome image to identify the type of floating matter in the aqueous humor. It may be configured to identify the type of thing.
  • information for example, luminance signal value (Y)
  • Y luminance signal value
  • the type identification method executed by the type identification unit 1330 is not limited to the above example.
  • the type identifying unit 1330 can identify the type of floating matter in the aqueous humor based on evaluation of reflection wavelength characteristics using optical coherence tomography (OCT).
  • OCT optical coherence tomography
  • the type identification method in this example is disclosed in the following documents, for example: RUOBING QIAN, RYAN P. MCNABB, KEVIN C. ZHOU, HAZEM M. MOUSA, DANIEL R. SABAN, VICTOR L. PEREZ, ANTHONY N. KUO , AND JOSEPH A. IZATT, “In vivo quantitative analysis of anterior chamber white blood cell mixture composition using spectroscopic optical coherence tomography”, Vol. 12, No.
  • the ophthalmologic apparatus may include, for example, a known OCT scanner in addition to the configuration shown in FIG. 1 and the configuration shown in FIG. 8.
  • the ophthalmologic apparatus may be configured to apply the type identification method of this example to an OCT image acquired by a separately provided OCT scanner.
  • the ophthalmological apparatus 1000 first applies a first imaging under the first imaging condition and a second imaging under the second imaging condition to the anterior segment of the eye to be examined ( S1).
  • the first imaging is performed by the first imaging unit 1110, and the second imaging is performed by the second imaging unit 1120.
  • Parallel execution of the first imaging and the second imaging is realized by synchronous control executed by the control unit 1200.
  • the ophthalmological apparatus 1000 estimates the anterior segment of the subject's eye based on the first image and second image generated by the first imaging and second imaging, respectively, which were performed in parallel in step S1. Evaluation information regarding floating objects present in the aqueous humor (in the anterior chamber) is generated (S2).
  • the evaluation information generated in step S2 may include, for example, the moving direction, moving amount, moving vector, moving speed, moving acceleration, etc. of the suspended matter in the aqueous humor.
  • the evaluation information generated in step S2 has various uses.
  • the evaluation information is stored in a storage device (not shown), and/or provided to generate visual information displayed on a display device (not shown), and/or provided to analysis processing by a computer (not shown).
  • the ophthalmological apparatus 1000 first performs a first imaging under a first imaging condition and a second imaging under a second imaging condition in the same manner as step S1 of the first operational example. In parallel, it is applied to the anterior segment of the eye to be examined (S11).
  • the ophthalmological apparatus 1000 detects a first floating object image corresponding to a floating object in the aqueous humor from the first image generated by the first imaging in step S11 (S12), and A second floating object image corresponding to the same floating object in the aqueous humor is detected from the second image generated in the second imaging (S13).
  • step of detecting the first floating object image and the step of detecting the second floating object may be performed in any order, or these steps may be performed in parallel.
  • the ophthalmological apparatus 1000 detects the tuft based on the first floating object image detected from the first image in step S12 and the second floating object image detected from the second image in step S13. Evaluation information for floating objects in water is generated (S14). The generated evaluation information is provided for various purposes.
  • a third operation example will be described with reference to FIG. 11.
  • the ophthalmological apparatus 1000 first applies a scan to a three-dimensional region of the anterior segment of the eye to be examined (S21).
  • the three-dimensional scan in step S21 is realized by combining parallel execution of first imaging under the first imaging condition and second imaging under the second imaging condition, and movement of the imaging position. Ru.
  • the parallel execution of the first photographing under the first photographing condition and the second photographing under the second photographing condition is realized in the same manner as step S1 of the first operation example.
  • the movement of the photographing position is realized by the movement mechanism 1400 moving the photographing section 1100 (first optical system 1111 and second optical system 1121) under the control of the control section 1200.
  • the first image and the second image generated in the first and second photography performed in parallel are obtained from the three-dimensional area of the anterior segment of the subject's eye.
  • Multiple pairs are collected. That is, by the three-dimensional scan in step S21, a plurality of image pairs (a pair of a first image and a second image) each depicting the state of a plurality of cross sections in a three-dimensional region of the anterior segment of the subject's eye are generated. be done.
  • the ophthalmological apparatus 1000 performs an imaging process based on the pair of the first image and the second image corresponding to the imaging position. Evaluation information on floating matter in the aqueous humor present in the cross section of the eye is generated (S22). As a result, a plurality of pieces of evaluation information respectively corresponding to a plurality of photographing positions can be obtained.
  • the ophthalmological apparatus 1000 determines a predetermined value regarding the floating matter in the aqueous humor present in the three-dimensional region of the anterior segment to which the three-dimensional scan in step S21 was applied. A three-dimensional distribution of evaluation values is generated (S23).
  • the three-dimensional distribution generated in step S23 can be provided for various purposes.
  • the three-dimensional distribution is stored in a storage device (not shown), and/or provided to generate visual information displayed on a display device (not shown), and/or provided to analysis processing by a computer (not shown).
  • the plural pieces of evaluation information generated in step S22 can also be provided for various purposes.
  • the ophthalmological apparatus 1000 first performs a first imaging under a first imaging condition and a second imaging under a second imaging condition in the same manner as step S1 of the first operational example. In parallel, it is applied to the anterior segment of the eye to be examined (S31).
  • the ophthalmological apparatus 1000 specifies the type of floating matter in the aqueous humor depicted in the first image and second image generated in the first imaging and second imaging in step S31 ( S32).
  • the ophthalmological apparatus 1000 generates evaluation information of the floating matter in the aqueous humor according to the type of the floating matter in the aqueous humor identified in step S32 (S33).
  • the evaluation information generated in step S33 according to the type of suspended matter in the aqueous humor is provided for various purposes.
  • the evaluation information generated in this operation example is stored according to the type of floating matter in the aqueous humor, and/or provided to generate visual information according to the type of floating matter in the aqueous humor, and /Or provided for analysis processing according to the type of suspended matter in the aqueous humor.
  • FIG. 13 shows one example of a more specific configuration of the ophthalmologic apparatus according to the embodiment.
  • FIG. 13 is a top view.
  • the direction along the axis of the eye E to be examined is the Z direction, and among the directions perpendicular to this, the left and right directions for the examinee are the X direction, and the directions perpendicular to both the X direction and the Z direction (vertical direction, body axis direction) ) is the Y direction.
  • the ophthalmological apparatus of this example is a slit lamp microscope system 1 having a configuration similar to that disclosed in Patent Document 3 (Japanese Patent Laid-Open No. 2019-213733), and includes an illumination optical system 2 and a photographing optical system 3. , a moving image photographing optical system 4, an optical path coupling element 5, a moving mechanism 6, a control section 7, a data processing section 8, a communication section 9, and a user interface 10.
  • the cornea of the eye E to be examined is indicated by the symbol C, and the crystalline lens is indicated by the symbol CL.
  • the anterior chamber corresponds to the area between the cornea C and the crystalline lens CL (the area between the cornea C and the iris).
  • Patent Document 3 Japanese Patent Application Laid-Open No. 2019-213733.
  • the illumination optical system 2 projects a slit light onto the anterior segment of the eye E to be examined.
  • Reference numeral 2a indicates the optical axis (illumination optical axis) of the illumination optical system 2.
  • the photographing optical system 3 photographs the anterior segment of the eye onto which the slit light from the illumination optical system 2 is projected.
  • Reference numeral 3a indicates the optical axis (photographing optical axis) of the photographing optical system 3.
  • the optical system 3A guides light from the anterior segment of the subject's eye E onto which the slit light is projected to the image sensor 3B.
  • the image sensor 3B receives the light guided by the optical system 3A on its imaging surface.
  • the image sensor 3B includes an area sensor (CCD area sensor, CMOS area sensor, etc.) having a two-dimensional imaging area.
  • the photographing optical system 3 includes two photographing optical systems (a first photographing section and a second photographing section). The two photographing optical systems will be described later with reference to FIG. 14.
  • the illumination optical system 2 and the photographing optical system 3 function as a Scheimpflug camera, and are designed so that the object surface along the illumination optical axis 2a, the imaging surface of the optical system 3A and the image sensor 3B satisfy Scheimpflug conditions.
  • the YZ plane (including the object plane) passing through the illumination optical axis 2a, the main surface of the optical system 3A, and the imaging surface of the image sensor 3B are configured to intersect on the same straight line.
  • the illumination optical system 2 and the photographing optical system 3 can perform photographing in a state in which at least the range from the posterior surface of the cornea C to the anterior surface of the crystalline lens CL (anterior chamber) is in focus.
  • the video photographing optical system 4 is a video camera, and takes a video of the anterior segment of the eye E in parallel with the photographing of the eye by the illumination optical system 2 and the photographing optical system 3.
  • the optical path coupling element 5 couples the optical path of the illumination optical system 2 (illumination optical path) and the optical path of the video imaging optical system 4 (video imaging optical path).
  • FIG. 14 A specific example of an optical system including the illumination optical system 2, the photographing optical system 3, the moving image photographing optical system 4, and the optical path coupling element 5 is shown in FIG.
  • the optical systems shown in FIG. 14 include an illumination optical system 20 which is an example of the illumination optical system 2, a left photographing optical system 30L and a right photographing optical system 30R which are examples of the photographing optical system 3, and an example of the video photographing optical system 4.
  • the optical system 40 includes a moving image photographing optical system 40, which is a moving image photographing optical system 40, and a beam splitter 47, which is an example of an optical path coupling element 5.
  • Reference numeral 20a indicates the optical axis (illumination optical axis) of the illumination optical system 20
  • reference numeral 30La indicates the optical axis (left imaging optical axis) of the left photographing optical system 30L
  • reference numeral 30Ra indicates the optical axis (of the right photographing optical system 30R). right photographing optical axis).
  • the angle ⁇ L indicates the angle between the illumination optical axis 20a and the left photographing optical axis 30La
  • the angle ⁇ R indicates the angle between the illumination optical axis 20a and the right photographing optical axis 30Ra.
  • the moving mechanism 6 moves the illumination optical system 20, the left photographing optical system 30L, and the right photographing optical system 30R in the direction shown by the arrow 49 (X direction).
  • the illumination light source 21 of the illumination optical system 20 outputs illumination light (for example, visible light), and the positive lens 22 refracts the illumination light.
  • the slit forming section 23 forms a slit to allow part of the illumination light to pass through.
  • the generated slit light is refracted by the objective lens groups 24 and 25, reflected by the beam splitter 47, and projected onto the anterior segment of the eye E to be examined.
  • the reflector 31L and the imaging lens 32L of the left photographing optical system 30L direct light from the anterior segment of the eye onto which the slit light is projected by the illumination optical system 20 (light traveling in the direction of the left photographing optical system 30L) to the imaging element. Leads to 33L.
  • the image sensor 33L receives the guided light at an image pickup surface 34L.
  • the left photographing optical system 30L repeatedly performs photographing in parallel with the movement of the illumination optical system 20, left photographing optical system 30L, and right photographing optical system 30R by the moving mechanism 6. As a result, a plurality of anterior segment images (a series of Scheimpflug images) are obtained.
  • the object surface along the illumination optical axis 20a, the optical system including the reflector 31L and the imaging lens 32L, and the imaging surface 34L satisfy the Scheimpflug condition.
  • the right photographing optical system 30R also has a similar configuration and function.
  • Scheimpflug image collection by the left photographing optical system 30L and Scheimpflug image collection by the right photographing optical system 30R are performed in parallel with each other.
  • the left photographing optical system 30L is used for the first photographing under the first photographing condition
  • the right photographing optical system 30R is used for the second photographing under the second photographing condition.
  • the left photographing optical system 30L is used for the second photographing under the second photographing condition
  • the right photographing optical system 30R is used for the first photographing under the first photographing condition.
  • the control unit 7 can synchronize repeated shooting by the left shooting optical system 30L and repeated shooting by the right shooting optical system 30R. Thereby, a correspondence relationship between the series of Scheimpflug images obtained by the left photographing optical system 30L and the series of Scheimpflug images obtained by the right photographing optical system 30R is obtained.
  • the process of determining the correspondence between the plurality of anterior eye segment images obtained by the left photographing optical system 30L and the plurality of anterior eye segment images obtained by the right photographing optical system 30R is performed by the control unit 7 or the data. It may also be executed by the processing unit 8.
  • the video photographing optical system 40 photographs a video of the anterior segment of the subject's eye E from a fixed position in parallel with the photographing by the left photographing optical system 30L and the photographing by the right photographing optical system 30R.
  • the light that has passed through the beam splitter 47 is reflected by a reflector 48 and enters the moving image photographing optical system 40 .
  • the light incident on the moving image photographing optical system 40 is refracted by an objective lens 41 and then imaged by an imaging lens 42 on an imaging surface of an image sensor 43 (area sensor).
  • the moving image photographing optical system 40 is used for monitoring the movement of the eye E to be examined, alignment, tracking, processing of collected Scheimpflug images, and the like.
  • the moving mechanism 6 moves the illumination optical system 2 and the photographing optical system 3 integrally in the X direction.
  • the control unit 7 controls each part of the slit lamp microscope system 1.
  • the control unit 7 executes the control of the illumination optical system 2, the photographing optical system 3, and the moving mechanism 6, and the control of the video photographing optical system 4 in parallel with each other, thereby performing a three-dimensional scan (a series of scans) of the anterior segment of the eye. (collection of Scheimpflug images) and video recording of the anterior segment (collection of a series of time-series images) can be performed in parallel.
  • control unit 7 performs three-dimensional scanning of the anterior segment of the eye by controlling the illumination optical system 2, the photographing optical system 3, and the moving mechanism 6, and controlling the video photographing optical system 4 in synchronization with each other. and video recording of the anterior segment of the eye can be synchronized with each other.
  • the control unit 7 controls repeated photographing (collection of Scheimpflug image groups) by the left photographing optical system 30L and repetition by the right photographing optical system 30R.
  • Photographing can be synchronized with each other.
  • the pair operation of the first photographing under the first photographing condition and the second photographing under the second photographing condition and the movement of the photographing position are combined (in parallel). can be executed synchronously).
  • the control unit 7 includes a processor, a storage device, and the like.
  • the storage device stores computer programs such as various control programs.
  • the functions of the control unit 7 are realized by cooperation between software such as a control program and hardware such as a processor.
  • the control unit 7 controls the illumination optical system 2, the photographing optical system 3, and the movement mechanism 6 in order to scan the three-dimensional region of the anterior segment of the eye E to be examined using the slit light.
  • Patent Document 3 Japanese Unexamined Patent Publication No. 2019-213733
  • the control unit 7 has the function of the control unit 1200 of the ophthalmological apparatus 1000.
  • the functions of the control unit 7 are not limited to those described here.
  • the data processing unit 8 executes various data processing.
  • the data processing unit 8 includes a processor, a storage device, and the like.
  • the storage device stores computer programs such as various data processing programs.
  • the functions of the data processing section 8 are realized by cooperation between software such as a data processing program and hardware such as a processor.
  • the data processing section 8 has the function of the evaluation processing section 1300 of the ophthalmological apparatus 1000. The function of the data processing section 8 is not limited to this.
  • the communication unit 9 performs data communication between the slit lamp microscope system 1 and other devices.
  • the user interface 10 includes any user interface devices such as a display device and an operation device.
  • the slit lamp microscope system 1 shown in FIGS. 13 and 14 is a non-limiting example, and the configuration for implementing the ophthalmologic apparatus 1000 (1000A) is not limited to the slit lamp microscope system 1.
  • a first aspect of the ophthalmological apparatus includes a first optical system that satisfies Scheimpflug conditions, and applies first imaging under a first imaging condition to the anterior segment of the eye to be examined. It includes a first imaging section and a second optical system that satisfies Scheimpflug conditions, and performs a second imaging under a second imaging condition different from the first imaging condition in parallel with the first imaging. a second imaging unit applied to the anterior segment of the eye; and a second imaging unit applied to the anterior segment of the eye;
  • the ophthalmologic apparatus includes an evaluation processing unit that generates evaluation information of an object.
  • the second aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the first aspect.
  • the first optical system includes a first image sensor
  • the second optical system includes a second image sensor
  • the first imaging condition is an exposure time of the first image sensor.
  • the second photographing condition includes a second exposure time that is the exposure time of the second image sensor, and the second exposure time is equal to the first exposure time. longer than
  • the third aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the second aspect.
  • the evaluation processing section is a floating object image detection section that detects a first floating object image in the first image and a second floating object image in the second image that correspond to the same floating object in the aqueous humor. and an evaluation information generation unit that generates evaluation information of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
  • the fourth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the third aspect.
  • the evaluation information generation unit estimates the moving direction of the floating object in the aqueous humor based on the second floating object image.
  • the fifth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the third or fourth aspect.
  • the evaluation information generation unit estimates a movement vector of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
  • the sixth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the fifth aspect.
  • the evaluation information generation unit obtains feature points of the first floating object image, and estimates the movement vector based on the feature points of the first floating object image and the second floating object image.
  • the seventh aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the sixth aspect.
  • the evaluation information generation unit specifies a position in the second floating object image that corresponds to the feature point of the first floating object image, and calculates the movement vector based on the position corresponding to the feature point. presume.
  • the eighth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the seventh aspect.
  • the evaluation information generation unit is configured to calculate the position in the second floating object image corresponding to the feature point of the first floating object image, the exposure period of the first image sensor, and the second imaging device.
  • the movement vector is estimated based on the timing relationship with the exposure period of the element.
  • the ninth aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the third to eighth aspects.
  • the tenth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the ninth aspect.
  • the evaluation information generation unit estimates the amount of movement of the floating object in the aqueous humor in a third direction perpendicular to both the first direction and the second direction.
  • the eleventh aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the tenth aspect.
  • the second optical system includes an objective lens, and the evaluation information generation unit estimates the amount of movement in the third direction based on the dimensions of the second floating object image and the numerical aperture of the objective lens. do.
  • the twelfth aspect of the ophthalmologic apparatus includes the following non-limiting features in addition to the non-limiting features of the eleventh aspect.
  • the evaluation information generation unit calculates the width of the second floating object image as the dimension of the second floating object image.
  • the thirteenth aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the tenth to twelfth aspects.
  • the evaluation information generation unit is configured to generate the second exposure time of the second image sensor, the amount of movement in the first direction, the amount of movement in the second direction, and the movement in the third direction. Based on the amount, the moving speed of the suspended matter in the aqueous humor is estimated.
  • the fourteenth aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the tenth to thirteenth aspects.
  • the evaluation information generation unit is configured to generate the first exposure time of the first image sensor, the second exposure time of the second image sensor, the amount of movement in the first direction, and the second direction.
  • the moving speed of the floating object in the aqueous humor is estimated based on the moving amount in the third direction and the moving amount in the third direction.
  • the fifteenth aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any one of the third to fourteenth aspects.
  • the evaluation information generation unit estimates the moving speed of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
  • the 16th aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the 3rd to 15th aspects.
  • the floating object image detection unit detects a first image set including images of a plurality of objects floating in the aqueous humor from the first image, and detects images of a plurality of objects floating in the aqueous humor from the second image. detecting a second image set including the first image set and determining a positional correspondence relationship between the first image set and the second image set, and the evaluation information generating unit Based on a pair of one element of the first image set and one element of the second image set, evaluation information of floating matter in the aqueous humor corresponding to the pair is generated.
  • the seventeenth aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any one of the third to sixteenth aspects.
  • the floating object image detection unit detects a floating object image from one of the first image and the second image, determines the coordinates of the floating object image in the one image, and calculates the coordinates of the floating object image in the one image, and and the other image of the second image, the floating object image at a position corresponding to the coordinates is detected, and the evaluation information generation section detects the floating object image detected from the one image and the floating object image detected from the one image. Based on the pair with the floating object image detected from the other image, evaluation information of the floating object in the aqueous humor corresponding to the pair is generated.
  • the 18th aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the 2nd to 17th aspects.
  • the The image forming apparatus further includes a control section that causes the first photographing section and the second photographing section to generate a plurality of pairs of the first image and the second image.
  • the nineteenth aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the eighteenth aspect.
  • the evaluation processing section generates evaluation information based on each of the plurality of pairs.
  • the 20th aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the 19th aspect.
  • the evaluation processing section generates a three-dimensional distribution of predetermined evaluation values based on the plurality of evaluation information generated based on the plurality of pairs.
  • the twenty-first aspect of the ophthalmological apparatus has the following non-limiting features in addition to the non-limiting features of any of the second to twentieth aspects.
  • the evaluation processing section includes a type specifying section that specifies the type of floating matter in the aqueous humor depicted in the first image and the second image.
  • the 22nd aspect of the ophthalmological apparatus includes the following non-limiting features in addition to the non-limiting features of the 21st aspect.
  • the evaluation processing section generates evaluation information according to the type of the floating object in the aqueous humor specified by the type specifying section.
  • an ophthalmological device having these non-limiting features can improve the quality of the process of evaluating suspended matter in the aqueous humor based on images acquired in ophthalmic imaging. It is possible.
  • Embodiments of ophthalmological devices have been described so far, embodiments according to the present disclosure are not limited to ophthalmological devices.
  • Embodiments other than ophthalmological devices include a method for controlling an ophthalmological device, a program, a recording medium, and the like. Similar to the embodiments of the ophthalmological device, these embodiments also allow for improved quality of the evaluation of suspended matter in the aqueous humor.
  • a method for controlling an ophthalmological apparatus is a method for controlling an ophthalmological apparatus.
  • This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the method according to this embodiment is configured to cause a processor included in the ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
  • the processor functioning as a control unit according to the method according to the present embodiment controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the subject's eye;
  • the second imaging unit is controlled to apply a second imaging under a second imaging condition different from the imaging conditions to the anterior segment of the eye in parallel with the first imaging.
  • the processor functioning as an evaluation processing unit evaluates the amount of suspended matter in the aqueous humor based on the first image generated in the first imaging and the second image generated in the second imaging. generate evaluation information.
  • a program according to one embodiment is a program for operating an ophthalmological apparatus.
  • This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the program according to this embodiment is configured to cause a processor included in an ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
  • a processor functioning as a control unit according to the program according to the present embodiment controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the subject's eye, and The second imaging unit is controlled to apply a second imaging under a second imaging condition different from the imaging conditions to the anterior segment of the eye in parallel with the first imaging.
  • the processor functioning as an evaluation processing unit evaluates the floating matter in the aqueous humor based on the first image generated in the first imaging and the second image generated in the second imaging. generate evaluation information.
  • the recording medium is a computer-readable non-temporary recording medium in which a program for operating an ophthalmological apparatus is recorded.
  • This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor.
  • the first imaging section includes a first optical system that satisfies Scheimpflug conditions.
  • the second photographing section includes a second optical system that satisfies Scheimpflug conditions.
  • the program recorded on the recording medium is configured to cause the processor included in the ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
  • a processor functioning as a control unit according to a program recorded in a recording medium includes a first imaging unit for applying a first imaging under a first imaging condition to the anterior segment of the subject's eye. and control of a second imaging unit to apply a second imaging to the anterior segment of the eye in parallel with the first imaging under a second imaging condition different from the first imaging condition. .
  • the processor functioning as an evaluation processing unit by the program recorded on the recording medium according to the present embodiment is configured to compare the first image generated in the first imaging and the second image generated in the second imaging. Based on this, evaluation information for suspended matter in the aqueous humor is generated.
  • the computer-readable non-temporary recording medium that can be used as the recording medium according to the present embodiment may be any type of recording medium, such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory. It may be.
  • items arbitrarily selected from various items described as arbitrary aspects of the ophthalmological apparatus according to the embodiments may be used as an embodiment of the ophthalmological apparatus control method, a program embodiment, and a recording medium embodiment. It can be combined with etc.
  • any of the matters described in the present disclosure can be combined into an embodiment of a method for controlling an ophthalmologic apparatus, an embodiment of a program, an embodiment of a recording medium, and the like.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The ophthalmologic device according to one embodiment includes a first imaging unit, a second imaging unit and an evaluation processing unit. The first imaging unit includes a first optical system that meets Scheimpflug conditions, and is configured such that first imaging under a first imaging condition is applied to an anterior eye segment of an eye to be examined. The second imaging unit includes a second optical system that meets Scheimpflug conditions, and is configured such that second imaging under a second imaging condition that is different from the first imaging condition is applied to the anterior eye segment in parallel with the first imaging. The evaluation processing unit is configured so as to generate evaluation information about a floating substance in an aqueous humor on the basis of a first image generated in the first imaging by the first imaging unit and a second image generated in the second imaging by the second imaging unit.

Description

眼科装置、眼科装置を制御する方法、プログラム、及び記録媒体Ophthalmological device, method for controlling ophthalmological device, program, and recording medium
 本発明は、眼科装置、眼科装置を制御する方法、プログラム、及び記録媒体に関する。 The present invention relates to an ophthalmologic apparatus, a method for controlling an ophthalmologic apparatus, a program, and a recording medium.
 眼科分野において画像診断は重要な位置を占める。眼科画像診断では、各種の眼科装置が用いられる。眼科装置には、スリットランプ顕微鏡、眼底カメラ、走査型レーザー検眼鏡(SLO)、光干渉断層計(OCT)などがある。また、レフラクトメータ、ケラトメータ、眼圧計、スペキュラーマイクロスコープ、ウェーブフロントアナライザ、マイクロペリメータなどの各種の検査装置や測定装置にも、前眼部や眼底を撮影する機能が搭載されている。 Image diagnosis occupies an important position in the field of ophthalmology. Various ophthalmological devices are used in ophthalmological image diagnosis. Ophthalmological equipment includes slit lamp microscopes, fundus cameras, scanning laser ophthalmoscopes (SLO), and optical coherence tomography (OCT). In addition, various inspection and measurement devices such as refractometers, keratometers, tonometers, specular microscopes, wavefront analyzers, and microperimeters are also equipped with functions for photographing the anterior segment and fundus of the eye.
 これら様々な眼科装置のうち最も広く且つ頻繁に使用される装置の1つが、眼科医にとっての聴診器とも呼ばれるスリットランプ顕微鏡である。スリットランプ顕微鏡は、スリット光で被検眼を照明し、照明された断面を側方から顕微鏡で観察したり撮影したりするための眼科装置である(例えば、特許文献1、2を参照)。また、シャインプルーフの条件を満足するように構成された光学系を用いることにより被検眼の3次元領域を高速でスキャンすることが可能なスリットランプ顕微鏡も知られている(例えば、特許文献3を参照)。なお、スリットランプ顕微鏡の他にも、スリット光で対象物をスキャンする撮像方式としてはローリングシャッターカメラなどが知られている。 Among these various ophthalmological devices, one of the most widely and frequently used devices is the slit lamp microscope, which is also called a stethoscope for ophthalmologists. A slit lamp microscope is an ophthalmological device that illuminates a subject's eye with slit light and observes or photographs the illuminated cross section from the side with a microscope (see, for example, Patent Documents 1 and 2). Additionally, a slit lamp microscope is known that can scan a three-dimensional area of the eye to be examined at high speed by using an optical system configured to satisfy Scheimpflug conditions (for example, see Patent Document 3). reference). In addition to the slit lamp microscope, a rolling shutter camera is also known as an imaging method that scans an object with slit light.
 スリットランプ顕微鏡は、眼組織(眼瞼、結膜、角膜、虹彩、前房、水晶体、硝子体、網膜など)の観察に用いられるだけでなく、房水内(前房内)の浮遊物の評価にも使用される(例えば、特許文献4、5を参照)。房水内浮遊物としては、炎症細胞、白血球(マクロファージ、リンパ球など)、血中蛋白などがある。 Slit lamp microscopes are used not only to observe ocular tissues (eyelids, conjunctiva, cornea, iris, anterior chamber, crystalline lens, vitreous, retina, etc.), but also to evaluate floating objects in the aqueous humor (in the anterior chamber). are also used (for example, see Patent Documents 4 and 5). Suspended substances in the aqueous humor include inflammatory cells, white blood cells (macrophages, lymphocytes, etc.), and blood proteins.
特開2016-159073号公報JP 2016-159073 Publication 特開2016-179004号公報Japanese Patent Application Publication No. 2016-179004 特開2019-213733号公報JP2019-213733A 国際公開第2018/003906号International Publication No. 2018/003906 特表2022-520832号公報Special Publication No. 2022-520832
 本発明の1つの目的は、房水内浮遊物の評価の品質向上を図ることにある。 One purpose of the present invention is to improve the quality of evaluation of suspended matter in the aqueous humor.
 実施形態の1つの例示的な態様は、第1の撮影部と、第2の撮影部と、評価処理部とを含む眼科装置である。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含み、第1の撮影条件での第1の撮影を被検眼の前眼部に適用するように構成されている。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含み、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用するように構成されている。評価処理部は、第1の撮影部による第1の撮影で生成された第1の画像と第2の撮影部による第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成するように構成されている。 One exemplary aspect of the embodiment is an ophthalmological apparatus including a first imaging unit, a second imaging unit, and an evaluation processing unit. The first imaging unit includes a first optical system that satisfies Scheimpflug conditions, and is configured to apply first imaging under first imaging conditions to the anterior segment of the subject's eye. The second photographing unit includes a second optical system that satisfies Scheimpflug conditions, and performs a second photograph under a second photographing condition different from the first photographing condition in parallel with the first photographing. Configured for application to the eye. The evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. The device is configured to generate evaluation information of an object.
 実施形態の他の例示的な態様は、眼科装置と制御する方法である。この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。本態様の方法は、眼科装置のプロセッサを制御部及び評価処理部として機能させる。制御部は、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行するように動作する。評価処理部は、第1の撮影部による第1の撮影で生成された第1の画像と第2の撮影部による第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成するように動作する。 Other exemplary aspects of embodiments are ophthalmic devices and methods of controlling. This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions. The method of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit. The control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging. The evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
 実施形態の更に他の例示的な態様は、眼科装置を動作させるためのプログラムである。この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。本態様のプログラムは、眼科装置のプロセッサを制御部及び評価処理部として機能させる。制御部は、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行するように動作する。評価処理部は、第1の撮影部による第1の撮影で生成された第1の画像と第2の撮影部による第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成するように動作する。 Yet another exemplary aspect of the embodiment is a program for operating an ophthalmological device. This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions. The program of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit. The control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging. The evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
 実施形態の更に他の例示的な態様は、眼科装置を動作させるためのプログラムが記録された、コンピュータ可読な非一時的記録媒体である。この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。本態様の記録媒体に記録されているプログラムは、眼科装置のプロセッサを制御部及び評価処理部として機能させる。制御部は、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行するように動作する。評価処理部は、第1の撮影部による第1の撮影で生成された第1の画像と第2の撮影部による第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成するように動作する。 Yet another exemplary aspect of the embodiment is a computer-readable non-transitory recording medium on which a program for operating an ophthalmological device is recorded. This ophthalmologic apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions. The program recorded on the recording medium of this aspect causes the processor of the ophthalmological apparatus to function as a control unit and an evaluation processing unit. The control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and the control unit controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined, and to apply the first imaging under the first imaging condition to the anterior segment of the eye to be examined. It operates to control the second imaging unit to apply the second imaging to the anterior segment in parallel with the first imaging. The evaluation processing unit is configured to evaluate the floating state in the aqueous humor based on the first image generated by the first imaging by the first imaging unit and the second image generated by the second imaging by the second imaging unit. It operates to generate evaluation information of things.
 実施形態によれば、房水内浮遊物の評価の品質向上を図ることができる。 According to the embodiment, it is possible to improve the quality of evaluation of suspended matter in the aqueous humor.
実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すタイミングチャートである。5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すタイミングチャートである。5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すタイミングチャートである。5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すタイミングチャートである。5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すタイミングチャートである。5 is a timing chart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理の例を説明するための概略図である。FIG. 2 is a schematic diagram for explaining an example of processing performed by an ophthalmologic apparatus according to an exemplary aspect of the embodiment. 実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すフローチャートである。3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すフローチャートである。3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すフローチャートである。3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置が実行する処理を表すフローチャートである。3 is a flowchart illustrating processing performed by an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment. 実施形態の例示的な態様に係る眼科装置の構成を表す概略図である。FIG. 1 is a schematic diagram illustrating the configuration of an ophthalmologic apparatus according to an exemplary aspect of an embodiment.
 実施形態の幾つかの非限定的な例示的態様について、図面を参照しながら詳細に説明する。 Some non-limiting exemplary aspects of embodiments will be described in detail with reference to the drawings.
 本開示に係るいずれかの態様に任意の公知技術を組み合わせることができる。例えば、本明細書で引用する文献に開示されている任意の事項を、本開示に係るいずれかの態様に組み合わせることができる。更に、本開示に関連する技術分野における任意の公知技術を、本開示に係るいずれかの態様に組み合わせることができる。 Any known technology can be combined with any aspect of the present disclosure. For example, any matter disclosed in the documents cited herein can be combined with any aspect of the present disclosure. Furthermore, any known technology in the technical field related to the present disclosure can be combined with any aspect of the present disclosure.
 特許文献3(特開2019-213733号公報)に開示されている全ての内容は、参照によって本開示に援用される。また、本開示に関連する技術について本願の出願人により開示された任意の技術事項(特許出願、論文などにおいて開示された事項)を、本開示に係るいずれかの態様に組み合わせることができる。 All contents disclosed in Patent Document 3 (Japanese Unexamined Patent Publication No. 2019-213733) are incorporated into the present disclosure by reference. Furthermore, any technical matter disclosed by the applicant of the present application regarding technology related to the present disclosure (matters disclosed in patent applications, papers, etc.) can be combined with any aspect of the present disclosure.
 本開示に係る様々な態様のうちのいずれか2つ以上の態様を、少なくとも部分的に組み合わせることが可能である。 It is possible to at least partially combine any two or more of the various aspects of the present disclosure.
 本開示において説明される要素の機能の少なくとも一部は、回路構成(circuitry)又は処理回路構成(processing circuitry)を用いて実装される。回路構成又は処理回路構成は、開示された機能の少なくとも一部を実行するように構成及び/又はプログラムされた、汎用プロセッサ、専用プロセッサ、集積回路、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、ASIC(Application Specific Integrated Circuit)、プログラマブル論理デバイス(例えば、SPLD(Simple Programmable Logic Device)、CPLD(Complex Programmable Logic Device)、FPGA(Field Programmable Gate Array)、従来の回路構成、及びそれらの任意の組み合わせのいずれかを含む。プロセッサは、トランジスタ及び/又は他の回路構成を含む、処理回路構成又は回路構成とみなされる。本開示において、回路構成、ユニット、手段、又はこれらに類する用語は、開示された機能の少なくとも一部を実行するハードウェア、又は、開示された機能の少なくとも一部を実行するようにプログラムされたハードウェアである。ハードウェアは、本明細書に開示されたハードウェアであってよく、或いは、記載された機能の少なくとも一部を実行するようにプログラム及び/又は構成された既知のハードウェアであってもよい。ハードウェアが或るタイプの回路構成とみなされ得るプロセッサである場合、回路構成、ユニット、手段、又はこれらに類する用語は、ハードウェアとソフトウェアとの組み合わせであり、このソフトウェアはハードウェア及び/又はプロセッサを構成するために使用される。 At least some of the functionality of the elements described in this disclosure is implemented using circuitry or processing circuitry. The circuit configuration or processing circuit configuration includes a general-purpose processor, a special-purpose processor, an integrated circuit, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit) configured and/or programmed to perform at least a portion of the disclosed functions. ), ASIC (Application Specific Integrated Circuit), programmable logic device (for example, SPLD (Simple Programmable Logic Device), CPLD (Complex Pr Logic Device), FPGA (Field Programmable Gate Array), conventional circuit configurations, and any of these A processor is considered to be a processing circuitry or circuitry that includes transistors and/or other circuitry. In this disclosure, the term circuitry, unit, means, or similar terminology refers to the disclosure Hardware that performs at least some of the functions disclosed herein, or hardware that is programmed to perform at least some of the functions disclosed. or may be known hardware programmed and/or configured to perform at least some of the functions described.A processor where the hardware may be considered a type of circuitry. , the term circuitry, unit, means, or similar terms is a combination of hardware and software, the software being used to configure the hardware and/or the processor.
<実施形態の概要>
 本開示に係る実施形態は、房水内浮遊物の評価の品質向上を図ることを1つの目的としたものである。そのために、本開示に係る実施形態は、シャインプルーフの条件を満足する複数の光学系を用いた複数の並行的な撮影(複数の同時的な撮影)を互いに異なる撮影条件の下に実行し、この複数の並行的な撮影によって生成された複数の画像(画像群)に基づき房水内浮遊物の評価を行うように構成される。
<Overview of embodiment>
One purpose of the embodiments of the present disclosure is to improve the quality of evaluation of suspended matter in the aqueous humor. To this end, the embodiments of the present disclosure perform a plurality of parallel imaging (multiple simultaneous imaging) using a plurality of optical systems that satisfy Scheimpflug conditions under mutually different imaging conditions, The apparatus is configured to evaluate floating objects in the aqueous humor based on a plurality of images (image group) generated by the plurality of parallel imaging operations.
 なお、本開示に係る実施形態による評価の対象は房水内に存在する浮遊物に限定されない。幾つかの例示的な態様は、眼内に存在する別の浮遊物(例えば、硝子体内に存在する浮遊物)、眼表面に存在する浮遊物(例えば、涙液中に存在する浮遊物)、眼球付属器(眼窩、眼瞼、結膜、涙器、眼筋)に存在する浮遊物などの評価を行うように構成されてもよい。したがって、本開示に係る実施形態は、眼内、眼表面、及び眼球付属器のいずれかにおける浮遊物の評価の品質向上を図ることを1つの目的としたものであり、シャインプルーフの条件を満足する複数の光学系を用いた複数の並行的な撮影(複数の同時的な撮影)を互いに異なる撮影条件の下に実行し、この複数の並行的な撮影によって生成された複数の画像(画像群)に基づき当該浮遊物の評価を行うように構成されたものである。本開示では房水内浮遊物を扱う実施形態について詳細に説明するが、別の浮遊物が対象である場合においても同様の構成で同様の動作を実行することができることは明らかである。本開示は、房水内浮遊物以外の浮遊物(眼科における評価対象となる又はなり得る浮遊物)を扱う実施形態を意識的に除外するものではない。 Note that the object of evaluation according to the embodiments of the present disclosure is not limited to floating objects present in the aqueous humor. Some exemplary embodiments include other floaters present within the eye (e.g., floaters present in the vitreous), floaters present on the ocular surface (e.g., floaters present in tear fluid), It may be configured to evaluate floating objects present in the ocular appendages (orbital socket, eyelid, conjunctiva, lacrimal organ, ocular muscle). Therefore, one purpose of the embodiments of the present disclosure is to improve the quality of evaluation of floating objects within the eye, on the ocular surface, or in the ocular appendage, and to satisfy the Scheimpflug conditions. Multiple parallel shootings (multiple simultaneous shootings) using multiple optical systems are performed under mutually different shooting conditions, and multiple images (image group) generated by these multiple parallel shootings are ) is configured to evaluate the floating objects. In the present disclosure, an embodiment that deals with floating objects in the aqueous humor will be described in detail, but it is clear that similar operations can be performed with a similar configuration even when other floating objects are the object. The present disclosure does not intentionally exclude embodiments that deal with floating matter other than floating matter in the aqueous humor (floating matter that is or can become an evaluation target in ophthalmology).
 幾つかの態様は、被検眼に対して複数の光学系を移動しながら複数の並行的な撮影を繰り返し実行することによって前眼部の3次元領域から画像群を収集し、収集された複数の画像群に基づいてこの3次元領域における房水内浮遊物の評価を行うように構成される。シャインプルーフの条件を満足する光学系を用いた前眼部の3次元スキャン(つまり、前眼部の3次元領域からの画像収集)については、特許文献3(特開2019-213733号公報)を参照されたい。本開示に係る実施形態は、この3次元スキャンを利用したものであり、複数の光学系を用いた複数の並行的な撮影を互いに異なる撮影条件の下に実行すること、及び、この複数の並行的な撮影によって収集された複数の画像に基づき房水内浮遊物の評価を行うことを更なる特徴としている。 In some embodiments, a group of images is collected from a three-dimensional region of the anterior segment of the eye by repeatedly performing a plurality of parallel imaging while moving a plurality of optical systems with respect to the eye to be examined; The system is configured to evaluate suspended matter in the aqueous humor in this three-dimensional region based on the image group. For three-dimensional scanning of the anterior segment of the eye using an optical system that satisfies Scheimpflug's conditions (that is, collecting images from a three-dimensional region of the anterior segment), see Patent Document 3 (Japanese Patent Laid-Open No. 2019-213733). Please refer. Embodiments according to the present disclosure utilize this three-dimensional scanning, and include performing a plurality of parallel imaging using a plurality of optical systems under mutually different imaging conditions, and A further feature is that floating objects in the aqueous humor are evaluated based on a plurality of images collected by conventional photography.
 広く知られているように、シャインプルーフの条件は、対象物に照明光を投射する光学系(照明光学系)と対象物を撮影する光学系(撮影光学系)とに関する条件であり、物面とレンズ主面とフィルム面とが同一直線上で交差するように照明光学系及び撮影光学系が構成されていることを規定したものである。シャインプルーフの条件を満足する光学系は、物面がレンズ主面と平行でないため、シャインプルーフカメラによれば、近距離にあるものから遠距離にあるものまで広い深さ範囲にわたって同時に焦点を合わせて撮影を行うことができる。例えば、前眼部イメージングにおいては、少なくとも角膜前面から水晶体後面までの広い深さ範囲にピントを合わせて撮影を行うことができ、前眼部の主要な観察対象の全体を高精細に表現することが可能である。 As is widely known, Scheimpflug conditions are conditions related to the optical system that projects illumination light onto the object (illumination optical system) and the optical system that photographs the object (photographic optical system). This stipulates that the illumination optical system and photographing optical system are configured so that the principal surface of the lens and the film surface intersect on the same straight line. In an optical system that satisfies the Scheimpflug condition, the object surface is not parallel to the lens principal surface, so according to the Scheimpflug camera, it is possible to simultaneously focus over a wide depth range from near objects to far objects. You can take pictures using For example, in anterior segment imaging, it is possible to focus on a wide depth range from at least the anterior surface of the cornea to the posterior surface of the crystalline lens, and to express the entire main observation target of the anterior segment in high definition. is possible.
 本開示に係る実施形態において、シャインプルーフの条件を満足する光学系の個数は任意であってよい。上記のように、シャインプルーフの条件は、照明光学系と撮影光学系とのペアに関する条件であるから、本開示に係る実施形態における「光学系の個数」は、照明光学系と撮影光学系とのペアの個数であり、より具体的には、撮影光学系の個数である。 In the embodiments of the present disclosure, the number of optical systems that satisfy Scheimpflug conditions may be arbitrary. As described above, the Scheimpflug condition is a condition regarding the pair of the illumination optical system and the photographing optical system, so the "number of optical systems" in the embodiments of the present disclosure refers to More specifically, it is the number of photographic optical systems.
 幾つかの例示的な態様では、照明光学系の個数(α個)と撮影光学系の個数(β個)とが等しく(α=β)、照明光学系と撮影光学系とのペアの個数もα個(=β個)である。    In some exemplary embodiments, the number of illumination optics (α) and the number of imaging optics (β) are equal (α = β), and the number of pairs of illumination optics and imaging optics is also equal. There are α pieces (= β pieces).   
 これに対し、幾つかの別の例示的な態様では、照明光学系の個数(α個)と撮影光学系の個数(β個)とが異なっている(α≠β)。照明光学系の個数が撮影光学系の個数よりも少ない場合(α<β)、少なくとも1つの照明光学系が2つ以上の撮影光学系によって共用されている。逆に、照明光学系の個数が撮影光学系の個数よりも多い場合(α>β)、少なくとも1つの撮影光学系が2つ以上の照明光学系を用いて撮影を行う。 On the other hand, in some other exemplary embodiments, the number of illumination optical systems (α) and the number of photographic optical systems (β) are different (α≠β). When the number of illumination optical systems is smaller than the number of photographic optical systems (α<β), at least one illumination optical system is shared by two or more photographic optical systems. Conversely, when the number of illumination optical systems is greater than the number of photographing optical systems (α>β), at least one photographing optical system performs photographing using two or more illumination optical systems.
 本開示では、後述の例示的な態様において、以下の条件を満足する眼科装置について詳述するが、実施形態はこれに限定されるものではない:眼科装置は、少なくとも2つの光学系(少なくとも、第1の光学系及び第2の光学系)を含む;第1の光学系は、シャインプルーフの条件を満足するように構成された第1の照明光学系と第2の撮影光学系とを含む;第2の光学系は、シャインプルーフの条件を満足するように構成された第2の照明光学系と第2の撮影光学系とを含む;第1の照明光学系と第2の照明光学系とが共通である(つまり、第1の照明光学系と第2の照明光学系とが同じものである);第1の撮影光学系と第2の撮影光学系とが異なるものである。当業者であれば、本態様の説明によって、照明光学系の個数及び/又は撮影光学系の個数が本態様と異なる眼科装置の構成についても理解することができるであろう。 In the exemplary aspects described below, this disclosure details an ophthalmic device that satisfies the following conditions, but embodiments are not limited thereto: The ophthalmic device includes at least two optical systems (at least (a first optical system and a second optical system); the first optical system includes a first illumination optical system and a second photographing optical system configured to satisfy Scheimpflug conditions. the second optical system includes a second illumination optical system and a second photographing optical system configured to satisfy Scheimpflug conditions; the first illumination optical system and the second illumination optical system (that is, the first illumination optical system and the second illumination optical system are the same); the first photographing optical system and the second photographing optical system are different. Those skilled in the art will be able to understand the configuration of an ophthalmological apparatus in which the number of illumination optical systems and/or the number of imaging optical systems differs from that of this embodiment through the description of this embodiment.
 本開示に係る実施形態によって実行される、シャインプルーフの条件を満足する複数の光学系を用いた複数の並行的な撮影は、互いに異なる撮影条件の下に実行されるものである。撮影条件は、眼科イメージングにおいて採用可能な任意の条件であってよい。 A plurality of parallel imaging operations using a plurality of optical systems that satisfy the Scheimpflug condition, which are performed according to the embodiments of the present disclosure, are performed under different imaging conditions. The imaging conditions may be any conditions that can be employed in ophthalmological imaging.
 なお、本開示で詳述される例示的な態様における撮影条件は後述の露光時間条件を少なくとも含んでいるが、実施形態はこれに限定されない。例えば、幾つかの例示的な態様における撮影条件は、露光時間条件と同じ又は類似の作用を奏する条件を含んでいてもよい。また、幾つかの例示的な態様の撮影条件は、露光時間条件と組み合わせることによって本開示に係る実施形態の品質向上(撮影の品質向上、撮影の効率向上、撮影が被検者及び/又は被検眼に与える負担の低減など)に寄与する条件を含んでいてもよい。 Note that, although the photographing conditions in the exemplary embodiments detailed in this disclosure include at least the exposure time conditions described below, the embodiments are not limited thereto. For example, the imaging conditions in some exemplary embodiments may include conditions that have the same or similar effect as the exposure time conditions. In addition, the imaging conditions of some exemplary aspects can be combined with the exposure time conditions to improve the quality of the embodiments of the present disclosure (improvement of imaging quality, improvement of imaging efficiency, imaging of subjects and/or patients). conditions that contribute to reducing the burden on optometry (such as reducing the burden on optometry).
 本開示に係る実施形態において採用可能な撮影条件の幾つかの例を説明する。撮影条件の種類としては、例えば、光学系に関する条件(光学系条件)、光学系の移動に関する条件(移動条件)、及び、これら以外の条件がある。 Some examples of imaging conditions that can be adopted in the embodiment of the present disclosure will be described. Types of photographing conditions include, for example, conditions related to the optical system (optical system conditions), conditions related to movement of the optical system (movement conditions), and conditions other than these.
 光学系条件の種類には、被検眼に照明光を投射するための照明光学系に関する条件(照明条件)、撮像素子を用いて被検眼を撮影するための撮影光学系に関する条件(露光条件)などがある。 Types of optical system conditions include conditions related to the illumination optical system for projecting illumination light onto the eye to be examined (illumination conditions), conditions for the photographic optical system to photograph the eye to be examined using an image sensor (exposure conditions), etc. There is.
 照明条件の種類には、照明光の強度に関する条件(照明強度条件)、照明光の投射時間に関する条件(照明時間条件)などがある。照明光の投射時間は、被検眼に照明光が投射されている期間(投射期間)の長さである。本開示において、照明光の投射時間を照明時間と呼ぶことがある。 Types of illumination conditions include conditions regarding the intensity of illumination light (illumination intensity conditions), conditions regarding the projection time of illumination light (illumination time conditions), etc. The projection time of the illumination light is the length of the period (projection period) during which the illumination light is projected onto the eye to be examined. In this disclosure, the projection time of illumination light may be referred to as illumination time.
 照明強度条件の例として、照明光を発する光源に関する条件(例えば、光源制御パルスの高さ)、照明光学系に設けられた減光フィルターに関する条件(例えば、2つ以上の減光フィルターの選択に関する条件、バリアブル減光フィルターの制御に関する条件)などがある。 Examples of illumination intensity conditions include conditions related to the light source that emits illumination light (for example, the height of the light source control pulse), conditions related to the neutral density filter provided in the illumination optical system (for example, conditions related to the selection of two or more neutral density filters) conditions, conditions related to variable neutral density filter control), etc.
 照明時間条件の例として、照明光を発する光源に関する条件(例えば、光源制御パルスの幅)、照明光学系に設けられたシャッターに関する条件(例えば、シャッターの開放時間)などがある。 Examples of illumination time conditions include conditions related to the light source that emits illumination light (for example, the width of the light source control pulse), conditions related to the shutter provided in the illumination optical system (for example, the opening time of the shutter), etc.
 露光条件の種類として露光時間条件がある。露光時間条件は、例えば、撮像素子に関する条件(例えば、撮像素子の露光時間)、撮影光学系に設けられたシャッターに関する条件(例えば、シャッターの開放時間)などがある。露光時間は、撮像素子が光を受けることが可能な期間(露光期間)の長さである。 There is an exposure time condition as a type of exposure condition. The exposure time conditions include, for example, conditions related to the image sensor (eg, exposure time of the image sensor), conditions related to the shutter provided in the photographic optical system (eg, shutter opening time), and the like. The exposure time is the length of a period (exposure period) during which the image sensor can receive light.
 前述したように、また、後で詳述するように、本開示に係る幾つかの例示的な態様における撮影条件は露光時間条件を含んでいる。本開示に係る実施形態の眼科装置は、シャインプルーフの条件を満足する光学系(照明光学系と撮影光学系とのペア)を複数個含んでいる。以下、シャインプルーフの条件を満足する光学系が2つ設けられている場合について様々な事項を説明するが、シャインプルーフの条件を満足する光学系が3つ以上設けられている場合においてもそれらの事項が成立することは、当業者であれば理解することができるであろう。 As mentioned above and as will be described in detail later, the imaging conditions in some exemplary embodiments of the present disclosure include exposure time conditions. An ophthalmologic apparatus according to an embodiment of the present disclosure includes a plurality of optical systems (a pair of an illumination optical system and a photographing optical system) that satisfy Scheimpflug conditions. Below, various matters will be explained in the case where two optical systems that satisfy the Scheimpflug condition are provided, but they will also be explained when there are three or more optical systems that satisfy the Scheimpflug condition. Those skilled in the art will understand that the following matters hold true.
 それぞれがシャインプルーフの条件を満足する2つの光学系を第1の光学系及び第2の光学系と呼ぶ。つまり、第1の光学系はシャインプルーフの条件を満足し、且つ、第2の光学系はシャインプルーフの条件を満足する。第1の光学系を含む撮影ユニットを第1の撮影部と呼び、第2の光学系を含む撮影ユニットを第2の撮影部と呼ぶ。第1の撮影部は、第1の光学系に加えて、例えば、電気的又は電子的な要素(回路、接続線、コネクタなど)、第1の光学系に含まれる要素(光学素子など)を駆動する機構、金具などを含んでいる。第2の撮影部も同様である。 Two optical systems that each satisfy Scheimpflug conditions are referred to as a first optical system and a second optical system. That is, the first optical system satisfies the Scheimpflug condition, and the second optical system satisfies the Scheimpflug condition. The photographing unit including the first optical system is referred to as a first photographing section, and the photographing unit including the second optical system is referred to as a second photographing section. In addition to the first optical system, the first imaging unit includes, for example, electrical or electronic elements (circuits, connection lines, connectors, etc.) and elements included in the first optical system (optical elements, etc.). It includes the driving mechanism, metal fittings, etc. The same applies to the second photographing section.
 第1の撮影部(第1の光学系)と第2の撮影部(第2の光学系)とは、互いに異なる撮影条件の下に、時間的に並行して撮影を行う。すなわち、幾つかの例示的な態様は、第1の撮影条件での撮影(第1の撮影)を第1の撮影部によって実行しながら、第1の撮影条件と異なる第2の撮影条件での撮影(第2の撮影)を第2の撮影部によって実行する。 The first imaging unit (first optical system) and the second imaging unit (second optical system) perform imaging in parallel in time under mutually different imaging conditions. That is, in some exemplary aspects, while the first imaging section performs imaging under a first imaging condition (first imaging), the imaging unit performs imaging under a second imaging condition different from the first imaging condition. Photographing (second photographing) is performed by the second photographing section.
 幾つかの例示的な態様は、第1の露光時間条件での第1の撮影を第1の撮影部によって実行しながら、第1の露光時間条件と異なる第2の露光時間条件での第2の撮影を第2の撮影部によって実行する。例えば、第1の光学系及び第2の光学系は別々の撮像素子を含んでおり、第1の光学系の撮像素子を第1の撮像素子と呼び、第2の光学系の撮像素子を第2の撮像素子と呼ぶ。別の例において、第1の光学系は撮像素子の第1の領域を第1の撮像素子として使用し、第2の光学系は同じ撮像素子の第2の領域(第1の領域と異なる領域)を第2の撮像素子として使用するように構成されていてよい。 Some example aspects provide a method for performing a first image capture at a first exposure time condition by a first image capturing unit, while performing a second image capture at a second exposure time condition different from the first exposure time condition. Photographing is performed by the second photographing section. For example, the first optical system and the second optical system include separate image sensors, and the image sensor of the first optical system is called the first image sensor, and the image sensor of the second optical system is called the first image sensor. This is called the second image sensor. In another example, the first optical system uses a first region of the image sensor as the first image sensor, and the second optical system uses a second region of the same image sensor (a region different from the first region). ) may be configured to be used as the second image sensor.
 第1の露光時間条件は、第1の撮影部における第1の撮像素子の露光時間(第1の露光時間と呼ぶ)を含む。また、第2の露光時間条件は、第2の撮影部における第2の撮像素子の露光時間(第2の露光時間と呼ぶ)を含む。第2の撮影部のための第2の露光時間の値は、第1の撮影部のための第1の露光時間の値よりも大きく設定される。なお、3つ以上の撮影部が設けられている場合、3つ以上の撮影部にそれぞれ対応する3つ以上の露光時間は互いに異なる。 The first exposure time condition includes the exposure time of the first image sensor in the first imaging unit (referred to as the first exposure time). Further, the second exposure time condition includes the exposure time of the second image sensor in the second imaging section (referred to as the second exposure time). The value of the second exposure time for the second imaging section is set larger than the value of the first exposure time for the first imaging section. Note that when three or more imaging units are provided, the three or more exposure times corresponding to the three or more imaging units are different from each other.
 第1の露光時間の値及び第2の露光時間の値は、任意のパラメータに基づいて決定されてよく、例えば、検出目標とされる房水内浮遊物の特性(寸法、移動速度)、照明光の強度(光量)、照明光の投射時間などのパラメータに基づき決定されてよい。第1の露光時間の値及び/又は第2の露光時間の値は、一定でもよいし、可変でもよい。第1の露光時間の値及び/又は第2の露光時間について、複数の種類の房水内浮遊物にそれぞれ対応する複数の値が設けられていてもよい。被検眼の前眼部を準備的に撮影して得られた準備的画像を解析することによって第1の露光時間の値及び/又は第2の露光時間の値を決定してもよい。 The value of the first exposure time and the value of the second exposure time may be determined based on arbitrary parameters, such as the characteristics (size, movement speed) of the suspended matter in the aqueous humor that is the detection target, the illumination It may be determined based on parameters such as light intensity (light amount) and illumination light projection time. The value of the first exposure time and/or the value of the second exposure time may be constant or variable. Regarding the value of the first exposure time and/or the second exposure time, a plurality of values may be provided, each corresponding to a plurality of types of suspended matter in the aqueous humor. The value of the first exposure time and/or the value of the second exposure time may be determined by analyzing a preliminary image obtained by photographing the anterior segment of the eye to be examined.
 移動条件は、シャインプルーフの条件を満足する光学系を用いた前眼部の3次元スキャンを実行する場合に使用される撮影条件である。移動条件の種類としては、光学系の移動範囲(スキャン範囲、スキャン開始位置、スキャン終了位置)、移動距離(スキャン距離)、移動速度(スキャン速度)、移動タイミング(スキャン開始タイミング、スキャン終了タイミングなど)、移動モード(連続的移動、断続的移動(間欠的移動)など)などがある。 The movement conditions are imaging conditions used when performing a three-dimensional scan of the anterior segment of the eye using an optical system that satisfies Scheimpflug conditions. Types of movement conditions include optical system movement range (scan range, scan start position, scan end position), movement distance (scan distance), movement speed (scan speed), movement timing (scan start timing, scan end timing, etc.) ), movement modes (continuous movement, intermittent movement, etc.).
 光学系条件及び移動条件以外の条件には、2つ以上の条件の連係(同期)に関する条件、被検眼に関する条件などがある。 Conditions other than optical system conditions and movement conditions include conditions related to linkage (synchronization) of two or more conditions, conditions related to the eye to be examined, etc.
 2つ以上の条件の連係に関する条件の例として、2つ以上の光学系条件の連係に関する条件、2つ以上の移動条件の連係に関する条件、1つ以上の光学系条件と1つ以上の移動条件との連係に関する条件などがある。 Examples of conditions related to coordination of two or more conditions include conditions related to coordination of two or more optical system conditions, conditions related to coordination of two or more movement conditions, and conditions related to coordination of one or more optical system conditions and one or more movement conditions. There are conditions regarding cooperation with.
 被検眼に関する条件の例として、固視に関する条件、造影剤の使用の有無に関する条件、散瞳剤の使用の有無に関する条件、眼の特性に関する条件などがある。この眼の特性は、眼画像の明るさに影響を与えるパラメータや、眼画像の明るさに影響を与える可能性のあるパラメータを含んでいてよく、その例として、瞳孔径、虹彩の色などがある。 Examples of conditions related to the eye to be examined include conditions related to fixation, conditions related to use of a contrast medium, conditions related to use of a mydriatic agent, conditions related to eye characteristics, etc. The eye characteristics may include parameters that affect or can affect the brightness of the eye image, such as pupil diameter, iris color, etc. be.
 本開示に係る実施形態は、上記したような撮影機能(イメージング)を実現するための制御を実行するように構成されていてよい。この制御は、例えば、光学系(照明光学系及び/又は撮影光学系)の制御、光学系以外の撮影部の要素の制御、及び、撮影部以外の要素の制御のうちの1つの制御又は2つ以上の制御の組み合わせを含んでいてよい。また、本開示に係る実施形態は、上記の撮影機能以外の機能を実現するための制御を実行可能に構成されてもよい。 Embodiments according to the present disclosure may be configured to execute control for realizing the above-described imaging function (imaging). This control may include, for example, one or two of the following: control of the optical system (illumination optical system and/or photographing optical system), control of elements of the photographing section other than the optical system, and control of elements other than the photographing section. It may include a combination of more than one control. Moreover, the embodiment according to the present disclosure may be configured to be able to execute control for realizing functions other than the above-mentioned photographing function.
 照明光学系の制御は、任意の方式の制御であってよく、例えば、光源の制御(点灯/消灯)や電子シャッターの制御などの電気的制御でもよいし、機械式シャッターの制御や回転式シャッターの制御などの機械的制御でもよいし、電気的制御と機械的制御との組み合わせでもよい。なお、これらのシャッターは、照明光学系に設けられており、光源から出力された照明光の通過と遮蔽とを切り替えるように(つまり、被検眼に対する照明光の投射と不投射とを切り替えるように)機能する。 Control of the illumination optical system may be any type of control, for example, electrical control such as light source control (on/off) or electronic shutter control, or mechanical shutter control or rotary shutter control. It may be mechanical control such as control, or it may be a combination of electrical control and mechanical control. Note that these shutters are provided in the illumination optical system, and are designed to switch between passing and blocking the illumination light output from the light source (that is, switching between projecting and not projecting the illumination light to the subject's eye). )Function.
 照明光学系の制御は、被検眼に対して照明光が投射されている状態(投射状態)と投射されていない状態(不投射状態)とを切り替えるための制御に限定されず、被検眼に投射される照明光の強度や光量を変調するための制御であってもよい。 Control of the illumination optical system is not limited to control for switching between a state in which illumination light is projected onto the eye to be examined (projection state) and a state in which it is not projected (non-projection state); Control may also be used to modulate the intensity or amount of illumination light.
 なお、投射状態と不投射状態との切り替えは、被検眼に投射される照明光の強度を正値とゼロとに切り替えることに相当する。これに対し、強度変調は、被検眼に投射される照明光の強度を、互いに異なる第1の値と第2の値とに切り替えることに相当する。ここで、第1の値及び第2の値はいずれも非負値であり、第1の値及び第2の値のいずれか一方又は双方は正値である。したがって、投射状態と不投射状態との切り替えは、強度変調の1つの例に相当するものと言える。 Note that switching between the projection state and the non-projection state corresponds to switching the intensity of the illumination light projected onto the eye to be examined between a positive value and zero. In contrast, intensity modulation corresponds to switching the intensity of illumination light projected onto the eye to be examined between a first value and a second value that are different from each other. Here, both the first value and the second value are non-negative values, and either one or both of the first value and the second value is a positive value. Therefore, switching between the projection state and the non-projection state can be said to correspond to one example of intensity modulation.
 撮影光学系の制御は、任意の方式の制御であってよく、例えば、撮像素子の制御や電子シャッターの制御などの電気的制御でもよいし、機械式シャッターの制御や回転式シャッターの制御などの機械的制御でもよいし、電気的制御と機械的制御との組み合わせでもよい。 The control of the photographing optical system may be any type of control, for example, electrical control such as control of an image sensor or control of an electronic shutter, or control of a mechanical shutter or control of a rotary shutter. It may be mechanical control or a combination of electrical control and mechanical control.
 本開示に係る実施形態は、以上に説明したような撮影方式、つまり、シャインプルーフの条件を満足する複数の光学系を用いた複数の並行的な撮影を互いに異なる撮影条件の下に実行し、それにより取得された複数の画像に基づき房水内浮遊物の評価を行うように構成されている。房水内浮遊物の評価については、その非限定的な幾つかの例を、後述する幾つかの例示的な態様において説明する。 Embodiments according to the present disclosure employ the imaging method described above, that is, a plurality of parallel imaging using a plurality of optical systems satisfying the Scheimpflug condition are performed under mutually different imaging conditions, The apparatus is configured to evaluate suspended matter in the aqueous humor based on the plurality of images thus acquired. Some non-limiting examples of the evaluation of suspended matter in the aqueous humor are described in some illustrative embodiments below.
 以上に概要を説明した実施形態について幾つかの例示的な態様を説明する。本開示では、眼科装置の例示的な態様、眼科装置を制御する方法の例示的な態様、プログラムの例示的な態様、及び記録媒体の例示的な態様について主に説明するが、実施形態の態様はこれらに限定されるものではない。例えば、医療方法の様々な態様、撮影方法の様々な態様、データ処理方法の様々な態様などが本開示によって提供されることは、当業者であれば理解することができるであろう。 Several exemplary aspects of the embodiments outlined above will now be described. Although the present disclosure mainly describes exemplary aspects of an ophthalmic device, exemplary aspects of a method for controlling an ophthalmic device, exemplary aspects of a program, and exemplary aspects of a recording medium, aspects of embodiments is not limited to these. For example, those skilled in the art will understand that the present disclosure provides various aspects of medical methods, various aspects of imaging methods, various aspects of data processing methods, and the like.
<眼科装置>
 実施形態に係る眼科装置の幾つかの例示的な態様を提供する。
<Ophthalmological equipment>
Some exemplary aspects of ophthalmic devices according to embodiments are provided.
 実施形態の1つの態様に係る眼科装置の構成を図1に示す。本態様の眼科装置1000は、撮影部1100と、制御部1200と、評価処理部1300と、移動機構1400とを含んでいる。 FIG. 1 shows the configuration of an ophthalmological apparatus according to one aspect of the embodiment. The ophthalmological apparatus 1000 of this embodiment includes an imaging section 1100, a control section 1200, an evaluation processing section 1300, and a movement mechanism 1400.
 撮影部1100は、シャインプルーフの条件を満足する光学系を用いて被検眼の前眼部を撮影することによりデジタル画像(シャインプルーフ画像)を生成する。撮影部1100は、第1の撮影部1110及び第2の撮影部1120を含んでいる。 The photographing unit 1100 generates a digital image (Scheimpflug image) by photographing the anterior segment of the eye to be examined using an optical system that satisfies Scheimpflug conditions. The photographing section 1100 includes a first photographing section 1110 and a second photographing section 1120.
 第1の撮影部1110は、シャインプルーフの条件を満足する光学系(第1の光学系)1111を含んでいる。第1の光学系1111は、デジタル画像を生成するための撮像素子(第1の撮像素子)1112を含んでいる。第1の撮影部1110の構成については、その幾つかの非限定的な具体例を後述する。第1の撮影部1110は、予め設定された撮影条件(第1の撮影条件)での撮影を被検眼の前眼部に適用する。第1の撮影条件の下に第1の撮影部1110により実行される撮影を第1の撮影と呼ぶ。第1の撮影により生成される画像を第1の画像と呼ぶ。 The first imaging unit 1110 includes an optical system (first optical system) 1111 that satisfies Scheimpflug conditions. The first optical system 1111 includes an image sensor (first image sensor) 1112 for generating a digital image. Some non-limiting specific examples of the configuration of the first imaging unit 1110 will be described later. The first imaging unit 1110 applies imaging to the anterior segment of the subject's eye under preset imaging conditions (first imaging conditions). Photographing performed by the first photographing unit 1110 under the first photographing conditions is referred to as first photographing. The image generated by the first photographing is called a first image.
 第2の撮影部1120は、シャインプルーフの条件を満足する光学系(第2の光学系)1121を含んでいる。第2の光学系1121は、デジタル画像を生成するための撮像素子(第2の撮像素子)1122を含んでいる。第2の撮影部1120の構成については、その幾つかの非限定的な具体例を後述する。第2の撮影部1120は、予め設定された撮影条件(第2の撮影条件)での撮影を被検眼の前眼部に適用する。第2の撮影条件の下に第2の撮影部1120により実行される撮影を第2の撮影と呼ぶ。第2の撮影により生成される画像を第2の画像と呼ぶ。 The second photographing unit 1120 includes an optical system (second optical system) 1121 that satisfies Scheimpflug conditions. The second optical system 1121 includes an image sensor (second image sensor) 1122 for generating a digital image. Regarding the configuration of the second imaging unit 1120, some non-limiting specific examples thereof will be described later. The second imaging unit 1120 applies imaging to the anterior segment of the subject's eye under preset imaging conditions (second imaging conditions). Photographing performed by the second photographing unit 1120 under the second photographing condition is referred to as second photographing. The image generated by the second shooting is called a second image.
 このように、本態様において、第1の撮影部1110の構成と第2の撮影部1120の構成とは(実質的に)同じである。なお、幾つかの例示的な態様では、第1の撮影部の構成と第2の撮影部の構成とが異なっていてもよい。 As described above, in this aspect, the configuration of the first imaging unit 1110 and the configuration of the second imaging unit 1120 are (substantially) the same. Note that in some exemplary embodiments, the configuration of the first imaging unit and the configuration of the second imaging unit may be different.
 また、幾つかの例示的な態様では、第1の撮影部における第1の光学系の一部と、第2の撮影部における第2の光学系の一部とが共通であってもよい。例えば、共通の対物レンズを第1の光学系の対物レンズ及び第2の光学系の対物レンズとした構成を採用してもよい。また、第1の光学系と第2の光学系との共通の光路により導かれた光を2つに分岐してそれぞれ第1の撮像素子及び第2の撮像素子で検出する構成を採用してもよい。第1の光学系と第2の光学系とを部分的に共通にした構成はこれらの例に限定されない。 Furthermore, in some exemplary embodiments, a part of the first optical system in the first imaging section and a part of the second optical system in the second imaging part may be common. For example, a configuration may be adopted in which a common objective lens is used as the objective lens of the first optical system and the objective lens of the second optical system. Further, a configuration is adopted in which the light guided by the common optical path of the first optical system and the second optical system is split into two and detected by the first image sensor and the second image sensor, respectively. Good too. The configuration in which the first optical system and the second optical system are partially shared is not limited to these examples.
 第1の撮影と第2の撮影とは並行して実行される。また、第1の撮影条件と第2の撮影条件とは互いに異なる。本態様において、第1の撮影条件は、第1の撮像素子1112の露光時間(第1の露光時間)の値を含み、且つ、第2の撮影条件は、第2の撮像素子1122の露光時間(第2の露光時間)の値を含んでいる。本態様では、第2の露光時間は第1の露光時間よりも長いものとする。 The first imaging and the second imaging are performed in parallel. Further, the first imaging condition and the second imaging condition are different from each other. In this aspect, the first photographing condition includes the value of the exposure time (first exposure time) of the first image sensor 1112, and the second photographing condition includes the value of the exposure time of the second image sensor 1122. (second exposure time). In this aspect, the second exposure time is longer than the first exposure time.
 前述したように第1の撮影部1110の構成と第2の撮影部1120の構成とは実質的に同じであることを考慮すると、第2の露光時間が第1の露光時間よりも長いと仮定しても一般性は失われない。なお、第1の撮影部の構成と第2の撮影部の構成とが実質的に相違するような態様においては、一方の撮影部の露光時間と他方の撮影部の露光時間との関係が非対称になるケースも想定されるが、本態様がそのようなケースも含んでいることは本開示から明らかである。 Considering that the configurations of the first imaging unit 1110 and the second imaging unit 1120 are substantially the same as described above, it is assumed that the second exposure time is longer than the first exposure time. However, generality is not lost. Note that in an embodiment where the configuration of the first imaging section and the configuration of the second imaging section are substantially different, the relationship between the exposure time of one imaging section and the exposure time of the other imaging section is asymmetric. Although it is assumed that the present embodiment also includes such a case, it is clear from the present disclosure that the present embodiment includes such a case.
 制御部1200は、眼科装置1000の各部の制御を行うように構成されている。図1に示すように、制御部1200は、所定の撮影条件に基づく撮影部1110の制御、評価処理部1300の制御、移動機構1400の制御などを実行する。 The control unit 1200 is configured to control each part of the ophthalmologic apparatus 1000. As shown in FIG. 1, the control unit 1200 controls the imaging unit 1110, the evaluation processing unit 1300, the moving mechanism 1400, etc. based on predetermined imaging conditions.
 図示は省略するが、制御部1200は、眼科装置1000の任意の要素の制御、及び/又は、眼科装置1000の周辺機器の制御を行うように構成されていてもよい。そのような眼科装置1000の要素及び/又は周辺機器の例として、ユーザーインターフェイス、通信機器、撮影部1100以外の要素、被検眼の検査を行うための要素、眼科装置1000を含むシステムにおける別の装置、眼科装置1000とともに使用される別の装置などがある。 Although not shown, the control unit 1200 may be configured to control any element of the ophthalmologic apparatus 1000 and/or peripheral devices of the ophthalmologic apparatus 1000. Examples of such elements and/or peripheral devices of the ophthalmological apparatus 1000 include a user interface, a communication device, an element other than the imaging unit 1100, an element for testing an eye to be examined, and another device in a system including the ophthalmological apparatus 1000. , another device used with ophthalmologic device 1000, and the like.
 制御部1200は、プロセッサ、記憶装置などのハードウェア要素を含む。記憶装置には、制御プログラム等のコンピュータプログラムが記憶されている。制御部1200の機能は、制御プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。 The control unit 1200 includes hardware elements such as a processor and a storage device. The storage device stores computer programs such as control programs. The functions of the control unit 1200 are realized by cooperation between software such as a control program and hardware such as a processor.
 本態様において、制御部1200は、撮影部1100を制御することによって、第1の撮影部1110による第1の撮影と第2の撮影部1120による第2の撮影とを並行的に且つ互いに異なる撮影条件で実行させる。すなわち、本態様の制御部1200は、撮影部1100を制御することによって、第1の撮影条件での第1の撮影を第1の撮影部1110に実行させながら、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第2の撮影部1120に実行させる。より具体的には、本態様の制御部1200は、撮影部1100を制御することによって、第1の撮像素子1112の露光時間が第1の露光時間に設定された第1の撮影を第1の撮影部1110に実行させながら、第2の撮像素子1122の露光時間が第1の露光時間よりも長い第2の露光時間に設定された第2の撮影を第2の撮影部1120に実行させる。 In this aspect, the control unit 1200 controls the imaging unit 1100 to perform first imaging by the first imaging unit 1110 and second imaging by the second imaging unit 1120 in parallel and different from each other. Execute with conditions. That is, the control unit 1200 of this aspect controls the imaging unit 1100 to cause the first imaging unit 1110 to perform the first imaging under the first imaging condition, while performing the first imaging under the first imaging condition. The second photographing unit 1120 is caused to perform the second photographing under the second photographing condition. More specifically, the control unit 1200 of this embodiment controls the imaging unit 1100 to change the first imaging in which the exposure time of the first image sensor 1112 is set to the first exposure time to the first imaging unit 1100. The second imaging unit 1120 is caused to perform second imaging in which the exposure time of the second image sensor 1122 is set to a second exposure time longer than the first exposure time while causing the imaging unit 1110 to perform the second imaging.
 露光時間が異なる2つの撮影を並行して行うことによって、比較的短い期間における房水内浮遊物の動きが描出された画像と、比較的長い期間における房水内浮遊物の動きが描出された画像とが取得される。本態様では、第1の撮影部1110により生成される第1の画像が、比較的短い期間における房水内浮遊物の動きが描出された画像であり、第2の撮影部1120により生成される第2の画像が、比較的長い期間における房水内浮遊物の動きが描出された画像である。本態様の眼科装置1000は、このような2つの画像を比較することによって房水内浮遊物の評価を行うという、新規な機能を有している。この評価処理については後述する。 By performing two shots in parallel with different exposure times, an image depicting the movement of suspended matter in the aqueous humor over a relatively short period and an image depicting the movement of suspended matter within the aqueous humor over a relatively long period were created. An image is obtained. In this aspect, the first image generated by the first imaging unit 1110 is an image depicting the movement of suspended matter in the aqueous humor over a relatively short period, and the first image generated by the second imaging unit 1120 The second image is an image depicting the movement of suspended matter in the aqueous humor over a relatively long period of time. The ophthalmological apparatus 1000 of this embodiment has a novel function of evaluating floating matter in the aqueous humor by comparing these two images. This evaluation process will be described later.
 移動機構1400は、撮影部1100(少なくとも、第1の光学系1111及び第2の光学系1121)を移動するように構成されている。なお、移動機構1400は、撮影部1100の移動と同等の機能を有する機構(換言すると、撮影部1110の移動と同じ作用を奏する機構)を含んでいてもよい。そのような機構の例として、照明光(スリット光)を偏向することによって照明位置を移動する機構(照明スキャナー、可動照明ミラー)、被検眼から撮影部1100に向かう光を偏向することによって撮影位置を移動する機構(撮影スキャナー、可動撮影ミラー)などがある。 The moving mechanism 1400 is configured to move the imaging unit 1100 (at least the first optical system 1111 and the second optical system 1121). Note that the moving mechanism 1400 may include a mechanism that has the same function as moving the imaging section 1100 (in other words, a mechanism that has the same effect as moving the imaging section 1110). Examples of such mechanisms include a mechanism (illumination scanner, movable illumination mirror) that moves the illumination position by deflecting illumination light (slit light), and a mechanism that moves the illumination position by deflecting the light directed from the eye to the imaging unit 1100. There are mechanisms for moving the camera (photo scanner, movable photograph mirror), etc.
 制御部1200は、撮影部1100の制御(第1の撮影部1110の制御及び第2の撮影部1120の制御)と移動機構1400の制御とを組み合わせて実行することにより、第1の画像と第2の画像との複数のペアを第1の撮影部1110及び第2の撮影部1120に生成させる。換言すると、制御部1200が第1の撮影部1110の制御と第2の撮影部1120の制御と移動機構1400の制御とを組み合わせることによって、被検眼の前眼部の3次元領域のスキャン(前眼部スキャン)、つまり被検眼の前眼部の3次元領域からの画像収集を実行する。前眼部スキャンは、前眼部に対する第1の光学系1111及び第2の光学系1121の位置を変えながら、第1の撮影部1110による第1の撮影と第2の撮影部1120による第2の撮影とのペア動作を複数回行うイメージングモードである。これにより、前眼部の異なる複数の部分がそれぞれ描出された複数の画像ペア(第1の画像と第2の画像とのペアが複数個)が得られる。 The control unit 1200 controls the first image and the first image by controlling the image capturing unit 1100 (control of the first image capturing unit 1110 and controlling the second image capturing unit 1120) in combination with control of the moving mechanism 1400. The first photographing unit 1110 and the second photographing unit 1120 generate a plurality of pairs with the second image. In other words, the control unit 1200 combines the control of the first imaging unit 1110, the control of the second imaging unit 1120, and the control of the movement mechanism 1400 to scan a three-dimensional region of the anterior segment of the subject's eye (anterior (ocular scan), that is, image collection from a three-dimensional region of the anterior segment of the eye to be examined. The anterior segment scan is performed by changing the positions of the first optical system 1111 and the second optical system 1121 with respect to the anterior segment, and performing the first imaging by the first imaging unit 1110 and the second imaging by the second imaging unit 1120. This is an imaging mode that performs pairing operations multiple times with shooting. As a result, a plurality of image pairs (a plurality of pairs of a first image and a second image) each depicting a plurality of different parts of the anterior segment are obtained.
 制御部1200が撮影部1110の制御と移動機構1400の制御とを組み合わせて実行することにより実施される前眼部スキャンの態様について、幾つかの例を以下に説明する。 Some examples of aspects of the anterior segment scan performed by the control unit 1200 in combination with the control of the imaging unit 1110 and the control of the movement mechanism 1400 will be described below.
 図2Aに示すスキャンは、光源の発光(照明光の出力、被検眼に対する照明光の投射)と、カメラA(第1の撮像素子1112)の露光と、カメラB(第2の撮像素子1122)の露光と、スキャン位置(移動される撮影部1100の位置)との連係的な制御(同期制御)によって実現されるものである。 The scan shown in FIG. 2A consists of light emission from a light source (output of illumination light, projection of illumination light onto the subject's eye), exposure from camera A (first image sensor 1112), and exposure from camera B (second image sensor 1122). This is realized by linked control (synchronous control) between the exposure and the scan position (the position of the moving imaging unit 1100).
 より具体的には、図2Aのスキャンは、照明光を連続的に出力させる制御と、比較的短い露光時間(第1の露光時間)「a」での第1の撮像素子1112の露光を反復的に実行させる制御と、比較的長い露光時間(第2の露光時間)「b」での第2の撮像素子1122の露光を反復的に実行させる制御と、スキャン開始位置からスキャン終了位置まで撮影部1100を連続的に移動させる制御とを同期的に組み合わせたものである。ここで、第1の露光時間「a」及び第2の露光時間「b」は、図2Cに示されている。 More specifically, the scan in FIG. 2A involves repeating control to continuously output illumination light and exposing the first image sensor 1112 at a relatively short exposure time (first exposure time) "a". control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b", and control to repeatedly perform exposure from the scan start position to the scan end position. This is a synchronous combination of control for continuously moving the section 1100. Here, the first exposure time "a" and the second exposure time "b" are shown in FIG. 2C.
 第1の撮像素子1112の反復的な露光は、第1の露光時間での露光と電荷転送(及び露光待機)とを交互に繰り返すことによって行われる。同様に、第2の撮像素子1122の反復的な露光は、第2の露光時間での露光と電荷転送(及び露光待機)とを交互に繰り返すことによって行われる。 Repetitive exposure of the first image sensor 1112 is performed by alternately repeating exposure during the first exposure time and charge transfer (and exposure standby). Similarly, the repetitive exposure of the second image sensor 1122 is performed by alternately repeating exposure at the second exposure time and charge transfer (and exposure standby).
 図2Aのスキャンによれば、照明光の制御がシンプルであり、比較的簡便な同期制御によってスキャンを実現可能であるという利点がある。 The scanning shown in FIG. 2A has the advantage that illumination light control is simple and scanning can be achieved through relatively simple synchronous control.
 図2Aのスキャンにおいて、2つの撮像素子1112及び1122の露光タイミングに応じて照明光の強度(光量)を変調することができる。例えば、第1の撮像素子1112の露光期間には照明光の強度を比較的高くするとともに、第2の撮像素子1122の露光期間には照明光の強度を比較的低くするように、照明光の強度変調を行うことによって、比較的短い露光時間で第1の撮像素子1112により得られる画像の明るさを増加させて像の明瞭化・高精細化を図るとともに、比較的長い露光時間で第2の撮像素子1122により得られる画像の明るさの飽和を防止することができる。 In the scan of FIG. 2A, the intensity (light amount) of the illumination light can be modulated according to the exposure timing of the two image sensors 1112 and 1122. For example, the intensity of the illumination light may be relatively high during the exposure period of the first image sensor 1112, and the intensity of the illumination light may be relatively low during the exposure period of the second image sensor 1122. By performing intensity modulation, the brightness of the image obtained by the first image sensor 1112 is increased with a relatively short exposure time, and the image is made clearer and has higher definition. Saturation of the brightness of the image obtained by the image sensor 1122 can be prevented.
 図2Bに示すスキャンは、第1の撮像素子1112の露光制御、第2の撮像素子1122の露光制御、及びスキャン位置の制御については図2Aのスキャンと同じ要領で実行されるが、光源の発光制御については、照明光を連続発光する図2Aのスキャンと異なり、照明光を断続的に出力(パルス発光)している。 The scan shown in FIG. 2B is executed in the same manner as the scan shown in FIG. 2A with regard to exposure control of the first image sensor 1112, exposure control of the second image sensor 1122, and scan position control, but As for control, unlike the scan of FIG. 2A in which illumination light is emitted continuously, illumination light is output intermittently (pulse light emission).
 より具体的には、図2Bのスキャンは、照明光を断続的に出力させる制御と、比較的短い露光時間(第1の露光時間)「a」での第1の撮像素子1112の露光を反復的に実行させる制御と、比較的長い露光時間(第2の露光時間)「b」での第2の撮像素子1122の露光を反復的に実行させる制御と、スキャン開始位置からスキャン終了位置まで撮影部1100を連続的に移動させる制御とを同期的に組み合わせたものである。ここで、第1の露光時間「a」及び第2の露光時間「b」は、図2Cに示されている。 More specifically, the scan in FIG. 2B involves repeated control to output illumination light intermittently and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a". control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b", and control to repeatedly perform exposure from the scan start position to the scan end position. This is a synchronous combination of control for continuously moving the section 1100. Here, the first exposure time "a" and the second exposure time "b" are shown in FIG. 2C.
 図2Bのスキャンによれば、図2Aのスキャンの場合と比較して、同期制御が煩雑になるという不利点はあるものの、被検眼に照明光が投射されている時間を短くすることができ、被検者に与える負担を軽減することができるという利点がある。 Although the scan of FIG. 2B has the disadvantage that synchronization control is more complicated than the scan of FIG. 2A, it is possible to shorten the time during which the illumination light is projected onto the eye to be examined. This has the advantage that the burden on the subject can be reduced.
 図2Aのスキャンの場合と同様に、図2Bのスキャンにおいても、2つの撮像素子1112及び1122の露光タイミングに応じて照明光の強度(光量)を変調するようにしてもよい。 Similarly to the scan in FIG. 2A, in the scan in FIG. 2B as well, the intensity (light amount) of the illumination light may be modulated according to the exposure timing of the two image sensors 1112 and 1122.
 図2Dに示すスキャンは、照明光の発光制御、第1の撮像素子1112の露光制御、及び第2の撮像素子1122の露光制御については図2Aのスキャンと同じ要領で実行されるが、スキャン位置の制御については、撮影部1100を連続的に移動する図2Aのスキャンと異なり、撮影部1100を断続的に移動している。 The scan shown in FIG. 2D is executed in the same manner as the scan shown in FIG. 2A with respect to illumination light emission control, exposure control of the first image sensor 1112, and exposure control of the second image sensor 1122, but the scanning position Regarding control, unlike the scan of FIG. 2A in which the imaging unit 1100 is moved continuously, the imaging unit 1100 is moved intermittently.
 より具体的には、図2Dのスキャンは、照明光を連続的に出力させる制御と、比較的短い露光時間(第1の露光時間)「a」での第1の撮像素子1112の露光を反復的に実行させる制御と、比較的長い露光時間(第2の露光時間)「b」での第2の撮像素子1122の露光を反復的に実行させる制御と、スキャン開始位置からスキャン終了位置まで撮影部1100を断続的に移動させる制御とを同期的に組み合わせたものである。ここで、第1の露光時間「a」及び第2の露光時間「b」は、図2Cに示されている。 More specifically, the scan in FIG. 2D repeats control to continuously output illumination light and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a". control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b", and control to repeatedly perform exposure from the scan start position to the scan end position. This is a synchronous combination of control to move the section 1100 intermittently. Here, the first exposure time "a" and the second exposure time "b" are shown in FIG. 2C.
 図2Eに示すスキャンは、照明光の発光制御、第1の撮像素子1112の露光制御、及び第2の撮像素子1122の露光制御については図2Bのスキャンと同じ要領で実行されるが、スキャン位置の制御については、撮影部1100を連続的に移動する図2Bのスキャンと異なり、撮影部1100を断続的に移動している。 The scan shown in FIG. 2E is executed in the same manner as the scan shown in FIG. 2B with regard to illumination light emission control, exposure control of the first image sensor 1112, and exposure control of the second image sensor 1122, but the scanning position Regarding control, unlike the scan of FIG. 2B in which the imaging unit 1100 is moved continuously, the imaging unit 1100 is moved intermittently.
 より具体的には、図2Eのスキャンは、照明光を断続的に出力させる制御と、比較的短い露光時間(第1の露光時間)「a」での第1の撮像素子1112の露光を反復的に実行させる制御と、比較的長い露光時間(第2の露光時間)「b」での第2の撮像素子1122の露光を反復的に実行させる制御と、スキャン開始位置からスキャン終了位置まで撮影部1100を断続的に移動させる制御とを同期的に組み合わせたものである。ここで、第1の露光時間「a」及び第2の露光時間「b」は、図2Cに示されている。 More specifically, the scan in FIG. 2E involves repeated control to output illumination light intermittently and exposure of the first image sensor 1112 at a relatively short exposure time (first exposure time) "a". control to repeatedly perform exposure of the second image sensor 1122 at a relatively long exposure time (second exposure time) "b", and control to repeatedly perform exposure from the scan start position to the scan end position. This is a synchronous combination of control to move the section 1100 intermittently. Here, the first exposure time "a" and the second exposure time "b" are shown in FIG. 2C.
 図2Dに示すスキャン及び図2Eに示すスキャンのように撮影部1100を断続的に移動させる場合、同期制御の煩雑さが増加すること、撮影部1100の急発進及び急停止の繰り返しに起因する振動が発生するおそれがあることなどの不利点があるものの、撮影部1100を停止した状態で各撮影を行うことができるため、撮影部1100の移動に起因する像のぼやけやブレが生じないという利点がある。 When the imaging unit 1100 is moved intermittently as in the scan shown in FIG. 2D and the scan shown in FIG. 2E, the complexity of synchronization control increases, and vibrations due to repeated sudden starts and stops of the imaging unit 1100 occur. Although there are disadvantages such as the risk of occurrence of image blurring, the advantage is that each image capturing can be performed with the image capturing unit 1100 stopped, so that blurring or blurring of the image due to movement of the image capturing unit 1100 does not occur. There is.
 以上、本態様の前眼部スキャンの態様について幾つかの例を説明したが、前眼部スキャンの態様はこれらに限定されるものではない。また、上記した利点及び不利点などに基づいて、前眼部に適用するスキャンの態様を選択することや、2つ以上のスキャン態様を少なくとも部分的に組み合わせることが可能である。 Although several examples of the aspect of the anterior segment scan of this aspect have been described above, the aspect of the anterior segment scan is not limited to these. Furthermore, it is possible to select the scanning mode to be applied to the anterior segment based on the advantages and disadvantages described above, or to at least partially combine two or more scanning modes.
 なお、幾つかの例示的な態様では、スキャン(光学系の移動)を行うことなく、光学系を静止させた状態で取得した2つ以上の画像(つまり、同じ光学系位置で取得された2つ以上の画像)に基づいて房水内浮遊物の評価を行うことができる。このような非スキャン方式の撮影及び評価は、本態様の眼科装置1000のように移動機構を具備した眼科装置でも、移動機構を具備しない眼科装置でも実行することが可能である。移動機構を具備した眼科装置においては、スキャン方式の撮影及び評価を実行するモードと、非スキャン方式の撮影及び評価を実行するモードとを、選択することが可能であってもよい。 Note that in some exemplary embodiments, two or more images acquired with the optical system stationary (i.e., two images acquired at the same optical system position) without scanning (moving the optical system) evaluation of suspended matter in the aqueous humour. Such non-scan imaging and evaluation can be performed by an ophthalmological apparatus equipped with a moving mechanism like the ophthalmological apparatus 1000 of this embodiment, or an ophthalmic apparatus not equipped with a moving mechanism. In an ophthalmologic apparatus equipped with a moving mechanism, it may be possible to select a mode in which scanning-based imaging and evaluation is performed and a mode in which non-scanning imaging and evaluation are performed.
 評価処理部1300は、第1の撮影条件の下に第1の撮影部1110が実施した第1の撮影によって生成された第1の画像と、第2の撮影条件の下に第2の撮影部1120が実施した第2の撮影によって生成された第2の画像とに基づいて、被検眼の前眼部(前房)における房水内浮遊物の評価情報を生成するように構成されている。 The evaluation processing unit 1300 generates a first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition, and an image generated by the second imaging unit under the second imaging condition. 1120 is configured to generate evaluation information on suspended matter in the aqueous humor in the anterior segment (anterior chamber) of the eye to be examined.
 評価情報は、房水内浮遊物に関する任意の情報であってよい。例えば、評価情報は、房水内浮遊物の状態を示す任意の事項に関する情報、房水内浮遊物の状態に基づき把握される任意の事項に関する情報などを含んでいてもよい。また、評価情報は、数値情報(所定のパラメータの値など)、判断結果を示す情報(特定の疾患の病態を示すグレード、特定の疾患の進行度を示すグレード、特定の疾患の疑いの有無・確率など)、判断結果の導出に使用されたデータを示す情報、これらのいずれかの情報を視覚化した情報などを含んでいてもよい。 The evaluation information may be any information regarding suspended matter in the aqueous humor. For example, the evaluation information may include information regarding any item indicating the state of the floating object in the aqueous humor, information regarding any item grasped based on the state of the floating object in the aqueous humor, and the like. In addition, evaluation information includes numerical information (values of predetermined parameters, etc.), information indicating judgment results (grade indicating the pathology of a specific disease, grade indicating the degree of progression of a specific disease, presence or absence of suspicion of a specific disease, (probability, etc.), information indicating the data used to derive the judgment result, information that visualizes any of these information, etc. may be included.
 評価処理部1300は、プロセッサ、記憶装置などのハードウェア要素を含む。記憶装置には、評価処理プログラム等のコンピュータプログラムが記憶されている。評価処理部1300の機能は、評価処理プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。 The evaluation processing unit 1300 includes hardware elements such as a processor and a storage device. A computer program such as an evaluation processing program is stored in the storage device. The functions of the evaluation processing unit 1300 are realized by cooperation between software such as an evaluation processing program and hardware such as a processor.
 評価処理部1300の幾つかの例示的な態様(幾つかの構成例、幾つかの処理例など)について、以下に説明する。 Some exemplary aspects (some configuration examples, some processing examples, etc.) of the evaluation processing unit 1300 will be described below.
 図3に示す1つの例示的な態様において、評価処理部1300は、浮遊物像検出部1310と、評価情報生成部1320とを含んでいる。浮遊物像検出部1310の機能は、浮遊物像検出プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。また、評価情報生成部1320の機能は、評価情報生成プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。 In one exemplary embodiment shown in FIG. 3, the evaluation processing section 1300 includes a floating object image detection section 1310 and an evaluation information generation section 1320. The functions of the floating object image detection unit 1310 are realized by cooperation between software such as a floating object image detection program and hardware such as a processor. Further, the functions of the evaluation information generation unit 1320 are realized by cooperation between software such as an evaluation information generation program and hardware such as a processor.
 浮遊物像検出部1310は、同じ房水内浮遊物に対応する第1の画像中の第1の浮遊物像及び第2の画像中の第2の浮遊物像を検出するように構成されている。より詳細には、浮遊物像検出部1310は、第1の撮影条件の下に第1の撮影部1110が実施した第1の撮影によって生成された第1の画像から房水内浮遊物の像(第1の浮遊物像)を検出し、且つ、第2の撮影条件の下に第2の撮影部1120が実施した第2の撮影によって生成された第2の画像からから同じ房水内浮遊物の像(第2の浮遊物像)を検出するように構成されている。 The floating object image detection unit 1310 is configured to detect a first floating object image in the first image and a second floating object image in the second image, which correspond to the same floating object in the aqueous humor. There is. More specifically, the floating object image detection unit 1310 detects an image of floating objects in the aqueous humor from a first image generated by first imaging performed by the first imaging unit 1110 under first imaging conditions. (the first floating object image) and from the second image generated by the second imaging performed by the second imaging unit 1120 under the second imaging condition. It is configured to detect an image of an object (second floating object image).
 評価情報生成部1320は、浮遊物像検出部1310により検出された同じ房水内浮遊物に対応する第1の浮遊物像と第2の浮遊物像とのペアに基づいて、この房水内浮遊物の評価情報を生成するように構成されている。 The evaluation information generation unit 1320 determines whether the aqueous humor is detected based on a pair of a first floating object image and a second floating object image that correspond to the same floating object in the aqueous humor detected by the floating object image detection unit 1310. The system is configured to generate floating object evaluation information.
 前述したように、第1の撮影と第2の撮影とは並行して実行される。例えば図2A~図2Eに示す例では、第1の撮影のための第1の撮像素子1112の露光期間(第1の露光期間)は、第2の撮影のための第2の撮像素子1122の露光期間(第2の露光期間)の一部に対応している。また、第1の撮影部1110の構成、第2の撮影部1120の構成、及び、第1の撮影部1110と第2の撮影部1120との間の相対的な位置関係は、既知である。したがって、第1の画像の画素の位置(座標)と、第2の画像の画素の位置(座標)との間の対応関係を決定することができる。つまり、第1の画像における画素位置を定義するための座標系と、第2の画像における画素位置を定義するための座標系との間の座標変換を決定することができる。浮遊物像検出部1310は、この対応関係(座標変換)を用いることによって、同じ房水内浮遊物の像を双方の画像から検出するように構成されていてよい。 As mentioned above, the first imaging and the second imaging are performed in parallel. For example, in the examples shown in FIGS. 2A to 2E, the exposure period (first exposure period) of the first image sensor 1112 for the first image capture is the same as that of the second image sensor 1122 for the second image capture. This corresponds to a part of the exposure period (second exposure period). Further, the configuration of the first imaging unit 1110, the configuration of the second imaging unit 1120, and the relative positional relationship between the first imaging unit 1110 and the second imaging unit 1120 are known. Therefore, it is possible to determine the correspondence between the pixel positions (coordinates) of the first image and the pixel positions (coordinates) of the second image. That is, it is possible to determine a coordinate transformation between a coordinate system for defining pixel positions in the first image and a coordinate system for defining pixel positions in the second image. The floating object image detection unit 1310 may be configured to detect the same image of the floating object in the aqueous humor from both images by using this correspondence (coordinate transformation).
 幾つかの例示的な態様において、浮遊物像検出部1310は、まず、第1の画像において前房に相当する前房領域を特定するための画像セグメンテーションを実行し、更に、この前房領域中の浮遊物像を特定するための画像セグメンテーションを実行する。これらの画像セグメンテーションは、機械学習ベースの処理若しくは非機械学習ベースの処理であってよく、又は、機械学習ベースの処理及び非機械学習ベースの処理の組み合わせであってもよい。 In some exemplary embodiments, floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image, and further performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image; Perform image segmentation to identify floating object images. These image segmentations may be machine learning-based processes, non-machine learning-based processes, or a combination of machine learning-based processes and non-machine learning-based processes.
 これにより、第1の画像から1つ以上の浮遊物像が検出される。典型的には、第1の画像から複数の浮遊物像が検出される。第1の画像から検出された浮遊物像の集合を第1の像集合と呼ぶ。同様に、第2の画像から検出された浮遊物像の集合である第2の像集合が検出される。 As a result, one or more floating object images are detected from the first image. Typically, multiple floating object images are detected from the first image. The set of floating object images detected from the first image is called a first image set. Similarly, a second image set is detected, which is a set of floating object images detected from the second image.
 更に、浮遊物像検出部1310は、第1の画像の画素の位置(座標)と第2の画像の画素の位置(座標)との間の既知の対応関係(座標変換)に基づいて、第1の像集合と第2の像集合との間の位置的対応関係を決定する。この位置的対応関係は、第1の像集合の要素(第1の浮遊物像)と第2の像集合の要素(第2の浮遊物像)との間のペアリング関係を表す情報である。つまり、この位置的対応関係により、1つの第1の浮遊物像と1つの第2の浮遊像とのペアが決定され、換言すると、実質的に同じ位置に存在する浮遊物の2つの像が決定され、更に換言すると、同じ浮遊物の2つの像が決定される。 Furthermore, the floating object image detection unit 1310 detects a second image based on a known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinates) of the second image. A positional correspondence between the first image set and the second image set is determined. This positional correspondence is information representing a pairing relationship between an element of the first image set (first floating object image) and an element of the second image set (second floating object image). . In other words, this positional correspondence determines a pair of one first floating object image and one second floating image. In other words, two images of floating objects that exist at substantially the same position are determined. In other words, two images of the same floating object are determined.
 本態様の評価情報生成部1320は、上記の位置的対応関係によって決定された同じ浮遊物の像のペア(第1の浮遊物像と第2の浮遊物像とのペア)に基づいて、このペアに対応する房水内浮遊物の評価情報を生成するように構成されていてよい。 The evaluation information generation unit 1320 of this embodiment generates this image based on the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined by the above-mentioned positional correspondence relationship. It may be configured to generate evaluation information of suspended matter in the aqueous humor corresponding to the pair.
 幾つかの別の例示的な態様において、浮遊物像検出部1310は、まず、第1の画像において前房に相当する前房領域を特定するための画像セグメンテーションを実行し、更に、この前房領域中の浮遊物像を特定するための画像セグメンテーションを実行する。これにより、第1の画像から1つ以上の浮遊物像が検出される。第1の画像から検出された浮遊物像の1つを第1の浮遊物像と呼ぶ。 In some other exemplary aspects, floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the first image; Perform image segmentation to identify floating object images in the region. As a result, one or more floating object images are detected from the first image. One of the floating object images detected from the first image is called a first floating object image.
 次に、浮遊物像検出部1310は、第1の画像から検出された第1の浮遊物像の座標を求める。第1の浮遊物像の座標は、例えば、第1の浮遊物像の重心の座標、第1の浮遊物像の中心の座標、第1の浮遊物の輪郭上の1つ以上の点の座標、第1の浮遊物像の輪郭の近似図形の代表位置の座標(例えば、近似楕円の中心、近似円の中心など)、及び、第1の浮遊物像の輪郭上の1つ以上の点の座標のうちのいずれかであってよい。 Next, the floating object image detection unit 1310 determines the coordinates of the first floating object image detected from the first image. The coordinates of the first floating object image are, for example, the coordinates of the center of gravity of the first floating object image, the coordinates of the center of the first floating object image, the coordinates of one or more points on the outline of the first floating object. , the coordinates of a representative position of an approximate figure of the outline of the first floating object image (for example, the center of an approximate ellipse, the center of an approximate circle, etc.), and the coordinates of one or more points on the outline of the first floating object image. It may be any of the coordinates.
 次に、浮遊物像検出部1310は、第1の画像の画素の位置(座標)と第2の画像の画素の位置(座標)との間の既知の対応関係(座標変換)に基づいて、第1の浮遊物像の座標に対応する位置の浮遊物像を第2の画像から検出する。なお、本態様では第1の露光時間よりも第2の露光時間の方が長いと仮定されているから、浮遊物像検出部1310は、第1の浮遊物像の座標に対応する第2の画像中の位置(座標)を含むように存在する浮遊物像を、第1の浮遊物像と同じ浮遊物に対応する第2の画像中の浮遊物像(第2の浮遊物像)として検出する。 Next, the floating object image detection unit 1310 performs the following based on the known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinate) of the second image. A floating object image at a position corresponding to the coordinates of the first floating object image is detected from the second image. Note that in this aspect, since it is assumed that the second exposure time is longer than the first exposure time, the floating object image detection unit 1310 detects the second exposure time corresponding to the coordinates of the first floating object image. A floating object image that exists to include the position (coordinates) in the image is detected as a floating object image in a second image (second floating object image) corresponding to the same floating object as the first floating object image. do.
 本態様の評価情報生成部1320は、このようにして決定された同じ浮遊物の像のペア(第1の浮遊物像と第2の浮遊物像とのペア)に基づいて、このペアに対応する房水内浮遊物の評価情報を生成するように構成されていてよい。 The evaluation information generation unit 1320 of this aspect corresponds to the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined in this way. The present invention may be configured to generate evaluation information on suspended matter in the aqueous humor.
 幾つかの別の例示的な態様において、浮遊物像検出部1310は、まず、第2の画像において前房に相当する前房領域を特定するための画像セグメンテーションを実行し、更に、この前房領域中の浮遊物像を特定するための画像セグメンテーションを実行する。これにより、第2の画像から1つ以上の浮遊物像が検出される。第2の画像から検出された浮遊物像の1つを第2の浮遊物像と呼ぶ。 In some other exemplary aspects, floater image detection unit 1310 first performs image segmentation to identify an anterior chamber region corresponding to the anterior chamber in the second image; Perform image segmentation to identify floating object images in the region. As a result, one or more floating object images are detected from the second image. One of the floating object images detected from the second image is referred to as a second floating object image.
 次に、浮遊物像検出部1310は、第2の画像から検出された第2の浮遊物像の座標を求める。第2の浮遊物像の座標は、例えば、第2の浮遊物像の重心の座標、第2の浮遊物像の中心の座標、第2の浮遊物の輪郭上の1つ以上の点の座標、第2の浮遊物像の輪郭の近似図形の代表位置の座標、及び、第2の浮遊物像の輪郭上の1つ以上の点の座標のうちのいずれかであってよい。 Next, the floating object image detection unit 1310 determines the coordinates of the second floating object image detected from the second image. The coordinates of the second floating object image are, for example, the coordinates of the center of gravity of the second floating object image, the coordinates of the center of the second floating object image, the coordinates of one or more points on the outline of the second floating object. , the coordinates of a representative position of an approximate figure of the outline of the second floating object image, and the coordinates of one or more points on the outline of the second floating object image.
 次に、浮遊物像検出部1310は、第1の画像の画素の位置(座標)と第2の画像の画素の位置(座標)との間の既知の対応関係(座標変換)に基づいて、第2の浮遊物像の座標に対応する位置の浮遊物像を第1の画像から検出する。なお、本態様では第1の露光時間よりも第2の露光時間の方が長いと仮定されているから、浮遊物像検出部1310は、例えば、第2の浮遊物像の座標に対応する第1の画像中の位置(座標)を含むように存在する浮遊物像、又は、第2の浮遊物像の座標に対応する第1の画像中の位置(座標)の最も近くに存在する浮遊物像を、第2の浮遊物像と同じ浮遊物に対応する第1の画像中の浮遊物像(第1の浮遊物像)として検出する。 Next, the floating object image detection unit 1310 performs the following based on the known correspondence relationship (coordinate transformation) between the pixel position (coordinates) of the first image and the pixel position (coordinate) of the second image. A floating object image at a position corresponding to the coordinates of the second floating object image is detected from the first image. Note that in this aspect, since it is assumed that the second exposure time is longer than the first exposure time, the floating object image detection unit 1310 detects, for example, the second exposure time corresponding to the coordinates of the second floating object image. A floating object image that exists so as to include the position (coordinates) in the first image, or a floating object that exists closest to the position (coordinates) in the first image that corresponds to the coordinates of the second floating object image. The image is detected as a floating object image in the first image (first floating object image) corresponding to the same floating object as the second floating object image.
 本態様の評価情報生成部1320は、このようにして決定された同じ浮遊物の像のペア(第1の浮遊物像と第2の浮遊物像とのペア)に基づいて、このペアに対応する房水内浮遊物の評価情報を生成するように構成されていてよい。 The evaluation information generation unit 1320 of this aspect corresponds to the pair of images of the same floating object (the pair of the first floating object image and the second floating object image) determined in this way. The present invention may be configured to generate evaluation information on suspended matter in the aqueous humor.
 評価処理部1300が実行する処理について、幾つかの例を説明する。 Several examples of the processing executed by the evaluation processing unit 1300 will be explained.
 図4A~図4Dを参照して、評価処理部1300が実行する処理の第1の例を説明する。本例の評価処理部1300は、第2の撮影条件の下に第2の撮影部1120が実施した第2の撮影によって生成された第2の画像に基づいて、房水内浮遊物の移動方向を推定するように構成されている。房水内浮遊物の移動方向は評価情報の1つの例である。 A first example of processing executed by the evaluation processing unit 1300 will be described with reference to FIGS. 4A to 4D. The evaluation processing unit 1300 of this example determines the movement direction of the suspended matter in the aqueous humor based on the second image generated by the second imaging performed by the second imaging unit 1120 under the second imaging condition. is configured to estimate. The direction of movement of suspended matter in the aqueous humor is one example of evaluation information.
 図4Aに示す画像2100は、第1の撮影条件の下に第1の撮影部1110が実施した第1の撮影によって生成された第1の画像であり、画像2200は、第2の撮影条件の下に第2の撮影部1120が実施した第2の撮影によって生成された第2の画像である。ここで、第1の撮影と第2の撮影とは並行して行われたものである。 The image 2100 shown in FIG. 4A is the first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition, and the image 2200 is the first image generated under the second imaging condition. Below is a second image generated by second photography performed by the second photography unit 1120. Here, the first imaging and the second imaging were performed in parallel.
 図4Aにおいて、「Z」は被検眼の軸に沿う方向(Z方向)を示し、「X」はZ方向に直交する方向のうち被検者にとって左右の方向(水平方向、X方向)を示し、「Y」はX方向及びZ方向の双方に直交する方向(上下方向、体軸方向、Y方向)を示す。他の図面(図4B~図4Dなど)においても同様である。 In FIG. 4A, "Z" indicates the direction along the axis of the subject's eye (Z direction), and "X" indicates the left and right direction for the subject (horizontal direction, X direction) in the direction orthogonal to the Z direction. , "Y" indicates a direction (vertical direction, body axis direction, Y direction) orthogonal to both the X direction and the Z direction. The same applies to other drawings (such as FIGS. 4B to 4D).
 符号2110は、浮遊物像検出部1310により第1の画像2100から検出された房水内浮遊物の像(第1の浮遊物像)を示し、符号2210は、浮遊物像検出部1310により第2の画像2200から検出された同じ房水内浮遊物の像(第2の浮遊物像)を示す。なお、第1の浮遊物像2110は、房水内浮遊物の像の近似図形(例えば、近似円、近似楕円など)であってもよく、第2の浮遊物像2210は、同じ房水内浮遊物の像の近似図形(例えば、近似円、近似楕円など)であってもよい。 Reference numeral 2110 indicates an image of floating objects in the aqueous humor (first floating object image) detected from the first image 2100 by the floating object image detection unit 1310, and reference numeral 2210 indicates the image of floating objects in the aqueous humor detected from the first image 2100 by the floating object image detection unit 1310. 2 shows an image of the same floating object in the aqueous humor detected from the second image 2200 (second floating object image). Note that the first floating object image 2110 may be an approximate figure (for example, an approximate circle, an approximate ellipse, etc.) of the image of the floating object in the aqueous humor, and the second floating object image 2210 may be an approximate figure of the image of the floating object in the aqueous humor. It may be an approximate figure (for example, an approximate circle, an approximate ellipse, etc.) of the image of the floating object.
 本例の評価情報生成部1320は、第2の画像2200から評価情報を生成する。まず、評価情報生成部1320は、浮遊物像検出部1310により第2の画像2200から検出された第2の浮遊物像2210を解析することにより、第2の浮遊物像2210の最大寸法を求める。 The evaluation information generation unit 1320 of this example generates evaluation information from the second image 2200. First, the evaluation information generation unit 1320 calculates the maximum dimension of the second floating object image 2210 by analyzing the second floating object image 2210 detected from the second image 2200 by the floating object image detection unit 1310. .
 最大寸法は、第2の浮遊物像2210の境界(輪郭、外縁、エッジ)上の2点間の距離の最大値である。換言すると、最大寸法は、第2の浮遊物像2210の境界上の2点を結ぶ線分の長さの最大値である。第2の浮遊物像2210が円形である場合、その最大寸法は円の直径の長さである。第2の浮遊物像2210が楕円形である場合、その最大寸法は楕円の長軸の長さである。図4Bの符号2220は、楕円形の第2の浮遊物像2210の長径を示している。 The maximum dimension is the maximum value of the distance between two points on the boundary (outline, outer edge, edge) of the second floating object image 2210. In other words, the maximum dimension is the maximum length of a line segment connecting two points on the boundary of the second floating object image 2210. If the second floating object image 2210 is circular, its maximum dimension is the length of the diameter of the circle. If the second floating object image 2210 is elliptical, its maximum dimension is the length of the major axis of the ellipse. Reference numeral 2220 in FIG. 4B indicates the major axis of the second elliptical floating object image 2210.
 更に、本例の評価情報生成部1320は、第2の浮遊物像2210の最大寸法を定義する線分に沿う方向を当該房水内浮遊物の移動方向とする。換言すると、本例の評価情報生成部1320は、第2の浮遊物像2210の最大寸法を定義する第2の浮遊物像2210の境界上の2点を通過する直線の向きを当該房水内浮遊物の移動方向とする。図4Cの符号2230は、図4Bに示す楕円形の第2の浮遊物像2210の長径2220に対応する、房水内浮遊物の移動方向を示している。 Further, the evaluation information generation unit 1320 of this example sets the direction along the line segment that defines the maximum dimension of the second floating object image 2210 as the moving direction of the floating object in the aqueous humor. In other words, the evaluation information generation unit 1320 of this example determines the direction of a straight line passing through two points on the boundary of the second floating object image 2210 that defines the maximum dimension of the second floating object image 2210 within the aqueous humor. The direction of movement of floating objects. Reference numeral 2230 in FIG. 4C indicates the moving direction of the floating object in the aqueous humor, which corresponds to the long axis 2220 of the elliptical second floating object image 2210 shown in FIG. 4B.
 図4Dの符号2110aは、図4Aに示す第1の画像2100中の第1の浮遊物像2110を第1の画像2100と第2の画像2200との間の座標変換によって第2の画像2200に写した像(領域)を示す。 Reference numeral 2110a in FIG. 4D indicates that the first floating object image 2110 in the first image 2100 shown in FIG. Shows the captured image (area).
 本例では、図2A~図2Eに示すように、第1の撮影のための第1の撮像素子1112の露光期間(第1の露光期間)は、第2の撮影のための第2の撮像素子1122の露光期間(第2の露光期間)の初期部分に相当しているものとする。 In this example, as shown in FIGS. 2A to 2E, the exposure period (first exposure period) of the first image sensor 1112 for the first image capture is different from the exposure period (first exposure period) of the first image sensor 1112 for the first image capture. It is assumed that this corresponds to the initial part of the exposure period (second exposure period) of the element 1122.
 この場合、図4Dに示す第2の画像2200中の領域2110aは、第2の露光期間における房水内浮遊物の初期位置と推定される領域(初期位置領域)である。一方、図4Dに示す第2の画像2200中の領域2110bは、第2の露光期間における房水内浮遊物の終期位置と推定される領域(終期位置領域)である。 In this case, a region 2110a in the second image 2200 shown in FIG. 4D is a region (initial position region) that is estimated to be the initial position of the floating object in the aqueous humor during the second exposure period. On the other hand, a region 2110b in the second image 2200 shown in FIG. 4D is a region (final position region) estimated to be the final position of the object suspended in the aqueous humor during the second exposure period.
 評価情報生成部1320により求められた移動方向2230(図4C)は、初期位置領域2110aから終期位置領域2110bに向かう方向を示している。このように、移動方向2230は、第2の露光期間における房水内浮遊物の移動状態を表す情報である。 The movement direction 2230 (FIG. 4C) determined by the evaluation information generation unit 1320 indicates the direction from the initial position area 2110a to the final position area 2110b. In this way, the moving direction 2230 is information representing the moving state of the suspended matter in the aqueous humor during the second exposure period.
 以上で、評価処理部1300が実行する処理の第1の例(第2の画像に基づく房水内浮遊物の移動方向の推定)の説明を終える。 This concludes the description of the first example of the process executed by the evaluation processing unit 1300 (estimation of the moving direction of the suspended matter in the aqueous humor based on the second image).
 次に、評価処理部1300が実行する処理の第2の例を説明する。本例の評価処理部1300は、第1の撮影条件の下に第1の撮影部1110が実施した第1の撮影によって生成された第1の画像と、第2の撮影条件の下に第2の撮影部1120が実施した第2の撮影によって生成された第2の画像とに基づいて、房水内浮遊物の移動ベクトルを推定するように構成されている。移動ベクトルは、房水内浮遊物の移動方向及び移動量(移動距離)を含み、評価情報の1つの例である。 Next, a second example of processing executed by the evaluation processing unit 1300 will be described. The evaluation processing unit 1300 of this example uses a first image generated by the first imaging performed by the first imaging unit 1110 under the first imaging condition, and a second image generated under the second imaging condition. The second image generated by the second imaging performed by the imaging unit 1120 is configured to estimate the movement vector of the suspended matter in the aqueous humor. The movement vector includes the movement direction and movement amount (movement distance) of the floating object in the aqueous humor, and is an example of evaluation information.
 図5Aは、本例において処理される第1の画像2300及び第2の画像2400を示している。本例の浮遊物像検出部1310は、第1の画像2300中の第1の浮遊物像2310と、第2の画像2400中の第2の浮遊物像2410とを検出する。本例の評価情報生成部1320は、浮遊物像検出部1310により検出された第1の浮遊物像2310及び第2の浮遊物像2410に基づいて、この房水内浮遊物の移動ベクトルを推定する。 FIG. 5A shows a first image 2300 and a second image 2400 that are processed in this example. The floating object image detection unit 1310 of this example detects the first floating object image 2310 in the first image 2300 and the second floating object image 2410 in the second image 2400. The evaluation information generation unit 1320 of this example estimates the movement vector of the floating object in the aqueous humor based on the first floating object image 2310 and the second floating object image 2410 detected by the floating object image detection unit 1310. do.
 幾つかの例示的な態様において、評価情報生成部1320は、第1の浮遊物像2310の特徴点を求め、第1の浮遊物像2310の特徴点及び第2の浮遊物像2410に基づいて房水内浮遊物の移動ベクトルを推定するように構成されてよい。 In some exemplary aspects, the evaluation information generation unit 1320 determines the feature points of the first floating object image 2310 and calculates the feature points of the first floating object image 2310 and the second floating object image 2410. The method may be configured to estimate a movement vector of suspended matter in the aqueous humor.
 第1の浮遊物像2310の特徴点は、第1の浮遊物像2310の代表点であってよく、例えば、中心、重心、境界上の1つ以上の点などであってよい。図5Bの符号2320は、第1の浮遊物像2310の特徴点の1つの例を示している。以下、この例示的な特徴点2320を用いて説明する。 The feature point of the first floating object image 2310 may be a representative point of the first floating object image 2310, for example, the center, the center of gravity, one or more points on the boundary, etc. Reference numeral 2320 in FIG. 5B indicates one example of feature points of the first floating object image 2310. The following will explain using this exemplary feature point 2320.
 より具体的な態様において、評価情報生成部1320は、第1の浮遊物像2310の特徴点2320に対応する第2の浮遊物像2410中の位置を特定し、特定された第2の浮遊物像2410中の位置(特徴点2320に対応する第2の浮遊物像2410中の位置)に基づいて房水内浮遊物の移動ベクトルを推定するように構成されていてよい。 In a more specific aspect, the evaluation information generation unit 1320 specifies the position in the second floating object image 2410 that corresponds to the feature point 2320 of the first floating object image 2310, and The moving vector of the floating object in the aqueous humor may be estimated based on the position in the image 2410 (the position in the second floating object image 2410 corresponding to the feature point 2320).
 図5Cの符号2320aは、第1の浮遊物像2310の特徴点2320を第1の画像2300と第2の画像2400との間の座標変換によって第2の画像2400に写した像の位置を示す。本例の評価情報生成部1320は、第1の浮遊物像2310の特徴点2320に対応する第2の浮遊物像2410中の位置2320aに基づいて房水内浮遊物の移動ベクトルを推定する。 Reference numeral 2320a in FIG. 5C indicates the position of an image obtained by mapping the feature point 2320 of the first floating object image 2310 to the second image 2400 by coordinate transformation between the first image 2300 and the second image 2400. . The evaluation information generation unit 1320 of this example estimates the movement vector of the floating object in the aqueous humor based on the position 2320a in the second floating object image 2410 that corresponds to the feature point 2320 of the first floating object image 2310.
 更に具体的な態様において、評価情報生成部1320は、第1の浮遊物像2310の特徴点2320に対応する第2の浮遊物像2410中の位置2320aに加えて、第1の撮像素子1112の露光期間(第1の撮影における露光期間、第1の露光期間)と第2の撮像素子1122の露光期間(第2の撮影における露光期間、第2の露光期間)との間のタイミング関係に基づいて、房水内浮遊物の移動ベクトルを推定するように構成されていてよい。 In a more specific aspect, the evaluation information generation unit 1320 includes a position 2320a in the second floating object image 2410 corresponding to the feature point 2320 of the first floating object image 2310, as well as a position 2320a of the first image sensor 1112. Based on the timing relationship between the exposure period (the exposure period in the first photographing, the first exposure period) and the exposure period of the second image sensor 1122 (the exposure period in the second photographing, the second exposure period) may be configured to estimate a movement vector of suspended matter in the aqueous humor.
 前述したように、第1の露光期間は第2の露光期間の一部(部分期間)に相当する。第1の露光期間と第2の露光期間との間のタイミング関係(露光タイミング関係)は、第2の露光期間における第1の露光期間の(時間的な)位置、つまり、第2の露光期間における上記部分期間の位置を表す情報である。第1の露光期間に対応する部分期間の位置は、例えば、当該部分期間における任意の時点によって定義されてよく、例えば、第1の露光期間の開始時点、中間時点、又は終了時点によって定義されてよい。 As described above, the first exposure period corresponds to a part (partial period) of the second exposure period. The timing relationship (exposure timing relationship) between the first exposure period and the second exposure period is the (temporal) position of the first exposure period in the second exposure period, that is, the second exposure period. This is information representing the position of the partial period in . The position of the sub-period corresponding to the first exposure period may be defined, for example, by any point in time in the sub-period, for example by the start, middle, or end of the first exposure period. good.
 本例の評価情報生成部1320は、例えば、前述した第1の例と同様に、第2の画像2400における第2の浮遊物像2410に基づいて房水内浮遊物の移動方向を推定することができる。 The evaluation information generation unit 1320 of this example estimates the moving direction of the floating object in the aqueous humor based on the second floating object image 2410 in the second image 2400, for example, as in the first example described above. I can do it.
 一方、房水内浮遊物の移動量(移動距離)の推定においては、例えば、評価情報生成部1320は、第2の浮遊物像2410と、第1の浮遊物像2310の特徴点2320に対応する第2の浮遊物像2410中の位置2320aと、露光タイミング関係とが参照される。 On the other hand, in estimating the moving amount (moving distance) of floating objects in the aqueous humor, for example, the evaluation information generation unit 1320 corresponds to the feature points 2320 of the second floating object image 2410 and the first floating object image 2310 The position 2320a in the second floating object image 2410 and the exposure timing relationship are referenced.
 具体的には、評価情報生成部1320は、露光タイミング関係に示された第1の露光期間と第2の露光期間との間の時間的な位置関係に基づいて、第1の浮遊物像2310の特徴点2320に対応する位置2320aが、第2の露光期間における房水内浮遊物の移動のどの時点(移動ルートのどの地点)を表しているか決定する。 Specifically, the evaluation information generation unit 1320 generates the first floating object image 2310 based on the temporal positional relationship between the first exposure period and the second exposure period shown in the exposure timing relationship. The position 2320a corresponding to the feature point 2320 represents which point in the movement of the suspended matter in the aqueous humor (which point on the movement route) in the second exposure period is determined.
 図2A~図2Eに示すように第1の露光期間が第2の露光期間の初期に相当する場合、第1の浮遊物像2310の特徴点2320に対応する位置2320aは、第2の露光期間における房水内浮遊物の移動の開始時点(移動ルートの起点)を表している。また、第1の露光期間が第2の露光期間の終期に相当する場合、第1の浮遊物像2310の特徴点2320に対応する位置2320aは、第2の露光期間における房水内浮遊物の移動の終了時点(移動ルートの終点)を表している。また、第1の露光期間が第2の露光期間の中央時点に相当する場合、第1の浮遊物像2310の特徴点2320に対応する位置2320aは、第2の露光期間における房水内浮遊物の移動の中央時点(移動ルートの中央点)を表している。一般に、第1の露光期間が第2の露光期間の開始時点(0パーセント時点)から終了時点(100パーセント時点)までの間のpパーセント時点に相当する場合(0≦p≦100)、第1の浮遊物像2310の特徴点2320に対応する位置2320aは、第2の露光期間における房水内浮遊物の移動の開始時点から終了時点までの間のpパーセント時点(移動ルートの起点から終点までの間のpパーセント地点)を表している。このようにして決定される時点(地点)を対応時点(対応地点)と呼ぶことにする。 As shown in FIGS. 2A to 2E, when the first exposure period corresponds to the beginning of the second exposure period, the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 is located during the second exposure period. represents the start point of movement of suspended matter in the aqueous humor (starting point of the movement route). Further, when the first exposure period corresponds to the end of the second exposure period, the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 corresponds to the position 2320a of the floating object in the aqueous humor during the second exposure period. It represents the end point of movement (the end point of the movement route). Further, when the first exposure period corresponds to the middle point of the second exposure period, the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 is the position of the floating object in the aqueous humor during the second exposure period. represents the center point of movement (the center point of the movement route). In general, if the first exposure period corresponds to p percent points between the start point (0 percent point) and the end point (100 percent point) of the second exposure period (0≦p≦100), then the first The position 2320a corresponding to the feature point 2320 of the floating object image 2310 is p percent point in time between the start and end of the movement of the floating object in the aqueous humor during the second exposure period (from the starting point to the ending point of the movement route). p percent points between The time points (points) determined in this manner will be referred to as corresponding time points (corresponding points).
 次に、評価情報生成部1320は、上記のようにして決定された対応時点と、第2の浮遊物像2410の拡がり(範囲、境界)と、房水内浮遊物の移動方向とに基づいて、房水内浮遊物の移動量を推定することができる。 Next, the evaluation information generation unit 1320 uses the corresponding time point determined as described above, the spread (range, boundary) of the second floating object image 2410, and the moving direction of the floating object in the aqueous humor. , the amount of movement of suspended matter in the aqueous humor can be estimated.
 第1の露光期間が第2の露光期間の初期に相当する場合について、図5D及び図5Eを更に参照して説明する。この場合、第1の浮遊物像2310の特徴点2320に対応する位置2320aは、第2の露光期間における房水内浮遊物の移動ルートの起点を表す。 A case where the first exposure period corresponds to the beginning of the second exposure period will be described with further reference to FIGS. 5D and 5E. In this case, the position 2320a corresponding to the feature point 2320 of the first floating object image 2310 represents the starting point of the movement route of the floating object in the aqueous humor during the second exposure period.
 評価情報生成部1320は、房水内浮遊物の移動方向と、第2の浮遊物像2410の拡がり(範囲、境界)とに基づいて、第2の露光期間における房水内浮遊物の移動の終了時点(移動ルートの終点)を推定することができる。 The evaluation information generation unit 1320 calculates the movement of the floating objects in the aqueous humor during the second exposure period based on the movement direction of the floating objects in the aqueous humor and the spread (range, boundary) of the second floating object image 2410. The end point (end point of the travel route) can be estimated.
 例えば、図5Dに示すように、評価情報生成部1320は、房水内浮遊物の移動方向に沿う直線2420と第2の浮遊物像2410の境界との2つの交点2410a及び2410bを求める。本例では、房水内浮遊物の移動ルートの起点(位置2320a)に近い側に第1の交点2410aが位置している。評価情報生成部1320は、位置2320aと第1の交点2410aとの間の距離Dを求める。 For example, as shown in FIG. 5D, the evaluation information generation unit 1320 finds two intersections 2410a and 2410b between a straight line 2420 along the moving direction of the floating object in the aqueous humor and the boundary of the second floating object image 2410. In this example, the first intersection 2410a is located on the side closer to the starting point (position 2320a) of the movement route of the suspended matter in the aqueous humor. The evaluation information generation unit 1320 calculates the distance D between the position 2320a and the first intersection 2410a.
 更に、評価情報生成部1320は、第2の交点2410bから直線2420に沿って位置2320a(第1の交点2410a)の方向に距離Dだけ変位した位置2320bを求める。この位置2320bは、第2の露光期間における房水内浮遊物の移動ルートの終点と推定される。 Furthermore, the evaluation information generation unit 1320 determines a position 2320b displaced by a distance D from the second intersection 2410b along the straight line 2420 in the direction of the position 2320a (first intersection 2410a). This position 2320b is estimated to be the end point of the movement route of the suspended matter in the aqueous humor during the second exposure period.
 評価情報生成部1320は、位置2320aと位置2320bとの間の距離を算出する。この距離は、第2の露光期間における房水内浮遊物の移動距離と推定される。図5Eに示す符号2430は、第2の露光期間における房水内浮遊物の推定移動ベクトルである。この移動ベクトルは、第2の露光期間における房水内浮遊物の移動方向及び移動量を表す情報であり、第2の露光期間における房水内浮遊物の移動状態を表す情報である。 The evaluation information generation unit 1320 calculates the distance between the position 2320a and the position 2320b. This distance is estimated to be the moving distance of the suspended matter in the aqueous humor during the second exposure period. Reference numeral 2430 shown in FIG. 5E is an estimated movement vector of the floating object in the aqueous humor during the second exposure period. This movement vector is information representing the direction and amount of movement of the objects suspended in the aqueous humor during the second exposure period, and is information representing the state of movement of the objects suspended within the aqueous humor during the second exposure period.
 なお、第1の露光期間が第2の露光期間の終期に相当する場合や、第1の露光期間が第2の露光期間の中央時点に相当する場合、一般に、第1の露光期間が第2の露光期間の任意の部分期間に対応する場合においても、第2の露光期間における房水内浮遊物の移動ベクトルを上記方法と同様の方法によって求めることが可能である。 Note that if the first exposure period corresponds to the end of the second exposure period or if the first exposure period corresponds to the middle of the second exposure period, generally the first exposure period corresponds to the second exposure period. Even in the case where the second exposure period corresponds to an arbitrary partial period of the exposure period, it is possible to obtain the movement vector of the suspended matter in the aqueous humor in the second exposure period by a method similar to the above method.
 以上で、評価処理部1300が実行する処理の第2の例(第1の画像及び第2の画像に基づく房水内浮遊物の移動ベクトルの推定)の説明を終える。 This concludes the description of the second example of the process executed by the evaluation processing unit 1300 (estimation of the movement vector of suspended matter in the aqueous humor based on the first image and the second image).
 以上に説明した第1の例及び第2の例を実現するために採用可能な眼科装置の1つの構成例を図6に示す。本例の眼科装置1000Aは、図1の眼科装置1000に照明系1130を付加したものである。 FIG. 6 shows one configuration example of an ophthalmological apparatus that can be adopted to realize the first example and second example described above. The ophthalmological apparatus 1000A of this example is obtained by adding an illumination system 1130 to the ophthalmological apparatus 1000 of FIG.
 照明系1130は、撮影部1100に含まれており、被検眼の前眼部にスリット光を投射するように構成されている。本例では、第1の撮影部1110及び第2の撮影部1120は単一の照明系1130を共用しているが、前述したように、第1の撮影部1110のための照明系と第2の撮影部1120のための照明系とが別々に設けられていてもよい。照明系1130については、幾つかの非限定的な具体例を後述する。 The illumination system 1130 is included in the imaging unit 1100 and is configured to project a slit light onto the anterior segment of the subject's eye. In this example, the first imaging unit 1110 and the second imaging unit 1120 share a single illumination system 1130, but as described above, the illumination system for the first imaging unit 1110 and the second imaging unit An illumination system for the imaging unit 1120 may be provided separately. Some non-limiting specific examples of the illumination system 1130 will be described below.
 スリット光のビーム形状(ビームの断面形状)の長手方向は所定の方向に一致されており、例えばY方向に一致されている。移動機構1400は、第1の撮影部1110、第2の撮影部1120、及び照明系1130を一体的に移動するように構成されている。移動方向は、例えば、スリット光のビーム形状の長手方向に直交する方向であり、例えばX方向である。これにより、被検眼の前眼部の3次元領域をスキャンして一連のシャインプルーフ画像を収集することができる。このような3次元スキャンについては、特許文献3(特開2019-213733号公報)を参照されたい。 The longitudinal direction of the beam shape (cross-sectional shape of the beam) of the slit light is aligned with a predetermined direction, for example, aligned with the Y direction. The moving mechanism 1400 is configured to integrally move the first imaging section 1110, the second imaging section 1120, and the illumination system 1130. The moving direction is, for example, a direction perpendicular to the longitudinal direction of the beam shape of the slit light, and is, for example, the X direction. This makes it possible to scan a three-dimensional region of the anterior segment of the subject's eye and collect a series of Scheimpflug images. Regarding such three-dimensional scanning, please refer to Patent Document 3 (Japanese Patent Laid-Open No. 2019-213733).
 なお、本態様の3次元スキャンによれば、第1の撮影部1110により収集される一連のシャインプルーフ画像(複数の第1の画像)と、第1の撮影部1120により収集される一連のシャインプルーフ画像(複数の第2の画像)とが得られる。換言すると、本態様の3次元スキャンによれば、第1の画像と第2の画像との複数のペア、すなわち前眼部の異なる複数の部分がそれぞれ描出された複数の画像ペアが得られる。 Note that according to the three-dimensional scan of this aspect, a series of Scheimpflug images (a plurality of first images) collected by the first imaging unit 1110 and a series of Scheimpflug images collected by the first imaging unit 1120 A proof image (a plurality of second images) is obtained. In other words, according to the three-dimensional scan of this embodiment, a plurality of pairs of the first image and the second image, ie, a plurality of image pairs in which different parts of the anterior segment are respectively depicted, are obtained.
 本態様の眼科装置1000Aにより取得される各シャインプルーフ画像(第1の画像、第2の画像)は、スリット光のビーム形状の長手方向に沿う座標軸と、被検眼に対するスリット光の投射方向に沿う座標軸とにより張られる2次元座標系で表現される前眼部の断面を表現した画像である。スリット光のビーム形状の長手方向がY方向であり、且つ、スリット光の投射方向がZ方向である場合、本態様の眼科装置1000Aは、例えば、図4Aに示す第1の画像2100と第2の画像2200とのペア(図5Aに示す第1の画像2300と第2の画像2400とのペア)のような画像ペアを取得することができる。 Each Scheimpflug image (first image, second image) acquired by the ophthalmological apparatus 1000A of this aspect is arranged along a coordinate axis along the longitudinal direction of the beam shape of the slit light and along the projection direction of the slit light onto the subject's eye. This is an image representing a cross section of the anterior segment of the eye expressed in a two-dimensional coordinate system defined by the coordinate axes. When the longitudinal direction of the beam shape of the slit light is the Y direction, and the projection direction of the slit light is the Z direction, the ophthalmological apparatus 1000A of this embodiment can, for example, An image pair such as the pair with image 2200 (the pair of first image 2300 and second image 2400 shown in FIG. 5A) can be obtained.
 更に、本態様の眼科装置1000Aは、第1の撮影部1110、第2の撮影部1120、及び照明系1130をX方向に移動して3次元スキャンを行うことによって、X方向に配列された複数の断面にそれぞれ対応する複数の画像ペアを収集することができる。 Furthermore, the ophthalmological apparatus 1000A of this embodiment moves the first imaging unit 1110, the second imaging unit 1120, and the illumination system 1130 in the A plurality of image pairs can be collected, each corresponding to a cross section of the image.
 評価処理部1300は、3次元スキャンにより収集された複数の画像ペアのそれぞれに基づいて房水内浮遊物の評価情報を生成することができる。これにより、前眼部の複数の断面(例えば、YZ断面)における房水内浮遊物の評価情報を生成することができる。すなわち、被検眼の3次元領域における房水内浮遊物の評価情報を生成することが可能である。 The evaluation processing unit 1300 can generate evaluation information for floating objects in the aqueous humor based on each of a plurality of image pairs collected by three-dimensional scanning. This makes it possible to generate evaluation information for floating objects in the aqueous humor in a plurality of cross sections (for example, YZ cross sections) of the anterior segment. In other words, it is possible to generate evaluation information of floating substances in the aqueous humor in a three-dimensional region of the eye to be examined.
 更に、評価処理部1300は、複数のペアに基づき生成された前眼部の複数の断面における複数の評価情報に基づいて、房水内浮遊物に関する所定の評価値の3次元分布を生成することができる。この評価値は、本開示において説明される任意の種類の評価値であってよい。生成された3次元分布から視覚的情報(例えば、カラーマップ)を作成して表示することができる。また、生成された3次元分布を解析することができる。 Further, the evaluation processing unit 1300 generates a three-dimensional distribution of predetermined evaluation values regarding floating objects in the aqueous humor based on a plurality of pieces of evaluation information on a plurality of cross sections of the anterior segment generated based on a plurality of pairs. I can do it. This evaluation value may be any type of evaluation value described in this disclosure. Visual information (eg, a color map) can be created and displayed from the generated three-dimensional distribution. Furthermore, the generated three-dimensional distribution can be analyzed.
 また、本態様の眼科装置1000Aは、照明系1130によりスリット光が投射されている前眼部にスリット光を投射しつつ、第1の撮影部1110による第1の撮影条件での第1の撮影と、第2の撮影部1120による第2の撮影条件での第2の撮影とを並行して行うことができる。 In addition, the ophthalmological apparatus 1000A of this aspect projects the slit light onto the anterior segment of the eye onto which the illumination system 1130 projects the slit light, while the first imaging unit 1110 performs the first imaging under the first imaging condition. and second photography under the second photography condition by the second photography unit 1120 can be performed in parallel.
 更に、本態様の眼科装置1000Aは、評価処理部1300により、スリット光の投射とともに実行された第1の撮影及び第2の撮影によって生成された第1の画像と第2の画像とのペアに基づいて房水内浮遊物の評価情報を生成することができる。 Furthermore, the ophthalmological apparatus 1000A of this embodiment uses the evaluation processing unit 1300 to generate a pair of the first image and the second image generated by the first imaging and the second imaging performed together with the projection of the slit light. Based on this, evaluation information on suspended matter in the aqueous humor can be generated.
 より具体的には、本態様の眼科装置1000Aの浮遊物像検出部1310は、第1の画像から第1の浮遊物像を検出し、第2の画像から第2の浮遊物像を検出することができる。更に、本態様の眼科装置1000Aの評価情報生成部1320は、第1の画像から検出された第1の浮遊像と第2の画像から検出された第2の浮遊物像とに基づいて、照明系1130により前眼部に適用されるスリット光の投射方向である第1の方向(例えば、Z方向)における房水内浮遊物の移動量と、このスリット光のビーム形状の長手方向(例えば、Y方向)における房水内浮遊物の移動量とを推定することができる。 More specifically, the floating object image detection unit 1310 of the ophthalmological apparatus 1000A of this aspect detects a first floating object image from a first image, and detects a second floating object image from a second image. be able to. Furthermore, the evaluation information generation unit 1320 of the ophthalmological apparatus 1000A of this embodiment generates illumination based on the first floating image detected from the first image and the second floating object image detected from the second image. The amount of movement of suspended matter in the aqueous humor in the first direction (e.g., Z direction), which is the projection direction of the slit light applied to the anterior segment by the system 1130, and the longitudinal direction of the beam shape of this slit light (e.g., The amount of movement of suspended matter in the aqueous humor in the Y direction) can be estimated.
 これにより、第1の方向における房水内浮遊物の移動状態、及び、第2の方向における房水内浮遊物の移動状態を検出することができる。更に、第1の方向及び第2の方向により定義される平面(例えば、YZ平面)に平行な断面における房水内浮遊物の移動状態(つまり、当該断面における房水内浮遊物の移動ベクトル)を検出することができる。 With this, it is possible to detect the state of movement of the objects suspended in the aqueous humor in the first direction and the state of movement of the objects suspended in the aqueous humor in the second direction. Furthermore, the movement state of the suspended matter in the aqueous humor in a cross section parallel to the plane defined by the first direction and the second direction (for example, the YZ plane) (that is, the movement vector of the suspended matter in the aqueous humor in the cross section) can be detected.
 第1の方向における移動量を推定する処理及び第2の方向における移動量を推定する処理は、例えば、前述した方法で行うことができるが、推定方法はこれに限定されるものではない。例えば、第1の画像から検出された第1の浮遊物像の特徴点と、第2の画像から検出された第2の浮遊物像の特徴点との間の変位に基づいて、第1の方向における移動量及び第2の方向における移動量を推定することができる。前述したように、浮遊物像の特徴点は、浮遊物像の任意の代表点であってよく、例えば、中心、重心、境界上の1つ以上の点などであってよい。 The process of estimating the amount of movement in the first direction and the process of estimating the amount of movement in the second direction can be performed, for example, by the method described above, but the estimation method is not limited to this. For example, based on the displacement between the feature points of the first floating object image detected from the first image and the feature points of the second floating object image detected from the second image, the first The amount of movement in the direction and the amount of movement in the second direction can be estimated. As described above, the feature point of the floating object image may be any representative point of the floating object image, such as the center, the center of gravity, or one or more points on the boundary.
 本態様の眼科装置1000Aの評価情報生成部1320は、第1の方向(例えば、Z方向)及び第2の方向(例えば、Y方向)の双方に直交する第3の方向(例えば、X方向)における房水内浮遊物の移動量を推定するように構成されていてもよい。 The evaluation information generation unit 1320 of the ophthalmological apparatus 1000A of this aspect operates in a third direction (for example, the X direction) orthogonal to both the first direction (for example, the Z direction) and the second direction (for example, the Y direction). It may be configured to estimate the amount of movement of suspended matter in the aqueous humor.
 図7Aは、本例において処理される第1の画像2500及び第2の画像2600を示している。本例では、第1の方向はZ方向であり、第2の方向はY方向であり、第3の方向はX方向である。浮遊物像検出部1310は、第1の画像2500中の第1の浮遊物像2510と、第2の画像2600中の第2の浮遊物像2610とを検出する。 FIG. 7A shows a first image 2500 and a second image 2600 that are processed in this example. In this example, the first direction is the Z direction, the second direction is the Y direction, and the third direction is the X direction. Floating object image detection section 1310 detects first floating object image 2510 in first image 2500 and second floating object image 2610 in second image 2600.
 評価情報生成部1320は、例えば、本開示で説明したいずれかの方法により、第1の画像2500から検出された第1の浮遊物像2510と第2の画像2600から検出された第2の浮遊物像2610とに基づいて、YZ平面内における房水内浮遊物の移動ベクトルを推定すること、つまり、Y方向における房水内浮遊物の移動量Δyと、Z方向における当該房水内浮遊物の移動量Δzとを推定することができる。 For example, the evaluation information generation unit 1320 generates a first floating object image 2510 detected from the first image 2500 and a second floating object image 2510 detected from the second image 2600 using any of the methods described in the present disclosure. Estimating the movement vector of the floating object in the aqueous humor in the YZ plane based on the object image 2610, that is, estimating the movement vector of the floating object in the aqueous humor in the Y direction and the movement amount Δy of the floating object in the aqueous humor in the Z direction. The amount of movement Δz can be estimated.
 また、評価情報生成部1320は、第2の画像2600から検出された第2の浮遊物像2610に基づいて、X方向における当該房水内浮遊物の移動量Δxを推定することができる。幾つかの例示的な態様では、評価情報生成部1320は、第2の浮遊物像2610のぼやけ具合に基づいてX方向における当該房水内浮遊物の移動量Δxを推定するように構成されていてよい。 Furthermore, the evaluation information generation unit 1320 can estimate the movement amount Δx of the floating object in the aqueous humor in the X direction based on the second floating object image 2610 detected from the second image 2600. In some exemplary aspects, the evaluation information generation unit 1320 is configured to estimate the movement amount Δx of the floating object in the aqueous humor in the X direction based on the degree of blurring of the second floating object image 2610. It's fine.
 房水内浮遊物のX方向への移動量Δxを推定する処理の例を説明する。第2の撮影部1120の第2の光学系1121の対物レンズは、開口数NAを有するものとする。評価情報生成部1320は、第2の撮影部1120により生成された第2の画像2600から検出された第2の浮遊物像2610の寸法と、第2の撮影部1120の対物レンズの開口数NAとに基づいて、房水内浮遊物のX方向への移動量Δxを推定することができる。 An example of a process for estimating the amount of movement Δx of floating matter in the aqueous humor in the X direction will be described. It is assumed that the objective lens of the second optical system 1121 of the second imaging unit 1120 has a numerical aperture NA. The evaluation information generation unit 1320 calculates the dimensions of the second floating object image 2610 detected from the second image 2600 generated by the second imaging unit 1120 and the numerical aperture NA of the objective lens of the second imaging unit 1120. Based on this, the amount of movement Δx of the suspended matter in the aqueous humor in the X direction can be estimated.
 第2の浮遊物像2610の当該寸法は、例えば、第2の浮遊物像2610の幅wであってよい。第2の浮遊物像2610の幅wは、例えば、図4Cに示す第2の浮遊物像2210の長径2220に直交する方向における第2の浮遊物像2610の寸法、換言すると、楕円形の第2の浮遊物像2210の短径の長さであってよい(図7Bを参照)。第2の浮遊物像2610が楕円形でない場合においても同様の方法で第2の浮遊物像2610の寸法(幅)を求めることができる。 The dimension of the second floating object image 2610 may be, for example, the width w of the second floating object image 2610. The width w of the second floating object image 2610 is, for example, the dimension of the second floating object image 2610 in the direction perpendicular to the long axis 2220 of the second floating object image 2210 shown in FIG. It may be the length of the minor axis of the floating object image 2210 of No. 2 (see FIG. 7B). Even when the second floating object image 2610 is not elliptical, the dimensions (width) of the second floating object image 2610 can be determined in a similar manner.
 評価情報生成部1320は、第2の浮遊物像2610の幅wと、第2の撮影部1120の対物レンズの開口数NAとに基づいて、X方向への移動量Δxを次式により算出することができる:Δx=w/(2×NA)。ここで、撮影部1100(第2の撮影部1120)は、深さ方向(Z方向)に対して十分な分解能を有することが仮定されている。 The evaluation information generation unit 1320 calculates the amount of movement Δx in the X direction based on the width w of the second floating object image 2610 and the numerical aperture NA of the objective lens of the second imaging unit 1120 using the following formula. It is possible: Δx=w/(2×NA). Here, it is assumed that the imaging unit 1100 (second imaging unit 1120) has sufficient resolution in the depth direction (Z direction).
 更に、評価情報生成部1320は、房水内浮遊物の3次元的な移動量Dを次式により算出することができる:D=(Δx^2+Δy^2+Δz^2)^(1/2)。また、評価情報生成部1320は、房水内浮遊物の3次元的な移動ベクトル(X、Y、Z)=(Δx、Δy、Δz)を求めることができる。 Furthermore, the evaluation information generation unit 1320 can calculate the three-dimensional movement amount D of the suspended matter in the aqueous humor using the following formula: D=(Δx^2+Δy^2+Δre^2)^(1/2). Furthermore, the evaluation information generation unit 1320 can obtain a three-dimensional movement vector (X, Y, Z)=(Δx, Δy, Δz) of the suspended matter in the aqueous humor.
 更に、評価情報生成部1320は、房水内浮遊物の移動速度を算出することも可能である。例えば、評価情報生成部1320は、第2の撮影部1120の第2の撮像素子1122の第2の露光時間「b」、房水内浮遊物のX方向への移動量Δx、房水内浮遊物のY方向への移動量Δy、及び房水内浮遊物のz方向への移動量Δzに基づいて、房水内浮遊物の移動速度Vを次式により算出することができる:V=D/b=[(Δx^2+Δy^2+Δz^2)^(1/2)]/b。この移動速度Vは、第2の撮影部1120の露光期間(第2の露光期間)における房水内浮遊物の3次元的な移動速度の大きさ(速さ)の推定値である。また、評価情報生成部1320は、房水内浮遊物の3次元的な移動速度ベクトル(v、v、v)=(Δx/b、Δy/b、Δz/b)を求めることができる。 Furthermore, the evaluation information generation unit 1320 can also calculate the moving speed of floating objects in the aqueous humor. For example, the evaluation information generation unit 1320 generates information such as the second exposure time “b” of the second image sensor 1122 of the second imaging unit 1120, the amount of movement Δx of the floating matter in the aqueous humor in the X direction, the amount of movement Δx of the floating matter in the aqueous humor, Based on the movement amount Δy of the object in the Y direction and the movement amount Δz of the object suspended in the aqueous humor in the z direction, the movement speed V 1 of the object suspended in the aqueous humor can be calculated by the following formula: V 1 =D/b=[(Δx^2+Δy^2+Δz^2)^(1/2)]/b. This moving speed V 1 is an estimated value of the three-dimensional moving speed (speed) of the object suspended in the aqueous humor during the exposure period (second exposure period) of the second imaging unit 1120. Furthermore, the evaluation information generation unit 1320 can obtain a three-dimensional movement velocity vector (v x , v y , v z )=(Δx/b, Δy/b, Δz/b) of the suspended matter in the aqueous humor. can.
 別の例として、評価情報生成部1320は、第1の撮影部1110の第1の撮像素子1112の第1の露光時間「a」、第2の撮影部1120の第2の撮像素子1122の第2の露光時間「b」、房水内浮遊物のX方向への移動量Δx、房水内浮遊物のY方向への移動量Δy、及び房水内浮遊物のz方向への移動量Δzに基づいて、房水内浮遊物の移動速度Vを次式により算出することができる:V=D/(b-a)=[(Δx^2+Δy^2+Δz^2)^(1/2)]/(b-a)。この移動速度Vは、第1の撮影部1100の露光期間(第1の露光期間)の終了時点から第2の撮影部1120の露光期間(第2の露光期間)の終了時点までの期間における房水内浮遊物の3次元的な移動速度の大きさ(速さ)の推定値である。また、評価情報生成部1320は、房水内浮遊物の3次元的な移動速度ベクトル(v、v、v)=(Δx/(b-a)、Δy/(b-a)、Δz/(b-a))を求めることができる。 As another example, the evaluation information generation unit 1320 may calculate the first exposure time “a” of the first image sensor 1112 of the first image capturing unit 1110 and the second exposure time “a” of the second image sensor 1122 of the second image capturing unit 1120. Exposure time "b" of 2, the amount of movement Δx of the floating substances in the aqueous humor in the X direction, the amount of movement Δy of the floating substances in the aqueous humor in the Y direction, and the amount of movement of the floating substances in the aqueous humor in the z direction Δz Based on this, the moving speed V 2 of the suspended matter in the aqueous humor can be calculated by the following formula: V 2 = D/(ba) = [(Δx^2+Δy^2+Δre^2)^(1/2 )]/(ba). This moving speed V 2 is determined during the period from the end of the exposure period (first exposure period) of the first imaging section 1100 to the end of the exposure period (second exposure period) of the second imaging section 1120. This is an estimated value of the three-dimensional moving speed (speed) of suspended matter in the aqueous humor. In addition, the evaluation information generation unit 1320 generates a three-dimensional movement velocity vector (v x , v y , v z )=(Δx/(ba), Δy/(ba)), Δz/(ba)) can be found.
 このように、評価情報生成部1320は、浮遊物像検出部1310により第1の画像及び第2の画像からそれぞれ検出された第1の浮遊物像及び第2の浮遊物像に基づいて、房水内浮遊物の移動速度を推定することが可能である。 In this way, the evaluation information generation unit 1320 generates a cluster based on the first floating object image and the second floating object image detected from the first image and the second image, respectively, by the floating object image detection unit 1310. It is possible to estimate the speed of movement of suspended objects in water.
 幾つかの例示的な態様に係る評価処理部1300について説明する。本態様の評価処理部1300は、図8に示すように、図3の態様と同様の浮遊物像検出部1310及び評価情報生成部1320に加えて、種別特定部1330を含んでいる。 The evaluation processing unit 1300 according to some exemplary aspects will be described. As shown in FIG. 8, the evaluation processing section 1300 of this embodiment includes a type identification section 1330 in addition to a floating object image detection section 1310 and an evaluation information generation section 1320 similar to those of the embodiment of FIG.
 種別特定部1330は、撮影部1100により取得された画像(第1の撮影条件の下に第1の撮影部1110により生成された第1の画像、及び、第2の撮影条件の下に第2の撮影部1120により生成された第2の画像)に描出されている房水内浮遊物の種別を特定するように構成されている。房水内浮遊物の種別としては、炎症細胞、白血球(マクロファージ、リンパ球など)、血中蛋白などがある。種別特定部1330の機能は、種別特定プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。 The type identification unit 1330 identifies images acquired by the imaging unit 1100 (a first image generated by the first imaging unit 1110 under a first imaging condition, and a second image generated by the first imaging unit 1110 under a second imaging condition). The second image generated by the imaging unit 1120) is configured to identify the type of suspended matter in the aqueous humor. Types of suspended matter in the aqueous humor include inflammatory cells, white blood cells (macrophages, lymphocytes, etc.), and blood proteins. The function of the type identification unit 1330 is realized by cooperation between software such as a type identification program and hardware such as a processor.
 画像から複数の浮遊物像が検出された場合、種別特定部1330は、検出された複数の浮遊物像のそれぞれについて、その浮遊物像に相当する房水内浮遊物の種別を特定することができる。更に、種別特定部1330は、種別特定の結果に基づいて、複数の浮遊物像を分類することができる。 When a plurality of floating object images are detected from the image, the type identifying unit 1330 can identify, for each of the plurality of detected floating object images, the type of the floating object in the aqueous humor that corresponds to that floating object image. can. Further, the type identification unit 1330 can classify a plurality of floating object images based on the result of type identification.
 幾つかの例示的な態様において、種別特定部1330は、撮影部1100により取得された画像から浮遊物像検出部1310によって検出された浮遊物像の特性に基づいて、この画像に描出されている房水内浮遊物の種別を特定するように構成されていてよい。本例の種別特定の基準として用いられる浮遊物像の特性の例としては、浮遊物像の輝度、寸法、形状、分布状態(個数、密度、位置など)、移動状態(移動方向、移動量、移動速度など)などがある。 In some exemplary embodiments, the type identification unit 1330 depicts the floating object image in the image based on the characteristics of the floating object image detected by the floating object image detection unit 1310 from the image acquired by the imaging unit 1100. It may be configured to identify the type of suspended matter in the aqueous humor. Examples of the characteristics of the floating object image used as criteria for identifying the type in this example include the brightness, size, shape, distribution state (number, density, position, etc.) of the floating object image, and movement state (moving direction, amount of movement, etc.). movement speed, etc.).
 幾つかの例示的な態様において、種別特定部1330は、被検者(患者)の診療データに基づいて、この被検者の前眼部の画像に描出されている房水内浮遊物の種別を特定するように構成されていてよい。本例の種別特定において参照される診療データとしては、疾患名(診断名、疑い病名など)、検査データ、医師の所見などがある。診療データは、例えば、データベースに保存されている被検者のデータ(電子カルテ、レポート、医療文書など)から取得される。 In some exemplary embodiments, the type identifying unit 1330 determines the type of floater in the aqueous humor depicted in the image of the anterior segment of the subject (patient) based on the subject's (patient) medical data. may be configured to identify the The medical data referred to in identifying the type in this example includes disease names (diagnosis names, suspected disease names, etc.), test data, doctor's findings, and the like. The medical data is obtained, for example, from patient data (electronic medical records, reports, medical documents, etc.) stored in a database.
 幾つかの例示的な態様において、種別特定部1330は、偏光を利用した撮影により取得された画像に基づいて房水内浮遊物の種別特定を行うように構成されていてよい。偏光を利用した撮影は、例えば、第1の偏光方向の照明光を被検眼に投射し、被検眼からの戻り光から第2の偏光方向の成分を抽出して検出することによって行われる。 In some exemplary embodiments, the type identification unit 1330 may be configured to identify the type of the object suspended in the aqueous humor based on an image obtained by imaging using polarized light. Photographing using polarized light is performed, for example, by projecting illumination light in a first polarization direction onto the subject's eye, and extracting and detecting a component in the second polarization direction from the returned light from the subject's eye.
 第1の偏光方向と第2の偏光方向とを互いに平行に配置して撮影を行うことにより、被検眼での反射によって偏光方向が変化しなかった成分(正反射成分)を検出することができる。これにより、前眼部の正反射画像が得られる。 By performing imaging with the first polarization direction and the second polarization direction parallel to each other, it is possible to detect a component whose polarization direction has not changed due to reflection at the subject's eye (regular reflection component). . As a result, a specular reflection image of the anterior segment of the eye is obtained.
 また、第1の偏光方向と第2の偏光方向とを互いに直交するように配置して撮影を行うことにより、被検眼での反射によって偏光方向が変化した成分(拡散反射成分)を検出することができる。これにより、前眼部の拡散反射画像が得られる。 Furthermore, by performing imaging with the first polarization direction and the second polarization direction orthogonal to each other, a component whose polarization direction has changed due to reflection at the subject's eye (diffuse reflection component) can be detected. I can do it. As a result, a diffuse reflection image of the anterior segment of the eye is obtained.
 本態様の評価処理部1300は、正反射画像と拡散反射画像との間の強度比に基づいて房水内浮遊物の種類の同定を行うように構成されていてよい。まず、本態様の評価処理部1300は、浮遊物像検出部1310によって、正反射画像から浮遊物像を検出し、且つ、拡散反射画像から浮遊物像を検出する。 The evaluation processing unit 1300 of this embodiment may be configured to identify the type of floating matter in the aqueous humor based on the intensity ratio between the specular reflection image and the diffuse reflection image. First, the evaluation processing unit 1300 of this embodiment detects a floating object image from a specular reflection image and also detects a floating object image from a diffuse reflection image using a floating object image detection unit 1310.
 必要に応じて、種別特定部1330は、正反射画像と拡散反射画像との間のレジストレーションを実行し、このレジストレーションの結果に基づいて、正反射画像から特定された1つ以上の浮遊物像の位置(座標)と、拡散反射画像から特定された1つ以上の浮遊物像の位置(座標)との間の対応付けを行う。これにより、同じ房水内浮遊物に対応する正反射画像中の浮遊物像と拡散反射画像中の浮遊物像とが特定され対応付けられる。 If necessary, the type identification unit 1330 performs registration between the specular reflection image and the diffuse reflection image, and identifies one or more floating objects identified from the specular reflection image based on the result of this registration. A correspondence is established between the position (coordinates) of the image and the position (coordinate) of one or more floating object images identified from the diffuse reflection image. As a result, the floating object image in the specular reflection image and the floating object image in the diffuse reflection image, which correspond to the same floating object in the aqueous humor, are identified and associated with each other.
 次に、種別特定部1330は、1つの房水内浮遊物に対応する正反射画像中の浮遊物像の強度値を求めるとともに、同じ房水内浮遊物に対応する拡散反射画像中の浮遊物像の強度値を求める。強度値は、画素値に基づき決定される。 Next, the type identifying unit 1330 calculates the intensity value of the floating object image in the specular reflection image corresponding to one floating object in the aqueous humor, and also calculates the intensity value of the floating object image in the diffuse reflection image corresponding to the same floating object in the aqueous humor. Find the intensity value of the image. Intensity values are determined based on pixel values.
 例えば、強度値は、浮遊物像における画素値から算出される任意の統計量であってよい。この統計量は、例えば、平均、分散、標準偏差、最大値、最小値、最頻値、中央値などであってよい。 For example, the intensity value may be any statistic calculated from pixel values in the floating object image. This statistic may be, for example, an average, variance, standard deviation, maximum value, minimum value, mode, median value, etc.
 次に、種別特定部1330は、1つの房水内浮遊物に対応する正反射画像中の浮遊物像(正反射浮遊物像)の強度値と、同じ房水内浮遊物に対応する拡散反射画像中の浮遊物像(拡散反射浮遊物像)の強度値とを比較する。 Next, the type identifying unit 1330 determines the intensity value of a floating object image (specular reflection floating object image) in the specular reflection image corresponding to one floating object in the aqueous humor and the diffuse reflection corresponding to the same floating object in the aqueous humor. The intensity value of the floating object image (diffuse reflection floating object image) in the image is compared.
 幾つかの例示的な態様において、種別特定部1330は、正反射浮遊物像の強度T1と、拡散反射浮遊物像の強度T2との比T1/T2を算出し、この比T1/T2を所定の閾値THと比較する。例えば、種別特定部1330は、比T1/T2の絶対値abs(T1/T2)が閾値TH以上である場合には、この浮遊物はマクロファージと推定するように、且つ、比T1/T2の絶対値abs(T1/T2)が閾値TH未満である場合には、この浮遊物はリンパ球と推定するように、構成されてよい。 In some exemplary aspects, the type identification unit 1330 calculates a ratio T1/T2 between the intensity T1 of the specularly reflected floating object image and the intensity T2 of the diffusely reflected floating object image, and sets this ratio T1/T2 to a predetermined value. is compared with the threshold value TH. For example, if the absolute value abs(T1/T2) of the ratio T1/T2 is greater than or equal to the threshold TH, the type identifying unit 1330 estimates that the floating material is a macrophage, and also specifies the absolute value of the ratio T1/T2. If the value abs(T1/T2) is below a threshold TH, the float may be configured to be presumed to be a lymphocyte.
 本態様の評価処理部1300の評価情報生成部1320は、種別特定部1330により特定された房水内浮遊物の種別に応じて評価情報の生成を行うことができる。例えば、評価情報生成部1320は、房水内浮遊物の種別ごとに評価情報を生成するように構成されていてよい。 The evaluation information generating section 1320 of the evaluation processing section 1300 of this embodiment can generate evaluation information according to the type of floating matter in the aqueous humor specified by the type specifying section 1330. For example, the evaluation information generation unit 1320 may be configured to generate evaluation information for each type of floating matter in the aqueous humor.
 幾つかの例示的な態様において、種別特定部1330は、上記の強度値に加えて、又は、上記の強度値の代わりに、画像の色情報を参照するように構成されてもよい。第1の撮影部1110により取得された第1の画像及び/又は第2の撮影部1120により取得された第2の画像がカラー画像である場合、種別特定部1330は、例えば、カラー画像の3つの色成分画像(R成分画像、G成分画像、B成分画像)のうちの少なくとも1つを用いて房水内浮遊物の種別を特定するように構成されてもよいし、3つの色成分画像に基づき生成される情報(例えば輝度信号値(Y))を用いて房水内浮遊物の種別を特定するように構成されてもよいし、カラー画像をモノクロ画像に変換して房水内浮遊物の種別を特定するように構成されてもよい。 In some exemplary embodiments, the type identification unit 1330 may be configured to refer to color information of the image in addition to or instead of the intensity values described above. If the first image acquired by the first image capture unit 1110 and/or the second image acquired by the second image capture unit 1120 are color images, the type identification unit 1330 may, for example, The configuration may be such that the type of floating matter in the aqueous humor is identified using at least one of three color component images (R component image, G component image, B component image), or three color component images. It may be configured to identify the type of floating matter in the aqueous humor using information generated based on the information (for example, luminance signal value (Y)), or to convert a color image into a monochrome image to identify the type of floating matter in the aqueous humor. It may be configured to identify the type of thing.
 種別特定部1330により実行される種別特定の方法は、上記した例に限定されるものではない。例えば、種別特定部1330は、光コヒーレンストモグラフィ(OCT)を用いた反射波長特性の評価に基づいて房水内浮遊物の種別を特定することができる。本例の種別特定方法は、例えば次の文献に開示されている:RUOBING QIAN, RYAN P. MCNABB, KEVIN C. ZHOU, HAZEM M. MOUSA, DANIEL R. SABAN, VICTOR L. PEREZ, ANTHONY N. KUO, AND JOSEPH A. IZATT, “In vivo quantitative analysis of anterior chamber white blood cell mixture composition using spectroscopic optical coherence tomography”, Vol. 12, No. 4 / 1 April 2021 / Biomedical Optics Express, pp. 2134-2148。本例の種別特定方法が採用される場合、眼科装置は、例えば、図1に示す構成及び図8に示す構成に加えて、公知のOCTスキャナーを含んでいてもよい。或いは、眼科装置は、別途に設けられたOCTスキャナーで取得されたOCT画像に対して本例の種別特定方法を適用するように構成されてもよい。 The type identification method executed by the type identification unit 1330 is not limited to the above example. For example, the type identifying unit 1330 can identify the type of floating matter in the aqueous humor based on evaluation of reflection wavelength characteristics using optical coherence tomography (OCT). The type identification method in this example is disclosed in the following documents, for example: RUOBING QIAN, RYAN P. MCNABB, KEVIN C. ZHOU, HAZEM M. MOUSA, DANIEL R. SABAN, VICTOR L. PEREZ, ANTHONY N. KUO , AND JOSEPH A. IZATT, “In vivo quantitative analysis of anterior chamber white blood cell mixture composition using spectroscopic optical coherence tomography”, Vol. 12, No. 4/1 April 2021 / Biomedical Optics Express, pp. 2134-2148. When the type identification method of this example is adopted, the ophthalmologic apparatus may include, for example, a known OCT scanner in addition to the configuration shown in FIG. 1 and the configuration shown in FIG. 8. Alternatively, the ophthalmologic apparatus may be configured to apply the type identification method of this example to an OCT image acquired by a separately provided OCT scanner.
 実施形態に係る眼科装置の動作について、幾つかの例を説明する。2つ以上の動作例を少なくとも部分的に組み合わせることができる。また、或る動作例で説明した事項を他の動作例に適用することができる。また、本開示で説明した任意の事項又はそれと均等な事項を各動作例に組み合わせることができる。 Several examples will be described regarding the operation of the ophthalmological apparatus according to the embodiment. Two or more operational examples may be at least partially combined. Furthermore, matters described in one example of operation can be applied to other examples of operation. Furthermore, any of the matters described in this disclosure or equivalent matters may be combined with each operation example.
 図9を参照して第1の動作例を説明する。本動作例において、眼科装置1000は、まず、第1の撮影条件での第1の撮影と第2の撮影条件での第2の撮影とを並行して被検眼の前眼部に適用する(S1)。 The first operation example will be explained with reference to FIG. In this operation example, the ophthalmological apparatus 1000 first applies a first imaging under the first imaging condition and a second imaging under the second imaging condition to the anterior segment of the eye to be examined ( S1).
 第1の撮影は第1の撮影部1110により実行され、第2の撮影は第2の撮影部1120により実行される。第1の撮影及び第2の撮影の並行的な実行は、制御部1200が実行する同期制御によって実現される。 The first imaging is performed by the first imaging unit 1110, and the second imaging is performed by the second imaging unit 1120. Parallel execution of the first imaging and the second imaging is realized by synchronous control executed by the control unit 1200.
 更に、眼科装置1000は、ステップS1で並行して実行された第1の撮影及び第2の撮影によりそれぞれ生成された第1の画像及び第2の画像に基づいて、被検眼の前眼部の房水内(前房内)に存在する浮遊物に関する評価情報を生成する(S2)。 Further, the ophthalmological apparatus 1000 estimates the anterior segment of the subject's eye based on the first image and second image generated by the first imaging and second imaging, respectively, which were performed in parallel in step S1. Evaluation information regarding floating objects present in the aqueous humor (in the anterior chamber) is generated (S2).
 ステップS2で生成される評価情報は、例えば、房水内浮遊物の移動方向、移動量、移動ベクトル、移動速度、移動加速度などを含んでいてよい。 The evaluation information generated in step S2 may include, for example, the moving direction, moving amount, moving vector, moving speed, moving acceleration, etc. of the suspended matter in the aqueous humor.
 ステップS2で生成された評価情報の用途は様々である。例えば、評価情報は、図示しない記憶装置に保存され、及び/又は、図示しない表示装置に表示される視覚的情報の生成に提供され、及び/又は、図示しないコンピュータによる解析処理に提供される。 The evaluation information generated in step S2 has various uses. For example, the evaluation information is stored in a storage device (not shown), and/or provided to generate visual information displayed on a display device (not shown), and/or provided to analysis processing by a computer (not shown).
 図10を参照して第2の動作例を説明する。本動作例において、眼科装置1000は、まず、第1の動作例のステップS1と同じ要領で、第1の撮影条件での第1の撮影と第2の撮影条件での第2の撮影とを並行して被検眼の前眼部に適用する(S11)。 A second operation example will be described with reference to FIG. In this operational example, the ophthalmological apparatus 1000 first performs a first imaging under a first imaging condition and a second imaging under a second imaging condition in the same manner as step S1 of the first operational example. In parallel, it is applied to the anterior segment of the eye to be examined (S11).
 次に、眼科装置1000は、ステップS11における第1の撮影で生成された第1の画像から房水内浮遊物に対応する第1の浮遊物像を検出し(S12)、且つ、ステップS11における第2の撮影で生成された第2の画像から同じ房水内浮遊物に対応する第2の浮遊物像を検出する(S13)。 Next, the ophthalmological apparatus 1000 detects a first floating object image corresponding to a floating object in the aqueous humor from the first image generated by the first imaging in step S11 (S12), and A second floating object image corresponding to the same floating object in the aqueous humor is detected from the second image generated in the second imaging (S13).
 なお、第1の浮遊物像を検出する工程と、第2の浮遊物を検出する工程とを行う順序は任意であってよく、或いは、これらの工程を並行して実行してもよい。 Note that the step of detecting the first floating object image and the step of detecting the second floating object may be performed in any order, or these steps may be performed in parallel.
 次に、眼科装置1000は、ステップS12で第1の画像から検出された第1の浮遊物像と、ステップS13で第2の画像から検出された第2の浮遊物像とに基づいて、房水内浮遊物の評価情報を生成する(S14)。生成された評価情報は、様々な用途に提供される。 Next, the ophthalmological apparatus 1000 detects the tuft based on the first floating object image detected from the first image in step S12 and the second floating object image detected from the second image in step S13. Evaluation information for floating objects in water is generated (S14). The generated evaluation information is provided for various purposes.
 図11を参照して第3の動作例を説明する。本動作例において、眼科装置1000は、まず、被検眼の前眼部の3次元領域に対してスキャンを適用する(S21)。 A third operation example will be described with reference to FIG. 11. In this operation example, the ophthalmological apparatus 1000 first applies a scan to a three-dimensional region of the anterior segment of the eye to be examined (S21).
 ステップS21の3次元スキャンは、第1の撮影条件での第1の撮影と第2の撮影条件での第2の撮影との並行的な実行と、撮影位置の移動とを組み合わせることによって実現される。第1の撮影条件での第1の撮影と第2の撮影条件での第2の撮影との並行的な実行は、第1の動作例のステップS1と同じ要領で実現される。撮影位置の移動は、制御部1200による制御の下に移動機構1400が撮影部1100(第1の光学系1111及び第2の光学系1121)を移動することによって実現される。 The three-dimensional scan in step S21 is realized by combining parallel execution of first imaging under the first imaging condition and second imaging under the second imaging condition, and movement of the imaging position. Ru. The parallel execution of the first photographing under the first photographing condition and the second photographing under the second photographing condition is realized in the same manner as step S1 of the first operation example. The movement of the photographing position is realized by the movement mechanism 1400 moving the photographing section 1100 (first optical system 1111 and second optical system 1121) under the control of the control section 1200.
 ステップS21の3次元スキャンにより、被検眼の前眼部の3次元領域から、並行して実行された第1の撮影及び第2の撮影でそれぞれ生成された第1の画像及び第2の画像のペアが複数個収集される。すなわち、ステップS21の3次元スキャンにより、被検眼の前眼部の3次元領域における複数の断面の状態をそれぞれ描出した複数の画像ペア(第1の画像と第2の画像とのペア)が生成される。 Through the three-dimensional scan in step S21, the first image and the second image generated in the first and second photography performed in parallel are obtained from the three-dimensional area of the anterior segment of the subject's eye. Multiple pairs are collected. That is, by the three-dimensional scan in step S21, a plurality of image pairs (a pair of a first image and a second image) each depicting the state of a plurality of cross sections in a three-dimensional region of the anterior segment of the subject's eye are generated. be done.
 次に、眼科装置1000は、複数の撮影位置(複数のスキャン位置)のそれぞれについて、その撮影位置に対応する第1の画像及び第2の画像のペアに基づいて、その撮影位置に対応する前眼部断面に存在する房水内浮遊物の評価情報を生成する(S22)。これにより、複数の撮影位置にそれぞれ対応する複数の評価情報が得られる。 Next, for each of the plurality of imaging positions (scanning positions), the ophthalmological apparatus 1000 performs an imaging process based on the pair of the first image and the second image corresponding to the imaging position. Evaluation information on floating matter in the aqueous humor present in the cross section of the eye is generated (S22). As a result, a plurality of pieces of evaluation information respectively corresponding to a plurality of photographing positions can be obtained.
 次に、眼科装置1000は、ステップS22において生成された複数の評価情報に基づいて、ステップS21の3次元スキャンが適用された前眼部の3次元領域に存在する房水内浮遊物に関する所定の評価値の3次元分布を生成する(S23)。 Next, based on the plurality of pieces of evaluation information generated in step S22, the ophthalmological apparatus 1000 determines a predetermined value regarding the floating matter in the aqueous humor present in the three-dimensional region of the anterior segment to which the three-dimensional scan in step S21 was applied. A three-dimensional distribution of evaluation values is generated (S23).
 ステップS23で生成された3次元分布を様々な用途に提供することができる。例えば、3次元分布は、図示しない記憶装置に保存され、及び/又は、図示しない表示装置に表示される視覚的情報の生成に提供され、及び/又は、図示しないコンピュータによる解析処理に提供される。同様に、ステップS22で生成された複数の評価情報についても様々な用途に提供することができる。 The three-dimensional distribution generated in step S23 can be provided for various purposes. For example, the three-dimensional distribution is stored in a storage device (not shown), and/or provided to generate visual information displayed on a display device (not shown), and/or provided to analysis processing by a computer (not shown). . Similarly, the plural pieces of evaluation information generated in step S22 can also be provided for various purposes.
 図12を参照して第4の動作例を説明する。本動作例において、眼科装置1000は、まず、第1の動作例のステップS1と同じ要領で、第1の撮影条件での第1の撮影と第2の撮影条件での第2の撮影とを並行して被検眼の前眼部に適用する(S31)。 A fourth operation example will be described with reference to FIG. 12. In this operational example, the ophthalmological apparatus 1000 first performs a first imaging under a first imaging condition and a second imaging under a second imaging condition in the same manner as step S1 of the first operational example. In parallel, it is applied to the anterior segment of the eye to be examined (S31).
 次に、眼科装置1000は、ステップS31における第1の撮影及び第2の撮影でそれぞれ生成された第1の画像及び第2の画像に描出されている房水内浮遊物の種別を特定する(S32)。 Next, the ophthalmological apparatus 1000 specifies the type of floating matter in the aqueous humor depicted in the first image and second image generated in the first imaging and second imaging in step S31 ( S32).
 次に、眼科装置1000は、ステップS32で特定された房水内浮遊物の種別に応じて、房水内浮遊物の評価情報の生成を行う(S33)。 Next, the ophthalmological apparatus 1000 generates evaluation information of the floating matter in the aqueous humor according to the type of the floating matter in the aqueous humor identified in step S32 (S33).
 ステップS33において房水内浮遊物の種別に応じて生成された評価情報は、様々な用途に提供される。例えば、本動作例で生成された評価情報は、房水内浮遊物の種別に応じて保存され、及び/又は、房水内浮遊物の種別に応じた視覚的情報の生成に提供され、及び/又は、房水内浮遊物の種別に応じた解析処理に提供される。 The evaluation information generated in step S33 according to the type of suspended matter in the aqueous humor is provided for various purposes. For example, the evaluation information generated in this operation example is stored according to the type of floating matter in the aqueous humor, and/or provided to generate visual information according to the type of floating matter in the aqueous humor, and /Or provided for analysis processing according to the type of suspended matter in the aqueous humor.
 実施形態に係る眼科装置のより具体的な構成の1つの例を図13に示す。図13は上面図である。 FIG. 13 shows one example of a more specific configuration of the ophthalmologic apparatus according to the embodiment. FIG. 13 is a top view.
 被検眼Eの軸に沿う方向をZ方向とし、これに直交する方向のうち被検者にとって左右の方向をX方向とし、X方向及びZ方向の双方に直交する方向(上下方向、体軸方向)をY方向とする。 The direction along the axis of the eye E to be examined is the Z direction, and among the directions perpendicular to this, the left and right directions for the examinee are the X direction, and the directions perpendicular to both the X direction and the Z direction (vertical direction, body axis direction) ) is the Y direction.
 本例の眼科装置は、特許文献3(特開2019-213733号公報)に開示されているものと同様の構成を有するスリットランプ顕微鏡システム1であり、照明光学系2と、撮影光学系3と、動画撮影光学系4と、光路結合素子5と、移動機構6と、制御部7と、データ処理部8と、通信部9と、ユーザーインターフェイス10とを含む。 The ophthalmological apparatus of this example is a slit lamp microscope system 1 having a configuration similar to that disclosed in Patent Document 3 (Japanese Patent Laid-Open No. 2019-213733), and includes an illumination optical system 2 and a photographing optical system 3. , a moving image photographing optical system 4, an optical path coupling element 5, a moving mechanism 6, a control section 7, a data processing section 8, a communication section 9, and a user interface 10.
 被検眼Eの角膜を符号Cで示し、水晶体を符号CLで示す。前房は、角膜Cと水晶体CLとの間の領域(角膜Cと虹彩との間の領域)に相当する。 The cornea of the eye E to be examined is indicated by the symbol C, and the crystalline lens is indicated by the symbol CL. The anterior chamber corresponds to the area between the cornea C and the crystalline lens CL (the area between the cornea C and the iris).
 スリットランプ顕微鏡システム1の各要素の詳細については、特許文献3(特開2019-213733号公報)を参照されたい。 For details of each element of the slit lamp microscope system 1, please refer to Patent Document 3 (Japanese Patent Application Laid-Open No. 2019-213733).
 照明光学系2は、被検眼Eの前眼部にスリット光を投射する。符号2aは、照明光学系2の光軸(照明光軸)を示す。 The illumination optical system 2 projects a slit light onto the anterior segment of the eye E to be examined. Reference numeral 2a indicates the optical axis (illumination optical axis) of the illumination optical system 2.
 撮影光学系3は、照明光学系2からのスリット光が投射されている前眼部を撮影する。符号3aは、撮影光学系3の光軸(撮影光軸)を示す。光学系3Aは、スリット光が投射されている被検眼Eの前眼部からの光を撮像素子3Bに導く。撮像素子3Bは、光学系3Aにより導かれた光を撮像面にて受光する。撮像素子3Bは、2次元の撮像エリアを有するエリアセンサ(CCDエリアセンサ、CMOSエリアセンサなど)を含む。図13では図示が省略されているが、撮影光学系3は2つの撮影光学系(第1の撮影部及び第2の撮影部)を含んでいる。2つの撮影光学系については図14を参照して後述する。 The photographing optical system 3 photographs the anterior segment of the eye onto which the slit light from the illumination optical system 2 is projected. Reference numeral 3a indicates the optical axis (photographing optical axis) of the photographing optical system 3. The optical system 3A guides light from the anterior segment of the subject's eye E onto which the slit light is projected to the image sensor 3B. The image sensor 3B receives the light guided by the optical system 3A on its imaging surface. The image sensor 3B includes an area sensor (CCD area sensor, CMOS area sensor, etc.) having a two-dimensional imaging area. Although not shown in FIG. 13, the photographing optical system 3 includes two photographing optical systems (a first photographing section and a second photographing section). The two photographing optical systems will be described later with reference to FIG. 14.
 照明光学系2及び撮影光学系3は、シャインプルーフカメラとして機能するものであり、照明光軸2aに沿う物面と光学系3Aと撮像素子3Bの撮像面とがシャインプルーフの条件を満足するように、すなわち照明光軸2aを通るYZ面(物面を含む)と光学系3Aの主面と撮像素子3Bの撮像面とが同一の直線上にて交差するように、構成されている。 The illumination optical system 2 and the photographing optical system 3 function as a Scheimpflug camera, and are designed so that the object surface along the illumination optical axis 2a, the imaging surface of the optical system 3A and the image sensor 3B satisfy Scheimpflug conditions. In other words, the YZ plane (including the object plane) passing through the illumination optical axis 2a, the main surface of the optical system 3A, and the imaging surface of the image sensor 3B are configured to intersect on the same straight line.
 これにより、照明光学系2及び撮影光学系3は、少なくとも角膜Cの後面から水晶体CLの前面までの範囲(前房)にピントが合っている状態で、撮影を行うことができる。また、照明光学系2及び撮影光学系3は、少なくとも角膜Cの前面の頂点(Z=Z1)から水晶体CLの後面の頂点(Z=Z2)までの範囲にピントが合っている状態で、撮影を行うことができる。なお、座標Z=Z0は照明光軸2aと撮影光軸3aとの交点を示す。 Thereby, the illumination optical system 2 and the photographing optical system 3 can perform photographing in a state in which at least the range from the posterior surface of the cornea C to the anterior surface of the crystalline lens CL (anterior chamber) is in focus. In addition, the illumination optical system 2 and the photographing optical system 3 are in focus at least in the range from the vertex of the front surface of the cornea C (Z=Z1) to the vertex of the rear surface of the crystalline lens CL (Z=Z2). It can be performed. Note that the coordinate Z=Z0 indicates the intersection of the illumination optical axis 2a and the photographing optical axis 3a.
 動画撮影光学系4は、ビデオカメラであり、照明光学系2及び撮影光学系3による被検眼の撮影と並行して被検眼Eの前眼部を動画撮影する。光路結合素子5は、照明光学系2の光路(照明光路)と、動画撮影光学系4の光路(動画撮影光路)とを結合している。 The video photographing optical system 4 is a video camera, and takes a video of the anterior segment of the eye E in parallel with the photographing of the eye by the illumination optical system 2 and the photographing optical system 3. The optical path coupling element 5 couples the optical path of the illumination optical system 2 (illumination optical path) and the optical path of the video imaging optical system 4 (video imaging optical path).
 照明光学系2、撮影光学系3、動画撮影光学系4、及び光路結合素子5を含む光学系の具体例を図14に示す。図14に示す光学系は、照明光学系2の例である照明光学系20と、撮影光学系3の例である左撮影光学系30L及び右撮影光学系30Rと、動画撮影光学系4の例である動画撮影光学系40と、光路結合素子5の例であるビームスプリッタ47とを含んでいる。 A specific example of an optical system including the illumination optical system 2, the photographing optical system 3, the moving image photographing optical system 4, and the optical path coupling element 5 is shown in FIG. The optical systems shown in FIG. 14 include an illumination optical system 20 which is an example of the illumination optical system 2, a left photographing optical system 30L and a right photographing optical system 30R which are examples of the photographing optical system 3, and an example of the video photographing optical system 4. The optical system 40 includes a moving image photographing optical system 40, which is a moving image photographing optical system 40, and a beam splitter 47, which is an example of an optical path coupling element 5.
 符号20aは照明光学系20の光軸(照明光軸)を示し、符号30Laは左撮影光学系30Lの光軸(左撮影光軸)を示し、符号30Raは右撮影光学系30Rの光軸(右撮影光軸)を示す。角度θLは照明光軸20aと左撮影光軸30Laとがなす角度を示し、角度θRは照明光軸20aと右撮影光軸30Raとがなす角度を示す。座標Z=Z0は、照明光軸20aと左撮影光軸30Laと右撮影光軸30Raとの交点を示す。 Reference numeral 20a indicates the optical axis (illumination optical axis) of the illumination optical system 20, reference numeral 30La indicates the optical axis (left imaging optical axis) of the left photographing optical system 30L, and reference numeral 30Ra indicates the optical axis (of the right photographing optical system 30R). right photographing optical axis). The angle θL indicates the angle between the illumination optical axis 20a and the left photographing optical axis 30La, and the angle θR indicates the angle between the illumination optical axis 20a and the right photographing optical axis 30Ra. Coordinate Z=Z0 indicates the intersection of the illumination optical axis 20a, the left photographing optical axis 30La, and the right photographing optical axis 30Ra.
 移動機構6は、照明光学系20、左撮影光学系30L、及び右撮影光学系30Rを、矢印49で示す方向(X方向)に移動する。 The moving mechanism 6 moves the illumination optical system 20, the left photographing optical system 30L, and the right photographing optical system 30R in the direction shown by the arrow 49 (X direction).
 照明光学系20の照明光源21は照明光(例えば可視光)を出力し、正レンズ22は照明光を屈折する。スリット形成部23はスリットを形成して照明光の一部を通過させる。生成されたスリット光は、対物レンズ群24及び25により屈折され、ビームスプリッタ47により反射され、被検眼Eの前眼部に投射される。 The illumination light source 21 of the illumination optical system 20 outputs illumination light (for example, visible light), and the positive lens 22 refracts the illumination light. The slit forming section 23 forms a slit to allow part of the illumination light to pass through. The generated slit light is refracted by the objective lens groups 24 and 25, reflected by the beam splitter 47, and projected onto the anterior segment of the eye E to be examined.
 左撮影光学系30Lの反射器31L及び結像レンズ32Lは、照明光学系20によりスリット光が投射されている前眼部からの光(左撮影光学系30Lの方向に進行する光)を撮像素子33Lに導く。撮像素子33Lは、導かれた光を撮像面34Lにて受光する。 The reflector 31L and the imaging lens 32L of the left photographing optical system 30L direct light from the anterior segment of the eye onto which the slit light is projected by the illumination optical system 20 (light traveling in the direction of the left photographing optical system 30L) to the imaging element. Leads to 33L. The image sensor 33L receives the guided light at an image pickup surface 34L.
 左撮影光学系30Lは、移動機構6による照明光学系20、左撮影光学系30L及び右撮影光学系30Rの移動と並行して繰り返し撮影を行う。これにより複数の前眼部画像(一連のシャインプルーフ画像)が得られる。 The left photographing optical system 30L repeatedly performs photographing in parallel with the movement of the illumination optical system 20, left photographing optical system 30L, and right photographing optical system 30R by the moving mechanism 6. As a result, a plurality of anterior segment images (a series of Scheimpflug images) are obtained.
 照明光軸20aに沿う物面と反射器31L及び結像レンズ32Lを含む光学系と撮像面34Lとは、シャインプルーフの条件を満足する。右撮影光学系30Rも同様の構成及び機能を有する。 The object surface along the illumination optical axis 20a, the optical system including the reflector 31L and the imaging lens 32L, and the imaging surface 34L satisfy the Scheimpflug condition. The right photographing optical system 30R also has a similar configuration and function.
 左撮影光学系30Lによるシャインプルーフ画像収集と右撮影光学系30Rによるシャインプルーフ画像収集とは、互いに並行して行われる。例えば、左撮影光学系30Lは第1の撮影条件での第1の撮影に用いられ、且つ、右撮影光学系30Rは第2の撮影条件での第2の撮影に用いられる。又は、左撮影光学系30Lは第2の撮影条件での第2の撮影に用いられ、且つ、右撮影光学系30Rは第1の撮影条件での第1の撮影に用いられる。 Scheimpflug image collection by the left photographing optical system 30L and Scheimpflug image collection by the right photographing optical system 30R are performed in parallel with each other. For example, the left photographing optical system 30L is used for the first photographing under the first photographing condition, and the right photographing optical system 30R is used for the second photographing under the second photographing condition. Alternatively, the left photographing optical system 30L is used for the second photographing under the second photographing condition, and the right photographing optical system 30R is used for the first photographing under the first photographing condition.
 制御部7は、左撮影光学系30Lによる繰り返し撮影と右撮影光学系30Rによる繰り返し撮影とを同期させることができる。これにより、左撮影光学系30Lにより得られた一連のシャインプルーフ画像と、右撮影光学系30Rにより得られた一連のシャインプルーフ画像との間の対応関係が得られる。 The control unit 7 can synchronize repeated shooting by the left shooting optical system 30L and repeated shooting by the right shooting optical system 30R. Thereby, a correspondence relationship between the series of Scheimpflug images obtained by the left photographing optical system 30L and the series of Scheimpflug images obtained by the right photographing optical system 30R is obtained.
 なお、左撮影光学系30Lにより得られた複数の前眼部画像と、右撮影光学系30Rにより得られた複数の前眼部画像との間の対応関係を求める処理を、制御部7又はデータ処理部8により実行してもよい。 Note that the process of determining the correspondence between the plurality of anterior eye segment images obtained by the left photographing optical system 30L and the plurality of anterior eye segment images obtained by the right photographing optical system 30R is performed by the control unit 7 or the data. It may also be executed by the processing unit 8.
 動画撮影光学系40は、左撮影光学系30Lによる撮影及び右撮影光学系30Rによる撮影と並行して、被検眼Eの前眼部を固定位置から動画撮影する。ビームスプリッタ47を透過した光は、反射器48により反射されて動画撮影光学系40に入射する。動画撮影光学系40に入射した光は、対物レンズ41により屈折された後、結像レンズ42によって撮像素子43(エリアセンサ)の撮像面に結像される。動画撮影光学系40は、被検眼Eの動きのモニタ、アライメント、トラッキング、収集されたシャインプルーフ画像の処理などに利用される。 The video photographing optical system 40 photographs a video of the anterior segment of the subject's eye E from a fixed position in parallel with the photographing by the left photographing optical system 30L and the photographing by the right photographing optical system 30R. The light that has passed through the beam splitter 47 is reflected by a reflector 48 and enters the moving image photographing optical system 40 . The light incident on the moving image photographing optical system 40 is refracted by an objective lens 41 and then imaged by an imaging lens 42 on an imaging surface of an image sensor 43 (area sensor). The moving image photographing optical system 40 is used for monitoring the movement of the eye E to be examined, alignment, tracking, processing of collected Scheimpflug images, and the like.
 図13の参照に戻る。移動機構6は、照明光学系2及び撮影光学系3を一体的にX方向に移動する。 Referring back to FIG. 13. The moving mechanism 6 moves the illumination optical system 2 and the photographing optical system 3 integrally in the X direction.
 制御部7は、スリットランプ顕微鏡システム1の各部を制御する。制御部7は、照明光学系2、撮影光学系3及び移動機構6の制御と、動画撮影光学系4の制御とを、互いに並行して実行することにより、前眼部の3次元スキャン(一連のシャインプルーフ画像の収集)と、前眼部の動画撮影(一連の時系列画像の収集)とを互いに並行して実行させることができる。 The control unit 7 controls each part of the slit lamp microscope system 1. The control unit 7 executes the control of the illumination optical system 2, the photographing optical system 3, and the moving mechanism 6, and the control of the video photographing optical system 4 in parallel with each other, thereby performing a three-dimensional scan (a series of scans) of the anterior segment of the eye. (collection of Scheimpflug images) and video recording of the anterior segment (collection of a series of time-series images) can be performed in parallel.
 また、制御部7は、照明光学系2、撮影光学系3及び移動機構6の制御と、動画撮影光学系4の制御とを、互いに同期して実行することにより、前眼部の3次元スキャンと前眼部の動画撮影とを互いに同期させることができる。 In addition, the control unit 7 performs three-dimensional scanning of the anterior segment of the eye by controlling the illumination optical system 2, the photographing optical system 3, and the moving mechanism 6, and controlling the video photographing optical system 4 in synchronization with each other. and video recording of the anterior segment of the eye can be synchronized with each other.
 撮影光学系3が左撮影光学系30L及び右撮影光学系30Rを含む場合、制御部7は、左撮影光学系30Lによる繰り返し撮影(シャインプルーフ画像群の収集)と、右撮影光学系30Rによる繰り返し撮影(シャインプルーフ画像群の収集)とを互いに同期させることができる。これにより、第1の撮影条件の下での第1の撮影と第2の撮影条件の下での第2の撮影とのペア動作と、撮影位置の移動とを、組み合わせて(並行して、同期して)実行することが可能になる。 When the photographing optical system 3 includes a left photographing optical system 30L and a right photographing optical system 30R, the control unit 7 controls repeated photographing (collection of Scheimpflug image groups) by the left photographing optical system 30L and repetition by the right photographing optical system 30R. Photographing (collection of Scheimpflug image groups) can be synchronized with each other. As a result, the pair operation of the first photographing under the first photographing condition and the second photographing under the second photographing condition and the movement of the photographing position are combined (in parallel). can be executed synchronously).
 制御部7は、プロセッサ、記憶装置などを含む。記憶装置には、各種の制御プログラム等のコンピュータプログラムが記憶されている。制御部7の機能は、制御プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。制御部7は、被検眼Eの前眼部の3次元領域をスリット光でスキャンするために、照明光学系2、撮影光学系3及び移動機構6の制御を実行する。この制御の詳細については、特許文献3(特開2019-213733号公報)を参照されたい。制御部7は、眼科装置1000の制御部1200の機能を有している。制御部7の機能は、ここに説明したものに限定されない。 The control unit 7 includes a processor, a storage device, and the like. The storage device stores computer programs such as various control programs. The functions of the control unit 7 are realized by cooperation between software such as a control program and hardware such as a processor. The control unit 7 controls the illumination optical system 2, the photographing optical system 3, and the movement mechanism 6 in order to scan the three-dimensional region of the anterior segment of the eye E to be examined using the slit light. For details of this control, please refer to Patent Document 3 (Japanese Unexamined Patent Publication No. 2019-213733). The control unit 7 has the function of the control unit 1200 of the ophthalmological apparatus 1000. The functions of the control unit 7 are not limited to those described here.
 データ処理部8は、各種のデータ処理を実行する。データ処理部8は、プロセッサ、記憶装置などを含む。記憶装置には、各種のデータ処理プログラム等のコンピュータプログラムが記憶されている。データ処理部8の機能は、データ処理プログラム等のソフトウェアと、プロセッサ等のハードウェアとの協働によって実現される。データ処理部8は、眼科装置1000の評価処理部1300の機能を有している。データ処理部8の機能はこれに限定されない。 The data processing unit 8 executes various data processing. The data processing unit 8 includes a processor, a storage device, and the like. The storage device stores computer programs such as various data processing programs. The functions of the data processing section 8 are realized by cooperation between software such as a data processing program and hardware such as a processor. The data processing section 8 has the function of the evaluation processing section 1300 of the ophthalmological apparatus 1000. The function of the data processing section 8 is not limited to this.
 通信部9は、スリットランプ顕微鏡システム1と他の装置との間におけるデータ通信を行う。ユーザーインターフェイス10は、表示デバイス、操作デバイスなど、任意のユーザーインターフェイスデバイスを含む。 The communication unit 9 performs data communication between the slit lamp microscope system 1 and other devices. The user interface 10 includes any user interface devices such as a display device and an operation device.
 図13及び図14に示すスリットランプ顕微鏡システム1は非限定的な例であり、眼科装置1000(1000A)を実施するための構成はスリットランプ顕微鏡システム1に限定されない。 The slit lamp microscope system 1 shown in FIGS. 13 and 14 is a non-limiting example, and the configuration for implementing the ophthalmologic apparatus 1000 (1000A) is not limited to the slit lamp microscope system 1.
 実施形態に係る眼科装置の幾つかの非限定的な特徴について説明する。 Some non-limiting features of the ophthalmological device according to the embodiment will be described.
 実施形態に係る眼科装置の第1の態様例は、シャインプルーフの条件を満足する第1の光学系を含み、第1の撮影条件での第1の撮影を被検眼の前眼部に適用する第1の撮影部と、シャインプルーフの条件を満足する第2の光学系を含み、上記第1の撮影条件と異なる第2の撮影条件での第2の撮影を上記第1の撮影と並行して上記前眼部に適用する第2の撮影部と、上記第1の撮影で生成された第1の画像と上記第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する評価処理部とを含む、眼科装置である。 A first aspect of the ophthalmological apparatus according to the embodiment includes a first optical system that satisfies Scheimpflug conditions, and applies first imaging under a first imaging condition to the anterior segment of the eye to be examined. It includes a first imaging section and a second optical system that satisfies Scheimpflug conditions, and performs a second imaging under a second imaging condition different from the first imaging condition in parallel with the first imaging. a second imaging unit applied to the anterior segment of the eye; and a second imaging unit applied to the anterior segment of the eye; The ophthalmologic apparatus includes an evaluation processing unit that generates evaluation information of an object.
 実施形態に係る眼科装置の第2の態様例は、第1の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記第1の光学系は、第1の撮像素子を含み、上記第2の光学系は、第2の撮像素子を含み、上記第1の撮影条件は、上記第1の撮像素子の露光時間である第1の露光時間を含み、上記第2の撮影条件は、上記第2の撮像素子の露光時間である第2の露光時間を含み、上記第2の露光時間は、上記第1の露光時間よりも長い。 The second aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the first aspect. The first optical system includes a first image sensor, the second optical system includes a second image sensor, and the first imaging condition is an exposure time of the first image sensor. The second photographing condition includes a second exposure time that is the exposure time of the second image sensor, and the second exposure time is equal to the first exposure time. longer than
 実施形態に係る眼科装置の第3の態様例は、第2の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価処理部は、同じ房水内浮遊物に対応する上記第1の画像中の第1の浮遊物像及び上記第2の画像中の第2の浮遊物像を検出する浮遊物像検出部と、上記第1の浮遊物像及び上記第2の浮遊物像に基づいて当該房水内浮遊物の評価情報を生成する評価情報生成部とを含む。 The third aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the second aspect. The evaluation processing section is a floating object image detection section that detects a first floating object image in the first image and a second floating object image in the second image that correspond to the same floating object in the aqueous humor. and an evaluation information generation unit that generates evaluation information of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
 実施形態に係る眼科装置の第4の態様例は、第3の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第2の浮遊物像に基づいて当該房水内浮遊物の移動方向を推定する。 The fourth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the third aspect. The evaluation information generation unit estimates the moving direction of the floating object in the aqueous humor based on the second floating object image.
 実施形態に係る眼科装置の第5の態様例は、第3又は第4の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の浮遊物像及び上記第2の浮遊物像に基づいて当該房水内浮遊物の移動ベクトルを推定する。 The fifth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the third or fourth aspect. The evaluation information generation unit estimates a movement vector of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
 実施形態に係る眼科装置の第6の態様例は、第5の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の浮遊物像の特徴点を求め、上記第1の浮遊物像の上記特徴点及び上記第2の浮遊物像に基づいて上記移動ベクトルを推定する。 The sixth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the fifth aspect. The evaluation information generation unit obtains feature points of the first floating object image, and estimates the movement vector based on the feature points of the first floating object image and the second floating object image.
 実施形態に係る眼科装置の第7の態様例は、第6の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の浮遊物像の上記特徴点に対応する上記第2の浮遊物像中の位置を特定し、上記特徴点に対応する上記位置に基づいて上記移動ベクトルを推定する。 The seventh aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the sixth aspect. The evaluation information generation unit specifies a position in the second floating object image that corresponds to the feature point of the first floating object image, and calculates the movement vector based on the position corresponding to the feature point. presume.
 実施形態に係る眼科装置の第8の態様例は、第7の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の浮遊物像の上記特徴点に対応する上記第2の浮遊物像中の上記位置、及び、上記第1の撮像素子の露光期間と上記第2の撮像素子の露光期間との間のタイミング関係に基づいて、上記移動ベクトルを推定する。 The eighth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the seventh aspect. The evaluation information generation unit is configured to calculate the position in the second floating object image corresponding to the feature point of the first floating object image, the exposure period of the first image sensor, and the second imaging device. The movement vector is estimated based on the timing relationship with the exposure period of the element.
 実施形態に係る眼科装置の第9の態様例は、第3~第8の態様例のいずれかの非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記前眼部にスリット光を投射する照明系を更に含み、上記第1の撮影及び上記第2の撮影は、上記スリット光が投射されている上記前眼部に適用され、上記評価情報生成部は、上記スリット光の投射方向である第1の方向における当該房水内浮遊物の移動量と、上記スリット光のビーム形状の長手方向における当該房水内浮遊物の移動量とを推定する。 The ninth aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the third to eighth aspects. further comprising an illumination system that projects a slit light onto the anterior segment of the eye, the first imaging and the second imaging are applied to the anterior segment onto which the slit light is projected, and the evaluation information generating unit estimates the amount of movement of the floating material in the aqueous humor in the first direction, which is the projection direction of the slit light, and the amount of movement of the floating material in the aqueous humor in the longitudinal direction of the beam shape of the slit light.
 実施形態に係る眼科装置の第10の態様例は、第9の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の方向及び上記第2の方向の双方に直交する第3の方向における当該房水内浮遊物の移動量を推定する。 The tenth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the ninth aspect. The evaluation information generation unit estimates the amount of movement of the floating object in the aqueous humor in a third direction perpendicular to both the first direction and the second direction.
 実施形態に係る眼科装置の第11の態様例は、第10の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記第2の光学系は、対物レンズを含み、上記評価情報生成部は、上記第2の浮遊物像の寸法及び上記対物レンズの開口数に基づいて上記第3の方向における上記移動量を推定する。 The eleventh aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the tenth aspect. The second optical system includes an objective lens, and the evaluation information generation unit estimates the amount of movement in the third direction based on the dimensions of the second floating object image and the numerical aperture of the objective lens. do.
 実施形態に係る眼科装置の第12の態様例は、第11の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第2の浮遊物像の上記寸法として、上記第2の浮遊物像の幅を算出する。 The twelfth aspect of the ophthalmologic apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the eleventh aspect. The evaluation information generation unit calculates the width of the second floating object image as the dimension of the second floating object image.
 実施形態に係る眼科装置の第13の態様例は、第10~第12の態様例のいずれかの非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第2の撮像素子の上記第2の露光時間、上記第1の方向における上記移動量、上記第2の方向における上記移動量、及び上記第3の方向における上記移動量に基づいて、当該房水内浮遊物の移動速度を推定する。 The thirteenth aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the tenth to twelfth aspects. The evaluation information generation unit is configured to generate the second exposure time of the second image sensor, the amount of movement in the first direction, the amount of movement in the second direction, and the movement in the third direction. Based on the amount, the moving speed of the suspended matter in the aqueous humor is estimated.
 実施形態に係る眼科装置の第14の態様例は、第10~第13の態様例のいずれかの非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の撮像素子の上記第1の露光時間、上記第2の撮像素子の上記第2の露光時間、上記第1の方向における上記移動量、上記第2の方向における上記移動量、及び上記第3の方向における上記移動量に基づいて、当該房水内浮遊物の移動速度を推定する。 The fourteenth aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the tenth to thirteenth aspects. The evaluation information generation unit is configured to generate the first exposure time of the first image sensor, the second exposure time of the second image sensor, the amount of movement in the first direction, and the second direction. The moving speed of the floating object in the aqueous humor is estimated based on the moving amount in the third direction and the moving amount in the third direction.
 実施形態に係る眼科装置の第15の態様例は、第3~第14の態様例のいずれかの態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価情報生成部は、上記第1の浮遊物像及び上記第2の浮遊物像に基づいて当該房水内浮遊物の移動速度を推定する。 The fifteenth aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any one of the third to fourteenth aspects. The evaluation information generation unit estimates the moving speed of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
 実施形態に係る眼科装置の第16の態様例は、第3~第15の態様例のいずれかの態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記浮遊物像検出部は、上記第1の画像から複数の房水内浮遊物の像を含む第1の像集合を検出し、上記第2の画像から複数の房水内浮遊物の像を含む第2の像集合を検出し、上記第1の像集合と上記第2の像集合との間の位置的対応関係を決定し、上記評価情報生成部は、上記位置的対応関係によって対応付けられている上記第1の像集合の1つの要素と上記第2の像集合の1つの要素とのペアに基づいて、当該ペアに対応する房水内浮遊物の評価情報を生成する。 The 16th aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the 3rd to 15th aspects. The floating object image detection unit detects a first image set including images of a plurality of objects floating in the aqueous humor from the first image, and detects images of a plurality of objects floating in the aqueous humor from the second image. detecting a second image set including the first image set and determining a positional correspondence relationship between the first image set and the second image set, and the evaluation information generating unit Based on a pair of one element of the first image set and one element of the second image set, evaluation information of floating matter in the aqueous humor corresponding to the pair is generated.
 実施形態に係る眼科装置の第17の態様例は、第3~第16の態様例のいずれかの態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記浮遊物像検出部は、上記第1の画像及び上記第2の画像のうちの一方の画像から浮遊物像を検出し、上記一方の画像における当該浮遊物像の座標を求め、上記第1の画像及び上記第2の画像のうちの他方の画像から、上記座標に対応する位置の浮遊物像を検出し、上記評価情報生成部は、上記一方の画像から検出された上記浮遊物像と上記他方の画像から検出された上記浮遊物像とのペアに基づいて、当該ペアに対応する房水内浮遊物の評価情報を生成する。 The seventeenth aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any one of the third to sixteenth aspects. The floating object image detection unit detects a floating object image from one of the first image and the second image, determines the coordinates of the floating object image in the one image, and calculates the coordinates of the floating object image in the one image, and and the other image of the second image, the floating object image at a position corresponding to the coordinates is detected, and the evaluation information generation section detects the floating object image detected from the one image and the floating object image detected from the one image. Based on the pair with the floating object image detected from the other image, evaluation information of the floating object in the aqueous humor corresponding to the pair is generated.
 実施形態に係る眼科装置の第18の態様例は、第2~第17の態様例のいずれかの態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記第1の光学系及び上記第2の光学系を移動する移動機構と、上記第1の撮影部の制御と上記第2の撮影部の制御と上記移動機構の制御とを組み合わせることにより、上記第1の画像と上記第2の画像との複数のペアを上記第1の撮影部及び上記第2の撮影部に生成させる制御部とを更に含む。 The 18th aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the 2nd to 17th aspects. By combining a moving mechanism that moves the first optical system and the second optical system, control of the first photographing section, control of the second photographing section, and control of the moving mechanism, the The image forming apparatus further includes a control section that causes the first photographing section and the second photographing section to generate a plurality of pairs of the first image and the second image.
 実施形態に係る眼科装置の第19の態様例は、第18の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価処理部は、上記複数のペアのそれぞれに基づいて評価情報を生成する。 The nineteenth aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the eighteenth aspect. The evaluation processing section generates evaluation information based on each of the plurality of pairs.
 実施形態に係る眼科装置の第20の態様例は、第19の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価処理部は、上記複数のペアに基づき生成された複数の評価情報に基づいて所定の評価値の3次元分布を生成する。 The 20th aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the 19th aspect. The evaluation processing section generates a three-dimensional distribution of predetermined evaluation values based on the plurality of evaluation information generated based on the plurality of pairs.
 実施形態に係る眼科装置の第21の態様例は、第2~第20の態様例のいずれかの態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価処理部は、上記第1の画像及び上記第2の画像に描出されている房水内浮遊物の種別を特定する種別特定部を含む。 The twenty-first aspect of the ophthalmological apparatus according to the embodiment has the following non-limiting features in addition to the non-limiting features of any of the second to twentieth aspects. The evaluation processing section includes a type specifying section that specifies the type of floating matter in the aqueous humor depicted in the first image and the second image.
 実施形態に係る眼科装置の第22の態様例は、第21の態様例の非限定的な特徴に加えて、次の非限定的な特徴を備えている。上記評価処理部は、上記種別特定部により特定された上記房水内浮遊物の上記種別に応じて評価情報の生成を行う。 The 22nd aspect of the ophthalmological apparatus according to the embodiment includes the following non-limiting features in addition to the non-limiting features of the 21st aspect. The evaluation processing section generates evaluation information according to the type of the floating object in the aqueous humor specified by the type specifying section.
 本開示において説明したように、これらの非限定的な特徴を有する眼科装置によれば、眼科イメージングで取得された画像に基づいて房水内浮遊物を評価する処理の品質の向上を図ることが可能である。 As described in the present disclosure, an ophthalmological device having these non-limiting features can improve the quality of the process of evaluating suspended matter in the aqueous humor based on images acquired in ophthalmic imaging. It is possible.
 本開示において説明した任意の事項を、いずれかの非限定的な特徴を有する眼科装置に組み合わせることによって、房水内浮遊物評価の更なる品質向上を図ることが可能であること、そして、房水内浮遊物評価の様々な応用を提供することが可能であることは、当業者であれば理解することができるであろう。 By combining any of the matters described in the present disclosure with an ophthalmological device having any of the non-limiting features, it is possible to further improve the quality of the evaluation of suspended matter in the aqueous humor, and Those skilled in the art will appreciate that it is possible to provide a variety of applications for the assessment of suspended solids in water.
<他の実施形態>
 ここまで眼科装置の実施形態について説明したが、本開示に係る実施形態は眼科装置に限定されない。眼科装置以外の実施形態として、眼科装置の制御方法、プログラム、記録媒体などがある。眼科装置の実施形態と同様に、これらの実施形態によっても、房水内浮遊物評価の品質向上を図ることができる。
<Other embodiments>
Although embodiments of ophthalmological devices have been described so far, embodiments according to the present disclosure are not limited to ophthalmological devices. Embodiments other than ophthalmological devices include a method for controlling an ophthalmological device, a program, a recording medium, and the like. Similar to the embodiments of the ophthalmological device, these embodiments also allow for improved quality of the evaluation of suspended matter in the aqueous humor.
 一実施形態に係る眼科装置の制御方法は、眼科装置を制御する方法である。 A method for controlling an ophthalmological apparatus according to one embodiment is a method for controlling an ophthalmological apparatus.
 この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。 This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions.
 本実施形態に係る方法は、眼科装置に含まれているプロセッサを、制御部及び評価処理部として機能させるように構成されている。 The method according to this embodiment is configured to cause a processor included in the ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
 本実施形態に係る方法により制御部として機能するプロセッサは、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行する。 The processor functioning as a control unit according to the method according to the present embodiment controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the subject's eye; The second imaging unit is controlled to apply a second imaging under a second imaging condition different from the imaging conditions to the anterior segment of the eye in parallel with the first imaging.
 本実施形態に係る方法により評価処理部として機能するプロセッサは、第1の撮影で生成された第1の画像と第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する。 According to the method according to the present embodiment, the processor functioning as an evaluation processing unit evaluates the amount of suspended matter in the aqueous humor based on the first image generated in the first imaging and the second image generated in the second imaging. generate evaluation information.
 本開示において説明した任意の事項を、本実施形態に係る方法に組み合わせることができる。 Any of the items described in this disclosure can be combined with the method according to this embodiment.
 一実施形態に係るプログラムは、眼科装置を動作させるためのプログラムである。 A program according to one embodiment is a program for operating an ophthalmological apparatus.
 この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。 This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions.
 本実施形態に係るプログラムは、眼科装置に含まれているプロセッサを、制御部及び評価処理部として機能させるように構成されている。 The program according to this embodiment is configured to cause a processor included in an ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
 本実施形態に係るプログラムにより制御部として機能するプロセッサは、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行する。 A processor functioning as a control unit according to the program according to the present embodiment controls the first imaging unit to apply the first imaging under the first imaging condition to the anterior segment of the subject's eye, and The second imaging unit is controlled to apply a second imaging under a second imaging condition different from the imaging conditions to the anterior segment of the eye in parallel with the first imaging.
 本実施形態に係るプログラムにより評価処理部として機能するプロセッサは、第1の撮影で生成された第1の画像と第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する。 According to the program according to the present embodiment, the processor functioning as an evaluation processing unit evaluates the floating matter in the aqueous humor based on the first image generated in the first imaging and the second image generated in the second imaging. generate evaluation information.
 本開示において説明した任意の事項を、本実施形態に係るプログラムに組み合わせることができる。 Any items described in this disclosure can be combined with the program according to this embodiment.
 一実施形態に係る記録媒体は、眼科装置を動作させるためのプログラムが記録された、コンピュータ可読な非一時的記録媒体である。 The recording medium according to one embodiment is a computer-readable non-temporary recording medium in which a program for operating an ophthalmological apparatus is recorded.
 この眼科装置は、第1の撮影部と、第2の撮影部と、プロセッサとを含んでいる。第1の撮影部は、シャインプルーフの条件を満足する第1の光学系を含んでいる。第2の撮影部は、シャインプルーフの条件を満足する第2の光学系を含んでいる。 This ophthalmological apparatus includes a first imaging section, a second imaging section, and a processor. The first imaging section includes a first optical system that satisfies Scheimpflug conditions. The second photographing section includes a second optical system that satisfies Scheimpflug conditions.
 本実施形態に係る記録媒体に記録されているプログラムは、眼科装置に含まれているプロセッサを、制御部及び評価処理部として機能させるように構成されている。 The program recorded on the recording medium according to this embodiment is configured to cause the processor included in the ophthalmologic apparatus to function as a control unit and an evaluation processing unit.
 本実施形態に係る記録媒体に記録されているプログラムにより制御部として機能するプロセッサは、第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための第1の撮影部の制御と、第1の撮影条件と異なる第2の撮影条件での第2の撮影を第1の撮影と並行して前眼部に適用させるための第2の撮影部の制御とを実行する。 A processor functioning as a control unit according to a program recorded in a recording medium according to the present embodiment includes a first imaging unit for applying a first imaging under a first imaging condition to the anterior segment of the subject's eye. and control of a second imaging unit to apply a second imaging to the anterior segment of the eye in parallel with the first imaging under a second imaging condition different from the first imaging condition. .
 本実施形態に係る記録媒体に記録されているプログラムにより評価処理部として機能するプロセッサは、第1の撮影で生成された第1の画像と第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する。 The processor functioning as an evaluation processing unit by the program recorded on the recording medium according to the present embodiment is configured to compare the first image generated in the first imaging and the second image generated in the second imaging. Based on this, evaluation information for suspended matter in the aqueous humor is generated.
 本開示において説明した任意の事項を、本実施形態に係る記録媒体に組み合わせることができる。 Any of the items described in this disclosure can be combined with the recording medium according to this embodiment.
 本実施形態に係る記録媒体として使用可能な、コンピュータ可読な非一時的記録媒体は、任意の形態の記録媒体であってよく、例えば、磁気ディスク、光ディスク、光磁気ディスク、及び半導体メモリのいずれかであってよい。 The computer-readable non-temporary recording medium that can be used as the recording medium according to the present embodiment may be any type of recording medium, such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory. It may be.
 眼科装置の実施形態において説明された任意の事項を、眼科装置以外の実施形態に組み合わせることができる。 Any of the items described in the ophthalmological device embodiments can be combined in non-ophthalmological device embodiments.
 例えば、実施形態に係る眼科装置の任意的な態様として説明された様々な事項のうちから任意に選択された事項を、眼科装置の制御方法の実施形態、プログラムの実施形態、記録媒体の実施形態などに組み合わせることができる。 For example, items arbitrarily selected from various items described as arbitrary aspects of the ophthalmological apparatus according to the embodiments may be used as an embodiment of the ophthalmological apparatus control method, a program embodiment, and a recording medium embodiment. It can be combined with etc.
 また、本開示において説明された任意の事項を、眼科装置の制御方法の実施形態、プログラムの実施形態、記録媒体の実施形態などに組み合わせることができる。 Furthermore, any of the matters described in the present disclosure can be combined into an embodiment of a method for controlling an ophthalmologic apparatus, an embodiment of a program, an embodiment of a recording medium, and the like.
 本開示は、幾つかの実施形態及びその幾つかの例示的な態様を提示するものである。これらの実施形態及び態様は、本発明の例示に過ぎない。したがって、本発明の要旨の範囲内における任意の変形(省略、置換、付加など)を、本開示において提示された実施形態や態様に適用することが可能である。 This disclosure presents several embodiments and some exemplary aspects thereof. These embodiments and aspects are merely illustrative of the invention. Therefore, any modification (omission, substitution, addition, etc.) within the scope of the gist of the present invention can be applied to the embodiments and aspects presented in this disclosure.
1 スリットランプ顕微鏡システム(眼科装置)
2 照明光学系
3 撮影光学系
3B 撮像素子
6 移動機構
7 制御部
20 照明光学系
30L 左撮影光学系
30R 右撮影光学系
33L、33R 撮像素子
1000、1000A 眼科装置
1100 撮影部
1110 第1の撮影部
1111 第1の光学系
1112 第1の撮像素子
1120 第2の撮影部
1121 第2の光学系
1122 第2の撮像素子
1130 照明系
1200 制御部
1300 評価処理部
1310 浮遊物像検出部
1320 評価情報生成部
1330 種別特定部
1400 移動機構

 
1 Slit lamp microscope system (ophthalmological equipment)
2 Illumination optical system 3 Photographic optical system 3B Image sensor 6 Movement mechanism 7 Control section 20 Illumination optical system 30L Left photographing optical system 30R Right photographing optical system 33L, 33R Imaging elements 1000, 1000A Ophthalmic apparatus 1100 Photographing section 1110 First photographing section 1111 First optical system 1112 First image sensor 1120 Second image sensor 1121 Second optical system 1122 Second image sensor 1130 Illumination system 1200 Control section 1300 Evaluation processing section 1310 Floating object image detection section 1320 Evaluation information generation Section 1330 Type identification section 1400 Movement mechanism

Claims (25)

  1.  シャインプルーフの条件を満足する第1の光学系を含み、第1の撮影条件での第1の撮影を被検眼の前眼部に適用する第1の撮影部と、
     シャインプルーフの条件を満足する第2の光学系を含み、前記第1の撮影条件と異なる第2の撮影条件での第2の撮影を前記第1の撮影と並行して前記前眼部に適用する第2の撮影部と、
     前記第1の撮影で生成された第1の画像と前記第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する評価処理部と
     を含む、眼科装置。
    a first imaging unit that includes a first optical system that satisfies Scheimpflug conditions and applies first imaging under first imaging conditions to the anterior segment of the subject's eye;
    including a second optical system that satisfies Scheimpflug conditions, and applies a second imaging to the anterior segment of the eye in parallel with the first imaging under a second imaging condition different from the first imaging condition. a second photography department,
    an ophthalmology clinic comprising: an evaluation processing unit that generates evaluation information of suspended matter in the aqueous humor based on a first image generated in the first imaging and a second image generated in the second imaging. Device.
  2.  前記第1の光学系は、第1の撮像素子を含み、
     前記第2の光学系は、第2の撮像素子を含み、
     前記第1の撮影条件は、前記第1の撮像素子の露光時間である第1の露光時間を含み、
     前記第2の撮影条件は、前記第2の撮像素子の露光時間である第2の露光時間を含み、
     前記第2の露光時間は、前記第1の露光時間よりも長い、
     請求項1の眼科装置。
    The first optical system includes a first image sensor,
    The second optical system includes a second image sensor,
    The first imaging condition includes a first exposure time that is an exposure time of the first image sensor,
    The second photographing condition includes a second exposure time that is an exposure time of the second image sensor,
    the second exposure time is longer than the first exposure time,
    The ophthalmological device of claim 1.
  3.  前記評価処理部は、
     同じ房水内浮遊物に対応する前記第1の画像中の第1の浮遊物像及び前記第2の画像中の第2の浮遊物像を検出する浮遊物像検出部と、
     前記第1の浮遊物像及び前記第2の浮遊物像に基づいて当該房水内浮遊物の評価情報を生成する評価情報生成部と
     を含む、
     請求項2の眼科装置。
    The evaluation processing unit is
    a floating object image detection unit that detects a first floating object image in the first image and a second floating object image in the second image that correspond to the same floating object in the aqueous humor;
    an evaluation information generation unit that generates evaluation information of the floating object in the aqueous humor based on the first floating object image and the second floating object image;
    The ophthalmological device according to claim 2.
  4.  前記評価情報生成部は、前記第2の浮遊物像に基づいて当該房水内浮遊物の移動方向を推定する、
     請求項3の眼科装置。
    The evaluation information generation unit estimates a moving direction of the floating object in the aqueous humor based on the second floating object image.
    The ophthalmological device according to claim 3.
  5.  前記評価情報生成部は、前記第1の浮遊物像及び前記第2の浮遊物像に基づいて当該房水内浮遊物の移動ベクトルを推定する、
     請求項3の眼科装置。
    The evaluation information generation unit estimates a movement vector of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
    The ophthalmological device according to claim 3.
  6.  前記評価情報生成部は、
     前記第1の浮遊物像の特徴点を求め、
     前記第1の浮遊物像の前記特徴点及び前記第2の浮遊物像に基づいて前記移動ベクトルを推定する、
     請求項5の眼科装置。
    The evaluation information generation unit includes:
    determining feature points of the first floating object image;
    estimating the movement vector based on the feature point of the first floating object image and the second floating object image;
    The ophthalmological device according to claim 5.
  7.  前記評価情報生成部は、
     前記第1の浮遊物像の前記特徴点に対応する前記第2の浮遊物像中の位置を特定し、
     前記特徴点に対応する前記位置に基づいて前記移動ベクトルを推定する、
     請求項6の眼科装置。
    The evaluation information generation unit includes:
    specifying a position in the second floating object image that corresponds to the feature point of the first floating object image;
    estimating the movement vector based on the position corresponding to the feature point;
    The ophthalmological device according to claim 6.
  8.  前記評価情報生成部は、前記第1の浮遊物像の前記特徴点に対応する前記第2の浮遊物像中の前記位置、及び、前記第1の撮像素子の露光期間と前記第2の撮像素子の露光期間との間のタイミング関係に基づいて、前記移動ベクトルを推定する、
     請求項7の眼科装置。
    The evaluation information generation unit is configured to determine the position in the second floating object image that corresponds to the feature point of the first floating object image, the exposure period of the first imaging device, and the second imaging device. estimating the movement vector based on a timing relationship with an exposure period of the element;
    The ophthalmological device according to claim 7.
  9.  前記前眼部にスリット光を投射する照明系を更に含み、
     前記第1の撮影及び前記第2の撮影は、前記スリット光が投射されている前記前眼部に適用され、
     前記評価情報生成部は、前記スリット光の投射方向である第1の方向における当該房水内浮遊物の移動量と、前記スリット光のビーム形状の長手方向における当該房水内浮遊物の移動量とを推定する、
     請求項3の眼科装置。
    further comprising an illumination system that projects a slit light onto the anterior segment of the eye,
    The first imaging and the second imaging are applied to the anterior segment of the eye onto which the slit light is projected,
    The evaluation information generation unit is configured to calculate the amount of movement of the floating object in the aqueous humor in a first direction that is the projection direction of the slit light, and the amount of movement of the floating object in the aqueous humor in the longitudinal direction of the beam shape of the slit light. to estimate,
    The ophthalmological device according to claim 3.
  10.  前記評価情報生成部は、前記第1の方向及び前記第2の方向の双方に直交する第3の方向における当該房水内浮遊物の移動量を推定する、
     請求項9の眼科装置。
    The evaluation information generation unit estimates a movement amount of the suspended matter in the aqueous humor in a third direction orthogonal to both the first direction and the second direction.
    The ophthalmological device according to claim 9.
  11.  前記第2の光学系は、対物レンズを含み、
     前記評価情報生成部は、前記第2の浮遊物像の寸法及び前記対物レンズの開口数に基づいて前記第3の方向における前記移動量を推定する、
     請求項10の眼科装置。
    The second optical system includes an objective lens,
    The evaluation information generation unit estimates the amount of movement in the third direction based on the dimensions of the second floating object image and the numerical aperture of the objective lens.
    The ophthalmological device of claim 10.
  12.  前記評価情報生成部は、前記第2の浮遊物像の前記寸法として、前記第2の浮遊物像の幅を算出する、
     請求項11の眼科装置。
    The evaluation information generation unit calculates a width of the second floating object image as the dimension of the second floating object image.
    The ophthalmological device of claim 11.
  13.  前記評価情報生成部は、前記第2の撮像素子の前記第2の露光時間、前記第1の方向における前記移動量、前記第2の方向における前記移動量、及び前記第3の方向における前記移動量に基づいて、当該房水内浮遊物の移動速度を推定する、
     請求項10の眼科装置。
    The evaluation information generation unit is configured to calculate the second exposure time of the second image sensor, the amount of movement in the first direction, the amount of movement in the second direction, and the movement in the third direction. estimating the movement speed of the suspended matter in the aqueous humor based on the amount;
    The ophthalmological device of claim 10.
  14.  前記評価情報生成部は、前記第1の撮像素子の前記第1の露光時間、前記第2の撮像素子の前記第2の露光時間、前記第1の方向における前記移動量、前記第2の方向における前記移動量、及び前記第3の方向における前記移動量に基づいて、当該房水内浮遊物の移動速度を推定する、
     請求項10の眼科装置。
    The evaluation information generation unit includes the first exposure time of the first image sensor, the second exposure time of the second image sensor, the amount of movement in the first direction, and the second direction. estimating the moving speed of the floating object in the aqueous humor based on the moving amount in the third direction and the moving amount in the third direction;
    The ophthalmological device of claim 10.
  15.  前記評価情報生成部は、前記第1の浮遊物像及び前記第2の浮遊物像に基づいて当該房水内浮遊物の移動速度を推定する、
     請求項3の眼科装置。
    The evaluation information generation unit estimates a moving speed of the floating object in the aqueous humor based on the first floating object image and the second floating object image.
    The ophthalmological device according to claim 3.
  16.  前記浮遊物像検出部は、
     前記第1の画像から複数の房水内浮遊物の像を含む第1の像集合を検出し、
     前記第2の画像から複数の房水内浮遊物の像を含む第2の像集合を検出し、
     前記第1の像集合と前記第2の像集合との間の位置的対応関係を決定し、
     前記評価情報生成部は、前記位置的対応関係によって対応付けられている前記第1の像集合の1つの要素と前記第2の像集合の1つの要素とのペアに基づいて、当該ペアに対応する房水内浮遊物の評価情報を生成する、
     請求項3の眼科装置。
    The floating object image detection section includes:
    detecting a first image set including images of a plurality of floating objects in the aqueous humor from the first image;
    detecting a second image set including images of a plurality of suspended objects in the aqueous humor from the second image;
    determining a positional correspondence between the first image set and the second image set;
    The evaluation information generation unit is configured to respond to a pair based on a pair of one element of the first image set and one element of the second image set, which are associated based on the positional correspondence relationship. generate evaluation information of suspended matter in the aqueous humor,
    The ophthalmological device according to claim 3.
  17.  前記浮遊物像検出部は、
     前記第1の画像及び前記第2の画像のうちの一方の画像から浮遊物像を検出し、
     前記一方の画像における当該浮遊物像の座標を求め、
     前記第1の画像及び前記第2の画像のうちの他方の画像から、前記座標に対応する位置の浮遊物像を検出し、
     前記評価情報生成部は、前記一方の画像から検出された前記浮遊物像と前記他方の画像から検出された前記浮遊物像とのペアに基づいて、当該ペアに対応する房水内浮遊物の評価情報を生成する、
     請求項3の眼科装置。
    The floating object image detection section includes:
    detecting a floating object image from one of the first image and the second image;
    Determine the coordinates of the floating object image in the one image,
    detecting a floating object image at a position corresponding to the coordinates from the other of the first image and the second image;
    The evaluation information generation unit is configured to determine, based on a pair of the floating object image detected from the one image and the floating object image detected from the other image, the floating object in the aqueous humor corresponding to the pair. generate evaluation information;
    The ophthalmological device according to claim 3.
  18.  前記第1の光学系及び前記第2の光学系を移動する移動機構と、
     前記第1の撮影部の制御と前記第2の撮影部の制御と前記移動機構の制御とを組み合わせることにより、前記第1の画像と前記第2の画像との複数のペアを前記第1の撮影部及び前記第2の撮影部に生成させる制御部と
     を更に含む、
     請求項2の眼科装置。
    a moving mechanism that moves the first optical system and the second optical system;
    By combining the control of the first photographing unit, the control of the second photographing unit, and the control of the moving mechanism, a plurality of pairs of the first image and the second image are further comprising: an imaging unit and a control unit that causes the second imaging unit to generate the image;
    The ophthalmological device according to claim 2.
  19.  前記評価処理部は、前記複数のペアのそれぞれに基づいて評価情報を生成する、
     請求項18の眼科装置。
    The evaluation processing unit generates evaluation information based on each of the plurality of pairs.
    The ophthalmological device of claim 18.
  20.  前記評価処理部は、前記複数のペアに基づき生成された複数の評価情報に基づいて所定の評価値の3次元分布を生成する、
     請求項19の眼科装置。
    The evaluation processing unit generates a three-dimensional distribution of predetermined evaluation values based on the plurality of evaluation information generated based on the plurality of pairs.
    The ophthalmological device of claim 19.
  21.  前記評価処理部は、前記第1の画像及び前記第2の画像に描出されている房水内浮遊物の種別を特定する種別特定部を含む、
     請求項2の眼科装置。
    The evaluation processing unit includes a type identifying unit that identifies the type of floating matter in the aqueous humor depicted in the first image and the second image.
    The ophthalmological device according to claim 2.
  22.  前記評価処理部は、前記種別特定部により特定された前記房水内浮遊物の前記種別に応じて評価情報の生成を行う、
     請求項21の眼科装置。
    The evaluation processing unit generates evaluation information according to the type of the floating substance in the aqueous humor specified by the type identification unit.
    The ophthalmological device of claim 21.
  23.  シャインプルーフの条件を満足する第1の光学系を含む第1の撮影部と、シャインプルーフの条件を満足する第2の光学系を含む第2の撮影部と、プロセッサとを含む眼科装置を制御する方法であって、
     前記プロセッサを、
     第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための前記第1の撮影部の制御と、前記第1の撮影条件と異なる第2の撮影条件での第2の撮影を前記第1の撮影と並行して前記前眼部に適用させるための前記第2の撮影部の制御とを実行する制御部、及び、
     前記第1の撮影で生成された第1の画像と前記第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する評価処理部
     として機能させる、方法。
    Controls an ophthalmological apparatus including a first imaging unit including a first optical system that satisfies Scheimpflug conditions, a second imaging unit that includes a second optical system that satisfies Scheimpflug conditions, and a processor. A method of
    The processor,
    controlling the first imaging unit to apply first imaging under a first imaging condition to the anterior segment of the subject's eye; and controlling a second imaging unit under a second imaging condition different from the first imaging condition. a control unit that controls the second imaging unit to apply imaging to the anterior segment in parallel with the first imaging;
    A method for functioning as an evaluation processing unit that generates evaluation information for suspended matter in the aqueous humor based on a first image generated in the first imaging and a second image generated in the second imaging. .
  24.  シャインプルーフの条件を満足する第1の光学系を含む第1の撮影部と、シャインプルーフの条件を満足する第2の光学系を含む第2の撮影部と、プロセッサとを含む眼科装置を動作させるためのプログラムであって、
     前記プロセッサを、
     第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための前記第1の撮影部の制御と、前記第1の撮影条件と異なる第2の撮影条件での第2の撮影を前記第1の撮影と並行して前記前眼部に適用させるための前記第2の撮影部の制御とを実行する制御部、及び、
     前記第1の撮影で生成された第1の画像と前記第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する評価処理部
     として機能させる、プログラム。
    Operates an ophthalmological apparatus that includes a first imaging unit including a first optical system that satisfies Scheimpflug conditions, a second imaging unit that includes a second optical system that satisfies Scheimpflug conditions, and a processor. It is a program to make
    The processor,
    controlling the first imaging unit to apply first imaging under a first imaging condition to the anterior segment of the subject's eye; and controlling a second imaging unit under a second imaging condition different from the first imaging condition. a control unit that controls the second imaging unit to apply imaging to the anterior segment in parallel with the first imaging;
    A program that functions as an evaluation processing unit that generates evaluation information for suspended matter in the aqueous humor based on a first image generated in the first image capture and a second image generated in the second image capture. .
  25.  シャインプルーフの条件を満足する第1の光学系を含む第1の撮影部と、シャインプルーフの条件を満足する第2の光学系を含む第2の撮影部と、プロセッサとを含む眼科装置を動作させるためのプログラムが記録された、コンピュータ可読な非一時的記録媒体であって、
     前記プログラムが、前記プロセッサを、
     第1の撮影条件での第1の撮影を被検眼の前眼部に適用させるための前記第1の撮影部の制御と、前記第1の撮影条件と異なる第2の撮影条件での第2の撮影を前記第1の撮影と並行して前記前眼部に適用させるための前記第2の撮影部の制御とを実行する制御部、及び、
     前記第1の撮影で生成された第1の画像と前記第2の撮影で生成された第2の画像とに基づいて房水内浮遊物の評価情報を生成する評価処理部
     として機能させる、記録媒体。

     
    Operates an ophthalmological apparatus that includes a first imaging unit including a first optical system that satisfies Scheimpflug conditions, a second imaging unit that includes a second optical system that satisfies Scheimpflug conditions, and a processor. A computer-readable non-transitory recording medium on which a program for
    The program causes the processor to
    controlling the first imaging unit to apply first imaging under a first imaging condition to the anterior segment of the subject's eye; and controlling a second imaging unit under a second imaging condition different from the first imaging condition. a control unit that controls the second imaging unit to apply imaging to the anterior segment in parallel with the first imaging;
    a recording unit that functions as an evaluation processing unit that generates evaluation information for suspended matter in the aqueous humor based on a first image generated in the first image capture and a second image generated in the second image capture; Medium.

PCT/JP2023/028514 2022-08-22 2023-08-04 Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium WO2024043041A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022131654A JP2024029416A (en) 2022-08-22 2022-08-22 Ophthalmological device, method for controlling ophthalmological device, program, and recording medium
JP2022-131654 2022-08-22

Publications (1)

Publication Number Publication Date
WO2024043041A1 true WO2024043041A1 (en) 2024-02-29

Family

ID=90013050

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/028514 WO2024043041A1 (en) 2022-08-22 2023-08-04 Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium

Country Status (2)

Country Link
JP (1) JP2024029416A (en)
WO (1) WO2024043041A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010156596A (en) * 2008-12-26 2010-07-15 Fujifilm Corp Measuring device and measuring method
JP2019126735A (en) * 2018-01-25 2019-08-01 ハーグ−シュトライト アーゲー Eye examining device
JP2022035168A (en) * 2020-08-20 2022-03-04 株式会社トプコン Slit lamp microscope system
JP2022044113A (en) * 2020-09-07 2022-03-17 キヤノン株式会社 Aberration estimation method, aberration estimation device, program and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010156596A (en) * 2008-12-26 2010-07-15 Fujifilm Corp Measuring device and measuring method
JP2019126735A (en) * 2018-01-25 2019-08-01 ハーグ−シュトライト アーゲー Eye examining device
JP2022035168A (en) * 2020-08-20 2022-03-04 株式会社トプコン Slit lamp microscope system
JP2022044113A (en) * 2020-09-07 2022-03-17 キヤノン株式会社 Aberration estimation method, aberration estimation device, program and storage medium

Also Published As

Publication number Publication date
JP2024029416A (en) 2024-03-06

Similar Documents

Publication Publication Date Title
JP6899632B2 (en) Ophthalmologic imaging equipment
US5719659A (en) Ophthalmic apparatus having light polarizing means
JP2022040372A (en) Ophthalmologic apparatus
JP7080076B2 (en) Ophthalmic device and its control method
JP2022027879A (en) Ophthalmologic imaging device, control method thereof, program, and recording medium
JP6923392B2 (en) Ophthalmic equipment
JP6633468B2 (en) Blood flow measurement device
WO2024043041A1 (en) Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium
JP7384987B2 (en) ophthalmology equipment
JP6934747B2 (en) Ophthalmic device and its control method
WO2021085020A1 (en) Ophthalmic device and method for controlling same
JP6864484B2 (en) Ophthalmic equipment
JP2021191551A (en) Ophthalmologic inspection device
JP2020195883A (en) Ophthalmologic inspection device
JP2022053081A (en) Fundus camera and control method of the same
WO2024034298A1 (en) Ophthalmologic device, method for controlling ophthalmologic device, and recording medium
WO2023238729A1 (en) Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium
WO2024018788A1 (en) Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium
WO2024004455A1 (en) Opthalmic information processing device, opthalmic device, opthalmic information processing method, and program
WO2023037658A1 (en) Ophthalmological device, method for controlling ophthalmological device, method for processing eye image, program, and recording medium
JP7374272B2 (en) ophthalmology equipment
US20240032787A1 (en) Ophthalmic apparatus, method of controlling ophthalmic apparatus, and recording medium
JP7096391B2 (en) Ophthalmic equipment
JP2019054974A (en) Ophthalmologic apparatus
JP6954831B2 (en) Ophthalmologic imaging equipment, its control method, programs, and recording media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23857155

Country of ref document: EP

Kind code of ref document: A1