EP4238482A1 - Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage - Google Patents

Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage Download PDF

Info

Publication number
EP4238482A1
EP4238482A1 EP20959942.2A EP20959942A EP4238482A1 EP 4238482 A1 EP4238482 A1 EP 4238482A1 EP 20959942 A EP20959942 A EP 20959942A EP 4238482 A1 EP4238482 A1 EP 4238482A1
Authority
EP
European Patent Office
Prior art keywords
image
eye
site
subject
moving image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20959942.2A
Other languages
German (de)
English (en)
Inventor
Kazuhiro Yamada
Kazuhiro Oomori
Yasufumi Fukuma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Topcon Corp
Original Assignee
Topcon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Topcon Corp filed Critical Topcon Corp
Publication of EP4238482A1 publication Critical patent/EP4238482A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/13Ophthalmic microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/117Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for examining the anterior chamber or the anterior chamber angle, e.g. gonioscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • A61B3/152Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0807Indication means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Definitions

  • the present disclosure relates generally to an ophthalmic observation apparatus, a method of controlling the same, a program, and a recording medium.
  • An ophthalmic observation apparatus is an apparatus for observing an eye of a patient (which will be referred to as a subject's eye hereinafter). Ophthalmic observation is conducted to grasp the condition of the subject's eye in various situations such as examination, surgery, and treatment.
  • Conventional ophthalmic observation apparatuses are configured to provide a user with a magnified image formed by an objective lens, a variable magnification optical system, etc. via an eyepiece.
  • some ophthalmic observation apparatuses are configured to photograph a magnified image formed by an objective lens, a variable magnification optical system, etc. with an image sensor, and display the photographed image obtained (such an ophthalmic observation apparatus will be referred to as a digital ophthalmic observation apparatus).
  • Examples of such digital ophthalmic observation apparatuses include surgical microscopes, slit lamp microscopes, and fundus cameras (retinal cameras).
  • ophthalmic examination apparatuses such as refractometers, keratometers, tonometers, specular microscopes, wavefront analyzers, and microperimeters are also provided with the function of the digital ophthalmic observation apparatus.
  • ophthalmic observation apparatuses of recent years use optical scanning (scanning-type ophthalmic observation apparatus).
  • ophthalmic observation apparatuses include scanning laser ophthalmoscopes (SLOs), and optical coherence tomography (OCT) apparatuses.
  • SLOs scanning laser ophthalmoscopes
  • OCT optical coherence tomography
  • an ophthalmic observation apparatus is configured to provide a moving image of a subject's eye to a user (e.g., a health professional (health care practitioner) such as a doctor).
  • a typical digital ophthalmic observation apparatus is configured to perform photographing of a moving image using infrared light and/or visible light as illumination light, and real-time display of the moving image obtained by the moving image photography.
  • a typical scanning-type ophthalmic observation apparatus is configured to perform data collection (data acquisition) by repetitive optical scanning, real-time image reconstruction based on datasets sequentially collected, and real-time moving image display of images sequentially reconstructed.
  • the real-time moving image provided in these ways is referred to as an observation image or a live image.
  • An ophthalmic observation apparatus capable of providing a real-time moving image is also used in surgery.
  • ophthalmic surgery There are various types of ophthalmic surgery.
  • Some typical surgical methods involve implantation of an artificial object (artificial material, artificial device, artificial instrument, or the like) in the eye.
  • Typical examples of such surgical methods include cataract surgery to replace the crystalline lens with an intraocular lens (IOL), refractive surgery to implant an intraocular contact lens (ICL; also known as a phakic IOL) in the anterior chamber, and minimally invasive glaucoma surgery (MIGS) to place a stent in the trabecular meshwork.
  • IOL intraocular lens
  • ICL intraocular contact lens
  • MIGS minimally invasive glaucoma surgery
  • PATENT DOCUMENT 1 Japanese Unexamined Patent Application Publication No. 2019-162336
  • Lens centering i.e., an operation of aligning the center of the lens with the axis of the eye
  • MIGS involves an operation of positioning the stent with respect to the trabecular meshwork to ensure the effectiveness of aqueous humor drainage.
  • An example of the current situation is that rough positioning or rough alignment for lens centering is performed while referring to the state (condition) of light reflection depicted in the image of the eye provided through an eyepiece or a display device. This is because there is no position indicator available for reference in performing the operation of lens centering.
  • One object of the present disclosure is to provide a novel method or technique for facilitating ophthalmic observation.
  • Some aspect examples are an ophthalmic observation apparatus for observing a subject's eye, including: a moving image generating unit configured to generate a moving image by photographing the subject's eye into which an artificial object has been inserted; an analyzing processor configured to analyze a still image included in the moving image to identify a first site image corresponding to a predetermined site of the subject's eye and a second site image corresponding to a predetermined site of the artificial object; and a display controller configured to display, on a display device, the moving image, first position information that represents a position of the first site image, and second position information that represents a position of the second site image.
  • the display controller may be configured to display information on the basis of a positional difference between the first site image and the second site image based on the position of the first site image and the position of the second site image.
  • the display controller may be configured to display first guide information that represents a movement direction of the artificial object.
  • the display controller may be configured to display second guide information that represents a movement distance of the artificial object.
  • the display controller may be configured to display the first position information and the second position information in mutually different aspects.
  • the analyzing processor may be configured to sequentially identify first site images and second site images from still images sequentially generated by the moving image generating unit, in parallel with generation of the moving image performed by the moving image generating unit, and the display controller may be configured to sequentially update the first position information and the second position information displayed together with the moving image, in parallel with sequential generation of the still images performed by the moving image generating unit and sequential analysis of the still images performed by the analyzing processor.
  • the predetermined site of the artificial object may be at least one of an edge of an intraocular lens, an approximate figure to the edge, a center of the intraocular lens, and a hole of the intraocular lens.
  • the predetermined site of the subject's eye may be at least one of a corneal ring, an approximate figure to the corneal ring, a corneal center, a pupil edge, an approximate figure to the pupil edge, and a pupil center.
  • Some aspect examples are a method of controlling an ophthalmic observation apparatus including an optical system for generating a moving image of a subject's eye and a processor, the method including: causing the optical system to generate a moving image of the subject's eye into which an artificial object has been inserted; causing the processor to analyze a still image included in the moving image to identify a first site image corresponding to a predetermined site of the subject's eye and a second site image corresponding to a predetermined site of the artificial object; and causing the processor to display, on a display device, the moving image, first position information that represents a position of the first site image, and second position information that represents a position of the second site image.
  • Some aspect examples are a program configured to cause a computer to execute the method of some aspect examples.
  • Some aspect examples are a computer-readable non-transitory recording medium storing the program of some aspect examples.
  • the ophthalmic observation apparatus is used in medical practice (healthcare practice) such as surgery, examination, and treatment, in order to grasp (understand, recognize, find) the state of the subject's eye.
  • the ophthalmic observation apparatus of the aspect examples described herein is mainly a surgical microscope system.
  • ophthalmic observation apparatuses of embodiments are not limited to surgical microscope systems.
  • the ophthalmic observation apparatus of some aspect examples may be any of a slit lamp microscope, a fundus camera, a refractometer, a keratometer, a tonometer, a specular microscope, a wavefront analyzer, a microperimeter, an SLO, and an OCT apparatus.
  • the ophthalmic observation apparatus of some aspect examples may be a system that includes any one or more of these apparatus examples. In a wider sense, the ophthalmic observation apparatus of some aspect examples may be any type of ophthalmic apparatus having an observation function.
  • a target ocular site for observation (ocular site to be observed, ocular site subject to observation) by using the ophthalmic observation apparatus may be any site of the subject's eye, and may be any site of the anterior eye segment and/or any site of the posterior eye segment.
  • the observation target sites of the anterior eye segment include cornea, iris, anterior chamber, corner angle, crystalline lens, ciliary body, and zonule of Zinn.
  • the observation target sites of the posterior eye segment include retina, choroid, sclera, and vitreous body.
  • the observation target site is not limited to tissues of an eye ball, and may be any site subject to be observed in ophthalmic medical practice (and/or medical practice in other medical fields) such as eyelid, meibomian gland, and orbit (eye socket, eye pit).
  • the ophthalmic observation apparatus is used for observing an artificial object inserted into the eye.
  • the artificial object may be a freely selected device (instrument) that is implanted (placed) in the eye.
  • examples of such an artificial object include an intraocular lens, an intraocular contact lens, and an MIGS device (stent).
  • the artificial objects may be a freely selected medical instrument such as a surgical instrument, an examination instrument, or a treatment instrument.
  • circuitry or circuitry
  • processing circuit configuration or processing circuitry
  • the circuitry or the processing circuitry includes any of the followings, all of which are configured and/or programmed to execute at least one or more functions disclosed herein: a general purpose processor, a dedicated processor, an integrated circuit, a central processing unit (CPU), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), a programmable logic device (e.g., a simple programmable logic device (SPLD), a complex programmable logic device (CPLD), or a field programmable gate array (FPGA)), a conventional circuit configuration or circuitry, and any combination of these.
  • SPLD simple programmable logic device
  • CPLD complex programmable logic device
  • FPGA field programmable gate array
  • a processor is considered to be processing circuitry or circuitry that includes a transistor and/or another circuitry.
  • circuitry, a unit, a means, or a term similar to these is hardware that executes at least one or more functions disclosed herein, or hardware that is programmed to execute at least one or more functions disclosed herein.
  • Hardware may be the hardware disclosed herein, or alternatively, known hardware that is programmed and/or configured to execute at least one or more functions described herein.
  • the hardware is a processor, which may be considered as a certain type of circuitry
  • circuitry, a unit, a means, or a term similar to these is a combination of hardware and software.
  • the software is used to configure the hardware and/or the processor.
  • FIG. 1 shows the configuration of the ophthalmic observation apparatus of some aspect examples.
  • the ophthalmic observation apparatus 1 (surgical microscope system, operation microscope system) according to the present embodiment includes the operation device 2, the display device 3, and the surgical microscope (operation microscope) 10.
  • the surgical microscope 10 may include at least one of the operation device 2 and the display device 3.
  • the display device 3 may not be included in the ophthalmic observation apparatus 1. In other words, the display device 3 may be a peripheral device of the ophthalmic observation apparatus 1.
  • the operation device 2 includes an operation device and/or an input device.
  • the operation device 2 may include any of a button, a switch, a mouse, a keyboard, a trackball, an operation panel, a dial, and so forth.
  • the operation device 2 includes a foot switch, like standard (general, normal, usual) ophthalmic surgical microscopes.
  • the operation device 2 may also be configured in such a manner that the user performs operations using voice recognition, line-of-sight (gaze) input, or like input technologies.
  • the display device 3 displays an image of the subject's eye acquired by the surgical microscope 10.
  • the display device 3 includes a display device such as a flat panel display.
  • the display device 3 may include any of various kinds of display devices such as a touch panel.
  • the display device 3 of some typical aspects includes a display device with a large screen.
  • the display device 3 includes one or more display devices. In the case where the display device 3 includes two or more display devices, for example, one may be a display device with a relatively large screen and one of the other(s) may be a display device with a relatively small screen. Also, a configuration may be employed in which a plurality of display regions is provided in one display device to display a plurality of pieces of information.
  • the operation device 2 and the display device 3 do not have to be separate devices.
  • a device having both the operation function and the display function such as a touch panel, may be used as the display device 3.
  • the operation device 2 may include a computer program in addition to the touch panel.
  • a content of an operation made by the operation device 2 is sent to a processor (not shown in the drawings) as an electric signal.
  • a graphical user interface (GUI) displayed on the display device 3 and the operation device 2 may be used to conduct operations (instructions) and input information.
  • the functions of the operation device 2 and the display device 3 may be implemented with a touch screen.
  • the surgical microscope 10 is used for observation of the eye of a patient (subject's eye) in the supine position.
  • the surgical microscope 10 performs photographing of the subject's eye to generate digital image data.
  • the surgical microscope 10 generates a moving image of the subject's eye.
  • the moving image (video, movie) generated by the surgical microscope 10 is transmitted to the display device 3 through a wired and/or wireless signal path and displayed on the display device 3.
  • the user e.g., surgeon
  • the surgical microscope 10 of some aspects may also be capable of providing observation through an eyepiece as in conventional technology.
  • the surgical microscope 10 includes a communication device for transmitting and receiving electrical signals to and from the operation device 2.
  • the operation device 2 receives an operation (instruction) performed by the user and generates an electric signal (operation signal) corresponding to the operation.
  • the operation signal is transmitted to the surgical microscope 10 through a wired and/or wireless signal path.
  • the surgical microscope 10 executes processing corresponding to the operation signal received.
  • the z direction is defined to be the optical axis direction (direction along the optical axis) of the objective lens (the z direction is, for example, the vertical direction, the up and down direction during surgery);
  • the x direction is defined to be a predetermined direction perpendicular to the z direction (the x direction is, for example, the horizontal direction during surgery, and the left and right direction for the surgeon and the patient during surgery);
  • the y direction is defined to be the direction perpendicular to both the z and x directions (the y direction is, for example, the horizontal direction during surgery, the front and back direction for the surgeon during surgery, and the body axis direction (direction along the body axis) for the patient during surgery).
  • observation optical system includes a pair of left and right optical systems (optical systems capable of binocular observation)
  • observation optical system of some other aspects may have an optical system for monocular observation, and it will be understood by those skilled in the art that the configuration described below may be incorporated into the aspects for monocular observation.
  • FIG. 2 shows an example of the configuration of the optical system of the surgical microscope 10.
  • FIG. 2 illustrates a schematic top view of the optical system viewed from above (top view) and a schematic side view of the optical system viewed from the side (side view) in association with each other.
  • the illumination optical system 30 arranged above the objective lens 20 is omitted in the top view.
  • the surgical microscope 10 includes the objective lens 20, the dichroic mirror DM1, the illumination optical system 30, and the observation optical system 40.
  • the observation optical system 40 includes the zoom expander 50, and the imaging camera 60.
  • the illumination optical system 30 or the observation optical system 40 includes the dichroic mirror DM1.
  • the objective lens 20 is arranged to face the subject's eye.
  • the objective lens 20 is arranged such that its optical axis is oriented along the z direction.
  • the objective lens 20 may include two or more lenses.
  • the dichroic mirror DM1 couples the optical path of the illumination optical system 30 and the optical path of the observation optical system 40 with each other.
  • the dichroic mirror DM1 is arranged between the illumination optical system 30 and the objective lens 20.
  • the dichroic mirror DM1 transmits illumination light from the illumination optical system 30 and directs the illumination light to the subject's eye through the objective lens 20.
  • the dichroic mirror DM1 reflects return light from the subject's eye incident through the objective lens 20 and directs the return light to the imaging camera 60 of the observation optical system 40.
  • the dichroic mirror DM1 coaxially couples the optical path of the illumination optical system 30 and the optical path of the observation optical system 40 with each other. In other words, the optical axis of the illumination optical system 30 and the optical axis of the observation optical system 40 intersect at the dichroic mirror DM1.
  • the dichroic mirror DM1 coaxially couples the optical path of the illumination optical system for left eye (the first illumination optical system 31L) and the optical path of the observation optical system for left eye 40L with each other, and coaxially couples the optical path of the illumination optical system for right eye (the first illumination optical system 31R) and the optical path of the observation optical system for right eye 40R with each other.
  • the illumination optical system 30 is an optical system for illuminating the subject's eye through the objective lens 20.
  • the illumination optical system 30 may be configured to selectively illuminate the subject's eye with two or more pieces of illumination light having different color temperatures.
  • the illumination optical system 30 projects illumination light having a designated color temperature onto the subject's eye under the control of a controller (the controller 200 described later).
  • the illumination optical system 30 includes the first illumination optical systems 31L and 31R and the second illumination optical system 32.
  • Each of the optical axis OL of the first illumination optical system 31L and the optical axis OR of the first illumination optical system 31R is arranged with the optical axis of the objective lens 20 in a substantially coaxial manner.
  • Such arrangements enable a coaxial illumination mode and therefore make it possible to obtain a red reflex image (transillumination image) formed by utilizing diffuse reflection from eye fundus.
  • the present aspect allows the red reflex image of the subject's eye to be observed with both eyes.
  • the second illumination optical system 32 is arranged in such a manner that its optical axis OS is eccentric (deviated, shifted) from the optical axis of the objective lens 20.
  • the first illumination optical systems 31L and 31R and the second illumination optical system 32 are arranged such that the deviation of the optical axis OS with respect to the optical axis of the objective lens 20 is larger than the deviations of the optical axes OL and OR with respect to the optical axis of the objective lens 20.
  • Such arrangements enable an illumination mode referred to as "angled illumination (oblique illumination)" and therefore enables binocular observation of the subject's eye while preventing ghosting caused by corneal reflection or the like.
  • the arrangements enable detailed observation of unevenness and irregularities of sites and tissues of the subject's eye.
  • the first illumination optical system 31L includes the light source 31LA and the condenser lens 31LB.
  • the light source 31LA outputs illumination light having a wavelength in the visible range (visible region) corresponding to color temperature of 3000 K (kelvins), for example.
  • the illumination light emitted from the light source 31LA passes through the condenser lens 31LB, passes through the dichroic mirror DM1, passes through the objective lens 20, and then is incident on the subject's eye.
  • the first illumination optical system 31R includes the light source 31RA and the condenser lens 31RB.
  • the light source 31RA also outputs illumination light having a wavelength in the visible range corresponding to color temperature of 3000 K, for example.
  • the illumination light emitted from the light source 31RA passes through the condenser lens 31RB, passes through the dichroic mirror DM1, passes through the objective lens 20, and then is incident on the subject's eye.
  • the second illumination optical system 32 includes the light source 32A and the condenser lens 32B.
  • the light source 32A outputs illumination light having a wavelength in the visible range corresponding to a color temperature within the range of 4000 K to 6000 K, for example.
  • the illumination light emitted from the light source 32A passes through the condenser lens 32B, passes through the objective lens 20 without passing through the dichroic mirror DM1, and then is incident on the subject's eye.
  • the color temperature of the illumination light from the first illumination optical systems 31L and 31R is lower than the color temperature of the illumination light from the second illumination optical system 32.
  • each of the optical axes OL and OR is movable relative to the optical axis of the objective lens 20.
  • the direction of the relative movement is a direction that intersects the optical axis of the objective lens 20, and the relative movement is represented by a displacement vector in which at least one of the x component and the y component is not zero.
  • the optical axes OL and OR may be mutually independently movable.
  • the optical axes OL and OR may be integrally movable.
  • the surgical microscope 10 includes a movement mechanism (31d) configured to move the first illumination optical systems 31L and 31R mutually independently or integrally, and therefore the movement mechanism moves the first illumination optical systems 31L and 31R mutually independently or integrally in a direction intersecting the optical axis of the objective lens 20.
  • a movement mechanism 31d
  • the movement mechanism operates under the control of a controller (the controller 200 described later).
  • the optical axis OS is movable relative to the optical axis of the objective lens 20.
  • the direction of the relative movement is a direction that intersects the optical axis of the objective lens 20, and the relative movement is represented by a displacement vector in which at least one of the x component and the y component is not zero.
  • the surgical microscope 10 includes a movement mechanism (32d) configured to move the second illumination optical system 32, and therefore the movement mechanism moves the second illumination optical system 32 in a direction that intersects the optical axis of the objective lens 20.
  • the movement mechanism operates under the control of a controller (the controller 200 described later).
  • the present aspect is configured such that the illumination optical system 30 is arranged at the position directly above the objective lens 20 (the position in the transmission direction of the dichroic mirror DM1) and the observation optical system 40 is arranged at the position in the reflection direction of the dichroic mirror DM1.
  • the observation optical system 40 may be arranged in such a manner that the angle formed by the optical axis of the observation optical system 40 and the plane perpendicular to the optical axis of the objective lens 20 (the xy plane) belongs to the range between -20 degrees and +20 degrees.
  • the observation optical system 40 which typically has a longer optical path length than the illumination optical system 30, is arranged substantially parallel to the xy plane.
  • the observation optical system 40 of the present aspect does not interfere with the surgeon's field of view while conventional surgical microscopes, whose observation optical system is oriented along the vertical direction in front of the surgeon's eyes, do. Therefore, the surgeon is capable of easily seeing the screen of the display device 3 arranged in front of the surgeon. In other words, the visibility of displayed information (images and videos of the subject's eye, and other various kinds of reference information) during surgery etc. is improved.
  • the housing since the housing is not placed in front of the surgeon's eyes, it does not give a sense of oppression to the surgeon, thereby reducing the burden on the surgeon.
  • the observation optical system 40 is an optical system for observation of an image formed based on return light of the illumination light incident from the subject's eye through the objective lens 20.
  • the observation optical system 40 guides the image to an image sensor of the imaging camera 60.
  • the observation optical system 40 includes the observation optical system for left eye 40L and the observation optical system for right eye 40R.
  • the configuration of the observation optical system for left eye 40L and the configuration of the observation optical system for right eye 40R are the same as or similar to one another.
  • the observation optical system for left eye 40L and the observation optical system for right eye 40R may be configured in such a manner that their optical arrangements can be changed independently of each other.
  • the zoom expander 50 is also referred to as a beam expander, a variable beam expander, or the like.
  • the zoom expander 50 includes the zoom expander for left eye 50L and the zoom expander for right eye 50R.
  • the configuration of the zoom expander for left eye 50L and the configuration of the zoom expander for right eye 50R are the same as or similar to each other.
  • the zoom expander for left eye 50L and the zoom expander for right eye 50R may be configured in such a manner that their optical arrangements can be changed independently of each other.
  • the zoom expander for left eye 50L includes the plurality of zoom lenses 51L, 52L, and 53L. At least one of the zoom lenses 51L, 52L, and 53L is movable in the direction along the optical axis with a variable magnification mechanism (not shown in the drawings).
  • the zoom expander for right eye 50R includes the plurality of zoom lenses 51R, 52R, and 53R, and at least one of the zoom lenses 51R, 52R, and 53R is movable in the direction along the optical axis with a variable magnification mechanism (not shown in the drawings).
  • variable magnification mechanism(s) may be configured to move a zoom lens of the zoom expander for left eye 50L and a zoom lens of the zoom expander for right eye 50R mutually independently or integrally in the directions along the optical axes. As a result of this, the magnification ratio for photographing the subject's eye is changed.
  • the variable magnification mechanism(s) operates under the control of a controller (the controller 200 described later).
  • the imaging camera 60 is a device that photographs an image formed by the observation optical system 40 and generates digital image data.
  • the imaging camera 60 is typically a digital camera (digital video camera).
  • the imaging camera 60 includes the imaging camera for left eye 60L and the imaging camera for right eye 60R.
  • the configuration of the imaging camera for left eye 60L and the configuration of the imaging camera for right eye 60R are the same as or similar to one another.
  • the imaging camera for left eye 60L and the imaging camera for right eye 60R may be configured such that their optical arrangements can be changed independently of each other.
  • the imaging camera for left eye 60L includes the imaging lens 61L and the image sensor 62L.
  • the imaging lens 61L forms an image based on the return light that has passed through the zoom expander for left eye 50L, on the imaging surface (light receiving surface) of the image sensor 62L.
  • the image sensor 62L is an area sensor, and may typically be a charge-coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor.
  • CMOS complementary metal oxide semiconductor
  • the imaging camera for right eye 60R includes the imaging lens 61R and the image sensor 62R.
  • the imaging lens 61R forms an image based on the return light that has passed through the zoom expander for right eye 50R, on the imaging surface (light receiving surface) of the image sensor 62R.
  • the image sensor 62R is an area sensor, and may typically be a CCD image sensor or a CMOS image sensor.
  • the image sensor 62R operates under the control of a controller (the controller 200 described later).
  • the processing system of the ophthalmic observation apparatus 1 will be described. Some configuration examples of the processing system are shown in FIG. 3 and FIG. 4 . Any two or more of the various kinds of configuration examples described below may be combined at least in part. Note that the configuration of the processing system is not limited to the examples described below.
  • the controller 200 executes a control of each part of the ophthalmic observation apparatus 1.
  • the controller 200 includes the main controller 201 and the memory 202.
  • the main controller 201 includes a processor and executes a control of each part of the ophthalmic observation apparatus 1.
  • the processor may load and run a program stored in the memory 202 or another storage device, thereby implementing a function according to the present aspect.
  • the processor may use (e.g., referring, processing, calculating, etc.) data and/or information stored in the memory 202 or another storage device in order to implement a function according to the present aspect.
  • the main controller 201 may control the light sources 31LA, 31RA, and 32A of the illumination optical system 30, the image sensors 62L and 62R of the observation optical system 40, the movement mechanisms 31d and 32d, the variable magnification mechanisms 50Ld and 50Rd, the operation device 2, the display device 3, and other component parts.
  • Controls of the light source 31LA include turning on and off the light source, adjusting the light amount, adjusting the diaphragm (aperture), and so forth.
  • Controls of the light source 31RA include turning on and off the light source, adjusting the light amount, adjusting the diaphragm (aperture), and so forth.
  • the main controller 201 may perform mutually exclusive controls of the light sources 31LA and 31RA.
  • Controls of the light source 32A include turning on and off the light source, adjusting the light amount, adjusting the diaphragm (aperture), and so forth.
  • the main controller 201 may change the color temperature of emitted illumination light by controlling such a light source.
  • Controls of the image sensor 62L include exposure adjustment, gain adjustment, photographing rate adjustment, and so forth.
  • Controls of the image sensor 62R include exposure adjustment, gain adjustment, photographing rate adjustment, and so forth.
  • the main controller 201 may control the image sensors 62L and 62R in such a manner that the photographing timings of the image sensors 62L and 62R match each other, or in such a manner that the difference between the photographing timings of the image sensors 62L and 62R lies within a predetermined time.
  • the main controller 201 may perform a control of loading digital data obtained by the image sensors 62L and 62R.
  • the movement mechanism 31d moves the light sources 31LA and 31RA mutually independently or integrally in a direction that intersects the optical axis of the objective lens 20.
  • the main controller 201 moves the optical axes OL and OR mutually independently or integrally with respect to the optical axis of the objective lens 20.
  • the movement mechanism 32d moves the light source 32A in a direction that intersects the optical axis of the objective lens 20.
  • the main controller 201 moves the optical axis OS with respect to the optical axis of the objective lens 20.
  • the movement mechanism 70 moves the surgical microscope 10.
  • the movement mechanism 70 is configured to integrally move at least part of the illumination optical system 30 and the observation optical system 40. This configuration makes it possible to change the relative positions of the at least part of the illumination optical system 30 and the observation optical system 40 with respect to the subject's eye while maintaining the relative positional relationship between at least part of the illumination optical system 30 and the observation optical system 40.
  • the movement mechanism 70 is configured to integrally move the first illumination optical systems 31L and 31R and the observation optical system 40. With this, the relative positions of the first illumination optical systems 31L and 31R with respect to the subject's eye and the relative position of the observation optical system 40 with respect to the subject's eye can be changed while maintaining the state (condition) of coaxial illumination.
  • the movement mechanism 70 is configured to integrally move the second illumination optical system 32 and the observation optical system 40. With this, the relative positions of the second illumination optical system 32 and the observation optical system 40 with respect to the subject's eye can be changed while maintaining the illumination angle for oblique illumination.
  • the movement mechanism 70 is configured to integrally move the first illumination optical systems 31L and 31R, the second illumination optical system 32, and the observation optical system 40. This makes it possible to change the relative positions of the illumination optical system 30 and the observation optical system 40 with respect to the subject's eye while maintaining both the state (condition) of coaxial illumination and the illumination angle for oblique illumination.
  • the movement mechanism 70 operates under a control of the controller 200.
  • the main controller 201 may be configured to control at least two of the movement mechanisms 31d, 32d, and 70 in an interlocking manner.
  • variable magnification mechanism 50Ld moves at least one of the plurality of zoom lenses 51L to 53L of the zoom expander for left eye 50L in the optical axis direction (direction along the optical axis).
  • the main controller 201 changes the magnification ratio of the observation optical system for left eye 40L by controlling the variable magnification mechanism 50Ld.
  • variable magnification mechanism 50Rd moves at least one of the plurality of zoom lenses 51R to 53R of the zoom expander for right eye 50R in the optical axis direction (direction along the optical axis).
  • the main controller 201 changes the magnification ratio of the observation optical system for right eye 40R by controlling the variable magnification mechanism 50Rd.
  • Controls for the operation device 2 include an operation permission control, an operation prohibition control, an operation signal transmission control and/or an operation signal reception control from the operation device 2, and other controls.
  • the main controller 201 receives an operation signal generated by the operation device 2 and executes a control corresponding to the operation signal received.
  • Controls for the display device 3 include an information display control and other controls.
  • the main controller 201 displays an image based on digital image data generated by the image sensors 62L and 62R on the display device 3.
  • the main controller 201 may display a moving image (video, movie) based on digital image data (video signal) generated by the image sensors 62L and 62R on the display device 3.
  • the main controller 201 may display a still image (frame) included in the moving image on the display device 3.
  • the main controller 201 may display an image (a moving image, a still image, etc.) obtained by processing the digital image data generated by the image sensors 62L and 62R on the display device 3.
  • the main controller 201 may display, on the display device 3, any information generated by the ophthalmic observation apparatus 1, any information acquired from the outside by the ophthalmic observation apparatus 1, and other types of information.
  • the main controller 201 may create an image for left eye from the digital image data generated by the image sensor 62L and create an image for right eye from the digital image data generated by the image sensor 62R, and then display the created image for left eye and the created image for right eye on the display device 3 in such a manner as to enable stereoscopic vision.
  • the main controller 201 may create a pair of left and right parallax images from the image for left eye and the image for right eye, and display the pair of parallax images on the display device 3.
  • the user e.g., surgeon
  • the stereoscopic method applicable to the present aspect may be freely selected, and for example, may be any of the following methods: a stereoscopic method for naked eyes; a stereoscopic method using an auxiliary device (polarized glasses, etc.); a stereoscopic method by applying image processing (image synthesis, image composition, rendering, etc.) to an image for left eye and an image for right eye; a stereoscopic method by displaying a pair of parallax images simultaneously; a stereoscopic method by alternately displaying a pair of parallax images; and a stereoscopic method of a combination of two or more of the above methods.
  • image processing image synthesis, image composition, rendering, etc.
  • the data processor 210 executes various kinds of data processes. Some examples of processing that may be executed by the data processor 210 will be described below.
  • the data processor 210 (each element thereof) includes a processor that operates on the basis of predetermined software (program), and is implemented by the cooperation of hardware and software.
  • FIG. 4 shows a configuration example of the data processor 210 (and related elements thereto).
  • the data processor 210A shown in FIG. 4 is an example of the data processor 210 in FIG.3 and includes the analyzing processor 211.
  • the surgical microscope 10 is configured to generate a moving image (live image) by photographing the subject's eye into which an artificial object (e.g., an intraocular lens, an intraocular contact lens, an MIGS device, etc.) has been inserted.
  • the controller 200 is configured to capture a moving image frame (still image) and input the captured moving image frame into the analyzing processor 211. In some examples, the controller 200 captures a frame from a moving image in response to a manual or automatic trigger (upon receiving a manual or automatic trigger).
  • the analyzing processor 211 is configured to analyze the frame captured from the moving image to identify both an image region corresponding to a predetermined site of the subject's eye and an image region corresponding to a predetermined site of the artificial object having been inserted into the subject's eye.
  • the image region corresponding to the predetermined site of the subject's eye may sometimes be referred to as the first site image
  • the image region corresponding to the predetermined site of the artificial object may sometimes be referred to as the second site image.
  • the image region corresponding to the predetermined site of the subject's eye may be, for example, an image region identified as an image of the corresponding site, or an image region obtained by applying processing to an image of the corresponding site (and its vicinity).
  • the image region obtained by applying processing to the image of the site (and its vicinity) may be an approximate figure such as an approximate ellipse or an approximate circle. The same applies to the image region corresponding to the predetermined site of the artificial object.
  • the predetermined site of the subject's eye detected by the analyzing processor 211 may be a freely selected or determined site (part) of the subject's eye.
  • the analyzing processor 211 may be configured to detect any of the following sites, for example: the pupil (its entirety, or a feature site such as the pupil edge, the pupil center, or the pupil center of gravity), the cornea (its entirety, or a feature site such as the corneal ring, the corneal edge, the corneal center, or the corneal apex), the iris (its entirety, or a feature site such as the iris inner edge, the iris outer edge, or the iris pattern), the anterior chamber (its entirety, or a feature site such as the anterior border, or the posterior border), the corner angle (its entirety, a peripheral site, etc.), the crystalline lens (its entirety, or a feature site such as the lens capsule, the anterior capsule, the
  • the analyzing processor 211 may be configured to detect any of the following sites, for example: the optic nerve head, the macula, a blood vessel, the retina (its entirety, the surface, or one or more sub-tissues), the choroid (its entirety, the anterior surface, the posterior surface, or one or more sub-tissues), the sclera (its entirety, the anterior surface, the posterior surface, or one or more sub-tissues), the vitreous body (its entirety, an opaque region, a floating object (floater), a detached tissue, etc.), and a lesion.
  • the optic nerve head the macula
  • a blood vessel the retina
  • the retina its entirety, the surface, or one or more sub-tissues
  • the choroid its entirety, the anterior surface, the posterior surface, or one or more sub-tissues
  • the sclera its entirety, the anterior surface, the posterior surface, or one or more sub-tissues
  • the vitreous body its entirety, an opaque region, a floating object (float
  • the analyzing processor 211 may be configured to detect a freely selected site or tissue such as the eyelid, the meibomian glands, or the orbit (eye socket, eye pit).
  • the site to be detected by the analyzing processor 211 may be selected or determined depending on an illumination method employed, a site subject to surgery, a surgical method conducted, or other factors.
  • the analyzing processor 211 may be configured to detect an image of a predetermined site of the subject's eye from a still image using a freely selected region extraction method or technique. In some examples of detecting an image of a site characterized by its brightness (an image of a site that has a distinctive feature in brightness), the analyzing processor 211 may be configured to perform detection of an image of this site of the subject's eye from a still image using brightness thresholding such as binarization. In the case of detecting an image of a site characterized by its shape (an image of a site that has a distinctive feature in shape), the analyzing processor 211 may be configured to perform detection of an image of this site of the subject's eye from a still image using shape analysis processing such as pattern matching.
  • the analyzing processor 211 may be configured to perform detection of an image of this site of the subject's eye from a still image using color analysis processing such as feature color extraction.
  • the analyzing processor 211 may be configured to detect an image of a predetermined site of the subject's eye by applying segmentation to a still image to identify an image of this site of the subject's eye.
  • segmentation is image processing for identifying a partial region (subregion) in a given image. Segmentation may include any known image processing technique, and some examples of which may include image processing such as edge detection and/or machine learning (e.g., deep learning) based segmentation.
  • the predetermined site of the artificial object detected by the analyzing processor 211 may be a freely selected or determined site (part) of a freely selected or determined artificial object, and may be, for example, an intraocular lens (its entirety, the optics (lens), the haptics, the outer edge of the optics, the center of the optics, or the hole of the optics), an intraocular contact lens (its entirety, the optics (lens), the haptics, the outer edge of the optics, the center of the optics, the hole of the optics, the outer edge of the haptics, or the hole of the haptics), an MIGS device (its entirety or a part thereof), and so forth. Similar to detection of an image of the predetermined site of the subject's eye, the analyzing processor 211 may detect an image of the predetermined site of the artificial object from a still image using a freely selected or designed region extraction method or technique.
  • the controller 200 is configured to receive, from the surgical microscope 10, a live image of the subject's eye into which the artificial object has been inserted and display the live image on the display device 3, and also simultaneously (in parallel) display, on the display device 3, the first position information that represents a position of the first site image detected by the analyzing processor 211 and the second position information that represents a position of the second site image detected also by the analyzing processor 211.
  • the position of the first site image is the position of the predetermined site of the subject's eye
  • the position of the second site image is the position of the predetermined site of the artificial object.
  • the position of the predetermined site of the subject's eye and the position of the predetermined site of the artificial object can be provided together with the live image.
  • This enables the user to know the positional relationship between the predetermined site of the subject's eye and the predetermined site of the artificial object in real time.
  • This allows the user to easily perform the operation of placing the artificial object in the correct position during surgery to implant the artificial object in the eye, and also to easily confirm whether the artificial object is placed in the proper position, for example. Therefore, according to the ophthalmic observation apparatus 1, it becomes possible to facilitate ophthalmic surgery.
  • the controller 200 may be configured to display, in mutually different modes or aspects, the first position information that represents the position of the predetermined site of the subject's eye and the second position information that represents the position of the predetermined site of the artificial object.
  • a display mode of the first position information and a display mode of the second position information are different, for example, in at least one item (feature) from among colors, patterns, shapes, and sizes.
  • Either one of or both the display mode of the first position information and the display mode of the second position information may be changed automatically or manually.
  • the data processor 210A is configured to find a positional relationship (e.g., distance, direction, etc.) between the predetermined site of the subject's eye and the predetermined site of the artificial object based on the first site image and the second site image detected by the analyzing processor 211.
  • the controller 200 is configured to change the display mode of the first position information and/or the display mode of the second position information based on the positional relationship between the predetermined site of the subject's eye and the predetermined site of the artificial object obtained by the data processor 210A. This allows the user to easily (intuitively) understand the positional relationship between the predetermined site of the subject's eye and the predetermined site of the artificial object.
  • the controller 200 may be configured to display information on the basis of a positional difference between the first site image and the second site image based on the position of the first site image and the position of the second site image detected by the analyzing processor 211, along with the live image generated by the surgical microscope 10. Note that the controller 200 may be configured to display the information on the basis of the positional difference between the first site image and the second site image, together with a live image, and with at least one of the first position information that represents the position of the predetermined site of the subject's eye and the second position information that represents the position of the predetermined site of the artificial object.
  • the information on the basis of the positional difference between the first site image and the second site image may include, for example, any of the following kinds of information: information that represents a positional difference of the artificial object with respect to the subject's eye; information that represents a positional difference of the actual position of the artificial object with respect to the position where the artificial object should be placed; information that represents a movement direction of the artificial object (a direction in which the artificial object should be moved); information that represents a movement distance of the artificial object (a distance (amount) by which the artificial object should be moved); information that represents a rotation direction of the artificial object (a direction in which the artificial object should be rotated); and information that represents a rotation angle of the artificial object (an angle (amount) by which the artificial object should be rotated).
  • the information on the basis of the positional difference between the first site image and the second site image may be any of the following kinds of information: information representing a positional difference; information for eliminating or canceling out the positional difference (e.g., guidance or guide information for users); and other kinds of information.
  • the information on the basis of the positional difference between the first site image and the second site image may be in any form, and may be, for example, an image, a figure, a character string, or other forms.
  • the ophthalmic observation apparatus 1 may be configured to update the first position information and the second position information displayed together with the live image generated by the surgical microscope 10 in real time.
  • the first position information is information that represents the position of the predetermined site of the subject's eye
  • the second position information is information that represents the position of the predetermined site of the artificial object.
  • the analyzing processor 211 may be configured to sequentially identify images of the predetermined site of the subject's eye (first site images) and images of the predetermined site of the artificial object (second site images) from frames (still images) sequentially generated by the surgical microscope 10, in parallel with generation of the moving image performed by the ophthalmic surgical microscope 10.
  • the controller 200 may be configured to sequentially update the first position information and the second position information displayed together with the live image, in parallel with the sequential generation of the still images performed by the surgical microscope 10 and the sequential analysis of the still images performed by the analyzing processor 211. Updating of the first position information is performed based on the first site images sequentially obtained from the live image (the first site images are time-series images). Similarly, updating of the second position information is performed based on the second site images sequentially obtained from the live image (the second site images are time-series images).
  • the user can become aware, in real time, of changes such as a change in the position of the predetermined site of the subject's eye, a change in the position of the predetermined site of the artificial object, and a change in the positional relationship between the predetermined site of the subject's eye and the predetermined site of the artificial object.
  • FIG. 5 shows an example of the operation and the usage mode of the ophthalmic observation apparatus 1. While the present example describes an application to centering operation(that is, an operation or a task for aligning the center of the intraocular lens with the center of the eye) conducted in intraocular lens implantation surgery, substantially the same or a similar operation and substantially the same or a similar usage mode as those in the present example can be implemented also in the cases of applying to other kinds of surgery or to other kinds of artificial objects, except for differences in the surgery or the artificial object to be applied.
  • centering operation that is, an operation or a task for aligning the center of the intraocular lens with the center of the eye
  • the user performs a predetermined operation using the operation device 2 to cause the ophthalmic observation apparatus 1 to start generating and displaying a live image of the subject's eye (the anterior eye segment thereof).
  • the surgical microscope 10 illuminates the subject's eye by the illumination optical system 30, and at the same time (in parallel, simultaneously) generates digital image data (moving image, video) of the subject's eye by the image sensors 62L and 62R.
  • the generated moving image (the live image 301) is displayed in real time on the display device 3 (see FIG. 6A ).
  • a moving image acquired by the surgical microscope 10 is displayed on the display device 3 as a live image (as an observation image). The user can conduct surgery while observing the live image.
  • the analyzing processor 211 detects a center position of the eye (eye center position) from the live image (from a frame(s) thereof) whose generation has started in the step S1.
  • the eye center position is a position defined in advance as the center position of the subject's eye, and may typically be the corneal center or the pupil center.
  • the analyzing processor 211 first applies image processing, such as binarization, edge detection, segmentation, or other methods, to a frame of the live image 301 to detect the corneal ring 302 (see FIG. 6B ).
  • image processing such as binarization, edge detection, segmentation, or other methods
  • the analyzing processor 211 identifies the position (pixel) 303 corresponding to the corneal center based on the corneal ring 302 detected (see FIG. 6C ).
  • the analyzing processor 211 finds an approximate ellipse (approximate circle) to the corneal ring 302 detected, and then identifies the center of this approximate ellipse as the corneal center 303.
  • the process of identifying the center of the approximate ellipse is performed, for example, by means of known geometrical techniques.
  • the analyzing processor 211 first applies image processing, such as binarization, edge detection, segmentation, or other methods, to a frame of the live image 301 to detect the pupil edge 304 (see FIG. 6D ).
  • image processing such as binarization, edge detection, segmentation, or other methods
  • the analyzing processor 211 identifies the position (pixel) 305 corresponding to the pupil center based on the pupil edge 304 detected (see FIG. 6E ).
  • the analyzing processor 211 finds an approximate ellipse (approximate circle) to the pupil edge 304 detected, and then identifies the center of this approximate ellipse as the pupil center 305.
  • the process of identifying the center of this approximate ellipse is performed, for example, by means of known geometrical techniques.
  • the processing of the step S2 may be applied to the frames that are sequentially acquired as the live image 301.
  • the processing of the step S2 may be applied to individual frames acquired as the live image 301 (time-series images).
  • the processing of the step S2 may be applied to one or more frames selected from all the frames acquired as the live image 301.
  • the process of selecting frames may include, for example, thinning of selecting frames at intervals of a predetermined number of frames.
  • the controller 200 may display information that represents the eye center position detected in the step S2 (referred to as eye center position information) on the live image 301.
  • the controller 200 may update the display of the eye center position information (the displayed eye center position information) each time a new eye center position is detected. This enables the surgeon to know the eye center position of the subject's eye in real time.
  • the surgeon inserts the intraocular lens into the subject's eye and then starts the process of determining the position in which the intraocular lens is placed (fixed) in conformity with the procedures and techniques of typical cataract surgery (crystalline lens replacement surgery).
  • the position determination process is referred to as positioning, centering, or the like.
  • the analyzing processor 211 detects an eye center position and an intraocular lens center position from a frame of the live image 301. Detection of the eye center position may be executed in the same or a similar manner as or to the processing in the step S2.
  • the present example describes a case of performing detection of the corneal ring 302 and detection of the corneal center 303 (see FIG. 6F and FIG. 6G ).
  • Detection of the intraocular lens center position may also be executed in the same or a similar manner.
  • the analyzing processor 211 first applies image processing, such as binarization, edge detection, segmentation, or other methods, to a frame of the live image 301 to detect the outer edge 306 of the optics (lens) of the intraocular lens (see FIG. 6F ).
  • the analyzing processor 211 identifies the position (pixel) 307 corresponding to the center of the optics of the intraocular lens based on the outer edge 306 detected (see FIG. 6G ).
  • the analyzing processor 211 finds an approximate ellipse (approximate circle) to the outer edge 306 detected and then identifies the center of this approximate ellipse as the intraocular lens center position 307.
  • the process of identifying the center of this approximate ellipse is performed, for example, by means of known geometrical methods.
  • the processing of the step S4 can be applied to the frames that are sequentially acquired as the live image 301.
  • an eye center position and an intraocular lens center position can be detected by applying the processing of the step S4 to all the frames acquired as the live image 301 (time-series images).
  • an eye center position and an intraocular lens center position can be detected by applying the processing of the step S4 to one or more frames selected from all the frames acquired as the live image 301.
  • the process of selecting frames may include, for example, thinning of selecting frames at intervals of a predetermined number of frames.
  • the controller 200 displays information that represents the eye center position (referred to as eye center position information) and information that represents the intraocular lens center position (referred to as intraocular lens center position information) on the live image.
  • eye center position information information that represents the eye center position
  • intraocular lens center position information information that represents the intraocular lens center position
  • FIG. 6H shows an example of the mode of the display in the step S5.
  • the cornea 308 of the subject's eye, the pupil (not shown), the intraocular lens 309 inserted in the step S3, and so forth are depicted.
  • the eye center position information 310 that represents the eye center position detected from the live image 301 in the step S4 and the intraocular lens center position information 311 that represents the intraocular lens center position also detected from the live image 301 in the step S4 are displayed on the live image 301 of the present example.
  • the eye center position information 310 and the intraocular lens center position information 311 are displayed in mutually different aspects (mutually different visual aspects, mutually different appearances, mutually different ways or manners).
  • the user can easily recognize both the eye center position and the intraocular lens center position, and can also easily understand the positional relationship (relative position) between the eye center position and the intraocular lens center position.
  • Such a display mode contributes to simplification and facilitation of the intraocular lens positioning.
  • FIG. 6I shows another example of the mode of the display in the step S5.
  • the live image 301 of the present example as in the case of FIG. 6H , the cornea 308 of the subject's eye, the pupil (not shown), the intraocular lens 309 inserted in the step S3, and so forth are displayed. Further, the eye center position information 310 and the intraocular lens center position information 311 are displayed on the live image 301.
  • the controller 200 displays the guide information 312 for eliminating or canceling out the positional difference.
  • the guide information 312 includes a group of arrow-shaped images that indicates a direction in which the intraocular lens 309 should be moved.
  • the guide information 312 of the present example includes the three arrow-shaped images each pointing to the left.
  • the direction indicated by the arrow-shaped images represents a direction in which the intraocular lens 309 should be moved, that is, the movement direction of the intraocular lens 309 in order to eliminate or cancel out the misalignment (displacement, deviation) of the intraocular lens 309 with respect to the subject's eye.
  • another guide information may be displayed that represents a direction of misalignment of the intraocular lens 309 with respect to the subject's eye.
  • the number of arrow-shaped images indicates a distance by which the intraocular lens 309 should be moved, that is, the amount or distance of the misalignment of the intraocular lens 309 with respect to the subject's eye.
  • the amount of misalignment corresponding to one arrow-shaped image is determined in advance.
  • the user can understand the fact that the intraocular lens 309 is misaligned to the right with respect to the subject's eye and also the fact that the intraocular lens 309 should be moved to the left by a distance corresponding to the number of the displayed arrow-shaped images which is three.
  • the guide information 312 of the present example is generated based on the eye center position and the intraocular lens center position acquired in the step S4.
  • the direction of the group of arrow-shaped images in the guide information 312 is determined as the direction of the vector whose initial point is located at the intraocular lens center position and whose terminal point is located at the eye center position, and the number of arrow-shaped images is determined based on the magnitude (length) of this vector.
  • the guide information 312 represents the movement direction and the movement amount of the intraocular lens 309 in order to align the intraocular lens center position with the eye center position (in order to match these positions), that is, the movement direction and the movement amount of the intraocular lens 309 to achieve centering of the intraocular lens 309.
  • the user can conduct the centering operation of the intraocular lens 309 by referring to the guide information 312 displayed.
  • the user can conduct the centering operation of the intraocular lens 309 by referring to these pieces of information displayed.
  • the steps S4 and S5, as well as the centering operation of the intraocular lens 309 by the user (surgeon, operator) are repeated and continued until the position determination (positioning) of the intraocular lens 309 is completed (S6: No).
  • the steps S4 and S5 are performed at predetermined time intervals. With this, the user can perform the positioning operation (centering operation) of the intraocular lens 309 while referring to the eye center position information 310 and the intraocular lens center position information 311 that represent the positional relationship between the subject's eye and the intraocular lens 309 in real time.
  • the user can perform the positioning operation of the intraocular lens 309 while referring to the guide information 312 that represents the direction to which the intraocular lens 309 should be moved and the distance by which the intraocular lens 309 should be moved.
  • the detection of the eye center position and the intraocular lens center position, the display of the eye center position information 310 and the intraocular lens center position information 311, and the display of the guide information 312 may be ended (End). Then, the user can move on to the next step of cataract surgery.
  • an intraocular contact lens that has a hole formed in the center of the lens has recently been developed.
  • Such an intraocular contact lens is referred to as a hole ICL.
  • the hole ICL allows aqueous humor to move through the hole, thereby eliminating the need for iridotomy.
  • the hole is very small and has no effect on vision. Note that hole ICLs are not suitable for eyes with certain refractive powers (e.g., farsighted eyes).
  • the ophthalmic observation apparatus 1 may detect the hole formed in the center by the analyzing processor 211, and then display information that represents the position of the detected hole together with a live image by the controller 200. This makes it possible to obtain the same or similar effects as those of the above-described embodiment examples. It should be noted that when the visibility of the hole of the hole ICL depicted in the live image is sufficiently high, the detection of the hole and the display of the position information of the hole may be unnecessary. In such a case, the user can perform positioning (centering) of the intraocular contact lens by referring to the image of the hole itself depicted in the live image and the eye center position information.
  • a hole ICL is an example of an artificial object that has a feature point (e.g., a hole).
  • the position of the feature point of such an artificial object is designed with high accuracy (high precision) in advance. Therefore, the positional relationship between the feature point and the center of the optics (lens) is known. Note that there are cases where the feature point and the center of the optics do or do not match each other.
  • the position information of the feature point of the artificial object may be stored in advance in the controller 200 (in the memory 202), the position of the center of the optics (or another site of the optics) of the artificial object may be calculated by referring to the stored position information, and information that represents the calculated position may be displayed together with a live image.
  • information on an artificial object to be implanted may be registered in advance, and information that represents the center position of the optics (or a predetermined site of the optics) may be displayed together with a live image.
  • the information on the artificial object may include any of the following kinds of information, for example: position information of a feature part of the artificial object (e.g., hole, mark, character string, unevenness, pattern, etc.); shape information of the whole or part of the artificial object; thickness distribution information of the whole or part of the artificial object; color information of the whole or part of the artificial object; and information of other kinds.
  • Display of the position information of a predetermined site of the artificial object may be started, for example, in response to commencement of depiction of an image of the artificial object within the pupil region of the live image.
  • the eye center position is identified based on the corneal ring of the subject's eye, but the method of identifying an eye center position is not limited to this. Even in the case where an eye center position is identified by means of another method, it is possible to present the same or similar guide information as or to any of those of the above embodiment examples.
  • Examples of other methods for identifying an eye center position include a method based on a site of an eye other than the corneal ring (e.g., pupil, iris), a method based on projected light, a method based on an instrument or a mark (sign) placed on or attached to the eye, and a method based on information obtained from other systems (e.g., augmented reality system, virtual reality system).
  • the positional relationship between the subject's eye and the lens is determined based on the positional relationship between the center of the eye and the center of the lens; however, the method for determining the positional relationship between the subject's eye and the lens is not limited to this.
  • the positional relationship between the subject's eye and the lens may be determined based on the positional relationship between the corneal ring (or, the pupil edge, the iris inner edge, the iris outer edge) and the lens edge (the outer edge of the lens).
  • the positional relationship between the subject's eye and the lens can be determined based on a distance distribution between the corneal ring (or, the pupil edge, the iris inner edge, the iris outer edge) and the lens edge (the outer edge) by utilizing the fact that the corneal ring (or, the pupil edge, the iris inner edge, the iris outer edge) and the lens edge (the outer edge) are both of substantially circular shapes or substantially elliptic shapes.
  • the positional relationship between the subject's eye and the lens can be determined based on a distance distribution between an approximate figure to the corneal ring (or, the pupil edge, the iris inner edge, the iris outer edge) and an approximate figure to the lens edge (the outer edge).
  • the ophthalmic observation apparatus 1 is capable of displaying the live image 401 of the subject's eye on the display device 3, displaying the corneal ring image 402, the pupil edge image 403, and the intraocular lens outer edge image 404 on the live image 401, and further displaying the vertical line image 405 and the horizontal line image 406.
  • Both the vertical line image 405 and the horizontal line image 406 are arranged, for example, to pass through the center of the corneal ring image 402 having a circular shape or an elliptic shape. Note that only either one of the vertical line image 405 or the horizontal line image 406 may be displayed, a line image(s) oriented in an oblique direction may be displayed, or a line image that has a shape other than a straight line (e.g., a curved line) may be displayed. In some examples, the line image may be arranged to pass through the center of the pupil edge image 403 or may be arranged to pass through the center of the intraocular lens outer edge image 404.
  • the positional relationship between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404 can be automatically assessed. Also, the user can easily understand the positional relationship between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404 by referring to the line image(s).
  • the automatic assessment processing of the positional relationship between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404 includes a process of calculating a predetermined assessment parameter and a determination process based on the assessment parameter calculated.
  • the assessment parameter may be, for example, either one of the degree of the uniformity of a distribution of the space (gap, distance) between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404, or the degree of the concentricity (or eccentricity) between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404.
  • the ophthalmic observation apparatus 1 may display, for example, guide information for uniformizing the space distribution (gap distribution, distance distribution) between the corneal ring image 402 (the pupil edge image 403) and the intraocular lens outer edge image 404, or guide information for increasing the degree of the concentricity.
  • guide information 407 including a group of arrow-shaped images
  • the guide information 407 may be displayed that represents the direction in which the intraocular lens should be moved and the distance by which the intraocular lens should be moved. It should be noted that the guide information is not limited to the present example.
  • Some embodiment examples provide a method of controlling an ophthalmic observation apparatus. It is possible to combine any items or matters relating to the ophthalmic observation apparatus 1 of the above embodiment examples with the example of the method described below.
  • An ophthalmic observation apparatus controlled by the method of an aspect example includes an optical system for generating a moving image of a subject's eye (e.g., the illumination optical system 30 and the observation optical system 40) and a processor (e.g., the controller 200 and the data processor 210).
  • the method of the present aspect example first causes the optical system to generate a moving image of the subject's eye into which an artificial object has been inserted. Further, the method of the present aspect example causes the processor to analyze a still image included in the generated moving image to identify the first site image corresponding to a predetermined site of the subject's eye and the second site image corresponding to a predetermined site of the artificial object. In addition, the method of the present aspect example causes the processor to display, on a display device, the moving image generated, the first position information that represents a position of the first site image identified, and the second position information that represents a position of the second site image identified.
  • the method of the present aspect example is capable of achieving the same actions and effects as those of the ophthalmic observation apparatus 1 of the above embodiment examples.
  • the resulting method becomes capable of achieving the actions and effects corresponding to the combined matters and/or items.
  • Some embodiment examples provide a program causing a computer to execute the method of the aspect example described above. It is possible to combine any of the matters and items relating to the ophthalmic observation apparatus 1 of the above embodiment examples with such a program.
  • the program thus configured is capable of achieving the same actions and effects as those of the ophthalmic observation apparatus 1 of the above embodiment examples.
  • the resulting program is capable of achieving the actions and effects corresponding to the combined matters and/or items.
  • Some embodiment examples provide a computer-readable non-transitory recording medium storing the program described above. It is possible to combine any of the matters and items relating to the ophthalmic observation apparatus 1 of the above embodiment examples with such a recording medium.
  • the non-transitory recording medium may be in any form, and examples thereof include magnetic disks, optical disks, magneto-optical disks, and semiconductor memories.
  • the recording medium thus configured is capable of achieving the same actions and effects as those of the ophthalmic observation apparatus 1 of the above embodiment examples.
  • the resulting recording medium is capable of achieving the actions and effects corresponding to the combined matters and/or items.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Microscoopes, Condenser (AREA)
  • Eye Examination Apparatus (AREA)
EP20959942.2A 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage Pending EP4238482A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063106087P 2020-10-27 2020-10-27
PCT/JP2020/045754 WO2022091428A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage

Publications (1)

Publication Number Publication Date
EP4238482A1 true EP4238482A1 (fr) 2023-09-06

Family

ID=81382141

Family Applications (4)

Application Number Title Priority Date Filing Date
EP20959943.0A Pending EP4238483A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique
EP20959945.5A Pending EP4238479A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique
EP20959942.2A Pending EP4238482A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage
EP21885560.9A Pending EP4238484A1 (fr) 2020-10-27 2021-02-02 Dispositif d'observation ophtalmologique, son procédé de commande, programme, et support d'enregistrement

Family Applications Before (2)

Application Number Title Priority Date Filing Date
EP20959943.0A Pending EP4238483A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique
EP20959945.5A Pending EP4238479A1 (fr) 2020-10-27 2020-12-09 Dispositif d'observation ophtalmologique

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP21885560.9A Pending EP4238484A1 (fr) 2020-10-27 2021-02-02 Dispositif d'observation ophtalmologique, son procédé de commande, programme, et support d'enregistrement

Country Status (5)

Country Link
US (5) US20240008741A1 (fr)
EP (4) EP4238483A1 (fr)
JP (5) JPWO2022091431A1 (fr)
CN (4) CN116507266A (fr)
WO (5) WO2022091430A1 (fr)

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001276111A (ja) * 2000-03-31 2001-10-09 Nidek Co Ltd 眼科手術装置
DE60239278D1 (de) 2001-09-07 2011-04-07 Topcon Corp Instrument zur messung von optischen merkmalen der augen
JP4638783B2 (ja) * 2005-07-19 2011-02-23 オリンパスイメージング株式会社 3d画像ファイルの生成装置、撮像装置、画像再生装置、画像加工装置、及び3d画像ファイルの生成方法
JP2013027672A (ja) * 2011-07-29 2013-02-07 Nidek Co Ltd 眼底撮影装置
CN103997948B (zh) * 2011-10-19 2016-06-01 艾里德克斯公司 栅格图案激光治疗及方法
JP6173169B2 (ja) * 2012-11-26 2017-08-02 キヤノン株式会社 眼科装置及び眼科装置の制御方法
WO2016149155A1 (fr) * 2015-03-13 2016-09-22 Richard Awdeh Procédés et systèmes pour le recalage à l'aide d'un insert de microscope
JP6456208B2 (ja) * 2015-03-26 2019-01-23 キヤノン株式会社 眼科装置及び眼科装置の制御方法
JP6499937B2 (ja) * 2015-06-30 2019-04-10 株式会社トプコン 眼科用顕微鏡システム
JP6685151B2 (ja) 2016-03-08 2020-04-22 株式会社トプコン 眼科装置
WO2017221507A1 (fr) * 2016-06-21 2017-12-28 オリンパス株式会社 Système d'endoscope
JP6812724B2 (ja) * 2016-09-30 2021-01-13 株式会社ニデック 眼科手術システム、眼科手術システム制御プログラム、および眼科用手術顕微鏡
EP3607922B1 (fr) * 2017-05-09 2022-07-20 Sony Group Corporation Dispositif de traitement d'images, procédé de traitement d'images et programme de traitement d'images
JP6930593B2 (ja) * 2017-09-22 2021-09-01 株式会社ニコン 画像表示装置及び画像表示システム
JP2019092844A (ja) 2017-11-22 2019-06-20 株式会社トプコン 前置レンズ装置及び眼科用顕微鏡
JP7070992B2 (ja) 2018-03-20 2022-05-18 i-PRO株式会社 眼科手術装置および眼科撮影装置
DE102018208014A1 (de) * 2018-05-22 2019-11-28 Carl Zeiss Meditec Ag Planungseinrichtung für eine Astigmatismus-Korrektur
JP6638050B2 (ja) 2018-10-23 2020-01-29 株式会社トプコン 眼科手術用顕微鏡および眼科手術用アタッチメント
JP7199236B2 (ja) * 2019-01-24 2023-01-05 株式会社トプコン 眼科装置
JP2020130607A (ja) * 2019-02-20 2020-08-31 ソニー株式会社 制御装置、眼科用顕微鏡システム、眼科用顕微鏡及び画像処理装置

Also Published As

Publication number Publication date
US20230404400A1 (en) 2023-12-21
EP4238484A1 (fr) 2023-09-06
WO2022091431A1 (fr) 2022-05-05
WO2022091428A1 (fr) 2022-05-05
CN116507266A (zh) 2023-07-28
JPWO2022091430A1 (fr) 2022-05-05
WO2022091435A1 (fr) 2022-05-05
EP4238479A1 (fr) 2023-09-06
US20240008741A1 (en) 2024-01-11
JPWO2022091431A1 (fr) 2022-05-05
US20230397811A1 (en) 2023-12-14
CN116437849A (zh) 2023-07-14
WO2022091429A1 (fr) 2022-05-05
JPWO2022091428A1 (fr) 2022-05-05
CN116507268A (zh) 2023-07-28
EP4238483A1 (fr) 2023-09-06
JPWO2022091429A1 (fr) 2022-05-05
US20230397810A1 (en) 2023-12-14
US20230397809A1 (en) 2023-12-14
CN116490117A (zh) 2023-07-25
JPWO2022091435A1 (fr) 2022-05-05
WO2022091430A1 (fr) 2022-05-05

Similar Documents

Publication Publication Date Title
JP6518054B2 (ja) 眼科装置
CN109068973B (zh) 用于白内障手术的可拆卸微型显微镜安装的角膜曲率计
JP6685151B2 (ja) 眼科装置
JP6899632B2 (ja) 眼科撮影装置
JP7320662B2 (ja) 眼科装置
JP7100503B2 (ja) 眼科装置
JP2006504493A (ja) 取得した画像の処理に基づく眼底画像の自動発生
JP2021146184A (ja) 眼科装置及び測定方法
JP7266375B2 (ja) 眼科装置及びその作動方法
JP2018051208A (ja) 眼科情報処理装置、眼科情報処理プログラム、および眼科情報処理システム
EP4238482A1 (fr) Dispositif d'observation ophtalmologique, son procédé de commande, programme et support de stockage
EP3571979A1 (fr) Dispositif ophtalmologique
EP4233689A1 (fr) Dispositif d'observation ophtalmologique, dispositif de traitement d'image ophtalmologique, méthode de traitement d'image ophtalmologique, programme, et support d'enregistrement
JP7227811B2 (ja) 眼科装置
JP7171162B2 (ja) 眼科撮影装置
US20220369921A1 (en) Ophthalmologic apparatus and measurement method using the same
Rishi et al. Principles of ophthalmoscopy
JP2022183272A (ja) 眼科装置
JP2020138002A (ja) 眼科装置及びその作動方法
JP2020151096A (ja) 眼科装置
JP2018153543A (ja) 眼科装置

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230525

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)