WO2014084231A1 - 眼底撮影装置 - Google Patents
眼底撮影装置 Download PDFInfo
- Publication number
- WO2014084231A1 WO2014084231A1 PCT/JP2013/081856 JP2013081856W WO2014084231A1 WO 2014084231 A1 WO2014084231 A1 WO 2014084231A1 JP 2013081856 W JP2013081856 W JP 2013081856W WO 2014084231 A1 WO2014084231 A1 WO 2014084231A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fundus
- light
- unit
- image
- optical system
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/102—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/12—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0041—Operational features thereof characterised by display arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0091—Fixation targets for viewing direction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/1025—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for confocal scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/12—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
- A61B3/1208—Multiple lens hand-held instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
- A61B3/15—Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
- A61B3/152—Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning
Definitions
- the present invention relates to a fundus imaging apparatus.
- a fundus imaging apparatus such as an optical scanning ophthalmoscope or an optical coherence tomometer
- a fundus imaging apparatus that scans the fundus using a laser beam and detects the return light to acquire a fundus image
- an apparatus that projects an alignment index onto an eye to be examined and positions the apparatus main body is known (for example, see Patent Document 2).
- a fundus imaging system including a fundus imaging apparatus such as a fundus camera, an optical coherence tomometer, or an optical scanning ophthalmoscope
- the fundus image and the subject are associated with each other by a subject ID.
- a fundus imaging apparatus such as a fundus camera, an optical coherence tomometer, or an optical scanning ophthalmoscope
- the fundus image and the subject are associated with each other by a subject ID.
- a subject ID for example, see Patent Document 3
- a medical data search system that links medical data (such as disease names) described in a patient's name and medical record and fundus images and stores them as electronic data (for example, see Patent Document 4).
- Such a fundus imaging apparatus is desirably small and lightweight.
- a dedicated optical system for projecting and receiving an alignment index and a focus index is provided as in a general stationary fundus photographing apparatus. Conceivable.
- providing such a dedicated optical system leads to an increase in size and complexity of the apparatus, and is not suitable for a portable type that is required to be reduced in size, weight, or compact.
- the present invention has been made in view of the above circumstances, and its purpose is to perform alignment and focusing without providing a dedicated optical system for projecting an index for alignment or focusing.
- An object of the present invention is to provide a fundus imaging apparatus that can be used.
- the fundus imaging apparatus of the embodiment includes a scanning optical system, a control circuit unit, and an image construction unit.
- the scanning optical system scans the fundus of the subject's eye with the light from the light source unit, and receives the return light from the fundus at the light receiving unit.
- the control circuit unit controls the scanning optical system so that a scanning locus by light is formed on the fundus.
- the image constructing unit constructs an image of the fundus based on the light reception signal from the light receiving unit and the scanning locus position.
- the control circuit unit can operate as an alignment mode for controlling the scanning optical system so that an alignment index for performing alignment of the scanning optical system with respect to the eye to be examined is projected onto the fundus based on light from the light source unit.
- alignment and focusing can be performed without providing a dedicated optical system for projecting an index for alignment or focusing.
- FIG. 1 denotes an optical scanning ophthalmoscope main body as the apparatus main body
- reference numeral 2 denotes a handle member.
- an eyepiece tube portion 3 is provided on the front side (side facing the subject) of the optical scanning ophthalmoscope main body 1.
- a transparent plate 4 is provided on the back side of the optical scanning ophthalmoscope main body 1 (the side facing the examiner). The examiner can visually recognize the liquid crystal display screen 5 of the monitor unit described later via the transparent plate 4.
- the optical scanning ophthalmoscope main body 1 includes a laser scanning optical system, a control circuit unit that controls the laser scanning optical system, a fundus image construction unit (image construction unit), a lighting control circuit unit, and a monitor unit. And a power supply unit and other drive mechanisms necessary for fundus observation photography.
- a power button B for power on / off is provided as shown in FIG.
- the eyepiece tube portion 3 of this embodiment is provided on the left side as seen from the side facing the examiner as shown in FIG.
- the handle member 2 is provided with a gripping portion 6 and a detachable protrusion portion 7.
- a trapezoidal concave portion 8 indicated by a dotted line in FIG. 2 is provided at a position corresponding to the arrangement position of the eyepiece tube portion 3 on the lower surface of the optical scanning ophthalmoscope main body 1.
- the detachable protrusion 7 has a shape corresponding to the shape of the recess 8 and is fitted into the recess 8.
- One of the detachable protrusion 7 and the recess 8 is provided with a magnet member (not shown).
- a magnetic member that acts on the attractive force of the magnet member is provided on the other side of the attaching / detaching protrusion 7 and the recess 8.
- the handle member 2 is attached to and detached from the optical scanning ophthalmoscope main body 1 by the attractive force of the magnet member, but is not limited to this configuration.
- the grip portion 6 of the handle member 2 is provided with a photographing button 9, an alignment button 10, and a focus button 11 (these functions will be described later).
- the optical scanning ophthalmoscope of this embodiment can be used even when mounted on the support base 12 as shown in FIG.
- a forehead support 13 is provided at the top of the support 12.
- An engaging protrusion (not shown) is formed on the inclined portion 14 of the support base 12.
- the optical scanning ophthalmoscope main body 1 is supported and fixed to the support base 12 by fitting the concave portion 8 to the engaging projection.
- a photographing button 9 ' On the upper surface of the optical scanning ophthalmoscope main body 1, a photographing button 9 ', an alignment button 10', and a focusing button 11 'are provided.
- the operations on these buttons 9 'to 11' are effective when the optical scanning ophthalmoscope main body 1 is mounted on the support base 12.
- the photographing button 9 ′, the alignment button 10 ′, and the focusing button 11 ′ are switched between valid / invalid via a detection switch (not shown) provided on the support 12.
- the photographing button 9 ', the alignment button 10' and the focusing button 11 ' are used. It can also be configured to be usable.
- a plug (USB) 14 for connecting a portable information device such as a smartphone, a tablet terminal, or a portable information device (PDA) is provided.
- the plug 14 is connected to the portable information device 16 shown in FIG.
- the portable information device 16 is provided with a plurality of operation buttons 17.
- the plurality of operation buttons 17 are used as substitutes for the shooting button 9, the alignment button 10, and the focus button 11, but are not limited thereto.
- the fundus image EGr is displayed on the display screen 18 of the portable information device 16, but the present invention is not limited to this.
- the fundus image EGr may be stored in a built-in memory unit described later, and the fundus image EGr may be output in response to the operation of the output button OB shown in FIG.
- a scanning optical system and a control circuit unit in a portable type optical scanning ophthalmoscope will be described.
- an optical scanning ophthalmoscope that can be used as both a portable type and a stationary type, or an optical scanning dedicated to a stationary type.
- the configuration according to this embodiment can also be applied to the mold ophthalmoscope.
- the fundus image EGr is sequentially stored in the built-in memory unit.
- the fundus image EGr may be transmitted to a medical institution via a wired telephone line or a wireless telephone line.
- FIG. 7 is a block diagram showing a scanning optical system and a control unit of the optical scanning ophthalmoscope according to this embodiment.
- Reference numeral 20 denotes an illumination light source unit that emits illumination light.
- Reference numeral 21 denotes a light receiving portion.
- the illumination light source unit 20 includes an infrared light source 20a that emits infrared light, a blue light source 20b that generates B light (blue light), a green light source 20g that generates G light (green light), and R light (red).
- a red light source 20r that generates light).
- Each of these light sources includes a highly spatially coherent light source, such as a semiconductor laser (including wavelength swept lasers, superluminescent diodes), a solid state laser, a gas laser, or light emitted from them coupled to an optical fiber.
- a semiconductor laser including wavelength swept lasers, superluminescent diodes
- a solid state laser including wavelength swept lasers, superluminescent diodes
- a gas laser or light emitted from them coupled to an optical fiber.
- a fiber laser is used.
- the infrared light Pa emitted from the infrared light source 20a is condensed by the condenser lens 20a ′ and guided to the reflection mirror 20a ′′.
- the blue light B emitted from the blue light source 20b is collected by the condenser lens 20b ′. Is condensed and guided to the dichroic mirror 20b ′′.
- the green light G emitted from the green light source 20g is condensed by the condenser lens 20g ′ and guided to the dichroic mirror 20g ′′.
- the red light R emitted from the red light source 20r is collected by the condenser lens 20r ′. The light is guided to the dichroic mirror 20r ′′.
- the dichroic mirror 20b "transmits infrared light Pa and functions to reflect blue light B.
- the dichroic mirror 20g transmits infrared light Pa and blue light B and reflects green light G.
- the dichroic mirror 20r ′′ transmits red light R and reflects green light G, blue light B, and infrared light Pa.
- the optical paths of the infrared light Pa, the blue light B, the green light G, and the red light R are synthesized by the reflection mirror 20a ′′ and the dichroic mirrors 20b ′′, 20g ′′, and 20r ′′. Further, these lights are guided to the beam splitter 22, pass through the beam splitter 22, and are guided to the MEMS mirror 23.
- the MEMS mirror 23 functions as a biaxial optical scanner.
- the biaxial optical scanner is a MEMS mirror, but this is not a limitation.
- a two-axis optical scanner may be configured by combining two uniaxial mirror scanners such as a galvanometer mirror optical scanner and a polygon mirror optical scanner.
- a relay lens optical system may be provided between two uniaxial mirror scanners included in the biaxial optical scanner.
- the infrared light Pa, blue light B, green light G, and red light R are two-dimensionally deflected by the MEMS mirror 23. Further, these lights are guided to the relay lens 24 and imaged in the air as spot lights on a surface Er ′ conjugate with the fundus Er of the eye E to be examined.
- the light imaged in the air on the conjugate plane Er ′ is transmitted through the objective lens 25 as a focusing lens, guided into the eye through the pupil Ep and the crystalline lens Ec of the eye E, and the fundus Er Er conjugate with the conjugate plane Er ′. Is imaged as spot light Sp on the conjugate plane Er ′′.
- the objective lens 25 is provided in the eyepiece tube section 3 and is moved in the axial direction by a manual operation.
- the annular ring member (not shown) of the eyepiece tube portion 3 is rotated according to the refractive power of the eye E, the objective lens 25 is moved in the optical axis direction. Accordingly, the conjugate plane Er ′′ is matched with the fundus oculi Er, and as a result, a clear spot light Sp is formed on the fundus oculi Er.
- the fundus Er is scanned two-dimensionally by the action of the MEMS mirror 23.
- symbol SL indicates a scanning locus by the scanning spot light Sp
- symbol Br indicates a blood vessel
- symbol Pu indicates a papilla
- symbol Qu indicates a macular portion.
- the spot reflected light from the fundus Er is guided to the objective lens 25 through the crystalline lens Ec and the pupil Ep, once imaged in the air on the conjugate plane Er ′, collimated by the relay lens 24, passes through the MEMS mirror 23, and passes through the beam splitter 22. Led to.
- the spot reflected light reflected by the beam splitter 22 is guided to the light receiving unit 21.
- the spot reflected light from the fundus Er is light returning from or near the position of the scanning spot light (returned light). For example, the regular reflected light of the scanning spot light, the scattered light of the scanning spot light, And at least one of fluorescence excited by the scanning spot light and scattered light thereof.
- the light receiving unit 21 includes a dichroic mirror 21r ′′, a dichroic mirror 21g ′′, and a dichroic mirror 21b ′′.
- the dichroic mirror 21r ′′ reflects the red light R, and the green light G, the blue light B, and the infrared light. Transmits light Pa.
- the dichroic mirror 21g ′′ reflects green light G and transmits blue light B and infrared light Pa.
- the dichroic mirror 21b ′′ reflects blue light B and transmits infrared light Pa.
- An imaging lens 21r ' is provided in the direction of reflection by the dichroic mirror 21r ".
- the red light R is imaged on the PD sensor 21r as an image receiving element by the imaging lens 21r'.
- An imaging lens 21g ' is provided in the direction of reflection by the dichroic mirror 21g ".
- the green light G is imaged on the PD sensor 21g as an image receiving element by the imaging lens 21g'.
- An imaging lens 21b ' is provided in the direction of reflection by the dichroic mirror 21b ".
- the blue light B is imaged on the PD sensor 21b as an image receiving element by the imaging lens 21b'.
- An imaging lens 21a ' is provided in the transmission direction by the dichroic mirror 21b ".
- the infrared light Pa is imaged on the PD sensor 21a as an image receiving element by the imaging lens 21a'.
- the light reception signals from the PD sensors 21a, 21b, 21g and 21r are input to the fundus image construction unit described later.
- the PD sensor 21a has sensitivity in the infrared region
- the PD sensor 21b has sensitivity in the blue wavelength region
- the PD sensor 21g has sensitivity in the green wavelength region
- the PD sensor 21r has sensitivity in the red wavelength region.
- a movable case 1A indicated by a broken line in FIG. 1 and a drive mechanism (not shown) for moving the movable case 1A in the front, rear, up, down, left and right directions.
- the movable case 1A and the drive mechanism correspond to a “drive unit”.
- an illumination light source unit 20 In the movable case 1A, an illumination light source unit 20, a light receiving unit 21, a relay lens 24, a MEMS mirror 23, and a beam splitter 22 are provided.
- the optical system and the eyepiece lens barrel 3 move integrally with the movable case 1A.
- the optical scanning ophthalmoscope main body 1 includes a power supply circuit unit 26, a control circuit unit 27, a lighting control circuit unit 28, a fundus image construction unit 29, and a monitor unit 30.
- the power supply circuit unit 26 is, for example, a type that can replace a battery or a type that can charge a battery.
- the power supply circuit unit 26 supplies power to the control circuit unit 27, the fundus image construction unit 29, the lighting control circuit unit 28, and the like.
- the control circuit unit 27 is provided with a program for controlling the illumination light source unit 20, a program for controlling the MEMS mirror 23, and a program for controlling the fundus image construction unit 29.
- a program for controlling the illumination light source unit 20 and a program for controlling the MEMS mirror 23 are provided in accordance with, for example, the following various operation modes: alignment of the apparatus main body with respect to the eye E to be examined Adjustment mode for performing focusing; focusing adjustment mode for performing focusing on the fundus Er; fundus observation mode; imaging mode.
- each circuit By operating the power button B, each circuit is activated. Since alignment adjustment and focus adjustment are not directly related to this embodiment, it is assumed that these adjustment operations have been completed. Hereinafter, the observation mode and the imaging mode will be described, and then the association between the fundus image and the personal information will be described.
- the optical scanning ophthalmoscope main body 1 automatically enters the personal information acquisition mode, and then shifts to the observation mode.
- the personal information acquisition mode will be described later, and the observation mode will be described first.
- the lighting control circuit unit 28 turns on the infrared light source 20a. Thereby, as shown in FIG. 8, spot light Sp which is infrared light is formed on the fundus Er.
- the MEMS mirror 23 is driven to draw a plurality of scanning trajectories SL from the left to the right in order from the top, as shown in FIG. 8, for example, according to a program for controlling this. Thereby, the predetermined range of the fundus Er is scanned with the spot light Sp.
- the reflected light of the spot light Sp from the fundus Er is received by the infrared sensor PD sensor 21a.
- the light reception signal from the PD sensor 21 a is input to the fundus image construction unit 29.
- the fundus image constructing unit 29 includes an image processing unit 29a and a built-in memory unit 29b.
- the pixel position signal corresponding to the scanning position (scanning locus position in the horizontal direction, scanning locus position in the vertical direction) of the spot light Sp is input to the image processing unit 29a in accordance with a program for controlling the MEMS mirror 23.
- the image processing unit 29a constructs a fundus image for observation based on the input pixel position signal and the received light signal corresponding to the pixel position signal.
- the image processing unit 29 a outputs an image signal of the constructed fundus image to the monitor unit 30.
- the fundus image EGr is displayed on the liquid crystal display screen 5 of the monitor unit 30 (see FIG. 9).
- the examiner operates the photographing button 9 while observing the displayed fundus image EGr.
- the operation mode of the optical scanning ophthalmoscope is switched to the imaging mode.
- the lighting control circuit unit 28 turns on the blue light source 20b, the green light source 20g, and the red light source 20r at the same time.
- the lighting time is set to 100 milliseconds, for example. While the blue light source 20b, the green light source 20g, and the red light source 20r are turned on, the MEMS mirror 23 is driven so as to draw the same scanning locus SL as the scanning locus SL when the fundus image for observation is constructed.
- the fundus Er is scanned in the same manner as in the observation mode, using the white spot light Sp that is visible light.
- the reflected light of the white spot light Sp is received by the PD sensors 21b, 21g and 21r.
- Light reception signals from the PD sensors 21 b, 21 g, and 21 r are input to the fundus image construction unit 29.
- the fundus image constructing unit 29 constructs a color fundus image EGr based on the light reception signals from the PD sensors 21b, 21g, and 21r.
- the color fundus image EGr is stored in the built-in memory unit 29b.
- the optical scanning ophthalmoscope main body 1 can be provided with a playback button (not shown).
- the color fundus image EGr can be displayed on the liquid crystal display screen 5.
- the acquired color fundus image EGr may be automatically transmitted to a medical institution in response to an operation on the playback button.
- the operation mode of the optical scanning ophthalmoscope is automatically switched to the observation mode.
- the anterior eye is obliquely inclined from the periphery of the illumination light flux from the light source unit 20.
- the oblique light beam Qp incident on the part is cut off symmetrically by the iris Rp.
- the amount of light reaching the periphery of the fundus Er is reduced.
- the amount of the spot light Sp formed on the periphery of the fundus Er is reduced, the periphery of the fundus Er is illuminated darkly, and the center of the fundus is illuminated brightly.
- the broken line in FIG. 10 indicates that the light beam is blocked by the iris Rp.
- the light amount of the spot light Sp formed on one of the peripheral portions of the fundus Er is reduced, and the light amount of the other spot light Sp is not reduced.
- one of the peripheral portions of the fundus Er is illuminated darkly and the other is illuminated brightly. Therefore, uneven brightness occurs in the illumination area of the fundus oculi Er.
- the broken line in FIG. 12 indicates that the light beam is blocked by the iris Rp.
- the oblique light beam Qp is symmetrical through the pupil Ep as shown in FIG. Therefore, it is possible to illuminate the fundus Er without uneven brightness.
- the operation mode of the optical scanning ophthalmoscope main body 1 shifts to the alignment mode.
- the control circuit unit 27 controls the MEMS mirror 23 so that the scanning locus SL of the spot light Sp is circular.
- the lighting control circuit unit 28 turns on the infrared light source 20a.
- the alignment mode the light reception signal from the light reception sensor 21 a is simultaneously input to the control circuit unit 27.
- the oblique light beam Qp ′ forming the circular locus CRSp passes through the pupil Ep as shown in FIG.
- a circular locus CRSp formed by the spot light Sp is formed on the fundus Er.
- the fundus oculi image construction unit 29 constructs an image corresponding to the on-arc trajectory CRSp. Further, an alignment index image corresponding to the circular locus CRSp is displayed on the liquid crystal display screen 5.
- the apparatus main body When the apparatus main body is not aligned with the eye E, for example, when the alignment in the optical axis direction is inappropriate, as shown in FIG. 18, the oblique light beam Qp ′ forming the circular locus CRSp is caused by the iris Rp. Therefore, the circular trajectory CRSp due to the spot light Sp is not formed on the fundus Er, and the alignment index image corresponding to the circular trajectory CRSp is not displayed on the liquid crystal display screen 5.
- the examiner can recognize from the alignment index image displayed on the liquid crystal display screen 5 whether or not a part of the circular locus CRSp is missing, and thereby the alignment state of the apparatus main body with respect to the eye E (that is, Appropriate / unsuitable alignment).
- the examiner can determine the direction in which the apparatus main body should be moved with respect to the eye E by checking the direction in which the alignment index image corresponding to the circular locus CRSp is missing.
- Modification example of alignment index 20 to 26 are explanatory diagrams showing modifications of the alignment index.
- the configuration in which the scanning trajectory of the spot light Sp by the MEMS mirror 23 is changed between observation / shooting of the fundus and alignment adjustment has been described.
- the alignment index can also be projected by applying the same scanning method as the scanning method at the time of observation / photographing, and synchronously controlling the lighting timing of the infrared light source 20a according to the scanning position of the MEMS mirror 23. .
- FIG. 20 shows control for synchronizing the scanning position by the MEMS mirror 23 and the lighting timing of the infrared light source 20a so that the spot light Sp draws a circular discontinuous locus on the fundus Er.
- the scanning trajectory SL is applied to a reciprocating scanning that draws a stroke.
- the following configuration can be used.
- the infrared light source 20a is always turned on.
- the scanning position of the MEMS mirror 23 and the output timing of the PD sensor 21a only the light reception signal of the reflected light from the scanning position corresponding to the alignment index is transmitted from the PD sensor 21a to the fundus image construction unit 29. send.
- the state shown in FIG. 22 may occur as an example in which the alignment of the apparatus main body with respect to the eye E is inappropriate.
- all three spot lights Sp on the upper right side are missing, and two of the three spot lights Sp on the lower right side are missing.
- the examiner can determine the direction and amount in which the apparatus main body should be moved with respect to the eye E based on such a missing state of the spot light Sp.
- the examiner can determine the direction and amount of movement of the apparatus main body with respect to the eye E by grasping the missing state of the spot light Sp.
- Modification 4 In FIG. 25, when the scanning locus SL is written with one stroke in the horizontal direction, alignment indexes composed of three spot lights Sp arranged in the vertical direction are formed at the upper center position and the lower center position, respectively, The case where the alignment index which consists of the three spot light Sp arranged in the direction is each formed in the left center position and the right center position is shown. In other words, this modification is configured as if a cross-shaped alignment index is formed on the fundus Er.
- the left three spot lights Sp are lost.
- the examiner can determine the direction and amount of movement of the apparatus main body with respect to the eye E based on the lack of the spot light Sp.
- the control circuit unit 27 can be provided with an alignment index determination unit 27a.
- the alignment index determination unit 27a moves the movable case 1A from front to back, up, down, left, and right so that the alignment of the apparatus body with respect to the eye E is appropriate according to the formation state of the spot light Sp as the alignment index.
- the operation mode of the optical scanning ophthalmoscope main body 1 shifts to the focus mode (focus mode). It is also possible to adopt a configuration that automatically shifts to the focusing mode with the completion of alignment adjustment as a trigger. In this case, the focus button 11 is not necessary.
- the fundus image EGr becomes unclear and dark as a whole.
- a line-shaped index SLP is formed on the fundus Er as a focus index.
- the control circuit unit 27 controls the lighting control circuit unit 28 so that a line-shaped index SLP extending in the left-right direction (horizontal direction) is formed at the center position of the fundus Er. At this time, the lighting control circuit unit 28 synchronizes the control of the MEMS mirror 23 and the lighting control of the infrared light source 20a.
- the light beam QP ′′ forming the line-shaped index SLP passes through the pupil Ep and irradiates the fundus Er.
- the spot light Sp is not focused on the fundus Er
- the diameter increases and a blurred spot is formed on the fundus oculi Er, which increases the width of the line-shaped index SLP and blurs the line-shaped index SLP.
- the fundus image constructing unit 29 constructs a focus index image corresponding to the line index SLP based on the light reception signal from the PD sensor 21a.
- the construction of the focus index image is performed as follows. Since the PD sensor 21a detects the amount of light received at the scanning locus position, information relating to the size of the spot light Sp cannot be detected. However, when the focus is not achieved, the amount of light received by the PD sensor 21a is reduced. Therefore, by performing actual measurement in advance, the amount of reflected light when the focus is achieved and the amount of reflected light when the focus is not achieved are acquired, and the width of the line-shaped index SLP and the amount of received light are associated with each other. Create information. By referring to this information, the width of the line-shaped index SLP can be drawn according to the focus state.
- the light receiving unit includes a two-dimensional sensor instead of the PD sensor 21a, the size of the spot light can be directly detected by the two-dimensional sensor, so that the line-shaped index SLP can be displayed without using a graphic. it can.
- the focus index image corresponding to the line-shaped index SLP is thinned by moving the objective lens 25 in the front-rear direction in the optical axis direction by manual operation, for example. It can be bright and clear (see FIG. 29). In this way, the apparatus main body can be focused on the eye E to be examined.
- the focus index is not limited to the line index SLP.
- the ring-shaped index SCP shown in FIG. 30, the square-shaped index SRP shown in FIG. 31, and the star-shaped index SSP shown in FIG. 32 can be used as the focus index.
- the fundus imaging apparatus is configured to form the alignment index and the focus index on the fundus Er by using the scanning optical system used for scanning the fundus Er. Is possible.
- the index is formed by infrared light, but the present invention is not limited to this.
- the index may be formed using at least one of red light (R light), green light (G light), and blue light (B light).
- the light amount applied at this time is set to a low light amount that does not damage the eye E.
- the subject himself can perform the focus adjustment.
- the fixation target can be projected onto the fundus Er while observing the fundus Er. This process is performed by the control circuit unit 27 controlling the lighting control circuit unit 28.
- the control circuit unit 27 is provided with a fixation target presentation program.
- the optical scanning ophthalmoscope performs scanning along the scanning locus SL for fundus observation, and at the scanning locus position corresponding to the imaging region (fixation position) of the fundus Er, the cross fixation target Fix shown in FIG. To control the lighting timing of the red light source 20r and the like. As a result, spot light Sp by low-light R light is formed on the fundus Er.
- the fixation target is formed using visible light so that the subject can recognize it.
- the subject turns his / her eyeball so that the cross fixation target Fix can be visually recognized. Then, photographing is performed in a state where the cross fixation target Fix is fixed.
- the fixation target is drawn on the fundus Er using a scanning optical system for scanning the fundus Er. Therefore, a wide variety of fixation targets can be formed.
- Step S1 Turn on the power button B.
- Step S2 An optical scanning ophthalmoscope is placed in front of the eye E, and the eyepiece tube portion 3 faces the eye E to be imaged.
- Step S3 The alignment button 10 is operated to align the optical scanning ophthalmoscope body 1 with the eye E to be examined (alignment).
- Step S4 The focus button 11 is operated to adjust the focus on the fundus Er of the eye E.
- Step S5 Upon completion of alignment and focusing, the operation mode of the optical scanning ophthalmoscope automatically shifts to the observation mode. Further, with the shift to the observation mode, the fixation target is projected onto the fundus Er. You may comprise so that the presentation position of a fixation target can be changed manually. Further, a fixation target program that can selectively apply a plurality of presentation positions may be incorporated in advance.
- Step S6 In response to an operation on the photographing button 9, the optical scanning ophthalmoscope performs photographing and stores the acquired fundus image EGr in the built-in memory unit 29b.
- the configuration according to the embodiment described above is applicable to a portable type fundus photographing apparatus.
- the handle member 2 is also arranged on the left side corresponding to the fact that the eyepiece tube portion 3 is arranged on the left side of the optical scanning ophthalmoscope main body 1 when viewed from the examiner side.
- the member 2 may be provided at the center position in the left-right width direction of the optical scanning ophthalmoscope main body 1.
- the eyepiece tube portion 3 may be provided at the center position in the left-right width direction of the optical scanning ophthalmoscope main body 1, and the handle member 2 may also be provided at the center position.
- the handle member 2 is provided at the center position in the left-right width direction of the optical scanning ophthalmoscope main body 1, and the eyepiece tube portion 3 is disposed symmetrically on the left side and the right side with respect to the center position. You can also.
- the fundus imaging apparatus is not limited to the optical scanning ophthalmoscope.
- the optical coherence tomography described as the background art can be applied as a fundus imaging apparatus.
- the fundus imaging apparatus according to the embodiment is configured to scan the fundus using light, detect return light from the fundus, and image the fundus based on the detection result and the position of the scanning locus. It only has to be done.
- the optical coherence tomograph according to the embodiment may be of a portable type (thus, the configuration shown in FIGS. 1 to 6 in the first embodiment can be applied).
- the optical coherence tomography may be used both as a portable type and a stationary type, or dedicated to the stationary type.
- the optical coherence tomometer is a device that acquires morphological information such as a cross-sectional image and three-dimensional image data of the eye to be examined and functional information such as a blood flow state by using an interference optical system. Such optical measurement is called optical coherence tomography (OCT).
- OCT optical coherence tomography
- Time domain OCT and Fourier domain OCT are known as OCT methods.
- a spectral domain type using a low coherence light source (broadband light source) and a spectroscope and a swept source type using a wavelength swept light source (wavelength variable light source) are known.
- a configuration of the fundus imaging apparatus according to the embodiment will be described.
- a fundus photographing apparatus 100 illustrated in FIG. 35 includes an optical unit 110, a computer 200, and a user interface (UI) 300.
- the fundus imaging apparatus 100 replaces the optical scanning ophthalmoscope main body 1 of the first embodiment.
- the optical unit 110, the computer 200, and the user interface 300 may be provided integrally (that is, in a single housing). Alternatively, these may be distributed in two or more housings. In that case, a part of the fundus imaging apparatus may be provided in another apparatus.
- part or all of the computer 200 can be provided in a personal computer or a mobile terminal (tablet computer, mobile phone, smartphone, etc.).
- part or all of the user interface 300 can be provided in a personal computer, a portable terminal, a television receiver, a smart television, or the like.
- the optical unit 110 includes an optical system for performing OCT measurement and a mechanism for driving a predetermined optical element.
- the optical system divides the light from the light source 111 into measurement light and reference light, causes the return light of the measurement light from the subject eye E to interfere with the reference light, and detects the interference light.
- This optical system has the same configuration as a conventional spectral domain type OCT apparatus. That is, this optical system divides low-coherence light (broadband light) into reference light and measurement light, and generates interference light by causing the measurement light passing through the eye E to interfere with the reference light passing through the reference light path. , Configured to detect a spectral component of the interference light. The detection result (detection signal) of the spectral component is sent to the computer 200.
- a wavelength swept light source is provided instead of a low coherence light source, and an optical member for spectrally decomposing interference light is not provided.
- a known technique corresponding to the type of OCT can be arbitrarily applied.
- the light source 111 outputs broadband low-coherence light.
- This low-coherence light includes, for example, a near-infrared wavelength band (about 800 nm to 900 nm) and has a temporal coherence length of about several tens of micrometers.
- near-infrared light having a wavelength band that cannot be visually recognized by the human eye for example, a center wavelength of about 1040 to 1060 nm, may be used as the low-coherence light.
- the light source 111 is configured to include a light output device such as a super luminescent diode (SLD), LED, or SOA (Semiconductor Optical Amplifier).
- a light output device such as a super luminescent diode (SLD), LED, or SOA (Semiconductor Optical Amplifier).
- the low coherence light output from the light source 111 is converted into a parallel light flux by the collimator lens 112 and guided to the beam splitter 113.
- the beam splitter 113 is, for example, a half mirror that reflects a predetermined ratio of light and transmits the remaining light.
- the beam splitter 113 splits this parallel light beam into measurement light and reference light.
- Measured light is light applied to the eye E (also called signal light).
- An optical element group that forms an optical path (measurement optical path) of measurement light is called a measurement arm (also called a sample arm or the like).
- the reference light is light serving as a reference for extracting information included in the return light of the measurement light as an interference signal.
- the optical element group that forms the optical path of the reference light (reference optical path) is called a reference arm.
- One end of the reference optical path is a beam splitter 113 and the other end is a reference mirror 114.
- the reference light composed of the component transmitted through the beam splitter 113 is reflected by the reference mirror 114 and returns to the beam splitter 113.
- the reference mirror 114 is moved along the traveling direction of the reference light by the reference mirror driving unit 114A shown in FIG. Thereby, the length of the reference optical path is changed.
- the reference mirror driving unit 114A functions to relatively change the length of the measurement light path and the length of the reference light path, thereby changing the depth at which the interference intensity between the measurement light and the reference light is maximized.
- a configuration for changing the length of the reference optical path is applied, but a configuration for changing the length of the measurement optical path can be provided instead of or in addition to this configuration.
- the change in the length of the measurement optical path is realized by, for example, a corner cube that reflects incident measurement light in a direction opposite to the incident direction, and a mechanism for moving the corner cube in the incident direction and the reflection direction.
- the measurement light composed of the component reflected by the beam splitter 113 is deflected by a fixed mirror 115 inclined with respect to the measurement optical path and guided to the scanner 116.
- the scanner 116 is a two-axis optical scanner, for example. That is, the scanner 116 has a configuration capable of deflecting measurement light two-dimensionally.
- the scanner 116 is, for example, a mirror scanner including two mirrors that can be deflected in directions orthogonal to each other. This mirror scanner is configured as, for example, MEMS (Micro Electro Mechanical Systems). As another example, the scanner 116 may be configured using one mirror scanner and a rotary prism.
- the measuring light output from the scanner 116 is collimated light deflected two-dimensionally.
- the measurement light is focused by the relay lens 117 and is imaged in the air on a plane (fundus conjugate plane) Pc conjugate with the fundus oculi Ef. Further, the measurement light is again focused by the objective lens 119 having a function as a focusing lens and enters the eye E to be examined.
- the optical element (dichroic mirror 118) disposed on the fundus conjugate plane Pc will be described later.
- a switching lens 127 described later is disposed in the measurement optical path, the measurement light passes through the objective lens 119 and is refracted by the switching lens 127 and enters the eye E.
- the objective lens 119 and the lens barrel portion 119A are moved along the measurement optical path by the lens barrel driving portion 119B shown in FIG.
- the objective lens 119 and the lens barrel 119A are moved in the optical axis direction according to the refractive power of the eye E to be examined.
- the fundus conjugate surface Pc is arranged at a position conjugate with the fundus Ef.
- the measurement light is projected onto the fundus oculi Ef as spot light.
- a configuration in which a focusing lens is provided separately from the objective lens 119 can also be applied.
- the switching lens 127 is a lens for switching the site (depth position) of the eye E to be imaged.
- the target site for imaging include the fundus oculi Ef, the anterior segment, the crystalline lens, and the vitreous body.
- the projection lens LL for anterior segment imaging similar to that in the first embodiment is used as the switching lens 127.
- the switching lens 127 is inserted / retracted with respect to the measurement optical path by the lens driving unit 127A shown in FIG. When three or more parts are targeted, it is possible to configure such that two or more switching lenses can be selectively placed in the optical path.
- an optical element whose refractive power is variable such as an Alvarez lens
- the switching lens 127 is retracted from the optical path when photographing the fundus oculi Ef, and the switching lens 127 is disposed on the optical path when photographing the anterior eye segment.
- Measurement light irradiated to the fundus oculi Ef is scattered (including reflection) at various depth positions of the fundus oculi Ef.
- the backscattered light (returned light) of the measurement light from the fundus oculi Ef travels in the opposite direction on the same path as the forward path and is guided to the beam splitter 113.
- the beam splitter 113 causes the return light of the measurement light to interfere with the reference light passing through the reference light path. At this time, only the component of the return light that has passed through a distance substantially equal to the length of the reference optical path, that is, the backscattered light from the range within the coherent distance with respect to the length of the reference optical path, substantially interferes with the reference light. .
- the interference light generated via the beam splitter 113 is guided to the spectrometer 120.
- the interference light incident on the spectroscope 120 is split (spectral decomposition) by the diffraction grating 121, and is irradiated onto the light receiving surface of the CCD image sensor 123 via the lens 122.
- the diffraction grating 121 shown in FIG. 35 is a transmission type, other types of spectroscopic elements such as a reflection type diffraction grating can be used.
- the CCD image sensor 123 is, for example, a line sensor or an area sensor, and detects each spectral component of the split interference light and converts it into an electric charge.
- the CCD image sensor 123 accumulates this electric charge, generates a detection signal, and sends it to the computer 200.
- the dichroic mirror 118 is inclined at a position corresponding to the fundus conjugate plane Pc of the measurement optical path.
- the dichroic mirror 118 is configured to transmit measurement light in the near-infrared band and reflect light in the visible band.
- a flat panel display (FPD) 125 and a lens 126 are provided in the optical path branched from the measurement optical path via the dichroic mirror 118.
- the flat panel display 125 displays information under the control of the control unit 210.
- As information displayed on the flat panel display 125 there are various indexes presented to the eye E. Examples of such an index include a fixation target for fixing the eye E to be examined. It is also possible to present information such as the contents of instructions (instructions) related to inspection.
- the flat panel display 125 is arranged through a lens 126 at a position conjugate with the fundus conjugate plane Pc (and thus a position conjugate with the fundus Ef).
- a liquid crystal display (LCD) or an organic EL display (OELD) is used as the flat panel display 125.
- Visible light output from the flat panel display 125 is reflected to the dichroic mirror 118 via the lens 126. Further, the visible light enters the eye E through the objective lens 119 and reaches the fundus oculi Ef. Thereby, an image (for example, a fixation target) based on the visible light is projected onto the fundus oculi Ef.
- an image for example, a fixation target
- an optical element such as a half mirror may be provided instead of the dichroic mirror 118. It is also possible to provide a reflection mirror configured to be inserted / retracted with respect to the measurement optical path. When the dichroic mirror 118 and the half mirror are provided, OCT measurement and index projection can be performed simultaneously. On the other hand, when a reflection mirror is provided, OCT measurement and index projection are executed at different timings.
- a Michelson type interferometer is used, but any type of interferometer such as a Mach-Zehnder type can be appropriately used.
- a light receiving element of another form for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor or the like can be used.
- the light reflected by the beam splitter 113 is used as measurement light, and the light transmitted through the beam splitter 113 is used as reference light.
- a member for converting the characteristics of the measurement light and / or the reference light can be provided.
- an optical attenuator (attenuator) or a polarization adjuster (polarization controller) can be provided in the reference optical path.
- the optical attenuator adjusts the amount of reference light under the control of the computer 200.
- the optical attenuator includes, for example, a neutral density filter and a mechanism for inserting / retracting the filter with respect to the reference optical path.
- the polarization adjuster adjusts the polarization state of the reference light under the control of the computer 200.
- the polarization adjuster includes, for example, a polarizing plate disposed in the reference optical path and a mechanism for rotating the polarizing plate. These are used to adjust the interference intensity between the return light of the measurement light and the reference light.
- the front image acquisition optical system forms an optical path branched from the measurement optical path, and includes, for example, an illumination optical system and a photographing optical system similar to a conventional fundus camera.
- the illumination optical system irradiates the eye E with illumination light composed of (near) infrared light or visible light.
- the photographing optical system detects the return light (reflected light) of the illumination light from the eye E.
- the photographing optical system has a focusing lens common to the measurement optical path and / or a focusing lens independent of the measurement optical path.
- the photographing optical system includes a focusing lens (such as the objective lens 119 and the switching lens 127) common to the measurement optical path and / or a focusing lens independent of the measurement optical path.
- a focusing lens such as the objective lens 119 and the switching lens 1257
- Another example of the front image acquisition optical system is an optical system similar to a conventional SLO.
- an alignment optical system similar to a conventional fundus camera can be provided.
- the alignment optical system forms an optical path branched from the measurement optical path, and generates an index (alignment index) for aligning the apparatus optical system with the eye E to be examined.
- the alignment is alignment in a direction (referred to as an xy direction) along a plane orthogonal to the measurement optical path (the optical axis of the objective lens 119).
- the alignment optical system generates two alignment light beams from a light beam output from an alignment light source (such as an LED) by a two-hole aperture.
- the two alignment light fluxes are guided to the measurement optical path via a beam splitter that is inclined with respect to the measurement optical path, and projected onto the cornea of the eye E to be examined.
- the cornea reflected light of the alignment light beam is detected by the image sensor of the front image acquisition optical system.
- auto alignment can be executed.
- the data processing unit 230 of the computer 200 analyzes the signal input from the image sensor of the front image acquisition optical system and specifies the positions of the two alignment index images.
- the control unit 210 optically projects two corneal reflection lights on a predetermined position (for example, the center position) of the light receiving surface of the image sensor based on the positions of the two specified alignment index images.
- the unit 110 is moved in the xy direction.
- the optical unit 110 is moved by the unit driving unit 110A.
- a focus optical system similar to a conventional fundus camera can be provided.
- the focus optical system forms an optical path branched from the measurement optical path, and generates an index (focus index, split index) for performing focusing (focusing) on the fundus oculi Ef.
- the focus optical system generates two focusing light beams by a split indicator plate from light beams output from a focusing light source (LED or the like).
- the two focusing light beams are guided to the measurement optical path through the reflecting member that is inclined with respect to the measurement optical path, and projected onto the fundus oculi Ef.
- the fundus reflection light of the focusing light beam is detected by the image sensor of the front image acquisition optical system.
- autofocus can be executed.
- the data processing unit 230 of the computer 200 analyzes the signal input from the image sensor of the front image acquisition optical system and specifies the positions of the two split index images.
- the control unit 210 controls the movement of the focus optical system so that the two fundus reflection lights are projected on a straight line on the light receiving surface of the image sensor based on the positions of the two specified split index images. Then, control of the focusing lens (for example, movement control of the objective lens 119) is performed.
- Auto tracking can be performed.
- Auto-tracking is to move the optical unit 110 in accordance with the movement of the eye E.
- alignment and focusing are performed in advance.
- the auto tracking is executed as follows, for example. First, a moving image is taken of the eye E using the front image acquisition optical system.
- the data processing unit 230 monitors the movement (position change) of the eye E by sequentially analyzing frames obtained by moving image shooting.
- the control unit 210 controls the unit driving unit 110A to move the optical unit 110 in accordance with the position of the eye E to be sequentially acquired. Thereby, the optical unit 110 can be made to follow the movement of the eye E in real time, and a suitable positional relationship in which the alignment and the focus are matched can be maintained.
- the index is applied to the fundus oculi Ef using the measurement light deflected by the scanner 116 without providing an alignment index, a focus index, a fixation target, or a dedicated optical system for projecting the fixation target. It is possible to project. This process will be described later.
- Control system / Data processing system A control system and a data processing system of the fundus imaging apparatus 100 according to the embodiment will be described. A configuration example of the control system and the data processing system is shown in FIG.
- the control system and data processing system are configured with the computer 200 as the center.
- the computer 200 includes a microprocessor, a RAM, a ROM, a hard disk drive, a communication interface, and the like.
- a storage device such as a hard disk drive stores a computer program for causing the fundus imaging apparatus 100 to execute various processes.
- the computer 200 may have a dedicated circuit board that executes a specific process. For example, a circuit board that performs processing for forming an OCT image may be provided.
- a user interface 300 is connected to the computer 200.
- the user interface 300 includes a display unit 310 and an operation unit 320.
- the display unit 310 includes a display device such as a flat panel display.
- the operation unit 320 includes an operation device such as a button, a key, a joystick, or an operation panel provided on the housing or the outside of the fundus imaging apparatus 100.
- the operation unit 320 may include an operation device (mouse, keyboard, trackpad, button, etc.) of the personal computer.
- the display unit 310 and the operation unit 320 do not need to be configured as individual devices.
- a device in which a display function and an operation function are integrated such as a touch panel
- the operation unit 320 includes the touch panel and a computer program.
- the operation content for the operation unit 320 is input to the control unit 210 as an electrical signal. Further, operations and information input may be performed using a graphical user interface (GUI) displayed on the display unit 310 and the operation unit 320.
- GUI graphical user interface
- the control unit 210 is provided in the computer 200.
- the control unit 210 includes a microprocessor, RAM, ROM, hard disk drive, and the like.
- the control unit 210 is provided with a main control unit 211 and a storage unit 212.
- the main control unit 211 controls each unit of the fundus imaging apparatus 100.
- the main control unit 211 is controlled by a unit driving unit 110A, a light source 111, a reference mirror driving unit 114A, a scanner 116, a lens barrel driving unit 119B, a CCD (image sensor) 123, a flat panel display 125, a display unit. 310, a data processing unit 230, and a communication unit 240 are included.
- the unit driver 110A moves the optical unit 110 in a direction along the measurement optical path (the optical axis of the objective lens 119) (z direction) and in a direction along the plane orthogonal to the z direction (xy direction). It has the mechanism of.
- the reference mirror driving unit 114A moves the reference mirror 114 along the reference optical path.
- the lens barrel drive unit 119B moves the objective lens 119 and the lens barrel unit 119A along the measurement optical path.
- the lens driving unit 127A inserts and removes the switching lens 127 with respect to the measurement optical path.
- the storage unit 212 stores various data.
- the storage unit 212 stores various programs and data for operating the fundus imaging apparatus 100.
- the data stored in the storage unit 212 includes data acquired by the fundus imaging apparatus 100 and data stored in advance.
- Examples of data acquired by the fundus imaging apparatus 100 include OCT image data, examination data, and front image data.
- the examination data is data indicating the state of the eye to be examined, which is generated by processing the detection result of the interference light by the optical unit 110.
- the storage unit 212 stores setting information described below in advance.
- the setting information is information in which contents of setting of predetermined items related to the optical unit 110 and the data processing unit 230 are recorded.
- the setting information includes, for example, setting contents regarding at least one of the following items: (1) fixation position; (2) scan pattern; (3) focus position; (4) diopter correction; (5) maximum interference. Depth; (6) Analysis processing.
- fixation position indicates a direction in which the eye E is fixed, that is, a portion of the eye E on which OCT measurement is performed.
- fixation position the fixation position for performing OCT measurement of the macula and its surroundings
- fixation position for performing OCT measurement of the optic nerve head and its surroundings There is a fixation position for. It is also possible to set a fixation position corresponding to an arbitrary part of the eye E.
- the fixation position includes, for example, information indicating the display position (pixel position) of the fixation target on the flat panel display 125.
- scan pattern indicates a pattern along which the projection position of the measurement light with respect to the eye E is moved.
- the scan pattern include one or more line scans (horizontal scan, vertical scan), one or more cross scans, radial scans, and circle scans. Further, when acquiring a three-dimensional image (three-dimensional data set), a three-dimensional scan in which intervals between a plurality of line scans are set sufficiently narrow is applied.
- the in-focus position indicates a focus condition applied in OCT measurement.
- the in-focus position includes information indicating the position of the objective lens 119, for example.
- “Diopter correction” indicates a condition applied in diopter correction. Specifically, there are a value indicating the refractive power (eyesight value) of the eye E, information indicating the position of the objective lens 119, and the like.
- Maximum interference depth indicates a depth at which the interference intensity between the measurement light and the reference light, which is applied in the OCT measurement, is maximum.
- the maximum interference depth includes information indicating the position of the reference mirror 114, for example.
- Analysis process indicates the content of the process executed based on the data acquired by the optical unit 110, that is, the type of the inspection data acquired.
- analysis processing include fundus layer thickness analysis and nipple shape analysis.
- the fundus layer thickness analysis is an analysis process for obtaining the thickness of a predetermined layer tissue of the fundus (retina, retinal sub-tissue, choroid, sclera, etc.).
- the nipple shape analysis is an analysis process in which a cross-sectional image or a three-dimensional image of the fundus oculi is analyzed to detect a hole portion (cut or defect portion) of the retina and obtain the shape of the optic nerve head. In the nipple shape analysis, the inclination (shape asymmetry) of the optic nerve head can also be obtained.
- the analysis process may include a process of specifying an image area corresponding to a predetermined part of the eye E and a process of obtaining the form and distribution of the specified image area.
- the predetermined site that is a specific target include blood vessels, optic discs, macular, predetermined layer tissues (retina, retinal sub-tissue, choroid, sclera, etc.), laser treatment scars, lesions (drusen, retinal detachment sites, Tissue defect site, tissue deformation site, etc.).
- the analysis process may include a process of calculating a distance based on data acquired by OCT measurement.
- this distance measurement there is a measurement of an intraocular distance (such as an axial length) based on data acquired by OCT measurement of the anterior segment and data acquired by OCT measurement of the fundus. It is also possible to obtain the intraocular distance based only on the OCT image of the fundus or anterior segment.
- setting information about the left eye (setting information for the left eye) and the right eye Setting information (right eye setting information) can be provided separately.
- individual setting information may be provided for each subject. Is possible.
- the setting information is created based on, for example, past examination results and diagnosis results regarding the eye to be examined.
- the image forming unit 220 forms image data based on the detection signal from the CCD image sensor 123. This process includes processes such as noise removal (noise reduction), filter processing, dispersion compensation, and FFT (Fast Fourier Transform) as in the conventional spectral domain type OCT. When another type of OCT is applied, the image forming unit 220 executes a known process corresponding to the type.
- the image forming unit 220 includes, for example, a dedicated circuit board and / or a microprocessor.
- image data and “image” based thereon may be identified.
- the data processing unit 230 executes various data processing.
- the data processing unit 230 performs image processing on the image formed by the image forming unit 220.
- the data processing unit 230 can form image data of a three-dimensional image of the eye E based on a plurality of two-dimensional cross-sectional images having different cross-sectional positions.
- the image data of a three-dimensional image means image data in which pixel positions are defined by a three-dimensional coordinate system.
- image data of a three-dimensional image there is image data composed of voxels arranged three-dimensionally. This image data is called volume data or voxel data.
- the data processing unit 230 When displaying an image based on volume data, the data processing unit 230 performs rendering processing (volume rendering, MIP (Maximum Intensity Projection), etc.) on the volume data, and views the image from a specific line-of-sight direction. Image data of a pseudo three-dimensional image is formed. In addition, the data processing unit 230 can image an arbitrary cross section of a three-dimensional image (MPR (Multi-Planar Reconstruction): cross section conversion).
- MPR Multi-Planar Reconstruction
- stack data of a plurality of cross-sectional images is image data obtained by three-dimensionally arranging a plurality of cross-sectional images obtained along a plurality of scanning lines based on the positional relationship of the scanning lines. That is, stack data is image data obtained by expressing a plurality of cross-sectional images originally defined by individual two-dimensional coordinate systems by one three-dimensional coordinate system (that is, by embedding them in one three-dimensional space). is there.
- the data processing unit 230 can perform MPR processing based on the stack data.
- the data processing unit 230 includes, for example, a microprocessor, a RAM, a ROM, a hard disk drive, a circuit board dedicated to predetermined data processing, and the like.
- a storage device such as a hard disk drive, a computer program for causing a microprocessor to execute data processing described later is stored in advance.
- the communication unit 240 performs data communication with an external device.
- the method of data communication is arbitrary.
- the communication unit 240 includes a communication interface compliant with the Internet, a communication interface compliant with LAN, a communication interface compliant with near field communication, and the like.
- Data communication may be wired communication or wireless communication.
- the communication unit 240 performs data communication with the external computer 1000 via the communication line 2000.
- One or more external computers 1000 are provided.
- the external computer 1000 includes a server installed in a medical institution, a terminal used by a doctor, a server of a manufacturer (or a sales company, a maintenance company, a rental company, etc.) of the fundus imaging apparatus 100, a person in charge of the manufacturer, etc. There are terminals used by.
- the external computer 1000 is not limited to one that can directly communicate with the fundus imaging apparatus 100, and may be one that can indirectly communicate with another computer.
- the external computer 1000 includes not only the portable information terminal 16 but also the portable information terminal 16 ′ and the personal computer 16 ⁇ / b> A.
- the data transmitted / received by the communication unit 240 may be encrypted.
- the control unit 210 (or the data processing unit 230) includes an encryption processing unit that encrypts transmission data and a decryption processing unit that decrypts reception data.
- the fundus imaging apparatus (100) includes the following components: the fundus (Ef) of the eye to be examined (E) is scanned with light from the light source unit (light source 111), and the return light from the fundus is received. Scanning optical system that receives light at the unit (CCD 123); Control circuit unit (control unit 210) that controls the scanning optical system so that a scanning trajectory (scan pattern) by light is formed on the fundus; An image constructing unit (image forming unit 220, data processing unit 230) that constructs an image of the fundus based on the position of the scanning locus.
- Alignment mode is provided as the operation mode of the control circuit.
- the control circuit unit (control unit 210) scans so that an alignment index for aligning the scanning optical system with respect to the eye to be examined is projected on the fundus based on light (measurement light) from the light source unit.
- Control of the optical system (in particular, control of the scanner 116) is performed. According to this configuration, alignment can be performed without providing a dedicated optical system for projecting the alignment index.
- the control mode in the alignment mode may be any of those described in the first embodiment. Specifically, the following control can be applied.
- the image construction unit (image forming unit 220 or the like) can form an image of the alignment index.
- an image of the alignment index can be acquired by configuring the measurement light wavelength band to be detectable.
- the control circuit unit causes the display means (display unit 310) to display an image of the alignment index constructed by the image construction unit. According to this configuration, the user can recognize the alignment state.
- the display means does not have to be configured as a part of the fundus imaging apparatus, and may be an external display.
- the fundus imaging apparatus of the embodiment may include a drive unit (unit drive unit 110A) for moving the scanning optical system.
- the control circuit unit (control unit 210) is in an alignment state based on the alignment index.
- the determination unit performs processing similar to that of the alignment index determination unit 27a of the first embodiment, and the control circuit unit determines the drive unit based on the determination result.
- the scanning optical system can be moved by the control, and according to this configuration, it is possible to perform auto-alignment based on the alignment index.
- the scanning optical system may include a scanner (scanner 116).
- the control circuit unit controls the scanner so that a scan locus different from the scan locus for constructing the fundus image is formed on the fundus.
- An indicator can be projected.
- Scanning trajectories applied in such an alignment mode include multiple rings and spiral trajectories in addition to the circular trajectory CRSp shown in FIG.
- the control circuit unit controls the scanner based on the same scanning locus as the scanning locus for constructing the fundus image, and the light source unit (
- the alignment index can be projected onto the fundus by linking the lighting timing control of the light source 111) (ie, by performing synchronization control between the scanner 116 and the light source 111). This process is executed, for example, in the manner shown in FIG.
- the control circuit unit performs the same scanning as the scanning trajectory for constructing the fundus image while turning on the light source unit (light source 111).
- the control circuit unit controls the same scanning as the scanning trajectory for constructing the fundus image while turning on the light source unit (light source 111).
- the control circuit unit controls the same scanning as the scanning trajectory for constructing the fundus image while turning on the light source unit (light source 111).
- the control of the scanner based on the trajectory and the control of the output timing of the light receiving signal from the light receiving unit (CCD 123) (that is, by the synchronous control of the scanner 116 and the CCD 123)
- an image of the alignment index is displayed. It can be displayed on the means (display unit 310). This process is executed in the same manner as in the first embodiment.
- the light (measurement light) applied to the fundus may be infrared light.
- a focus mode can be provided as an operation mode of the control circuit unit (control unit 210).
- the scanning optical system is controlled such that a focus index for focusing the scanning optical system on the fundus of the eye to be examined is projected onto the fundus Ef based on the light from the light source unit (light source 111). .
- focusing can be performed without providing a dedicated optical system for projecting the focus index.
- control circuit unit can cause the display means (display unit 310) to display the image of the focus index constructed by the image construction unit (image forming unit 220, data processing unit 230). . According to this configuration, the user can recognize the focus state.
- the scanning optical system may include a lens (objective lens 119 or the like) for performing focusing. Furthermore, a drive unit (such as a lens barrel drive unit 119B) for moving the lens in the direction of the optical axis of the scanning optical system may be provided.
- the control circuit unit (control unit 210) includes a determination unit that determines the focus state based on the focus index. This determination unit performs the same processing as in the first embodiment. The control circuit unit can move the lens by controlling the drive unit based on the result of this determination. According to this configuration, it is possible to perform autofocus based on the focus index.
- the light (measurement light) applied to the fundus may be infrared light.
- a fundus observation mode for observing a moving image of the fundus of the eye to be examined can be provided.
- scanning based on the same scan pattern for example, line scan, three-dimensional scan
- a moving image representing substantially the same part of the fundus oculi Ef is displayed.
- the control circuit unit controls the light source unit (light source 111) so that a fixation target by visible light is presented to the eye to be examined.
- the light source 111 is provided with a visible light source capable of outputting visible light in addition to a near-infrared light source for OCT measurement.
- the control circuit unit performs synchronous control of the light source 111 and the scanner 116 so that visible light (and infrared light) is output when light is irradiated to the scanning locus position corresponding to the fixation target.
- FIGS. 1 to 9 of the first embodiment will be referred to as appropriate.
- the configuration of the optical system is partially different from that of the first embodiment (FIG. 7).
- FIG. 37 shows the configuration of the fundus imaging apparatus according to this embodiment.
- the difference between the configuration shown in FIG. 37 and the configuration shown in FIG. 7 of the first embodiment is that the former is provided with a projection lens LL.
- the projection lens LL will be described later.
- the subject can perform fundus photography.
- the fundus image and personal information can be associated as follows.
- the optical scanning ophthalmoscope main body 1 is connected to the portable information device 16 via the USB.
- a portable information device 16 ', a personal computer (PC) 16A, and a monitor unit 16B are installed in the medical institution (examiner side).
- the portable information device 16 ′ is connected to the personal computer 16A via USB.
- the examiner-side portable information device 16 ′ receives the information transmitted from the subject-side portable information device 16.
- the personal computer 16A is provided with an analysis processing unit for analyzing fundus image information and personal information.
- a monitor unit 16B is connected to the personal computer 16A.
- a fundus image and personal information are displayed in association with each other on the screen of the monitor unit 16B as necessary.
- FIG. 39 shows a configuration example 1 of the fundus imaging system according to this embodiment.
- a handprint palm shape
- a fingerprint recognition sensor is used as the personal information acquisition means (external patient recognition unit).
- the biometric authentication information applicable in Configuration Example 1 is not limited to a handprint or a fingerprint, and for example, a palm print or a vein (blood vessel) pattern of a hand or finger can be applied.
- a fingerprint / hand sensor 30 is connected to the control circuit unit 27 of the optical scanning ophthalmoscope main body 1.
- the ROM of the control circuit unit 29B is provided with a personal information acquisition control program for controlling the fingerprint / hand sensor 30 and a program as an association means for associating the personal information with the fundus image.
- the control circuit unit 27 includes a microprocessor. The microprocessor receives a clock signal from the real time clock circuit 31.
- the personal information acquisition control program is automatically loaded into the microprocessor of the control circuit unit 27. Is done.
- the fingerprint or handprint is acquired as personal information.
- the acquired personal information is stored in the RAM of the built-in memory unit 29B.
- a configuration is adopted in which the mode automatically shifts to the observation mode in response to the acquisition of personal information.
- the subject himself operates the imaging button 9 (external switch) to perform fundus imaging without performing fundus observation.
- the acquired fundus image EGr and personal information acquired before fundus imaging are associated with each other and temporarily stored in the RAM.
- the fundus image EGr and the personal information are transmitted to the examiner-side portable information device 16 ′ shown in FIG. 38 via the portable information device 16.
- the medical institution on the examiner obtains personal information and fundus images acquired by fundus photography performed by the subject himself.
- FIG. 40 shows a configuration example 2 of the fundus imaging system according to this embodiment.
- a program for recognizing a retinal pattern or iris pattern is provided in the ROM as personal information acquisition means.
- a retinal pattern recognition program When recognizing a retinal pattern, a retinal pattern recognition program is loaded into the microprocessor in response to the power being turned on.
- the scanning optical system shown in FIG. 37 is controlled by a retinal pattern recognition program.
- the retina pattern image is acquired by the image processing unit 29a and stored in the RAM.
- a fundus image EGr is acquired, and the fundus image EGr and the retinal pattern image (personal information) are associated and stored in the RAM.
- the fundus EGr and personal information are transmitted to the examiner-side portable information device 16 ′ shown in FIG. 38 via the portable information device 16. Thereby, the personal information and the fundus image are acquired by the medical institution on the examiner side.
- retinal pattern images for each subject are registered in advance.
- the personal computer 16A identifies the subject by collating the retinal pattern image transmitted from the subject side with the retinal pattern image registered in advance.
- an iris pattern recognition program is loaded into the microprocessor.
- the control circuit unit 27 inserts the projection lens LL for anterior segment imaging shown in FIG. 37 into the optical path of the scanning optical system.
- control circuit unit 27 controls the scanning optical system so as to scan the iris Rp.
- an iris pattern image is acquired by the image processing unit 29a.
- the acquired iris pattern image is stored in the RAM.
- the projection lens LL is retracted from the optical path of the scanning optical system, and a fundus image EGr is acquired.
- the acquired fundus image EGr is associated with an iris pattern image as personal information and stored in the RAM.
- the fundus image EGr and the personal information are transmitted to the examiner-side portable information device 16 ′ shown in FIG. 38 via the portable information device 16. Thereby, the personal information and the fundus image are acquired by the medical institution on the examiner side.
- personal information can be acquired using a scanning optical system used for photographing a fundus image, so that personal information can be acquired without unnecessarily complicating the physical configuration of the personal information acquisition unit.
- FIG. 41 shows a configuration example 3 of the fundus imaging system according to this embodiment.
- a face photography camera 32 is provided as personal information acquisition means.
- the face photography camera 32 is configured to be operated by the alignment adjustment button 10.
- the alignment adjustment button 10 When the alignment adjustment button 10 is operated, the face of the subject is photographed by the face photography camera 32. Since the subsequent operation is the same as that of the configuration example 1 and the configuration example 2, the detailed description thereof is omitted.
- a camera for photographing a face can be used for, for example, brightness determination of a photographing place other than personal information authentication.
- the fundus imaging system in which the optical scanning ophthalmoscope is applied as the fundus imaging apparatus has been described.
- the fundus imaging apparatus is not limited to the optical scanning ophthalmoscope.
- the optical coherence tomography described as the background art can be applied as a fundus imaging apparatus.
- the optical coherence tomography may be a portable type (thus, the configuration shown in FIGS. 1 to 6 in the first embodiment can be applied).
- the optical coherence tomography may be used both as a portable type and a stationary type, or dedicated to the stationary type.
- the optical coherence tomography of this embodiment has the same optical system as that of the second embodiment.
- FIG. 35 of 2nd Embodiment is referred suitably.
- the data processing unit 230 of the optical coherence tomography of this embodiment is provided with an inspection data generation unit 231 and an authentication processing unit 232 (see FIG. 42).
- the inspection data generation unit 231 generates inspection data.
- the examination data is data indicating the state of the eye to be examined, which is generated by processing the detection result of the interference light by the optical unit 110 (details will be described later).
- the examination data is used as authentication information for authenticating the subject.
- setting information similar to that in the second embodiment and regular personal authentication information described below are stored in advance.
- the regular personal authentication information is personal authentication information of a person who is permitted to perform an examination using the fundus imaging apparatus 100 (a regular subject).
- the personal authentication information is information used for personal authentication of a person who is going to perform an examination using the fundus imaging apparatus 100.
- the personal authentication information is used as personal information in the first embodiment.
- the personal authentication information may be character string information or image information.
- the character string information include a patient ID given by a medical institution, personal information such as a subject's name, character string information arbitrarily designated by the subject, and character string information designated randomly.
- the image information include biometric authentication information (such as a fingerprint pattern, an iris pattern, a vein pattern, and a face pattern), a one-dimensional code, and a two-dimensional code.
- a voice pattern or a handwriting pattern can be used as the personal authentication information.
- the personal authentication information may be information acquired based on an image (two-dimensional cross-sectional image, three-dimensional image, fundus front image, etc.) of the eye E that can be acquired by the fundus imaging apparatus 100.
- Examples of such personal authentication information include the following: (1) Morphological information indicating the form of a predetermined region of the fundus (blood vessel, optic nerve head, layer tissue, laser treatment mark, etc.); (2) a predetermined region of the fundus (3) Layer thickness distribution information indicating the distribution of the thickness of a predetermined layer tissue of the fundus; (4) Distance (such as the axial length) between the fundus and the anterior segment. A method for obtaining the personal authentication information will be described later.
- the blood vessel morphology information includes, for example, the blood vessel pattern, the number / thickness / length / curvature of blood vessels, the number of blood vessel branching portions, and the number of blood vessel intersections.
- the blood vessel distribution information includes, for example, the position (distribution) of the blood vessel, the curvature distribution of the blood vessel, the position of the branching portion of the blood vessel, and the position of the crossing portion of the blood vessel.
- the optic nerve head morphology information includes, for example, its shape and size (area, disk diameter, cup diameter, rim diameter, ratio of these diameters, nipple depth, etc.).
- the distribution information of the optic disc includes, for example, its position.
- the morphological information of the layer structure includes, for example, its shape / size (length, thickness).
- the distribution information of the layer structure includes, for example, the position.
- the form information of the characteristic part includes, for example, its shape and size.
- the distribution information of the characteristic part includes, for example, the position.
- a person who wants to perform an examination using the fundus imaging apparatus 100 inputs personal authentication information by a predetermined method.
- the input method corresponds to the type of personal authentication information used.
- the same personal authentication information as in the first embodiment is input in the same manner as in the first embodiment.
- the personal authentication information acquired through OCT measurement is input by actually performing OCT measurement on the eye E to be examined and processing an image obtained thereby.
- the inspection data generation unit 231 generates inspection data indicating the state of the eye E by processing the detection result of the interference light by the optical unit 110.
- the “interference light detection result” is, for example, one of the following: (1) a signal output from the CCD image sensor 123; (2) image data formed by the image forming unit 220; (3) an image forming unit. Data obtained at an intermediate stage of processing executed by 220 (that is, data obtained during the image data forming process); (4) A signal output from the CCD image sensor 123 is processed by components other than the image forming unit 220 Data obtained by Hereinafter, an example of processing executed by the inspection data generation unit 231 will be described.
- the inspection data generation unit 231 can generate layer thickness information of the fundus oculi Ef based on the detection result of the interference light by the optical unit 110.
- the examination data generation unit 231 functions as a layer thickness information generation unit, and executes the above-described fundus layer thickness analysis (such as retinal thickness analysis and RNFL thickness analysis). Further, the examination data generation unit 231 can perform a comparative analysis between the layer thickness information acquired by the fundus layer thickness analysis and a standard layer thickness value.
- Fundus layer thickness analysis is a process for obtaining the thickness (distribution) of a predetermined layer tissue of the fundus oculi Ef based on the detection result of interference light.
- retinal thickness analysis will be described. Similar processing is performed when the thicknesses of other layer structures are obtained.
- retinal thickness analysis for example, an OCT image (cross-sectional image, three-dimensional image) of the fundus oculi Ef is analyzed to obtain the thickness distribution of the retina in a part or all of the scan range.
- OCT image cross-sectional image, three-dimensional image
- retinal thickness There are various definitions of retinal thickness. For example, when the thickness from the inner boundary membrane to the inner granule layer (inscribed / external contact of photoreceptor cells) is used as the retinal thickness, the thickness from the inner border membrane to the retinal pigment epithelial layer may be used as the retinal thickness. .
- the retinal thickness obtained by retinal thickness analysis is one of such definitions.
- the retinal thickness analysis is executed as follows, for example. First, an OCT image of the fundus oculi Ef is analyzed, and an image region corresponding to a predetermined boundary region (for example, the inner boundary membrane and the retinal pigment epithelium layer) is specified. Then, the number of pixels between the specified boundary portions is counted to obtain the retinal thickness (distance in the depth direction).
- the processing for analyzing the OCT image to determine the thickness of the fundus layer is, for example, Japanese Patent Application Laid-Open No. 2007-325831, Japanese Patent Application Laid-Open No. 2008-206684, Japanese Patent Application Laid-Open No. 2009-61203, Japanese Patent Application Laid-Open No. This is described in, for example, JP-A-2009-66015.
- the information acquired in this way is an example of “layer thickness distribution information”.
- an image region corresponding to an arbitrary layer structure of the fundus oculi Ef can be specified, and a layer thickness distribution of the arbitrary layer structure can be acquired.
- morphological information indicating the form of a predetermined part (predetermined layer tissue) of the fundus oculi Ef can be acquired, and distribution information indicating the distribution of the predetermined part, and layer thickness distribution information of the predetermined layer tissue.
- the comparative analysis of the retinal thickness is an analysis process that compares the retinal thickness obtained by the retinal thickness analysis with standard data (Normal data) stored in advance.
- Normal data is the standard value (standard thickness) of the retinal thickness of healthy eyes. Normal data is created by measuring many retinal thicknesses of healthy eyes and obtaining statistical values (average values, standard deviations, etc.) of the measurement results.
- the comparative analysis it is determined whether or not the retinal thickness of the eye E is included in the range of the normal eye.
- the comparative analysis may be a process of obtaining a range of retinal thickness in a diseased eye and determining whether or not the retinal thickness obtained by the retinal thickness analysis is included in the range.
- the examination data generation unit 231 may be configured to execute an analysis process for specifying an image region corresponding to a characteristic part (laser treatment mark, lesion, etc.) of the fundus oculi Ef.
- This process is an analysis process in which, for example, by analyzing an OCT image, the morphological information and / or distribution information of a characteristic part in part or all of the scan range is obtained.
- This analysis process includes, for example, a process of specifying a pixel corresponding to a characteristic part among the pixels of the OCT image.
- a process of specifying a pixel corresponding to a characteristic part among the pixels of the OCT image includes, for example, a process of specifying a pixel corresponding to a characteristic part among the pixels of the OCT image.
- a pixel value histogram can be created for the pixels constituting the OCT image, and the pixel corresponding to the characteristic part can be specified based on the distribution of the pixel values in the histogram.
- the OCT image can be divided into a plurality of image areas, or a predetermined image area of the OCT image can be specified, and further, a pixel corresponding to a characteristic part can be specified based on the form of the image area. For example, an image region corresponding to a Bruch's membrane and an image region corresponding to a retinal pigment epithelium are identified, and an image region corresponding to a small substantially circular raised shape is selected based on a pixel value between these image regions. ) Can be specified.
- the image region specifying process based on such a shape may include, for example, image matching using a template of the shape.
- an image region corresponding to a characteristic part is specified based on the photographed image of the fundus oculi Ef. Can do.
- This analysis process is executed, for example, by determining whether the pixel value of each pixel of the captured image is included in a predetermined range and specifying the pixels included in the predetermined range.
- the characteristic part has a characteristic shape and color. Specifically, the laser treatment scar is depicted as a small circular vitiligo, and the lesion is depicted in a characteristic color (Drusen is yellowish white).
- the nipple shape analysis may include an analysis process for detecting the hole portion of the retina by analyzing the OCT image of the fundus oculi Ef and obtaining the shape of the hole portion, that is, the shape of the optic nerve head.
- the nipple shape analysis is performed by, for example, analyzing a cross-sectional image or a three-dimensional image to identify an image region corresponding to the optic nerve head and the retinal surface in the vicinity thereof, and analyzing the identified image region to analyze its global shape and locality.
- a parameter (nipple shape parameter) representing the shape (unevenness) is obtained.
- papillary shape parameters include the optic disc cup diameter, disc diameter, rim diameter, and nipple depth.
- the nipple shape analysis may include an analysis process for obtaining the inclination (shape asymmetry) of the optic nerve head.
- This analysis process is executed as follows, for example. First, the examination data generation unit 231 analyzes the three-dimensional image obtained by scanning the region including the optic nerve head and identifies the center of the teat. Next, the test data generation unit 231 sets a circular area centered on the nipple center, and divides the circular area radially to obtain a plurality of partial areas. Subsequently, the examination data generation unit 231 obtains the height position of a predetermined layer (for example, the retinal pigment epithelium layer) at each pixel position by analyzing a cross-sectional image of the circular region.
- a predetermined layer for example, the retinal pigment epithelium layer
- the inspection data generation unit 231 calculates an average value of height positions of predetermined layers in each partial region. Next, the examination data generation unit 231 obtains the inclination of the fundus oculi Ef in the facing direction by comparing a pair of average values obtained for a pair of partial regions corresponding to the facing position with respect to the nipple center. Then, the examination data generation unit 231 generates inclination distribution information indicating the distribution of the inclination of the fundus oculi Ef in the circular area based on the inclinations obtained in the plurality of facing directions. Note that the evaluation information of the disease state can be generated based on the generated inclination distribution information (and information indicating the standard distribution).
- a fundus front image can be formed based on a three-dimensional image acquired by OCT.
- the fundus front image is an image drawn on a plane orthogonal to the incident direction (z direction) of the measurement light with respect to the fundus oculi Ef.
- the fundus front image is formed by projecting a part or all of the three-dimensional image in the depth direction (z direction) of the fundus oculi Ef. This process is executed by the data processing unit 230.
- the fundus front image is called a projection image.
- this partial region may be a part in the xy direction or a part in the z direction.
- the former is applied, for example, when obtaining a fundus front image of a predetermined part (optic nerve head, macular, etc.).
- the latter is applied, for example, when obtaining a fundus front image including a predetermined range of information in the z direction.
- the upper surface and / or the lower surface of this partial region may be a flat surface or a curved surface.
- an image (tilted fundus image) drawn on a plane that forms an arbitrary angle with respect to the z direction can be formed based on a three-dimensional image.
- the inspection data generation unit 231 can generate inspection data based on a fundus front image (or an inclined fundus image).
- the type of examination data based on the fundus front image may be the same as that of examination data based on another type of OCT image (two-dimensional cross-sectional image, three-dimensional image, etc.) such as the above-described form information and distribution information.
- the data processing unit 230 can image an arbitrary cross section of a three-dimensional image (MPR image).
- the designation of the cross section is executed manually or automatically.
- the inspection data generation unit 231 can generate inspection data based on the MPR image.
- the type of inspection data based on the MPR image may be the same as that of inspection data based on another type of OCT image, such as the above-described form information and distribution information.
- the inspection data generation unit 231 can obtain the intraocular distance as the inspection data.
- the examination data generation unit 231 specifies a plurality of positions in the image.
- the specified position is a point of interest in the eye E (the center of the nipple, fovea, lesion, etc.), which is a point manually designated by the user or a point obtained by image analysis.
- the examination data generation unit 231 can specify the intraocular distance based on two or more OCT images. As an example, processing for obtaining the axial length will be described.
- the fundus imaging apparatus 100 can switch between a fundus mode for acquiring an image of the fundus oculi Ef and an anterior segment mode for forming an anterior eye image.
- the operation mode is switched by inserting / retracting the switching lens 127 with respect to the measurement optical path.
- the switching lens 127 and the lens driving unit 127A are examples of “mode switching unit”.
- the main control unit 211 causes the storage unit 212 to store the position information of the reference mirror 114 applied in the fundus mode and the position information of the reference mirror 114 applied in the anterior segment mode.
- a two-dimensional cross-sectional image or a three-dimensional image including the center position of the fundus surface is acquired.
- a two-dimensional cross-sectional image or a three-dimensional image including the corneal apex is acquired.
- the examination data generation unit 231 Based on the OCT image (fundus OCT image) acquired in the fundus mode and the OCT image (anterior ocular segment OCT image) acquired in the anterior segment mode, the examination data generation unit 231 The distance between is calculated. As a specific example, the examination data generation unit 231 analyzes the fundus OCT image to identify coordinates corresponding to the fundus center (fundus center coordinates), analyzes the anterior segment OCT image, and coordinates corresponding to the corneal apex (cornea) Specify the vertex coordinates.
- the examination data generation unit 231 determines the eye to be examined based on the fundus center coordinates, the corneal vertex coordinates, the displacement of the focus of the measurement light by switching the operation mode, and the position information of the reference mirror 114 in both operation modes.
- the axial length of E is calculated.
- the inspection data acquired by the inspection data generation unit 231 can be used as personal authentication information.
- regular personal authentication information is stored in the storage unit 212 in advance.
- the authentication processing unit 232 collates personal authentication information such as inspection data with regular personal authentication information.
- the authentication processing unit 232 sends a verification result (information indicating whether the verification is successful) to the control unit 210.
- the control unit 210 controls the fundus imaging apparatus 100 to execute OCT measurement.
- the control part 210 displays the predetermined alert information on the display part 310, for example.
- This notification information includes a message indicating that the verification has failed, a message prompting re-input of personal authentication information, a message prompting input of other personal authentication information, and the like.
- an allowable range (such as a threshold) is set in advance for the difference between the personal authentication information and the regular personal authentication information.
- the authentication processing unit 232 obtains a difference between the personal authentication information and the regular personal authentication information, and determines whether this difference is included in the allowable range.
- the authentication processing unit 232 obtains a result that the verification is successful. On the other hand, when it is determined that the difference is not included in the allowable range, the authentication processing unit 232 obtains a result that the verification has failed.
- the storage unit 212 includes regular personal authentication information (first regular personal authentication information) related to the first personal authentication information and regular personal authentication information (second regular personal authentication information) related to the second personal authentication information. , Stored in advance.
- the authentication processing unit 232 collates the acquired first personal authentication information with the first regular personal authentication information. If the collation is successful, photographing by the fundus photographing apparatus 100 is permitted. On the other hand, if the verification fails, the second personal authentication information is acquired. The authentication processing unit 232 collates the acquired second personal authentication information with the second regular personal authentication information. If the collation is successful, photographing by the fundus photographing apparatus 100 is permitted. On the other hand, when collation fails, the control unit 210 displays notification information including a message indicating that collation has failed, a message prompting re-input of personal authentication information, a message prompting input of other personal authentication information, and the like. 310 is displayed.
- regular personal authentication information of each subject is stored in the storage unit 212.
- the control unit 210 sends all the regular personal authentication information stored in the storage unit 212 to the authentication processing unit 232.
- the authentication processing unit 232 collates the acquired personal authentication information with each regular personal authentication information. If the acquired personal authentication information is successfully verified with any of the regular personal authentication information, the subject corresponding to the regular personal authentication information is recognized as the subject of the current examination. On the other hand, when the verification by the acquired personal authentication information fails for all the regular personal authentication information, the control unit 210 causes the display unit 310 to display the notification information as described above.
- the personal authentication information generated by the inspection data generation unit 231 is sent to the control unit 210 and stored in the storage unit 212.
- the OCT image formed by the image forming unit 220 (and the data processing unit 230) is sent to the control unit 210 and stored in the storage unit 212.
- the control unit 210 associates the OCT image with personal authentication information (personal information). This process is performed in the same manner as in the first embodiment.
- the control unit 210 is an example of “association means”.
- the communication unit 240 transmits the OCT image and personal authentication information (personal information) associated by the control unit 210 to the external computer 1000.
- the communication unit 240 is an example of a “transmission unit”.
- the fundus imaging system includes a fundus imaging apparatus (100), a personal information acquisition unit (examination data generation unit 231), and an association unit (control unit 210).
- the fundus imaging apparatus images the fundus of the subject's eye.
- the personal information acquisition means acquires personal information of the subject.
- the association unit associates the fundus image acquired by the fundus imaging apparatus with the personal information acquired by the personal information acquisition unit.
- the acquired fundus image and personal information can be automatically associated with each other, so that the fundus image and personal information can be easily associated with each other.
- the association can be performed even in an examination performed without an examiner present.
- the fundus imaging apparatus of this embodiment may include a transmission unit (communication unit 240) that transmits the image associated with the association unit and the personal information. According to this configuration, it is possible to send the associated image and personal information to the external device.
- a transmission unit communication unit 240
- the fundus imaging apparatus of this embodiment may include the following optical system and processing unit.
- the optical system scans the fundus of the subject's eye with the measurement light, and detects interference light obtained by superimposing the return light of the measurement light from the fundus and the reference light. That is, this optical system is an interference optical system for performing OCT (see FIG. 35).
- the processing unit forms a fundus image by processing the detection result of the interference light by the optical system.
- the processing unit includes at least the image forming unit 220 and may include the data processing unit 230.
- the personal information acquisition unit (examination data generation unit 231) acquires personal information (personal authentication information) based on the image formed by the processing unit.
- personal information can be acquired based on the OCT image.
- functional information blood flow information or the like
- functional OCT such as Doppler OCT
- the personal information acquisition means may be configured to acquire one or both of the following personal information by analyzing the image formed by the processing unit.
- Morphological information representing the form of the predetermined part of the fundus may include any of a blood vessel of the fundus, an optic disc, a predetermined layer tissue, and a treatment mark by laser treatment.
- an optical system scans a three-dimensional region of the fundus with measurement light; a processing unit forms a three-dimensional image of the fundus; The personal information is acquired based on the three-dimensional image.
- personal information can be acquired based on a three-dimensional image with a large amount of information. For example, it is possible to obtain a two-dimensional form or a three-dimensional form as well as a two-dimensional distribution or a three-dimensional distribution for a predetermined part of the fundus.
- the freedom degree of the personal information acquired increases.
- an optical system scans a three-dimensional region of the fundus with measurement light; a processing unit forms a three-dimensional image of the fundus; A fundus front image (projection image) is formed by projecting at least a part of the three-dimensional image in the depth direction of the fundus; personal information acquisition means acquires personal information based on the fundus front image.
- the fundus front image includes information on a range subjected to the projection process. According to this structure, the freedom degree of the acquired personal information increases.
- an optical system scans a three-dimensional region of the fundus with measurement light; a processing unit forms a three-dimensional image of the fundus; A two-dimensional cross-sectional image representing a cross-section of the three-dimensional image is formed; personal information acquisition means acquires personal information based on the two-dimensional cross-sectional image.
- personal information can be acquired based on an image in which an arbitrary cross section of the fundus is drawn.
- the freedom degree of the personal information acquired increases.
- the personal information acquisition means may be configured to acquire layer thickness distribution information representing the thickness distribution of a predetermined layer tissue of the fundus by analyzing the image formed by the processing unit. .
- the layer thickness distribution information can be used as personal information.
- the fundus imaging apparatus includes a mode switching unit (switching lens) for switching between a fundus mode for forming an image of the fundus of the eye to be examined and an anterior segment mode for forming an image of the anterior segment. 127 and a lens driving unit 127A).
- the personal information acquisition unit calculates the distance between the fundus and the anterior segment based on the fundus image formed in the fundus mode and the anterior segment image formed in the anterior segment mode. can do. This distance can be used as personal information.
- the mode switching unit is not limited to a configuration in which the switching lens is inserted / retracted with respect to the optical path of the measurement light.
- the fundus mode and the anterior segment mode may be switched by applying a configuration that can change the optical path length of the reference light by a sufficient distance.
- the reference mirror (114) can be configured to be movable within a range in which a predetermined distance is added to a general axial length.
- the fundus imaging system of this embodiment may have the following storage means and collation means.
- the storage means (storage unit 212) stores in advance normal personal authentication information regarding a normal subject who is permitted to perform imaging using the fundus imaging apparatus.
- the collating unit collates the personal information acquired by the personal information acquiring unit with the regular personal authentication information.
- the fundus photographing apparatus performs photographing when the collation unit succeeds in collating the personal information with the regular personal authentication information.
- This configuration can also be applied to the first embodiment. According to this configuration, the subject can be authenticated as to whether or not the use of this system is permitted.
- the storage means is provided in the fundus imaging apparatus, but the storage means may be provided in a device connected to the fundus imaging apparatus.
- the storage means may be provided in a server or NAS (Network Attached Storage).
- the verification unit may be configured to determine that the verification is successful when the difference between the personal information and the authorized personal authentication information is included in a preset allowable range. According to this configuration, personal authentication using biometric information can be suitably performed.
- the personal information acquisition means collates the acquired other personal information with other regular personal authentication information stored in advance in the storage unit.
- authentication can be performed using another type of personal information.
- biometric information can be suitably performed.
- the fundus imaging apparatus may be a portable type and / or a stationary type. According to this configuration, the degree of freedom of the system configuration is increased.
- a fundus imaging system capable of acquiring a plurality of types of personal information
- the type of personal information may include any one or more of the personal information used in the third embodiment and the fourth embodiment.
- description will be made based on the fourth embodiment.
- FIG. 43 shows the configuration of the fundus imaging apparatus included in the fundus imaging system according to this example.
- the fundus imaging apparatus 101 has a function for inputting a medical history of a subject.
- the medical history is information indicating the content of medical treatment given to the subject in the past.
- Medical history includes medical history, treatment history, medication history, and the like.
- the medical history is input, for example, by acquiring the subject's electronic medical record via the network or manually by the user.
- the former process is executed by the control unit 210 controlling the communication unit 240, for example.
- the latter process is performed on the predetermined input screen displayed on the display unit 310 via the operation unit 320, for example.
- the data processing unit 230 of the fundus imaging apparatus 101 is provided with a type selection unit 233.
- the type selection unit 233 selects one or more types from among a plurality of types related to personal information based on the input medical history.
- the “plurality of types” is set in advance, and is, for example, all types of personal information that can be acquired by personal information acquisition means (such as the test data generation unit 231). Further, some of all the types that can be acquired may be “plural types”. In this case, the “plurality of types” is set in advance automatically or manually.
- the type selection unit 233 stores in advance correspondence information in which items included in the medical history are associated with types of personal information.
- This item includes, for example, a disease name, a drug name, a treatment item, and the like.
- One or more types of personal information are associated with each item or a combination of two or more items.
- a type of personal information that has a small influence (or no influence) is associated with an item.
- morphological information related to the optic nerve head is associated with a macular disease.
- morphological information relating to the macula is associated with a disease of the optic nerve head.
- drusen distribution information related to age-related macular degeneration is associated with glaucoma.
- the axial length is associated with a disease that does not involve a change in the axial length.
- the examination data generation unit 231 acquires personal information of the type selected by the type selection unit 233, for example, by analyzing an OCT image.
- the personal information acquisition means in this example is configured to be able to acquire a plurality of types of personal information.
- the personal information acquisition means in this example includes a medical history input unit and a type selection unit.
- the medical history input unit operates to input the medical history of the subject.
- the medical history input unit includes, for example, the control unit 210 and the communication unit 240, or the user interface 300.
- the type selection unit (233) selects one or more types from among a plurality of types set in advance regarding personal information based on the medical history input by the medical history input unit.
- the personal information acquisition means operates to acquire personal information of the type selected by the type selection unit.
- the personal information acquisition means in this example includes, for example, the data processing unit 230 (examination data generation unit 231 and type selection unit) in addition to the above medical history input unit (control unit 210 and communication unit 240 or user interface 300). 233). Further, the configuration of the third embodiment or the fourth embodiment can be applied to the association unit and the transmission unit.
- the fundus imaging apparatus included in the fundus imaging system according to this example has the same configuration as that of the first configuration example (see FIG. 43).
- This fundus photographing apparatus has a function of acquiring an elapsed time since the last photographing of the fundus oculi Ef of the eye E.
- This function includes, for example, a first function for acquiring an electronic medical record of a subject via a network, a second function for acquiring a previous photographing date (and time) from the electronic medical record, and the photographing date and current And a third function for calculating the elapsed time based on the date and time.
- the first function is realized by the control unit 210 controlling the communication unit 240, for example.
- the second and third functions are realized by the control unit 210, for example. These are examples of the “shooting interval acquisition unit”.
- the type selection unit 233 of this example selects one or more types from among a plurality of types related to personal information based on the elapsed time input by the shooting interval acquisition unit. This process is executed with reference to correspondence information similar to that in the first configuration example, for example.
- the elapsed time and the type of personal information are associated with each other.
- This correspondence is determined according to the degree of change of personal information with time, for example. That is, personal information includes information that changes with time, such as morphological information and distribution information generated from a fundus image, and information that does not change with time, such as character string information and fingerprint patterns. On the other hand, it does not bother the subject in acquiring the form information and distribution information, but it is necessary to acquire the character string information and the fingerprint pattern.
- personal information that changes over time can be applied when the elapsed time is short, and personal information that does not change over time can be applied when the elapsed time is long.
- various personal information that changes over time can be classified according to factors such as the speed of change over time and the accuracy of authentication, and can be arbitrarily associated with the time axis of elapsed time.
- the examination data generation unit 231 acquires personal information of the type selected by the type selection unit 233, for example, by analyzing an OCT image.
- the control unit 210 causes the display unit 310 to display a message that prompts an operation for acquiring personal information of the type selected by the type selection unit 233.
- the personal information acquisition means in this example is configured to be able to acquire a plurality of types of personal information.
- the personal information acquisition means in this example includes a shooting interval acquisition unit and a type selection unit.
- the imaging interval acquisition unit acquires an elapsed time from the previous imaging of the fundus.
- the imaging interval acquisition unit includes, for example, a control unit 210 and a communication unit 240.
- the type selection unit (233) selects one or more types from among a plurality of types set in advance regarding personal information based on the elapsed time acquired by the shooting interval acquisition unit.
- the personal information acquisition means operates to acquire personal information of the type selected by the type selection unit.
- the personal information acquisition means in this example is, for example, a data processing unit 230 (examination data generation unit 231 and type selection unit 233, or third unit in addition to the imaging interval acquisition unit (control unit 210 and communication unit 240). In the embodiment). Further, the configuration of the third embodiment or the fourth embodiment can be applied to the association unit and the transmission unit.
- the configuration of the fundus imaging apparatus included in the fundus imaging system according to this example is shown in FIG.
- the fundus imaging apparatus 102 includes a disease candidate specifying unit 234 that specifies a disease candidate of the eye E by analyzing an image formed by the image forming unit 220 or the data processing unit 230.
- the disease candidate means a disease that may cause the eye E to be examined.
- the image analysis performed by the disease candidate specifying unit 234 includes an arbitrary process for specifying a disease candidate (may be a known process), and may include, for example, fundus layer thickness analysis, nipple shape analysis, drusen analysis, and the like.
- the disease candidate specifying unit 234 may be provided in the examination data generating unit 231.
- the data processing unit 230 of the fundus imaging apparatus 102 is provided with a type selection unit 233.
- the type selection unit 233 selects one or more types from among a plurality of types related to personal information based on the disease candidates specified by the disease candidate specification unit 234. This process is executed with reference to correspondence information similar to that in the first configuration example, for example.
- the disease candidate and the type of personal information are associated with each other.
- This correspondence relationship is obtained, for example, by associating one disease candidate with the type of personal information related to a site that is not affected (or has a small effect) by this disease candidate.
- morphological information relating to the optic nerve head is associated with a macular disease.
- morphological information relating to the macula is associated with a disease of the optic nerve head.
- drusen distribution information related to age-related macular degeneration is associated with glaucoma.
- the axial length is associated with a disease that does not involve a change in the axial length.
- the examination data generation unit 231 acquires personal information of the type selected by the type selection unit 233, for example, by analyzing an OCT image.
- the personal information acquisition means in this example is configured to be able to acquire a plurality of types of personal information.
- the personal information acquisition means in this example includes a disease candidate specifying unit (234) and a type selecting unit (233).
- the disease candidate specifying unit specifies a disease candidate for the eye to be examined by analyzing an image acquired by the fundus imaging apparatus (102).
- the type selection unit selects one or more types from among a plurality of types set in advance regarding personal information based on the disease candidate specified by the disease candidate specifying unit.
- the personal information acquisition unit operates to acquire personal information of the type selected by the type selection unit.
- the personal information acquisition means in this example includes, for example, a data processing unit 230 (examination data generation unit 231 and type selection unit 233) in addition to the disease candidate identification unit (234). Further, the configuration of the third embodiment or the fourth embodiment can be applied to the association unit and the transmission unit.
- the type of personal information is properly used according to the medical history, imaging interval, or disease candidate.
- the allowable range in the collation processing of the fourth embodiment can be changed. It is.
- the optical scanning ophthalmoscope and the optical coherence tomography have been described.
- a multi-function machine to which a plurality of imaging methods can be applied can be used as the fundus imaging apparatus.
- Multifunctional devices include optical scanning ophthalmoscopes and optical coherence tomography devices, fundus cameras and optical coherence tomography devices, optical coherence tomography and slit lamp microscopes, optical There are devices that combine a scanning ophthalmoscope and a fundus camera.
- a multi-function peripheral it is arbitrary which function is used to execute a target process. For example, in an apparatus combining an optical scanning ophthalmoscope and an optical coherence tomograph, it is possible to form the alignment index in the former and the focus index in the latter.
- light for imaging the fundus is used to form an index, but this light may be used for other purposes.
- this light can be used to provide a light stimulus to the photoreceptor cell.
- the light source unit can be controlled so that the amount of light when applying the light stimulus is smaller than that when imaging is performed.
- optical stimulation can be applied in the latter (or the former) while imaging is performed in the former (or the latter).
- the computer program for realizing the above embodiment can be stored in an arbitrary recording medium readable by a computer.
- this recording medium for example, a semiconductor memory, an optical disk, a magneto-optical disk (CD-ROM / DVD-RAM / DVD-ROM / MO, etc.), a magnetic storage medium (hard disk / floppy (registered trademark) disk / ZIP, etc.), etc. Can be used.
- examinations may be carried out without the presence of an examiner at sites such as group screening, home medical care, remote medical care, and welfare facilities for the elderly. Under such examination conditions, the examiner cannot identify the subject, and thus the fundus imaging and the association between the fundus image and the subject (personal information) can be made easier by the subject himself / herself. It is desirable to be able to do it.
- the purpose of the third to fifth embodiments is to easily associate the fundus image with the personal information.
- a fundus imaging system for achieving this object has the following characteristics.
- a fundus photographing apparatus for photographing the fundus of the subject's eye; Personal information acquisition means for acquiring the personal information of the subject;
- a fundus imaging system comprising: an association unit that associates an image of the fundus acquired by the fundus imaging apparatus with the personal information acquired by the personal information acquisition unit.
- the fundus imaging apparatus includes: A scanning optical system that receives the return light of the spot light from the fundus while scanning the fundus of the subject's eye with a spot light; and A control circuit unit for controlling the scanning optical system so that a scanning locus by the spot light is formed on the fundus; An image constructing unit that constructs an image by the return light using a light reception signal from the light receiving unit and a position of the scanning locus; The fundus imaging system according to claim 3, wherein the personal information acquisition unit acquires the retinal pattern using an image constructed by the image construction unit.
- the fundus imaging apparatus includes: A scanning optical system that receives the return light of the spot light from the fundus while scanning the fundus of the subject's eye with a spot light; and A control circuit unit for controlling the scanning optical system so that a scanning locus by the spot light is formed on the fundus; An image constructing unit that constructs an image by the return light using a light reception signal from the light receiving unit and a position of the scanning locus, A projection lens for projecting the spot light onto the anterior eye portion of the eye to be examined, which can be inserted into and removed from the optical path of the scanning optical system
- the fundus imaging system according to claim 5, wherein the personal information acquisition unit acquires the iris pattern using an image of the anterior segment constructed by the image constructing unit.
- the personal information acquisition means includes Including a photo camera for photographing the face of the subject, The fundus photographing system according to claim 1 or 2, wherein personal information is obtained using a face photograph obtained by the face photograph camera.
- Appendix 8 The fundus imaging system according to appendix 1 or appendix 2, wherein the personal information acquisition means acquires any one of a handprint, a fingerprint, a palm print, and a vein pattern of the subject as the personal information.
- the fundus imaging apparatus includes: An optical system that scans the fundus of the subject's eye with measurement light and detects interference light obtained by superimposing the return light of the measurement light from the fundus and the reference light; A processing unit that forms a fundus image by processing the detection result of the interference light by the optical system, The fundus imaging system according to Additional Item 1 or Additional Item 2, wherein the personal information acquisition unit acquires the personal information based on an image formed by the processing unit.
- the personal information acquisition means acquires, as the personal information, morphological information representing the morphology of a predetermined part of the fundus and / or distribution information representing the distribution of the predetermined part by analyzing the image formed by the processing unit.
- the fundus imaging system according to item 9, wherein the fundus imaging system is provided.
- the optical system scans a three-dimensional region of the fundus with measurement light
- the processing unit forms a three-dimensional image of the fundus
- the fundus imaging system according to any one of appendices 9 to 11, wherein the personal information acquisition unit acquires the personal information based on the three-dimensional image.
- the optical system scans a three-dimensional region of the fundus with measurement light
- the processing unit forms a three-dimensional image of the fundus, forms at least a part of the three-dimensional image in the depth direction of the fundus, and forms a fundus front image.
- the fundus imaging system according to any one of appendices 9 to 11, wherein the personal information acquisition unit acquires the personal information based on the fundus front image.
- the optical system scans a three-dimensional region of the fundus with measurement light
- the processing unit forms a three-dimensional image of the fundus, forms a two-dimensional cross-sectional image representing a cross-section of the three-dimensional image
- the fundus imaging system according to any one of appendices 9 to 11, wherein the personal information acquisition unit acquires the personal information based on the two-dimensional cross-sectional image.
- the personal information acquisition means acquires layer thickness distribution information representing a thickness distribution of a predetermined layer tissue of the fundus as the personal information by analyzing the image formed by the processing unit.
- Item 10 The fundus imaging system according to Item 9.
- the fundus imaging apparatus has a mode switching unit for switching between a fundus mode for forming an image of the fundus of the eye to be examined and an anterior eye unit mode for forming an image of the anterior eye part,
- the personal information acquisition means calculates the distance between the fundus and the anterior segment based on the fundus image formed in the fundus mode and the anterior segment image formed in the anterior segment mode.
- the fundus imaging system according to item 9, wherein the fundus imaging system is calculated as personal information.
- the personal information acquisition means includes A medical history input unit for inputting the medical history of the subject; A type selection unit that selects one or more types of the plurality of types based on the medical history input by the medical history input unit; The fundus imaging system according to item 17, wherein personal information of the type selected by the type selection unit is acquired.
- the personal information acquisition means includes An imaging interval acquisition unit that acquires an elapsed time from the previous imaging of the fundus; A type selection unit that selects one or more types of the plurality of types based on the elapsed time acquired by the shooting interval acquisition unit; The fundus imaging system according to item 17, wherein personal information of the type selected by the type selection unit is acquired.
- the personal information acquisition means includes By analyzing the image acquired by the fundus imaging apparatus, a disease candidate specifying unit that specifies a disease candidate of the eye to be examined; and A type selecting unit that selects one or more types of the plurality of types based on the disease candidate specified by the disease candidate specifying unit, and The fundus imaging system according to item 17, wherein personal information of the type selected by the type selection unit is acquired.
- Appendix 21 Storage means in which regular personal authentication information related to a regular subject permitted to perform imaging using the fundus imaging apparatus is stored; Collation means for collating the personal information acquired by the personal information acquisition means with the regular personal authentication information, 21.
- the fundus imaging apparatus performs imaging when the collation unit succeeds in collating the personal information with the regular personal authentication information. Fundus photography system.
- the supplementary item 21 is characterized in that the collating unit determines that the collation is successful when a difference between the personal information and the regular personal authentication information is included in a preset allowable range.
- the personal information acquisition means acquires other personal information, The fundus imaging system according to item 22, wherein the collating unit collates the other personal information with other regular personal authentication information stored in advance in the storage unit.
- Appendix 24 24.
Abstract
Description
(光走査型検眼鏡の外観構成の概要)
図1~図4は、実施形態に係る眼底撮影装置としてのポータブルタイプ(持ち運び可能タイプ、可動タイプ)の光走査型検眼鏡の説明図である。図1において、符号1は、装置本体としての光走査型検眼鏡本体を示し、符号2は取っ手部材を示す。
図7は、この実施形態に係る光走査型検眼鏡の走査光学系および制御部を示すブロック図である。符号20は、照明光を照射する照明光源部を示す。また、符号21は受光部を示す。照明光源部20は、赤外光を照射する赤外光源20aと、B光(青色光)を発生する青色光源20bと、G光(緑色光)を発生する緑色光源20gと、R光(赤色光)を発生する赤色光源20rとを含む。これら光源のそれぞれとしては、空間的コヒーレンシの高い光源、たとえば、半導体レーザ(波長掃引レーザ、スーパールミネセントダイオードを含む)、固体レーザ、ガスレーザ、若しくはそれらから発せられた光を光ファイバに結合させたもの、またはファイバレーザが用いられる。
この実施形態に係る光走査型検眼鏡本体1の内部には、図1に破線で示す可動ケース1Aと、可動ケース1Aを前後上下左右方向に移動する駆動機構(図示を略す)とが設けられている。可動ケース1Aおよび駆動機構は「駆動部」に相当する。
検者が電源ボタンBを操作することにより、光走査型検眼鏡本体1は自動的に個人情報取得モードとなり、その後に観察モードに移行する。個人情報取得モードについては後述することにし、観察モードについて先に説明を行う。観察モードにおいて、点灯制御回路部28は、赤外光源20aを点灯させる。これにより、図8に示すように、赤外光であるスポット光Spが眼底Erに形成される。
撮影モードにおいて、点灯制御回路部28は、青色光源20bと、緑色光源20gと、赤色光源20rとを同時に点灯させる。その点灯時間は、たとえば100ミリ秒に設定されている。青色光源20b、緑色光源20gおよび赤色光源20rが点灯されている間に、観察用の眼底画像を構築する際の走査軌跡SLと同じ走査軌跡SLを描くようにMEMSミラー23が駆動される。
まず、被検眼Eに対するアライメントが必要な理由について説明する。
図20~図26は、アライメント指標の変形例を示す説明図である。以上においては、MEMSミラー23によるスポット光Spの走査軌跡を、眼底の観察時/撮影時とアライメント調節時とで変更する構成を説明した。
図20は、スポット光Spが眼底Er上で円形状の不連続的軌跡を描くように、MEMSミラー23による走査位置と赤外光源20aの点灯タイミングとを同期させる制御を示す。
以上においては、眼底Er上で円形状軌跡CRSpを描くことによりアライメント指標を形成する場合について説明した。しかし、図21に示すように、複数のアライメント指標が異なる位置に形成されるように眼底Erにスポット光Spを照射する構成としてもよい。
図23に示す例は、走査軌跡SLを縦方向に一筆書きする場合において、縦方向に配列された3個のスポット光Spからなるアライメント指標を、眼底Erにおける上下左右の隅部の四か所にそれぞれ形成する場合を示す。
図25は、走査軌跡SLを横方向に一筆書きする場合において、縦方向に配列された3個のスポット光Spからなるアライメント指標を上側中央位置および下側中央位置にそれぞれ形成し、かつ、横方向に配列された3個のスポット光Spからなるアライメント指標を左側中央位置および右側中央位置にそれぞれ形成する場合を示す。つまり、この変形例は、十字型のアライメント指標が眼底Erに形成されるかのごとく構成したものである。
被検眼Eに対する装置本体のアライメント調節が終了した後、ユーザが合焦ボタン11を操作すると、光走査型検眼鏡本体1の動作モードが合焦モード(フォーカスモード)に移行する。なお、アライメント調節の完了をトリガとして自動的に合焦モードに移行させる構成を採用することもできる。この場合、合焦ボタン11は不要である。
フォーカス指標はライン状指標SLPに限定されるものではない。たとえば、図30に示すリング状指標SCPや、図31に示す正方形状指標SRPや、図32に示す星形状指標SSPを、フォーカス指標として用いることができる。
この実施形態において、眼底Erを観察しているときに眼底Erに固視標を投影することができる。この処理は、制御回路部27が点灯制御回路部28を制御することにより行われる。
以上の説明では、説明の便宜上、眼底の観察および撮影を先に説明したが、実際の検査はたとえば以下に説明する流れで行われる。
(ステップS1)
電源ボタンBをオンする。
(ステップS2)
被検眼Eの前に光走査型検眼鏡を配置させ、撮影対象の被検眼Eに対して接眼鏡筒部3を対面させる。
(ステップS3)
アライメントボタン10を操作して、被検眼Eに対する光走査型検眼鏡本体1の位置合わせ(アライメント)を行う。
(ステップS4)
合焦ボタン11を操作して、被検眼Eの眼底Erに対するフォーカス調節を行う。
(ステップS5)
アライメントおよび合焦の完了を受けて、光走査型検眼鏡の動作モードが自動的に観察モードに移行する。さらに、観察モードへの移行とともに、眼底Erに固視標が投影される。固視標の提示位置をマニュアルで変更できるように構成してよい。また、複数の提示位置を選択的に適用可能な固視標プログラムをあらかじめ組み込んでもよい。
(ステップS6)
撮影ボタン9に対する操作を受けて、光走査型検眼鏡は、撮影を実行し、取得された眼底像EGrを内蔵メモリ部29bに格納する。
第1の実施形態では、眼底撮影装置として光走査型検眼鏡が適用される場合について説明したが、眼底撮影装置は光走査型検眼鏡に限定されるものではない。たとえば、背景技術として説明した光干渉断層計を眼底撮影装置として適用することが可能である。より一般に、実施形態に係る眼底撮影装置は、光を用いて眼底を走査し、眼底からの戻り光を検出し、その検出結果と走査軌跡の位置とに基づいて眼底を画像化するように構成されていればよい。
実施形態に係る眼底撮影装置の構成について説明する。図35に示す眼底撮影装置100は、光学ユニット110と、コンピュータ200と、ユーザインターフェイス(UI)300とを有する。眼底撮影装置100は、第1の実施形態の光走査型検眼鏡本体1を代替する。
光学ユニット110は、OCT計測を行うための光学系と、所定の光学素子を駆動する機構とを含む。光学系は、光源111からの光を測定光と参照光とに分割し、測定光の被検眼Eからの戻り光と参照光とを干渉させ、その干渉光を検出する。この光学系は、従来のスペクトラルドメインタイプのOCT装置と同様の構成を有する。すなわち、この光学系は、低コヒーレンス光(広帯域光)を参照光と測定光に分割し、被検眼Eを経由した測定光と参照光路を経由した参照光とを干渉させて干渉光を生成し、この干渉光のスペクトル成分を検出するように構成される。スペクトル成分の検出結果(検出信号)はコンピュータ200に送られる。
実施形態に係る眼底撮影装置100の制御系およびデータ処理系について説明する。制御系およびデータ処理系の構成例を図36に示す。
コンピュータ200にはユーザインターフェイス300が接続されている。ユーザインターフェイス300には、表示部310と操作部320とが含まれる。表示部310は、フラットパネルディスプレイ等の表示デバイスを含む。操作部320は、眼底撮影装置100の筐体や外部に設けられたボタン、キー、ジョイスティック、操作パネル等の操作デバイスを含む。コンピュータ200がパーソナルコンピュータを含む場合、操作部320は、このパーソナルコンピュータの操作デバイス(マウス、キーボード、トラックパッド、ボタン等)を含んでいてよい。
制御部210は、コンピュータ200に設けられている。制御部210は、マイクロプロセッサ、RAM、ROM、ハードディスクドライブ等を含んで構成される。制御部210には、主制御部211と記憶部212が設けられている。
主制御部211は、眼底撮影装置100の各部の制御を行う。たとえば、主制御部211による制御の対象には、ユニット駆動部110A、光源111、参照ミラー駆動部114A、スキャナ116、鏡筒駆動部119B、CCD(イメージセンサ)123、フラットパネルディスプレイ125、表示部310、データ処理部230、および通信部240が含まれる。
記憶部212は、各種のデータを記憶する。また、記憶部212には、眼底撮影装置100を動作させるための各種プログラムやデータが記憶されている。記憶部212に記憶されるデータは、眼底撮影装置100により取得されるデータと、あらかじめ記憶されるデータとを含む。
設定情報は、光学ユニット110およびデータ処理部230に関する所定の項目の設定の内容が記録された情報である。設定情報はたとえば次の事項のうち少なくとも1つに関する設定内容を含む:(1)固視位置;(2)スキャンパターン;(3)合焦位置;(4)視度補正;(5)最大干渉深度;(6)解析処理。
画像形成部220は、CCDイメージセンサ123からの検出信号に基づいて画像データを形成する。この処理には、従来のスペクトラルドメインタイプのOCTと同様に、ノイズ除去(ノイズ低減)、フィルタ処理、分散補償、FFT(Fast Fourier Transform)などの処理が含まれている。他のタイプのOCTが適用される場合、画像形成部220は、そのタイプに応じた公知の処理を実行する。
データ処理部230は、各種のデータ処理を実行する。たとえば、データ処理部230は、画像形成部220により形成された画像に対して画像処理を施す。その一例として、データ処理部230は、断面位置が異なる複数の2次元断面像に基づいて、被検眼Eの3次元画像の画像データを形成することができる。3次元画像の画像データとは、3次元座標系により画素の位置が定義された画像データを意味する。3次元画像の画像データとしては、3次元的に配列されたボクセルからなる画像データがある。この画像データは、ボリュームデータ或いはボクセルデータなどと呼ばれる。ボリュームデータに基づく画像を表示させる場合、データ処理部230は、このボリュームデータに対してレンダリング処理(ボリュームレンダリングやMIP(Maximum Intensity Projection:最大値投影)など)を施して、特定の視線方向から見たときの擬似的な3次元画像の画像データを形成する。また、データ処理部230は、3次元画像の任意の断面を画像化することができる(MPR(Multi-Planar Reconstruction):断面変換)。
通信部240は、外部装置との間でデータ通信を行う。データ通信の方式は任意である。たとえば、通信部240は、インターネットに準拠した通信インターフェイス、LANに準拠した通信インターフェイス、近距離通信に準拠した通信インターフェイスなどを含む。また、データ通信は有線通信でも無線通信でもよい。
この実施形態に係る眼底撮影システムの作用および効果について説明する。なお、眼底をスキャンする態様と、眼底に指標を形成する態様は、第1の実施形態と同様であってよい。
第3の実施形態では、第1の実施形態と同様の光走査型検眼鏡を含む眼底撮影システムについて説明する。以下、第1の実施形態の図1~図9を適宜に参照する。なお、光学系の構成については第1の実施形態(図7)と一部が異なる。
図38に模式的に示すように、光走査型検眼鏡本体1は、USBを介して携帯情報機器16に接続されている。医療機関(検者側)には、携帯情報機器16’と、パーソナルコンピュータ(PC)16Aと、モニタ部16Bとが設置されている。携帯情報機器16’は、パーソナルコンピュータ16AにUSBを介して接続されている。検者側の携帯情報機器16’は、被検者側の携帯情報機器16から送信された情報を受信する。
図39は、この実施形態に係る眼底撮影システムの構成例1を示す。構成例1では、個人情報取得手段(外付け患者認識ユニット)として手形(掌形)または指紋認識センサを用いている。なお、構成例1において適用可能な生体認証情報は手形や指紋に限定されず、たとえば、掌紋や、手または指の静脈(血管)のパターンを適用することが可能である。
図40は、この実施形態に係る眼底撮影システムの構成例2を示す。構成例2では、網膜パターンまたは虹彩パターンを認識するためのプログラムが、個人情報取得手段としてROMに設けられている。
図41は、この実施形態に係る眼底撮影システムの構成例3を示す。構成例3では、個人情報取得手段として顔写真撮影用カメラ32が設けられている。本例において、顔写真撮影用カメラ32は、アライメント調節ボタン10により作動するよう構成されている。アライメント調節ボタン10が操作されると、顔写真撮影用カメラ32によって被検者の顔が撮影される。それ以降の動作は、構成例1や構成例2と同様であるので、その詳細な説明は省略する。
第3の実施形態では、眼底撮影装置として光走査型検眼鏡が適用された眼底撮影システムについて説明したが、眼底撮影装置は光走査型検眼鏡に限定されるものではない。たとえば、背景技術として説明した光干渉断層計を眼底撮影装置として適用することが可能である。
正規個人認証情報は、眼底撮影装置100を用いて検査を行うことが許可された者(正規の被検者)の個人認証情報である。個人認証情報とは、眼底撮影装置100を用いて検査を行おうとしている者の個人認証を行うために使用される情報である。個人認証情報は、第1の実施形態における個人情報として使用される。
検査データ生成部231は、光学ユニット110による干渉光の検出結果を処理することにより、被検眼Eの状態を示す検査データを生成する。「干渉光の検出結果」は、たとえば次のいずれかである:(1)CCDイメージセンサ123から出力される信号;(2)画像形成部220により形成された画像データ;(3)画像形成部220が実行する処理の中間段階で得られるデータ(つまり、画像データ形成処理の途中で得られるデータ);(4)CCDイメージセンサ123から出力される信号を画像形成部220以外の構成要素によって処理して得られるデータ。以下、検査データ生成部231が実行する処理の例を説明する。
この実施形態では、検査データ生成部231により取得された検査データを個人認証情報として用いることができる。また、記憶部212には、正規個人認証情報があらかじめ記憶されている。認証処理部232は、検査データ等の個人認証情報と、正規個人認証情報とを照合する。認証処理部232は、照合結果(照合に成功したか否かを示す情報)を制御部210に送る。
検査データ生成部231により生成された個人認証情報は、制御部210に送られ、記憶部212に記憶される。また、画像形成部220(およびデータ処理部230)により形成されたOCT画像は、制御部210に送られ、記憶部212に記憶される。制御部210は、OCT画像と個人認証情報(個人情報)とを関連付ける。この処理は、第1の実施形態と同様にして行われる。制御部210は「関連付け手段」の一例である。
通信部240は、制御部210により関連付けられたOCT画像と個人認証情報(個人情報)とを外部コンピュータ1000に送信する。通信部240は「送信手段」の一例である。
この実施形態に係る眼底撮影システムの作用および効果について説明する。
この実施形態では、複数の種別の個人情報を取得可能な眼底撮影システムについて説明する。個人情報の種別は、第3の実施形態および第4の実施形態において用いられた個人情報のうちいずれか1つ以上を含んでいてよい。以下、第4の実施形態に基づいて説明を行う。
本例に係る眼底撮影システムに含まれる眼底撮影装置の構成を図43に示す。眼底撮影装置101は、被検者の診療歴を入力するための機能を有する。診療歴とは、被検者に対して過去に施された診療内容を示す情報である。診療歴には、病歴、治療歴、投薬歴などが含まれる。診療歴の入力は、たとえば、被検者の電子カルテをネットワーク経由で取得することにより、または、ユーザが手入力することにより行われる。前者の処理は、たとえば、制御部210が通信部240を制御することにより実行される。後者の処理は、たとえば、表示部310に表示された所定の入力画面に対し、操作部320を介して行われる。これらは「診療歴入力部」の一例である。
本例に係る眼底撮影システムに含まれる眼底撮影装置は、第1の構成例と同様の構成を有する(図43を参照)。この眼底撮影装置は、被検眼Eの眼底Efを前回撮影してからの経過時間を取得する機能を有する。この機能は、たとえば、被検者の電子カルテをネットワーク経由で取得する第1の機能と、この電子カルテから前回の撮影日(および時刻)を取得する第2の機能と、この撮影日と現在日時とに基づいて経過時間を算出する第3の機能とを含む。第1の機能は、たとえば、制御部210が通信部240を制御することにより実現される。第2および第3の機能は、たとえば、制御部210により実現される。これらは「撮影間隔取得部」の一例である。
本例に係る眼底撮影システムに含まれる眼底撮影装置の構成を図44に示す。眼底撮影装置102は、画像形成部220またはデータ処理部230により形成された画像を解析することにより、被検眼Eの疾患候補を特定する疾患候補特定部234を有する。疾患候補とは、被検眼Eが患っている可能性がある疾患を意味する。疾患候補特定部234が実行する画像解析は、疾患候補を特定するための任意の処理を含み(公知の処理でもよい)、たとえば眼底層厚解析や乳頭形状解析やドルーゼン解析などを含んでいてよい。なお、疾患候補特定部234は検査データ生成部231内に設けられていてよい。
以上に説明した構成は、この発明を実施するための一例に過ぎない。よって、この発明の要旨の範囲内における任意の変形(省略、置換、付加等)を適宜に施すことが可能である。
第3~第5の実施形態に係る眼底撮影システムに関し、以下のように付記する。
被検者の被検眼の眼底を撮影する眼底撮影装置と、
被検者の個人情報を取得する個人情報取得手段と、
前記眼底撮影装置により取得された眼底の画像と、前記個人情報取得手段により取得された個人情報とを関連付ける関連付け手段と
を備える眼底撮影システム。
前記眼底撮影装置は、前記関連付け手段により関連付けられた前記画像と前記個人情報とを送信する送信手段を含む
ことを特徴とする付記項1に記載の眼底撮影システム。
前記個人情報取得手段は、被検者の網膜パターンを前記個人情報として取得する
ことを特徴とする付記項1または付記項2に記載の眼底撮影システム。
前記眼底撮影装置は、
被検眼の眼底をスポット光により走査しつつ前記眼底からの前記スポット光の戻り光を受光部により受光する走査光学系と、
前記スポット光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御する制御回路部と、
前記受光部からの受光信号と前記走査軌跡の位置とを用いて、前記戻り光による像を構築する像構築部と
を有し、
前記個人情報取得手段は、前記像構築部により構築された像を用いて前記網膜パターンを取得する
ことを特徴とする付記項3に記載の眼底撮影システム。
前記個人情報取得手段は、被検者の虹彩パターンを前記個人情報として取得する
ことを特徴とする付記項1または付記項2に記載の眼底撮影システム。
前記眼底撮影装置は、
被検眼の眼底をスポット光により走査しつつ前記眼底からの前記スポット光の戻り光を受光部により受光する走査光学系と、
前記スポット光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御する制御回路部と、
前記受光部からの受光信号と前記走査軌跡の位置とを用いて、前記戻り光による像を構築する像構築部と、
前記走査光学系の光路に対して挿脱可能とされた、被検眼の前眼部に前記スポット光を投影するための投影レンズと
を有し、
前記個人情報取得手段は、前記像構築部により構築された前眼部の像を用いて前記虹彩パターンを取得する
ことを特徴とする付記項5に記載の眼底撮影システム。
前記個人情報取得手段は、
被検者の顔を撮影する顔写真用カメラを含み、
前記顔写真用カメラにより取得された顔写真を用いて個人情報を取得する
ことを特徴とする付記項1または付記項2に記載の眼底撮影システム。
前記個人情報取得手段は、被検者の手形、指紋、掌紋および静脈パターンのうちのいずれかを前記個人情報として取得する
ことを特徴とする付記項1または付記項2に記載の眼底撮影システム。
前記眼底撮影装置は、
被検眼の眼底を測定光で走査し、眼底からの測定光の戻り光と参照光とを重ね合わせて得られた干渉光を検出する光学系と、
前記光学系による干渉光の検出結果を処理することにより眼底の画像を形成する処理部と
を有し、
前記個人情報取得手段は、前記処理部により形成された画像に基づいて前記個人情報を取得する
ことを特徴とする付記項1または付記項2に記載の眼底撮影システム。
前記個人情報取得手段は、前記処理部により形成された画像を解析することにより、眼底の所定部位の形態を表す形態情報および/または当該所定部位の分布を表す分布情報を前記個人情報として取得する
ことを特徴とする付記項9に記載の眼底撮影システム。
前記所定部位は、眼底の血管、視神経乳頭、所定の層組織、およびレーザ治療による治療痕のうちのいずれかを含む
ことを特徴とする付記項10に記載の眼底撮影システム。
前記光学系は、眼底の3次元領域を測定光で走査し、
前記処理部は、眼底の3次元画像を形成し、
前記個人情報取得手段は、前記3次元画像に基づいて前記個人情報を取得する
ことを特徴とする付記項9~付記項11のいずれか一項に記載の眼底撮影システム。
前記光学系は、眼底の3次元領域を測定光で走査し、
前記処理部は、眼底の3次元画像を形成し、前記3次元画像の少なくとも一部を眼底の深さ方向に投影することにより眼底正面画像を形成し、
前記個人情報取得手段は、前記眼底正面画像に基づいて前記個人情報を取得する
ことを特徴とする付記項9~付記項11のいずれか一項に記載の眼底撮影システム。
前記光学系は、眼底の3次元領域を測定光で走査し、
前記処理部は、眼底の3次元画像を形成し、前記3次元画像の断面を表す2次元断面像を形成し、
前記個人情報取得手段は、前記2次元断面像に基づいて前記個人情報を取得する
ことを特徴とする付記項9~付記項11のいずれか一項に記載の眼底撮影システム。
前記個人情報取得手段は、前記処理部により形成された画像を解析することにより、眼底の所定の層組織の厚みの分布を表す層厚分布情報を前記個人情報として取得する
ことを特徴とする付記項9に記載の眼底撮影システム。
前記眼底撮影装置は、被検眼の眼底の画像を形成するための眼底モードと、前眼部の画像を形成するための前眼部モードとを切り替えるためのモード切替部を有し、
前記個人情報取得手段は、前記眼底モードにおいて形成された眼底の画像と、前記前眼部モードにおいて形成された前眼部の画像とに基づいて、眼底と前眼部との間の距離を前記個人情報として算出する
ことを特徴とする付記項9に記載の眼底撮影システム。
前記個人情報取得手段は、複数の種別の個人情報を取得可能である
ことを特徴とする付記項1~付記項16のいずれか一項に記載の眼底撮影システム。
前記個人情報取得手段は、
被検者の診療歴を入力するための診療歴入力部と、
前記診療歴入力部により入力された診療歴に基づいて、前記複数の種別のうちの1以上の種別を選択する種別選択部と
を含み、
前記種別選択部により選択された種別の個人情報を取得する
ことを特徴とする付記項17に記載の眼底撮影システム。
前記個人情報取得手段は、
当該眼底の前回の撮影からの経過時間を取得する撮影間隔取得部と、
前記撮影間隔取得部により取得された経過時間に基づいて、前記複数の種別のうちの1以上の種別を選択する種別選択部と
を含み、
前記種別選択部により選択された種別の個人情報を取得する
ことを特徴とする付記項17に記載の眼底撮影システム。
前記個人情報取得手段は、
前記眼底撮影装置により取得された画像を解析することにより、被検眼の疾患候補を特定する疾患候補特定部と、
前記疾患候補特定部により特定された疾患候補に基づいて、前記複数の種別のうちの1以上の種別を選択する種別選択部と
を含み、
前記種別選択部により選択された種別の個人情報を取得する
ことを特徴とする付記項17に記載の眼底撮影システム。
前記眼底撮影装置を用いた撮影を行うことが許可された正規の被検者に関する正規個人認証情報があらかじめ記憶された記憶手段と、
前記個人情報取得手段により取得された個人情報と前記正規個人認証情報とを照合する照合手段と
を有し、
前記眼底撮影装置は、前記照合手段により前記個人情報と前記正規個人認証情報との照合に成功した場合に撮影を実行する
ことを特徴とする付記項1~付記項20のいずれか一項に記載の眼底撮影システム。
前記照合手段は、前記個人情報と前記正規個人認証情報との間の相違が、あらかじめ設定された許容範囲に含まれる場合に、前記照合に成功したと判定する
ことを特徴とする付記項21に記載の眼底撮影システム。
前記相違が前記許容範囲に含まれない場合、前記個人情報取得手段は、他の個人情報を取得し、
前記照合手段は、前記他の個人情報と、前記記憶手段にあらかじめ記憶された他の正規個人認証情報とを照合する
ことを特徴とする付記項22に記載の眼底撮影システム。
前記眼底撮影装置はポータブルタイプである
ことを特徴とする付記項1~付記項23のいずれか一項に記載の眼底撮影システム。
27 制御回路部
29 眼底像構築部
30 指紋・手形センサ
100、101、102 眼底撮影装置
110 光学ユニット
111 光源
114 参照ミラー
114A 参照ミラー駆動部
116 スキャナ
123 CCDイメージセンサ
127 切替レンズ
127A レンズ駆動部
200 コンピュータ
210 制御部
211 主制御部
212 記憶部
220 画像形成部
230 データ処理部
231 検査データ生成部
232 認証処理部
233 種別選択部
234 疾患候補特定部
240 通信部
300 ユーザインターフェイス
310 表示部
320 操作部
1000 外部コンピュータ
2000 通信回線
Claims (14)
- 光源部からの光により被検眼の眼底を走査し、前記眼底からの戻り光を受光部にて受光する走査光学系と、
前記光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御する制御回路部と、
前記受光部からの受光信号と前記走査軌跡の位置とに基づいて前記眼底の画像を構築する像構築部と
を有し、
前記制御回路部は、その動作モードとして、被検眼に対する前記走査光学系の位置合わせを行うためのアライメント指標が前記光源部からの光に基づき前記眼底に投影されるように前記走査光学系の制御を行うアライメントモードを含む
ことを特徴とする眼底撮影装置。 - 前記アライメントモードにおいて、前記制御回路部は、前記像構築部により構築された前記アライメント指標の画像を表示手段に表示させる
ことを特徴とする請求項1に記載の眼底撮影装置。 - 前記走査光学系を移動させるための駆動部を有し、
前記制御回路部は、前記アライメント指標に基づいてアライメント状態を判断する判断部を含み、この判断の結果に基づいて前記駆動部を制御することにより前記走査光学系を移動させる
ことを特徴とする請求項1または請求項2に記載の眼底撮影装置。 - 前記走査光学系は走査器を含み、
前記アライメントモードにおいて、前記制御回路部は、前記眼底の画像を構築するための走査軌跡と異なる走査軌跡が前記眼底に形成されるように前記走査器を制御することによって、前記眼底にアライメント指標を投影させる
ことを特徴とする請求項1~請求項3のいずれか一項に記載の眼底撮影装置。 - 前記走査光学系は走査器を含み、
前記アライメントモードにおいて、前記制御回路部は、前記眼底の画像を構築するための走査軌跡と同じ走査軌跡に基づく前記走査器の制御と、前記光源部の点灯タイミングの制御とを連係して行うことによって、前記眼底にアライメント指標を投影させる
ことを特徴とする請求項1~請求項3のいずれか一項に記載の眼底撮影装置。 - 前記走査光学系は走査器を含み、
前記アライメントモードにおいて、前記制御回路部は、前記光源部を点灯状態としつつ、前記眼底の画像を構築するための走査軌跡と同じ走査軌跡に基づく前記走査器の制御と、前記受光部からの受光信号の出力タイミングの制御とを連係して行うことによって、前記アライメント指標の画像を前記表示手段に表示させる
ことを特徴とする請求項2に記載の眼底撮影装置。 - 前記アライメントモードにおいて、前記眼底に照射される光は赤外光である
ことを特徴とする請求項1~請求項6のいずれか一項に記載の眼底撮影装置。 - 光源部からの光により被検眼の眼底を走査し、前記眼底からの戻り光を受光部にて受光する走査光学系と、
前記光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御する制御回路部と、
前記受光部からの受光信号と前記走査軌跡の位置とに基づいて前記眼底の画像を構築する像構築部と
を有し、
前記制御回路部は、その動作モードとして、被検眼の眼底に対する前記走査光学系のフォーカス合わせを行うためのフォーカス指標が前記光源部からの光に基づき前記眼底に投影されるように前記走査光学系の制御を行うフォーカスモードを含む
ことを特徴とする眼底撮影装置。 - 前記アライメントモードにおいて、前記制御回路部は、前記像構築部により構築された前記フォーカス指標の画像を表示手段に表示させる
ことを特徴とする請求項8に記載の眼底撮影装置。 - 前記走査光学系は、フォーカス合わせを行うためのレンズを含み、
前記走査光学系の光軸の方向に前記レンズを移動させるための駆動部を有し、
前記制御回路部は、前記フォーカス指標に基づいてフォーカス状態を判断する判断部を含み、この判断の結果に基づいて前記駆動部を制御することにより前記レンズを移動させる
ことを特徴とする請求項8または請求項9に記載の眼底撮影装置。 - 前記アライメントモードにおいて、前記眼底に照射される光は赤外光である
ことを特徴とする請求項8~請求項10のいずれか一項に記載の眼底撮影装置。 - 前記制御回路部は、その動作モードとして、被検眼の眼底を動画観察するための眼底観察モードを含み、
前記制御回路部は、前記眼底観察モードにおいて、可視光による固視標が前記被検眼に提示されるように前記光源部を制御する
ことを特徴とする請求項1~請求項11のいずれか一項に記載の眼底撮影装置。 - 前記走査光学系は、前記光源部からの光によってスポット光を形成し、前記スポット光によって眼底を走査し、前記眼底からの前記スポット光の戻り光を前記受光部にて受光し、
前記制御回路部は、前記スポット光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御し、
前記像構築部は、前記受光部からの受光信号と前記走査軌跡の位置とに基づいて前記眼底の正面画像を構築する
ことを特徴とする請求項1~請求項12のいずれか一項に記載の眼底撮影装置。 - 前記走査光学系は、眼底を測定光で走査し、前記眼底からの前記測定光の戻り光と参照光とを重ね合わせて干渉光を生成する干渉光学系を含み、前記干渉光を前記受光部にて受光し、
前記制御回路部は、前記測定光による走査軌跡が前記眼底に形成されるように前記走査光学系を制御し、
前記像構築部は、前記受光部からの受光信号と前記走査軌跡の位置とに基づいて前記眼底の2次元断面像または3次元画像を構築する
ことを特徴とする請求項1~請求項12のいずれか一項に記載の眼底撮影装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13858552.6A EP2926722A4 (en) | 2012-11-30 | 2013-11-27 | PHOTOGRAPHER FOR EYE BACKGROUND |
US14/648,575 US10149615B2 (en) | 2012-11-30 | 2013-11-27 | Fundus imaging apparatus that determines a state of alignment |
JP2014549848A JP6310859B2 (ja) | 2012-11-30 | 2013-11-27 | 眼底撮影装置 |
US15/335,861 US10226175B2 (en) | 2012-11-30 | 2016-10-27 | Fundus imaging apparatus |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012262241 | 2012-11-30 | ||
JP2012-262727 | 2012-11-30 | ||
JP2012262727 | 2012-11-30 | ||
JP2012-262241 | 2012-11-30 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/648,575 A-371-Of-International US10149615B2 (en) | 2012-11-30 | 2013-11-27 | Fundus imaging apparatus that determines a state of alignment |
US15/335,861 Division US10226175B2 (en) | 2012-11-30 | 2016-10-27 | Fundus imaging apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014084231A1 true WO2014084231A1 (ja) | 2014-06-05 |
Family
ID=50827867
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/081856 WO2014084231A1 (ja) | 2012-11-30 | 2013-11-27 | 眼底撮影装置 |
Country Status (4)
Country | Link |
---|---|
US (2) | US10149615B2 (ja) |
EP (1) | EP2926722A4 (ja) |
JP (3) | JP6310859B2 (ja) |
WO (1) | WO2014084231A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016032608A (ja) * | 2014-07-31 | 2016-03-10 | 株式会社ニデック | 光コヒーレンストモグラフィー装置、および眼底画像処理プログラム |
JP2016198506A (ja) * | 2015-04-12 | 2016-12-01 | 台医光電科技股▲ふん▼有限公司 | アライメントデバイス及びその方法 |
JP2016202900A (ja) * | 2015-04-15 | 2016-12-08 | 株式会社トプコン | 最適な信号処理によるoct血管造影法 |
JP2017164064A (ja) * | 2016-03-14 | 2017-09-21 | キヤノン株式会社 | 眼科装置及びその制御方法、並びに、プログラム |
JP2017196210A (ja) * | 2016-04-28 | 2017-11-02 | 株式会社トプコン | 眼科撮影装置 |
JP2018023563A (ja) * | 2016-08-10 | 2018-02-15 | 株式会社トプコン | 眼科撮影装置 |
JP2018094324A (ja) * | 2016-12-16 | 2018-06-21 | 株式会社トーメーコーポレーション | 眼科装置 |
CN108734157A (zh) * | 2018-08-28 | 2018-11-02 | 北京乾沛科技有限公司 | 第一指节指静脉采集装置和方法 |
WO2019069648A1 (ja) * | 2017-10-05 | 2019-04-11 | 株式会社Qdレーザ | 視覚検査装置 |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8348429B2 (en) | 2008-03-27 | 2013-01-08 | Doheny Eye Institute | Optical coherence tomography device, method, and system |
US11839430B2 (en) | 2008-03-27 | 2023-12-12 | Doheny Eye Institute | Optical coherence tomography-based ophthalmic testing methods, devices and systems |
WO2010009450A1 (en) | 2008-07-18 | 2010-01-21 | Doheny Eye Institute | Optical coherence tomography device, method, and system |
US10314763B2 (en) * | 2013-12-31 | 2019-06-11 | Teeny Clean, Llc | Eyelid care appliance |
US9833140B2 (en) | 2015-01-09 | 2017-12-05 | Smart Vision Labs, Inc. | Portable corneal topographer |
US10383508B2 (en) * | 2015-02-20 | 2019-08-20 | Gwangju Institute Of Science And Technology | Endoscope, handpiece of endoscope, calibration method therefor, and method for using the same |
US10820824B2 (en) | 2015-05-12 | 2020-11-03 | Diagnosys LLC | Combined stimulator and bipolar electrode assembly for mouse electroretinography (ERG) |
US11357442B2 (en) | 2015-05-12 | 2022-06-14 | Diagnosys LLC | Combined stimulator and electrode assembly for mouse electroretinography (ERG) |
US11160447B2 (en) * | 2015-08-24 | 2021-11-02 | The Board Of Trustees Of The University Of Illinois | Pixelated, full-field multi-protocol stimulus source apparatus, method and system for probing visual pathway function |
EP3349642B1 (en) | 2015-09-17 | 2020-10-21 | Envision Diagnostics, Inc. | Medical interfaces and other medical devices, systems, and methods for performing eye exams |
JP6619202B2 (ja) * | 2015-10-29 | 2019-12-11 | 株式会社トプコン | 眼科撮影装置 |
US10130250B2 (en) * | 2015-11-02 | 2018-11-20 | Nidek Co., Ltd. | OCT data processing apparatus and OCT data processing program |
US20170119250A1 (en) * | 2015-11-04 | 2017-05-04 | The Charles Stark Draper Laboratory, Inc. | Portable hardware fixture for fundoscopy |
WO2017083567A1 (en) * | 2015-11-10 | 2017-05-18 | Diagnosys LLC | Method and apparatus for the assessment of electrophysiological signals |
US10820840B2 (en) * | 2016-04-28 | 2020-11-03 | Joshua Noel Hogan | Optical coherence tomography for identity verification |
WO2017190071A1 (en) * | 2016-04-30 | 2017-11-02 | Envision Diagnostics, Inc. | Medical devices, systems, and methods for performing eye exams using displays comprising mems scanning mirrors |
EP3448234A4 (en) | 2016-04-30 | 2019-05-01 | Envision Diagnostics, Inc. | MEDICAL DEVICES, SYSTEMS AND METHODS FOR OPERATING OCULAR EXAMINATIONS AND OCULOMETRY |
US10832051B1 (en) * | 2016-06-13 | 2020-11-10 | Facebook Technologies, Llc | Eye tracking using optical coherence methods |
JP2018061621A (ja) * | 2016-10-11 | 2018-04-19 | オプトス ピーエルシー | 眼底画像撮影装置、眼底画像撮影方法、及び眼底画像撮影プログラム |
KR101948674B1 (ko) | 2016-10-20 | 2019-02-18 | (주)하이모 | 포터블 스캐너 및 그 스캐닝 방법 |
DE102016121246A1 (de) * | 2016-11-07 | 2018-05-09 | Carl Zeiss Ag | Verfahren zur Selbstuntersuchung eines Auges und ophthalmologische Selbstuntersuchungsvorrichtung |
KR101942465B1 (ko) * | 2018-01-30 | 2019-01-28 | 주식회사 루티헬스 | 망막 촬영 장치 및 이를 이용한 망막 촬영 방법 |
KR101911441B1 (ko) * | 2017-02-01 | 2018-10-24 | 주식회사 루티헬스 | 휴대용 망막 촬영 장치 및 이를 이용한 망막 촬영 방법 |
WO2018143651A1 (ko) * | 2017-02-01 | 2018-08-09 | 주식회사 루티헬스 | 망막 촬영 장치 및 이를 이용한 망막 촬영 방법 |
RU2667875C2 (ru) * | 2017-02-02 | 2018-09-24 | Федеральное государственное бюджетное образовательное учреждение высшего образования "Петрозаводский государственный университет" | Микрофотовидеофиксирующее устройство |
US10529082B2 (en) * | 2017-06-20 | 2020-01-07 | Mitutoyo Corporation | Three-dimensional geometry measurement apparatus and three-dimensional geometry measurement method |
US11373450B2 (en) * | 2017-08-11 | 2022-06-28 | Tectus Corporation | Eye-mounted authentication system |
JP7027075B2 (ja) | 2017-09-06 | 2022-03-01 | キヤノン株式会社 | 光干渉断層撮影装置及びその制御方法 |
CN111542258B (zh) | 2017-11-07 | 2023-10-20 | 诺达尔视觉有限公司 | 用于眼科成像设备的对准的方法和系统 |
EP3675711A4 (en) | 2017-11-07 | 2021-06-30 | Notal Vision Ltd. | RETINAL IMAGING DEVICE AND RELATED PROCEDURES |
CN109859155A (zh) * | 2017-11-30 | 2019-06-07 | 京东方科技集团股份有限公司 | 影像畸变检测方法和系统 |
US20190290117A1 (en) * | 2018-03-22 | 2019-09-26 | Kabushiki Kaisha Topcon | Interferometric fundus imaging method |
US10963046B1 (en) | 2018-05-17 | 2021-03-30 | Facebook Technologies, Llc | Drift corrected eye tracking |
KR20190138548A (ko) * | 2018-06-05 | 2019-12-13 | 주식회사 필로포스 | Point of care 진단을 위한 일체형 핸드헬드 배터리 구동 OCT 시스템 |
US11219362B2 (en) * | 2018-07-02 | 2022-01-11 | Nidek Co., Ltd. | Fundus imaging apparatus |
US11497911B2 (en) | 2018-07-18 | 2022-11-15 | Diagnosys LLC | Electrically evoked response (EER) stimulator/amplifier combination |
CN110895824B (zh) * | 2018-09-12 | 2023-03-28 | 上海耕岩智能科技有限公司 | 确定显示屏幕厚度参数的方法、存储介质及电子设备 |
WO2020064475A1 (en) * | 2018-09-27 | 2020-04-02 | Albanna Walid | Method of retinal vessel analysis, a portable retinal vessel analysis apparatus and a non-transitory computer-readable medium |
US10595722B1 (en) | 2018-10-03 | 2020-03-24 | Notal Vision Ltd. | Automatic optical path adjustment in home OCT |
US10993613B2 (en) * | 2018-12-21 | 2021-05-04 | Welch Allyn, Inc. | Fundus image capturing |
JP7199236B2 (ja) | 2019-01-24 | 2023-01-05 | 株式会社トプコン | 眼科装置 |
JP7443400B2 (ja) * | 2019-05-31 | 2024-03-05 | 株式会社ニコン | 眼科装置及び断層画像生成装置 |
US10653311B1 (en) | 2019-06-12 | 2020-05-19 | Notal Vision Ltd. | Home OCT with automatic focus adjustment |
US11832885B2 (en) | 2019-10-24 | 2023-12-05 | Sanovas Intellectual Property, Llc | Patient home monitoring and physician alert for ocular anatomy |
JP7435961B2 (ja) | 2020-03-27 | 2024-02-21 | 興和株式会社 | 眼底撮影装置 |
US20230148862A1 (en) * | 2020-04-04 | 2023-05-18 | The Board Of Regents Of The University Of Texas System | Systems and methods to measure retinal perfusion |
US11423569B1 (en) * | 2021-04-09 | 2022-08-23 | Varjo Technologies Oy | Gaze-tracking system and method employing selective glints |
US20240099580A1 (en) * | 2022-09-27 | 2024-03-28 | Optomed Plc | Ophthalmic imaging instrument and ophthalmic imaging method |
US11864834B1 (en) * | 2023-01-25 | 2024-01-09 | SoliDDD Corp. | Multi-tiled plenoptic system for the detection and correction of ocular defects and for improved foveated rendering |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06110947A (ja) | 1992-09-30 | 1994-04-22 | Pfu Ltd | 医療データ検索システム |
JP2002369801A (ja) | 2001-06-14 | 2002-12-24 | Canon Inc | 眼科撮影装置 |
JP2005279121A (ja) | 2004-03-31 | 2005-10-13 | Nidek Co Ltd | 眼底撮影装置 |
JP2007325831A (ja) | 2006-06-09 | 2007-12-20 | Topcon Corp | 眼底観察装置、眼科画像処理装置及び眼科画像処理プログラム |
JP2008206684A (ja) | 2007-02-26 | 2008-09-11 | Topcon Corp | 眼底観察装置、眼底画像処理装置及びプログラム |
JP2009061203A (ja) | 2007-09-10 | 2009-03-26 | Univ Of Tokyo | 眼底観察装置、眼底画像処理装置及びプログラム |
JP2009066015A (ja) | 2007-09-10 | 2009-04-02 | Univ Of Tokyo | 眼底観察装置、眼科画像処理装置及びプログラム |
JP2011147609A (ja) | 2010-01-21 | 2011-08-04 | Nidek Co Ltd | 眼科撮影装置 |
WO2011122004A1 (en) * | 2010-03-31 | 2011-10-06 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
JP2011244917A (ja) * | 2010-05-25 | 2011-12-08 | Topcon Corp | 走査型レーザ撮影装置 |
JP2012075641A (ja) * | 2010-09-30 | 2012-04-19 | Nidek Co Ltd | 眼科撮影装置 |
Family Cites Families (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS59183727A (ja) * | 1983-04-02 | 1984-10-18 | 株式会社トプコン | 眼科器械の合焦検出装置 |
JPH01113025A (ja) | 1987-10-28 | 1989-05-01 | Topcon Corp | レーザー走査式眼科装置 |
JPH09234186A (ja) | 1996-03-01 | 1997-09-09 | Nikon Corp | 検眼装置 |
JPH10151114A (ja) | 1996-11-26 | 1998-06-09 | Nippon Telegr & Teleph Corp <Ntt> | 眼底画像分類処理方法 |
JPH119553A (ja) | 1997-06-26 | 1999-01-19 | Nikon Corp | 眼科装置 |
US6299307B1 (en) * | 1997-10-10 | 2001-10-09 | Visx, Incorporated | Eye tracking device for laser eye surgery using corneal margin detection |
US6027216A (en) * | 1997-10-21 | 2000-02-22 | The Johns University School Of Medicine | Eye fixation monitor and tracker |
JP3964035B2 (ja) * | 1998-03-12 | 2007-08-22 | 興和株式会社 | 眼科装置 |
JP2000189386A (ja) | 1998-12-25 | 2000-07-11 | Nidek Co Ltd | 眼底カメラ |
JP2000237168A (ja) | 1999-02-19 | 2000-09-05 | Canon Inc | 眼科検査装置 |
JP2001161646A (ja) | 1999-12-07 | 2001-06-19 | Canon Inc | 眼科撮影装置 |
CN1498385A (zh) | 2000-08-29 | 2004-05-19 | 成像治疗仪公司 | X图像量析的方法与装置 |
US6904123B2 (en) | 2000-08-29 | 2005-06-07 | Imaging Therapeutics, Inc. | Methods and devices for quantitative analysis of x-ray images |
US7050534B2 (en) | 2000-08-29 | 2006-05-23 | Imaging Therapeutics, Inc. | Methods and devices for quantitative analysis of x-ray images |
US20020186818A1 (en) | 2000-08-29 | 2002-12-12 | Osteonet, Inc. | System and method for building and manipulating a centralized measurement value database |
US7467892B2 (en) | 2000-08-29 | 2008-12-23 | Imaging Therapeutics, Inc. | Calibration devices and methods of use thereof |
JP3742288B2 (ja) | 2000-09-05 | 2006-02-01 | 株式会社ニデック | 検眼装置 |
JP2003000546A (ja) | 2001-06-18 | 2003-01-07 | Canon Inc | 眼科装置 |
JP4126244B2 (ja) | 2003-03-27 | 2008-07-30 | 株式会社ニデック | 眼底カメラ |
JP4179606B2 (ja) | 2003-06-09 | 2008-11-12 | 株式会社コーナン・メディカル | フォトレフラクター |
US7766903B2 (en) * | 2003-12-24 | 2010-08-03 | The Board Of Trustees Of The Leland Stanford Junior University | Patterned laser treatment of the retina |
US7452080B2 (en) | 2004-06-10 | 2008-11-18 | Optimedica Corporation | Scanning ophthalmic fixation method and apparatus |
JP4510534B2 (ja) | 2004-06-22 | 2010-07-28 | 株式会社トプコン | 光学特性測定装置及び眼底像観察装置 |
US8394084B2 (en) | 2005-01-10 | 2013-03-12 | Optimedica Corporation | Apparatus for patterned plasma-mediated laser trephination of the lens capsule and three dimensional phaco-segmentation |
JP2007097820A (ja) | 2005-10-04 | 2007-04-19 | Sumitomo Electric Ind Ltd | 生体検査システムおよび生体検査方法 |
JP4822969B2 (ja) * | 2006-07-27 | 2011-11-24 | 株式会社ニデック | 眼科撮影装置 |
JP4817184B2 (ja) | 2006-09-08 | 2011-11-16 | 国立大学法人岐阜大学 | 画像撮影装置及び画像解析プログラム |
JP4996917B2 (ja) * | 2006-12-26 | 2012-08-08 | 株式会社トプコン | 光画像計測装置及び光画像計測装置を制御するプログラム |
JP4937792B2 (ja) | 2007-03-01 | 2012-05-23 | 株式会社ニデック | 眼底カメラ |
JP5138977B2 (ja) | 2007-05-24 | 2013-02-06 | 株式会社トプコン | 光画像計測装置 |
US10398599B2 (en) * | 2007-10-05 | 2019-09-03 | Topcon Medical Laser Systems Inc. | Semi-automated ophthalmic photocoagulation method and apparatus |
JP5192250B2 (ja) | 2008-02-04 | 2013-05-08 | 株式会社トプコン | 眼底観察装置 |
US7824035B2 (en) | 2008-06-02 | 2010-11-02 | Nidek Co., Ltd. | Ophthalmic photographing apparatus |
JP5209377B2 (ja) * | 2008-06-02 | 2013-06-12 | 株式会社ニデック | 眼底撮影装置 |
JP5324839B2 (ja) | 2008-06-19 | 2013-10-23 | 株式会社トプコン | 光画像計測装置 |
JP4810562B2 (ja) | 2008-10-17 | 2011-11-09 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
WO2010062883A1 (en) | 2008-11-26 | 2010-06-03 | Bioptigen, Inc. | Methods, systems and computer program products for biometric identification by tissue imaging using optical coherence tomography (oct) |
JP5255514B2 (ja) | 2009-05-01 | 2013-08-07 | 株式会社ニデック | 眼科撮影装置 |
JP5361522B2 (ja) | 2009-05-08 | 2013-12-04 | キヤノン株式会社 | 眼底カメラ |
JP5432625B2 (ja) | 2009-07-29 | 2014-03-05 | 株式会社トプコン | 眼科観察装置 |
EP2347701B1 (en) | 2010-01-21 | 2017-01-04 | Nidek Co., Ltd | Ophthalmic photographing apparatus |
JP5701625B2 (ja) * | 2010-03-31 | 2015-04-15 | 株式会社ニデック | 眼科用レーザ治療装置 |
JP5762712B2 (ja) | 2010-09-30 | 2015-08-12 | 株式会社ニデック | 眼科観察システム |
JP2012176162A (ja) | 2011-02-28 | 2012-09-13 | Topcon Corp | 眼底観察装置 |
JP2012200292A (ja) | 2011-03-23 | 2012-10-22 | Nidek Co Ltd | 医療情報管理システム |
TWI453523B (zh) | 2011-12-29 | 2014-09-21 | Ind Tech Res Inst | 具有自動對焦功能之診斷設備 |
-
2013
- 2013-11-27 JP JP2014549848A patent/JP6310859B2/ja active Active
- 2013-11-27 WO PCT/JP2013/081856 patent/WO2014084231A1/ja active Application Filing
- 2013-11-27 US US14/648,575 patent/US10149615B2/en active Active
- 2013-11-27 EP EP13858552.6A patent/EP2926722A4/en not_active Withdrawn
-
2016
- 2016-03-18 JP JP2016055371A patent/JP2016105945A/ja active Pending
- 2016-05-31 JP JP2016107999A patent/JP2016172041A/ja active Pending
- 2016-10-27 US US15/335,861 patent/US10226175B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06110947A (ja) | 1992-09-30 | 1994-04-22 | Pfu Ltd | 医療データ検索システム |
JP2002369801A (ja) | 2001-06-14 | 2002-12-24 | Canon Inc | 眼科撮影装置 |
JP2005279121A (ja) | 2004-03-31 | 2005-10-13 | Nidek Co Ltd | 眼底撮影装置 |
JP2007325831A (ja) | 2006-06-09 | 2007-12-20 | Topcon Corp | 眼底観察装置、眼科画像処理装置及び眼科画像処理プログラム |
JP2008206684A (ja) | 2007-02-26 | 2008-09-11 | Topcon Corp | 眼底観察装置、眼底画像処理装置及びプログラム |
JP2009061203A (ja) | 2007-09-10 | 2009-03-26 | Univ Of Tokyo | 眼底観察装置、眼底画像処理装置及びプログラム |
JP2009066015A (ja) | 2007-09-10 | 2009-04-02 | Univ Of Tokyo | 眼底観察装置、眼科画像処理装置及びプログラム |
JP2011147609A (ja) | 2010-01-21 | 2011-08-04 | Nidek Co Ltd | 眼科撮影装置 |
WO2011122004A1 (en) * | 2010-03-31 | 2011-10-06 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
JP2011212203A (ja) * | 2010-03-31 | 2011-10-27 | Canon Inc | 撮像装置及び撮像方法 |
CN102834046A (zh) * | 2010-03-31 | 2012-12-19 | 佳能株式会社 | 摄像设备和摄像方法 |
KR20120140672A (ko) * | 2010-03-31 | 2012-12-31 | 캐논 가부시끼가이샤 | 촬상장치 및 촬상 방법 |
US20130003018A1 (en) * | 2010-03-31 | 2013-01-03 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
EP2552298A1 (en) * | 2010-03-31 | 2013-02-06 | Canon Kabushiki Kaisha | Imaging apparatus and imaging method |
JP2011244917A (ja) * | 2010-05-25 | 2011-12-08 | Topcon Corp | 走査型レーザ撮影装置 |
JP2012075641A (ja) * | 2010-09-30 | 2012-04-19 | Nidek Co Ltd | 眼科撮影装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2926722A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016032608A (ja) * | 2014-07-31 | 2016-03-10 | 株式会社ニデック | 光コヒーレンストモグラフィー装置、および眼底画像処理プログラム |
JP2016198506A (ja) * | 2015-04-12 | 2016-12-01 | 台医光電科技股▲ふん▼有限公司 | アライメントデバイス及びその方法 |
US9696254B2 (en) | 2015-04-12 | 2017-07-04 | Taiwan Biophotonic Corporation | Device and method for alignment |
JP2016202900A (ja) * | 2015-04-15 | 2016-12-08 | 株式会社トプコン | 最適な信号処理によるoct血管造影法 |
JP2017164064A (ja) * | 2016-03-14 | 2017-09-21 | キヤノン株式会社 | 眼科装置及びその制御方法、並びに、プログラム |
JP2017196210A (ja) * | 2016-04-28 | 2017-11-02 | 株式会社トプコン | 眼科撮影装置 |
JP2018023563A (ja) * | 2016-08-10 | 2018-02-15 | 株式会社トプコン | 眼科撮影装置 |
JP2018094324A (ja) * | 2016-12-16 | 2018-06-21 | 株式会社トーメーコーポレーション | 眼科装置 |
WO2019069648A1 (ja) * | 2017-10-05 | 2019-04-11 | 株式会社Qdレーザ | 視覚検査装置 |
JPWO2019069648A1 (ja) * | 2017-10-05 | 2020-01-23 | 株式会社Qdレーザ | 視覚検査装置 |
US11717157B2 (en) | 2017-10-05 | 2023-08-08 | Qd Laser, Inc. | Visual sense examination device |
CN108734157A (zh) * | 2018-08-28 | 2018-11-02 | 北京乾沛科技有限公司 | 第一指节指静脉采集装置和方法 |
CN108734157B (zh) * | 2018-08-28 | 2024-04-05 | 北京乾沛科技有限公司 | 第一指节指静脉采集装置和方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6310859B2 (ja) | 2018-04-11 |
US20150313467A1 (en) | 2015-11-05 |
EP2926722A1 (en) | 2015-10-07 |
JPWO2014084231A1 (ja) | 2017-01-05 |
US10149615B2 (en) | 2018-12-11 |
EP2926722A4 (en) | 2016-12-21 |
US20170042422A1 (en) | 2017-02-16 |
JP2016172041A (ja) | 2016-09-29 |
JP2016105945A (ja) | 2016-06-16 |
US10226175B2 (en) | 2019-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6310859B2 (ja) | 眼底撮影装置 | |
JP6338358B2 (ja) | 眼底撮影システム | |
JP6141140B2 (ja) | 眼科撮影装置 | |
JP5989523B2 (ja) | 眼科装置 | |
JP2015033472A (ja) | 眼科撮影装置 | |
WO2016027589A1 (ja) | 眼科撮影装置およびその制御方法 | |
JP6566541B2 (ja) | 眼科装置 | |
JP6498398B2 (ja) | 眼科装置 | |
JP2014200678A (ja) | 眼科装置 | |
JP2015029559A (ja) | 撮影装置及び撮影方法 | |
JP2015029557A (ja) | 画像処理装置および画像処理方法 | |
JP2018198967A (ja) | 眼科装置 | |
JP6407631B2 (ja) | 眼科装置 | |
JP6392408B2 (ja) | 眼科装置 | |
JP2014039870A (ja) | 光画像計測装置及び撮影装置 | |
JP6901264B2 (ja) | 眼科装置 | |
JP6422529B2 (ja) | プログラムおよび眼科システム | |
JP2017164522A (ja) | 眼科装置 | |
JP2017164520A (ja) | 眼科装置 | |
JP2017164521A (ja) | 眼科装置 | |
JP2019135005A (ja) | 眼科撮影装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13858552 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014549848 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14648575 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013858552 Country of ref document: EP |