WO2023176599A1 - Fundus-imaging device and fundus-imaging method - Google Patents

Fundus-imaging device and fundus-imaging method Download PDF

Info

Publication number
WO2023176599A1
WO2023176599A1 PCT/JP2023/008671 JP2023008671W WO2023176599A1 WO 2023176599 A1 WO2023176599 A1 WO 2023176599A1 JP 2023008671 W JP2023008671 W JP 2023008671W WO 2023176599 A1 WO2023176599 A1 WO 2023176599A1
Authority
WO
WIPO (PCT)
Prior art keywords
section
optical
fundus
imaging
unit
Prior art date
Application number
PCT/JP2023/008671
Other languages
French (fr)
Japanese (ja)
Inventor
達昌 今井
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023176599A1 publication Critical patent/WO2023176599A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present disclosure relates to a fundus imaging device and a fundus imaging method.
  • Fundus imaging devices that photograph the fundus of the eye through the pupil are known.
  • Fundus imaging devices are broadly classified into stationary devices and handheld devices.
  • a fundus imaging device In a stationary type device, a fundus imaging device is installed on a table or the like, the subject's head is fixed relative to the fundus imaging device, and fundus imaging is performed.
  • a hand-held device a person holds the fundus imaging device and, for example, brings the fundus imaging device into contact with the head (eyeball portion) of the subject to photograph the fundus.
  • the optical axis from the tip of the photographic lens to the imaging surface is fixed. Therefore, in order to appropriately set the image frame on the imaging surface side, the photographer needs to move the entire optical system of the fundus imaging device itself in six axes.
  • This image frame setting work is called alignment adjustment.
  • the alignment adjustment procedure is different between a stationary fundus imaging device and a handheld fundus imaging device.
  • An alignment adjustment method for a stationary fundus imaging device is, for example, as follows. (1) While viewing the image of the anterior segment of the eye using a fundus imaging device installed on a table, the photographer instructs the patient to look straight ahead on the imaging optical axis and fixes the patient's line of sight. I do. (2) The photographer uses a controller etc. to control the alignment adjustment mechanism built into the fundus imaging device, and moves the entire optical system in the direction of the imaging optical axis and the three axes on the plane perpendicular to the imaging optical axis. move it.
  • an alignment adjustment method for a hand-held fundus imaging device is as follows. (1) While viewing an image of the anterior segment of the eye using a fundus imaging device held in the photographer's hand, the photographer instructs the subject to look approximately straight ahead and fixes the line of sight. (2) The photographer moves the fundus imaging device so that the line of sight of the subject overlaps the imaging optical axis, and holds the device in his hand in that state.
  • stationary fundus imaging devices require the subject's head to be fixed to the device and then move the entire optical system of the device in six axes, resulting in an increase in size.
  • a stationary fundus imaging device requires the skill of the photographer to properly fix the subject's line of sight, so it is necessary to have someone with appropriate photography skills present at the time of photography.
  • a handheld fundus imaging device does not have a built-in alignment adjustment mechanism and has a structure consisting only of an optical system, thereby realizing miniaturization.
  • the alignment adjustment work becomes more complicated. Therefore, like a stationary type device, a handheld fundus imaging device also requires the presence of a person with appropriate skills at the time of imaging.
  • An object of the present disclosure is to provide a fundus imaging device and a fundus imaging method that can be miniaturized and have easy alignment adjustment.
  • a fundus imaging device includes: a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes the visual field direction of the imaging section with respect to the eye to be examined; a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light emitted from the fundus and irradiated to the imaging section via the first optical section; and.
  • FIG. 1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to an existing technique.
  • 1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to an existing technique.
  • 1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to the present disclosure.
  • 1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to the present disclosure.
  • FIG. 1 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a first embodiment.
  • FIG. 3 is a schematic diagram showing a configuration example of an anterior eye observation unit applicable to the first embodiment.
  • FIG. 2 is a schematic diagram showing an example of a configuration of an annular light source applicable to the first embodiment.
  • FIG. 2 is a functional block diagram of an example for explaining the functions of the fundus imaging device according to the first embodiment.
  • 2 is a flowchart of an example for explaining the overall flow of fundus imaging by the fundus imaging device according to the first embodiment.
  • FIG. 3 is a schematic diagram for explaining an overview of visual field deflection according to the first embodiment.
  • FIG. 3 is a schematic diagram for explaining a method of specifying the position of a pupil portion that is applicable to the first embodiment.
  • FIG. 3 is a schematic diagram for explaining an overview of optical axis adjustment according to the first embodiment. It is a schematic diagram showing the composition of an example of the fundus imaging device concerning the 1st modification of a 1st embodiment.
  • FIG. 2 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a second modification of the first embodiment.
  • FIG. 7 is a schematic diagram showing the configuration of an example of an optical element according to a second modification of the first embodiment.
  • FIG. 7 is a schematic diagram for explaining a usage pattern of a fundus imaging device according to a second embodiment.
  • FIG. 2 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a second embodiment.
  • FIG. 7 is a schematic diagram showing a configuration example of an indicator when visual information is used, which is applicable to the second embodiment.
  • FIG. 7 is a schematic diagram for explaining a usage pattern of a fundus imaging device according to a third embodiment.
  • FIG. 7 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a third embodiment.
  • FIG. 7 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a third embodiment.
  • the present disclosure relates to a fundus imaging device for photographing a human fundus.
  • the fundus imaging device according to the present disclosure can be applied to medical applications.
  • the present disclosure is not limited to this, and the fundus imaging device according to the present disclosure can also be applied to security applications using fundus images.
  • fundus imaging devices There are two types: one is a stationary device that takes images of the fundus by fixing the subject's head to the fundus imaging device installed on a table, and the other is a stationary device that takes images of the fundus by fixing the subject's head to the fundus imaging device installed on a table or the like.
  • devices hand-held devices that are placed in contact with the subject's head (eyeballs) to photograph the fundus.
  • the optical axis from the tip of the photographic lens to the imaging surface is fixed. Therefore, in order to appropriately set the image frame on the imaging surface side, the photographer needs to move the entire optical system of the fundus imaging device itself in six axes. This image frame setting work is called alignment.
  • the alignment procedure is different between a stationary fundus imaging device and a handheld fundus imaging device.
  • An example of an alignment method for a stationary fundus imaging device is as follows. (1) While viewing the image of the anterior segment of the eye using a fundus imaging device installed on a table, the photographer instructs the patient to look straight ahead on the imaging optical axis and fixes the patient's line of sight. I do. (2) The photographer uses a controller, etc. to control the alignment mechanism built into the fundus imaging device, and moves the entire optical system in the direction of the imaging optical axis and along three axes on a plane perpendicular to the imaging optical axis. let
  • An example of an alignment method for a hand-held fundus imaging device is as follows. (1) While viewing the image of the anterior segment of the eye using a fundus imaging device held by the photographer, the photographer instructs the subject to look approximately straight ahead, and fixes the patient's line of sight. (2) The photographer moves the fundus imaging device so that the line of sight of the subject overlaps the imaging optical axis, and holds the device in his hand in that state.
  • stationary fundus imaging devices require the subject's head to be fixed to the device and then move the entire optical system of the device in six axes, resulting in an increase in size.
  • a stationary fundus imaging device requires the skill of the photographer to properly fix the subject's line of sight, so it is necessary to have someone with appropriate photography skills present at the time of photography.
  • a handheld fundus imaging device does not have a built-in alignment mechanism and has a structure consisting only of an optical system, thereby realizing miniaturization.
  • the alignment work becomes more complicated. Therefore, like a stationary type device, a handheld fundus imaging device also requires the presence of a person with appropriate skills at the time of imaging.
  • the alignment mechanism on the device side has a low degree of freedom. If the alignment mechanism inside the device has a sufficient degree of freedom, it is thought that the photographer will be able to use remote adjustment or the automatic adjustment mechanism on the device side. However, if an attempt is made to increase the degree of freedom in adjusting the alignment mechanism in a conventional fundus imaging device, it is expected that the device will become even larger, and there is a risk that the equipment that can be introduced will be limited.
  • Patent Document 1 describes a fundus imaging device in which the degree of freedom of the alignment mechanism is increased by arranging a concave mirror whose angle is adjustable on the optical path.
  • alignment adjustment can be performed while fixing the device body by fixing the device body to the face and adjusting the position of the entrance pupil using a concave mirror.
  • the subject needs to direct his/her line of sight to the intersection of the rotation axes of the concave mirror. Therefore, if the subject's visual acuity is impaired or the subject's line of sight cannot be properly guided for some reason, accurate alignment adjustment is difficult.
  • Patent Document 2 for example, a method has been proposed that does not require a person with appropriate skills to be present at the time of photographing.
  • Patent Document 2 describes a small fundus imaging device that uses a head-mounted display type housing, houses an imaging optical system, its drive system, and a control system inside the housing, and has an automatic alignment adjustment function. has been done. According to Patent Document 2, it is possible to eliminate the need for a photographer with appropriate skills to be present when photographing the fundus. However, since Patent Document 2 employs a conventional stationary alignment method in which the imaging optical system itself is moved, there is a trade-off between the reduction in device capacity and the optical axis adjustment range. In addition, in Patent Document 2, the gaze fixation method is also assumed to be realized by fixation of the target by the subject himself or by voice instructions directed to the subject from the outside, so the above-mentioned Patent Document Issues similar to 1 may occur.
  • Non-Patent Document 1 describes a method in which the subject himself or herself performs alignment while viewing an index projected onto the retina. According to the method disclosed in Non-Patent Document 1, similarly to Patent Document 2 mentioned above, it is possible to eliminate the need for a photographer with specialized skills at the time of photographing. However, the alignment according to Non-Patent Document 1 is not effective for patients with visual acuity abnormalities.
  • FIGS. 1A and 1B are schematic diagrams schematically showing the configuration of a fundus imaging device according to existing technology.
  • the pupil is generally captured on an imaging optical axis extending vertically from the imaging center of the imaging device, and the intraocular and fundus of the subject's eye are, for example, magnified and photographed through the pupil.
  • FIGS. 1A and 1B which will be described below, and FIGS. 2A and 2B, which will be described later, parts of the fundus imaging device necessary for explanation are excerpted and other optical elements and the like are omitted.
  • Section (a) of FIG. 1A shows an example in which fundus photography is appropriately performed in a configuration according to existing technology.
  • a reflecting mirror 520 that is rotatable about a rotation axis 521 is provided on an imaging optical axis 512 of an image sensor 530. Further, it is assumed that the reflecting mirror 520 is arranged so that when the subject looks in the horizontal direction 503 in the figure, the line of sight coincides with the rotation axis 521.
  • Light emitted from the fundus 502 of the eye to be examined 500 through the pupil 501 is irradiated onto the reflecting mirror 520 through the optical path 510a and reflected, and is irradiated onto the image sensor 530 through the optical path 511a, where it is photographed.
  • the subject's line of sight is in the horizontal direction 503 in the figure, and the line of sight is directed toward the center of the reflecting mirror 520 (rotation axis 521). Therefore, by adjusting the angle of the reflecting mirror 520, the optical path 511a can be overlapped with the imaging optical axis 512 at the imaging center of the imaging element 530.
  • Section (b) of FIG. 1A shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 1A.
  • a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center.
  • the position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530.
  • Section (a) of FIG. 1B shows an example in which fundus photography is not performed appropriately in a configuration related to the existing technology.
  • the subject's line of sight is inclined with respect to the horizontal direction 503, and the center of the reflector 520 (rotation axis 521) It is aimed at a location that is off-center. That is, in section (a) of FIG. 1A, the optical path 510a of the light emitted from the fundus 502 through the pupil 501 is approximately horizontal in the diagram, but in section (a) of FIG.
  • the optical path 510b is directed diagonally upward to the right in the figure. Further, it is assumed that the reflecting mirror 520 initially has an inclination 520b shown by a dashed line in the figure.
  • the reflected light from the reflecting mirror 520 can be irradiated onto the image sensor 530.
  • the image sensor 530 is irradiated with the reflected light from the reflecting mirror 520.
  • Section (b) of FIG. 1B shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 1B. It can be seen that in the captured image 540, the positions of the pupil image 551 and the fundus image 552 are shifted with respect to the position of the imaging optical axis 512.
  • the present disclosure introduces one or more optical axis adjustment elements on the optical axis of the fundus imaging device, so that the subject can fixate at an appropriate position. Alignment can be adjusted on the device side even if the device is not installed.
  • FIGS. 2A and 2B are schematic diagrams schematically showing the configuration of a fundus imaging device according to the present disclosure. As shown in section (a) of FIGS. 2A and 2B, in the present disclosure, in contrast to the configuration described in FIGS. In between, an optical axis adjustment element 560 (second optical section) for adjusting the optical axis is added.
  • the optical axis adjustment element 560 is shown as a concave lens.
  • Such a change in the optical path by the optical axis adjustment element 560 according to the present disclosure can be realized by applying, for example, a technique used for optical image stabilization.
  • section (a) in FIG. 2A similar to section (a) in FIG.
  • the slope 520b is set accordingly.
  • Light from the fundus 502 is irradiated onto the reflecting mirror 520 through an optical path 510b, and the reflected light is emitted in the direction of an optical path 511c according to the inclination 520b.
  • the optical axis adjustment element 560 changes the optical path 511c of the light emitted from the reflecting mirror 520 and irradiated onto the imaging device 530 to an optical path 513 that overlaps the imaging optical axis 512.
  • the optical path 513 can be changed to overlap the optical path 513.
  • Section (b) of FIG. 2A shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 2A.
  • a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center.
  • the position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530.
  • the technology according to the present disclosure can be applied to photographing the peripheral fundus, which is the periphery of the fundus 502.
  • An example of photographing the peripheral fundus 502' using the technology according to the present disclosure will be schematically described using FIG. 2B.
  • the optical axis adjustment element 560 is adjusted as described using section (a) of FIG. 2A, and the optical path 513 of the light emitted from the optical axis adjustment element 560 is aligned with the imaging optical axis 512. Match.
  • the angle of the reflecting mirror 520 is changed as appropriate. In the illustrated example, the angle of the reflecting mirror 520 is changed from an inclination 520b to an inclination 520d. Thereby, it is possible to enlarge the area of the object to be imaged.
  • Section (b) of FIG. 2B shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 2B.
  • a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center.
  • the position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530.
  • the peripheral fundus image 553 is captured in a wide range relative to the fundus image 552 shown in section (b) of FIG. 2A.
  • FIG. 3 is a schematic diagram showing the configuration of an example of the fundus imaging device according to the first embodiment.
  • the fundus imaging device 100 includes an optical section B1, an imaging processing section B2, and an information processing section B3.
  • the optical section B1 and the imaging processing section B2 are shown to be provided in the same housing, but this is not limited to this example, and the optical section B1 and the imaging processing section B2 are provided in different cases. It may be provided in the housing. Furthermore, the information processing section B3 may be provided in the same casing as the optical section B1 and the imaging processing section B2, and the information processing section B3 may be further provided in the casing.
  • the optical part B1 is provided with a mounting part 1 that is brought into contact with the subject's eye S, which is the eyeball of the subject 200.
  • the mounting portion 1 is configured such that the distance between the optical portion B1 and the eye S to be examined can be changed by a drive mechanism 24.
  • the drive mechanism 24 expands and contracts the mounting portion 1 in the optical axis direction to adjust the distance between the eye S to be examined and the entire device in the optical axis direction.
  • the relative position of the optical part B1 with respect to the eye S to be examined is fixed by the mounting part 1.
  • Light obtained from the eye S to be examined passes through the device interface part D and enters the inside of the optical part B1.
  • the device interface part D may be a hole, or may be provided with an optical element such as a protective glass, a lens, or a filter.
  • the optical section B1 includes optical elements 2 to 6, a device control section 23, and an annular light source 41. Note that the optical elements 3 and 6 can be omitted.
  • the optical element 2 is, for example, a concave mirror, and is provided so that its inclination (angle) can be changed by a drive mechanism 25 with respect to a predetermined point. The inclination of the optical element 2 is changed, for example, using its center as a reference point for changing the inclination.
  • the pupil image and the fundus image of the eye S to be examined can be placed within a certain range within the image frame of the image sensor 7, which will be described later.
  • the optical element 2 is not limited to a concave mirror, but may be a plane mirror, a lens, or a hologram optical element.
  • the optical element 2 changes the direction of the visual field of the subject 200 via the optical element 2 by rotating around the rotation axis.
  • the optical element 2 is provided between the eye S to be examined and the imaging element 7, which is an imaging unit that images the fundus of the eye S, and is a first optical element that changes the visual field direction of the imaging unit with respect to the eye S to be examined. Functions as an optical section.
  • the field of view refers to the field of view when looking from the image sensor 7 in the direction of the imaging optical axis. At this time, for example, if the direction of the optical axis is changed by the optical element 2 or the like, the field of view to which the change has been made is made the field of view of the imaging element 7.
  • the optical element 4 corresponds to the optical axis adjustment element 560 described in FIGS. 2A and 2B, and is movable by the drive mechanism 26.
  • the drive mechanism 26 is configured such that an optical path 50a connecting the pupil of the eye S to the fundus is changed to an optical path 50b by the optical element 2, and the optical path 50b whose direction has been changed overlaps with the imaging optical axis of the imaging device 7, which will be described later.
  • the optical element 4 is controlled so that the angle becomes 50c.
  • the optical element 4 is shown as a concave lens, and is movable by a drive mechanism 26 in a direction perpendicular to the optical axis (horizontal direction in the figure).
  • a known optical image stabilization technique can be applied for the optical element 4.
  • the optical element 4 corrects the traveling direction of light that is emitted from the eye S to be examined and enters through the optical element 2, which is the first optical section.
  • the optical element 4 is provided between a first optical part (optical element 2) and an imaging part (imaging element 7), and is emitted from the fundus and irradiated onto the imaging part through the first optical part. It functions as a second optical section that corrects the traveling direction of light.
  • the optical element 5 is driven by the drive mechanism 27 to adjust the focus of the image formed on the image sensor 7.
  • a known autofocus mechanism can be applied.
  • the device includes a drive mechanism 24 that adjusts the mounting portion 1, an optical element 2 and a drive mechanism 25, an optical element 4 and a drive mechanism 26, and an optical element 5 and a drive mechanism 27.
  • An alignment adjustment mechanism is configured to perform alignment adjustment.
  • drive mechanism 24 acting on the mounting portion 1 and the drive mechanism 25 acting on the optical element 2 are described as separate structures in the above, this is not limited to this example.
  • the operation of the drive mechanism 24 may be combined with the operation of the drive mechanism 25 to move the optical element 2 both in the tilt direction and in the depth direction with respect to the eye S to be examined.
  • a concave mirror is used as the optical element 2 for changing the field of view, but by adding an optical element 3 between the optical element 2 and the optical element 4, a plane mirror is used as the optical element 2. It is also possible to use For example, a convex lens can be applied to the optical element 3.
  • optical elements 2, 4, and 5 described above are the minimum necessary optical element configurations for performing fundus imaging according to the present disclosure, and do not limit the optical element configuration or arrangement.
  • the optical element 3 may be added as described above, or the optical element 6 may be further added.
  • the arrangement of optical elements 3 and 6 may be exchanged.
  • the optical elements 3 to 6 are composed of lenses, one or more of them is not limited to a solid lens, but it is also possible to use a liquid lens or a hologram optical element. Further miniaturization is possible by using liquid lenses or hologram optical elements instead of solid lenses.
  • the device control section 23 includes a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit), a ROM (Read Only Memory) in which programs are stored, and a work memory for the processor. It includes RAM (Random Access Memory) and an interface to the outside.
  • the device control unit 23 operates according to a program stored in the ROM, for example, using the RAM as a work memory, and controls the overall operation of the fundus imaging device 100.
  • the device control unit 23 can control the above-mentioned drive mechanisms 25 to 27 and the annular light source 41 via the interface. Further, the device control section 23 may control the operations of the image sensor 7 and the development processing section 8, which will be described later.
  • the UI 21 can include a user operation section that accepts user operations, and a display section using indicators and the like.
  • the user operation unit can include buttons, toggle switches, a keyboard, and pointing devices such as a mouse and a touch panel.
  • the present invention is not limited to this, and the user operation section may include other input means (such as voice input).
  • the indicator may simply indicate lighting/non-lighting, or may be a display device such as an LCD (Liquid Crystal Display).
  • the device control unit 23 can also control the operation of the fundus imaging device 100 in response to user operations on the UI 21 provided in the optical section B1.
  • control communication line 11 is connected to the port 22.
  • the device control unit 23 can communicate with, for example, the information processing unit B3 via the port 22 and the control communication line 11. Note that although the description has been made here such that the port 22 and the information processing section B3 are connected by wire through the control communication line 11, this is not limited to this example. You may.
  • the imaging processing section B2 includes an imaging device 7, a development processing section 8, and a device control section 23.
  • the image sensor 7 includes, for example, a pixel array in which pixels that output electrical signals according to irradiated light are arranged in a matrix.
  • the image sensor 7 forms an image of the incident light that has entered through the optical elements 2 to 5 described above, and outputs a signal (image signal) corresponding to the imaged light.
  • the image sensor 7 is driven by a drive circuit (not shown) and can capture moving images at a predetermined frame rate. Further, the image sensor 7 can capture a still image according to a trigger given while capturing a moving image. Furthermore, the image sensor 7 can also capture only still images without capturing moving images.
  • the development processing unit 8 performs predetermined image processing on the image signal of the moving image or still image output from the image sensor 7. For example, the development processing unit 8 performs AD (Analog to Digital) conversion processing on the image signal output as an analog signal from the image sensor 7, and converts the image signal into a digital image signal. Convert to image data.
  • the development processing unit 8 may further perform image processing such as development processing such as demosaic processing, noise removal processing, and white balance adjustment processing on the image data.
  • Image data subjected to image processing in the development processing section 8 is output from a port 9 and transmitted to the information processing section B3 via an image communication line 10 connected to the port 9, for example.
  • the port 9 is connected by wire through the image communication line 10, this is not limited to this example, and the port 9 and the information processing section B3 may be connected by wireless communication.
  • port 9 for transmitting image data and port 22 for transmitting control signals etc. are shown as separate ports, but this is not limited to this example. It is also possible to combine them into one port.
  • the information processing unit B3 includes a processor such as a CPU, and a memory such as a ROM and a RAM.
  • the information processing unit B3 can further include a storage device using a nonvolatile storage medium such as a hard disk drive or flash memory.
  • the information processing unit B3 can further include an input device (keyboard, touch panel, etc.) that accepts user input, and a display device that displays information.
  • the information processing unit B3 generates a device control signal for alignment adjustment in the fundus imaging device 100 based on image data based on the output of the image sensor 7 and information obtained from the anterior eye observation unit described below.
  • the information processing section B3 transmits the generated device control signal to the device control section 23 of the optical section B1 via the control communication line 11, for example, as feedback information for the operations of the optical elements 2, 4, and 5.
  • the optical section B1 includes an anterior eye observation section for observing the anterior eye segment of the eye S to be examined.
  • the anterior eye observation unit makes it possible to observe the state of the anterior eye segment of the subject's eye S by photographing or measuring the anterior eye segment when performing alignment adjustment.
  • Information obtained by the anterior eye observation section is sent to the device control section 23 and fed back to the alignment adjustment mechanism.
  • the anterior eye observation unit may include an imaging optical system as a measuring device for measuring the anterior eye segment.
  • An imaging device including a development processing section if necessary
  • the configuration of the anterior eye observation unit is not limited to this example, and other configurations may be used as long as the measurement target is achieved.
  • an auxiliary device such as a distance measurement sensor or a marker indicator projection unit may be added to the anterior eye observation unit in order to obtain anterior eye information.
  • FIG. 4 is a schematic diagram showing a configuration example of an anterior eye observation unit applicable to the first embodiment.
  • Optical section B1 includes at least one of anterior eye observation sections 28a, 28b, and 28c.
  • the anterior eye observation parts 28a, 28b, and 28c may be collectively called the anterior eye observation part 28.
  • the anterior eye observation section 28a is an example in which the anterior eye observation section 28 is provided around the attachment section 1.
  • the anterior eye observation unit 28a photographs the anterior eye segment from an oblique direction, but by applying projective transformation or the like to the captured image, converts the captured image into an image viewed from the optical axis direction. It is possible to do so.
  • the anterior eye observation unit 28b provides a mirror 29 in the optical path of the light emitted from the optical element 2, and images the light reflected by the mirror 29, thereby making it possible to observe the anterior eye segment.
  • the mirror 29 may be a half mirror that transmits and reflects incident light.
  • the present invention is not limited to this, and the mirror 29 may be a movable mirror that performs total reflection.
  • the mirror 29 may be inserted obliquely to the optical path during anterior eye observation by the anterior eye observation unit 28b, and may be arranged so as not to obstruct the optical path when anterior eye observation is not performed.
  • a trimming filter that transmits and reflects only light of a specific wavelength may be applied to the mirror 29.
  • an anterior eye observation section 28c can be provided.
  • the anterior eye observation section 28c is arranged at a position via the optical element 2 on a line passing through, for example, the center of the optical element 2 from the wearing section 1 (eye S to be examined).
  • a half mirror is used for the optical element 2
  • a trimming filter that transmits only a specific wavelength is used, etc., so that light other than the light rays directed to the image sensor 7, which will be described later, also reaches 28c. There is a need to.
  • a measuring device for example, a distance sensor
  • a projector 30 for projecting an index such as a marker
  • the index projected onto the anterior eye segment by the projector 30 can also be photographed and observed by an image sensor included in the anterior eye observation unit 28a, 28b, or 28c.
  • the output of the anterior eye observation section 28a, 28b, or 28c When transmitting the output of the anterior eye observation section 28a, 28b, or 28c to an external device, it may be outputted from the development processing section 8, which will be described later, or a port 22 for control communication, or an additional output port may be provided. It's okay.
  • the optical section B1 includes the annular light source 41.
  • the annular light source 41 is a light source configured in an annular shape so that the center portion transmits light.
  • FIG. 5 is a schematic diagram showing an example of the configuration of the annular light source 41 applicable to the first embodiment.
  • the annular light source 41 includes an adjustment light source section 41a (first light source section) that emits light with a wavelength outside the visible light region (for example, infrared light), and an adjustment light source section 41a (first light source section) that emits light with a wavelength in the visible light region.
  • a photographing light source section 41b (second light source section) is included.
  • the annular light source 41 has concentric circles in which the adjustment light source section 41a and the photographing light source section 41b share a center, and the center portion 52 is a light transmission area.
  • the adjustment light source section 41a is used, for example, in alignment adjustment and focus adjustment.
  • the photographing light source section 41b is used when actually photographing the fundus.
  • the light emitted by the adjustment light source section 41a will be referred to as adjustment light
  • the light emitted by the photographing light source section 41b will be referred to as photographing light.
  • an excitation light source for fluorescence observation or a light source with a spectrum optimized for focus adjustment by the optical element 5 may be applied.
  • the imaging device 7 can capture the anterior eye segment in a wide imaging range equivalent to when the anterior eye segment is photographed based on the index by the anterior eye observation unit 28a, 28b, or 28c. It becomes possible to photograph. Furthermore, if the imaging device 7 is capable of photographing with the adjustment light and photographing with the photographing light, the adjustment process performed using the anterior eye observation unit 28a, 28b or 28c can be performed with the imaging device 7. It is also possible to realize this.
  • FIG. 6 is an example functional block diagram for explaining the functions of the fundus imaging device 100 according to the first embodiment.
  • the optical section B1 includes a device control section 300, each mechanism 302, a UI (User Interface) 21, an anterior eye observation section 28, and an annular light source 41.
  • the device control section 300 corresponds to the device control section 23 described using FIG. 3, and includes a CPU, ROM, RAM, and various interfaces.
  • Each mechanism 302 includes the drive mechanisms 24 to 27 described using FIG. 3.
  • the operation of the drive mechanisms 24 to 27 is controlled by control signals output from the device control section 300.
  • the anterior eye observation section 28 includes at least one of the aforementioned anterior eye observation sections 28a, 28b, and 28c.
  • the photographing operation by the imaging device in the anterior eye observation section 28 is controlled by a control signal output from the device control section 300.
  • the photographing operation in the anterior eye observation unit 28 controlled by the device control unit 300 includes not only the operation of the image sensor itself such as exposure, but also the operation of the image sensor itself such as exposure time (shutter speed) and gain adjustment for the image signal output from the image sensor. This may include setting parameters related to imaging. Note that the device control unit 300 may set this parameter using a predetermined setting value, or may set this parameter using a setting value corresponding to a user operation on the UI 21 or input unit 353, which will be described later. Good too.
  • the anterior eye observation section 28 may further include a projector 30 whose operation is controlled by control signals similarly passed from the device control section 300.
  • the annular light source 41 includes the above-mentioned adjustment light source section 41a and photographing light source section 41b. Turning on/off of the adjustment light source section 41a and the photographing light source section 41b is controlled by a control signal output from the device control section 300.
  • the device control section 300 may further control the amount of light emitted from each of the adjustment light source section 41a and the photographing light source section 41b.
  • the device control unit 300 may control the amount of light emitted according to a user operation on the UI 21 or input unit 353, which will be described later.
  • the UI 21 includes an input device for receiving operations on the fundus imaging device 100 by the subject 200.
  • the UI 21 receives signals from input devices such as buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels.
  • the UI 21 passes a control signal to the device control unit 300 in response to a user's operation on the input device.
  • the device control section 300 may control the operation of each section of the fundus imaging device 100 according to this control signal.
  • the image processing section B2 includes an image sensor 7 and a development processing section 8.
  • the imaging operation of the image sensor 7 is controlled by a control signal output from the device control section 300.
  • the photographing operation of the image sensor 7 controlled by the device control unit 300 includes not only the operation of the image sensor itself such as exposure, but also the photographing operations for the image sensor such as exposure time (shutter speed) and gain adjustment for the image signal output from the image sensor. This may include setting parameters related to. Note that the device control unit 300 may set this parameter using a predetermined setting value, or may set this parameter using a setting value corresponding to a user operation on the UI 21 or input unit 353, which will be described later. Good too.
  • the development processing section 8 is supplied with an image signal output from the image sensor 7.
  • the development processing section 8 performs various signal processing including development processing on the supplied imaging signal according to a control signal output from the apparatus control section 300.
  • the development processing section 8 may adjust the parameters of the signal processing to be performed on the imaging signal, according to the needs of the photographer.
  • the device control unit 300 sets parameters for signal processing that the development processing unit 8 performs on the imaged signal in response to a user operation on the UI 21 or input unit 353, which will be described later. It is passed to the processing section 8.
  • the development processing section 8 executes signal processing on the image signal supplied from the image sensor 7 according to the parameters passed from the apparatus control section 300.
  • the imaging processing unit B2 is provided with a port 9 for communicating with the outside.
  • the port 9 shown in FIG. 6 may be one in which the port 9 and the port 22 in FIG. 3 are combined into one port, as described above.
  • Image data subjected to various signal processing in the development processing section 8 and image data output from the anterior eye observation section 28 are transmitted from the port 9 to the information processing section B3 via the image communication line 10.
  • the information processing section B3 includes a signal processing section 351, a display section 352, and an input section 353. Further, the information processing section B3 is provided with a port 350 for communicating with the outside.
  • Each image data output from the port 9 of the imaging processing section B2 is received by the port 350 via the image communication line 10, and is supplied to the signal processing section 351.
  • the signal processing unit 351 performs predetermined processing on each supplied image data. For example, the signal processing unit 351 generates an adjustment control signal for performing alignment adjustment and focus adjustment based on the image data output from the development processing unit 8 or the anterior eye observation unit 28. The generation of the adjustment control signal by the signal processing unit 351 will be described later.
  • the adjustment control signal generated by the signal processing unit 351 is output from the port 350 and transmitted to the imaging processing unit B2 via the image communication line 10.
  • the adjustment control signal is received by the port 9 in the imaging processing section B2, and is passed to the device control section 300 in the optical section B1.
  • the device control unit 300 controls, for example, the drive mechanisms 24 to 27 included in each mechanism 302 based on this adjustment control signal, and performs alignment adjustment and focus adjustment.
  • the display unit 352 displays information on a display device such as a display or an indicator.
  • the signal processing section 351 can cause the display section 352 to display an image based on the image data output from the anterior eye observation section 28 and the development processing section 8 on a display or the like.
  • the present invention is not limited to this, and the display section 352 may only display operation information using an indicator or the like, and the image data itself may be transmitted to the outside.
  • the input unit 353 receives signals from input devices such as buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels.
  • the input unit 353 passes a control signal corresponding to a user's operation on the input device to the signal processing unit 351.
  • the signal processing unit 351 can control processing of image data according to this control signal.
  • FIG. 7 is an example flowchart for explaining the overall flow of fundus imaging by the fundus imaging device 100 according to the first embodiment.
  • step S100 the subject's eye portion is brought into contact with the mounting section 1, and the fundus imaging device 100 is mounted. At this time, the subject directs his/her field of vision toward the optical element 2 via the device interface section D in the mounting section 1. More specifically, it is preferable for the subject to direct his or her field of vision toward the center of the optical element 2.
  • the direction in which the subject looks from the mounting part 1 to the optical element 2 will be referred to as "the subject looking in front”
  • the direction of the optical element 2 seen from the mounting part 1 will be defined as the direction in front of the subject. shall be.
  • step S100 When the fundus imaging device 100 is worn by the subject in step S100, the imaging operation by the anterior eye observation unit 28 and the image sensor 7 is started, and, for example, image data in the form of a moving image is transmitted from the anterior eye observation unit 28 and the image sensor 7. Output.
  • step S101 the fundus imaging device 100 executes alignment adjustment processing based on the image data acquired from the anterior eye observation unit 28 and the image sensor 7.
  • the fundus imaging device 100 performs alignment adjustment processing to deflect the field of view of the image sensor 7 and adjust the optical axis with respect to the image sensor 7.
  • the fundus imaging device 100 tilts the optical element 2, for example, so that the optical path 50c through which the light from the eye S to be examined is irradiated onto the imaging element 7 overlaps the imaging optical axis of the imaging element 7. , and the position of the optical element 4.
  • step S101 the fundus imaging device 100 adjusts the distance between the optical unit B1 and the eye S to be examined appropriately, for example, by the drive mechanism 24 of the mounting unit 1, so that the eye S to be examined is photographed at an appropriate size. control and adjust the angle of view.
  • the fundus imaging device 100 controls the drive mechanism 27 based on the image data acquired from the image sensor 7 to execute focus adjustment processing.
  • the fundus imaging device 100 determines whether to perform fundus imaging based on the result of the alignment adjustment process in step S101 and the result of the focus adjustment process in step S102. If the result of at least one of the alignment adjustment and the focus adjustment is insufficient, the fundus imaging apparatus 100 determines not to perform fundus imaging (step S103, "No"), and returns the process to step S101.
  • the fundus imaging device 100 determines to perform fundus imaging if both the alignment adjustment and the focus adjustment result are sufficient (step S103, "No"), and moves the process to step S104.
  • the fundus imaging device 100 uses the imaging device 7 to perform fundus imaging of the eye S to be examined.
  • FIG. 8 is an example flowchart showing the alignment adjustment process according to the first embodiment in more detail.
  • the series of processes shown in FIG. 8 corresponds to the process of step S101 in the flowchart of FIG.
  • step S110 the device control section 300 turns on the adjustment light source section 41a in the annular light source 41.
  • the photographing light source section 41b is not turned on.
  • the adjustment light source section 41a that emits light with a wavelength outside the visible light region is turned on, and the photographing light source section 41b that emits light with a wavelength outside the visible light region is turned on. Do not let it light up. Thereby, it is possible to prevent the pupil of the eye S to be examined from constricting due to glare or the like.
  • step S111 the device control unit 300 determines whether the amount of light from the adjustment light source unit 41a turned on in step S110 is appropriate. In other words, in step S111, the device control unit 300 determines whether imaging with appropriate brightness is possible in the anterior eye observation unit 28 using the light emitted by the adjustment light source unit 41a turned on in step S110. judge.
  • the device control unit 300 may make the determination in step S111 depending on the deterioration of the light emission intensity of the adjustment light source unit 41a.
  • the device control unit 300 accumulates and stores the time during which the adjustment light source unit 41a is turned on, and determines that the light amount is inappropriate when the accumulated time reaches a certain time. It can be determined that
  • the device control unit 300 may make the determination in step S111 depending on the deterioration in sensitivity of the image sensor included in the anterior eye observation unit 28. As a more specific example, the device control unit 300 accumulates and stores the time during which the image sensor included in the anterior eye observation unit 28 is used, and when the accumulated time reaches a certain time, It may be determined that the amount of light is not appropriate.
  • the present invention is not limited to this, and the device control unit 300 may perform this determination by capturing an image using the image sensor of the anterior eye observation unit 28 and obtaining the amount of light from the adjustment light source unit 41a based on the image capture result. Furthermore, a measuring device for measuring the amount of light emitted by the adjustment light source section 41a may be added, and the device control section 300 may make this determination based on the amount of light measured by this measuring device.
  • the device control unit 300 may make the determination in step S111 according to changes in the surrounding environment. For example, a measurement device that measures the environment outside the fundus imaging device 100 (for example, the amount of environmental light) is added to the fundus imaging device 100, and the device control unit 300 controls the measurement results measured by the measurement device.
  • the determination in step S111 may be made based on the following. The external environment may be measured separately.
  • the device control unit 300 can also make the determination in step S111 according to user operations on the UI 21 or the input unit 353.
  • step S111 determines in step S111 that the light amount is not appropriate (step S111, "No"), the process moves to step S112.
  • the device control unit 300 adjusts the amount of light emitted by the adjustment light source unit 41a. For example, the device control unit 300 estimates the amount of attenuation of the light amount according to the cumulative lighting time of the adjustment light source unit 41a and the cumulative usage time of the image sensor of the anterior eye observation unit 28, and based on this estimation result, adjusts the adjustment light source. The amount of light emitted from the portion 41a may be adjusted. Further, the device control unit 300 may adjust the amount of light emitted by the adjustment light source unit 41a based on the imaging result by the image sensor included in the anterior eye observation unit 28 or the measurement result of the amount of environmental light. Further, the device control unit 300 may adjust the amount of light emitted by the adjustment light source unit 41a in response to user operations on the UI 21 or the input unit 353.
  • the device control unit 300 may adjust the shutter time and gain settings in the imaging operation by the imaging device of the anterior eye observation unit 28 in step S112. Further, the device control unit 300 may adjust the shutter time and gain settings of the image sensor 7 in step S112. Further, the device control section 300 may adjust the gain setting in the development processing section 8 in step S112. For the adjustment in step S112, an algorithm used for automatic exposure adjustment in general cameras may be applied.
  • step S112 Upon completion of the light amount adjustment process in step S112, the device control unit 300 returns the process to step S111.
  • step S111 determines in step S111 that the light amount by the adjustment light source unit 41a is appropriate (step S111, "Yes"), the process moves to step S113.
  • step S113 the device control unit 300 photographs the anterior segment of the eye S to be examined using the anterior eye observation unit 28.
  • the device control unit 300 also executes imaging using the image sensor 7.
  • the optical unit B1 outputs the image data photographed and acquired by the anterior eye observation unit 28 in step S113 and the image data photographed and acquired by the image sensor 7 from the port 9. Each image data output from the port 9 is transmitted to the information processing section B3 via the image communication line 10. Each image data is received by the port 350 in the information processing section B3 and passed to the signal processing section 351.
  • the signal processing unit 351 acquires the alignment state based on the image data acquired by the anterior eye observation unit 28 and the image data acquired by the image sensor 7. For example, the signal processing unit 351 identifies the position of the pupil of the eye S to be examined based on the image data acquired by the anterior eye observation unit 28. Further, the signal processing unit 351 specifies, for example, the position where the image sensor 7 is irradiated with the light emitted from the pupil of the eye S, based on the image data acquired by the image sensor 7 or the anterior eye observation unit 28. . Furthermore, the signal processing unit 351 specifies the size of the fundus image based on the light on the captured image based on the image data. In other words, this is a process of specifying the angle of view in imaging by the image sensor 7.
  • the signal processing unit 351 In the next step S115, the signal processing unit 351 generates an adjustment control signal for performing alignment adjustment based on each piece of information specified in step S114.
  • the adjustment control signal here may include a control signal for deflecting the field of view of the image sensor 7 and a control signal for adjusting the angle of view of the image sensor 7.
  • the signal processing unit 113 outputs the generated adjustment control signal from the port 350.
  • the adjustment control signal output from the port 350 is received by the port 9 of the imaging processing section B2 via the image communication line 10, and is passed to the device control section 300.
  • the device control unit 300 controls each mechanism 302 based on the adjustment control signal passed thereto.
  • step S116 the signal processing unit 351 determines whether the pupil position (for example, the position of the pupil, iris, etc.) of the eye S to be examined is appropriate based on each piece of information specified in step S112.
  • the signal processing unit 351 determines that the pupil position is not appropriate (step S116, "No"), the process returns to step S113.
  • step S116 determines that the pupil position is appropriate (step S116, "Yes"), it ends the series of processes according to the flowchart of FIG. 8 and moves the process to step S102 of FIG. .
  • FIG. 9 is an example flowchart showing the focus adjustment process according to the first embodiment in more detail.
  • the series of processes shown in FIG. 9 corresponds to the process of step S102 in the flowchart of FIG.
  • step S120 the device control section 300 turns on the adjustment light source section 41a in the annular light source 41. At this time, the photographing light source section 41b is not turned on.
  • step S121 the device control unit 300 determines whether the amount of light from the adjustment light source unit 41a turned on in step S120 is appropriate.
  • the determination process in step S121 is the same as the determination process described in step S111 in the flowchart of FIG. 8, so the description here will be omitted.
  • step S121 determines in step S121 that the light amount is not appropriate (step S121, "No"), the process moves to step S122.
  • step S122 the device control section 300 adjusts the amount of light emitted by the adjustment light source section 41a.
  • the light amount adjustment process in step S122 is the same as the light amount adjustment process described in step S112 in the flowchart of FIG. 8, so the description here will be omitted.
  • step S122 Upon completion of the light amount adjustment process in step S122, the device control unit 300 returns the process to step S121.
  • step S121 determines in step S121 that the amount of light from the adjustment light source unit 41a is appropriate (step S121, "Yes"), the process moves to step S123.
  • step S123 the device control unit 300 executes imaging using the image sensor 7.
  • the photographing in step S123 is provisional photographing before the original purpose of photographing the fundus of the eye S to be examined using the image sensor 7.
  • the image data photographed in step S123 is output from the port 9 and transmitted to the information processing section B3 via the image communication line 10.
  • the image data is received by the port 350 in the information processing section B3 and passed to the signal processing section 351.
  • the signal processing unit 351 confirms the image data of the temporary photograph taken in step S123.
  • the signal processing unit 351 determines whether focus adjustment is appropriate based on the image data confirmed in step S122. For example, the signal processing unit 351 may determine whether focus adjustment is appropriate based on edge information of the image data.
  • step S125 If the signal processing unit 351 determines that the focus adjustment is not appropriate (step S125, "No"), the process moves to step S126.
  • step S126 the signal processing unit 351 generates an adjustment control signal that instructs focus adjustment, and outputs the generated adjustment control signal from the port 350.
  • the adjustment control signal output from the port 350 is received by the port 9 of the imaging processing section B2 via the image communication line 10, and is passed to the device control section 300.
  • the device control unit 300 controls the drive mechanism 27 based on the adjustment control signal passed thereto.
  • step S123 the signal processing unit 351 ends the series of processes according to the flowchart of FIG. The process moves to step S103.
  • FIG. 10 is an example flowchart showing the photographing process according to the first embodiment in more detail.
  • the series of processes shown in FIG. 10 corresponds to the process of step S104 in the flowchart of FIG. Note that the photographing according to the flowchart of FIG. 10 is called the actual photographing, and is distinguished from the temporary photographing in step S121 of FIG.
  • step S140 the device control section 300 performs offset adjustment for the difference in optical path length between the adjustment light from the adjustment light source section 41a and the photographing light from the photographing light source section 41b.
  • the wavelength when infrared light is used as the adjustment light is 850 nm
  • the wavelength of the photographing light is, for example, a representative value of 550 nm
  • the alignment adjustment performed using the adjustment light has a known deviation from the time of photographing using the photographing light according to this difference. Based on this deviation, the device control unit 300 determines an offset value for the alignment adjustment made using the adjustment light, corrects the control value resulting from the alignment adjustment, and performs the offset adjustment.
  • the device control unit 300 lights the photographing light source unit 41b with a flash.
  • the device control unit 300 uses the image sensor 7 to perform actual imaging of the fundus of the eye S to be examined. Since the photographing light source unit 41b is lit with a flash in step S141, it is possible to suppress the contraction of the pupils due to light.
  • step S142 When the actual photographing is performed in step S142, the series of processes according to the flowchart of FIG. 10 and the above-mentioned FIG. 7 are completed.
  • the alignment adjustment process according to the first embodiment includes the following three steps. (1) Adjustment of angle of view (2) Deflection of field of view (3) Adjustment of optical axis
  • the angle of view of the image sensor 7 is adjusted by changing the optical path length from the eye S to the image sensor 7.
  • This adjustment can be realized by adjusting the depth length of the mounting portion 1 (the length in the direction from the eye S to the optical element 2) using the drive mechanism 24.
  • This is not limited to this, and can be realized by adding a movement mechanism in the depth direction (direction from the subject's eye S toward the optical element 2) to the drive mechanism 25 that drives the optical element 2, and moving the optical element 2 in parallel in the depth direction. It is also possible to do so.
  • the signal processing unit 351 calculates the amount of adjustment in the angle of view adjustment based on the image information obtained from the image sensor 7 or the information obtained from the anterior eye observation unit 28. For example, when calculating the adjustment amount based on the image information acquired by the anterior eye observation unit 28, the signal processing unit 351 adjusts the angle of view so that the object displayed within the angle of view has a specified size. conduct.
  • the target object may be an ocular structure such as a pupil diameter or an iris diameter, or a predetermined pattern projected from the projector 30 around the anterior segment of the eye.
  • the present invention is not limited to this, and by adding a distance measuring sensor or the like to the anterior eye observation section 28, the distance between the optical section B1 and the eye S to be examined may be directly measured.
  • FIG. 11 is a schematic diagram for explaining an overview of visual field deflection according to the first embodiment.
  • the eye S to be examined, the optical element 2 and its drive mechanism 25 in the optical section B1, the optical element 4, and the image sensor 7 are extracted and shown from the configuration of the fundus imaging device 100, and other parts are omitted. are doing.
  • the fundus of the eye S to be examined is Pf
  • the pupil center of the pupil
  • a straight line connecting the fundus Pf and the pupil Pp is defined as a chief ray Af.
  • the principal ray Af contacts the optical element 2 for visual field deflection at a point Pa, and then reaches a position Pb' on the principal plane of the optical element 4 for optical axis adjustment.
  • the center of the imaging area of the image sensor 7 is defined as a position Pi
  • the axis of the light beam incident perpendicularly to the position Pi is defined as the imaging optical axis Ai.
  • the imaging optical axis Ai passes through a position Pb on the principal plane of the optical element 4, and exits toward the optical element 2, the device interface D, and the eye S to be examined. Note that in an initial state in which field deflection and optical axis adjustment are not performed, the imaging optical axis passes through the center of the device interface portion D in a direction perpendicular to the device interface.
  • the field of view deflection refers to adjusting the inclination of the optical element 2 using the drive mechanism 25 to make the positions Pb and Pb' coincide or sufficiently close to each other.
  • the center of rotation of the optical element 2 is assumed to be a point Pc.
  • the position of the point Pa where the distance between Pb and Pb' is the minimum is optimally determined based on the following constraints. It can be calculated as a problem.
  • Constraint condition (1) Point Pa is on principal ray Af.
  • Constraint condition (2) Point Pa is on the optical element 2 whose rotation center is point Pc.
  • fundus Pf, pupil Pp, point Pc, and position Pb A method for specifying the relative positions of the four points mentioned above (fundus Pf, pupil Pp, point Pc, and position Pb) will be described.
  • fundus Pf, pupil Pp, point Pc, and position Pb will be referred to as point Pf, point Pp, point Pc, and point Pb, respectively, as appropriate.
  • point Pf is made to coincide with the photographing optical axis before field deflection and optical axis adjustment at the time of mounting in step S100 of FIG.
  • This can be realized by arranging the device interface part D at a position sufficiently close to the eye S to be examined. This is not limited to this, and can also be realized by limiting the mounting position of the mounting section 1 on the subject 200.
  • As a method of limiting the mounting position of the mounting part 1 on the subject 200 for example, it is possible to form the shape of the mounting part 1 so as to follow a specific part of the face.
  • the present invention is not limited to this, and by specifying the part of the face of the subject 200 that is to be brought into contact with the mounting section 1, it is also possible to limit the mounting position of the mounting section 1 on the subject 200.
  • a reference marker may be projected onto the face of the subject 200 from the mounting portion 1 or the like, and alignment may be performed using the reference marker as an index.
  • the relative positional relationship between points Pb and Pc is determined by the structure of optical section B1. Further, it is known that the axial length of adults (about 15 years old or older) is about 23 to 25 mm, and there is little individual variation. Therefore, if point Pf is placed on the initial imaging optical axis, the distance between Pf and Pc will not vary significantly for each 200 subjects if one appropriate initial value is set. it is conceivable that. Furthermore, the distance between the device interface D and the eye S to be examined, which is caused by individual differences in the ocular axial length and the degree of pressure applied to the attachment part 1 in step S100, is absorbed when adjusting the angle of view. Therefore, when the mounting of the mounting section 1 in step S100 is completed, it can be said that the relative positional relationship of the points Pf, Pb, and Pc is specified on the world coordinate system W B1 in the optical section B1.
  • the invention is not limited to this, but a distance sensor that measures the distance to the surface of the eye S to be examined is added to the anterior eye observation unit 28, and each point Pf, Pb and The relative positional relationship of Pc may also be specified.
  • the axial length of a newborn baby is approximately 17 mm, which is slightly smaller than that of an adult. Therefore, in order to realize fundus imaging of subjects 200 in a wide range of age groups, for example, several patterns of initial values of the Pf-Pb distance are prepared in the fundus imaging device 100, and the photographer can switch them at the time of imaging. It is a good idea to make sure that you can
  • FIG. 12 is a schematic diagram for explaining a method for specifying the position of the pupil portion Pp that is applicable to the first embodiment.
  • Section (a) of FIG. 12 shows how an image of the anterior segment of the eye is captured on the imaging surface of the anterior eye observation unit 28.
  • the coordinate system W a of the image sensor 31 of the anterior eye observation section 28 can be transformed from the world coordinate system W B1 of the entire optical section B1 using a specific affine transformation matrix M 44 .
  • the coordinate system W a has its origin at the center of the imaging surface of the image sensor 31, the X a axis and the Y a axis coincide with the horizontal direction and the vertical direction of the image sensor, respectively, and the Z a axis coincides with the optical axis direction. Further, the plane that intersects perpendicularly with this Z a axis and in which the pupil portion Pp is present is defined as the anterior ocular segment plane F. Furthermore, let f be the focal length of the imaging optical system of the anterior eye observation section 28, and let L be the distance from the imaging surface of the anterior eye observation section 28 to the anterior eye surface F on the Z a axis.
  • the pupil Pp has a diameter ⁇ i . Further, it is assumed that the pupil portion Pp is projected with a diameter ⁇ i ' on the imaging surface of the image sensor 31.
  • the focal length f is known because it is uniquely determined by the device configuration of the optical section B1. Further, since the distance L also converges to a constant value at the time when the above-described view angle adjustment process is completed, it is known at the time when the visual field deflection starts. Then, with reference to section (b) of FIG. 12, it is assumed that the pupil portion Pp is imaged at the coordinates I p (x, y) on the imaging surface of the image sensor of the anterior eye observation section 28. By using the similarity relationship of focal length f:distance L from the coordinates I p (x, y) of the imaging position, the coordinates of the pupil portion Pp in the coordinate system W a can be determined.
  • the position of the pupil portion Pp in the coordinate system W a and, in turn, in the world coordinate system W B1 is specified.
  • the positions of each point Pf, Pp, Pb, and Pc in the world coordinate system W B1 have been specified.
  • FIG. 13 is a schematic diagram for explaining an overview of optical axis adjustment according to the first embodiment.
  • the eye S to be examined, the optical element 2 and its drive mechanism 25 in the optical section B1, the optical element 4, and the image sensor 7 are extracted from the configuration of the fundus imaging device 100. The other parts are omitted.
  • optical axis adjustment by moving the optical element 4 for optical axis adjustment, the principal ray traveling from the eye S to the optical element 4 is made to coincide with the imaging optical axis of the imaging element 7.
  • This adjustment is realized by adjusting the position and orientation of the optical element 4 using the drive mechanism 26 that drives the optical element 4.
  • FIG. 13 shows a state in which point Pb and point Pb' are sufficiently close to each other due to the above-described visual field deflection process.
  • the direction of the chief ray A f and the positions of points Pa and Pb' on the world coordinate system W B1 are specified. Therefore, the angle ⁇ i that the chief ray A f incident on the optical element 4 makes with the imaging optical axis of the image sensor 7 can be calculated.
  • the technique of superimposing obliquely incident light rays on an appropriate optical axis by adjusting the attitude and position of a part of the optical system is a known technique that is also used in lens shift image stabilization. If the angle ⁇ i can be calculated, the position and orientation of the optical element 4 can be determined by back calculation from the calculated angle ⁇ i .
  • the inclination of the optical element 2 for visual field deflection and the position and orientation of the optical element 4 for optical axis adjustment are acquired by the anterior eye observation unit 28.
  • the control is performed based on the image data acquired by the image sensor 7 and the image data acquired by the image sensor 7.
  • the degree of freedom in alignment adjustment can be increased compared to existing techniques, and it is possible to provide a fundus imaging device that can be miniaturized and whose alignment adjustment is easy.
  • the fundus imaging device 100 uses light with a wavelength outside the visible light range for illumination during alignment adjustment and focus adjustment, and also lights a flash during actual imaging. It is possible to prevent the pupil of the subject's eye S from constricting due to glare or the like. This eliminates the need to use mydriatic drugs and allows non-medical personnel to perform imaging, and furthermore, allows imaging to be performed remotely from a remote location without a medical personnel present.
  • FIG. 14 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a first modification of the first embodiment.
  • a fundus imaging device 100a according to a first modification example connects an optical section B1 and an imaging processing section B2 by an attachment 71, and makes the imaging processing section B2 removable.
  • the attachment 71 physically connects the optical section B1 and the imaging device 72 so that the incident light from the optical section B1 forms an image on the imaging device within the imaging device 72.
  • the imaging device 72 may be a lens-mounted still image camera or a video camera, or may be an electronic device with a camera function such as a smartphone.
  • a communication terminal for communicating control signals and the like between the imaging device 72 and the device control unit 300 may be added to the attachment 71, or an additional communication line 73 for wired or wireless communication may be prepared. It's okay.
  • the imaging device 72 that performs fundus photography removable, the user can update the imaging device 72 side or change the imaging device 72 depending on the shooting situation, like a general interchangeable lens camera. Selectable. Furthermore, by making it possible to use existing electronic devices with an imaging function, such as a smartphone with a built-in camera or a single-lens reflex camera with interchangeable lenses, as the imaging device 72, the imaging device 7 and the developing processing unit can be used on the fundus imaging device 100a side. 8 is no longer necessary, making it possible to reduce the size and cost.
  • FIG. 15 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a second modification of the first embodiment. Note that in FIG. 15, the information processing section B3 is omitted.
  • the optical element 2 made of a concave mirror or a plane mirror was used for field deflection.
  • an optical element 2a including an objective lens 60 is used for field deflection.
  • a convex lens can be applied to the objective lens 60.
  • the objective lens 60 a combination lens made by combining a plurality of lenses may be applied.
  • FIG. 16 is a schematic diagram showing the configuration of an example of an optical element 2a according to a second modification of the first embodiment.
  • the optical element 2a includes an objective lens 60 and an objective lens holder 60' that holds the objective lens 60.
  • the objective lens holder 60' is connected to the outer frames 61a and 61a' via respective drive mechanisms 61b and 61b'.
  • the drive mechanism 61b' is connected to the outer frame 61a' on the outer periphery and the outer frame 61a on the inner periphery, and rotates the outer frame 61a on the inner periphery with respect to the fixed outer frame 61a'.
  • a drive mechanism 61b is connected to the outer frame 61a on the inner periphery and the objective lens holder 60', and rotates the objective lens holder 60' with respect to the outer frame 61a' on the outer periphery.
  • the objective lens 60 has two rotation axes that intersect perpendicularly to the optical axis.
  • the drive mechanisms 61b and 61b' each rotate at an angle according to an instruction amount indicated by a control signal supplied from the device control section 300.
  • the device control section 300 generates a control signal instructing the rotation angle based on the device control signal transmitted from the information processing section B3 through the above-described visual field deflection process, and passes it to the drive mechanisms 61b and 61b'.
  • the drive mechanisms 61b and 61b' rotate the outer frame 61a and the objective lens holder 60' according to the control signals. Thereby, the field of view deflection for the image sensor 7 is realized.
  • the objective lens 60 is used as the optical element 2a for visual field deflection in the fundus imaging device 100b, there is no need to bend the fundus imaging device 100b, making the device more compact. It is possible to
  • the second embodiment is an example of a fundus imaging device that enables automatic imaging.
  • the fundus imaging device according to the second embodiment is capable of automatically executing the alignment adjustment process, focus adjustment process, and imaging process described in the flowchart of FIG. 7, for example, in response to a trigger operation by the subject 200. . That is, by using the fundus imaging device according to the second embodiment, the subject 200 can directly perform fundus imaging without the presence of a person with appropriate imaging techniques.
  • the subject 200 does not have specialized photographic techniques for intraocular and fundus photography. Furthermore, with the existing technology, it is necessary to fix the line of sight of the subject 200 during imaging, making it difficult for the subject 200 to operate the device during imaging. Therefore, when the subject 200 directly performs fundus imaging, the alignment adjustment and control of the fundus imaging device are all automatically performed within the fundus imaging device, and basically the patient 200 operates the device during imaging and performs the imaging. It is necessary to eliminate the need for actions such as checking the situation.
  • FIG. 17 is a schematic diagram for explaining how the fundus imaging device according to the second embodiment is used.
  • a fundus imaging device 100c according to the second embodiment includes a main body 110a including an optical section B1 and an imaging processing section B2 (none of which are shown), and a subject terminal 120. Ru.
  • the main body portion 110a is provided with a UI 21 for the subject 200 to instruct the start of fundus imaging processing.
  • the subject 200 holds the main body part 110a with, for example, his/her own hand 210, and brings a part of his/her face (eye area) into contact with the mounting part 1 of the main body part 110a to keep it in a fixed state.
  • the main body portion 110a may be of a hand-held type provided with a grip portion for easy holding by the hand 210, or may be of a stationary type installed on a pedestal or the like.
  • the UI 21 has a simple function, such as a button only for starting automatic shooting. Note that if the photographer is someone other than the subject 200, the UI 21 may be an input device that requires visual observation during operation, such as a keyboard, mouse, or touch panel.
  • the subject terminal 120 is capable of monitoring the operating status of the main body 110a, checking the fundus image taken by the main body 110a, and transmitting it to the outside. It may be configured such that the fundus photographing operation in the main body section 110a is controlled from the subject terminal 120.
  • the subject terminal 120 a general-purpose information processing device such as a smartphone, a tablet computer, or a personal computer can be applied.
  • the subject terminal 120 can be configured by installing an application program for executing fundus imaging processing according to the second embodiment into the information processing device.
  • the present invention is not limited to this, and the subject terminal 120 may be a terminal device dedicated to this fundus imaging device 100c.
  • FIG. 18 is an example functional block diagram for explaining the functions of the fundus imaging device 100c according to the second embodiment.
  • the configurations of the optical section B1 and the imaging processing section B2 are the same as those of the optical section B1 and the imaging processing section B2 in the fundus imaging apparatus 100 according to the first embodiment described using FIG. The explanation will be omitted.
  • the information processing section B3 includes a subject terminal 120, an automatic adjustment processing section 130, and an indicator 131. Furthermore, in the information processing section B3, the automatic adjustment processing section 130 and the indicator 131 are provided in the main body section 110a. In contrast, the subject terminal 120 is configured separately from the main body portion 110a. The present invention is not limited to this, and it is also possible to incorporate the functions of the subject terminal 120 into the main body 110a.
  • the subject terminal 120 includes a display section 121, a storage section 122, and a communication section 123.
  • the display unit 121 causes a display device included in the subject terminal 120 to display an image.
  • the storage unit 122 stores data in a storage medium included in the subject terminal 120 and reads data stored in the storage medium.
  • the communication section 123 communicates with the main body section 110a via the image communication line 10. Further, the communication unit 123 can also communicate via an external network by wired or wireless communication.
  • the imaging processing section B2 transmits image data captured and acquired by the imaging device 7 to the subject terminal 120 via the image communication line 10.
  • the subject terminal 120 can receive the image data transmitted from the main body section 110a through the communication section 123, and cause the display section 121 to display an image based on the received image data on the display device.
  • the subject 200 can confirm the result of fundus photography using the main body portion 110a.
  • the subject terminal 120 can store the image data transmitted from the main body section 110a and received by the communication section 123 in a storage medium using the storage section 122. Furthermore, the subject terminal 120 can also read the image data stored in the storage medium using the storage unit 122 and further transmit it to an external device using the communication unit 123.
  • the automatic adjustment processing unit 130 automatically executes the processing related to fundus photography described using FIGS. 7 to 10. For example, in response to an operation by the subject 200 to instruct the UI 21 to start fundus imaging, a control signal instructing to start fundus imaging is passed from the device control unit 300 to the automatic adjustment processing unit 130. In response to this control signal, the automatic adjustment processing unit 130 starts processing from step S101 according to the flowchart of FIG.
  • the device control unit 300 lights up the adjustment light source unit 41a in the annular light source 41 in response to an operation instructing the UI 21 to start imaging, and then lights up the adjustment light source unit 41a in the anterior eye observation unit 28, if necessary. Accordingly, the image sensor 7 and the development processing section 8 are controlled to obtain the initial alignment state and focus state.
  • the device control unit 300 passes information indicating the initial alignment state and focus state to the automatic adjustment processing unit 130.
  • the automatic adjustment processing unit 130 determines the current alignment state based on the information acquired by the anterior eye observation unit 28 and the image data acquired by the image sensor 7, and calculates the amount of adjustment to the optimal alignment state. .
  • the automatic adjustment processing section 130 sends information indicating the calculated adjustment amount to the optimal alignment state to the device control section 300, and feeds back the adjustment amount.
  • the device control unit 300 that has received the feedback sends the adjustment amount to the drive mechanisms 24 to 26 to perform alignment adjustment.
  • the automatic adjustment processing unit 130 obtains the alignment state after the adjustment, and thereafter checks the alignment state and performs the adjustment amount until it is determined that the alignment state is appropriate, in other words, until the alignment adjustment is converged. Repeat calculation and adjustment mechanism control.
  • the fundus imaging device 100c proceeds to focus adjustment processing (step S102 in FIG. 7) and imaging processing (step S104 in FIG. 7).
  • focus adjustment processing the fundus imaging device 100c repeatedly performs focus adjustment feedback between the device control unit 300, the drive mechanism 27 that controls focus adjustment, and the automatic adjustment processing unit 130, similarly to the alignment adjustment process. Control takes place.
  • the fundus imaging device 100c may include an indicator 131 on the main body portion 110a for assisting the subject 200 who performs imaging in operating the device.
  • the indicator 131 may present the internal state of the main body 110a, including the operation method of the fundus imaging device 100c and the alignment information obtained from the automatic adjustment processing unit 130, to the photographer (for example, the subject 200).
  • the indicator 131 may present information to the photographer using at least one of visual information and audio information.
  • the indicator 131 is not limited to this, and the indicator 131 may present information to the photographer using tactile information such as vibration. Further, the indicator 131 may present information to the photographer using a combination of these.
  • the subject 200 When the subject 200 operates the main body portion 110a and presents information as visual information using the indicator 131, the subject 200 confirms the indicator 131 with the eye that is not the subject of photography. In this case, when the subject 200 tries to look directly at the indicator 131, the line of sight of the subject's eye S also moves. Therefore, when presenting information as visual information, it is preferable to adopt a structure that does not require direct viewing of the indicator 131.
  • FIG. 19 is a schematic diagram showing a configuration example of the indicator 131 when visual information is used, which is applicable to the second embodiment.
  • the indicator 131a by the lighting device is placed on both side surfaces of the main body 110a, that is, on a surface parallel to the front direction of the face of the subject 200 when the subject 200 is correctly holding the main body 110a. It is set up in
  • the indicator 131a can use a light emitting element such as an LED (Light Emitting Diode). It is preferable that the indicator 131a is configured so that the emitted light is sufficiently diffused so that the subject 200 can confirm the light emission of the indicator 131a without looking directly at the indicator 131a. Further, the indicator 131a can selectively emit light in a plurality of colors, and may emit light in a color depending on the state of the fundus imaging device 100c. By doing so, the subject 200 can confirm the state of the fundus imaging device 100c from the color of the light emitted from the indicator 131a.
  • a light emitting element such as an LED (Light Emitting Diode). It is preferable that the indicator 131a is configured so that the emitted light is sufficiently diffused so that the subject 200 can confirm the light emission of the indicator 131a without looking directly at the indicator 131a. Further, the indicator 131a can selectively emit light in a plurality of colors, and may emit light in a color depending on the
  • An indicator 131b may be provided on the side surface.
  • the subject 200 can confirm whether or not the indicator 131b emits light and the color of the light emitted through the mirror. .
  • FIG. 20 is an example flowchart showing more specifically the alignment adjustment process according to the second embodiment.
  • the process according to the flowchart of FIG. 20 corresponds to the process of step S101 in the flowchart of FIG.
  • step S200 the fundus imaging device 100c causes the device control unit 300 to turn on the adjustment light source unit 41a.
  • the fundus imaging device 100c performs provisional imaging using the image sensor 7 under the control of the device control unit 300.
  • the anterior eye observation unit 28 may photograph the anterior eye segment. Image data acquired by photographing with the image sensor 7 or the anterior eye observation section 28 is passed to the automatic adjustment processing section 130 in the information processing section B3.
  • the image sensor 7 or the anterior eye observation unit 28 captures a moving image. Shooting of moving images may be continued, for example, until the actual shooting is started.
  • the fundus imaging device 100c uses the automatic adjustment processing unit 130 to detect the iris image of the eye S to be examined from the image data acquired by imaging with the image sensor 7 or the anterior eye observation unit 28, and detects the iris image of the eye S to be examined.
  • the iris diameter is determined based on the iris image. That is, since it is known that individual differences in the diameter of the iris are small, it is possible to determine whether the current angle of view falls within the range based on the size of the photographed iris system.
  • step S203 the automatic adjustment processing unit 130 determines whether the acquired iris diameter is an appropriate size.
  • step S203 "No"
  • the automatic adjustment processing unit 130 moves the process to step S204, and generates angle of view information based on the acquired iris diameter. is passed to the device control unit 300, and the viewing angle information is fed back.
  • the device control unit 300 controls, for example, the drive mechanism 24 based on the angle of view information passed from the automatic adjustment processing unit 130 to adjust the distance from the optical section B1 to the eye S to be examined. After step S204, the process returns to step S202.
  • step S203 determines that the acquired iris diameter is an appropriate size (step S203, "Yes")
  • step S205 determines that the acquired iris diameter is an appropriate size
  • the method of adjusting the angle of view is not limited to the method based on image data.
  • a distance measurement sensor or the like may be incorporated into the anterior eye observation unit 28 to directly measure the distance to the eye S to be examined.
  • a projector 30 may be provided in the anterior eye observation unit 28 to project a predetermined marker onto the eye S to be examined.
  • the automatic adjustment processing unit 130 determines the difference between the optical unit B1 and the anterior segment based on the size of the marker image on the image based on the image data acquired by the anterior eye observation unit 28 or the image sensor 7. The distance may also be calculated. The automatic adjustment processing unit 130 adjusts the angle of view based on these measured or calculated distances.
  • step S205 the automatic adjustment processing unit 130 detects the pupil image of the eye S to be examined from the image data acquired by photographing with the imaging device 7 or the anterior eye observation unit 28, and acquires the position of the pupil. That is, depending on whether the center of the pupil of the eye S to be examined is imaged at a position sufficiently close to the center of the image sensor 7, the current state of the visual field deflection and the optical axis position can be determined.
  • the automatic adjustment processing unit 130 determines whether the pupil position acquired in step S205 is an appropriate position. If the automatic adjustment processing unit 130 determines that the acquired pupil position is not an appropriate position, it moves the process to step S207, passes the position information indicating the acquired pupil position to the device control unit 300, and transfers the position information to the device control unit 300. Provide feedback.
  • the device control unit 300 adjusts, for example, the drive mechanisms 25 and 26 based on the position information passed from the automatic adjustment processing unit 130, and adjusts the pupil position on the image captured by the image sensor 7. If the pupil image is formed at a position shifted by a predetermined amount or more with respect to the imaging center of the image sensor 7, the device control unit 300 re-executes the field of view deflection and optical axis adjustment in a direction to correct the amount of shift. do. After step S206, the process returns to step S205.
  • step S205 and step S206 if necessary, the drive mechanism 27 can be controlled to drive the optical element 5 for focus adjustment to focus the pupil of the eye S to be examined on the image sensor 7. .
  • the condition can also be confirmed by the distorted shape of the imaged pupil.
  • the automatic adjustment processing unit 130 acquires the shape of the pupil image in the eye S to be examined from the image data acquired by photographing with the imaging device 7 or the anterior eye observation unit 28. That is, by detecting the distortion in the shape of the pupil image of the eye S to be examined, the current state of the optical axis position can be determined.
  • the pupil including the principal ray A f is captured directly in front of the imaging optical axis A i of the image sensor 7, the pupil is imaged as a circular image.
  • the optical axis adjustment is insufficient and the pupil is photographed obliquely with respect to the imaging optical axis A i , the pupil image on the image sensor 7 will be distorted in correlation to the tilt. It becomes an oval shape. Therefore, it becomes possible to perform feedback control of optical axis adjustment based on the shape of the pupil image.
  • the method of optical axis adjustment is not limited to the above method.
  • the focus adjustment process (step S102 in FIG. 7) that performs fundus imaging, it is determined whether the macula is photographed near the center of the fundus image finally obtained by the image sensor 7, or whether it is within the image frame. It is possible to determine the optical axis based on whether or not the optic nerve head is accommodated in the optic nerve head.
  • step S209 the automatic adjustment processing unit 130 determines whether there is any distortion in the shape of the pupil image acquired in step S208 (for example, whether the ratio of the major axis to the minor axis of the pupil image is equal to or greater than a predetermined value). If the automatic adjustment processing unit 130 determines that the acquired pupil shape is distorted (step S209, "No"), the automatic adjustment processing unit 130 moves the process to step S210, and transmits the shape information indicating the acquired pupil shape to the device control unit. 300, and the shape information is fed back.
  • the device control unit 300 controls, for example, the drive mechanism 26 to drive the optical element 4 based on the shape information passed from the automatic adjustment processing unit 130, and irradiates the image sensor 7 from the eye S through the optical element 2. Adjust the angle of the light. After step S210, the process returns to step S208.
  • step S209 determines that there is no distortion in the acquired pupil shape (step S209, "Yes")
  • the fundus imaging device 100c ends the series of processes in the flowchart of FIG. Then, the fundus imaging device 100c moves the process to step S102 in the flowchart of FIG. 7, for example, and executes focus adjustment processing.
  • a general autofocus function can be applied to the focus adjustment process. For example, many blood vessels run through the fundus of the eye, and it is possible to automatically adjust the focus using these blood vessels as indicators.
  • alignment adjustment processing and focus adjustment can be performed by the subject 200, for example, simply performing a simple operation on the UI 21 provided on the main body 110a of the fundus imaging device 100c.
  • Fundus photography including processing is automatically executed. It becomes possible to provide a fundus imaging device whose alignment is easy to adjust.
  • the adjustment light source section 41a illuminates the eye S to be examined using light with a wavelength outside the visible light range, and during actual photography, flash lighting is used. Therefore, there is no need to use mydriatic drugs, etc., and it is possible for non-medical personnel to perform imaging, and furthermore, it is possible to perform fundus imaging without a medical personnel present.
  • the fundus imaging device is an example that allows operation from a remote location by a photographer who has specialized skills in fundus photography.
  • the photographer since the photographer has specialized skills regarding fundus photography, it is assumed that the photographer himself/herself operates the equipment to set photography conditions, including alignment adjustment, as necessary. Therefore, in addition to fully automatic imaging within the fundus imaging device, it is necessary to add a mechanism that allows manual adjustment of the device while viewing the outputs of the image sensor 7 and the anterior eye observation unit 28.
  • FIG. 21 is a schematic diagram for explaining how the fundus imaging device according to the third embodiment is used.
  • a fundus imaging device 100d includes a main body 110b including an optical section B1 and an imaging processing section B2 (none of which are shown), communication devices 401 and 402, and a photographer terminal 410. , and a display device 420. Furthermore, the communication devices 401 and 402, the photographer terminal 410, and the display device 420 constitute an information processing section B3.
  • the photographer terminal 410 for example, a general-purpose information processing device such as a personal computer, a tablet computer, or a smartphone can be applied.
  • the photographer terminal 410 can be configured by installing an application program for executing fundus imaging processing according to the third embodiment into the information processing device.
  • the present invention is not limited to this, and the photographer terminal 410 may be a terminal device dedicated to this fundus imaging device 100d.
  • the communication device 401 is provided, for example, at a location where the subject 200 uses the main body portion 110b. Further, the communication device 402 is provided, for example, at a location remote from the location where the main body portion 110b is used, and where the photographer 430 remotely operates the main body portion 110b.
  • the communication devices 401 and 402 are capable of communicating with each other via a network 400 such as the Internet.
  • the main body portion 110b and the communication device 401 are connected by the image communication line 10 and the control communication line 11.
  • Image data acquired by the anterior eye observation section 28 and the image sensor 7 in the main body section 110b is output from the port 9 and passed to the communication device 401 via the image communication line 10.
  • control information transmitted from the communication device 402 and received by the communication device 401 via the network 400 is inputted from the communication device 401 to the port 22 via the control communication line 11, and is passed to the main body section 110b.
  • the display device 420 and the photographer terminal 410 are used by the photographer 430.
  • the photographer 430 is assumed to be someone who has specialized skills regarding fundus photography.
  • the communication device 402 passes image data received from the communication device 401 via the network 400 to the display device 420 and the photographer terminal 410. For example, control information output by the photographer terminal 410 in response to an operation by the photographer 430 is transmitted to the network 400 by the communication device 402 and received by the communication device 401.
  • FIG. 22 is an example functional block diagram for explaining the functions of the fundus imaging device 100d according to the third embodiment.
  • the configurations of the optical section B1 and the imaging processing section B2 are the same as those of the optical section B1 and the imaging processing section B2 in the fundus imaging apparatus 100 according to the first embodiment described using FIG. The explanation will be omitted.
  • the information processing unit B3 includes communication devices 401 and 402 that are communicably connected via a network 400, a photographer terminal 410, and a display device 420. Further, the photographer terminal 410 includes a UI 411, a storage section 412, and an operation assistance section 413.
  • the communication device 402 passes the image data received from the communication device 401 to the display device 420 and the photographer terminal 410.
  • the UI 411 includes an input device that outputs a control signal according to an input operation by the photographer 430, and a display device that displays information such as images.
  • Input devices can include buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels. The input device is not limited to this, and may have other input means (voice input, etc.).
  • the storage unit 412 stores data in a storage medium included in the photographer terminal 410 and reads data stored in the storage medium.
  • the storage unit 412 stores the image data obtained by photographing the fundus, which is received by the communication device 402, in the storage medium.
  • the storage unit 412 reads image data obtained by fundus photography stored in the storage medium.
  • the read image data is supplied to, for example, the display device 420 and displayed as an image.
  • the operation assisting unit 413 has the same function as the automatic adjustment processing unit 130 described using, for example, FIG. A control signal for controlling alignment adjustment processing and focus adjustment processing in the main body portion 110b is generated. Further, the operation assisting unit 413 generates a control signal for controlling alignment adjustment processing and focus adjustment processing in the main body unit 110b in response to an operation on the UI 411 by the photographer 430.
  • the operation assisting unit 413 may transmit control signals for controlling the image sensor 7, the developing processing unit 8, the anterior eye observation unit 28, and the annular light source 41, and set values of internal parameters, as necessary. . By transmitting control signals for controlling these parts and internal parameter setting values for each part, the operation assisting part 413 can adjust the appearance of the image displayed on the display device 420 and adjust the final fundus imaging device. The image of the output image data 100b can be adjusted.
  • the main body section 110b performs the alignment adjustment process in step S101, the focus adjustment process in step S102, and the photographing process in step S104 in the flowchart of FIG.
  • the image data obtained by photographing the anterior segment of the eye by the anterior eye observation unit 28 and the image data of the intraocular and fundus images obtained by photographing with the imaging device 7 are transmitted.
  • an image based on the image data transmitted from the main body section 110b and received by the communication device 402 is displayed on the display device 420.
  • the display device 420 may further display other auxiliary information (referred to as auxiliary information) regarding the image.
  • the auxiliary information may be information indicating the focused state of the image, information indicating the light amount saturation state, or the like.
  • the operation assistance unit 413 can acquire assistance information based on, for example, image data transmitted from the main unit 110b and received by the communication device 402.
  • the operation assistance unit 413 passes the acquired assistance information to the display device 420.
  • the display device 420 may display the auxiliary information passed from the operation assistance unit 413 superimposed on the image data, or may display the auxiliary information in a window separate from the image data. Furthermore, the auxiliary information may be displayed on another display device different from the display device 420.
  • the display device 420 may display the auxiliary information as image information (for example, displaying the light intensity saturated area with diagonal lines on the image) or as text-based numerical information (for example, displaying the light intensity area with diagonal lines on the image). degree expressed as a percentage).
  • the photographer 430 Based on the image and auxiliary information displayed on the display device 420, the photographer 430 checks the state of alignment adjustment and focus adjustment in the main body section 110b located at a remote location. Depending on the confirmation result, the photographer 430 operates the input device of the UI 411 on the photographer terminal 410 to control the main body 110b.
  • the photographer terminal 410 generates a control signal for controlling the operation of the main body 110b in response to the operation of the UI 411 by the photographer 430.
  • the control signal generated here includes information regarding alignment adjustment control, focus adjustment control, imaging control, image processing control, and the like.
  • a control signal generated by the photographer terminal 410 is transmitted from the communication device 402 via the network 400 and received by the communication device 401.
  • the communication device 401 inputs the received control information to the port 22 via the control communication line 11.
  • Control information input to the port 22 is passed to the device control section 300.
  • the device control unit 300 controls the drive mechanisms 24 to 27, the anterior eye observation unit 28, the image sensor 7, the development processing unit 8, etc. based on this control information, and executes alignment adjustment processing, focus adjustment processing, and imaging processing. .
  • the main body 110b can be remotely controlled by the photographer 430 manually from the photographer's terminal 410.
  • the photographer terminal 410 uses the main body 110b based on the image data acquired by photographing the anterior eye by the anterior eye observation unit 28 and the image data of the intraocular and fundus images acquired by photographing by the image sensor 7.
  • the alignment adjustment process, focus adjustment process, and photographing process can be automatically executed.
  • the operation assisting section 413 calculates adjustment values for each of the alignment adjustment and the focus adjustment in the same manner as the automatic adjustment processing section 130 described in the above-mentioned second embodiment.
  • the photographer terminal 410 sends each calculated adjustment value to the main body section 110b through communication using the communication devices 401 and 402. This enables feedback control of alignment adjustment and focus adjustment.
  • operation assisting unit 413 can also individually control alignment adjustment, focus adjustment, and photographing processing in accordance with the operation of the photographer 430 on the UI 411.
  • the photographer 430 performs alignment adjustment and focus adjustment for fundus imaging from a remote location where the subject 200 uses the main body 110b to perform fundus imaging. and can control the shooting process. This makes it possible to provide a fundus imaging device with easy alignment adjustment.
  • the operation assisting section 413 controls the image sensor 7, the development processing section 8, the anterior eye observation section 28, and the annular light source 41 in accordance with the operation of the photographer 430 on the UI 411. control signals and internal parameter settings can be generated and transmitted to the main body section 110b. This makes it possible to provide a fundus imaging device that can easily perform more detailed alignment adjustments from a remote location.
  • the present technology can also have the following configuration.
  • a fundus imaging device comprising: (2)
  • the first optical section is a first optical element made of any one of a concave mirror, a plane mirror, a lens, and a hologram optical element; a first drive mechanism that controls the direction in which light is emitted by the first optical element; including, The fundus imaging device according to (1) above.
  • the second optical section is a second optical element consisting of either a lens or a hologram optical element; a second drive mechanism that controls the direction in which light is emitted by the second optical element; including, The fundus imaging device according to (1) or (2) above.
  • a control unit that controls the first optical unit, the second optical unit, and the imaging by the imaging unit based on an image of the eye to be examined; further comprising, The fundus imaging device according to any one of (1) to (4) above.
  • an anterior ocular observation unit for observing the anterior ocular segment of the subject's eye;
  • the control unit includes: The fundus imaging device according to (5), wherein the first optical section and the second optical section are controlled based on the image acquired by observation by the anterior eye observation section.
  • the anterior eye observation part is a first light source unit that irradiates the anterior segment with light having a wavelength in the visible light range; a second light source unit that irradiates the anterior segment with light having a wavelength outside the visible light range; including, The fundus imaging device according to (6) above.
  • the anterior eye observation part is When the visual field direction is changed by the first optical part and when the traveling direction of the light is corrected by the second optical part, the second light source part irradiating light and not irradiating light from the first light source unit; The fundus imaging device according to (7) above.
  • the first light source section and the second light source section are Having a concentric shape that shares a mutual center, The fundus imaging device according to (7) or (8) above.
  • the anterior eye observation part is a third light source unit that irradiates the anterior segment with light of a specific wavelength for performing fluorescence observation; further including, The fundus imaging device according to any one of (7) to (9) above.
  • the control unit includes: controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to a user operation; The fundus imaging device according to any one of (5) to (10) above.
  • the control unit includes: controlling the first optical unit and the second optical unit based on the position of the pupil of the eye to be examined in the image captured by the imaging unit; The fundus imaging device according to (11) above.
  • the control unit includes: controlling the first optical unit and the second optical unit based on the shape of the pupil of the eye to be examined in the image captured by the imaging unit; The fundus imaging device according to (11) or (12) above.
  • a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined; Furthermore, The control unit includes: controlling the fourth optical section based on the size of the eye to be examined in the image; The fundus imaging device according to any one of (5) to (13) above.
  • the fundus imaging device includes a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes a visual field direction of the imaging section with respect to the eye to be examined; a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light that is emitted from the fundus and reaches the imaging section via the first optical section; , Imaging by the imaging unit; a control step that controls according to user operations; Fundus imaging methods, including: (17) According to the control step, controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to the user operation; The fundus imaging method according to (16) above.
  • the fundus imaging device includes: further comprising a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined, According to the control step, further: controlling the fourth optical section based on the size of the eye to be examined in the image; The fundus imaging method according to any one of (17) to (19) above.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A fundus-imaging device according to the present disclosure comprises: a first optical unit (2) provided between an eye to be examined and an imaging unit for imaging the fundus of the eye to be examined, the first optical unit (2) changing the field-of-view direction of the imaging unit with respect to the eye to be examined; and a second optical unit (4) provided between the first optical unit and the imaging unit, the second optical unit (4) correcting the direction of travel of light emitted from the fundus and beamed into the imaging unit via the first optical unit.

Description

眼底撮像装置および眼底撮像方法Fundus imaging device and fundus imaging method
 本開示は、眼底撮像装置および眼底撮像方法に関する。 The present disclosure relates to a fundus imaging device and a fundus imaging method.
 瞳孔を介して眼底を撮影する眼底撮像装置が知られている。眼底撮像装置は、据え置き型の装置と、手持ち型の装置とに大別される。据え置き型の装置は、台などに眼底撮像装置を設置し、被検者の頭部を眼底撮像装置に対して固定して、眼底撮影を行う。一方、手持ち型の装置は、人が眼底撮像装置を保持し、眼底撮像装置を被検者の頭部(眼球部分)に例えば当接させて、眼底撮影を行う。 Fundus imaging devices that photograph the fundus of the eye through the pupil are known. Fundus imaging devices are broadly classified into stationary devices and handheld devices. In a stationary type device, a fundus imaging device is installed on a table or the like, the subject's head is fixed relative to the fundus imaging device, and fundus imaging is performed. On the other hand, in a hand-held device, a person holds the fundus imaging device and, for example, brings the fundus imaging device into contact with the head (eyeball portion) of the subject to photograph the fundus.
 従来の眼底撮像装置では、撮影レンズ先端から撮像面に至るまでの光軸は固定されていた。そのため、撮像面側で適切な画枠設定を行うためには、撮影者は眼底撮像装置の光学系全体そのものを6軸方向に移動させる必要がある。この画枠設定作業をアライメント調整と呼ぶ。据え置き型の眼底撮像装置と、手持ち型の眼底撮像装置とでは、アライメント調整の手順が異なる。 In conventional fundus imaging devices, the optical axis from the tip of the photographic lens to the imaging surface is fixed. Therefore, in order to appropriately set the image frame on the imaging surface side, the photographer needs to move the entire optical system of the fundus imaging device itself in six axes. This image frame setting work is called alignment adjustment. The alignment adjustment procedure is different between a stationary fundus imaging device and a handheld fundus imaging device.
 据え置き型の眼底撮像装置におけるアライメント調整方法は、例えば次のようになる。
(1)撮影者が台などに設置された眼底撮像装置により前眼部の画像を見ながら、被検者に対して撮影光軸上を真正面で見るよう指示を出し、被検者の視線固定を行う。
(2)撮影者がコントローラなどを用いて、眼底撮像装置に内蔵されたアライメント調整機構を制御し、光学系全体を撮影光軸方向、および撮影光軸に垂直な平面上の3軸に対して移動させる。 
An alignment adjustment method for a stationary fundus imaging device is, for example, as follows.
(1) While viewing the image of the anterior segment of the eye using a fundus imaging device installed on a table, the photographer instructs the patient to look straight ahead on the imaging optical axis and fixes the patient's line of sight. I do.
(2) The photographer uses a controller etc. to control the alignment adjustment mechanism built into the fundus imaging device, and moves the entire optical system in the direction of the imaging optical axis and the three axes on the plane perpendicular to the imaging optical axis. move it.
 手持ち型の眼底撮像装置におけるアライメント調整方法は、例えば次のようになる。
(1)撮影者自身が手で保持する眼底撮像装置により前眼部の画像を見ながら、被検者に対して、およそ正面で見るよう指示を出し、視線固定を行う。
(2)撮影者が、撮影光軸上に被検者の視線が重なるように眼底撮像装置を移動させ、その状態で装置を手持ち状態で保持させる。
For example, an alignment adjustment method for a hand-held fundus imaging device is as follows.
(1) While viewing an image of the anterior segment of the eye using a fundus imaging device held in the photographer's hand, the photographer instructs the subject to look approximately straight ahead and fixes the line of sight.
(2) The photographer moves the fundus imaging device so that the line of sight of the subject overlaps the imaging optical axis, and holds the device in his hand in that state.
特開2008-000155号公報Japanese Patent Application Publication No. 2008-000155 特表2020-507386号公報Special Publication No. 2020-507386
 従来の据え置き型の眼底撮像装置は、被検者の頭部を装置に対して固定した上で装置の光学系全体そのものを6軸方向に移動させる必要があるため、装置が大型化してしまう。また、据え置き型の眼底撮像装置は、被検者の適切な視線固定を行うために撮影者の技術が必要となるため、撮影時に適切な撮影技術を有する者の同席が必要となる。 Conventional stationary fundus imaging devices require the subject's head to be fixed to the device and then move the entire optical system of the device in six axes, resulting in an increase in size. In addition, a stationary fundus imaging device requires the skill of the photographer to properly fix the subject's line of sight, so it is necessary to have someone with appropriate photography skills present at the time of photography.
 一方、手持ち型の眼底撮像装置は、アライメント調整機構を内蔵せず、光学系のみの構造を持つことで小型化が実現される。しかしながら、撮影者が眼底撮像装置を手で保持しつつアライメント調整を実施する必要があるため、アライメント調整作業がより煩雑になってしまう。したがって、手持ち型の眼底撮像装置も、据え置き型の装置と同様に、撮影時において適切な技術を有した者の同席が必要となる。 On the other hand, a handheld fundus imaging device does not have a built-in alignment adjustment mechanism and has a structure consisting only of an optical system, thereby realizing miniaturization. However, since the photographer needs to carry out the alignment adjustment while holding the fundus imaging device by hand, the alignment adjustment work becomes more complicated. Therefore, like a stationary type device, a handheld fundus imaging device also requires the presence of a person with appropriate skills at the time of imaging.
 本開示は、小型化が可能で、且つ、アライメント調整が容易な眼底撮像装置および眼底撮像方法を提供することを目的とする。 An object of the present disclosure is to provide a fundus imaging device and a fundus imaging method that can be miniaturized and have easy alignment adjustment.
 本開示に係る眼底撮像装置は、被検眼と、前記被検眼の眼底を撮像する撮像部との間に設けられ、前記撮像部の前記被検眼に対する視野方向を変化させる第1の光学部と、前記第1の光学部と前記撮像部との間に設けられ、前記眼底から射出され前記第1の光学部を介して前記撮像部に照射される光の進行方向を補正する第2の光学部と、を備える。 A fundus imaging device according to the present disclosure includes: a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes the visual field direction of the imaging section with respect to the eye to be examined; a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light emitted from the fundus and irradiated to the imaging section via the first optical section; and.
既存技術に係る眼底撮像装置の構成を概略的に示す模式図である。1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to an existing technique. 既存技術に係る眼底撮像装置の構成を概略的に示す模式図である。1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to an existing technique. 本開示に係る眼底撮像装置の構成を概略的に示す模式図である。1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to the present disclosure. 本開示に係る眼底撮像装置の構成を概略的に示す模式図である。1 is a schematic diagram schematically showing the configuration of a fundus imaging device according to the present disclosure. 第1の実施形態に係る眼底撮像装置の一例の構成を示す模式図である。FIG. 1 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a first embodiment. 第1の実施形態に適用可能な前眼観察部の構成例を示す模式図である。FIG. 3 is a schematic diagram showing a configuration example of an anterior eye observation unit applicable to the first embodiment. 第1の実施形態に適用可能な環状光源の構成の例を示す模式図である。FIG. 2 is a schematic diagram showing an example of a configuration of an annular light source applicable to the first embodiment. 第1の実施形態に係る眼底撮像装置の機能を説明するための一例の機能ブロック図である。FIG. 2 is a functional block diagram of an example for explaining the functions of the fundus imaging device according to the first embodiment. 第1の実施形態に係る眼底撮像装置による眼底撮影の全体の流れを説明するための一例のフローチャートである。2 is a flowchart of an example for explaining the overall flow of fundus imaging by the fundus imaging device according to the first embodiment. 第1の実施形態に係るアライメント調整処理をより詳細に示す一例のフローチャートである。7 is a flowchart of an example showing alignment adjustment processing according to the first embodiment in more detail. 第1の実施形態に係るフォーカス調整処理をより詳細に示す一例のフローチャートである。7 is a flowchart of an example showing in more detail the focus adjustment process according to the first embodiment. 第1の実施形態に係る撮影処理をより詳細に示す一例のフローチャートである。2 is a flowchart of an example showing the photographing process according to the first embodiment in more detail. 第1の実施形態に係る視野偏向の概要について説明するための模式図である。FIG. 3 is a schematic diagram for explaining an overview of visual field deflection according to the first embodiment. 第1の実施形態に適用可能な瞳孔部の位置特定の方法を説明するための模式図である。FIG. 3 is a schematic diagram for explaining a method of specifying the position of a pupil portion that is applicable to the first embodiment. 第1の実施形態に係る光軸調整の概要について説明するための模式図である。FIG. 3 is a schematic diagram for explaining an overview of optical axis adjustment according to the first embodiment. 第1の実施形態の第1の変形例に係る眼底撮像装置の一例の構成を示す模式図である。It is a schematic diagram showing the composition of an example of the fundus imaging device concerning the 1st modification of a 1st embodiment. 第1の実施形態の第2の変形例に係る眼底撮像装置の一例の構成を示す模式図である。FIG. 2 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a second modification of the first embodiment. 第1の実施形態の第2の変形例に係る光学素子の一例の構成を示す模式図である。FIG. 7 is a schematic diagram showing the configuration of an example of an optical element according to a second modification of the first embodiment. 第2の実施形態に係る眼底撮像装置の使用形態を説明するための模式図である。FIG. 7 is a schematic diagram for explaining a usage pattern of a fundus imaging device according to a second embodiment. 第2の実施形態に係る眼底撮像装置の機能を説明するための一例の機能ブロック図である。FIG. 2 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a second embodiment. 第2の実施形態に適用可能な、視覚情報を用いる場合のインジケータの構成例を示す模式図である。FIG. 7 is a schematic diagram showing a configuration example of an indicator when visual information is used, which is applicable to the second embodiment. 第3の実施形態に係る眼底撮像装置の使用形態を説明するための模式図である。FIG. 7 is a schematic diagram for explaining a usage pattern of a fundus imaging device according to a third embodiment. 第3の実施形態に係る眼底撮像装置の機能を説明するための一例の機能ブロック図である。FIG. 7 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a third embodiment. 第3の実施形態に係る眼底撮像装置の機能を説明するための一例の機能ブロック図である。FIG. 7 is a functional block diagram of an example for explaining the functions of a fundus imaging device according to a third embodiment.
 以下、本開示の実施形態について、図面に基づいて詳細に説明する。なお、以下の実施形態において、同一の部位には同一の符号を付することにより、重複する説明を省略する。 Hereinafter, embodiments of the present disclosure will be described in detail based on the drawings. Note that in the following embodiments, the same portions are given the same reference numerals, and redundant explanation will be omitted.
 以下、本開示の実施形態について、下記の順序に従って説明する。
1.本開示に係る技術の概要
 1-1.既存技術について
 1-2.本開示に係る技術
2.第1の実施形態
  2-1-1.第1の実施形態に係る構成
  2-1-2.第1の実施形態に係る処理の概要
  2-1-3.第1の実施形態に係るアライメント調整処理の詳細
 2-2.第1の実施形態の第1の変形例
 2-3.第1の実施形態の第2の変形例
3.第2の実施形態
4.第3の実施形態
Hereinafter, embodiments of the present disclosure will be described in the following order.
1. Overview of technology related to the present disclosure 1-1. Regarding existing technology 1-2. Technology related to the present disclosure 2. First embodiment 2-1-1. Configuration according to first embodiment 2-1-2. Outline of processing according to first embodiment 2-1-3. Details of alignment adjustment processing according to first embodiment 2-2. First modification of first embodiment 2-3. Second modification of the first embodiment 3. Second embodiment 4. Third embodiment
(1.本開示に係る技術の概要)
 本開示は、人の眼底を撮影するための眼底撮像装置に関するものである。本開示に係る眼底撮像装置は、医療用途に適用することができる。これに限らず、本開示に係る眼底撮像装置は、眼底画像を用いたセキュリティ用途にも適用することができる。
(1. Overview of technology related to this disclosure)
The present disclosure relates to a fundus imaging device for photographing a human fundus. The fundus imaging device according to the present disclosure can be applied to medical applications. The present disclosure is not limited to this, and the fundus imaging device according to the present disclosure can also be applied to security applications using fundus images.
(1-1.既存技術について)
 本開示の説明に先んじて、理解を容易とするために、本開示の技術と関連する既存技術について説明する。
(1-1. Regarding existing technology)
Prior to describing the present disclosure, existing technology related to the technology of the present disclosure will be described for ease of understanding.
 眼底撮像装置は、台などに設置された眼底撮像装置に対して被検者の頭部を固定して眼底撮影を行う据え置き型の装置と、人が眼底撮像装置を保持し、眼底撮像装置を被検者の頭部(眼球部分)に当接させて、眼底撮影を行う手持ち型の装置と、に大別される。 There are two types of fundus imaging devices: one is a stationary device that takes images of the fundus by fixing the subject's head to the fundus imaging device installed on a table, and the other is a stationary device that takes images of the fundus by fixing the subject's head to the fundus imaging device installed on a table or the like. There are two types of devices: hand-held devices that are placed in contact with the subject's head (eyeballs) to photograph the fundus.
 従来の眼底撮像装置では、撮影レンズ先端から撮像面に至るまでの光軸は固定されていた。そのため、撮像面側で適切な画枠設定を行うためには、撮影者は眼底撮像装置の光学系全体そのものを6軸方向に移動させる必要がある。この画枠設定作業をアライメントと呼ぶ。据え置き型の眼底撮像装置と、手持ち型の眼底撮像装置とでは、アライメントの手順が異なる。 In conventional fundus imaging devices, the optical axis from the tip of the photographic lens to the imaging surface is fixed. Therefore, in order to appropriately set the image frame on the imaging surface side, the photographer needs to move the entire optical system of the fundus imaging device itself in six axes. This image frame setting work is called alignment. The alignment procedure is different between a stationary fundus imaging device and a handheld fundus imaging device.
 据え置き型の眼底撮像装置におけるアライメント方法は、例えば次のようになる。
(1)撮影者が台などに設置された眼底撮像装置により前眼部の画像を見ながら、被検者に対して撮影光軸上を真正面で見るよう指示を出し、被検者の視線固定を行う。
(2)撮影者がコントローラなどを用いて、眼底撮像装置に内蔵されたアライメント機構を制御し、光学系全体を撮影光軸方向、および撮影光軸に垂直な平面上の3軸に対して移動させる。 
An example of an alignment method for a stationary fundus imaging device is as follows.
(1) While viewing the image of the anterior segment of the eye using a fundus imaging device installed on a table, the photographer instructs the patient to look straight ahead on the imaging optical axis and fixes the patient's line of sight. I do.
(2) The photographer uses a controller, etc. to control the alignment mechanism built into the fundus imaging device, and moves the entire optical system in the direction of the imaging optical axis and along three axes on a plane perpendicular to the imaging optical axis. let
 手持ち型の眼底撮像装置におけるアライメント方法は、例えば次のようになる。
(1)撮影者が、自身が手で保持する眼底撮像装置により前眼部の画像を見ながら、被検者に対して、およそ正面で見るよう指示を出し、視線固定を行う。
(2)撮影者が、撮影光軸上に被検者の視線が重なるように眼底撮像装置を移動させ、その状態で装置を手持ち状態で保持させる。
An example of an alignment method for a hand-held fundus imaging device is as follows.
(1) While viewing the image of the anterior segment of the eye using a fundus imaging device held by the photographer, the photographer instructs the subject to look approximately straight ahead, and fixes the patient's line of sight.
(2) The photographer moves the fundus imaging device so that the line of sight of the subject overlaps the imaging optical axis, and holds the device in his hand in that state.
 従来の据え置き型の眼底撮像装置は、被検者の頭部を装置に対して固定した上で装置の光学系全体そのものを6軸方向に移動させる必要があるため、装置が大型化してしまう。また、据え置き型の眼底撮像装置は、被検者の適切な視線固定を行うために撮影者の技術が必要となるため、撮影時に適切な撮影技術を有する者の同席が必要となる。 Conventional stationary fundus imaging devices require the subject's head to be fixed to the device and then move the entire optical system of the device in six axes, resulting in an increase in size. In addition, a stationary fundus imaging device requires the skill of the photographer to properly fix the subject's line of sight, so it is necessary to have someone with appropriate photography skills present at the time of photography.
 一方、手持ち型の眼底撮像装置は、アライメント機構を内蔵せず、光学系のみの構造を持つことで小型化が実現される。しかしながら、撮影者が眼底撮像装置を手で保持しつつアライメントを実施する必要があるため、アライメント作業がより煩雑になってしまう。したがって、手持ち型の眼底撮像装置も、据え置き型の装置と同様に、撮影時において適切な技術を有した者の同席が必要となる。 On the other hand, a handheld fundus imaging device does not have a built-in alignment mechanism and has a structure consisting only of an optical system, thereby realizing miniaturization. However, since the photographer needs to carry out alignment while holding the fundus imaging device by hand, the alignment work becomes more complicated. Therefore, like a stationary type device, a handheld fundus imaging device also requires the presence of a person with appropriate skills at the time of imaging.
 撮影時に技術を有した者の同席が必要となるのは、装置側でのアライメント機構の自由度が低いことが要因の一つと言える。装置内部のアライメント機構が十分な自由度を有していれば、撮影者はリモート操作調整や装置側の自動調整機構を利用することが可能になると考えられる。しかしながら、従来の眼底撮像装置におけるアライメント機構に対して調整の自由度を増やそうとした場合、装置がさらに大型化することが予測され、導入できる設備が限定されてしまう恐れがある。 One of the reasons why it is necessary to have a skilled person present during shooting is that the alignment mechanism on the device side has a low degree of freedom. If the alignment mechanism inside the device has a sufficient degree of freedom, it is thought that the photographer will be able to use remote adjustment or the automatic adjustment mechanism on the device side. However, if an attempt is made to increase the degree of freedom in adjusting the alignment mechanism in a conventional fundus imaging device, it is expected that the device will become even larger, and there is a risk that the equipment that can be introduced will be limited.
 これに対して、従来から、アライメント機構の自由度を高めることが可能な眼底撮像装置の構造が提案されている。例えば、特許文献1では、光路上に角度調整可能な凹面鏡を配置することでアライメント機構の自由度を高めるようにした眼底撮像装置が記載されている。特許文献1では、機器本体を顔面に固定し、入射瞳の位置を凹面鏡により調整することで、機器本体を固定しながらアライメント調整を実行可能である。しかしながら、この手法でアライメントを実現するためには、被検者が凹面鏡の回転軸交点に視線を向けておく必要がある。したがって、被検者の視力が低下している、または何らかの理由により被検者の視線を適切に誘導できない場合には、正確なアライメント調整が困難である。 In response to this, structures of fundus imaging devices that can increase the degree of freedom of the alignment mechanism have been proposed. For example, Patent Document 1 describes a fundus imaging device in which the degree of freedom of the alignment mechanism is increased by arranging a concave mirror whose angle is adjustable on the optical path. In Patent Document 1, alignment adjustment can be performed while fixing the device body by fixing the device body to the face and adjusting the position of the entrance pupil using a concave mirror. However, in order to achieve alignment using this method, the subject needs to direct his/her line of sight to the intersection of the rotation axes of the concave mirror. Therefore, if the subject's visual acuity is impaired or the subject's line of sight cannot be properly guided for some reason, accurate alignment adjustment is difficult.
 また、例えば特許文献2や非特許文献1のように、撮影時における適切な技術を有した者の同席を不要とする手法も提案されている。 Furthermore, as in Patent Document 2 and Non-Patent Document 1, for example, a method has been proposed that does not require a person with appropriate skills to be present at the time of photographing.
 特許文献2では、ヘッドマウントディスプレイ型の筐体を用い、当該筐体内部に撮像光学系と、その駆動系および制御系とを収納し、自動アライメント調整機能を有した小型の眼底撮像装置が記載されている。特許文献2によれば、眼底撮影にあたり適切な技術を有した撮影者の同席を不要とすることが可能である。しかしながら、特許文献2では、撮像光学系そのものを移動させるといった、従来の据置型のアライメント手法を採用しているため、機器容量の小型化と光軸調整範囲とがトレードオフの関係になる。また、特許文献2では、視線固定方法も、被検者自身によるターゲットの固視、または外部から被検者に向けての音声指示、により実現することを想定しているため、上述した特許文献1と同様の課題が発生しうる。 Patent Document 2 describes a small fundus imaging device that uses a head-mounted display type housing, houses an imaging optical system, its drive system, and a control system inside the housing, and has an automatic alignment adjustment function. has been done. According to Patent Document 2, it is possible to eliminate the need for a photographer with appropriate skills to be present when photographing the fundus. However, since Patent Document 2 employs a conventional stationary alignment method in which the imaging optical system itself is moved, there is a trade-off between the reduction in device capacity and the optical axis adjustment range. In addition, in Patent Document 2, the gaze fixation method is also assumed to be realized by fixation of the target by the subject himself or by voice instructions directed to the subject from the outside, so the above-mentioned Patent Document Issues similar to 1 may occur.
 非特許文献1では、被検者自身が網膜に投影される指標を見ながら、アライメントを実施する手法が記載されている。この非特許文献1による手法によれば、上述した特許文献2と同様に、撮影時に専門技術を有した撮影者を不要とすることが可能である。しかしながら、非特許文献1によるアライメントは、視力異常が発生している患者に対しては有効ではない。 Non-Patent Document 1 describes a method in which the subject himself or herself performs alignment while viewing an index projected onto the retina. According to the method disclosed in Non-Patent Document 1, similarly to Patent Document 2 mentioned above, it is possible to eliminate the need for a photographer with specialized skills at the time of photographing. However, the alignment according to Non-Patent Document 1 is not effective for patients with visual acuity abnormalities.
(1-2.本開示に係る技術)
 本開示に係る技術について、概略的に説明する。以下では、アライメント機構の自由度を高めるという観点から、既存技術として特許文献1に記載の構成を例にとって、本開示に係る技術について説明する。
(1-2. Technology related to this disclosure)
The technology according to the present disclosure will be briefly described. In the following, from the viewpoint of increasing the degree of freedom of the alignment mechanism, the technology according to the present disclosure will be described using the configuration described in Patent Document 1 as an example of existing technology.
 図1Aおよび図1Bは、既存技術に係る眼底撮像装置の構成を概略的に示す模式図である。眼底撮影においては、一般的には、撮像素子の撮像中心から垂直方向に伸びる撮像光軸上で瞳孔を捉え、瞳孔を介して被検眼の眼内および眼底を例えば拡大して撮影する。 FIGS. 1A and 1B are schematic diagrams schematically showing the configuration of a fundus imaging device according to existing technology. In fundus photography, the pupil is generally captured on an imaging optical axis extending vertically from the imaging center of the imaging device, and the intraocular and fundus of the subject's eye are, for example, magnified and photographed through the pupil.
 なお、以下に説明する図1Aおよび図1B、ならびに、後述する図2Aおよび図2Bでは、眼底撮像装置において説明に必要な部分を抜粋して示し、その他の光学素子等は省略している。 Note that in FIGS. 1A and 1B, which will be described below, and FIGS. 2A and 2B, which will be described later, parts of the fundus imaging device necessary for explanation are excerpted and other optical elements and the like are omitted.
 図1Aのセクション(a)は、既存技術に係る構成において、適切に眼底撮影が行われる例を示している。この例では、撮像素子530の撮像光軸512上に、回転軸521を中心に回転可能な反射鏡520が設けられている。また、反射鏡520は、被検者が図上で水平方向503に視線を向けた場合に、当該視線が回転軸521と一致するように配置されているものとする。被検眼500の眼底502から瞳孔501を介して射出された光は、光路510aにより反射鏡520に照射されて反射され、光路511aにより撮像素子530に照射され、撮影される。 Section (a) of FIG. 1A shows an example in which fundus photography is appropriately performed in a configuration according to existing technology. In this example, a reflecting mirror 520 that is rotatable about a rotation axis 521 is provided on an imaging optical axis 512 of an image sensor 530. Further, it is assumed that the reflecting mirror 520 is arranged so that when the subject looks in the horizontal direction 503 in the figure, the line of sight coincides with the rotation axis 521. Light emitted from the fundus 502 of the eye to be examined 500 through the pupil 501 is irradiated onto the reflecting mirror 520 through the optical path 510a and reflected, and is irradiated onto the image sensor 530 through the optical path 511a, where it is photographed.
 図1Aのセクション(a)の例では、被検者の視線が図上において水平方向503とされ、視線が反射鏡520の中心部(回転軸521)に向けられている。そのため、反射鏡520の角度を調整することで、光路511aを撮像素子530の撮像中心における撮像光軸512と重ねることができる。 In the example of section (a) in FIG. 1A, the subject's line of sight is in the horizontal direction 503 in the figure, and the line of sight is directed toward the center of the reflecting mirror 520 (rotation axis 521). Therefore, by adjusting the angle of the reflecting mirror 520, the optical path 511a can be overlapped with the imaging optical axis 512 at the imaging center of the imaging element 530.
 図1Aのセクション(b)は、図1Aのセクション(a)の状態において撮像素子530に撮像された像の例を示している。撮像素子530により撮像された撮像画像540において、ほぼ中央部に、瞳孔像551および眼底像552を含む被検眼像550が含まれている。瞳孔像551の位置は、撮像素子530における撮像光軸512の位置にほぼ一致している。 Section (b) of FIG. 1A shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 1A. In the captured image 540 captured by the image sensor 530, a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center. The position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530.
 図1Bのセクション(a)は、既存技術に係る構成において、眼底撮影が適切に行われない例を示している。図1Bのセクション(a)によれば、被検者の視線は、水平方向503に対して傾斜し、図1Aのセクション(a)の構成に対して、反射鏡520の中心(回転軸521)からずれた箇所に向けられている。すなわち、図1Aのセクション(a)では、眼底502から瞳孔501を介して射出される光の光路510aが図上で略水平となっているが、図1Bのセクション(a)では、当該光の光路510bが図上で右斜上方向となっている。また、反射鏡520は、当初、図中に一点鎖線で示される傾斜520bを有しているものとする。 Section (a) of FIG. 1B shows an example in which fundus photography is not performed appropriately in a configuration related to the existing technology. According to section (a) of FIG. 1B, the subject's line of sight is inclined with respect to the horizontal direction 503, and the center of the reflector 520 (rotation axis 521) It is aimed at a location that is off-center. That is, in section (a) of FIG. 1A, the optical path 510a of the light emitted from the fundus 502 through the pupil 501 is approximately horizontal in the diagram, but in section (a) of FIG. The optical path 510b is directed diagonally upward to the right in the figure. Further, it is assumed that the reflecting mirror 520 initially has an inclination 520b shown by a dashed line in the figure.
 図1Bのセクション(a)の例では、被検眼500の眼底502から瞳孔501を介して射出された光は、光路510bに従い反射鏡520に照射されて反射され、光路511bの方向に射出される。この状態では、反射鏡520からの反射光は、撮像素子530に照射されず、撮像画像540は、眼底502の画像を含まないことになる。 In the example of section (a) of FIG. 1B, light emitted from the fundus 502 of the eye to be examined 500 through the pupil 501 is irradiated onto the reflector 520 according to the optical path 510b, is reflected, and is emitted in the direction of the optical path 511b. . In this state, the reflected light from the reflecting mirror 520 is not applied to the image sensor 530, and the captured image 540 does not include an image of the fundus 502.
 この場合において、反射鏡520の傾斜520aを調整することで、反射鏡520からの反射光を撮像素子530上に照射させることができる。図1Bのセクション(b)の例では、反射鏡520の傾斜を角度θだけ変更して傾斜520bとすることで、反射鏡520からの反射光の光路511cを撮像光軸512に近付けることが可能である。これにより、反射鏡520からの反射光を、撮像素子530に照射させている。 In this case, by adjusting the inclination 520a of the reflecting mirror 520, the reflected light from the reflecting mirror 520 can be irradiated onto the image sensor 530. In the example of section (b) in FIG. 1B, by changing the inclination of the reflecting mirror 520 by the angle θ to obtain an inclination 520b, it is possible to move the optical path 511c of the reflected light from the reflecting mirror 520 closer to the imaging optical axis 512. It is. Thereby, the image sensor 530 is irradiated with the reflected light from the reflecting mirror 520.
 この図1Bのセクション(a)に示す場合であっても、光路511cを撮像光軸512と重ねることは、困難である。したがって、例えば撮像光軸512上において拡大視した場合であっても、図1Aのセクション(b)に示したような、適切な眼底像552を得ることは、困難である。図1Bのセクション(b)は、図1Bのセクション(a)の状態において撮像素子530に撮像された像の例を示している。撮像画像540において、瞳孔像551および眼底像552の位置が、撮像光軸512の位置に対してずれていることが分かる。 Even in the case shown in section (a) of FIG. 1B, it is difficult to overlap the optical path 511c with the imaging optical axis 512. Therefore, even when viewed under magnification on the imaging optical axis 512, for example, it is difficult to obtain an appropriate fundus image 552 as shown in section (b) of FIG. 1A. Section (b) of FIG. 1B shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 1B. It can be seen that in the captured image 540, the positions of the pupil image 551 and the fundus image 552 are shifted with respect to the position of the imaging optical axis 512.
 既存技術による上述の問題を解決するために、本開示では、眼底撮像装置の光軸上に、1以上の光軸調整用素子を導入することで、被検者が適切な位置に固視できなくても、装置側でのアライメント調整を可能としている。 In order to solve the above-mentioned problems with the existing technology, the present disclosure introduces one or more optical axis adjustment elements on the optical axis of the fundus imaging device, so that the subject can fixate at an appropriate position. Alignment can be adjusted on the device side even if the device is not installed.
 図2Aおよび図2Bは、本開示に係る眼底撮像装置の構成を概略的に示す模式図である。図2Aおよび図2Bそれぞれのセクション(a)に示されるように、本開示では、図1Aおよび図1Bにおいて説明した構成に対して、反射鏡520(第1の光学部)と撮像素子530との間に、光軸を調整するための光軸調整用素子560(第2の光学部)を追加している。 FIGS. 2A and 2B are schematic diagrams schematically showing the configuration of a fundus imaging device according to the present disclosure. As shown in section (a) of FIGS. 2A and 2B, in the present disclosure, in contrast to the configuration described in FIGS. In between, an optical axis adjustment element 560 (second optical section) for adjusting the optical axis is added.
 図2Aおよび図2Bの例では、光軸調整用素子560を、凹レンズとして示している。このような、本開示に係る光軸調整用素子560による光路の変更は、例えば、光学式手振れ補正などに用いられる技術を適用することで実現可能である。 In the examples of FIGS. 2A and 2B, the optical axis adjustment element 560 is shown as a concave lens. Such a change in the optical path by the optical axis adjustment element 560 according to the present disclosure can be realized by applying, for example, a technique used for optical image stabilization.
 図2Aのセクション(a)の例では、上述した図1Bのセクション(a)と同様に、被検者の視線が水平方向503に対して傾斜し、反射鏡の傾斜を、この視線の方向に応じて傾斜520bとしている。眼底502からの光は、光路510bにより反射鏡520に照射され、反射光が傾斜520bに応じた光路511cの方向に射出されている。 In the example of section (a) in FIG. 2A, similar to section (a) in FIG. The slope 520b is set accordingly. Light from the fundus 502 is irradiated onto the reflecting mirror 520 through an optical path 510b, and the reflected light is emitted in the direction of an optical path 511c according to the inclination 520b.
 この場合において、光軸調整用素子560により、反射鏡520から射出され撮像素子530に照射される光の光路511cを、撮像光軸512に重なる光路513に変更する。これにより、図1Bのセクション(a)に示した、水平方向503に対して傾斜した、眼底502から瞳孔501を介して射出され反射鏡520で反射された光の光路511cを、撮像光軸512に重なる光路513に変更することができる。 In this case, the optical axis adjustment element 560 changes the optical path 511c of the light emitted from the reflecting mirror 520 and irradiated onto the imaging device 530 to an optical path 513 that overlaps the imaging optical axis 512. Thereby, the optical path 511c of the light emitted from the fundus 502 through the pupil 501 and reflected by the reflector 520, which is inclined with respect to the horizontal direction 503, shown in section (a) of FIG. The optical path 513 can be changed to overlap the optical path 513.
 図2Aのセクション(b)は、図2Aのセクション(a)の状態において撮像素子530に撮像された像の例を示している。撮像素子530により撮像された撮像画像540において、ほぼ中央部に、瞳孔像551および眼底像552を含む被検眼像550が含まれている。瞳孔像551の位置は、撮像素子530における撮像光軸512の位置にほぼ一致している。 Section (b) of FIG. 2A shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 2A. In the captured image 540 captured by the image sensor 530, a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center. The position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530.
 また、本開示に係る技術は、眼底502の周辺である周辺眼底部の撮影に応用することが可能である。図2Bを用いて、本開示に係る技術を用いて周辺眼底部502’の撮影を行う例について概略的に説明する。この場合、図2Aのセクション(a)を用いて説明したようにして光軸調整用素子560を調整して、光軸調整用素子560から射出される光の光路513を、撮像光軸512に一致させる。その上で、反射鏡520の角度を適宜に変更する。図の例では、反射鏡520の角度を、傾斜520bから傾斜520dに変更している。これにより、撮像される対象のエリアを拡大することが可能である。 Furthermore, the technology according to the present disclosure can be applied to photographing the peripheral fundus, which is the periphery of the fundus 502. An example of photographing the peripheral fundus 502' using the technology according to the present disclosure will be schematically described using FIG. 2B. In this case, the optical axis adjustment element 560 is adjusted as described using section (a) of FIG. 2A, and the optical path 513 of the light emitted from the optical axis adjustment element 560 is aligned with the imaging optical axis 512. Match. Then, the angle of the reflecting mirror 520 is changed as appropriate. In the illustrated example, the angle of the reflecting mirror 520 is changed from an inclination 520b to an inclination 520d. Thereby, it is possible to enlarge the area of the object to be imaged.
 図2Bのセクション(b)は、図2Bのセクション(a)の状態において撮像素子530に撮像された像の例を示している。撮像素子530により撮像された撮像画像540において、ほぼ中央部に、瞳孔像551および眼底像552を含む被検眼像550が含まれている。瞳孔像551の位置は、撮像素子530における撮像光軸512の位置にほぼ一致している。さらに、図2Aのセクション(b)に示した眼底像552に対して広い範囲で、周辺眼底像553が撮像されていることが分かる。 Section (b) of FIG. 2B shows an example of an image captured by the image sensor 530 in the state of section (a) of FIG. 2B. In the captured image 540 captured by the image sensor 530, a subject's eye image 550 including a pupil image 551 and a fundus image 552 is included approximately in the center. The position of the pupil image 551 substantially coincides with the position of the imaging optical axis 512 in the image sensor 530. Furthermore, it can be seen that the peripheral fundus image 553 is captured in a wide range relative to the fundus image 552 shown in section (b) of FIG. 2A.
 このように、本開示に係る技術によれば、反射鏡520の角度の変更に加え、光軸調整用素子560を用いることで、眼底撮影におけるアライメント調整の自由度を高めることが可能である。またこれにより、撮影時に適切な撮影技術を有する者の同席を不要とすることが可能である。 As described above, according to the technology according to the present disclosure, in addition to changing the angle of the reflecting mirror 520, by using the optical axis adjustment element 560, it is possible to increase the degree of freedom in alignment adjustment in fundus photography. Furthermore, this makes it possible to eliminate the need for a person with appropriate photographic skills to be present during photographing.
(2.第1の実施形態)
 本開示の第1の実施形態について説明する。
(2. First embodiment)
A first embodiment of the present disclosure will be described.
(2-1-1.第1の実施形態に係る構成)
 図3は、第1の実施形態に係る眼底撮像装置の一例の構成を示す模式図である。図3において、眼底撮像装置100は、光学部B1と、撮像処理部B2と、情報処理部B3と、を含む。
(2-1-1. Configuration according to the first embodiment)
FIG. 3 is a schematic diagram showing the configuration of an example of the fundus imaging device according to the first embodiment. In FIG. 3, the fundus imaging device 100 includes an optical section B1, an imaging processing section B2, and an information processing section B3.
 図3の例では、光学部B1と撮像処理部B2とが同一筐体に設けられるように示されているが、これはこの例に限定されず、光学部B1と撮像処理部B2とを異なる筐体に設けてもよい。さらに、光学部B1および撮像処理部B2と同一の筐体に設けて、当該筐体にさらに情報処理部B3を設けてもよい。 In the example of FIG. 3, the optical section B1 and the imaging processing section B2 are shown to be provided in the same housing, but this is not limited to this example, and the optical section B1 and the imaging processing section B2 are provided in different cases. It may be provided in the housing. Furthermore, the information processing section B3 may be provided in the same casing as the optical section B1 and the imaging processing section B2, and the information processing section B3 may be further provided in the casing.
 光学部B1には、被検者200の眼球である被検眼Sに当接させるための装着部1が設けられる。装着部1は、駆動機構24により光学部B1と被検眼Sとの距離を変更可能とされている。例えば、駆動機構24は、装着部1を光軸方向に伸縮させ、被検眼Sと機器全体の光軸方向距離を調整する。光学部B1は、装着部1により被検眼Sとの相対位置が固定される。被検眼Sから得られる光は、装置界面部Dを通して光学部B1の内部へと採光される。装置界面部Dは、空孔でもよいし、保護用ガラス、レンズ、フィルタといった光学素子を配置してもよい。 The optical part B1 is provided with a mounting part 1 that is brought into contact with the subject's eye S, which is the eyeball of the subject 200. The mounting portion 1 is configured such that the distance between the optical portion B1 and the eye S to be examined can be changed by a drive mechanism 24. For example, the drive mechanism 24 expands and contracts the mounting portion 1 in the optical axis direction to adjust the distance between the eye S to be examined and the entire device in the optical axis direction. The relative position of the optical part B1 with respect to the eye S to be examined is fixed by the mounting part 1. Light obtained from the eye S to be examined passes through the device interface part D and enters the inside of the optical part B1. The device interface part D may be a hole, or may be provided with an optical element such as a protective glass, a lens, or a filter.
 光学部B1は、光学素子2~6と、装置制御部23と、環状光源41と、を含む。なお、光学素子3および6は、省略可能である。光学素子2は、例えば凹面鏡であって、所定の点を基準として駆動機構25により傾斜(角度)を変更可能に設けられる。光学素子2は、例えばその中心部を傾斜の変更の基準となる点として、傾斜が変更される。駆動機構25により光学素子2の傾斜を調整することで、後述する撮像素子7の画枠内の一定範囲内に被検眼Sの瞳孔像および眼底像が収まるようにできる。 The optical section B1 includes optical elements 2 to 6, a device control section 23, and an annular light source 41. Note that the optical elements 3 and 6 can be omitted. The optical element 2 is, for example, a concave mirror, and is provided so that its inclination (angle) can be changed by a drive mechanism 25 with respect to a predetermined point. The inclination of the optical element 2 is changed, for example, using its center as a reference point for changing the inclination. By adjusting the inclination of the optical element 2 using the drive mechanism 25, the pupil image and the fundus image of the eye S to be examined can be placed within a certain range within the image frame of the image sensor 7, which will be described later.
 光学素子2は、凹面鏡に限らず、平面鏡やレンズであってもよいし、ホログラム光学素子であってもよい。光学素子2は、回転軸を中心として回転することで、被検者200による、光学素子2を介した視野の方向を変化させる。 The optical element 2 is not limited to a concave mirror, but may be a plane mirror, a lens, or a hologram optical element. The optical element 2 changes the direction of the visual field of the subject 200 via the optical element 2 by rotating around the rotation axis.
 すなわち、光学素子2は、被検眼Sと、被検眼Sの眼底を撮像する撮像部である撮像素子7との間に設けられ、当該撮像部の被検眼Sに対する視野方向を変化させる第1の光学部として機能する。 That is, the optical element 2 is provided between the eye S to be examined and the imaging element 7, which is an imaging unit that images the fundus of the eye S, and is a first optical element that changes the visual field direction of the imaging unit with respect to the eye S to be examined. Functions as an optical section.
 なお、ここでは、視野は、撮像素子7から撮像光軸方向を見た場合の視界を指している。このとき、例えば光学素子2などにより光軸の方向が変更された場合には、変更先の視界を撮像素子7の視野とする。 Note that here, the field of view refers to the field of view when looking from the image sensor 7 in the direction of the imaging optical axis. At this time, for example, if the direction of the optical axis is changed by the optical element 2 or the like, the field of view to which the change has been made is made the field of view of the imaging element 7.
 光学素子4は、図2Aおよび図2Bで説明した光軸調整用素子560に対応するもので、駆動機構26により可動とされている。駆動機構26は、被検眼Sの瞳孔から眼底を結ぶ光路50aが光学素子2により光路50bに方向を変更され、その方向を変更された光路50bが後述する撮像素子7の撮像光軸と重なる光路50cとなるように、光学素子4を制御する。 The optical element 4 corresponds to the optical axis adjustment element 560 described in FIGS. 2A and 2B, and is movable by the drive mechanism 26. The drive mechanism 26 is configured such that an optical path 50a connecting the pupil of the eye S to the fundus is changed to an optical path 50b by the optical element 2, and the optical path 50b whose direction has been changed overlaps with the imaging optical axis of the imaging device 7, which will be described later. The optical element 4 is controlled so that the angle becomes 50c.
 図3の例では、光学素子4が凹レンズとして示されており、駆動機構26により、光軸に対して垂直の方向(図における水平方向)に移動可能とされている。光学素子4は、既知の光学式手振れ補正の技術を適用することができる。光学素子4は、被検眼Sから射出され第1の光学部である光学素子2を介して入射される光の進行方向を補正する。 In the example of FIG. 3, the optical element 4 is shown as a concave lens, and is movable by a drive mechanism 26 in a direction perpendicular to the optical axis (horizontal direction in the figure). For the optical element 4, a known optical image stabilization technique can be applied. The optical element 4 corrects the traveling direction of light that is emitted from the eye S to be examined and enters through the optical element 2, which is the first optical section.
 すなわち、光学素子4は、第1の光学部(光学素子2)と撮像部(撮像素子7)との間に設けられ、眼底から射出され第1の光学部を介して当該撮像部に照射される光の進行方向を補正する第2の光学部として機能する。 That is, the optical element 4 is provided between a first optical part (optical element 2) and an imaging part (imaging element 7), and is emitted from the fundus and irradiated onto the imaging part through the first optical part. It functions as a second optical section that corrects the traveling direction of light.
 光学素子5は、駆動機構27により駆動されることで、撮像素子7上に結像する像のフォーカス調整を行う。光学素子5としては、既知のオートフォーカス機構を適用することができる。 The optical element 5 is driven by the drive mechanism 27 to adjust the focus of the image formed on the image sensor 7. As the optical element 5, a known autofocus mechanism can be applied.
 装着部1の調整を行う駆動機構24と、光学素子2および駆動機構25と、光学素子4および駆動機構26と、光学素子5および駆動機構27と、を含んで、第1の実施形態に係るアライメント調整を行うアライメント調整機構が構成される。 The device according to the first embodiment includes a drive mechanism 24 that adjusts the mounting portion 1, an optical element 2 and a drive mechanism 25, an optical element 4 and a drive mechanism 26, and an optical element 5 and a drive mechanism 27. An alignment adjustment mechanism is configured to perform alignment adjustment.
 なお、上述では、装着部1に作用する駆動機構24と、光学素子2に作用する駆動機構25とを個別の構成として説明しているが、これはこの例に限定されない。例えば、駆動機構24の動作を駆動機構25の動作に併合し、光学素子2を、傾斜と、被検眼Sに対する奥行方向との両方に移動させるように構成してもよい。 Note that although the drive mechanism 24 acting on the mounting portion 1 and the drive mechanism 25 acting on the optical element 2 are described as separate structures in the above, this is not limited to this example. For example, the operation of the drive mechanism 24 may be combined with the operation of the drive mechanism 25 to move the optical element 2 both in the tilt direction and in the depth direction with respect to the eye S to be examined.
 図3の例では、では視野変更用の光学素子2として凹面鏡を使用しているが、光学素子2と光学素子4との間に、光学素子3を追加することで、光学素子2に平面鏡を使用することも可能となる。光学素子3は、例えば凸レンズを適用することができる。 In the example of FIG. 3, a concave mirror is used as the optical element 2 for changing the field of view, but by adding an optical element 3 between the optical element 2 and the optical element 4, a plane mirror is used as the optical element 2. It is also possible to use For example, a convex lens can be applied to the optical element 3.
 なお、上述した光学素子2、4および5は、本開示に係る眼底撮像を行うために最低限必要な光学素子構成であり、光学素子構成や配置を限定しているわけではない。例えば、上述のように光学素子3を追加してもよいし、さらに光学素子6を追加してもよい。また、例えば光学素子3および6については、配置を入れ替えてもよい。また、光学素子3~6をレンズにて構成する場合、これらのうち1または複数を、固体レンズに限らず、液体レンズやホログラム光学素子を用いることも可能である。固体レンズの代わりに液体レンズやホログラム光学素子を用いることで、さらなる小型化が可能となる。 Note that the optical elements 2, 4, and 5 described above are the minimum necessary optical element configurations for performing fundus imaging according to the present disclosure, and do not limit the optical element configuration or arrangement. For example, the optical element 3 may be added as described above, or the optical element 6 may be further added. Further, for example, the arrangement of optical elements 3 and 6 may be exchanged. Further, when the optical elements 3 to 6 are composed of lenses, one or more of them is not limited to a solid lens, but it is also possible to use a liquid lens or a hologram optical element. Further miniaturization is possible by using liquid lenses or hologram optical elements instead of solid lenses.
 光学部B1において、装置制御部23は、例えばCPU(Central Processing Unit)あるいはMPU(Micro Processing Unit)といったプロセッサと、プログラムなどが記憶されるROM(Read Only Memory)と、プロセッサのワークメモリとして用いられるRAM(Random Access Memory)と、外部に対するインタフェースと、を含む。装置制御部23は、例えばROMに記憶されるプログラムに従い、RAMをワークメモリとして用いて動作し、この眼底撮像装置100の全体の動作を制御する。例えば、装置制御部23は、インタフェースを介して上述した駆動機構25~27、および、環状光源41を制御することができる。また、装置制御部23は、後述する撮像素子7および現像処理部8の動作を制御してもよい。 In the optical section B1, the device control section 23 includes a processor such as a CPU (Central Processing Unit) or an MPU (Micro Processing Unit), a ROM (Read Only Memory) in which programs are stored, and a work memory for the processor. It includes RAM (Random Access Memory) and an interface to the outside. The device control unit 23 operates according to a program stored in the ROM, for example, using the RAM as a work memory, and controls the overall operation of the fundus imaging device 100. For example, the device control unit 23 can control the above-mentioned drive mechanisms 25 to 27 and the annular light source 41 via the interface. Further, the device control section 23 may control the operations of the image sensor 7 and the development processing section 8, which will be described later.
 光学部B1に含まれるUI(User Interface)21およびポート22について説明する。UI21は、ユーザ操作を受け付けるユーザ操作部や、インジケータなどによる表示部を含むことができる。ユーザ操作部は、ボタン、トグルスイッチ、キーボードや、マウスやタッチパネルといったポインティングデバイスを含むことができる。これに限らず、ユーザ操作部は、他の入力手段(音声入力など)を有していてもよい。また、インジケータは、単に点灯/非点灯を表示するものでもよいし、LCD(Liquid Crystal Display)などの表示デバイスであってもよい。装置制御部23は、光学部B1に設けられるUI21に対するユーザ操作に応じて、この眼底撮像装置100の動作を制御することもできる。 The UI (User Interface) 21 and port 22 included in the optical section B1 will be explained. The UI 21 can include a user operation section that accepts user operations, and a display section using indicators and the like. The user operation unit can include buttons, toggle switches, a keyboard, and pointing devices such as a mouse and a touch panel. The present invention is not limited to this, and the user operation section may include other input means (such as voice input). Further, the indicator may simply indicate lighting/non-lighting, or may be a display device such as an LCD (Liquid Crystal Display). The device control unit 23 can also control the operation of the fundus imaging device 100 in response to user operations on the UI 21 provided in the optical section B1.
 ポート22は、例えば制御通信線11が接続される。装置制御部23は、ポート22および制御通信線11を介して、例えば情報処理部B3と通信を行うことができる。なお、ここでは、ポート22と情報処理部B3とが制御通信線11により有線接続されるように説明したが、これはこの例に限らず、ポート22と情報処理部B3とを無線通信により接続してもよい。 For example, the control communication line 11 is connected to the port 22. The device control unit 23 can communicate with, for example, the information processing unit B3 via the port 22 and the control communication line 11. Note that although the description has been made here such that the port 22 and the information processing section B3 are connected by wire through the control communication line 11, this is not limited to this example. You may.
 図3において、撮像処理部B2は、撮像素子7と、現像処理部8と、装置制御部23と、を含む。撮像素子7は、例えば照射された光に応じた電気信号を出力する画素が行列状の配列で配置された画素アレイを含む。撮像素子7は、上述した光学素子2~5を介して入射された入射光が結像され、結像された光に応じた信号(画像信号)を出力する。 In FIG. 3, the imaging processing section B2 includes an imaging device 7, a development processing section 8, and a device control section 23. The image sensor 7 includes, for example, a pixel array in which pixels that output electrical signals according to irradiated light are arranged in a matrix. The image sensor 7 forms an image of the incident light that has entered through the optical elements 2 to 5 described above, and outputs a signal (image signal) corresponding to the imaged light.
 撮像素子7は、図示されない駆動回路により駆動され、所定のフレームレートで動画像を撮影することができる。また、撮像素子7は、動画像の撮影中に与えられたトリガに従い、静止画像を撮影することができる。さらに、撮像素子7は、動画像の撮影を伴わずに、静止画像のみを撮影することも可能である。 The image sensor 7 is driven by a drive circuit (not shown) and can capture moving images at a predetermined frame rate. Further, the image sensor 7 can capture a still image according to a trigger given while capturing a moving image. Furthermore, the image sensor 7 can also capture only still images without capturing moving images.
 現像処理部8は、撮像素子7から出力された動画像または静止画像による画像信号に対して所定の画像処理を実行する。例えば、現像処理部8は、撮像素子7からアナログ方式の信号として出力された画像信号に対して、AD(Analog to Digital)変換処理を実行して、当該画像信号をデジタル方式の画像信号である画像データに変換する。現像処理部8は、さらに、当該画像データに対して、デモザイク処理などの現像処理、ノイズ除去処理、ホワイトバランス調整処理などの画像処理を実行してよい。 The development processing unit 8 performs predetermined image processing on the image signal of the moving image or still image output from the image sensor 7. For example, the development processing unit 8 performs AD (Analog to Digital) conversion processing on the image signal output as an analog signal from the image sensor 7, and converts the image signal into a digital image signal. Convert to image data. The development processing unit 8 may further perform image processing such as development processing such as demosaic processing, noise removal processing, and white balance adjustment processing on the image data.
 現像処理部8で画像処理が実行された画像データは、ポート9から出力され、例えばポート9に接続される画像通信線10を介して情報処理部B3に伝送される。なお、ここでは、ポート9が画像通信線10により有線接続されるように説明したが、これはこの例に限らず、ポート9と情報処理部B3とを無線通信により接続してもよい。また、図3では、画像データを伝送するポート9と、制御信号などを伝送するポート22とを個別のポートとして示しているが、これはこの例に限らず、これらポート9とポート22とを1つのポートに纏めることも可能である。 Image data subjected to image processing in the development processing section 8 is output from a port 9 and transmitted to the information processing section B3 via an image communication line 10 connected to the port 9, for example. Note that although the description has been made here in which the port 9 is connected by wire through the image communication line 10, this is not limited to this example, and the port 9 and the information processing section B3 may be connected by wireless communication. Further, in FIG. 3, port 9 for transmitting image data and port 22 for transmitting control signals etc. are shown as separate ports, but this is not limited to this example. It is also possible to combine them into one port.
 情報処理部B3は、例えばCPUといったプロセッサと、ROMおよびRAMなどのメモリと、を含む。情報処理部B3は、さらに、ハードディスクドライブやフラッシュメモリといった不揮発性の記憶媒体によるストレージ装置を含むことができる。情報処理部B3は、さらに、ユーザ入力を受け付ける入力デバイス(キーボード、タッチパネルなど)や、情報を表示する表示デバイスを含むことができる。 The information processing unit B3 includes a processor such as a CPU, and a memory such as a ROM and a RAM. The information processing unit B3 can further include a storage device using a nonvolatile storage medium such as a hard disk drive or flash memory. The information processing unit B3 can further include an input device (keyboard, touch panel, etc.) that accepts user input, and a display device that displays information.
 情報処理部B3は、撮像素子7の出力に基づく画像データや、後述する前眼観察部から得られた情報に基づき、眼底撮像装置100におけるアライメント調整のための装置制御信号を生成する。情報処理部B3は、生成した装置制御信号を、例えば光学素子2、4および5の動作に対するフィードバック情報として、制御通信線11を介して光学部B1の装置制御部23に伝送する。 The information processing unit B3 generates a device control signal for alignment adjustment in the fundus imaging device 100 based on image data based on the output of the image sensor 7 and information obtained from the anterior eye observation unit described below. The information processing section B3 transmits the generated device control signal to the device control section 23 of the optical section B1 via the control communication line 11, for example, as feedback information for the operations of the optical elements 2, 4, and 5.
(前眼観察部)
 図3では省略されているが、光学部B1は、被検眼Sの前眼部を観察する前眼観察部を含む。前眼観察部は、アライメント調整を行う際に、被検眼Sの前眼部を撮影あるいは測定することで、前眼部の様子を観察可能とする。前眼観察部により得られた情報は、装置制御部23に送られ、アライメント調整機構にフィードバックされる。
(Anterior eye observation part)
Although omitted in FIG. 3, the optical section B1 includes an anterior eye observation section for observing the anterior eye segment of the eye S to be examined. The anterior eye observation unit makes it possible to observe the state of the anterior eye segment of the subject's eye S by photographing or measuring the anterior eye segment when performing alignment adjustment. Information obtained by the anterior eye observation section is sent to the device control section 23 and fed back to the alignment adjustment mechanism.
 前眼観察部は、前眼部を測定する測定器として結像光学系を含んでよい。前眼観察部の構成として、撮像装置(必要に応じて現像処理部も含む)を適用してよい。前眼観察部の構成は、この例に限定されず、測定目標が達成されるのであれば他の構成でもよい。また、前眼観察部に対し、撮像装置以外にも測距センサやマーカ指標投影部など、前眼情報を取得するために補助的な装置を追加してもよい。 The anterior eye observation unit may include an imaging optical system as a measuring device for measuring the anterior eye segment. An imaging device (including a development processing section if necessary) may be used as the configuration of the anterior eye observation section. The configuration of the anterior eye observation unit is not limited to this example, and other configurations may be used as long as the measurement target is achieved. Further, in addition to the imaging device, an auxiliary device such as a distance measurement sensor or a marker indicator projection unit may be added to the anterior eye observation unit in order to obtain anterior eye information.
 図4は、第1の実施形態に適用可能な前眼観察部の構成例を示す模式図である。光学部B1は、前眼観察部28a、28bおよび28cのうち少なくとも1つを含む。なお、以下では、前眼観察部28a、28bおよび28cを纏めて前眼観察部28と呼ぶことがある。 FIG. 4 is a schematic diagram showing a configuration example of an anterior eye observation unit applicable to the first embodiment. Optical section B1 includes at least one of anterior eye observation sections 28a, 28b, and 28c. In addition, below, the anterior eye observation parts 28a, 28b, and 28c may be collectively called the anterior eye observation part 28.
 前眼観察部28aは、前眼観察部28を装着部1の周辺に設けた例である。前眼観察部28aを用いることで、被検眼Sの前眼部を直接的に観察することが可能となる。この場合、前眼観察部28aは、前眼部を斜め方向から撮影することになるが、撮像画像に対して射影変換などを適用することで、撮像画像を光軸方向から見た画像に変換することが可能である。 The anterior eye observation section 28a is an example in which the anterior eye observation section 28 is provided around the attachment section 1. By using the anterior eye observation unit 28a, it becomes possible to directly observe the anterior eye segment of the eye S to be examined. In this case, the anterior eye observation unit 28a photographs the anterior eye segment from an oblique direction, but by applying projective transformation or the like to the captured image, converts the captured image into an image viewed from the optical axis direction. It is possible to do so.
 前眼観察部28bは、光学素子2から出射される光の光路にミラー29を設け、このミラー29により反射された光を撮像することで、前眼部の観察を可能としている。ミラー29は、入射された光を透過および反射させるハーフミラーを適用してよい。これに限らず、ミラー29を全反射すると共に可動式とされたミラーとしてもよい。この場合、ミラー29は、前眼観察部28bによる前眼観察時には光路に対して斜めに挿入され、前眼観察を行わない場合には、光路を妨げないような配置とすることが考えられる。また、ミラー29は、特定波長の光のみを透過および反射させるトリミングフィルタを適用してもよい。 The anterior eye observation unit 28b provides a mirror 29 in the optical path of the light emitted from the optical element 2, and images the light reflected by the mirror 29, thereby making it possible to observe the anterior eye segment. The mirror 29 may be a half mirror that transmits and reflects incident light. The present invention is not limited to this, and the mirror 29 may be a movable mirror that performs total reflection. In this case, the mirror 29 may be inserted obliquely to the optical path during anterior eye observation by the anterior eye observation unit 28b, and may be arranged so as not to obstruct the optical path when anterior eye observation is not performed. Moreover, a trimming filter that transmits and reflects only light of a specific wavelength may be applied to the mirror 29.
 光学素子2として凹面鏡あるいは平面鏡を用いる場合、前眼観察部28cを設けることができる。前眼観察部28cは、装着部1(被検眼S)から光学素子2の例えば中心部を通る線上の、光学素子2を介した位置に配置される。この場合、光学素子2にはハーフミラーを用いたり、または特定波長のみを透過するトリミングフィルタなどを用いる、等の方法により、後述する撮像素子7に向かう光線以外の光が28cにも届くようにする必要がある。 When a concave mirror or a plane mirror is used as the optical element 2, an anterior eye observation section 28c can be provided. The anterior eye observation section 28c is arranged at a position via the optical element 2 on a line passing through, for example, the center of the optical element 2 from the wearing section 1 (eye S to be examined). In this case, a half mirror is used for the optical element 2, or a trimming filter that transmits only a specific wavelength is used, etc., so that light other than the light rays directed to the image sensor 7, which will be described later, also reaches 28c. There is a need to.
 必要に応じて、前眼観察部28a、28bまたは28cに追加された測定器(例えば測距センサ)や、撮像素子7で観察可能な指標(マーカなど)を投影する投影機30を設置してもよい。投影機30により前眼部に投影した指標は、前眼観察部28a、28bまたは28cが含む撮像素子により撮影し、観察することも可能である。投影機30により指標を前眼部に投影することで、前眼観察部28a、28bまたは28cを用いた調整処理(後述する)を容易とすることが可能である。前眼観察部28a、28bまたは28cの出力を外部機器に送信する場合は、後述する現像処理部8、または制御用通信を行うポート22から出力させてもよいし、または追加の出力ポートを設けてもよい。 If necessary, a measuring device (for example, a distance sensor) added to the anterior eye observation unit 28a, 28b, or 28c or a projector 30 for projecting an index (such as a marker) that can be observed by the image sensor 7 may be installed. Good too. The index projected onto the anterior eye segment by the projector 30 can also be photographed and observed by an image sensor included in the anterior eye observation unit 28a, 28b, or 28c. By projecting the index onto the anterior eye segment using the projector 30, it is possible to facilitate adjustment processing (described later) using the anterior eye observation unit 28a, 28b, or 28c. When transmitting the output of the anterior eye observation section 28a, 28b, or 28c to an external device, it may be outputted from the development processing section 8, which will be described later, or a port 22 for control communication, or an additional output port may be provided. It's okay.
 上述したように、光学部B1は、環状光源41を含む。環状光源41は、中央部が光を透過するように環状に構成された光源である。図5は、第1の実施形態に適用可能な環状光源41の構成の例を示す模式図である。図5において、環状光源41は、可視光領域外(例えば赤外光)の波長の光を発光する調整用光源部41a(第1の光源部)と、可視光領域の波長の光を発光する撮影用光源部41b(第2の光源部)とを含む。図5の例では、環状光源41は、調整用光源部41aおよび撮影用光源部41bが互いに中心を共有する同心円とされ、中央部52が光の透過領域となっている。 As described above, the optical section B1 includes the annular light source 41. The annular light source 41 is a light source configured in an annular shape so that the center portion transmits light. FIG. 5 is a schematic diagram showing an example of the configuration of the annular light source 41 applicable to the first embodiment. In FIG. 5, the annular light source 41 includes an adjustment light source section 41a (first light source section) that emits light with a wavelength outside the visible light region (for example, infrared light), and an adjustment light source section 41a (first light source section) that emits light with a wavelength in the visible light region. A photographing light source section 41b (second light source section) is included. In the example of FIG. 5, the annular light source 41 has concentric circles in which the adjustment light source section 41a and the photographing light source section 41b share a center, and the center portion 52 is a light transmission area.
 調整用光源部41aは、例えばアライメント調整やフォーカス調整の際に用いられる。撮影用光源部41bは、実際に眼底撮影を行う際に用いられる。調整時に可視光領域外の波長の光を用いることで、眩しさなどによる被検眼Sの瞳孔の収縮を防ぐことが可能である。以下、適宜、調整用光源部41aにより発光された光を調整用光、撮影用光源部41bにより発光された光を撮影用光と呼ぶ。 The adjustment light source section 41a is used, for example, in alignment adjustment and focus adjustment. The photographing light source section 41b is used when actually photographing the fundus. By using light with a wavelength outside the visible light range during adjustment, it is possible to prevent the pupil of the subject's eye S from constricting due to glare or the like. Hereinafter, the light emitted by the adjustment light source section 41a will be referred to as adjustment light, and the light emitted by the photographing light source section 41b will be referred to as photographing light.
 なお、前眼観察部28a、28bまたは28cの照明として、例えば蛍光観察用の励起光源や、光学素子5によるフォーカス調整に最適化したスペクトラムの光源を適用してもよい。 Note that as the illumination for the anterior eye observation section 28a, 28b, or 28c, for example, an excitation light source for fluorescence observation or a light source with a spectrum optimized for focus adjustment by the optical element 5 may be applied.
 なお、投影機30で前眼部に投影した指標に基づき前眼部を撮像素子7により撮影し観察する場合、例えば光学素子2~6を適切に調整することが好ましい。光学素子2~6を適切に調整することで、撮像素子7は、前眼観察部28a、28bまたは28cにより当該指標に基づき前眼部を撮影した場合と同等の広い撮像範囲で、前眼部を撮影することが可能となる。さらに、撮像素子7により、調整用光による撮影と、撮影用光による撮影とが可能であれば、前眼観察部28a、28bまたは28cを用いて実行する調整処理を、撮像素子7による撮影で実現することも可能である。 Note that when the anterior eye segment is photographed and observed using the image sensor 7 based on the index projected onto the anterior eye segment by the projector 30, it is preferable to appropriately adjust the optical elements 2 to 6, for example. By appropriately adjusting the optical elements 2 to 6, the imaging device 7 can capture the anterior eye segment in a wide imaging range equivalent to when the anterior eye segment is photographed based on the index by the anterior eye observation unit 28a, 28b, or 28c. It becomes possible to photograph. Furthermore, if the imaging device 7 is capable of photographing with the adjustment light and photographing with the photographing light, the adjustment process performed using the anterior eye observation unit 28a, 28b or 28c can be performed with the imaging device 7. It is also possible to realize this.
 図6は、第1の実施形態に係る眼底撮像装置100の機能を説明するための一例の機能ブロック図である。 FIG. 6 is an example functional block diagram for explaining the functions of the fundus imaging device 100 according to the first embodiment.
 図6において、光学部B1は、装置制御部300と、各機構302と、UI(User Interface)21と、前眼観察部28と、環状光源41と、を含む。装置制御部300は、図3を用いて説明した装置制御部23に対応するもので、CPU、ROMおよびRAMや、各種のインタフェースを含む。 In FIG. 6, the optical section B1 includes a device control section 300, each mechanism 302, a UI (User Interface) 21, an anterior eye observation section 28, and an annular light source 41. The device control section 300 corresponds to the device control section 23 described using FIG. 3, and includes a CPU, ROM, RAM, and various interfaces.
 各機構302は、図3を用いて説明した駆動機構24~27を含む。駆動機構24~27は、装置制御部300が出力する制御信号により動作を制御される。前眼観察部28は、上述した前眼観察部28a、28bおよび28cのうち少なくとも1つを含む。前眼観察部28における撮像装置による撮影動作は、装置制御部300から出力される制御信号により制御される。 Each mechanism 302 includes the drive mechanisms 24 to 27 described using FIG. 3. The operation of the drive mechanisms 24 to 27 is controlled by control signals output from the device control section 300. The anterior eye observation section 28 includes at least one of the aforementioned anterior eye observation sections 28a, 28b, and 28c. The photographing operation by the imaging device in the anterior eye observation section 28 is controlled by a control signal output from the device control section 300.
 装置制御部300が制御する前眼観察部28における撮影動作は、露光といった撮像素子そのものの動作のみならず、露光時間(シャッタ速度)、撮像素子から出力される撮像信号に対するゲイン調整など、撮像素子に対する撮影に関するパラメータの設定動作を含んでよい。なお、装置制御部300は、このパラメータを、予め定められた設定値を用いて設定してもよいし、例えば後述するUI21や入力部353に対するユーザ操作に応じた設定値を用いて設定してもよい。 The photographing operation in the anterior eye observation unit 28 controlled by the device control unit 300 includes not only the operation of the image sensor itself such as exposure, but also the operation of the image sensor itself such as exposure time (shutter speed) and gain adjustment for the image signal output from the image sensor. This may include setting parameters related to imaging. Note that the device control unit 300 may set this parameter using a predetermined setting value, or may set this parameter using a setting value corresponding to a user operation on the UI 21 or input unit 353, which will be described later. Good too.
 また、前眼観察部28に、同様に装置制御部300から渡される制御信号により動作が制御される投影機30をさらに含めてもよい。環状光源41は、上述した調整用光源部41aと、撮影用光源部41bと、を含む。調整用光源部41aおよび撮影用光源部41bそれぞれの点灯/消灯は、装置制御部300から出力される制御信号により制御される。装置制御部300は、さらに、調整用光源部41aおよび撮影用光源部41bそれぞれの発光量を制御してもよい。例えば、装置制御部300は、例えば後述するUI21や入力部353に対するユーザ操作に応じて、当該発光量を制御してよい。 Furthermore, the anterior eye observation section 28 may further include a projector 30 whose operation is controlled by control signals similarly passed from the device control section 300. The annular light source 41 includes the above-mentioned adjustment light source section 41a and photographing light source section 41b. Turning on/off of the adjustment light source section 41a and the photographing light source section 41b is controlled by a control signal output from the device control section 300. The device control section 300 may further control the amount of light emitted from each of the adjustment light source section 41a and the photographing light source section 41b. For example, the device control unit 300 may control the amount of light emitted according to a user operation on the UI 21 or input unit 353, which will be described later.
 UI21は、被検者200によるこの眼底撮像装置100に対する操作を受け付けるための入力デバイスを含む。UI21は、ボタン、トグルスイッチ、キーボードや、マウスやタッチパネルといったポインティングデバイスなどの入力デバイスからの信号を受け付ける。UI21は、入力デバイスに対するユーザ操作に応じた制御信号を装置制御部300に渡す。装置制御部300は、この制御信号に応じて、この眼底撮像装置100の各部の動作を制御してよい。 The UI 21 includes an input device for receiving operations on the fundus imaging device 100 by the subject 200. The UI 21 receives signals from input devices such as buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels. The UI 21 passes a control signal to the device control unit 300 in response to a user's operation on the input device. The device control section 300 may control the operation of each section of the fundus imaging device 100 according to this control signal.
 撮像処理部B2は、撮像素子7と、現像処理部8とを含む。撮像素子7は、装置制御部300から出力される制御信号により撮影動作を制御される。装置制御部300が制御する撮像素子7における撮影動作は、露光といった撮像素子そのものの動作のみならず、露光時間(シャッタ速度)、撮像素子から出力される撮像信号に対するゲイン調整など、撮像素子に対する撮影に関するパラメータの設定動作を含んでよい。なお、装置制御部300は、このパラメータを、予め定められた設定値を用いて設定してもよいし、例えば後述するUI21や入力部353に対するユーザ操作に応じた設定値を用いて設定してもよい。 The image processing section B2 includes an image sensor 7 and a development processing section 8. The imaging operation of the image sensor 7 is controlled by a control signal output from the device control section 300. The photographing operation of the image sensor 7 controlled by the device control unit 300 includes not only the operation of the image sensor itself such as exposure, but also the photographing operations for the image sensor such as exposure time (shutter speed) and gain adjustment for the image signal output from the image sensor. This may include setting parameters related to. Note that the device control unit 300 may set this parameter using a predetermined setting value, or may set this parameter using a setting value corresponding to a user operation on the UI 21 or input unit 353, which will be described later. Good too.
 現像処理部8は、撮像素子7から出力された撮像信号が供給される。現像処理部8は、装置制御部300から出力される制御信号に従い、供給された撮像信号に対して現像処理を含む各種の信号処理を実行する。 The development processing section 8 is supplied with an image signal output from the image sensor 7. The development processing section 8 performs various signal processing including development processing on the supplied imaging signal according to a control signal output from the apparatus control section 300.
 現像処理部8は、撮影者側の必要に応じて、当該撮像信号に対して実行する信号処理のパラメータを調整してよい。例えば、装置制御部300は、例えば後述するUI21や入力部353に対するユーザ操作に応じて、現像処理部8が撮像信号に対して実行する信号処理のためのパラメータを設定し、設定したパラメータを現像処理部8に渡す。現像処理部8は、装置制御部300から渡されたこのパラメータに従い、撮像素子7から供給された撮像信号に対して信号処理を実行する。 The development processing section 8 may adjust the parameters of the signal processing to be performed on the imaging signal, according to the needs of the photographer. For example, the device control unit 300 sets parameters for signal processing that the development processing unit 8 performs on the imaged signal in response to a user operation on the UI 21 or input unit 353, which will be described later. It is passed to the processing section 8. The development processing section 8 executes signal processing on the image signal supplied from the image sensor 7 according to the parameters passed from the apparatus control section 300.
 また、図6の例では、撮像処理部B2は、外部との通信を行うためのポート9が設けられる。ここで、図6に示されるポート9は、上述したような、図3におけるポート9とポート22とを一つのポートに纏めたものであってよい。現像処理部8で各種の信号処理を施された画像データと、前眼観察部28から出力された画像データとが、ポート9から画像通信線10を介して情報処理部B3に伝送される。 In the example of FIG. 6, the imaging processing unit B2 is provided with a port 9 for communicating with the outside. Here, the port 9 shown in FIG. 6 may be one in which the port 9 and the port 22 in FIG. 3 are combined into one port, as described above. Image data subjected to various signal processing in the development processing section 8 and image data output from the anterior eye observation section 28 are transmitted from the port 9 to the information processing section B3 via the image communication line 10.
 情報処理部B3は、信号処理部351と、表示部352と、入力部353と、を含む。また、情報処理部B3は、外部との通信を行うためのポート350が設けられる。 The information processing section B3 includes a signal processing section 351, a display section 352, and an input section 353. Further, the information processing section B3 is provided with a port 350 for communicating with the outside.
 撮像処理部B2のポート9から出力された各画像データは、画像通信線10を介してポート350に受信され、信号処理部351に供給される。信号処理部351は、供給された各画像データに対して所定の処理を実行する。例えば、信号処理部351は、現像処理部8または前眼観察部28から出力された画像データに基づき、アライメント調整やフォーカス調整を行うための調整制御信号を生成する。信号処理部351による調整制御信号の生成については、後述する。 Each image data output from the port 9 of the imaging processing section B2 is received by the port 350 via the image communication line 10, and is supplied to the signal processing section 351. The signal processing unit 351 performs predetermined processing on each supplied image data. For example, the signal processing unit 351 generates an adjustment control signal for performing alignment adjustment and focus adjustment based on the image data output from the development processing unit 8 or the anterior eye observation unit 28. The generation of the adjustment control signal by the signal processing unit 351 will be described later.
 信号処理部351で生成された調整制御信号は、ポート350から出力され画像通信線10を介して撮像処理部B2に対して送信される。調整制御信号は、撮像処理部B2においてポート9により受信され、光学部B1内の装置制御部300に渡される。装置制御部300は、この調整制御信号に基づき各機構302に含まれる、例えば駆動機構24~27を制御し、アライメント調整およびフォーカス調整を行う。 The adjustment control signal generated by the signal processing unit 351 is output from the port 350 and transmitted to the imaging processing unit B2 via the image communication line 10. The adjustment control signal is received by the port 9 in the imaging processing section B2, and is passed to the device control section 300 in the optical section B1. The device control unit 300 controls, for example, the drive mechanisms 24 to 27 included in each mechanism 302 based on this adjustment control signal, and performs alignment adjustment and focus adjustment.
 表示部352は、ディスプレイやインジケータといった表示デバイスに対して情報の表示を行う。信号処理部351は、例えば、前眼観察部28や現像処理部8から出力された画像データに基づく画像を、表示部352によりディスプレイなどに表示させることができる。これに限らず、表示部352はインジケータなどによる動作情報の表示のみとし、画像データ自体は外部に送信することもできる。 The display unit 352 displays information on a display device such as a display or an indicator. For example, the signal processing section 351 can cause the display section 352 to display an image based on the image data output from the anterior eye observation section 28 and the development processing section 8 on a display or the like. However, the present invention is not limited to this, and the display section 352 may only display operation information using an indicator or the like, and the image data itself may be transmitted to the outside.
 入力部353は、ボタン、トグルスイッチ、キーボードや、マウスやタッチパネルといったポインティングデバイスなどの入力デバイスからの信号を受け付ける。入力部353は、入力デバイスに対するユーザ操作に応じた制御信号を信号処理部351に渡す。信号処理部351は、この制御信号に応じて、画像データに対する処理を制御することができる。 The input unit 353 receives signals from input devices such as buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels. The input unit 353 passes a control signal corresponding to a user's operation on the input device to the signal processing unit 351. The signal processing unit 351 can control processing of image data according to this control signal.
(2-1-2.第1の実施形態に係る処理の概要)
 次に、第1の実施形態に係る、眼底撮像装置100における処理について、図7~図10のフローチャートを用いて概略的に説明する。
(2-1-2. Outline of processing according to the first embodiment)
Next, the processing in the fundus imaging apparatus 100 according to the first embodiment will be schematically explained using the flowcharts of FIGS. 7 to 10.
 図7は、第1の実施形態に係る眼底撮像装置100による眼底撮影の全体の流れを説明するための一例のフローチャートである。 FIG. 7 is an example flowchart for explaining the overall flow of fundus imaging by the fundus imaging device 100 according to the first embodiment.
 図7において、ステップS100で、被検者の眼の部分が装着部1に当接され、眼底撮像装置100が装着される。このとき、被検者は、視界を、装着部1において装置界面部Dを介して光学素子2に向ける。より詳細には、被検者は、光学素子2の中心部に視界を向けるようにすると、好ましい。 In FIG. 7, in step S100, the subject's eye portion is brought into contact with the mounting section 1, and the fundus imaging device 100 is mounted. At this time, the subject directs his/her field of vision toward the optical element 2 via the device interface section D in the mounting section 1. More specifically, it is preferable for the subject to direct his or her field of vision toward the center of the optical element 2.
 以下、被検者が装着部1から光学素子2の方向を見ることを、「被検者が正面を見る」とし、装着部1から見た光学素子2の方向を、被検者の正面方向とする。 Hereinafter, the direction in which the subject looks from the mounting part 1 to the optical element 2 will be referred to as "the subject looking in front", and the direction of the optical element 2 seen from the mounting part 1 will be defined as the direction in front of the subject. shall be.
 ステップS100で被検者により眼底撮像装置100が装着されると、前眼観察部28および撮像素子7による撮影動作が開始され、例えば動画像による画像データが前眼観察部28および撮像素子7から出力される。 When the fundus imaging device 100 is worn by the subject in step S100, the imaging operation by the anterior eye observation unit 28 and the image sensor 7 is started, and, for example, image data in the form of a moving image is transmitted from the anterior eye observation unit 28 and the image sensor 7. Output.
 次のステップS101で、眼底撮像装置100は、前眼観察部28および撮像素子7から取得された画像データに基づき、アライメント調整処理を実行する。すなわち、ステップS100で装着部1に被検者の眼の部分を当接させたままの状態では、撮像素子7の視野と、被検者が向いている方向とが一致していない可能性がある。そのため、眼底撮像装置100は、ステップS101で、アライメント調整処理により、撮像素子7の視野の偏向と、撮像素子7に対する光軸の調整とを行う。より具体的には、眼底撮像装置100は、被検眼Sからの光が撮像素子7に照射される光路50cを、当該撮像素子7の撮像光軸に重ねるように、例えば光学素子2の傾斜と、光学素子4の位置とを調整する。 In the next step S101, the fundus imaging device 100 executes alignment adjustment processing based on the image data acquired from the anterior eye observation unit 28 and the image sensor 7. In other words, if the subject's eye remains in contact with the attachment part 1 in step S100, there is a possibility that the field of view of the image sensor 7 and the direction in which the subject is facing do not match. be. Therefore, in step S101, the fundus imaging device 100 performs alignment adjustment processing to deflect the field of view of the image sensor 7 and adjust the optical axis with respect to the image sensor 7. More specifically, the fundus imaging device 100 tilts the optical element 2, for example, so that the optical path 50c through which the light from the eye S to be examined is irradiated onto the imaging element 7 overlaps the imaging optical axis of the imaging element 7. , and the position of the optical element 4.
 また、眼底撮像装置100は、ステップS101で、被検眼Sが適切な大きさで撮影されるように、例えば装着部1の駆動機構24により、光学部B1と被検眼Sとの距離を適切に制御し、画角調整を行う。 Further, in step S101, the fundus imaging device 100 adjusts the distance between the optical unit B1 and the eye S to be examined appropriately, for example, by the drive mechanism 24 of the mounting unit 1, so that the eye S to be examined is photographed at an appropriate size. control and adjust the angle of view.
 次のステップS102で、眼底撮像装置100は、撮像素子7から取得された画像データに基づき駆動機構27を制御して、フォーカス調整処理を実行する。 In the next step S102, the fundus imaging device 100 controls the drive mechanism 27 based on the image data acquired from the image sensor 7 to execute focus adjustment processing.
 次のステップS103で、眼底撮像装置100は、ステップS101のアライメント調整処理の結果と、ステップS102のフォーカス調整処理の結果とに基づき、眼底撮影を実行するか否かを判定する。眼底撮像装置100は、アライメント調整およびフォーカス調整のうち少なくとも一方が不十分な結果である場合に、眼底撮影を実行しないと判定し(ステップS103、「No」)、処理をステップS101に戻す。 In the next step S103, the fundus imaging device 100 determines whether to perform fundus imaging based on the result of the alignment adjustment process in step S101 and the result of the focus adjustment process in step S102. If the result of at least one of the alignment adjustment and the focus adjustment is insufficient, the fundus imaging apparatus 100 determines not to perform fundus imaging (step S103, "No"), and returns the process to step S101.
 一方、眼底撮像装置100は、アライメント調整およびフォーカス調整の何れも十分な結果である場合に、眼底撮影を実行すると判定し(ステップS103、「No」)、処理をステップS104に移行させる。ステップS104で、眼底撮像装置100は、撮像素子7により、被検眼Sの眼底撮影を実行する。 On the other hand, the fundus imaging device 100 determines to perform fundus imaging if both the alignment adjustment and the focus adjustment result are sufficient (step S103, "No"), and moves the process to step S104. In step S104, the fundus imaging device 100 uses the imaging device 7 to perform fundus imaging of the eye S to be examined.
 図8は、第1の実施形態に係るアライメント調整処理をより詳細に示す一例のフローチャートである。図8に示す一連の処理は、図7のフローチャートにおけるステップS101の処理に対応する。 FIG. 8 is an example flowchart showing the alignment adjustment process according to the first embodiment in more detail. The series of processes shown in FIG. 8 corresponds to the process of step S101 in the flowchart of FIG.
 図8において、ステップS110で、装置制御部300は、環状光源41における調整用光源部41aを点灯させる。このとき、撮影用光源部41bは、点灯させない。このように、第1の実施形態では、アライメント調整において、可視光領域外の波長の光を発光する調整用光源部41aを点灯させ、可視光領域の波長の光を発光する撮影用光源部41bを点灯させないようにする。これにより、被検眼Sにおける眩しさなどによる瞳孔の収縮を防止することができる。 In FIG. 8, in step S110, the device control section 300 turns on the adjustment light source section 41a in the annular light source 41. At this time, the photographing light source section 41b is not turned on. As described above, in the first embodiment, in alignment adjustment, the adjustment light source section 41a that emits light with a wavelength outside the visible light region is turned on, and the photographing light source section 41b that emits light with a wavelength outside the visible light region is turned on. Do not let it light up. Thereby, it is possible to prevent the pupil of the eye S to be examined from constricting due to glare or the like.
 次のステップS111で、装置制御部300は、ステップS110で点灯された調整用光源部41aによる光量が適切か否かを判定する。換言すれば、ステップS111で、装置制御部300は、ステップS110で点灯された調整用光源部41aにより発光された光で、前眼観察部28において適切な明るさによる撮像が可能か否かを判定する。 In the next step S111, the device control unit 300 determines whether the amount of light from the adjustment light source unit 41a turned on in step S110 is appropriate. In other words, in step S111, the device control unit 300 determines whether imaging with appropriate brightness is possible in the anterior eye observation unit 28 using the light emitted by the adjustment light source unit 41a turned on in step S110. judge.
 例えば、装置制御部300は、調整用光源部41aの発光強度の劣化に応じて、ステップS111における判定を行ってよい。より具体的な例として、装置制御部300は、調整用光源部41aが点灯された時間を累積して記憶しておき、累積された時間が一定時間に達した時点で当該光量が適切ではないと判定してよい。 For example, the device control unit 300 may make the determination in step S111 depending on the deterioration of the light emission intensity of the adjustment light source unit 41a. As a more specific example, the device control unit 300 accumulates and stores the time during which the adjustment light source unit 41a is turned on, and determines that the light amount is inappropriate when the accumulated time reaches a certain time. It can be determined that
 また、装置制御部300は、前眼観察部28が有する撮像素子の感度劣化に応じて、ステップS111における判定を行ってよい。より具体的な例として、装置制御部300は、前眼観察部28が有する撮像素子が使用された時間を累積して記憶しておき、累積された時間が一定時間に達した時点で、当該光量が適切ではないと判定してよい。 Furthermore, the device control unit 300 may make the determination in step S111 depending on the deterioration in sensitivity of the image sensor included in the anterior eye observation unit 28. As a more specific example, the device control unit 300 accumulates and stores the time during which the image sensor included in the anterior eye observation unit 28 is used, and when the accumulated time reaches a certain time, It may be determined that the amount of light is not appropriate.
 これに限らず、装置制御部300は、前眼観察部28の撮像素子により撮像を行い、この撮像結果に基づき調整用光源部41aによる光量を取得して、この判定を行ってよい。また、調整用光源部41aによる光量を測定する測定器を追加し、装置制御部300は、この測定器により測定された光量に基づきこの判定を行ってよい。 The present invention is not limited to this, and the device control unit 300 may perform this determination by capturing an image using the image sensor of the anterior eye observation unit 28 and obtaining the amount of light from the adjustment light source unit 41a based on the image capture result. Furthermore, a measuring device for measuring the amount of light emitted by the adjustment light source section 41a may be added, and the device control section 300 may make this determination based on the amount of light measured by this measuring device.
 さらに、装置制御部300は、周囲環境の変化に応じてステップS111における判定を行ってよい。例えば、当該眼底撮像装置100に対して、眼底撮像装置100の外部の環境(例えば環境光の光量)を測定する測定器を追加し、装置制御部300は、当該測定器により測定された測定結果に基づきステップS111における判定を行ってよい。当該外部の環境は、別途に測定してもよい。 Furthermore, the device control unit 300 may make the determination in step S111 according to changes in the surrounding environment. For example, a measurement device that measures the environment outside the fundus imaging device 100 (for example, the amount of environmental light) is added to the fundus imaging device 100, and the device control unit 300 controls the measurement results measured by the measurement device. The determination in step S111 may be made based on the following. The external environment may be measured separately.
 さらにまた、装置制御部300は、UI21や入力部353に対するユーザ操作に応じてステップS111における判定を行うことも可能である。 Furthermore, the device control unit 300 can also make the determination in step S111 according to user operations on the UI 21 or the input unit 353.
 装置制御部300は、ステップS111で光量が適切ではないと判定した場合(ステップS111、「No」)、処理をステップS112に移行させる。 If the device control unit 300 determines in step S111 that the light amount is not appropriate (step S111, "No"), the process moves to step S112.
 ステップS112で、装置制御部300は、調整用光源部41aによる発光光量を調整する。装置制御部300は、例えば、調整用光源部41aの累積点灯時間や前眼観察部28の撮像素子の累積使用時間に応じた光量の減衰量を推測し、この推測結果に基づき、調整用光源部41aの発光光量を調整してよい。また、装置制御部300は、前眼観察部28が有する撮像素子による撮像結果や、環境光の光量の測定結果に基づき、調整用光源部41aの発光光量を調整してもよい。さらに、装置制御部300は、UI21や入力部353に対するユーザ操作に応じて調整用光源部41aの発光光量を調整してもよい。 In step S112, the device control unit 300 adjusts the amount of light emitted by the adjustment light source unit 41a. For example, the device control unit 300 estimates the amount of attenuation of the light amount according to the cumulative lighting time of the adjustment light source unit 41a and the cumulative usage time of the image sensor of the anterior eye observation unit 28, and based on this estimation result, adjusts the adjustment light source. The amount of light emitted from the portion 41a may be adjusted. Further, the device control unit 300 may adjust the amount of light emitted by the adjustment light source unit 41a based on the imaging result by the image sensor included in the anterior eye observation unit 28 or the measurement result of the amount of environmental light. Further, the device control unit 300 may adjust the amount of light emitted by the adjustment light source unit 41a in response to user operations on the UI 21 or the input unit 353.
 これに限らず、装置制御部300は、ステップS112で、前眼観察部28の撮像素子による撮像動作におけるシャッタ時間やゲイン設定を調整してもよい。また、装置制御部300は、ステップS112で、撮像素子7におけるシャッタ時間やゲイン設定を調整してもよい。さらに、装置制御部300は、ステップS112で、現像処理部8におけるゲイン設定を調整してもよい。ステップS112における調整は、一般的なカメラにおける自動露光調整に用いられるアルゴリズムを適用してよい。 The invention is not limited to this, and the device control unit 300 may adjust the shutter time and gain settings in the imaging operation by the imaging device of the anterior eye observation unit 28 in step S112. Further, the device control unit 300 may adjust the shutter time and gain settings of the image sensor 7 in step S112. Further, the device control section 300 may adjust the gain setting in the development processing section 8 in step S112. For the adjustment in step S112, an algorithm used for automatic exposure adjustment in general cameras may be applied.
 装置制御部300は、ステップS112の光量調整処理が完了すると、処理をステップS111に戻す。 Upon completion of the light amount adjustment process in step S112, the device control unit 300 returns the process to step S111.
 一方、装置制御部300は、ステップS111で、調整用光源部41aによる光量が適切であると判定した場合(ステップS111、「Yes」)、処理をステップS113に移行させる。 On the other hand, if the device control unit 300 determines in step S111 that the light amount by the adjustment light source unit 41a is appropriate (step S111, "Yes"), the process moves to step S113.
 ステップS113で、装置制御部300は、前眼観察部28により被検眼Sの前眼部を撮影する。また、装置制御部300は、撮像素子7による撮影も実行する。 In step S113, the device control unit 300 photographs the anterior segment of the eye S to be examined using the anterior eye observation unit 28. The device control unit 300 also executes imaging using the image sensor 7.
 光学部B1は、ステップS113で前眼観察部28により撮影され取得された画像データと、撮像素子7により撮影され取得された画像データとを、ポート9から出力する。ポート9から出力された各画像データは、画像通信線10を介して情報処理部B3に送信される。各画像データは、情報処理部B3においてポート350により受信され、信号処理部351に渡される。 The optical unit B1 outputs the image data photographed and acquired by the anterior eye observation unit 28 in step S113 and the image data photographed and acquired by the image sensor 7 from the port 9. Each image data output from the port 9 is transmitted to the information processing section B3 via the image communication line 10. Each image data is received by the port 350 in the information processing section B3 and passed to the signal processing section 351.
 次のステップS114で、信号処理部351は、前眼観察部28により取得された画像データと、撮像素子7により取得された画像データとに基づき、アライメント状態を取得する。例えば、信号処理部351は、前眼観察部28により取得された画像データに基づき、被検眼Sの瞳孔の位置を特定する。また、信号処理部351は、例えば、撮像素子7または前眼観察部28により取得された画像データに基づき、被検眼Sの瞳孔から射出された光が撮像素子7に照射される位置を特定する。また、信号処理部351は、当該画像データに基づき、当該光に基づく眼底画像の撮像画像上での大きさを特定する。これは、換言すれば、撮像素子7による撮像における画角を特定する処理となる。 In the next step S114, the signal processing unit 351 acquires the alignment state based on the image data acquired by the anterior eye observation unit 28 and the image data acquired by the image sensor 7. For example, the signal processing unit 351 identifies the position of the pupil of the eye S to be examined based on the image data acquired by the anterior eye observation unit 28. Further, the signal processing unit 351 specifies, for example, the position where the image sensor 7 is irradiated with the light emitted from the pupil of the eye S, based on the image data acquired by the image sensor 7 or the anterior eye observation unit 28. . Furthermore, the signal processing unit 351 specifies the size of the fundus image based on the light on the captured image based on the image data. In other words, this is a process of specifying the angle of view in imaging by the image sensor 7.
 次のステップS115で、信号処理部351は、ステップS114で特定された各情報に基づき、アライメント調整を行うための調整制御信号を生成する。ここでの調整制御信号は、撮像素子7の視野を偏向させるための制御信号と、撮像素子7による画角を調整するための制御信号とを含めてよい。 In the next step S115, the signal processing unit 351 generates an adjustment control signal for performing alignment adjustment based on each piece of information specified in step S114. The adjustment control signal here may include a control signal for deflecting the field of view of the image sensor 7 and a control signal for adjusting the angle of view of the image sensor 7.
 信号処理部113は、生成した調整制御信号をポート350から出力する。ポート350から出力された調整制御信号は、画像通信線10を介して撮像処理部B2のポート9に受信され、装置制御部300に渡される。装置制御部300は、渡された調整制御信号に基づき各機構302を制御する。 The signal processing unit 113 outputs the generated adjustment control signal from the port 350. The adjustment control signal output from the port 350 is received by the port 9 of the imaging processing section B2 via the image communication line 10, and is passed to the device control section 300. The device control unit 300 controls each mechanism 302 based on the adjustment control signal passed thereto.
 次のステップS116で、信号処理部351は、ステップS112で特定された各情報に基づき、被検眼Sの瞳位置(例えば瞳孔、虹彩などの位置)が適切か否かを判定する。信号処理部351は、瞳位置が適切ではないと判定した場合(ステップS116、「No」)、処理をステップS113に戻す。 In the next step S116, the signal processing unit 351 determines whether the pupil position (for example, the position of the pupil, iris, etc.) of the eye S to be examined is appropriate based on each piece of information specified in step S112. When the signal processing unit 351 determines that the pupil position is not appropriate (step S116, "No"), the process returns to step S113.
 一方、信号処理部351は、瞳位置が適切であると判定した場合(ステップS116、「Yes」)、この図8のフローチャートによる一連の処理を終了させ、処理を図7のステップS102に移行させる。 On the other hand, if the signal processing unit 351 determines that the pupil position is appropriate (step S116, "Yes"), it ends the series of processes according to the flowchart of FIG. 8 and moves the process to step S102 of FIG. .
 図9は、第1の実施形態に係るフォーカス調整処理をより詳細に示す一例のフローチャートである。図9に示す一連の処理は、図7のフローチャートにおけるステップS102の処理に対応する。 FIG. 9 is an example flowchart showing the focus adjustment process according to the first embodiment in more detail. The series of processes shown in FIG. 9 corresponds to the process of step S102 in the flowchart of FIG.
 図9において、ステップS120で、装置制御部300は、環状光源41における調整用光源部41aを点灯させる。このとき、撮影用光源部41bは、点灯させない。 In FIG. 9, in step S120, the device control section 300 turns on the adjustment light source section 41a in the annular light source 41. At this time, the photographing light source section 41b is not turned on.
 次のステップS121で、装置制御部300は、ステップS120で点灯された調整用光源部41aによる光量が適切か否かを判定する。ステップS121での判定処理は、図8のフローチャートにおけるステップS111で説明した判定処理と同様であるので、ここでの説明を省略する。 In the next step S121, the device control unit 300 determines whether the amount of light from the adjustment light source unit 41a turned on in step S120 is appropriate. The determination process in step S121 is the same as the determination process described in step S111 in the flowchart of FIG. 8, so the description here will be omitted.
 装置制御部300は、ステップS121で、光量が適切ではないと判定した場合(ステップS121、「No」)、処理をステップS122に移行させる。ステップS122で、装置制御部300は、調整用光源部41aによる発光光量を調整する。ステップS122での光量調整処理は、図8のフローチャートにおけるステップS112で説明した光量調整処理と同様であるので、ここでの説明を省略する。 If the device control unit 300 determines in step S121 that the light amount is not appropriate (step S121, "No"), the process moves to step S122. In step S122, the device control section 300 adjusts the amount of light emitted by the adjustment light source section 41a. The light amount adjustment process in step S122 is the same as the light amount adjustment process described in step S112 in the flowchart of FIG. 8, so the description here will be omitted.
 装置制御部300は、ステップS122の光量調整処理が完了すると、処理をステップS121に戻す。 Upon completion of the light amount adjustment process in step S122, the device control unit 300 returns the process to step S121.
 一方、装置制御部300は、ステップS121で、調整用光源部41aによる光量が適切であると判定した場合(ステップS121、「Yes」)、処理をステップS123に移行させる。 On the other hand, if the device control unit 300 determines in step S121 that the amount of light from the adjustment light source unit 41a is appropriate (step S121, "Yes"), the process moves to step S123.
 ステップS123で、装置制御部300は、撮像素子7による撮影を実行する。このステップS123での撮影は、本来の目的である撮像素子7による被検眼Sの眼底撮影の前段階での、仮撮影である。 In step S123, the device control unit 300 executes imaging using the image sensor 7. The photographing in step S123 is provisional photographing before the original purpose of photographing the fundus of the eye S to be examined using the image sensor 7.
 ステップS123で撮影された画像データは、ポート9から出力され、画像通信線10を介して情報処理部B3に送信される。当該画像データは、情報処理部B3においてポート350により受信され、信号処理部351に渡される。 The image data photographed in step S123 is output from the port 9 and transmitted to the information processing section B3 via the image communication line 10. The image data is received by the port 350 in the information processing section B3 and passed to the signal processing section 351.
 次のステップS124で、信号処理部351は、ステップS123で撮影された仮撮影の画像データを確認する。次のステップS125で、信号処理部351は、ステップS122で確認された画像データに基づき、フォーカス調整が適切であるか否かを判定する。例えば、信号処理部351は、当該画像データのエッジ情報に基づきフォーカス調整が適切であるか否かの判定を行ってよい。 In the next step S124, the signal processing unit 351 confirms the image data of the temporary photograph taken in step S123. In the next step S125, the signal processing unit 351 determines whether focus adjustment is appropriate based on the image data confirmed in step S122. For example, the signal processing unit 351 may determine whether focus adjustment is appropriate based on edge information of the image data.
 信号処理部351は、フォーカス調整が適切では無いと判定した場合(ステップS125、「No」)、処理をステップS126に移行させる。 If the signal processing unit 351 determines that the focus adjustment is not appropriate (step S125, "No"), the process moves to step S126.
 ステップS126で、信号処理部351は、フォーカス調整を指示する調整制御信号を生成し、生成した調整制御信号をポート350から出力する。ポート350から出力された調整制御信号は、画像通信線10を介して撮像処理部B2のポート9に受信され、装置制御部300に渡される。装置制御部300は、渡された調整制御信号に基づき駆動機構27を制御する。ステップS126の処理の後、処理がステップS124に戻される。 In step S126, the signal processing unit 351 generates an adjustment control signal that instructs focus adjustment, and outputs the generated adjustment control signal from the port 350. The adjustment control signal output from the port 350 is received by the port 9 of the imaging processing section B2 via the image communication line 10, and is passed to the device control section 300. The device control unit 300 controls the drive mechanism 27 based on the adjustment control signal passed thereto. After the process in step S126, the process returns to step S124.
 一方、信号処理部351は、ステップS125でフォーカス調整が適切になされていると判定した場合(ステップS123、「Yes」)、この図9のフローチャートによる一連の処理を終了させ、処理を図7のステップS103に移行させる。 On the other hand, if the signal processing unit 351 determines that the focus adjustment is properly performed in step S125 (step S123, "Yes"), the signal processing unit 351 ends the series of processes according to the flowchart of FIG. The process moves to step S103.
 図10は、第1の実施形態に係る撮影処理をより詳細に示す一例のフローチャートである。図10に示す一連の処理は、図7のフローチャートにおけるステップS104の処理に対応する。なお、この図10のフローチャートによる撮影を本撮影と呼び、図9のステップS121における仮撮影と区別する。 FIG. 10 is an example flowchart showing the photographing process according to the first embodiment in more detail. The series of processes shown in FIG. 10 corresponds to the process of step S104 in the flowchart of FIG. Note that the photographing according to the flowchart of FIG. 10 is called the actual photographing, and is distinguished from the temporary photographing in step S121 of FIG.
 図10において、ステップS140で、例えば装置制御部300は、調整用光源部41aによる調整用光と、撮影用光源部41bによる撮影用光との光路長の差分に対するオフセット調整を行う。例えば、調整用光に赤外光を用いた場合の波長が850nm、撮影用光の波長を例えば代表値として550nmとした場合、これらの間に300nmの差分が存在する。したがって、調整用光を用いてなされたアライメント調整は、撮影用光を用いた撮影時に対して、この差分に応じた既知のずれを有する。装置制御部300は、このずれに基づき、調整用光を用いてなされたアライメント調整に対するオフセット値を求め、当該アライメント調整による制御値を補正し、オフセット調整を行う。 In FIG. 10, in step S140, for example, the device control section 300 performs offset adjustment for the difference in optical path length between the adjustment light from the adjustment light source section 41a and the photographing light from the photographing light source section 41b. For example, if the wavelength when infrared light is used as the adjustment light is 850 nm, and the wavelength of the photographing light is, for example, a representative value of 550 nm, there is a difference of 300 nm between them. Therefore, the alignment adjustment performed using the adjustment light has a known deviation from the time of photographing using the photographing light according to this difference. Based on this deviation, the device control unit 300 determines an offset value for the alignment adjustment made using the adjustment light, corrects the control value resulting from the alignment adjustment, and performs the offset adjustment.
 次のステップS141で、装置制御部300は、撮影用光源部41bをフラッシュ点灯させる。次のステップS142で、装置制御部300は、撮像素子7により、被検眼Sの眼底を本撮影する。ステップS141で撮影用光源部41bをフラッシュ点灯させているため、光による瞳孔の収縮を抑制することができる。 In the next step S141, the device control unit 300 lights the photographing light source unit 41b with a flash. In the next step S142, the device control unit 300 uses the image sensor 7 to perform actual imaging of the fundus of the eye S to be examined. Since the photographing light source unit 41b is lit with a flash in step S141, it is possible to suppress the contraction of the pupils due to light.
 ステップS142で本撮影が行われると、図10および上述した図7のフローチャートによる一連の処理が終了される。 When the actual photographing is performed in step S142, the series of processes according to the flowchart of FIG. 10 and the above-mentioned FIG. 7 are completed.
(2-1-3.第1の実施形態に係るアライメント調整処理の詳細)
 次に、第1の実施形態に係るアライメント調整処理について、より詳細に説明する。
(2-1-3. Details of alignment adjustment processing according to the first embodiment)
Next, the alignment adjustment process according to the first embodiment will be described in more detail.
 第1の実施形態に係るアライメント調整処理は、次の3ステップにより構成される。
(1)画角調整
(2)視野偏向
(3)光軸調整
The alignment adjustment process according to the first embodiment includes the following three steps.
(1) Adjustment of angle of view (2) Deflection of field of view (3) Adjustment of optical axis
(1)画角調整
 図7のステップS101および図8のステップS112による画角の調整処理について説明する。
(1) View angle adjustment The view angle adjustment process in step S101 in FIG. 7 and step S112 in FIG. 8 will be described.
 画角調整では、被検眼Sから撮像素子7に至る光路長を変更することで、撮像素子7の画角を調整する。この調整は、駆動機構24によって装着部1の奥行き長さ(被検眼Sから光学素子2に向かう方向の長さ)を調整することで実現することができる。これに限らず、光学素子2を駆動する駆動機構25に、奥行き方向(被検眼Sから光学素子2に向かう方向)の移動機構を付加し、光学素子2当該奥行方向に平行移動させることで実現することも可能である。 In the angle of view adjustment, the angle of view of the image sensor 7 is adjusted by changing the optical path length from the eye S to the image sensor 7. This adjustment can be realized by adjusting the depth length of the mounting portion 1 (the length in the direction from the eye S to the optical element 2) using the drive mechanism 24. This is not limited to this, and can be realized by adding a movement mechanism in the depth direction (direction from the subject's eye S toward the optical element 2) to the drive mechanism 25 that drives the optical element 2, and moving the optical element 2 in parallel in the depth direction. It is also possible to do so.
 例えば信号処理部351は、画角調整における調整量の大きさを、撮像素子7から得られる画像情報、あるいは、前眼観察部28から得られる情報に基づき算出する。信号処理部351は、例えば、前眼観察部28により取得された画像情報に基づき調整量を算出する場合、画角内に映し出される対象物が規定の大きさに揃うように画角の調整を行う。対象物は、瞳孔径や虹彩径などの眼球構造を設定してもよいし、投影機30から前眼部周辺に投影された既定のパターンを設定してもよい。これに限らず、前眼観察部28に測距センサなどを追加することで、光学部B1と被検眼Sとの間の距離を直接的に測定してもよい。 For example, the signal processing unit 351 calculates the amount of adjustment in the angle of view adjustment based on the image information obtained from the image sensor 7 or the information obtained from the anterior eye observation unit 28. For example, when calculating the adjustment amount based on the image information acquired by the anterior eye observation unit 28, the signal processing unit 351 adjusts the angle of view so that the object displayed within the angle of view has a specified size. conduct. The target object may be an ocular structure such as a pupil diameter or an iris diameter, or a predetermined pattern projected from the projector 30 around the anterior segment of the eye. The present invention is not limited to this, and by adding a distance measuring sensor or the like to the anterior eye observation section 28, the distance between the optical section B1 and the eye S to be examined may be directly measured.
(2)視野偏向
 図7のステップS101および図8のステップS113による視野偏向処理について、図11および図12を用いて説明する。
(2) Field of View Deflection The field of view deflection processing in step S101 in FIG. 7 and step S113 in FIG. 8 will be explained using FIGS. 11 and 12.
 図11は、第1の実施形態に係る視野偏向の概要について説明するための模式図である。図11では、眼底撮像装置100の構成から被検眼Sと、光学部B1における光学素子2とその駆動機構25、光学素子4、ならびに、撮像素子7を抜粋して示し、その他の部分については省略している。 FIG. 11 is a schematic diagram for explaining an overview of visual field deflection according to the first embodiment. In FIG. 11, the eye S to be examined, the optical element 2 and its drive mechanism 25 in the optical section B1, the optical element 4, and the image sensor 7 are extracted and shown from the configuration of the fundus imaging device 100, and other parts are omitted. are doing.
 図11において、被検眼Sの眼底部をPf、瞳孔部(瞳孔中心)をPpとする。この眼底部Pfと瞳孔部Ppとを結ぶ直線を、主光線Afとする。主光線Afは、視野偏向用の光学素子2と点Paで接した後、光軸調整用の光学素子4の主平面上の位置Pb’に到達する。 In FIG. 11, the fundus of the eye S to be examined is Pf, and the pupil (center of the pupil) is Pp. A straight line connecting the fundus Pf and the pupil Pp is defined as a chief ray Af. The principal ray Af contacts the optical element 2 for visual field deflection at a point Pa, and then reaches a position Pb' on the principal plane of the optical element 4 for optical axis adjustment.
 また、撮像素子7の撮影エリア中心を位置Piとし、位置Piに対して垂直に入射する光線軸を、撮影光軸Aiとする。撮影光軸Aiは、光学素子4の主平面上の位置Pbを通過し、光学素子2、装置界面部D、被検眼S方向へと抜けていく。なお、視野偏向および光軸調整が行われていない初期状態においては、撮影光軸は、装置界面部Dの中心を、装置界面に対して垂直方向に抜けるものとする。 Further, the center of the imaging area of the image sensor 7 is defined as a position Pi, and the axis of the light beam incident perpendicularly to the position Pi is defined as the imaging optical axis Ai. The imaging optical axis Ai passes through a position Pb on the principal plane of the optical element 4, and exits toward the optical element 2, the device interface D, and the eye S to be examined. Note that in an initial state in which field deflection and optical axis adjustment are not performed, the imaging optical axis passes through the center of the device interface portion D in a direction perpendicular to the device interface.
 視野偏向とは、光学素子2の傾斜を駆動機構25により調整することで、位置Pbと位置Pb’を一致または十分に近接させることを指す。なお、光学素子2の回転中心を、点Pcとする。視野偏向は、眼底部Pf、瞳孔部Pp、点Pcおよび位置Pbの相対位置を特定すれば、以下の制約条件に基づき、Pb-Pb’間の距離が最小となる点Paの位置を、最適化問題として算出することができる。 The field of view deflection refers to adjusting the inclination of the optical element 2 using the drive mechanism 25 to make the positions Pb and Pb' coincide or sufficiently close to each other. Note that the center of rotation of the optical element 2 is assumed to be a point Pc. For visual field deflection, once the relative positions of the fundus Pf, pupil Pp, point Pc, and position Pb are specified, the position of the point Pa where the distance between Pb and Pb' is the minimum is optimally determined based on the following constraints. It can be calculated as a problem.
制約条件(1):点Paは、主光線Af上にあること。
制約条件(2):点Paは、回転中心を点Pcとする光学素子2上にあること。
Constraint condition (1): Point Pa is on principal ray Af.
Constraint condition (2): Point Pa is on the optical element 2 whose rotation center is point Pc.
 上述の4点(眼底部Pf、瞳孔部Pp、点Pcおよび位置Pb)の相対位置特定方法について説明する。以下では、説明のため、適宜、眼底部Pf、瞳孔部Pp、点Pcおよび位置Pbを、それぞれ点Pf、点Pp、点Pcおよび点Pbと呼ぶ。 A method for specifying the relative positions of the four points mentioned above (fundus Pf, pupil Pp, point Pc, and position Pb) will be described. Hereinafter, for the sake of explanation, fundus Pf, pupil Pp, point Pc, and position Pb will be referred to as point Pf, point Pp, point Pc, and point Pb, respectively, as appropriate.
 先ず、図7のステップS100における装着時における、視野偏向および光軸調整前の撮影光軸上に、点Pfを一致させる。これは、装置界面部Dを被検眼Sに十分に近接させた位置に配置することで実現可能である。これに限らず、装着部1の被検者200に対する装着位置を限定することによっても実現可能である。装着部1の被検者200に対する装着位置を限定する方法としては、例えば、装着部1の形状を、顔面の特定部位に沿うように形成することが考えられる。これに限らず、装着部1に当接させる被検者200の顔面の部位を特定することで、装着部1の被検者200に対する装着位置を限定することも可能である。さらには、装着部1などから基準マーカを被検者200の顔面に投影させ、その基準マーカを指標に位置合わせを行ってもよい。 First, point Pf is made to coincide with the photographing optical axis before field deflection and optical axis adjustment at the time of mounting in step S100 of FIG. This can be realized by arranging the device interface part D at a position sufficiently close to the eye S to be examined. This is not limited to this, and can also be realized by limiting the mounting position of the mounting section 1 on the subject 200. As a method of limiting the mounting position of the mounting part 1 on the subject 200, for example, it is possible to form the shape of the mounting part 1 so as to follow a specific part of the face. The present invention is not limited to this, and by specifying the part of the face of the subject 200 that is to be brought into contact with the mounting section 1, it is also possible to limit the mounting position of the mounting section 1 on the subject 200. Furthermore, a reference marker may be projected onto the face of the subject 200 from the mounting portion 1 or the like, and alignment may be performed using the reference marker as an index.
 点PbおよびPcの相対位置関係は、光学部B1の構造により決定されている。また、成人(15歳程度以上)の眼軸長は、約23~25mm程度であり、個体差が少ないことが知られている。したがって、点Pfが初期撮影光軸上に配置されているのであれば、Pf-Pc間距離は、一つの適切な初期値を設定しておけば、被検者200ごとで大きくずれることは無いと考えられる。さらに、眼軸長の個体差や、ステップS100における装着部1の装着時での押し付け具合などで生まれる装置界面部Dと被検眼Sとの距離は、画角調整時に吸収される。したがって、ステップS100における装着部1の装着が完了した時点で、各点Pf、PbおよびPcの相対位置関係は、光学部B1におけるワールド座標系WB1上で特定されるといえる。 The relative positional relationship between points Pb and Pc is determined by the structure of optical section B1. Further, it is known that the axial length of adults (about 15 years old or older) is about 23 to 25 mm, and there is little individual variation. Therefore, if point Pf is placed on the initial imaging optical axis, the distance between Pf and Pc will not vary significantly for each 200 subjects if one appropriate initial value is set. it is conceivable that. Furthermore, the distance between the device interface D and the eye S to be examined, which is caused by individual differences in the ocular axial length and the degree of pressure applied to the attachment part 1 in step S100, is absorbed when adjusting the angle of view. Therefore, when the mounting of the mounting section 1 in step S100 is completed, it can be said that the relative positional relationship of the points Pf, Pb, and Pc is specified on the world coordinate system W B1 in the optical section B1.
 これに限らず、前眼観察部28に、被検眼Sの表面までの距離を測定する距離センサを追加し、この追加した測距センサにより測距された距離に基づき、各点Pf、PbおよびPcの相対位置関係を特定してもよい。 The invention is not limited to this, but a distance sensor that measures the distance to the surface of the eye S to be examined is added to the anterior eye observation unit 28, and each point Pf, Pb and The relative positional relationship of Pc may also be specified.
 なお、新生児の眼軸長は、約17mm程度と、成人に比べてやや小さいことが知られている。したがって、幅広い年齢層の被検者200の眼底撮影を実現するためには、例えば、眼底撮像装置100において数パターンのPf-Pb間距離の初期値を準備しておき、撮影時に撮影者が切り替えられるようにしておくとよい。 It is known that the axial length of a newborn baby is approximately 17 mm, which is slightly smaller than that of an adult. Therefore, in order to realize fundus imaging of subjects 200 in a wide range of age groups, for example, several patterns of initial values of the Pf-Pb distance are prepared in the fundus imaging device 100, and the photographer can switch them at the time of imaging. It is a good idea to make sure that you can
 次に、被検眼Sの瞳孔部Ppの位置を特定する。例えば、信号処理部351は、瞳孔部Ppの位置を、前眼観察部28から得られる瞳孔位置の情報に基づき算出する。図12は、第1の実施形態に適用可能な瞳孔部Ppの位置特定の方法を説明するための模式図である。 Next, the position of the pupil Pp of the eye S to be examined is specified. For example, the signal processing unit 351 calculates the position of the pupil Pp based on the information on the pupil position obtained from the anterior eye observation unit 28. FIG. 12 is a schematic diagram for explaining a method for specifying the position of the pupil portion Pp that is applicable to the first embodiment.
 図12のセクション(a)は、前眼部の像を前眼観察部28の撮像面に撮像させた様子を示している。この図に示されるように、前眼観察部28の撮像素子31が持つ座標系Waは、特定のアフィン変換行列M44により、光学部B1全体のワールド座標系WB1から変換可能である。 Section (a) of FIG. 12 shows how an image of the anterior segment of the eye is captured on the imaging surface of the anterior eye observation unit 28. As shown in this figure, the coordinate system W a of the image sensor 31 of the anterior eye observation section 28 can be transformed from the world coordinate system W B1 of the entire optical section B1 using a specific affine transformation matrix M 44 .
 座標系Waは、撮像素子31の撮像面中心を原点とし、Xa軸、Ya軸が当該撮像素子の水平方向、鉛直方向にそれぞれ一致し、Za軸が光軸方向に一致する。また、このZa軸と垂直に交わり、且つ瞳孔部Ppが存在する面を、前眼部面Fとする。さらに、前眼観察部28の結像光学系の焦点距離をf、Za軸上の、前眼観察部28の撮像面から前眼部面Fまでの距離をLとする。 The coordinate system W a has its origin at the center of the imaging surface of the image sensor 31, the X a axis and the Y a axis coincide with the horizontal direction and the vertical direction of the image sensor, respectively, and the Z a axis coincides with the optical axis direction. Further, the plane that intersects perpendicularly with this Z a axis and in which the pupil portion Pp is present is defined as the anterior ocular segment plane F. Furthermore, let f be the focal length of the imaging optical system of the anterior eye observation section 28, and let L be the distance from the imaging surface of the anterior eye observation section 28 to the anterior eye surface F on the Z a axis.
 前眼部面Fにおいて、瞳孔部Ppは直径Φiを有するものとする。また、瞳孔部Ppは、撮像素子31の撮像面においては、直径Φi’にて投影されるものとする。 In the anterior eye plane F, it is assumed that the pupil Pp has a diameter Φ i . Further, it is assumed that the pupil portion Pp is projected with a diameter Φ i ' on the imaging surface of the image sensor 31.
 焦点距離fは、光学部B1の装置構成により一意に定まるため既知である。また、距離Lも、上述した画角調整処理が完了した時点で一定の値に収束しているため、視野偏向の開始時点では、既知である。すると、図12のセクション(b)を参照し、瞳孔部Ppが前眼観察部28の撮像素子の撮像面上で座標Ip(x、y)に結像したとする。結像位置の座標Ip(x,y)から焦点距離f:距離Lの相似関係を用いることで、座標系Waにおける瞳孔部Ppの座標を求めることができる。以上により、座標系Wa、延いては、ワールド座標系WB1における瞳孔部Ppの位置が特定される。以上により、ワールド座標系WB1における、各点Pf、Pp、PbおよびPcの位置が特定された。 The focal length f is known because it is uniquely determined by the device configuration of the optical section B1. Further, since the distance L also converges to a constant value at the time when the above-described view angle adjustment process is completed, it is known at the time when the visual field deflection starts. Then, with reference to section (b) of FIG. 12, it is assumed that the pupil portion Pp is imaged at the coordinates I p (x, y) on the imaging surface of the image sensor of the anterior eye observation section 28. By using the similarity relationship of focal length f:distance L from the coordinates I p (x, y) of the imaging position, the coordinates of the pupil portion Pp in the coordinate system W a can be determined. As described above, the position of the pupil portion Pp in the coordinate system W a and, in turn, in the world coordinate system W B1 is specified. As described above, the positions of each point Pf, Pp, Pb, and Pc in the world coordinate system W B1 have been specified.
(3)光軸調整
 図7のステップS101および図8のステップS113による光軸調整処理について、図13を用いて説明する。
(3) Optical axis adjustment The optical axis adjustment process in step S101 in FIG. 7 and step S113 in FIG. 8 will be explained using FIG. 13.
 図13は、第1の実施形態に係る光軸調整の概要について説明するための模式図である。図13では、上述の図11と同様に、眼底撮像装置100の構成から被検眼Sと、光学部B1における光学素子2とその駆動機構25、光学素子4、ならびに、撮像素子7を抜粋して示し、その他の部分については省略している。 FIG. 13 is a schematic diagram for explaining an overview of optical axis adjustment according to the first embodiment. In FIG. 13, as in FIG. 11 described above, the eye S to be examined, the optical element 2 and its drive mechanism 25 in the optical section B1, the optical element 4, and the image sensor 7 are extracted from the configuration of the fundus imaging device 100. The other parts are omitted.
 光軸調整では、光軸調整用の光学素子4を移動させることで、被検眼Sから光学素子4へと進む主光線を、撮像素子7の撮像光軸と一致させる。この調整は、光学素子4を駆動する駆動機構26によって光学素子4の位置や姿勢を調整することで実現する。 In the optical axis adjustment, by moving the optical element 4 for optical axis adjustment, the principal ray traveling from the eye S to the optical element 4 is made to coincide with the imaging optical axis of the imaging element 7. This adjustment is realized by adjusting the position and orientation of the optical element 4 using the drive mechanism 26 that drives the optical element 4.
 図13では、上述の視野偏向処理により、点Pbと点Pb’とが十分に近接した様子を示している。上述した視野偏向までの過程で、ワールド座標系WB1上における主光線Afの向きと、点PaおよびPb’の位置とが特定される。したがって、光学素子4に対して入射する主光線Afが、撮像素子7の撮影光軸となす角度θiを算出することができる。 FIG. 13 shows a state in which point Pb and point Pb' are sufficiently close to each other due to the above-described visual field deflection process. In the process up to the field of view deflection described above, the direction of the chief ray A f and the positions of points Pa and Pb' on the world coordinate system W B1 are specified. Therefore, the angle θ i that the chief ray A f incident on the optical element 4 makes with the imaging optical axis of the image sensor 7 can be calculated.
 光学系の一部の姿勢、位置を調整することで傾斜して入射する光線を適切な光軸上に重ねる技術は、レンズシフト式手振れ補正などでも使用されている既知の技術である。角度θiが算出できれば、算出された角度θiから逆算して光学素子4の位置および姿勢を決定することができる。 The technique of superimposing obliquely incident light rays on an appropriate optical axis by adjusting the attitude and position of a part of the optical system is a known technique that is also used in lens shift image stabilization. If the angle θ i can be calculated, the position and orientation of the optical element 4 can be determined by back calculation from the calculated angle θ i .
 以上説明したように、本開示の第1の実施形態では、視野偏向用の光学素子2の傾斜と、光軸調整用の光学素子4の位置および姿勢とを、前眼観察部28により取得された画像データと、撮像素子7により取得された画像データとに基づき制御している。これにより、アライメント調整における自由度を既存技術と比較して高めることができ、小型化が可能で、且つ、アライメント調整が容易な眼底撮像装置を提供することが可能となる。 As described above, in the first embodiment of the present disclosure, the inclination of the optical element 2 for visual field deflection and the position and orientation of the optical element 4 for optical axis adjustment are acquired by the anterior eye observation unit 28. The control is performed based on the image data acquired by the image sensor 7 and the image data acquired by the image sensor 7. As a result, the degree of freedom in alignment adjustment can be increased compared to existing techniques, and it is possible to provide a fundus imaging device that can be miniaturized and whose alignment adjustment is easy.
 また、第1の実施形態に係る眼底撮像装置100は、アライメント調整やフォーカス調整の際の照明に、可視光領域外の波長の光を用い、また、本撮影時にはフラッシュ点灯を行っているため、眩しさなどによる被検眼Sの瞳孔の収縮を防ぐことが可能である。これにより、散瞳薬などを使用する必要が無く、非医療従事者による撮影が可能となり、さらには、医療従事者の同席無しで、リモートで遠隔地から撮影を実行することが可能となる。 Further, the fundus imaging device 100 according to the first embodiment uses light with a wavelength outside the visible light range for illumination during alignment adjustment and focus adjustment, and also lights a flash during actual imaging. It is possible to prevent the pupil of the subject's eye S from constricting due to glare or the like. This eliminates the need to use mydriatic drugs and allows non-medical personnel to perform imaging, and furthermore, allows imaging to be performed remotely from a remote location without a medical personnel present.
(2-2.第1の実施形態の第1の変形例)
 次に、第1の実施形態の第1の変形例について説明する。図14は、第1の実施形態の第1の変形例に係る眼底撮像装置の一例の構成を示す模式図である。図14において、第1の変形例に係る眼底撮像装置100aは、光学部B1と撮像処理部B2とを、アタッチメント71により結合し、撮像処理部B2を着脱可能としている。
(2-2. First modification of the first embodiment)
Next, a first modification of the first embodiment will be described. FIG. 14 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a first modification of the first embodiment. In FIG. 14, a fundus imaging device 100a according to a first modification example connects an optical section B1 and an imaging processing section B2 by an attachment 71, and makes the imaging processing section B2 removable.
 アタッチメント71は、光学部B1からの入射光が撮像装置72内の撮像素子上で結像するように、光学部B1と撮像装置72とを物理的に接続する。撮像装置72は、レンズマウント型の静止画カメラ、動画カメラでもよいし、スマートフォンのようなカメラ機能を有した電子機器でもよい。また、撮像装置72と装置制御部300との間で制御信号などの通信を行うための通信端子をアタッチメント71に追加してもよいし、有線あるいは無線通信を行う追加の通信線73を用意してもよい。 The attachment 71 physically connects the optical section B1 and the imaging device 72 so that the incident light from the optical section B1 forms an image on the imaging device within the imaging device 72. The imaging device 72 may be a lens-mounted still image camera or a video camera, or may be an electronic device with a camera function such as a smartphone. Further, a communication terminal for communicating control signals and the like between the imaging device 72 and the device control unit 300 may be added to the attachment 71, or an additional communication line 73 for wired or wireless communication may be prepared. It's okay.
 このように、眼底撮影を行う撮像装置72を着脱可能とすることで、一般的なレンズ交換式カメラのように、撮像装置72側のアップデートや、撮影状況に応じて撮像装置72をユーザ側で選択可能となる。また、カメラ内蔵のスマートフォンや、レンズ交換式の一眼レフカメラといった、撮像機能を有する既存の電子機器を撮像装置72として適用可能とすることで、眼底撮像装置100a側で撮像素子7や現像処理部8を設ける必要が無くなり、小型化、低コスト化が可能となる。 In this way, by making the imaging device 72 that performs fundus photography removable, the user can update the imaging device 72 side or change the imaging device 72 depending on the shooting situation, like a general interchangeable lens camera. Selectable. Furthermore, by making it possible to use existing electronic devices with an imaging function, such as a smartphone with a built-in camera or a single-lens reflex camera with interchangeable lenses, as the imaging device 72, the imaging device 7 and the developing processing unit can be used on the fundus imaging device 100a side. 8 is no longer necessary, making it possible to reduce the size and cost.
(2-3.第1の実施形態の第2の変形例)
 次に、第1の実施形態の第2の変形例について説明する。図15は、第1の実施形態の第2の変形例に係る眼底撮像装置の一例の構成を示す模式図である。なお、図15では、情報処理部B3が省略されている。
(2-3. Second modification of the first embodiment)
Next, a second modification of the first embodiment will be described. FIG. 15 is a schematic diagram showing the configuration of an example of a fundus imaging device according to a second modification of the first embodiment. Note that in FIG. 15, the information processing section B3 is omitted.
 上述した第1の実施形態では、視野偏向用に、凹面鏡あるいは平面鏡による光学素子2を用いた。これに対して、第1の実施形態の第2の変形例では、図15に示されるように、視野偏向用に、対物レンズ60による光学素子2aを用いる。対物レンズ60は、凸レンズを適用することができる。対物レンズ60として、複数のレンズを組み合わせた組み合わせレンズを適用してもよい。 In the first embodiment described above, the optical element 2 made of a concave mirror or a plane mirror was used for field deflection. On the other hand, in the second modification of the first embodiment, as shown in FIG. 15, an optical element 2a including an objective lens 60 is used for field deflection. A convex lens can be applied to the objective lens 60. As the objective lens 60, a combination lens made by combining a plurality of lenses may be applied.
 図16は、第1の実施形態の第2の変形例に係る光学素子2aの一例の構成を示す模式図である。図16において、光学素子2aは、対物レンズ60と、対物レンズ60を保持する対物レンズホルダ60’とを含む。対物レンズホルダ60’は、各駆動機構61bおよび61b’を介して外枠61aおよび61a’それぞれと接続される。 FIG. 16 is a schematic diagram showing the configuration of an example of an optical element 2a according to a second modification of the first embodiment. In FIG. 16, the optical element 2a includes an objective lens 60 and an objective lens holder 60' that holds the objective lens 60. The objective lens holder 60' is connected to the outer frames 61a and 61a' via respective drive mechanisms 61b and 61b'.
 図16の例では、駆動機構61b’が外周の外枠61a’と内周の外枠61aとに接続され、固定の外枠61a’に対して内周の外枠61aを回転させる。駆動機構61bが内周の外枠61aと対物レンズホルダ60’とに接続され、外周の外枠61a’に対して対物レンズホルダ60’を回転させる。これにより、対物レンズ60は、光軸に対して垂直に交わる2軸の回転軸を有することになる。 In the example of FIG. 16, the drive mechanism 61b' is connected to the outer frame 61a' on the outer periphery and the outer frame 61a on the inner periphery, and rotates the outer frame 61a on the inner periphery with respect to the fixed outer frame 61a'. A drive mechanism 61b is connected to the outer frame 61a on the inner periphery and the objective lens holder 60', and rotates the objective lens holder 60' with respect to the outer frame 61a' on the outer periphery. Thereby, the objective lens 60 has two rotation axes that intersect perpendicularly to the optical axis.
 駆動機構61bおよび61b’は、それぞれ、装置制御部300から供給される制御信号に示される指示量に応じた角度で回転する。装置制御部300は、例えば上述した視野偏向処理により情報処理部B3から送信される装置制御信号に基づき回転角度を指示する制御信号を生成し、駆動機構61bおよび61b’に渡す。駆動機構61bおよび61b’は、当該制御信号に従い外枠61aおよび対物レンズホルダ60’を回転させる。これにより、撮像素子7に対する視野偏向が実現される。 The drive mechanisms 61b and 61b' each rotate at an angle according to an instruction amount indicated by a control signal supplied from the device control section 300. The device control section 300 generates a control signal instructing the rotation angle based on the device control signal transmitted from the information processing section B3 through the above-described visual field deflection process, and passes it to the drive mechanisms 61b and 61b'. The drive mechanisms 61b and 61b' rotate the outer frame 61a and the objective lens holder 60' according to the control signals. Thereby, the field of view deflection for the image sensor 7 is realized.
 第1の実施形態の第2の変形例では、眼底撮像装置100bにおける視野偏向用の光学素子2aとして対物レンズ60を用いているため、眼底撮像装置100bを屈曲させる必要が無く、装置のさらなる小型化が可能である。 In the second modification of the first embodiment, since the objective lens 60 is used as the optical element 2a for visual field deflection in the fundus imaging device 100b, there is no need to bend the fundus imaging device 100b, making the device more compact. It is possible to
(3.第2の実施形態)
 本開示の第2の実施形態について説明する。第2の実施形態は、自動撮影を可能とした眼底撮像装置の例である。第2の実施形態に係る眼底撮像装置は、例えば被検者200のトリガ操作に応じて、図7のフローチャートで説明したアライメント調整処理、フォーカス調整処理および撮影処理を自動で実行可能とされている。すなわち、第2の実施形態に係る眼底撮像装置を用いることで、適切な撮影技術を有する者の同席無しに、被検者200が直接的に眼底撮影を実行可能となる。
(3. Second embodiment)
A second embodiment of the present disclosure will be described. The second embodiment is an example of a fundus imaging device that enables automatic imaging. The fundus imaging device according to the second embodiment is capable of automatically executing the alignment adjustment process, focus adjustment process, and imaging process described in the flowchart of FIG. 7, for example, in response to a trigger operation by the subject 200. . That is, by using the fundus imaging device according to the second embodiment, the subject 200 can directly perform fundus imaging without the presence of a person with appropriate imaging techniques.
 すなわち、多くの場合、被検者200は、眼内、眼底撮影について専門的な撮影技術を有していない。また、既存技術では、撮影中に被検者200の視線を固定する必要があるため、被検者200は、撮影中の装置操作を行うことが困難である。したがって、被検者200が直接的に眼底撮影を行う場合、眼底撮像装置のアライメント調整および制御を全て眼底撮像装置内で自動実行され、基本的に撮影中の被検者200による装置操作、撮影状況確認などの行為を不要とする必要がある。 That is, in many cases, the subject 200 does not have specialized photographic techniques for intraocular and fundus photography. Furthermore, with the existing technology, it is necessary to fix the line of sight of the subject 200 during imaging, making it difficult for the subject 200 to operate the device during imaging. Therefore, when the subject 200 directly performs fundus imaging, the alignment adjustment and control of the fundus imaging device are all automatically performed within the fundus imaging device, and basically the patient 200 operates the device during imaging and performs the imaging. It is necessary to eliminate the need for actions such as checking the situation.
 図17は、第2の実施形態に係る眼底撮像装置の使用形態を説明するための模式図である。図17において、第2の実施形態に係る眼底撮像装置100cは、光学部B1および撮像処理部B2(何れも図示しない)を含む本体部110aと、被検者端末120と、を含んで構成される。本体部110aは、被検者200が眼底撮影処理の開始を指示するためのUI21が設けられる。 FIG. 17 is a schematic diagram for explaining how the fundus imaging device according to the second embodiment is used. In FIG. 17, a fundus imaging device 100c according to the second embodiment includes a main body 110a including an optical section B1 and an imaging processing section B2 (none of which are shown), and a subject terminal 120. Ru. The main body portion 110a is provided with a UI 21 for the subject 200 to instruct the start of fundus imaging processing.
 被検者200は、本体部110aを例えば自身の手210により保持し、自分の顔面の一部(眼の部分)を本体部110aの装着部1に当接させて固定状態とする。本体部110aは、手210による保持を容易にするためのグリップ部を設けた手持ち型としてもよいし、台座などに設置して据え置き型としてもよい。 The subject 200 holds the main body part 110a with, for example, his/her own hand 210, and brings a part of his/her face (eye area) into contact with the mounting part 1 of the main body part 110a to keep it in a fixed state. The main body portion 110a may be of a hand-held type provided with a grip portion for easy holding by the hand 210, or may be of a stationary type installed on a pedestal or the like.
 このとき、被検者200は、視線を固定する必要があるため、UI21を直接的に目視して操作することが困難である。したがって、 UI21は、自動撮影を開始するためだけのボタンのような、シンプルな機能とすると好ましい。なお、被検者200以外のその場に同席する者が撮影者となる場合は、UI21として、キーボードやマウス、タッチパネルなど、操作時に目視が必要となる入力デバイスを使用することも可能である。 At this time, since the subject 200 needs to fix his/her line of sight, it is difficult for the subject 200 to directly view and operate the UI 21 . Therefore, it is preferable that the UI 21 has a simple function, such as a button only for starting automatic shooting. Note that if the photographer is someone other than the subject 200, the UI 21 may be an input device that requires visual observation during operation, such as a keyboard, mouse, or touch panel.
 被検者端末120は、本体部110aの動作状況のモニタ、本体部110aにおいて撮影された眼底画像の確認および外部への送信などを行うことが可能である。被検者端末120から本体部110aにおける眼底撮影動作を制御するように構成してもよい。 The subject terminal 120 is capable of monitoring the operating status of the main body 110a, checking the fundus image taken by the main body 110a, and transmitting it to the outside. It may be configured such that the fundus photographing operation in the main body section 110a is controlled from the subject terminal 120.
 被検者端末120としては、スマートフォンやタブレット型コンピュータ、パーソナルコンピュータといった汎用の情報処理装置を適用することができる。この場合、当該情報処理装置に対して第2の実施形態に係る眼底撮影処理を実行するためのアプリケーションプログラムをインストールすることで、被検者端末120を構成することができる。これに限らず、被検者端末120は、この眼底撮像装置100cに専用の端末装置としてもよい。 As the subject terminal 120, a general-purpose information processing device such as a smartphone, a tablet computer, or a personal computer can be applied. In this case, the subject terminal 120 can be configured by installing an application program for executing fundus imaging processing according to the second embodiment into the information processing device. The present invention is not limited to this, and the subject terminal 120 may be a terminal device dedicated to this fundus imaging device 100c.
 図18は、第2の実施形態に係る眼底撮像装置100cの機能を説明するための一例の機能ブロック図である。図18において、光学部B1および撮像処理部B2の構成は、図6を用いて説明した第1の実施形態に係る眼底撮像装置100における光学部B1および撮像処理部B2と同等であるので、ここでの説明を省略する。 FIG. 18 is an example functional block diagram for explaining the functions of the fundus imaging device 100c according to the second embodiment. In FIG. 18, the configurations of the optical section B1 and the imaging processing section B2 are the same as those of the optical section B1 and the imaging processing section B2 in the fundus imaging apparatus 100 according to the first embodiment described using FIG. The explanation will be omitted.
 第2の実施形態に係る眼底撮像装置100cにおいて、情報処理部B3は、被検者端末120と、自動調整処理部130と、インジケータ131と、を含む。また、情報処理部B3において、自動調整処理部130およびインジケータ131は、本体部110aに設けられる。これに対して、被検者端末120は、本体部110aとは別体の構成とされる。これに限らず、被検者端末120の機能を本体部110aに組み込むことも可能である。 In the fundus imaging device 100c according to the second embodiment, the information processing section B3 includes a subject terminal 120, an automatic adjustment processing section 130, and an indicator 131. Furthermore, in the information processing section B3, the automatic adjustment processing section 130 and the indicator 131 are provided in the main body section 110a. In contrast, the subject terminal 120 is configured separately from the main body portion 110a. The present invention is not limited to this, and it is also possible to incorporate the functions of the subject terminal 120 into the main body 110a.
 被検者端末120は、表示部121と、記憶部122と、通信部123と、を含む。表示部121は、被検者端末120が備える表示デバイスに対して画像を表示させる。記憶部122は、被検者端末120が備える記憶媒体に対するデータの記憶と、当該記憶媒体に記憶されたデータの読み出しを行う。通信部123は、本体部110aと画像通信線10を介して通信を行う。また、通信部123は、有線あるいは無線通信により、外部のネットワークを介した通信を行うこともできる。 The subject terminal 120 includes a display section 121, a storage section 122, and a communication section 123. The display unit 121 causes a display device included in the subject terminal 120 to display an image. The storage unit 122 stores data in a storage medium included in the subject terminal 120 and reads data stored in the storage medium. The communication section 123 communicates with the main body section 110a via the image communication line 10. Further, the communication unit 123 can also communicate via an external network by wired or wireless communication.
 例えば、本体部110aにおいて撮像処理部B2は、撮像素子7により撮影され取得された画像データを、画像通信線10を介して被検者端末120に送信する。被検者端末120は、本体部110aから送信された画像データを通信部123により受信し、受信した画像データに基づく画像を表示部121により表示バイスに表示させることができる。これにより、例えば被検者200は、本体部110aによる眼底撮影の結果を確認することができる。 For example, in the main body section 110a, the imaging processing section B2 transmits image data captured and acquired by the imaging device 7 to the subject terminal 120 via the image communication line 10. The subject terminal 120 can receive the image data transmitted from the main body section 110a through the communication section 123, and cause the display section 121 to display an image based on the received image data on the display device. Thereby, for example, the subject 200 can confirm the result of fundus photography using the main body portion 110a.
 また、被検者端末120は、本体部110aから送信され通信部123により受信された画像データを、記憶部122により記憶媒体に記憶させ保存することができる。また、被検者端末120は、当該記憶媒体に記憶された画像データを記憶部122により読み出して、通信部123によりさらに外部の機器に伝送することもできる。 Furthermore, the subject terminal 120 can store the image data transmitted from the main body section 110a and received by the communication section 123 in a storage medium using the storage section 122. Furthermore, the subject terminal 120 can also read the image data stored in the storage medium using the storage unit 122 and further transmit it to an external device using the communication unit 123.
 情報処理部B3において、自動調整処理部130は、図7~図10を用いて説明した眼底撮影に係る処理を自動実行する。例えば、被検者200による、UI21に対する眼底撮影開始を指示する操作に応じて、眼底撮影開始を指示する制御信号が装置制御部300から自動調整処理部130に渡される。自動調整処理部130は、この制御信号に応じて、図7のフローチャートによるステップS101からの処理を開始する。 In the information processing unit B3, the automatic adjustment processing unit 130 automatically executes the processing related to fundus photography described using FIGS. 7 to 10. For example, in response to an operation by the subject 200 to instruct the UI 21 to start fundus imaging, a control signal instructing to start fundus imaging is passed from the device control unit 300 to the automatic adjustment processing unit 130. In response to this control signal, the automatic adjustment processing unit 130 starts processing from step S101 according to the flowchart of FIG.
 すなわち、装置制御部300は、ステップS101の処理により、UI21に対して撮影開始を指示する操作に応じて環状光源41において調整用光源部41aを点灯させ、次いで、前眼観察部28、場合に応じて撮像素子7、現像処理部8を制御して、初期のアライメント状態およびフォーカス状態を取得する。装置制御部300は、初期のアライメント状態およびフォーカス状態を示す情報を、自動調整処理部130に渡す。 That is, through the process of step S101, the device control unit 300 lights up the adjustment light source unit 41a in the annular light source 41 in response to an operation instructing the UI 21 to start imaging, and then lights up the adjustment light source unit 41a in the anterior eye observation unit 28, if necessary. Accordingly, the image sensor 7 and the development processing section 8 are controlled to obtain the initial alignment state and focus state. The device control unit 300 passes information indicating the initial alignment state and focus state to the automatic adjustment processing unit 130.
 自動調整処理部130は、現在のアライメント状態を、前眼観察部28により取得された情報と、撮像素子7により取得された画像データとに基づき求め、最適なアライメント状態への調整量を算出する。自動調整処理部130は、算出した最適なアライメント状態への調整量を示す情報を、装置制御部300に送り、調整量をフィードバックする。フィードバックを受けた装置制御部300は、その調整量を駆動機構24~26に送り、アライメント調整を行う。 The automatic adjustment processing unit 130 determines the current alignment state based on the information acquired by the anterior eye observation unit 28 and the image data acquired by the image sensor 7, and calculates the amount of adjustment to the optimal alignment state. . The automatic adjustment processing section 130 sends information indicating the calculated adjustment amount to the optimal alignment state to the device control section 300, and feeds back the adjustment amount. The device control unit 300 that has received the feedback sends the adjustment amount to the drive mechanisms 24 to 26 to perform alignment adjustment.
 自動調整処理部130は、アライメント調整後に、調整後のアライメント状態を求め、以後、適切なアライメント状態になったと判定されるまで、換言すれば、アライメント調整が収束するまで、アライメント状態確認、調整量算出、調整機構制御を繰り返す。 After the alignment adjustment, the automatic adjustment processing unit 130 obtains the alignment state after the adjustment, and thereafter checks the alignment state and performs the adjustment amount until it is determined that the alignment state is appropriate, in other words, until the alignment adjustment is converged. Repeat calculation and adjustment mechanism control.
 眼底撮像装置100cは、自動調整処理部130によりアライメント調整が収束したと判定された場合、フォーカス調整処理(図7のステップS102)、撮影処理(図7のステップS104)へと移行する。眼底撮像装置100cは、フォーカス調整処理においても、アライメント調整処理と同様に、装置制御部300と、フォーカス調整を制御する駆動機構27と、自動調整処理部130との間で、繰り返しフォーカス調整のフィードバック制御が行われる。 If the automatic adjustment processing unit 130 determines that the alignment adjustment has converged, the fundus imaging device 100c proceeds to focus adjustment processing (step S102 in FIG. 7) and imaging processing (step S104 in FIG. 7). In the focus adjustment process, the fundus imaging device 100c repeatedly performs focus adjustment feedback between the device control unit 300, the drive mechanism 27 that controls focus adjustment, and the automatic adjustment processing unit 130, similarly to the alignment adjustment process. Control takes place.
 ここで、眼底撮像装置100cは、本体部110aに対し、撮影を行う被検者200に対して装置操作の補助を行うためのインジケータ131を備えてもよい。 Here, the fundus imaging device 100c may include an indicator 131 on the main body portion 110a for assisting the subject 200 who performs imaging in operating the device.
 インジケータ131は、眼底撮像装置100cの操作方法や、自動調整処理部130から得られたアライメント情報を含む、本体部110aの内部の状態を撮影者(例えば被検者200)に提示してよい。インジケータ131は、視覚情報および音声情報のうち少なくとも一方を用いて撮影者に情報を提示してよい。これに限らず、インジケータ131は、振動など触覚情報を用いて撮影者に情報を提示してもよい。また、インジケータ131は、これらの組み合わせにより撮影者に情報を提示してもよい。 The indicator 131 may present the internal state of the main body 110a, including the operation method of the fundus imaging device 100c and the alignment information obtained from the automatic adjustment processing unit 130, to the photographer (for example, the subject 200). The indicator 131 may present information to the photographer using at least one of visual information and audio information. The indicator 131 is not limited to this, and the indicator 131 may present information to the photographer using tactile information such as vibration. Further, the indicator 131 may present information to the photographer using a combination of these.
 本体部110aを被検者200が操作し、インジケータ131により視覚情報として情報提示する場合、被検者200は、撮影対象となっていない方の眼でインジケータ131を確認することになる。この場合において、被検者200がインジケータ131を直視しようとすると、被検眼Sの視線も移動してしまう。そのため、情報を視覚情報として提示する場合は、インジケータ131の直視を不要とする構造とすると好ましい。 When the subject 200 operates the main body portion 110a and presents information as visual information using the indicator 131, the subject 200 confirms the indicator 131 with the eye that is not the subject of photography. In this case, when the subject 200 tries to look directly at the indicator 131, the line of sight of the subject's eye S also moves. Therefore, when presenting information as visual information, it is preferable to adopt a structure that does not require direct viewing of the indicator 131.
 図19は、第2の実施形態に適用可能な、視覚情報を用いる場合のインジケータ131の構成例を示す模式図である。図19の例では、点灯デバイスによるインジケータ131aを、本体部110aの両側側面、すなわち、被検者200が正しく本体部110aを保持した状態において、被検者200の顔の正面方向に平行な面に設けている。 FIG. 19 is a schematic diagram showing a configuration example of the indicator 131 when visual information is used, which is applicable to the second embodiment. In the example of FIG. 19, the indicator 131a by the lighting device is placed on both side surfaces of the main body 110a, that is, on a surface parallel to the front direction of the face of the subject 200 when the subject 200 is correctly holding the main body 110a. It is set up in
 インジケータ131aは、LED(Light Emitting Diode)などの発光素子を用いることができる。インジケータ131aは、発光した光が十分に拡散するように構成すると、被検者200がインジケータ131aを直視しなくても、インジケータ131aの発光を確認することができ、好ましい。また、インジケータ131aは、複数色を選択的に発光可能とし、眼底撮像装置100cの状態に応じた色で発光させてもよい。こうすることで、被検者200は、インジケータ131aの発光色から眼底撮像装置100cの状態を確認することが可能となる。 The indicator 131a can use a light emitting element such as an LED (Light Emitting Diode). It is preferable that the indicator 131a is configured so that the emitted light is sufficiently diffused so that the subject 200 can confirm the light emission of the indicator 131a without looking directly at the indicator 131a. Further, the indicator 131a can selectively emit light in a plurality of colors, and may emit light in a color depending on the state of the fundus imaging device 100c. By doing so, the subject 200 can confirm the state of the fundus imaging device 100c from the color of the light emitted from the indicator 131a.
 この例に限らず、図19にさらに示されるように、本体部110aの背面、すなわち、被検者200が正しく本体部110aを保持した状態において、被検者200の顔が対向する面と反対側の面に、インジケータ131bを設けてもよい。この場合は、被検者200の対面の十分に遠い位置に鏡などを設置することで、被検者200は、当該鏡を介してインジケータ131bの発光の有無や発光色を確認することができる。 Not limited to this example, as further shown in FIG. An indicator 131b may be provided on the side surface. In this case, by installing a mirror or the like at a position sufficiently far from facing the subject 200, the subject 200 can confirm whether or not the indicator 131b emits light and the color of the light emitted through the mirror. .
(アライメント調整処理の具体例)
 第2の実施形態に係るアライメント調整処理について、より具体的に説明する。第2の実施形態では、アライメント調整において、画角調整、視野偏向および光軸調整それぞれの処理の結果を装置制御部300にフィードバックすることで、アライメント調整の自動実行を実現している。
(Specific example of alignment adjustment processing)
The alignment adjustment process according to the second embodiment will be described in more detail. In the second embodiment, automatic execution of the alignment adjustment is realized by feeding back the results of the processing of the angle of view adjustment, field deflection, and optical axis adjustment to the device control unit 300.
 図20は、第2の実施形態に係るアライメント調整処理をより具体的に示す一例のフローチャートである。図20のフローチャートによる処理は、図7のフローチャートにおけるステップS101の処理に対応する。 FIG. 20 is an example flowchart showing more specifically the alignment adjustment process according to the second embodiment. The process according to the flowchart of FIG. 20 corresponds to the process of step S101 in the flowchart of FIG.
 図20において、ステップS200で、眼底撮像装置100cは、装置制御部300により調整用光源部41aを点灯させる。次のステップS201で、眼底撮像装置100cは、装置制御部300の制御により、撮像素子7による仮撮影を行う。これに限らず、ステップS201では、前眼観察部28による前眼部の撮影を行ってもよい。撮像素子7または前眼観察部28で撮影により取得された画像データは、情報処理部B3における自動調整処理部130に渡される。 In FIG. 20, in step S200, the fundus imaging device 100c causes the device control unit 300 to turn on the adjustment light source unit 41a. In the next step S201, the fundus imaging device 100c performs provisional imaging using the image sensor 7 under the control of the device control unit 300. However, the present invention is not limited to this, and in step S201, the anterior eye observation unit 28 may photograph the anterior eye segment. Image data acquired by photographing with the image sensor 7 or the anterior eye observation section 28 is passed to the automatic adjustment processing section 130 in the information processing section B3.
 なお、ここでは、撮像素子7あるいは前眼観察部28は、動画像の撮影を行うものとする。動画像による撮影は、例えば本撮影が開始されるまで継続してよい。 Note that here, it is assumed that the image sensor 7 or the anterior eye observation unit 28 captures a moving image. Shooting of moving images may be continued, for example, until the actual shooting is started.
 次のステップS202で、眼底撮像装置100cは、自動調整処理部130により、撮像素子7または前眼観察部28で撮影により取得された画像データから、被検眼Sの虹彩像を検出し、検出された虹彩像に基づき虹彩径を求める。すなわち、虹彩径は、個体差が小さいことが知られていることから、撮影された虹彩系の大きさに基づき、現在の画角が範囲に収まっているか否かを判定することができる。 In the next step S202, the fundus imaging device 100c uses the automatic adjustment processing unit 130 to detect the iris image of the eye S to be examined from the image data acquired by imaging with the image sensor 7 or the anterior eye observation unit 28, and detects the iris image of the eye S to be examined. The iris diameter is determined based on the iris image. That is, since it is known that individual differences in the diameter of the iris are small, it is possible to determine whether the current angle of view falls within the range based on the size of the photographed iris system.
 次のステップS203で、自動調整処理部130は、取得された虹彩径が適切な大きさか否かを判定する。自動調整処理部130は、取得された虹彩径が適切な大きさではないと判定した場合(ステップS203、「No」)、処理をステップS204に移行させ、取得された虹彩径に基づく画角情報を装置制御部300に渡し、画角情報のフィードバックを行う。 In the next step S203, the automatic adjustment processing unit 130 determines whether the acquired iris diameter is an appropriate size. When the automatic adjustment processing unit 130 determines that the acquired iris diameter is not an appropriate size (step S203, "No"), the automatic adjustment processing unit 130 moves the process to step S204, and generates angle of view information based on the acquired iris diameter. is passed to the device control unit 300, and the viewing angle information is fed back.
 装置制御部300は、自動調整処理部130から渡された画角情報に基づき、例えば駆動機構24を制御して光学部B1から被検眼Sまでの距離を調整する。処理は、ステップS204の後、ステップS202に戻される。 The device control unit 300 controls, for example, the drive mechanism 24 based on the angle of view information passed from the automatic adjustment processing unit 130 to adjust the distance from the optical section B1 to the eye S to be examined. After step S204, the process returns to step S202.
 一方、自動調整処理部130は、取得された虹彩径が適切な大きさであると判定した場合(ステップS203、「Yes」)、処理をステップS205に移行させる。 On the other hand, when the automatic adjustment processing unit 130 determines that the acquired iris diameter is an appropriate size (step S203, "Yes"), the process moves to step S205.
 画角の調整方法は、画像データに基づく方法に限定されない。例えば、前眼観察部28に測距センサなどを組み込み、被検眼Sまでの距離を直接的に測定してもよい。また、前眼観察部28に投影機30を設けて所定のマーカを被検眼Sに投影してもよい。この場合には、自動調整処理部130は、前眼観察部28あるいは撮像素子7により取得された画像データによる画像上でのマーカ像の大きさから、光学部B1と前眼部との間の距離を算出してもよい。自動調整処理部130は、これら測定または算出された距離に基づき、画角の調整を行う。 The method of adjusting the angle of view is not limited to the method based on image data. For example, a distance measurement sensor or the like may be incorporated into the anterior eye observation unit 28 to directly measure the distance to the eye S to be examined. Alternatively, a projector 30 may be provided in the anterior eye observation unit 28 to project a predetermined marker onto the eye S to be examined. In this case, the automatic adjustment processing unit 130 determines the difference between the optical unit B1 and the anterior segment based on the size of the marker image on the image based on the image data acquired by the anterior eye observation unit 28 or the image sensor 7. The distance may also be calculated. The automatic adjustment processing unit 130 adjusts the angle of view based on these measured or calculated distances.
 ステップS205で、自動調整処理部130は、撮像素子7または前眼観察部28で撮影により取得された画像データから、被検眼Sの瞳孔像を検出し、瞳孔の位置を取得する。すなわち、被検眼Sの瞳孔中心が撮像素子7の中心部に十分近い位置で撮像されているか否かにより、現在の視野偏向および光軸位置の状態を判定することができる。 In step S205, the automatic adjustment processing unit 130 detects the pupil image of the eye S to be examined from the image data acquired by photographing with the imaging device 7 or the anterior eye observation unit 28, and acquires the position of the pupil. That is, depending on whether the center of the pupil of the eye S to be examined is imaged at a position sufficiently close to the center of the image sensor 7, the current state of the visual field deflection and the optical axis position can be determined.
 次のステップS206で、自動調整処理部130は、ステップS205で取得された瞳孔位置が適切な位置であるか否かを判定する。自動調整処理部130は、取得された瞳孔位置が適切な位置ではないと判定した場合、処理をステップS207に移行させ、取得された瞳孔位置を示す位置情報を装置制御部300に渡し、位置情報のフィードバックを行う。 In the next step S206, the automatic adjustment processing unit 130 determines whether the pupil position acquired in step S205 is an appropriate position. If the automatic adjustment processing unit 130 determines that the acquired pupil position is not an appropriate position, it moves the process to step S207, passes the position information indicating the acquired pupil position to the device control unit 300, and transfers the position information to the device control unit 300. Provide feedback.
 装置制御部300は、自動調整処理部130から渡された位置情報に基づき、例えば駆動機構25および26を調整して、撮像素子7に撮影される画像上での瞳孔位置を調整する。装置制御部300は、瞳孔像が撮像素子7の撮影中心に対して所定以上ずれた位置で結像している場合は、そのずれ量を補正する方向に、視野偏向と光軸調整を再実行する。処理は、ステップS206の後、ステップS205に戻される。 The device control unit 300 adjusts, for example, the drive mechanisms 25 and 26 based on the position information passed from the automatic adjustment processing unit 130, and adjusts the pupil position on the image captured by the image sensor 7. If the pupil image is formed at a position shifted by a predetermined amount or more with respect to the imaging center of the image sensor 7, the device control unit 300 re-executes the field of view deflection and optical axis adjustment in a direction to correct the amount of shift. do. After step S206, the process returns to step S205.
 なお、ステップS205およびステップS206において、必要に応じて、駆動機構27を制御してフォーカス調整用の光学素子5を駆動し、撮像素子7上に被検眼Sの瞳孔部を合焦することもできる。 Note that in step S205 and step S206, if necessary, the drive mechanism 27 can be controlled to drive the optical element 5 for focus adjustment to focus the pupil of the eye S to be examined on the image sensor 7. .
 光軸調整に関しては、撮像された瞳孔の歪形状によっても状態確認することができる。次のステップS208で、自動調整処理部130は、撮像素子7または前眼観察部28で撮影により取得された画像データから、被検眼Sにおける瞳孔像の形状を取得する。すなわち、被検眼Sの瞳孔像の形状の歪みを検出することで、現在の光軸位置の状態を判定することができる。 Regarding optical axis adjustment, the condition can also be confirmed by the distorted shape of the imaged pupil. In the next step S208, the automatic adjustment processing unit 130 acquires the shape of the pupil image in the eye S to be examined from the image data acquired by photographing with the imaging device 7 or the anterior eye observation unit 28. That is, by detecting the distortion in the shape of the pupil image of the eye S to be examined, the current state of the optical axis position can be determined.
 例えば、上述した図11を参照し、撮像素子7の撮影光軸Aiに対して、主光線Afを含む瞳孔が真正面に捉えられている場合、瞳孔は円形状の像として撮像される。しかしながら、光軸調整が不十分であり、撮影光軸Aiに対して瞳孔が斜めに傾いた状態で撮影された場合、撮像素子7上での瞳孔像は、その傾斜に相関して歪んだ楕円形状になる。したがって、瞳孔像の形状に基づき光軸調整のフィードバック制御を実行することが可能になる。 For example, referring to FIG. 11 described above, if the pupil including the principal ray A f is captured directly in front of the imaging optical axis A i of the image sensor 7, the pupil is imaged as a circular image. However, if the optical axis adjustment is insufficient and the pupil is photographed obliquely with respect to the imaging optical axis A i , the pupil image on the image sensor 7 will be distorted in correlation to the tilt. It becomes an oval shape. Therefore, it becomes possible to perform feedback control of optical axis adjustment based on the shape of the pupil image.
 光軸調整の方法は、上述の方法に限定されない。例えば、眼底撮像を行うフォーカス調整処理(図7のステップS102)において、最終的に撮像素子7にて得られた眼底画像の中心付近に黄斑部が撮影されているか否か、あるいは、画枠内に視神経乳頭部が収められているか否かに基づき光軸に対する判定を行うことが可能である。 The method of optical axis adjustment is not limited to the above method. For example, in the focus adjustment process (step S102 in FIG. 7) that performs fundus imaging, it is determined whether the macula is photographed near the center of the fundus image finally obtained by the image sensor 7, or whether it is within the image frame. It is possible to determine the optical axis based on whether or not the optic nerve head is accommodated in the optic nerve head.
 ステップS208の処理の後、処理がステップS209に移行される。ステップS209で、自動調整処理部130は、ステップS208で取得された瞳孔像の形状に歪みが無いか否か(例えば瞳孔像の長径と短径の比が所定以上か否か)を判定する。自動調整処理部130は、取得された瞳孔形状に歪みがあると判定した場合(ステップS209、「No」)、処理をステップS210に移行させ、取得された瞳孔形状を示す形状情報を装置制御部300に渡し、形状情報のフィードバックを行う。 After the process in step S208, the process moves to step S209. In step S209, the automatic adjustment processing unit 130 determines whether there is any distortion in the shape of the pupil image acquired in step S208 (for example, whether the ratio of the major axis to the minor axis of the pupil image is equal to or greater than a predetermined value). If the automatic adjustment processing unit 130 determines that the acquired pupil shape is distorted (step S209, "No"), the automatic adjustment processing unit 130 moves the process to step S210, and transmits the shape information indicating the acquired pupil shape to the device control unit. 300, and the shape information is fed back.
 装置制御部300は、自動調整処理部130から渡された形状情報に基づき、例えば駆動機構26を制御して光学素子4を駆動し、被検眼Sから光学素子2を介して撮像素子7に照射される光の角度を調整する。処理は、ステップS210の後、ステップS208に戻される。 The device control unit 300 controls, for example, the drive mechanism 26 to drive the optical element 4 based on the shape information passed from the automatic adjustment processing unit 130, and irradiates the image sensor 7 from the eye S through the optical element 2. Adjust the angle of the light. After step S210, the process returns to step S208.
 眼底撮像装置100cは、自動調整処理部130により、取得された瞳孔形状に歪みが無いと判定された場合(ステップS209、「Yes」)、図20のフローチャートにおける一連の処理を終了させる。そして、眼底撮像装置100cは、例えば図7のフローチャートにおけるステップS102に処理を移行させ、フォーカス調整処理を実行する。 If the automatic adjustment processing unit 130 determines that there is no distortion in the acquired pupil shape (step S209, "Yes"), the fundus imaging device 100c ends the series of processes in the flowchart of FIG. Then, the fundus imaging device 100c moves the process to step S102 in the flowchart of FIG. 7, for example, and executes focus adjustment processing.
 なお、フォーカス調整処理については、一般的なオートフォーカス機能を適用可能である。例えば、眼底には多くの血管が走行しており、これらの血管を指標にフォーカス調整を自動で行うことが可能である。 Note that a general autofocus function can be applied to the focus adjustment process. For example, many blood vessels run through the fundus of the eye, and it is possible to automatically adjust the focus using these blood vessels as indicators.
 このように、第2の実施形態によれば、例えば被検者200が、眼底撮像装置100cの本体部110aに設けられたUI21に対して単純な操作を行うだけで、アライメント調整処理およびフォーカス調整処理を含めた眼底撮影が自動で実行される。アライメント調整が容易な眼底撮像装置を提供することが可能となる。 As described above, according to the second embodiment, alignment adjustment processing and focus adjustment can be performed by the subject 200, for example, simply performing a simple operation on the UI 21 provided on the main body 110a of the fundus imaging device 100c. Fundus photography including processing is automatically executed. It becomes possible to provide a fundus imaging device whose alignment is easy to adjust.
 また、アライメント調整処理およびフォーカス調整処理の際には、調整用光源部41aにより可視光領域外の波長の光を用いて被検眼Sに対する照明を行い、本撮影時にはフラッシュ点灯を用いている。そのため、散瞳薬などを使用する必要が無く、非医療従事者による撮影が可能となり、さらには、医療従事者の同席無しで、眼底撮影を実行することが可能となる。 Furthermore, during alignment adjustment processing and focus adjustment processing, the adjustment light source section 41a illuminates the eye S to be examined using light with a wavelength outside the visible light range, and during actual photography, flash lighting is used. Therefore, there is no need to use mydriatic drugs, etc., and it is possible for non-medical personnel to perform imaging, and furthermore, it is possible to perform fundus imaging without a medical personnel present.
(4.第3の実施形態)
 次に、本開示の第3の実施形態について説明する。第3の実施形態に係る眼底撮像装置は、眼底撮影について専門的技術を有した撮影者による、遠隔地からの操作を可能とした例である。この場合、撮影者は、眼底撮影に関して専門的技術を有しているため、必要に応じて自ら機器を操作してアライメント調整を始めとする撮影条件を設定することが想定される。したがって、眼底撮像装置内での全自動撮影以外にも、撮像素子7や前眼観察部28の出力を見ながら、装置を手動調整できる機構を追加しておく必要がある。
(4. Third embodiment)
Next, a third embodiment of the present disclosure will be described. The fundus imaging device according to the third embodiment is an example that allows operation from a remote location by a photographer who has specialized skills in fundus photography. In this case, since the photographer has specialized skills regarding fundus photography, it is assumed that the photographer himself/herself operates the equipment to set photography conditions, including alignment adjustment, as necessary. Therefore, in addition to fully automatic imaging within the fundus imaging device, it is necessary to add a mechanism that allows manual adjustment of the device while viewing the outputs of the image sensor 7 and the anterior eye observation unit 28.
 図21は、第3の実施形態に係る眼底撮像装置の使用形態を説明するための模式図である。 FIG. 21 is a schematic diagram for explaining how the fundus imaging device according to the third embodiment is used.
 図21において、第3の実施形態に係る眼底撮像装置100dは、光学部B1および撮像処理部B2(何れも図示しない)を含む本体部110bと、通信機401および402と、撮影者端末410と、表示装置420と、を含んで構成される。また、通信機401および402と、撮影者端末410と、表示装置420と、により、情報処理部B3が構成される。 In FIG. 21, a fundus imaging device 100d according to the third embodiment includes a main body 110b including an optical section B1 and an imaging processing section B2 (none of which are shown), communication devices 401 and 402, and a photographer terminal 410. , and a display device 420. Furthermore, the communication devices 401 and 402, the photographer terminal 410, and the display device 420 constitute an information processing section B3.
 撮影者端末410は、例えば、パーソナルコンピュータやタブレット型コンピュータ、スマートフォンなどの汎用の情報処理装置を適用することができる。この場合、当該情報処理装置に対して第3の実施形態に係る眼底撮影処理を実行するためのアプリケーションプログラムをインストールすることで、撮影者端末410を構成することができる。これに限らず、撮影者端末410は、この眼底撮像装置100dに専用の端末装置としてもよい。 As the photographer terminal 410, for example, a general-purpose information processing device such as a personal computer, a tablet computer, or a smartphone can be applied. In this case, the photographer terminal 410 can be configured by installing an application program for executing fundus imaging processing according to the third embodiment into the information processing device. The present invention is not limited to this, and the photographer terminal 410 may be a terminal device dedicated to this fundus imaging device 100d.
 通信機401は、例えば被検者200が本体部110bを使用する場所に設けられる。また、通信機402は、例えば本体部110bが使用される場所とは遠隔した、撮影者430が本体部110bを遠隔操作する場所に設けられる。通信機401および402は、インターネットなどのネットワーク400を介して互いに通信が可能とされている。 The communication device 401 is provided, for example, at a location where the subject 200 uses the main body portion 110b. Further, the communication device 402 is provided, for example, at a location remote from the location where the main body portion 110b is used, and where the photographer 430 remotely operates the main body portion 110b. The communication devices 401 and 402 are capable of communicating with each other via a network 400 such as the Internet.
 本体部110bと通信機401とが、画像通信線10および制御通信線11により接続される。本体部110bにおいて前眼観察部28および撮像素子7により取得された画像データは、ポート9から出力され、画像通信線10を介して通信機401に渡される。また、通信機402から送信されネットワーク400を介して通信機401に受信された例えば制御情報は、通信機401から制御通信線11を介してポート22に入力され、本体部110bに渡される。 The main body portion 110b and the communication device 401 are connected by the image communication line 10 and the control communication line 11. Image data acquired by the anterior eye observation section 28 and the image sensor 7 in the main body section 110b is output from the port 9 and passed to the communication device 401 via the image communication line 10. Further, for example, control information transmitted from the communication device 402 and received by the communication device 401 via the network 400 is inputted from the communication device 401 to the port 22 via the control communication line 11, and is passed to the main body section 110b.
 表示装置420および撮影者端末410は、撮影者430により使用される。撮影者430は、眼底撮影に関して専門的技術を有している者を想定している。通信機402は、通信機401からネットワーク400を介して受信した画像データを、表示装置420および撮影者端末410に渡す。また、例えば撮影者430の操作に応じて撮影者端末410により出力された制御情報は、通信機402によりネットワーク400に対して送信され、通信機401に受信される。 The display device 420 and the photographer terminal 410 are used by the photographer 430. The photographer 430 is assumed to be someone who has specialized skills regarding fundus photography. The communication device 402 passes image data received from the communication device 401 via the network 400 to the display device 420 and the photographer terminal 410. For example, control information output by the photographer terminal 410 in response to an operation by the photographer 430 is transmitted to the network 400 by the communication device 402 and received by the communication device 401.
 図22は、第3の実施形態に係る眼底撮像装置100dの機能を説明するための一例の機能ブロック図である。図22において、光学部B1および撮像処理部B2の構成は、図6を用いて説明した第1の実施形態に係る眼底撮像装置100における光学部B1および撮像処理部B2と同等であるので、ここでの説明を省略する。 FIG. 22 is an example functional block diagram for explaining the functions of the fundus imaging device 100d according to the third embodiment. In FIG. 22, the configurations of the optical section B1 and the imaging processing section B2 are the same as those of the optical section B1 and the imaging processing section B2 in the fundus imaging apparatus 100 according to the first embodiment described using FIG. The explanation will be omitted.
 図22において、情報処理部B3は、ネットワーク400で通信可能に接続された通信機401および402と、撮影者端末410と、表示装置420と、を含む。また、撮影者端末410は、UI411と、記憶部412と、操作補助部413と、を含む。 In FIG. 22, the information processing unit B3 includes communication devices 401 and 402 that are communicably connected via a network 400, a photographer terminal 410, and a display device 420. Further, the photographer terminal 410 includes a UI 411, a storage section 412, and an operation assistance section 413.
 通信機402は、通信機401から受信した画像データを表示装置420と撮影者端末410とに渡す。撮影者端末410において、UI411は、撮影者430による入力操作に応じた制御信号を出力する入力デバイスと、画像などの情報を表示するための表示デバイスと、を含む。入力デバイスは、ボタン、トグルスイッチ、キーボードや、マウスやタッチパネルといったポインティングデバイスを含むことができる。これに限らず、入力デバイスは、他の入力手段(音声入力など)を有していてもよい。 The communication device 402 passes the image data received from the communication device 401 to the display device 420 and the photographer terminal 410. In the photographer terminal 410, the UI 411 includes an input device that outputs a control signal according to an input operation by the photographer 430, and a display device that displays information such as images. Input devices can include buttons, toggle switches, keyboards, and pointing devices such as mice and touch panels. The input device is not limited to this, and may have other input means (voice input, etc.).
 記憶部412は、撮影者端末410が備える記憶媒体に対するデータの記憶と、当該記憶媒体に記憶されたデータの読み出しを行う。例えば、記憶部412は、通信機402により受信された、眼底撮影を行った画像データを当該記憶媒体に記憶させる。また、記憶部412は、当該記憶媒体に記憶された眼底撮影による画像データを読み出す。読み出された画像データは、例えば表示装置420に供給され、画像として表示される。 The storage unit 412 stores data in a storage medium included in the photographer terminal 410 and reads data stored in the storage medium. For example, the storage unit 412 stores the image data obtained by photographing the fundus, which is received by the communication device 402, in the storage medium. Furthermore, the storage unit 412 reads image data obtained by fundus photography stored in the storage medium. The read image data is supplied to, for example, the display device 420 and displayed as an image.
 操作補助部413は、例えば図18などを用いて説明した自動調整処理部130と同等の機能を有し、本体部110bにおいて前眼観察部28および撮像素子7により取得された画像データに基づき、本体部110bにおけるアライメント調整処理やフォーカス調整処理を制御するための制御信号を生成する。また、操作補助部413は、撮影者430によるUI411に対する操作に応じて、本体部110bにおけるアライメント調整処理やフォーカス調整処理を制御するための制御信号を生成する。 The operation assisting unit 413 has the same function as the automatic adjustment processing unit 130 described using, for example, FIG. A control signal for controlling alignment adjustment processing and focus adjustment processing in the main body portion 110b is generated. Further, the operation assisting unit 413 generates a control signal for controlling alignment adjustment processing and focus adjustment processing in the main body unit 110b in response to an operation on the UI 411 by the photographer 430.
 さらに、操作補助部413は、必要に応じて、撮像素子7、現像処理部8、前眼観察部28、環状光源41を制御するための制御信号や、内部パラメータの設定値を送信してよい。操作補助部413がこれら各部を制御する制御信号や各部における内部パラメータ設定値を送信することで、表示装置420に表示される画像の見え方を調整したり、また最終的に得られる眼底撮像装置100bの出力画像データの画像を調整することができる。 Further, the operation assisting unit 413 may transmit control signals for controlling the image sensor 7, the developing processing unit 8, the anterior eye observation unit 28, and the annular light source 41, and set values of internal parameters, as necessary. . By transmitting control signals for controlling these parts and internal parameter setting values for each part, the operation assisting part 413 can adjust the appearance of the image displayed on the display device 420 and adjust the final fundus imaging device. The image of the output image data 100b can be adjusted.
 このような構成において、本体部110bは、図7のフローチャートにおけるステップS101のアライメント調整処理、ステップS102のフォーカス調整処理、および、ステップS104の撮影処理では、遠隔地にいる撮影者430に対して、前眼観察部28により前眼部撮影して取得された画像データと、撮像素子7の撮影で取得した眼内、眼底像の画像データとを送信する。 In such a configuration, the main body section 110b performs the alignment adjustment process in step S101, the focus adjustment process in step S102, and the photographing process in step S104 in the flowchart of FIG. The image data obtained by photographing the anterior segment of the eye by the anterior eye observation unit 28 and the image data of the intraocular and fundus images obtained by photographing with the imaging device 7 are transmitted.
 情報処理部B3において、本体部110bから送信され通信機402に受信された画像データによる画像が表示装置420に表示される。表示装置420は、さらに、当該画像に係るその他の補助的な情報(補助情報と呼ぶ)を表示してもよい。補助情報としては、当該画像の合焦状態を示す情報や、光量飽和状態を示す情報などであってよい。 In the information processing section B3, an image based on the image data transmitted from the main body section 110b and received by the communication device 402 is displayed on the display device 420. The display device 420 may further display other auxiliary information (referred to as auxiliary information) regarding the image. The auxiliary information may be information indicating the focused state of the image, information indicating the light amount saturation state, or the like.
 操作補助部413は、例えば、本体部110bから送信され通信機402に受信された画像データに基づき、補助情報を取得することができる。操作補助部413は、取得した補助情報を表示装置420に渡す。表示装置420は、操作補助部413から渡された補助情報を、画像データに重畳させて表示させてもよいし、画像データとは別ウィンドウに表示させてもよい。さらには、補助情報を、表示装置420は異なる別の表示装置に表示させてもよい。 The operation assistance unit 413 can acquire assistance information based on, for example, image data transmitted from the main unit 110b and received by the communication device 402. The operation assistance unit 413 passes the acquired assistance information to the display device 420. The display device 420 may display the auxiliary information passed from the operation assistance unit 413 superimposed on the image data, or may display the auxiliary information in a window separate from the image data. Furthermore, the auxiliary information may be displayed on another display device different from the display device 420.
 表示装置420は、補助情報を画像情報として表示してもよいし(例えば光量飽和領域を画像上に斜線を付して表示)、テキストベースの数値情報として表示してもよい(例えば合焦の度合いをパーセンテージで表示)。 The display device 420 may display the auxiliary information as image information (for example, displaying the light intensity saturated area with diagonal lines on the image) or as text-based numerical information (for example, displaying the light intensity area with diagonal lines on the image). degree expressed as a percentage).
 撮影者430は、表示装置420に表示された画像や補助情報に基づき、遠隔地にある本体部110bにおけるアライメント調整やフォーカス調整の状態を確認する。撮影者430は、確認の結果に応じて、撮影者端末410において、本体部110bの制御を行うためにUI411の入力デバイスを操作する。 Based on the image and auxiliary information displayed on the display device 420, the photographer 430 checks the state of alignment adjustment and focus adjustment in the main body section 110b located at a remote location. Depending on the confirmation result, the photographer 430 operates the input device of the UI 411 on the photographer terminal 410 to control the main body 110b.
 撮影者端末410は、撮影者430によるUI411に対する操作に応じて、本体部110bの動作を制御するための制御信号を生成する。ここで生成される制御信号は、アライメント調整制御、フォーカス調整制御、撮像制御、画像処理制御などに関する情報が含まれる。 The photographer terminal 410 generates a control signal for controlling the operation of the main body 110b in response to the operation of the UI 411 by the photographer 430. The control signal generated here includes information regarding alignment adjustment control, focus adjustment control, imaging control, image processing control, and the like.
 撮影者端末410により生成された制御信号は、通信機402からネットワーク400を介して送信され、通信機401に受信される。通信機401は、受信した制御情報を、制御通信線11を介してポート22に入力する。ポート22に入力された制御情報は、装置制御部300に渡される。装置制御部300は、この制御情報に基づき駆動機構24~27、前眼観察部28、撮像素子7および現像処理部8などを制御して、アライメント調整処理、フォーカス調整処理および撮像処理を実行する。このようにして、第3の実施形態に係る眼底撮像装置100dは、撮影者端末410から本体部110bを、撮影者430によるマニュアル操作により遠隔操作することが可能となる。 A control signal generated by the photographer terminal 410 is transmitted from the communication device 402 via the network 400 and received by the communication device 401. The communication device 401 inputs the received control information to the port 22 via the control communication line 11. Control information input to the port 22 is passed to the device control section 300. The device control unit 300 controls the drive mechanisms 24 to 27, the anterior eye observation unit 28, the image sensor 7, the development processing unit 8, etc. based on this control information, and executes alignment adjustment processing, focus adjustment processing, and imaging processing. . In this way, in the fundus imaging apparatus 100d according to the third embodiment, the main body 110b can be remotely controlled by the photographer 430 manually from the photographer's terminal 410.
 また、撮影者端末410は、前眼観察部28により前眼部撮影して取得された画像データと、撮像素子7の撮影で取得した眼内、眼底像の画像データとに基づき、本体部110bにおけるアライメント調整処理、フォーカス調整処理および撮影処理を自動実行させることができる。 Further, the photographer terminal 410 uses the main body 110b based on the image data acquired by photographing the anterior eye by the anterior eye observation unit 28 and the image data of the intraocular and fundus images acquired by photographing by the image sensor 7. The alignment adjustment process, focus adjustment process, and photographing process can be automatically executed.
 すなわち、撮影者端末410において、操作補助部413は、上述した第2の実施形態において説明した自動調整処理部130と同様にして、アライメント調整およびフォーカス調整それぞれの調整値を算出する。撮影者端末410は、算出された各調整値を、通信機401および402の通信により本体部110bに送る。これにより、アライメント調整およびフォーカス調整のフィードバック制御が可能となる。 That is, in the photographer terminal 410, the operation assisting section 413 calculates adjustment values for each of the alignment adjustment and the focus adjustment in the same manner as the automatic adjustment processing section 130 described in the above-mentioned second embodiment. The photographer terminal 410 sends each calculated adjustment value to the main body section 110b through communication using the communication devices 401 and 402. This enables feedback control of alignment adjustment and focus adjustment.
 なお、操作補助部413は、撮影者430によるUI411に対する操作に応じて、アライメント調整、フォーカス調整および撮影処理を、それぞれ個別に制御することも可能である。 Note that the operation assisting unit 413 can also individually control alignment adjustment, focus adjustment, and photographing processing in accordance with the operation of the photographer 430 on the UI 411.
 このように、第3の実施形態によれば、撮影者430は、被検者200が本体部110bを用いて眼底撮影を行う場所に対して遠隔地から、当該眼底撮影におけるアライメント調整、フォーカス調整および撮影処理を制御することができる。これにより、アライメント調整が容易な眼底撮像装置を提供することが可能となる。 As described above, according to the third embodiment, the photographer 430 performs alignment adjustment and focus adjustment for fundus imaging from a remote location where the subject 200 uses the main body 110b to perform fundus imaging. and can control the shooting process. This makes it possible to provide a fundus imaging device with easy alignment adjustment.
 また、第3の実施形態によれば、操作補助部413は、撮影者430のUI411に対する操作に応じて、撮像素子7、現像処理部8、前眼観察部28、環状光源41を制御するための制御信号や、内部パラメータの設定値を生成して、本体部110bに送信することができる。これにより、より詳細なアライメント調整を遠隔地から容易に実行可能な眼底撮像装置を提供することが可能となる。 Further, according to the third embodiment, the operation assisting section 413 controls the image sensor 7, the development processing section 8, the anterior eye observation section 28, and the annular light source 41 in accordance with the operation of the photographer 430 on the UI 411. control signals and internal parameter settings can be generated and transmitted to the main body section 110b. This makes it possible to provide a fundus imaging device that can easily perform more detailed alignment adjustments from a remote location.
 また、アライメント調整やフォーカス調整の際の照明に、可視光領域外の波長の光を用い、また、本撮影時にはフラッシュ点灯を行っているため、眩しさなどによる被検眼Sの瞳孔の収縮を防ぐことが可能である。そのため、散瞳薬などを使用する必要が無く、医療従事者などの同席無しで、リモートで遠隔地から撮影を実行することが可能となる。 In addition, light with a wavelength outside the visible light range is used for illumination during alignment adjustment and focus adjustment, and a flash is turned on during the actual shooting to prevent constriction of the pupil of the subject's eye S due to glare etc. Is possible. Therefore, there is no need to use mydriatic drugs, and imaging can be performed remotely from a remote location without the presence of medical personnel.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also exist.
 なお、本技術は以下のような構成も取ることができる。
(1)
 被検眼と、前記被検眼の眼底を撮像する撮像部との間に設けられ、前記撮像部の前記被検眼に対する視野方向を変化させる第1の光学部と、
 前記第1の光学部と前記撮像部との間に設けられ、前記眼底から射出され前記第1の光学部を介して前記撮像部に照射される光の進行方向を補正する第2の光学部と、
を備える、眼底撮像装置。
(2)
 前記第1の光学部は、
 凹面鏡、平面鏡、レンズおよびホログラム光学素子のうち何れかによる第1の光学素子と、
 前記第1の光学素子による光の射出方向を制御する第1の駆動機構と、
を含む、
前記(1)に記載の眼底撮像装置。
(3)
 前記第2の光学部は、
 レンズおよびホログラム光学素子のうち何れかによる第2の光学素子と、
 前記第2の光学素子による光の射出方向を制御する第2の駆動機構と、
を含む、
前記(1)または(2)に記載の眼底撮像装置。
(4)
 前記第2の光学部から射出された光が入射される第3の光学素子と、
 前記第3の光学素子を駆動して該第2の光学部を介して前記撮像部に照射される光の焦点を調整する第3の駆動機構と、
を含む、第3の光学部、
をさらに備える、
前記(1)乃至(3)の何れかに記載の眼底撮像装置。
(5)
 前記第1の光学部と、前記第2の光学部と、前記撮像部による撮影と、を前記被検眼を撮影した画像に基づき制御する制御部、
をさらに備える、
前記(1)乃至(4)の何れかに記載の眼底撮像装置。
(6)
 前記被検眼の前眼部を観察する前眼観察部、
をさらに備え、
 前記制御部は、
 前記前眼観察部による観察により取得された前記画像に基づき前記第1の光学部と前記第2の光学部と、を制御する
前記(5)に記載の眼底撮像装置。
(7)
 前記前眼観察部は、
 可視光領域の波長の光を前記前眼部に照射する第1の光源部と、
 可視光領域外の波長の光を前記前眼部に照射する第2の光源部と、
を含む、
前記(6)に記載の眼底撮像装置。
(8)
 前記前眼観察部は、
 前記第1の光学部により前記視野方向を変化させる場合、および、前記第2の光学部による前記光の進行方向を補正する場合に、前記前眼部に対して、前記第2の光源部による光を照射し、前記第1の光源部による光を照射しない、
前記(7)に記載の眼底撮像装置。
(9)
 前記第1の光源部および前記第2の光源部は、
 互いの中心を共有する同心円状の形状を有する、
前記(7)または(8)に記載の眼底撮像装置。
(10)
 前記前眼観察部は、
 蛍光観察を行うための特定波長の光を前記前眼部に照射する第3の光源部、
をさらに含む、
前記(7)乃至(9)の何れかに記載の眼底撮像装置。
(11)
 前記制御部は、
 ユーザ操作に応じて、前記撮像部により撮影された前記画像に基づき、前記撮像部と前記第1の光学部と前記第2の光学部とを制御する、
前記(5)乃至(10)の何れかに記載の眼底撮像装置。
(12)
 前記制御部は、
 前記撮像部により撮影された前記画像における前記被検眼の瞳孔の位置に基づき、前記第1の光学部と前記第2の光学部とを制御する、
前記(11)に記載の眼底撮像装置。
(13)
 前記制御部は、
 前記撮像部により撮影された前記画像における前記被検眼の瞳孔の形状に基づき、前記第1の光学部と前記第2の光学部とを制御する、
前記(11)または(12)に記載の眼底撮像装置。
(14)
 前記撮像部から前記被検眼までの光路の距離を変化させる第4の光学部、
をさらに備え、
 前記制御部は、
 前記画像における前記被検眼の大きさに基づき、前記第4の光学部を制御する、
前記(5)乃至(13)の何れかに記載の眼底撮像装置。
(15)
 前記撮像部を脱着可能なアタッチメント、
をさらに備える、前記(1)乃至(14)の何れかに記載の眼底撮像装置。
(16)
 プロセッサにより実行される、
 眼底撮像装置が備える、
  被検眼と、前記被検眼の眼底を撮像する撮像部との間に設けられ、前記撮像部の前記被検眼に対する視野方向を変化させる第1の光学部と、
  前記第1の光学部と前記撮像部との間に設けられ、前記眼底から射出され前記第1の光学部を介して前記撮像部に到来する光の進行方向を補正する第2の光学部と、
 前記撮像部による撮像と、
を、ユーザ操作に応じて制御する制御ステップ、
を含む、眼底撮像方法。
(17)
 前記制御ステップにより、
 前記ユーザ操作に応じて、前記撮像部により撮像された画像に基づき、前記撮像部と前記第1の光学部と前記第2の光学部とを制御する、
前記(16)に記載の眼底撮像方法。
(18)
 前記制御ステップにより、
 前記画像における前記被検眼の瞳孔の位置に基づき、前記第1の光学部と前記第2の光学部とを制御する、
前記(17)に記載の眼底撮像方法。
(19)
 前記制御ステップにより、
 前記画像における前記被検眼の瞳孔の形状に基づき、前記第1の光学部と前記第2の光学部とを制御する、
前記(17)または(18)に記載の眼底撮像方法。
(20)
 前記眼底撮像装置は、
 前記撮像部から前記被検眼までの光路の距離を変化させる第4の光学部、をさらに備え、
 前記制御ステップにより、さらに、
 前記画像における前記被検眼の大きさに基づき、前記第4の光学部を制御する、
前記(17)乃至(19)の何れかに記載の眼底撮像方法。
Note that the present technology can also have the following configuration.
(1)
a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes a visual field direction of the imaging section with respect to the eye to be examined;
a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light emitted from the fundus and irradiated to the imaging section via the first optical section; and,
A fundus imaging device comprising:
(2)
The first optical section is
a first optical element made of any one of a concave mirror, a plane mirror, a lens, and a hologram optical element;
a first drive mechanism that controls the direction in which light is emitted by the first optical element;
including,
The fundus imaging device according to (1) above.
(3)
The second optical section is
a second optical element consisting of either a lens or a hologram optical element;
a second drive mechanism that controls the direction in which light is emitted by the second optical element;
including,
The fundus imaging device according to (1) or (2) above.
(4)
a third optical element into which the light emitted from the second optical section is incident;
a third drive mechanism that drives the third optical element to adjust the focus of the light irradiated to the imaging section via the second optical section;
a third optical section,
further comprising,
The fundus imaging device according to any one of (1) to (3) above.
(5)
a control unit that controls the first optical unit, the second optical unit, and the imaging by the imaging unit based on an image of the eye to be examined;
further comprising,
The fundus imaging device according to any one of (1) to (4) above.
(6)
an anterior ocular observation unit for observing the anterior ocular segment of the subject's eye;
Furthermore,
The control unit includes:
The fundus imaging device according to (5), wherein the first optical section and the second optical section are controlled based on the image acquired by observation by the anterior eye observation section.
(7)
The anterior eye observation part is
a first light source unit that irradiates the anterior segment with light having a wavelength in the visible light range;
a second light source unit that irradiates the anterior segment with light having a wavelength outside the visible light range;
including,
The fundus imaging device according to (6) above.
(8)
The anterior eye observation part is
When the visual field direction is changed by the first optical part and when the traveling direction of the light is corrected by the second optical part, the second light source part irradiating light and not irradiating light from the first light source unit;
The fundus imaging device according to (7) above.
(9)
The first light source section and the second light source section are
Having a concentric shape that shares a mutual center,
The fundus imaging device according to (7) or (8) above.
(10)
The anterior eye observation part is
a third light source unit that irradiates the anterior segment with light of a specific wavelength for performing fluorescence observation;
further including,
The fundus imaging device according to any one of (7) to (9) above.
(11)
The control unit includes:
controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to a user operation;
The fundus imaging device according to any one of (5) to (10) above.
(12)
The control unit includes:
controlling the first optical unit and the second optical unit based on the position of the pupil of the eye to be examined in the image captured by the imaging unit;
The fundus imaging device according to (11) above.
(13)
The control unit includes:
controlling the first optical unit and the second optical unit based on the shape of the pupil of the eye to be examined in the image captured by the imaging unit;
The fundus imaging device according to (11) or (12) above.
(14)
a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined;
Furthermore,
The control unit includes:
controlling the fourth optical section based on the size of the eye to be examined in the image;
The fundus imaging device according to any one of (5) to (13) above.
(15)
an attachment that allows the imaging unit to be attached and detached;
The fundus imaging device according to any one of (1) to (14) above, further comprising:
(16)
executed by the processor,
The fundus imaging device includes
a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes a visual field direction of the imaging section with respect to the eye to be examined;
a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light that is emitted from the fundus and reaches the imaging section via the first optical section; ,
Imaging by the imaging unit;
a control step that controls according to user operations;
Fundus imaging methods, including:
(17)
According to the control step,
controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to the user operation;
The fundus imaging method according to (16) above.
(18)
According to the control step,
controlling the first optical section and the second optical section based on the position of the pupil of the eye to be examined in the image;
The fundus imaging method according to (17) above.
(19)
According to the control step,
controlling the first optical section and the second optical section based on the shape of the pupil of the eye to be examined in the image;
The fundus imaging method according to (17) or (18) above.
(20)
The fundus imaging device includes:
further comprising a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined,
According to the control step, further:
controlling the fourth optical section based on the size of the eye to be examined in the image;
The fundus imaging method according to any one of (17) to (19) above.
1 装着部
2,3,4,5,6 光学素子
7,31,530 撮像素子
8 現像処理部
9,22,350 ポート
10 画像通信線
11 制御通信線
21,411 UI
23,300 装置制御部
24,25,26,27,61b,61b’ 駆動機構
28,28あ、28b,28c 前眼観察部
30 投影機
41 環状光源
60 対物レンズ
71 アタッチメント
72 撮像装置
41a 調整用光源部
41b 撮影用光源部
100,100a,100b,100c,100d 眼底撮像装置
110a,110b 本体部
120 被検者端末
130 自動調整処理部
131,131a,131b インジケータ
200 被検者
351 信号処理部
352 表示部
353 入力部
400 ネットワーク
401,402 通信機
410 撮影者端末
413 操作補助部
420 表示装置
430 撮影者
B1 光学部
B2 撮像処理部
B3 情報処理部
S,500 被検眼
1 Mounting section 2, 3, 4, 5, 6 Optical element 7, 31, 530 Image sensor 8 Development processing section 9, 22, 350 Port 10 Image communication line 11 Control communication line 21, 411 UI
23,300 Device control section 24, 25, 26, 27, 61b, 61b' Drive mechanism 28, 28a, 28b, 28c Anterior eye observation section 30 Projector 41 Annular light source 60 Objective lens 71 Attachment 72 Imaging device 41a Adjustment light source Section 41b Photographing light source section 100, 100a, 100b, 100c, 100d Fundus imaging device 110a, 110b Main body section 120 Subject terminal 130 Automatic adjustment processing section 131, 131a, 131b Indicator 200 Subject 351 Signal processing section 352 Display section 353 Input unit 400 Network 401, 402 Communication device 410 Photographer terminal 413 Operation assistance unit 420 Display device 430 Photographer B1 Optical unit B2 Imaging processing unit B3 Information processing unit S, 500 Eye to be examined

Claims (20)

  1.  被検眼と、前記被検眼の眼底を撮像する撮像部との間に設けられ、前記撮像部の前記被検眼に対する視野方向を変化させる第1の光学部と、
     前記第1の光学部と前記撮像部との間に設けられ、前記眼底から射出され前記第1の光学部を介して前記撮像部に照射される光の進行方向を補正する第2の光学部と、
    を備える、眼底撮像装置。
    a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes a visual field direction of the imaging section with respect to the eye to be examined;
    a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light emitted from the fundus and irradiated to the imaging section via the first optical section; and,
    A fundus imaging device comprising:
  2.  前記第1の光学部は、
     凹面鏡、平面鏡、レンズおよびホログラム光学素子のうち何れかによる第1の光学素子と、
     前記第1の光学素子による光の射出方向を制御する第1の駆動機構と、
    を含む、
    請求項1に記載の眼底撮像装置。
    The first optical section is
    a first optical element made of any one of a concave mirror, a plane mirror, a lens, and a hologram optical element;
    a first drive mechanism that controls the direction in which light is emitted by the first optical element;
    including,
    The fundus imaging device according to claim 1.
  3.  前記第2の光学部は、
     レンズおよびホログラム光学素子のうち何れかによる第2の光学素子と、
     前記第2の光学素子による光の射出方向を制御する第2の駆動機構と、
    を含む、
    請求項1に記載の眼底撮像装置。
    The second optical section is
    a second optical element consisting of either a lens or a hologram optical element;
    a second drive mechanism that controls the direction in which light is emitted by the second optical element;
    including,
    The fundus imaging device according to claim 1.
  4.  前記第2の光学部から射出された光が入射される第3の光学素子と、
     前記第3の光学素子を駆動して該第2の光学部を介して前記撮像部に照射される光の焦点を調整する第3の駆動機構と、
    を含む、第3の光学部、
    をさらに備える、
    請求項1に記載の眼底撮像装置。
    a third optical element into which the light emitted from the second optical section is incident;
    a third drive mechanism that drives the third optical element to adjust the focus of the light irradiated to the imaging section via the second optical section;
    a third optical section,
    further comprising,
    The fundus imaging device according to claim 1.
  5.  前記第1の光学部と、前記第2の光学部と、前記撮像部による撮影と、を前記被検眼を撮影した画像に基づき制御する制御部、
    をさらに備える、
    請求項1に記載の眼底撮像装置。
    a control unit that controls the first optical unit, the second optical unit, and the imaging by the imaging unit based on an image of the eye to be examined;
    further comprising,
    The fundus imaging device according to claim 1.
  6.  前記被検眼の前眼部を観察する前眼観察部、
    をさらに備え、
     前記制御部は、
     前記前眼観察部による観察により取得された前記画像に基づき前記第1の光学部と前記第2の光学部と、を制御する
    請求項5に記載の眼底撮像装置。
    an anterior ocular observation unit for observing the anterior ocular segment of the subject's eye;
    Furthermore,
    The control unit includes:
    The fundus imaging device according to claim 5, wherein the first optical section and the second optical section are controlled based on the image acquired by observation by the anterior eye observation section.
  7.  前記前眼観察部は、
     可視光領域の波長の光を前記前眼部に照射する第1の光源部と、
     可視光領域外の波長の光を前記前眼部に照射する第2の光源部と、
    を含む、
    請求項6に記載の眼底撮像装置。
    The anterior eye observation part is
    a first light source unit that irradiates the anterior segment with light having a wavelength in the visible light range;
    a second light source unit that irradiates the anterior segment with light having a wavelength outside the visible light range;
    including,
    The fundus imaging device according to claim 6.
  8.  前記前眼観察部は、
     前記第1の光学部により前記視野方向を変化させる場合、および、前記第2の光学部による前記光の進行方向を補正する場合に、前記前眼部に対して、前記第2の光源部による光を照射し、前記第1の光源部による光を照射しない、
    請求項7に記載の眼底撮像装置。
    The anterior eye observation part is
    When the visual field direction is changed by the first optical part and when the traveling direction of the light is corrected by the second optical part, the second light source part irradiating light and not irradiating light from the first light source unit;
    The fundus imaging device according to claim 7.
  9.  前記第1の光源部および前記第2の光源部は、
     互いの中心を共有する同心円状の形状を有する、
    請求項7に記載の眼底撮像装置。
    The first light source section and the second light source section are
    Having a concentric shape that shares a mutual center,
    The fundus imaging device according to claim 7.
  10.  前記前眼観察部は、
     蛍光観察を行うための特定波長の光を前記前眼部に照射する第3の光源部、
    をさらに含む、
    請求項7に記載の眼底撮像装置。
    The anterior eye observation part is
    a third light source unit that irradiates the anterior segment with light of a specific wavelength for performing fluorescence observation;
    further including,
    The fundus imaging device according to claim 7.
  11.  前記制御部は、
     ユーザ操作に応じて、前記撮像部により撮影された前記画像に基づき、前記撮像部と前記第1の光学部と前記第2の光学部とを制御する、
    請求項5に記載の眼底撮像装置。
    The control unit includes:
    controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to a user operation;
    The fundus imaging device according to claim 5.
  12.  前記制御部は、
     前記撮像部により撮影された前記画像における前記被検眼の瞳孔の位置に基づき、前記第1の光学部と前記第2の光学部とを制御する、
    請求項11に記載の眼底撮像装置。
    The control unit includes:
    controlling the first optical unit and the second optical unit based on the position of the pupil of the eye to be examined in the image captured by the imaging unit;
    The fundus imaging device according to claim 11.
  13.  前記制御部は、
     前記撮像部により撮影された前記画像における前記被検眼の瞳孔の形状に基づき、前記第1の光学部と前記第2の光学部とを制御する、
    請求項11に記載の眼底撮像装置。
    The control unit includes:
    controlling the first optical unit and the second optical unit based on the shape of the pupil of the eye to be examined in the image captured by the imaging unit;
    The fundus imaging device according to claim 11.
  14.  前記撮像部から前記被検眼までの光路の距離を変化させる第4の光学部、
    をさらに備え、
     前記制御部は、
     前記画像における前記被検眼の大きさに基づき、前記第4の光学部を制御する、
    請求項5に記載の眼底撮像装置。
    a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined;
    Furthermore,
    The control unit includes:
    controlling the fourth optical section based on the size of the eye to be examined in the image;
    The fundus imaging device according to claim 5.
  15.  前記撮像部を脱着可能なアタッチメント、
    をさらに備える、請求項1に記載の眼底撮像装置。
    an attachment that allows the imaging unit to be attached and detached;
    The fundus imaging device according to claim 1, further comprising:
  16.  プロセッサにより実行される、
     眼底撮像装置が備える、
      被検眼と、前記被検眼の眼底を撮像する撮像部との間に設けられ、前記撮像部の前記被検眼に対する視野方向を変化させる第1の光学部と、
      前記第1の光学部と前記撮像部との間に設けられ、前記眼底から射出され前記第1の光学部を介して前記撮像部に到来する光の進行方向を補正する第2の光学部と、
     前記撮像部による撮像と、
    を、ユーザ操作に応じて制御する制御ステップ、
    を含む、眼底撮像方法。
    executed by the processor,
    The fundus imaging device includes
    a first optical section that is provided between an eye to be examined and an imaging section that images the fundus of the eye to be examined, and that changes a visual field direction of the imaging section with respect to the eye to be examined;
    a second optical section that is provided between the first optical section and the imaging section and corrects the traveling direction of light that is emitted from the fundus and reaches the imaging section via the first optical section; ,
    Imaging by the imaging unit;
    a control step that controls according to user operations;
    Fundus imaging methods, including:
  17.  前記制御ステップにより、
     前記ユーザ操作に応じて、前記撮像部により撮像された画像に基づき、前記撮像部と前記第1の光学部と前記第2の光学部とを制御する、
    請求項16に記載の眼底撮像方法。
    According to the control step,
    controlling the imaging unit, the first optical unit, and the second optical unit based on the image captured by the imaging unit in response to the user operation;
    The fundus imaging method according to claim 16.
  18.  前記制御ステップにより、
     前記画像における前記被検眼の瞳孔の位置に基づき、前記第1の光学部と前記第2の光学部とを制御する、
    請求項17に記載の眼底撮像方法。
    According to the control step,
    controlling the first optical section and the second optical section based on the position of the pupil of the eye to be examined in the image;
    The fundus imaging method according to claim 17.
  19.  前記制御ステップにより、
     前記画像における前記被検眼の瞳孔の形状に基づき、前記第1の光学部と前記第2の光学部とを制御する、
    請求項17に記載の眼底撮像方法。
    According to the control step,
    controlling the first optical section and the second optical section based on the shape of the pupil of the eye to be examined in the image;
    The fundus imaging method according to claim 17.
  20.  前記眼底撮像装置は、
     前記撮像部から前記被検眼までの光路の距離を変化させる第4の光学部、をさらに備え、
     前記制御ステップにより、さらに、
     前記画像における前記被検眼の大きさに基づき、前記第4の光学部を制御する、
    請求項17に記載の眼底撮像方法。
    The fundus imaging device includes:
    further comprising a fourth optical section that changes the distance of the optical path from the imaging section to the eye to be examined,
    According to the control step, further:
    controlling the fourth optical section based on the size of the eye to be examined in the image;
    The fundus imaging method according to claim 17.
PCT/JP2023/008671 2022-03-16 2023-03-07 Fundus-imaging device and fundus-imaging method WO2023176599A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022041866 2022-03-16
JP2022-041866 2022-03-16

Publications (1)

Publication Number Publication Date
WO2023176599A1 true WO2023176599A1 (en) 2023-09-21

Family

ID=88023095

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/008671 WO2023176599A1 (en) 2022-03-16 2023-03-07 Fundus-imaging device and fundus-imaging method

Country Status (1)

Country Link
WO (1) WO2023176599A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013098981A1 (en) * 2011-12-27 2013-07-04 株式会社ニデック Hand-held ophthalmological device
JP2021157069A (en) * 2020-03-27 2021-10-07 キヤノン株式会社 Image tremor correction control device, imaging device and imaging device control method
JP2021175419A (en) * 2020-05-01 2021-11-04 株式会社ニデック Ophthalmologic imaging apparatus and ophthalmologic imaging program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013098981A1 (en) * 2011-12-27 2013-07-04 株式会社ニデック Hand-held ophthalmological device
JP2021157069A (en) * 2020-03-27 2021-10-07 キヤノン株式会社 Image tremor correction control device, imaging device and imaging device control method
JP2021175419A (en) * 2020-05-01 2021-11-04 株式会社ニデック Ophthalmologic imaging apparatus and ophthalmologic imaging program

Similar Documents

Publication Publication Date Title
JP4937792B2 (en) Fundus camera
KR101640536B1 (en) Image-processor-controlled misalignment-reduction for ophthalmic systems
KR101900907B1 (en) Electronically controlled fixation light for ophthalmic imaging systems
US9004684B2 (en) Fundus camera
JPH0566133B2 (en)
ES2655703T3 (en) Intuitive techniques and apparatus for obtaining ophthalmological images
KR20110086004A (en) Apparatus and method for imaging the eye
US20140002795A1 (en) Fundus photographing apparatus
KR20220073756A (en) Miniature retina scanning device for tracking pupil movement and application accordingly
JP7221587B2 (en) ophthalmic equipment
JP4733511B2 (en) Fundus camera
JP6090977B2 (en) Ophthalmic examination equipment
JP2008006104A (en) Fundus camera
JP5745864B2 (en) Fundus photographing device
WO2023176599A1 (en) Fundus-imaging device and fundus-imaging method
JP7266375B2 (en) Ophthalmic device and method of operation thereof
JP6379639B2 (en) Glasses wearing parameter measurement imaging device, glasses wearing parameter measuring imaging program
JP5328517B2 (en) Fundus photographing device
JP2008006105A (en) Fundus camera
US20230014952A1 (en) Retinal imaging system
JP7145708B2 (en) Optometry equipment
JP2018033055A (en) Imaging apparatus
JP5677501B2 (en) Ophthalmic equipment
JP5927689B2 (en) Ophthalmic examination equipment
JP7171162B2 (en) ophthalmic camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23770555

Country of ref document: EP

Kind code of ref document: A1