JP6465930B2 - Information processing apparatus, information processing apparatus control method, and computer program - Google Patents

Information processing apparatus, information processing apparatus control method, and computer program Download PDF

Info

Publication number
JP6465930B2
JP6465930B2 JP2017134948A JP2017134948A JP6465930B2 JP 6465930 B2 JP6465930 B2 JP 6465930B2 JP 2017134948 A JP2017134948 A JP 2017134948A JP 2017134948 A JP2017134948 A JP 2017134948A JP 6465930 B2 JP6465930 B2 JP 6465930B2
Authority
JP
Japan
Prior art keywords
image
imaging
magnification
images
acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2017134948A
Other languages
Japanese (ja)
Other versions
JP2017170266A (en
Inventor
裕之 今村
裕之 今村
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2017134948A priority Critical patent/JP6465930B2/en
Publication of JP2017170266A publication Critical patent/JP2017170266A/en
Application granted granted Critical
Publication of JP6465930B2 publication Critical patent/JP6465930B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Description

The present invention relates to an information processing apparatus used for ophthalmic medical care, a control method for the information processing apparatus, and a computer program.

  Eye examinations are widely performed for the purpose of early medical treatment of lifestyle-related diseases and diseases that account for the top causes of blindness. A scanning laser ophthalmoscope (SLO) is known as an ophthalmologic apparatus using the principle of a confocal laser microscope. The scanning laser ophthalmoscope is a device that raster scans the fundus of the laser as measurement light and obtains a planar image with high resolution and high speed from the intensity of the return light. Hereinafter, an apparatus that captures such a planar image is referred to as an SLO apparatus, and the captured planar image is referred to as an SLO image.

  In recent years, it has become possible to acquire an SLO image of the retina having a lateral resolution improved as compared with the prior art by increasing the beam diameter of the measurement light in the SLO apparatus. However, with the increase in the diameter of the measurement light beam, the acquisition of the SLO image of the retina has caused problems such as a decrease in the S / N ratio and resolution of the SLO image due to aberration in the eye to be examined. In order to solve this problem, an adaptive optics SLO apparatus having an adaptive optical system that measures aberrations in the eye to be examined in real time with a wavefront sensor and corrects aberrations of measurement light generated in the eye to be examined and its return light with a wavefront correction device. Has been developed. Thereby, it is possible to acquire an SLO image having a high lateral resolution.

  Such a high lateral resolution SLO image can be acquired as a moving image, and various biological information can be measured using the moving image. For example, in order to non-invasively observe the blood flow dynamics, retinal blood vessels are extracted from each frame and then the blood cell moving speed in the capillary blood vessels is measured. Further, in order to evaluate the relationship with the visual function using the SLO image, the density distribution and arrangement of the photoreceptor cells P are measured after the photoreceptor cells P are detected.

  However, since the angle of view of one high lateral resolution SLO image that can be captured by the adaptive optics SLO device is generally small, if the imaging target region is larger than the angle of view of the high lateral resolution SLO image, the imaging region in the imaging target region The problem is how to set. This will be described with reference to FIG. FIG. 7A is a diagram schematically showing a cross-sectional view of the eye to be examined. FIG. 7B to FIG. 7G are diagrams illustrating examples of SLO images or imaging target regions.

  FIG. 7B is a diagram illustrating an example of an SLO image having a high lateral resolution. In FIG. 7B, a low luminance region Q corresponding to the position of the photoreceptor cell P and capillary blood vessel and a high luminance region W corresponding to the position of the white blood cell are observed. When observing the photoreceptor cells P or measuring the photoreceptor cell P distribution, the focus position is set near the outer retina (B5 in FIG. 7A) and an SLO image as shown in FIG. 7B is captured. To do.

  On the other hand, retinal blood vessels and branched capillaries run in the inner retina (B2 to B4 in FIG. 7A). In particular, in the affected eye, the imaging target region is often larger than the angle of view of one SLO image that can be captured by the SLO device. FIG. 7C and FIG. 7D illustrate an example in which the imaging target region is larger than the angle of view of the SLO image. FIG. 7 (c) shows an example of a frequent site of capillary lesions (annular region surrounded by a broken line), and FIG. 7 (d) shows an example of a wide photoreceptor cell defective region (black closed region). Yes. In the case of FIG. 7C and FIG. 7D, if all the imaging target areas are acquired at a high magnification, setting of imaging conditions for a large number of SLO images is complicated, and imaging time becomes long. There was a problem that the burden increased. Since the imaging target area includes both areas that are highly necessary to obtain a high-magnification image for medical care and areas that are less necessary, a high-magnification image is acquired in an examination time that does not burden the subject. It is necessary to appropriately set the imaging area so that all of the highly necessary areas can be imaged.

  In relation to this, Japanese Patent Laid-Open No. 2004-228561 has a configuration as a technique related to parameter setting for acquiring a plurality of high-magnification images by changing the imaging positions of a plurality of adaptive optics SLO images and displaying them as panoramic images. Have been described.

JP 2012-213513 A

However, when a cell group, tissue, or lesion area to be observed or measured in detail is more widely distributed than the area covered by the image with high lateral resolution (high magnification image D H ), the area such as the cell group is efficiently When imaging, the conventional configuration has the following problems. That is,
i) It is necessary for the operator to individually specify the values of acquisition parameters (for example, acquisition position, angle of view, pixel size, number of frames, frame rate, etc.) of a plurality of high-magnification images D Hj. It was a hindrance to conversion.
ii) When an observation target area wider than the high-magnification image DH is imaged with the same high-magnification image acquisition parameter, the number of high-magnification images to be acquired (total number of frames) becomes enormous (thousands to tens of thousands) and efficiency Image acquisition was difficult.

  Also in the configuration of Patent Document 1, acquisition parameters for a large number of high-magnification images are manually determined for each image, and the operator is forced to perform complicated operations for setting the acquisition parameters.

  The present invention has been made in view of the above-described problems, and provides a technology that enables efficient and appropriate imaging of tissue, cell groups, or lesion candidates having different distributions depending on the eye to be examined in a wider range than the angle of view of a high-magnification image. The purpose is to provide.

In order to achieve the above object, an information processing apparatus according to the present invention comprises the following arrangement. That is, an information processing apparatus that controls imaging of a plurality of the images in a region of an eye to be examined that is wider than a region of one image to be captured,
Storage means for storing a plurality of basic patterns indicating the arrangement of a plurality of imaging positions for capturing the images;
Selecting means for selecting one basic pattern from the plurality of basic patterns stored in the storage means;
Adjusting means for adjusting the imaging positions of the plurality of images associated in advance with the selected basic pattern;
According to a plurality of image pickup positions included in the basic pattern selected by said selecting means, have a control unit for capturing a plurality of images on the imaging device in the region, the control means, according to the adjusted imaging position , Ru is captured a plurality of images on the imaging device.

  ADVANTAGE OF THE INVENTION According to this invention, the technique which enables efficient and appropriate imaging of the structure | tissue, cell group, or lesion | pathological-change candidate whose range of distribution differs by eye to be examined in a wider range than the angle of view of a high-magnification image can be provided.

1 is a block diagram illustrating a configuration example of a system including an ophthalmologic apparatus 10. FIG. FIG. 3 is a block diagram illustrating an example of a hardware configuration of the ophthalmologic apparatus 10. FIG. 3 is a block diagram illustrating an example of a functional configuration of the ophthalmologic apparatus 10. FIG. 2 is a diagram for explaining the overall configuration of an SLO image capturing apparatus 20. The flowchart of the process which the ophthalmologic apparatus 10 performs. The figure explaining an image acquisition pattern. The figure explaining the image processing content. The flowchart which shows the detail of a high magnification image acquisition process. The flowchart which shows the detail of an image display process. FIG. 3 is a block diagram illustrating an example of a functional configuration of the ophthalmologic apparatus 10. The flowchart which shows the detail of a high magnification image acquisition process. The figure explaining the exceptional frame contained in an image acquisition pattern and a high magnification moving image. FIG. 3 is a block diagram illustrating an example of a functional configuration of the ophthalmologic apparatus 10. The figure explaining the whole structure of the tomogram imaging device 60. FIG. The figure explaining an image acquisition pattern. The flowchart which shows the detail of an image display process.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

<< Embodiment 1 >>
When an ophthalmologic apparatus as an information processing apparatus according to the present embodiment acquires a plurality of high-magnification adaptive optical SLO images, the ophthalmic apparatus operates a basic pattern of parameters relating to imaging conditions for a plurality of high-magnification image acquisition prepared in advance. To the operator (user) and let the operator select it. Next, if necessary, the operator adjusts the image acquisition parameter in accordance with the lesion shape, and determines acquisition parameter values for a plurality of high-magnification images according to the contents of the adjustment. Hereinafter, the operator selects a basic pattern for acquiring a plurality of images in a disc shape for a wide range of photoreceptor cell defects in the macular region, and the acquisition position, angle of view, pixel size, number of frames, frame of each high-magnification image A case where the rate and the in-focus position are determined will be described as an example.

(overall structure)
FIG. 1A is a configuration diagram of a system including an ophthalmologic apparatus 10 according to the present embodiment. As shown in FIG. 1A, an ophthalmic apparatus 10 includes an SLO image capturing apparatus 20 and a data server 40 serving as an image capturing apparatus, and a local area network (LAN) 30 including an optical fiber, USB, IEEE 1394, and the like. Connected through. In addition, the connection form of these apparatuses is not restricted to the example of FIG. For example, these devices may be connected via an external network such as the Internet, or the ophthalmologic apparatus 10 may be directly connected to the SLO image capturing apparatus 20.

The SLO image capturing device 20 is a device that captures (captures) a wide field angle image DL and a high-magnification image DH of the fundus. SLO image capturing apparatus 20, wide field of view image D L and the high magnification image D H, and fixation target position F L used in the time of imaging, information F H, and transmits to the ophthalmic device 10 and data server 40.

In addition, when acquiring the image of each magnification at a different imaging position, the acquired image is represented as D Li and D Hj . That is, i and j are variables indicating imaging position numbers, and i = 1, 2,. . . , Imax, j = 1, 2,. . . , Jmax. When acquiring a high-magnification image at a plurality of different magnifications, D 1j , D 2k ,. . . D 1j having the highest magnification is designated as a high magnification image, D 2k,. . . Is denoted as an intermediate magnification image.

The data server 40 holds imaging condition data, eye image features, normal values related to eye feature distribution, and the like. As the imaging condition data, output by the SLO image capturing apparatus 20, a fixation target position F L, F H, ophthalmologic apparatus 10 outputs using wide-angle image D L of the eye, the high magnification image D H, the time of imaging The image feature of the eye is stored in the data server 40. In the present embodiment, image features relating to photoreceptor cells P, capillaries Q, blood cells W, retinal blood vessels, and retinal layer boundaries are handled as image features of the eye part. In response to a request from the ophthalmologic apparatus 10, the data server 40 transmits the wide-angle image D L , the high-magnification image D H , the image feature of the eye part, and normal value data of the image feature to the ophthalmologic apparatus 10.

(Ophthalmic equipment)
The ophthalmologic apparatus 10 is realized by an information processing apparatus such as an embedded system, a personal computer (PC), or a tablet terminal. A hardware configuration of the ophthalmologic apparatus 10 will be described with reference to FIG. In FIG. 2, a CPU 301 is a central processing unit, and controls the operation of the entire ophthalmologic apparatus in cooperation with other components based on a computer program such as an operating system (OS) or an application program. A RAM 302 is a writable memory and functions as a work area for the CPU 301. A ROM 303 is a read-only memory and stores programs such as basic I / O programs, data used for basic processing, and the like. The external storage device 304 is a device that functions as a large-capacity memory, and is realized by a hard disk device or a semiconductor memory. The monitor 305 is a display device as a display unit that displays a command input from the keyboard 306 or the pointing device 307, a response output of the ophthalmologic apparatus 10 to the command, and the like. A keyboard 306 and a pointing device 307 are devices that accept instructions from an operator and input of commands. The interface 308 is a device that relays data exchange with an external device.

  A control program for realizing the image processing function according to the present embodiment and data used when the control program is executed are stored in the external storage device 304. These control programs and data are appropriately fetched into the RAM 302 through the bus 309 under the control of the CPU 301, executed by the CPU 301, and function as each unit described below.

  Next, the functional configuration of the ophthalmologic apparatus 10 according to the present embodiment will be described with reference to FIG. FIG. 3 is a block diagram showing a functional configuration of the ophthalmologic apparatus 10. As illustrated in FIG. 3, the ophthalmologic apparatus 10 includes a data acquisition unit 110, a storage unit 120, an image processing unit 130, and an instruction acquisition unit 140.

  The data acquisition unit 110 is a functional block that acquires data such as image data and imaging condition data. The data acquisition unit 110 includes a wide-angle image acquisition unit 111 that acquires a wide-angle image and a high-magnification image acquisition unit 112 that acquires a high-magnification image. The storage unit 120 is a functional block that holds the data acquired by the data acquisition unit 110 and the image acquisition pattern set 121. The image acquisition pattern set 121 is a set of basic setting patterns (hereinafter referred to as “image acquisition patterns”) regarding parameters when acquiring a plurality of high-magnification images.

  The image processing unit 130 is a functional block that performs processing such as determination of imaging conditions, setting of the imaging conditions, display of captured images, and the like. The image processing unit 130 includes a display control unit 131 that performs display control of a captured image, a determination unit 132 that determines an imaging condition, and an alignment unit 133 that aligns an imaging region based on the imaging condition. The display control unit 131 includes an image acquisition pattern presentation unit 1311 that displays an image acquisition pattern on a monitor and presents it to the operator. The determination unit 132 includes a magnification determination unit 1321 that determines an imaging magnification, a position determination unit 1322 that determines an imaging position, a time determination unit 1323 that determines an imaging timing and the like, and an order determination unit 1324 that determines an imaging order. .

(SLO image pickup device)
Next, a configuration example of the SLO image capturing apparatus 20 including the adaptive optical system will be described with reference to FIG. Note that the configuration of the SLO image capturing apparatus described below is merely an example, and any image capturing apparatus can be used as long as a high-magnification image can be acquired.

  Reference numeral 201 denotes a light source, and FIG. 4 shows an example realized by an SLD light source (Super Luminescent Diode). In the present embodiment, both the light source for fundus imaging and the light source for wavefront measurement are realized by the light source 201, but each may be a separate light source and combined in the middle of the optical path.

  The light emitted from the light source 201 passes through the single mode optical fiber 202 and is irradiated as parallel measurement light 205 by the collimator 203. The irradiated measurement light 205 passes through the light splitting unit 204 formed of a beam splitter and is guided to the adaptive optics optical system.

  The compensation optical system includes a light splitting unit 206, a wavefront sensor 215, a wavefront correction device 208, and reflection mirrors 207-1 to 207-4 for guiding light to them. Here, the reflection mirrors 207-1 to 207-4 are installed such that at least the pupil of the eye, the wavefront sensor 215, and the wavefront correction device 208 are optically conjugate. In this embodiment, a beam splitter is used as the light splitting unit 206. In this embodiment, a spatial phase modulator using a liquid crystal element is used as the wavefront correction device 208. Note that a variable-shape mirror may be used as the wavefront correction device. The light that has passed through the compensation optical system is scanned one-dimensionally or two-dimensionally by the scanning optical system 209.

  As the scanning optical system 209, in this embodiment, two galvano scanners are used for main scanning (fundus horizontal direction) and sub-scanning (fundus vertical direction). However, a resonant scanner may be used on the main scanning side of the scanning optical system 209 for faster imaging.

  The measurement light 205 scanned by the scanning optical system 209 is irradiated to the eyeball 211 through the eyepiece lenses 210-1 and 210-2. The measurement light 205 irradiated to the eyeball 211 is reflected or scattered by the fundus. By adjusting the positions of the eyepieces 210-1 and 210-2, optimal irradiation can be performed according to the diopter of the eyeball 211. Although a lens is used for the eyepiece here, a spherical mirror or the like may be used.

  The reflected scattered light (returned light) reflected or scattered from the retina of the eyeball 211 travels in the opposite direction along the same path as that of the incident light, and a part of the reflected scattered light is reflected on the wavefront sensor 215 by the light splitting unit 206. Used to measure the wavefront of The wavefront sensor 215 is connected to the adaptive optics controller 216 and transmits the received wavefront to the adaptive optics controller 216. The wavefront correction device 208 is also connected to the adaptive optics controller 216 and performs modulation instructed by the adaptive optics controller 216. Based on the wavefront measured by the wavefront sensor 215, the adaptive optics controller 216 calculates a modulation amount (correction amount) that corrects the wavefront reaching the wavefront correction device 208 to a wavefront having no aberration, and corrects the wavefront. Instruct device 208 to do so. The wavefront measurement and the instruction to the wavefront correction device 208 are repeatedly processed, and feedback control is performed so that the wavefront is always optimal.

A part of the reflected and scattered light transmitted through the light splitting unit 206 is reflected by the light splitting unit 204 and guided to the light intensity sensor 214 through the collimator 212 and the optical fiber 213. Light is converted into an electrical signal by the light intensity sensor 214, and the image is formed as an eye image by the control unit 217 and displayed on the display 218. 4, the SLO image capturing apparatus 20 operates as a normal SLO apparatus by increasing the swing angle of the scanning optical system and instructing the adaptive optics control unit 216 not to perform aberration correction. An angle-of-view SLO image (wide-angle image D L ) can be captured.

(Processing procedure)
Specific processing contents executed by the ophthalmologic apparatus 10 will be described in detail in association with the role of each functional block. FIG. 5 is a flowchart showing a processing procedure executed by the ophthalmologic apparatus 10. The following steps are executed based on the control of the CPU 301.

<Step S510>
Wide field of view image obtaining unit 111, to the SLO image capturing device 20, requests the acquisition of the wide field of view image D L and fixation target position F L. In the present embodiment, an example in the case of obtaining a wide angle image D L by setting the fixation target position F L fovea of the macula portion. Note that the method for setting the imaging position is not limited to this, and it may be set at any other position.

SLO image capturing device 20, a wide angle of view wide angle image D L in response to the acquisition request from the image acquisition unit 111 acquires the fixation target position F L, and transmits to the wide field of view image obtaining unit 111. Wide field of view image obtaining unit 111 receives the wide angle image D L and fixation target position F L through the LAN30 from SLO image capturing device 20. Wide field of view image acquisition unit 111 stores the wide-angle image D L and fixation target position F L received in the storage unit 120. In the present embodiment, wide field of view image D L is-registered moving image between frames.

<Step S520>
The image acquisition pattern presentation unit 1311 acquires at least one type of image acquisition pattern (basic setting pattern related to parameters for acquiring a plurality of high-magnification images) from the storage unit 120 and displays it on the monitor 305 in a selectable manner. Although an arbitrary pattern can be presented as the image acquisition pattern, the present embodiment will describe a case where a basic pattern as shown in FIGS. 6A to 6F is presented. 6 (a) is a linear shape, FIG. 6 (b) is a cross shape, FIG. 6 (c) is a radial shape, FIG. 6 (d) is a rectangular shape, FIG. 6 (e) is a disk shape, and FIG. Shows an example of an annular pattern.

  Next, the instruction acquisition unit 140 acquires an instruction related to selection of an image acquisition pattern desired by the operator from the outside. This instruction is input by an operator via the keyboard 306 or the pointing device 307, for example. Alternatively, when the monitor 305 includes a touch panel liquid crystal, the input may be performed via the touch panel. In this embodiment, the observation target is a disc-shaped photoreceptor cell defect region as shown in FIG. 7D, and therefore a disc-shaped image acquisition pattern as shown in FIG. 6E is selected.

Note that the image acquisition pattern is not only composed of a high-magnification image of one kind of magnification as shown in FIGS. 6A to 6F, but also a combination of images of a plurality of magnifications. Can be presented as an image acquisition pattern. For example, as shown in FIG. 6G, not only the high-magnification image D 1j but also the intermediate-magnification image D 2k may be included in the acquisition pattern and defined. Hereinafter, such an acquisition pattern including not only a high-magnification image but also an intermediate magnification image is referred to as a “multi-magnification type image acquisition pattern”. Such an image acquisition pattern, and if you want to reduce the number of acquired images are suitable if you want to align the wide field of view image D L more accurately. In the multi-magnification type image acquisition pattern, the shape of the image acquisition pattern formed by the high-magnification image or intermediate-magnification image of each magnification may be different or the same for each magnification. For example, the intermediate magnification image D 2k having a lower magnification may be acquired in a rectangular shape, and the high magnification image D 1j may be acquired in a disk shape. When the image acquisition patterns are different between different magnifications, information on the type of image acquisition pattern at each magnification is also acquired in this step. FIG. 6G shows a case where the shape of the image acquisition pattern formed by the high-magnification image D 1j and the shape of the image acquisition pattern formed by the intermediate magnification image D 2k are both cruciform and the same. .

  In addition, as shown in FIG. 6H, a plurality of basic patterns arranged in different positions (hereinafter referred to as “multi-placement type image acquisition patterns”) may be presented. FIG. 6H shows an arrangement of a plurality of rectangular image acquisition patterns. This is suitable when there are a plurality of lesions, or when the morphology and dynamics of the observation target are compared for each region. The multi-arrangement type image acquisition pattern includes a case where images are acquired by changing the in-focus positions of the basic patterns. Furthermore, as shown in FIG. 6I, an image acquisition pattern defined by a combination of basic patterns (hereinafter referred to as “composite image acquisition pattern”) may be presented. The composite image acquisition pattern is suitable for efficiently acquiring images of different purposes in one inspection. For example, FIG. 6 (i) shows a case of measuring the shape of the avascular region (white line closed region in FIG. 6 (i)) in the fovea (the center of gravity of the avascular region) indicated by the black dots in FIG. ) And the case where the photoreceptor cell density is measured at a certain distance from the fovea (cross shape).

<Step S530>
In the determination unit 132, the acquisition parameters for the plurality of images included in the image acquisition pattern selected in S520 are set as initial values, and the acquisition parameters for the plurality of high-magnification images are determined by the operator adjusting the image acquisition parameters as necessary. Image acquisition. The processing of this step (hereinafter referred to as “high-magnification image acquisition processing”) will be described in detail later using the flowchart shown in FIG.

<Step S540>
The alignment unit 133 performs alignment between the wide angle image D L and the high magnification image D H and obtains the relative position of the high magnification image D H on the wide angle image D L. This alignment means that the positional relationship between the wide-angle image D L and the high-magnification image D H is automatically determined, and the position of D H is set at a position corresponding to D L. Incidentally, wide field of view image D L is a low magnification image than the high magnification image showing the entire imaging region, a given image previously given. When there is an overlapping area between the high-magnification images D Hj , first, the similarity between images is calculated for the overlapping area, and the positions of the high-magnification images D Hj are aligned with the position where the inter-image similarity is maximized. Next, when high-magnification images with different resolutions are acquired in S530, alignment is performed in order from lower-magnification images. For example, when the high-magnification image D 1j and the intermediate-magnification image D 2k are acquired as the high-magnification image DH , first, alignment is performed between the wide-angle image D L and the intermediate-magnification image D 2k. Position alignment is performed between the intermediate magnification image D 2k and the high magnification image D 1j . Needless to say, when the resolution of the high-magnification image is one type, only the alignment of the wide-angle image D L and the high-magnification image D H is performed.

The alignment unit 133 acquires the fixation target position F H used at the time of capturing the high-magnification image DH from the storage unit 120, and the position in the alignment between the wide-angle image D L and the high-magnification image DH. Set as the search initial point for the matching parameters. In addition, any known method can be used as the inter-image similarity and the coordinate conversion method, and in this embodiment, alignment is performed using a correlation coefficient as the inter-image similarity and Affine conversion as the coordinate conversion method.

<Step S550>
The display control unit 131, based on the obtained value of the alignment parameter in S540, displays a high magnification image D H on wide field of view image D L. The processing in this step (hereinafter referred to as “image display processing”) will be described in detail later using the flowchart shown in FIG.

<Step S560>
Instruction acquisition unit 140, wide field of view image D L, the high magnification image D H, fixation target position F L, instruction whether to save the position alignment parameter values obtained in F H, S540 to the data server 40 Is acquired from the outside. This instruction is input by an operator via the keyboard 306 or the pointing device 307, for example. If saving is instructed (YES in S560), the process proceeds to S570. If saving is not instructed, the process proceeds to S580 (NO in S560).

<Step S570>
The image processing unit 130 associates the examination date and time, information for identifying the eye to be examined, the wide-angle image D L , the high-magnification image D H , the fixation target positions F L and F H , and the alignment parameter value, and the data server 40. Send to.

<Step S580>
The instruction acquisition unit 140 acquires an instruction from the outside as to whether or not to end the processing on the wide-angle image D L and the high-magnification image D H by the ophthalmologic apparatus 10. This instruction is input by the operator via the keyboard 306 or the pointing device 307. If an instruction to end the process is acquired (YES in S580), the process ends. On the other hand, when an instruction to continue the process is acquired (NO in S580), the process returns to S510, and the process for the next test eye or the re-process for the same test eye is performed.

(High magnification image acquisition processing)
Next, details of the high-magnification image acquisition process executed in S530 will be described with reference to the flowchart shown in FIG.

<Step S810>
The determination unit 132 acquires from the storage unit 120 the type of the image acquisition pattern selected through the instruction acquisition unit 140 and the acquisition parameter value of each high-magnification image DH that the pattern has. Specifically, among the acquisition parameters of each high-magnification image held by the selected image acquisition pattern, the following parameter values are input as initial values. That is, the magnification number, each angle of view, and the pixel size value are acquired in the magnification determination unit 1321, the acquisition position and the focus position are acquired in the position determination unit 1322, and the number of frames, the frame rate, and the number of repeated acquisitions are acquired in the time determination unit 1323. The order is input to the order determination unit 1324 as an initial value.

<Step S820>
The determination unit 132 acquires, through the instruction acquisition unit 140, the constraint condition related to the setting value of the acquisition parameter of each high-magnification image D Hj constituting the selected image acquisition pattern. The constraint condition is a condition that determines a range that the imaging condition can take. The constraint condition can be instructed and set by an operator regarding an arbitrary image acquisition parameter. In this embodiment, a case where the following four constraint conditions can be set will be described.
a) Total image acquisition time b) Type of magnification (number of magnifications, angle of view and pixel size)
c) In-focus position d) Overlapping region between adjacent high magnification images where
a) is the permissible time that the eye can withstand per examination.
b) is the size of the image feature expected to be obtained at the imaging position;
c) is the position in the depth (z-axis) direction where the observation object exists,
d) is a constraint on the amount of fixation disparity of the eye to be inspected. In this embodiment, a) 15 minutes b) 1 and 300 [μm] × 300 [μm] and 1 [μm / pixel] × 1 [μm / pixel], respectively
c) photoreceptor layer d) 20% of high magnification image area
An example will be described.

<Step S830>
A magnification determination unit 1321 determines the type of magnification (number of magnifications, field angle, and pixel size) of the high-magnification image DH . Further, the position determination unit 1322 determines the acquisition position and the focus position of each high-magnification image D Hj .

In the present embodiment, the angle of view, the pixel size, and the focus position are fixed values due to the constraint condition acquired in S820, but the acquisition position of each high-magnification image D Hj is a variable parameter. Therefore, the operator first designates the position on the fundus of the point (representative point) to be subjected to the image acquisition pattern. In the present embodiment, the representative point is the center point C in FIG. 6E and is set in the central fovea of the eye to be examined. Next, when the operator enlarges or reduces the size of the entire image acquisition pattern, the number of high-magnification image acquisition positions increases or decreases while maintaining the overlap area size between adjacent high-magnification images. The acquisition position of the image D Hj is determined. In the present embodiment, the size of the entire image acquisition pattern is enlarged by the operator moving the position of (one) high-magnification image located at the end of the image acquisition pattern to the outside of the disc, as shown in FIG. The acquisition position of the high-magnification image D Hj as shown by the white line rectangular area in g) is determined.

  The representative point is not limited to the center point of the image acquisition pattern. For example, the position of a specific high-magnification image constituting the image acquisition pattern may be used. In the case of a multi-magnification type image acquisition pattern as shown in FIG. 6G, the sizes of the image acquisition patterns of all magnifications may be enlarged or reduced collectively, or image acquisition is performed for each magnification. The pattern size may be changed.

  Also in the case of the multi-arrangement type image acquisition pattern shown in FIG. 6H, the image acquisition pattern sizes or the arrangement intervals of all the arrangements may be changed together. Further, the size or arrangement interval of the image acquisition pattern may be different for each arrangement of the basic pattern. Further, in the case of a composite image acquisition pattern as shown in FIG. 6 (i), the sizes of all types of patterns may be changed together, or the size of the image acquisition pattern for each type of image acquisition pattern. May be changed.

<Step S840>
The time determination unit 1323 determines the number of frames, the frame rate, and the number of repeated acquisitions of each high-magnification image. In the example of the present embodiment, the frame rate and the number of repeated acquisitions are 32 [frame / sec], each fixed value, and the number of frames is a variable parameter. The variable parameter value can be changed using any known user interface (hereinafter abbreviated as “UI”). In this embodiment, the parameter value (weight) changing UI as shown in FIG. 6J is changed efficiently by the operator. This is a UI for adjusting the weight in the radial direction with respect to the number of frames of each high-magnification image D Hj . In FIG. 6 (j), the center side of the disk-type image acquisition pattern (FIG. 7 (g)) in which Wc is arranged, Wo represents the outside weight. By reducing the outer weight Wo, the number of frames of each high-magnification image D Hj is automatically determined so as to decrease stepwise from the pattern center toward the outside.

In the case of a multi-magnification pattern as shown in FIG. 6G, a parameter value (or weight) adjustment UI as shown in FIG. 6K can be used. Adjustment of the parameter value is performed according to the following procedures i) to iv).
i) A variable parameter to be adjusted is selected from the variable parameter list V.
ii) Select the target magnification and target image for adjusting the parameter value from the adjustment map.
iii) Select a parameter value change (weighting) method R between multiple images at the selected magnification.
iv) A parameter value (weight) for the selected image is determined on the parameter value changing UI (B).
Note that in i), the adjustment map shown in FIG. 6K is displayed for each type of the selected variable parameter. With regard to ii), FIG. 6K shows a case where the target magnification is D 1 and the central D 1 image D 1c is selected as the adjustment target image.
Regarding iii), in FIG. 6 (k), as a parameter value setting (weighting) method R between images,
・ When setting to the same value between multiple images with the same magnification (uniform)
・ When changing in stages (gradual)
When changing parameter values for specified individual images (individual)
The case where it selects from among is illustrated. In the example of FIG. 6 (k), the case of changing in stages (gradual) is selected.
With regard to iv), in FIG. 6 (k), by specifying the maximum value (white) on the color bar B, the parameter value of the image having the same magnification as the selected target image D 1c is changed to the selected image D 1c. It shows a case where it is automatically changed so as to increase step by step.

  In FIG. 6 (k), the parameter value changing UI is the color bar B and the parameter values are expressed in grayscale. However, the present invention is not limited to this. For example, the parameter value changing UI may be a slider or a (numeric) list box. The parameter value may be displayed as a numerical value (parameter value itself or weight) or may be displayed in color. Alternatively, the parameter value may be displayed using both a numerical value and a gray scale (color).

  In the case of a multi-placement pattern as shown in FIG. 6 (h), as shown in FIG. 6 (l), the weight in each pattern (for example, Wm1dhi in FIG. 6 (l)) and the weight between patterns Both (Wm1 and Wm2) are adjusted. Further, in the case of a composite image acquisition pattern as shown in FIG. 6 (m), the setting of the parameter value (Wai or Wci) in each pattern and the parameter value (Wbi) in the image common to both patterns are obtained. Do. The adjustment procedure is almost the same as in the case of the multi-magnification type image acquisition pattern of FIG. In other words, the procedure of iii) described above is merely replaced with a procedure of selecting the parameter value changing (weighting) method R between a plurality of images in the selected pattern or common area, not the selected magnification.

<Step S850>
The order determination unit 1324 determines the acquisition order of the high-magnification images D Hj . In the present embodiment, of the following i) to iv), i) is the innermost (highest priority), ii) is the second inner, iii) is the third inner, and iv) is the outermost (lowest priority). Repeat the process so as to form a loop. Specifically, the following procedure is executed after setting the acquisition start position to a position having the highest importance in observation (the fovea in the present embodiment) and the acquisition magnification to the minimum magnification.
i) Acquire images of the same arrangement pattern, the same acquisition magnification, and the same image acquisition position as many times as the number of acquisitions.
ii) Images in the same arrangement pattern and the same acquisition magnification are acquired in the same manner as in i) after being moved to an adjacent image acquisition position.
iii) When ii) is completed, increase the value of the acquisition magnification and execute the operation of ii) again, and repeat the same operation for the number of magnifications.
iv) When iii) is completed, the operation of iii) is executed in another arrangement and repeated until images are acquired in all arrangements.

  In this embodiment, i) is not repeatedly acquired (the number of acquisitions is only once), and the image acquisition pattern is not a multi-arrangement type as shown in FIG. Processing is omitted. Further, although the movement direction to the adjacent image of ii) may be moved in any direction, in the present embodiment, the closer to the fovea, the stronger the influence on the visual function and the higher the importance for observation. Move in a spiral from the fovea.

  Through the processing of S830 to S850, the operator can easily change the image acquisition parameter indicating the imaging condition of the high-magnification image.

<Step S860>
The high-magnification image acquisition unit 112 requests the SLO image capturing apparatus 20 to acquire a plurality of high-magnification images D Hj and fixation target positions F Hj using the image acquisition parameters instructed by the determination unit 132. . Since the SLO image capturing apparatus 20 acquires and transmits the high-magnification image D Hj and the fixation target position F Hj in response to this acquisition request, the high-magnification image acquiring unit 112 transmits the high-magnification image acquiring unit 112 from the SLO image capturing apparatus 20 via the LAN 30. The high-magnification image D Hj and the fixation target position F Hj are received. The high-magnification image acquisition unit 112 stores the received high-magnification image D Hj and fixation target position F Hj in the storage unit 120. In the present embodiment, the high-magnification image D Hj is a moving image that has undergone inter-frame alignment.

(Image display processing)
Next, details of the image display processing executed in S550 will be described with reference to the flowchart shown in FIG.

<Step S910>
A representative image is generated from each moving image acquired by the wide angle image acquisition unit 111 and the high magnification image acquisition unit 112. In this embodiment, a superimposed image relating to each moving image is generated, and this superimposed image is used as a representative image. In addition, the method of generating the representative image is not limited to this, and for example, a reference frame that is set when the moving image is aligned between frames may be set as the representative image. Any known setting method can be used as the reference frame setting method. For example, the first frame number can be set as the reference frame.

<Step S920>
The display control unit 131 corrects the density difference between the high-magnification images when a plurality of high-magnification images D Hj are acquired. Any known luminance correction method can be applied to the correction of the density difference. As an example, in the present embodiment, a histogram is generated Hj in each high magnification image D Hj, the luminance of each high magnification image D Hj as mean and variance of the histogram Hj is common values between the high magnification image D Hj The density difference is corrected by linearly converting the value.

<Step S930>
The display control unit 131, when displaying on the wide angle image D L high magnification image D H as a moving image, and sets the reproduction speed of the high magnification image D H. The playback speed is adjusted by the playback speed adjustment slider and the frame advance button being arranged in the image display area and designated by the operator through the instruction acquisition unit 140.

  In the present embodiment, the still image (superimposed image) generated in S910 is pasted and displayed, so this process is omitted.

<Step S940>
The display control unit 131 controls display / non-display of each high-magnification image D Hj and display magnification. For displaying / hiding images, a list related to acquired images is displayed on the monitor 305, a UI (in this embodiment, a check box) is arranged near the image name in the acquired image list, and the UI is displayed through the instruction acquisition unit 140. ON / OFF is set by the operator. A UI (check box) for specifying all images at once and a batch specification UI (check box) for each acquisition magnification are also prepared to facilitate switching between display and non-display for a large number of images.

In this step, not only the display / non-display of the image but also the setting of the overlapping order when there is an overlapping area between adjacent high-magnification images D Hj or when images are captured multiple times at the same fixation target position. . Regarding the setting method of the overlapping order of moving images, any setting method including manual setting can be used. In this embodiment, the image quality index and the fixation disparity amount of the image are calculated, and the image having the highest evaluation value is set as the foremost layer and displayed using the linear sum of the image quality index value and the fixation disparity amount as an evaluation function. . Here, any known index can be used as the image quality index value, and the average brightness value of the image histogram is used in the present embodiment. Further, as the fixation disparity amount, a value obtained by adding the absolute value of the translational movement distance between adjacent frames over all frames is used. An arbitrary index may be used as long as it is an index capable of evaluating fixation disparity. As for the display magnification, the high-magnification image designated by the operator through the instruction acquisition unit 140 is enlarged and displayed on the monitor 305.

Although in the example above was taken up when wide field of view image D L is a single wide angle SLO image, the present invention is not limited thereto. For example, a composite image obtained by aligning the wide angle image D L i between the different acquisition position may be wide field of view image D L.

  As described above, the ophthalmologic apparatus 10 according to the present embodiment is selected by the operator from a plurality of basic patterns related to high-magnification image acquisition parameters (imaging conditions), and adjusts the parameter value according to the lesion shape. A high-magnification image is acquired based on the adjusted parameter value. That is, the ophthalmologic apparatus 10 presents a plurality of basic patterns indicating distributions of a plurality of positions where a high-magnification image is captured to the operator in a selectable manner. In accordance with the operator's selection, the imaging condition relating to the imaging of the high-magnification image, which is associated in advance with the basic pattern selected from the plurality of basic patterns, is adjusted according to the operator's instruction. Furthermore, according to the adjusted imaging condition, the imaging device is caused to capture a plurality of high-magnification images in the imaging area. For this reason, according to the present embodiment, it is possible to easily set an appropriate imaging condition for acquiring a plurality of high-magnification images having a field angle smaller than the field angle of the imaging region in a given imaging region. it can. Therefore, it is possible to efficiently image a tissue, a cell group, or a lesion candidate having a different distribution depending on the eye to be examined in a wider range than a high-magnification image.

In the present embodiment, the imaging condition is determined based on at least one of the position of the representative point of the selected basic pattern in the imaging area, the constraint condition specified by the operator, and the change amount of the imaging condition. adjust. For this reason, it is possible to easily set an appropriate imaging condition in accordance with the imaging target. For example, there are individual differences in cell groups, tissues, and shape and density of lesions, and the area to be observed or measured in detail varies depending on the subject eye. According to the configuration of this embodiment, after specifying the cell group, tissue, and lesion area to be observed for each subject eye, the acquisition parameters of a plurality of high-magnification images D Hj are set according to the shape and density of this area. Can be set automatically.

  In the present embodiment, as imaging conditions, the position at which a high-magnification image is captured in a wide-angle image, the imaging order, the number of images to be imaged at the same position, the field angle and pixel size of the high-magnification image, and the imaging frame Although the number, the frame rate, and the in-focus position are exemplified, the present invention is not limited to this.

<< Embodiment 2 >>
The ophthalmologic apparatus according to the present embodiment relates to acquisition of a plurality of high-magnification images based on image features extracted from an image having a wider angle of view than the high-magnification image when acquiring a plurality of high-magnification adaptive optical SLO images. The parameter value is determined. Specifically, when a plurality of high-magnification moving images D 1j are acquired in an annular shape with respect to the capillary region of the parafovea , the acquisition position and angle of view of each high-magnification image D 1j are obtained based on this image feature. The pixel size, the number of frames, the frame rate, and the number of repeated acquisitions are determined. Furthermore, after determining whether the acquired high-magnification image includes an exceptional frame such as fixation disparity or blinking, the same image if it is determined that the high-magnification image needs to be reacquired based on the determination result Re-acquire a high-magnification image with the acquisition parameter value.

(overall structure)
FIG. 1B shows a configuration of a device connected to the ophthalmologic apparatus 10 according to the present embodiment. The present embodiment is different from the configuration of the first embodiment in that a temporal data acquisition device 50 is connected to the ophthalmic device 10 in addition to the SLO image capturing device 20 and the data server 40. The time phase data acquisition device 50 is a device that acquires biological signal data that changes autonomously and periodically (referred to as “time phase data”), and includes, for example, a pulse wave meter or an electrocardiograph. The time phase data acquisition device 50 acquires time phase data Sj simultaneously with the acquisition of the high-magnification image D Hj in response to an operation by an operator (not shown). The obtained temporal data Sj is transmitted to the ophthalmologic apparatus 10 and the data server 40. In the present embodiment, a high-magnification image is acquired and displayed on the monitor 305 in synchronization with the periodic timing indicated by the time phase data measured from the subject. For this reason, it is possible to acquire or reproduce a high-magnification image at an appropriate timing according to changes in the living body.

The data server 40, wide field of view image D L of the eye, the high magnification image D H, and fixation target position F L used at the time of its acquisition, in addition to obtaining condition data such as F H, the pulse data Sj also holds normal values relating to the image feature of the eye part and the distribution of the image feature of the eye part. In this embodiment, retinal blood vessels, capillaries Q, and blood cells W are held as image features of the eye, but the present invention is not limited to this. The time phase data Sj output from the time phase data acquisition device 50 and the image characteristics of the eye portion output from the ophthalmic device 10 are stored in the data server 40. Further, in response to a request from the ophthalmologic apparatus 10, the temporal data Sj and normal value data regarding the image feature of the eye and the image feature distribution of the eye are transmitted to the ophthalmologic apparatus 10.

(Ophthalmic equipment)
Next, functional blocks of the ophthalmologic apparatus 10 according to the present embodiment are shown in FIG. In addition to the configuration of the first embodiment, the ophthalmologic apparatus 10 of the present embodiment has a data acquisition unit 110 for a temporal data acquisition unit 113, an image processing unit 130 for an image feature acquisition unit 134, and a determination unit 132 for re-acquisition necessity determination. The unit 1325 and the alignment unit 133 include an exceptional frame determination unit 1331. The time phase data acquisition unit 113 is a functional block that acquires time phase data of the subject. The image feature acquisition unit 134 is a functional block that analyzes a wide-angle image and acquires information indicating the feature. The reacquisition necessity determination unit 1325 is a functional block that determines whether or not to acquire a high-magnification image again. The exceptional frame determination unit 1331 is a functional block that detects, as an “exception frame”, a frame that is not suitable for optometry, such as a frame that has a large positional shift due to fixation failure. When the “exception frame” is detected, the high-magnification image is reacquired.

(Processing procedure)
The image processing flow in this embodiment is the same as that in FIG. 5, and the processing in S510, S560, S570, and S580 is the same as that in the first embodiment. Further, step S540 is omitted. Therefore, in the present embodiment, the processing of S520, S530, and S550 will be described.

<Step S520>
The image acquisition pattern presentation unit 1311 acquires at least one type of image acquisition pattern for acquiring a plurality of high-magnification images from the storage unit 120 and displays the acquired image acquisition pattern on the monitor 305. In the present embodiment, the image acquisition pattern presentation unit 1311 presents basic patterns of a linear shape, a cross shape, a radial shape, a rectangular shape, a disc shape, an annular shape, a multi-magnification type, a multi-placement type, and a composite type.

Next, the instruction acquisition unit 140 acquires an instruction on which image acquisition pattern to select from the outside. In the present embodiment, the case where the observation target is an annular parafoveal capillary region as shown in FIG. 7C will be described as an example. In this case, since it is necessary to determine the inner boundary of this annular region based on the avascular region, the operator selects a multi-magnification type image acquisition pattern. Here, in the multi-magnification type image acquisition pattern, D 1j is annular and D 2k is rectangular.

  In this embodiment, the image acquisition pattern selection process is not an essential process, and the process of this step may be omitted by setting the high-magnification image acquisition target area as an annular area and the intermediate-magnification image acquisition target area as a rectangular area from the beginning. Good.

<Step S530>
The determination unit 132 requests acquisition of the intermediate magnification image D 2k to the high magnification image acquisition unit 112, and the high magnification image acquisition unit 112 acquires the intermediate magnification image D 2k . Then, to get the image feature of the image feature obtaining section 134 is wide angle on the image D L and the intermediate magnification image D 2k, to determine the acquisition parameters of a plurality of high magnification image based on the image feature, a high magnification image D 1j is acquired. Also, inter-frame alignment and exceptional frame determination are performed on the acquired high-magnification image D 1j , and if it is determined that re-imaging is necessary based on the determination result of the exception frame, the same high-magnification image D 1j is captured again. . Furthermore, to align the intermediate magnification image D 2k and the high magnification image D 1j on wide field of view image D L. The processing in this step (high-magnification image acquisition processing) will be described in detail later using the flowchart shown in FIG.

<Step S550>
Based on the alignment parameter value obtained in S1270 (described later), the display control unit 131 displays the high-magnification image D H superimposed on the wide-angle image D L as shown in FIG. In the present embodiment, a capillary image as shown in FIG. 7F is also displayed next to the superimposed image as an image in which the distribution of capillaries in the parafovea can be observed in more detail. As described above, in the present embodiment, display control is performed in which a plurality of high-magnification images that have been captured are displayed on the monitor 305 while being superimposed on an image that represents the entire imaging region, so only a necessary portion of a wide-angle image is required. A precise image can be observed. The processing in this step (image display processing) will be described in detail later using the flowchart shown in FIG.

(High magnification image acquisition processing)
Next, details of the process executed in S530 will be described with reference to the flowchart shown in FIG.

<Step S1210>
The determination unit 132 acquires the intermediate magnification image D 2k based on the D 2k image acquisition pattern selected in S520.

In this embodiment, a rectangular image acquisition pattern such as D 2k in FIG. 12A is set, and the fixation target is presented so that the center point C of the image acquisition pattern is near the fovea. . In addition, the field angle, the pixel size, the number of frames, and the frame rate are each 600 [μm] × 600 [μm], the pixel size is 2 [μm / pixel] × 2 [μm / pixel], 256, and 64 [frame]. / sec] is set. The overlapping area between adjacent intermediate rate images is 10% of the intermediate magnification image area. Acquisition order of the high-magnification images, in this embodiment, the order of moving the intermediate magnification image D 25 in the center of the image acquisition pattern after movement to the next right as a first acquisition position, the adjacent image counterclockwise To do. Positioning unit 133 performs inter-frame alignment of the obtained intermediate magnification image D 2k, performs positioning to the intermediate magnification of the image D 2k wide angle image on D L (image pasting). Note that the coordinate conversion method and similarity evaluation function used for alignment are the same as those in the first embodiment, and thus detailed description thereof is omitted.

<Step S1220>
The image feature acquisition unit 134 detects the capillary vessel from the wide-angle image D L or the intermediate magnification image D 2k acquired in S1210, and detects the boundary of the avascular region from the detected capillary region. Further, an annular (donut-shaped) region that is equidistant from the boundary position of the avascular region is detected in order to set a region near the avascular region as an acquisition target region of the high magnification image.

In the present embodiment, first, the capillary is specified as the moving range of the blood cell component from the intermediate magnification image D 2k in the following procedure.
(A) Difference processing is performed between adjacent frames of the intermediate magnification image D 2k that has been aligned between frames. That is, a differential moving image is generated.
(B) A luminance statistic (for example, variance) in the frame direction is calculated at each xy position of the differential video generated in (a).
(C) At each xy position of the differential moving image, an area where the luminance variance is equal to or greater than a predetermined threshold Tv is specified as an area where the blood cell has moved, that is, a capillary blood vessel area.
Capillary blood vessel detection processing is not limited to this method, and any known method may be used. For example, it may detect a blood vessel by applying a filter emphasizing a linear structure for a particular frame of the wide field of view image D L or intermediate magnification image D 2k.

  Next, the image feature acquisition unit 134 detects the boundary of the avascular region from the obtained capillary region. In the vicinity of the central fovea of the retina, there is a region where no retinal blood vessel exists (referred to as “avascular region”), as in the inner broken line region in FIG. The shape of the border of the avascular region varies greatly between individuals, and initial retinal vascular lesions are likely to occur around the border of the avascular region. Therefore, the avascular region boundary is important as an object of observation and analysis.

In the present embodiment, a variable shape model (solid line portion in the figure) having a smaller radius Tr than the circle connecting the image centers of the high-magnification image D 1j in the image acquisition pattern (broken line portion in FIG. 12 (a)) is widened in S1210. The intermediate magnification image D 2k aligned on the corner image is arranged on the composite image. In the present embodiment, the model is arranged so that the center of the model coincides with the center C of the intermediate magnification image D 25 in FIG. The position of the deformable shape model (Bi in FIG. 12B) that has been deformed in accordance with the image features on the combined image of the intermediate magnification image D 2k is defined as the avascular region boundary, and the center of gravity C of this avascular region boundary 'Determine. Further, predetermined threshold values To and To / out from the avascular region boundary to the outside using a distance image (image having a distance value from the boundary as a pixel value) obtained by performing Euclidean distance conversion on the avascular region boundary. A position at a distance of 2 (Bo and Bm in FIG. 12B) is determined. Although an arbitrary value can be set as the threshold value To, it is generally set to about 150 [μm] for a healthy person, so this value is also used in this embodiment. An annular (doughnut-shaped) high-magnification target region is determined using the identified inner boundary Bi and outer boundary Bo, and a broken line portion Bm is a candidate for an acquisition position (image center) of the high-magnification image D 1j. Become.

  In the present embodiment, the distance from the avascular region boundary (the thickness of the annular region) is fixed by the threshold value To, but the present invention is not limited to this. For example, in a disease in which lesions occur in retinal capillaries in the parafovea, such as diabetic retinopathy, the capillaries are blocked and the avascular region becomes larger as the disease progresses. Further, when the avascular region becomes large, there is a possibility that a vascular lesion is generated in a wider area around the avascular region. Therefore, a value obtained by multiplying the threshold value To by a value proportional to the area of the avascular region may be set as the distance from the avascular region boundary. In this case, the angle of view is set as a variable parameter in S1230, and is determined to be a sufficiently large value compared to the distance from the avascular region boundary, that is, the thickness of the annular region.

<Step S1230>
A magnification determining unit 1321 determines the magnification number, field angle, and pixel size of the high-magnification image D 1j . Further, the position determination unit 1322 determines the acquisition position and the focus position of each high-magnification image D 1j . In this embodiment, the number of magnifications, the angle of view, the pixel size, and the in-focus position are fixed parameters (each 2,200 [μm] × 200 [μm], 1 [μm / pixel] × 1 [μm / pixel], retinal blood vessels) ) And the high magnification image D 1j acquisition position is determined as a variable parameter as follows.

First, the points (Bms in FIG. 12B) obtained by sampling the boundary Bm determined in S1220 at equal intervals Td are used as candidates for the acquisition position of the high-magnification image D 1j , and the high-magnification from the specific candidate point Bm0 The acquisition position of the image D 1j is determined in order.
In this embodiment, the interval Td = the angle of view of the high-magnification image D 1j × (100−standard value of the ratio of the overlapping area between the high-magnification image D 1j ) / 100 (1)
And a candidate point directly above the center of gravity of the avascular region is defined as Bm0. The detailed acquisition position of the high-magnification image D 1j is determined so as to satisfy both of the following conditions a) and b). That is,
a) High magnification that protrudes outside the annular region under the condition that no blank is generated in the annular region with respect to the radial direction of the annular region determined in S1220 (the line direction connecting the gravity center position C ′ of the avascular region and the acquisition position candidate point Bms). The sum of the pixels of the image D 1j is minimized.
b) With respect to the tangential direction of the boundary position Bm, it matches the ratio of the overlapping area between the high-magnification images described below.
Here, the ratio [%] of the overlapping region between the high-magnification images D 1j is the circularity Cr of the avascular region boundary specified in S1220.
Cr = 4πS / (L * L) (2)
(S is the area of the avascular region, L is the perimeter)
Is used to set a value obtained by multiplying a standard set value (in this embodiment, 20% as an example) by a value inversely proportional to the circularity Cr as a ratio of the overlapping region between the high-magnification images D 1j. . Therefore, the ratio of the overlapping area between the high-magnification images D 1j is set to a larger value as the circularity is lower, that is, the unevenness is larger.

  Moreover, the determination method of the overlap area | region between high magnification images is not restricted to this, You may use arbitrary well-known methods. For example, the curvature is within a certain distance from the intersection Bis of the line connecting the center of gravity C ′ of the avascular region boundary and the acquisition position candidate point Bms and the avascular region to the high magnification image adjacent along the avascular region boundary. An absolute value is calculated, and an average value Ch of the absolute values of the obtained curvatures is calculated. Weighting is performed by multiplying a standard set value relating to the ratio of the overlapping area between high-magnification images by a value proportional to the average curvature value Ch. However, when the average curvature value Ch is 0, the standard set value is used as it is without weighting. By using such a setting method, it is possible to set a large ratio of overlapping regions between high-magnification images in the vicinity of a position where the absolute value of the curvature of the avascular region boundary is large.

<Step S1240>
The time determination unit 1323 determines the number of frames and the frame rate of the high-magnification image D 1j and the number of repeated acquisitions. In the example of this embodiment, the number of frames, the frame rate, and the number of repeated acquisitions at the same acquisition position are 256, 64 [frame / sec], and 2 respectively. However, the present invention is not limited to this, and any setting method may be used. For example, thinning processing is performed on the capillary blood vessel region in each high-magnification image D 1j determined in the processing of S1220 and S1230, and the blood vessel diameter is calculated in a direction orthogonal to the central axis of the obtained blood vessel. Then, only when there is a region where the blood vessel diameter shows an abnormal value, the number of frames and the number of repeated acquisitions of the high-magnification image D 1j may be increased by threshold values Tf and Tc, respectively.

<Step S1250>
The order determination unit 1324 determines the acquisition order of the high-magnification images D Hj . As in the case of S850 in the first embodiment, among the following i) to iii), i) is the innermost (highest priority), ii) is the second innermost, and iii) is the outermost (lowest priority). The process is repeated so as to form a loop. That is, the following procedure is executed after setting the acquisition start position to the ear side and the acquisition magnification to the minimum magnification.
i) Repetitively acquiring images at the same acquisition magnification and the same image acquisition position for the number of acquisition times.
ii) An image at the same acquisition magnification is acquired in the same manner as i) from adjacent image acquisition positions (counterclockwise in this embodiment).
iii) When ii) is completed, increase the value of the acquisition magnification and execute the operation of ii) again, and repeat the same operation for the number of magnifications.
Note that the method for determining the order of high-magnification images is not limited to the above procedure, and an arbitrary order setting method may be used.

<Step S1260>
The high-magnification image acquisition unit 112 acquires a high-magnification image and time phase data according to the high-magnification image acquisition parameters determined in steps S1210 to S1250. The time phase data acquisition unit 113 requests the time phase data acquisition device 50 to acquire the time phase data Sj related to the biological signal. In this embodiment, a pulse wave meter is used as the time phase data acquisition device, and the pulse wave data Sj is acquired from the ear lobe (earlobe) of the subject. Here, the pulse wave data Sj is expressed as a periodic point sequence having an acquisition time on one axis and a pulse wave signal value measured by a pulse wave meter on the other axis. Since the time phase data acquisition device 50 acquires and transmits the corresponding time phase data Sj in response to the acquisition request, the time phase data acquisition unit 113 receives the pulse wave data Sj from the time phase data acquisition device 50 via the LAN 30. Receive. The time phase data acquisition unit 113 stores the received time phase data Sj in the storage unit 120.

The data acquisition unit 110 requests the SLO image capturing device 20 to acquire a wide angle image D L , a plurality of high-magnification images D Hj captured at different fixation target positions Fj, and fixation target position Fj data. . Here, when the data acquisition unit 110 starts acquiring the high-magnification image D Hj in accordance with a certain phase of the time-phase data Sj acquired by the time-phase data acquisition device 50, and immediately after the acquisition request for the high-magnification image D Hj is obtained. It can be considered that acquisition of the wave data Sj and the high-magnification image D Hj is started at the same time. In the present embodiment, immediately after the acquisition request for the high-magnification image D Hj , acquisition of the time phase data Sj and the high-magnification image D Hj is started.

<Step S1270>
Positioning unit 133 performs inter-frame alignment on High magnification image D 1j obtained is displayed on the monitor 305 by the high magnification image D 1j aligned on wide field of view image D L. In the present embodiment, exception frame determination is performed to determine whether or not each of the high-magnification moving images D 1j corresponds to an exception frame described below when the frames are aligned. Any known alignment method may be used as an inter-frame alignment method in each moving image or an alignment (image pasting) method between images of different magnifications. In this embodiment, both are affine transformations and phase relationships. Align using numbers.

Here, as shown in FIG. 12C, the exceptional frame is a frame Es having a large misalignment due to fixation failure in each frame of the high-magnification moving image DH , a low-luminance frame Eb due to blinking, and a low due to aberration correction failure. An image quality frame (not shown). An exceptional frame can be determined based on whether the degree of luminance abnormality, the magnitude of distortion, the magnitude of noise with respect to the signal, the amount of displacement with respect to the reference frame, etc. are equal to or greater than a certain value. In particular,
a) When the translation between the frame alignment parameter values is greater than or equal to the threshold value b) When the average luminance value of each frame is less than the threshold value c) When the S / N ratio of each frame is less than the threshold value, the frame is determined as an exceptional frame . As a result of the exceptional frame determination, if the maximum value of the exceptional frame occurrence interval in each high-magnification moving image D 1j is equal to or smaller than the threshold Te or the total number of exceptional frames is equal to or larger than the threshold Ts, the reacquisition necessity determining unit 1325 It is determined that the high-magnification image D 1j needs to be acquired again. Reacquisition necessity determining unit 1325, the high magnification image D to 1j requests to reacquire the high magnification image D 1j when reacquisition is determined to require the high magnification image acquisition unit 112 of the high magnification image acquisition The unit 112 reacquires the high-magnification image D 1j according to this request.

Note that it is not always necessary to perform inter-frame alignment, exceptional frame determination, re-acquisition determination, and re-acquisition after all high-magnification images have been acquired. As soon as it is determined that reacquisition is necessary, it may be reacquired immediately. Alternatively, as soon as it is determined that must be re-obtained by running the exceptional frame determination and re-acquisition necessity determination during inter-frame alignment of the intermediate magnification image D 2k of S1210, by performing the re-acquisition of the intermediate magnification image D 2k Also good. Further, the exceptional frame determination is not limited to the inter-frame alignment process of the SLO moving image. For example, the anterior eye camera is connected to the ophthalmologic apparatus 10 and the image processing of the anterior eye camera, for example, low luminance is performed. The determination may be made using frame detection, pupil position detection, or the like.

(Image display processing)
Next, the details of the processing executed in S550 will be described with reference to the flowchart shown in FIG. In addition, since it is the same as that of the case of Embodiment 1 except S910 and S930, in this embodiment, the process in S910 and S930 is demonstrated.

<Step S910>
The display control unit 131 performs processing for generating a superimposed image of the high-magnification image D H on the wide-angle image D L as shown in FIG. 7E based on the alignment parameter value obtained in S1270. Do. In this embodiment, since the high-magnification image DH is displayed as a moving image instead of a still image, as described in S930, a representative image is not generated. However, in the image after the alignment between frames, an area having a pixel value of 0 may be generated at the edge of the image, which may interfere with the display. Only pixels whose pixel value is greater than 0 are displayed through the frames.

Further, in the present embodiment, a capillary image as shown in FIG. 7F is attached as an image for allowing the distribution of the capillary vessels in the parafovea to be observed in more detail next to the above-described combined moving image. The alignment display is also performed. For the capillary blood vessel image, the processing for specifying the capillary region performed on the intermediate magnification image D 2k in S1220 is similarly executed not only on the intermediate magnification image D 2k but also on the high magnification image D 1j to obtain a binary image. Generate and display the images based on the alignment parameters obtained in S1270. Similarly to the case of pasting moving images, only the pixels having a pixel value larger than 0 are displayed through the frames other than the exceptional frames, as in the case of pasting moving images.

<Step S930>
Each intermediate magnification image based on a plurality of intermediate magnification image D 2k and high magnification image D 1j in the phase data (cycle data based on a biological signal such as a pulse wave) time when displaying on the wide field of view image D L The playback timing of D 2k and the high-magnification image D 1j is synchronized. Specifically, the display control unit 131 acquires time phase data Sj and Sk corresponding to each moving image (that is, each high-magnification image D 1j and intermediate magnification image D 2k ) from the time-phase data acquisition unit 113. The pulsation period is calculated by detecting the extreme value of the time phase data. Next, an exceptional frame number sequence in each high-magnification image D 1j and intermediate magnification image D 2k is acquired, and a continuous frame sequence not including the exceptional frame is selected as a display target. Further, when the pulsation period in the selected frame is different between moving images (high-magnification image D 1j , intermediate magnification image D 2k ), display frame interval adjustment processing (referred to as “frame interpolation processing”) between the moving images is performed. Do. Furthermore, the frames corresponding to the integer number of pulsation periods are reproduced while adjusting the reproduction start time of each moving image so that the reproduction timings of the frames corresponding to the extreme values of the time phase data corresponding to each moving image coincide. To display the pasted video.

  Note that the display method of the present invention is not limited to this, and if time phase data is not acquired, this step is omitted, and the video is pasted and displayed as a moving image without adjusting the playback time. Also good.

  As described above, when acquiring a plurality of high-magnification adaptive optical SLO images, the ophthalmologic apparatus 10 according to the present embodiment is based on image features extracted from an image having a wider angle of view than the high-magnification image. The parameter value relating to the acquisition of a high-magnification image is determined. For this reason, it is possible to efficiently image tissues, cell groups, or lesion candidates having different distributions depending on the eye to be examined in a wider range than a high-magnification image.

  Further, in the present embodiment, at least one blood vessel image is displayed on the monitor 305 from the captured high-magnification image based on the feature of the image relating to the region where the blood vessel or blood cell has moved. For this reason, it is possible to appropriately extract only a portion that requires particularly careful observation from a wide-angle image and automatically perform precise imaging / display.

<< Embodiment 3 >>
When the ophthalmologic apparatus according to the present embodiment acquires a plurality of high-magnification adaptive optical OCT tomographic images, the plurality of high-magnification optical OCT tomographic images based on image features extracted from the OCT tomographic image having a wider angle of view than the high-magnification image. It is configured to determine a parameter value related to magnification image acquisition. Specifically, the operator selects a basic pattern for acquiring a plurality of high-magnification images in a disc shape for a photoreceptor layer near the fovea whose outer retina has deformed due to serous retinal detachment RD, and an image acquisition parameter is selected. The initial value of. Next, a case will be described in which the acquisition parameters (acquisition position, angle of view, pixel size, coherence gate) of a plurality of high-magnification images are changed based on the image characteristics of the layer shape acquired from the wide-angle OCT tomographic image, and imaging is performed. To do.

(overall structure)
FIG. 1C shows the configuration of a device connected to the ophthalmologic apparatus 10 according to this embodiment. In this embodiment, the ophthalmologic apparatus 10 is different from the first embodiment in that the ophthalmologic apparatus 10 is connected to a tomographic image capturing apparatus 60 having an adaptive optical system instead of the SLO image capturing apparatus 20. The tomographic image capturing device 60 is a device that captures a tomographic image of the eye. The tomographic imaging apparatus 60 is configured as, for example, Spectral Domain OCT (SD-OCT: Spectral Domain Optical Coherence Tomography). The tomographic image capturing device 60 for an eye captures a tomographic image of the eye to be examined three-dimensionally in response to an operation by an operator (not shown). The captured tomographic image is transmitted to the ophthalmologic apparatus 10.

(Ophthalmic equipment)
Next, functional blocks of the ophthalmologic apparatus 10 according to the present embodiment are shown in FIG. The image processing unit 130 is different from the configuration of the first embodiment in that the image processing unit 130 includes an image feature acquisition unit 134 that acquires features of a wide-angle image. Further, the data server 40 holds normal value data relating to the image feature of the eye part and the distribution of the image feature of the eye part. Here, a case where normal value data relating to the retinal layer boundary, its shape, and thickness is held as these data will be described.

(Tomographic imaging device)
Next, the configuration of the tomographic imaging apparatus 60 including the adaptive optical system will be described with reference to FIG. In FIG. 14, reference numeral 201 denotes a light source. In this embodiment, an SLD light source having a wavelength of 840 nm is used. The light source 201 only needs to have a low interference property, and an SLD light source having a wavelength width of 30 nm or more is preferably used. Further, an ultrashort pulse laser such as a titanium sapphire laser can be used as a light source. The light emitted from the light source 201 is guided to the fiber coupler 520 through the single mode optical fiber 202. The fiber coupler 520 branches the measurement light path 521 and the reference light path 522. Here, a fiber coupler having a branching ratio of 10:90 is used, and 10% of the input light amount is configured to reach the measurement light path 521. The light passing through the measurement light path 521 is irradiated as parallel measurement light by the collimator 203.

  The configuration after the collimator 203 is the same as that of the SLO image capturing apparatus 20 described in the first embodiment. That is, the eyeball 211 is irradiated through the compensation optical system and the scanning optical system, and the reflected / scattered light from the eyeball 211 is guided again to the optical fiber 521 through the same path and reaches the fiber coupler 520. On the other hand, the reference light that has passed through the reference light path 522 is emitted by the collimator 523, reflected by the optical path length variable unit 524, and returns to the fiber coupler 520 again. The measurement light and the reference light that have reached the fiber coupler 520 are combined and guided to the spectroscope 526 through the optical fiber 525. Based on the interference light information split by the spectroscope 526, a tomographic image of the eye is constructed by the control unit 217. The control unit 217 can control the optical path length variable unit 524 to acquire an image at a desired depth position.

The tomographic imaging apparatus 60 also operates as a normal tomographic imaging apparatus by increasing the swing angle of the scanning optical system in the configuration of FIG. 14 and instructing the adaptive optics control unit 216 not to perform aberration correction. A wide-angle tomogram (wide-angle image D L ) can be captured. In the present embodiment, the tomographic imaging apparatus 60 provided with the adaptive optical system is configured as SD-OCT, but SD-OCT is not an essential requirement. For example, you may comprise as time domain OCT or SS-OCT (Swept Source Optical Coherence Tomography). In the case of SS-OCT, a light source that generates light of different wavelengths at different times is used, and a spectroscopic element for acquiring spectral information becomes unnecessary. In SS-OCT, a deep image including not only the retina but also the choroid can be acquired.

(Processing procedure)
FIG. 5 shows an image processing flow of the ophthalmologic apparatus 10 according to the present embodiment. The processing contents other than S510, S520, S530, S540, and S550 are the same as the processing of the first embodiment described with reference to FIG. Therefore, in this embodiment, the processing of S510, S520, S530, S540, and S550 will be described.

<Step 510>
Wide field of view image obtaining unit 111, to the tomographic image capturing apparatus 60, requests the acquisition of the wide field of view image D L and fixation target position F L. In the present embodiment, an example in the case of obtaining a wide angle image D L by setting the fixation target position F L fovea of the macula portion. Note that the method for setting the imaging position is not limited to this, and it may be set at any other position.

The tomography apparatus 60, the wide field of view wide angle image D L in response to the acquisition request from the image acquisition unit 111 acquires the fixation target position F L, and transmits to the wide field of view image obtaining unit 111. Wide field of view image obtaining unit 111 receives the wide angle image D L and fixation target position F L through the LAN30 from tomography apparatus 60. Wide field of view image acquisition unit 111 stores the wide-angle image D L and fixation target position F L received in the storage unit 120.

<Step S520>
The image acquisition pattern presentation unit 1311 acquires at least one type of basic setting pattern (image acquisition pattern) related to parameters when acquiring a plurality of high-magnification images from the storage unit 120 and displays them on the monitor 305. Although any pattern can be set as the image acquisition pattern, basic patterns as shown in FIGS. 15A to 15F are presented in this embodiment. 15 (a) is a linear shape, FIG. 15 (b) is a cross shape, FIG. 15 (c) is a radial shape, FIG. 15 (d) is a rectangular shape, FIG. 15 (e) is a disk shape, and FIG. f) is circular.

  Next, the instruction acquisition unit 140 acquires an instruction on which image acquisition pattern to select from the outside. In this embodiment, the case where the observation target is a region in which the outer layer of the retina is deformed by the serous retinal detachment RD as shown in FIG. 15 (i) and the photoreceptor cell is damaged is shown in FIG. 15 (e). Select a disc-shaped image acquisition pattern.

As in the case of the first embodiment, multi-magnification type, multi-arrangement type, and composite type image acquisition patterns may be presented in a three-dimensional tomographic image. For example, when the magnification ratio is 3 in the multi-magnification type, the acquisition pattern of the intermediate magnification image D 3m as shown in FIG. 15 (h) and the acquisition pattern of the intermediate magnification image D 2k as shown in FIG. 15 (g). The acquisition pattern of the high-magnification image D 1j as shown in FIG. Further, in the case of a multi-arrangement type image acquisition pattern, a plurality of image acquisition patterns may be arranged and presented in the depth direction (z-axis direction in the figure).

<Step S530>
The determination unit 132 sets a plurality of high magnifications by adjusting the image acquisition parameters based on the image features acquired by the image feature acquisition unit 134 using the acquisition parameters of the plurality of images included in the image acquisition pattern selected in S520 as initial values. Determine image acquisition parameters. The processing in this step (high-magnification image acquisition processing) will be described in detail later using the flowchart shown in FIG.

<Step S540>
The alignment unit 133 performs alignment between the wide angle image D L and the high magnification image D Hj and determines the position of the high magnification image D Hj on the wide angle image D L. First, the alignment unit 133 acquires the fixation target position F Hj used when imaging the high-magnification image D Hj from the storage unit 120, and in the alignment between the wide-angle image D L and the high-magnification image D Hj . This is the initial search point for the alignment parameter. If there is an overlapping area between the high-magnification images D Hj , first, the similarity between images is calculated for the overlapping area, and the positions of the high-magnification images D Hj are aligned with the position where the similarity between images is maximized. Next, when high-magnification images with different resolutions are acquired in S530, as in the case of the first embodiment, alignment is performed in order from lower-magnification images. In the present embodiment, since the resolution of the high-magnification image is one type, only the alignment of the wide-angle image D L and the high-magnification image D H is performed.

  Note that any known method can be used as the similarity between images and the coordinate conversion method. In the present embodiment, a three-dimensional correlation coefficient is used as the similarity between images, and a three-dimensional Affine transformation is used as the coordinate conversion method. Position.

<Step S550>
The display control unit 131 displays the high-magnification image D Hj on the wide-angle image D L based on the value of the alignment parameter obtained in S540. In the present embodiment, both the wide-angle image D L and the high-magnification image D Hj are three-dimensional tomographic images, so the following two types of display are performed.
i) A projection image of the wide-angle image D L and the high-magnification image D Hj is generated in the z-axis direction, and the projection image of the high-magnification image D H is superimposed on the projection image of the wide-angle image D L.
ii) In the position acquired only wide angle of view three-dimensional tomogram D L in the pixel values of the wide field of view image 3-dimensional tomographic image D L, wide angle of view three-dimensional tomogram D L and the high magnification three-dimensional tomogram D Hj At the acquired position, a wide-angle 3-dimensional tomographic image D L "displayed with the pixel value of the high-magnification 3-dimensional tomographic image D Hj is generated. Further, a specific scan on the wide-angle 3-dimensional tomographic image D L " The position is displayed with an arrow on the superimposed image of i), and the two-dimensional tomographic image of the wide-angle three-dimensional tomographic image D L "cut out at the position of this arrow is displayed side by side with the superimposed image as in i). In this display, not only the two-dimensional tomographic image of the wide-angle three-dimensional tomographic image D L but also the two-dimensional tomographic image of the high-magnification three-dimensional tomographic image D Hj is superimposed and displayed.
In the display of ii), the operator can move the arrow indicating the display position of the wide-angle tomogram D L "(up and down or left and right) through the instruction acquisition unit 140. The arrow is cut out in conjunction with this operation. The display slices of the wide-angle image D L and the high-magnification image D Hj displayed in this manner also change.

Further, when a plurality of high-magnification images D Hj with different acquisition positions are acquired as in the present embodiment, the luminance characteristics between the high-magnification images D Hj are obtained using the same method as in the first embodiment. Make similar adjustments. Further, when the imaging positions of the high-magnification images D Hj are close to each other (including the case where the imaging positions are the same), the display method of the overlapping area is set to one of the following. That is, the image quality index value of the image is calculated and the image having the highest evaluation value is displayed, or the luminance of each high-magnification image D Hj is blended by weighting the transparency based on the image quality index value described above. . Here, any known index can be used as the image quality index value. In this embodiment, the average luminance value of the image histogram is used.
The method for generating the projection image is not limited to the average value projection, and any projection method may be used. The high-magnification image D Hj is not limited to a still image, and may be a moving image.

(High magnification image acquisition processing)
Next, details of the processing (high-magnification image acquisition processing) executed in S530 will be described with reference to the flowchart shown in FIG. Note that S1510 is the same as S810 in the first embodiment, and a description thereof will be omitted.

<Step S1520>
The image feature acquisition unit 134 uses the wide-angle image D L stored in the storage unit 120, that is, the three-dimensional tomographic image of the eye, as the image features, such as the inner boundary membrane B1, the nerve fiber layer boundary B2, and the inner reticular layer boundary. Each boundary position of B4, photoreceptor inner / outer segment boundary B5, and retinal pigment epithelium boundary B6 is extracted. FIG. 7A, FIG. 15I, and FIG. 15J schematically show the boundary positions B1 to B6. Then, each extracted image feature is stored in the storage unit 120.

Here, concretely describing the feature extraction procedures for wide field of view image D L. First, an extraction procedure for extracting a layer boundary will be described. Here, the three-dimensional tomographic image to be processed is considered as a set of two-dimensional tomographic images (B-scan images), and the following processing is performed on each two-dimensional tomographic image. First, smoothing processing is performed on the focused two-dimensional tomographic image to remove noise components. Next, edge components are detected from the two-dimensional tomogram, and some line segments are extracted as layer boundary candidates based on their connectivity. Then, the first line segment from the extracted candidates is extracted as the inner boundary film B1, the second line segment from the top is extracted as the nerve fiber layer boundary B2, and the third line segment is extracted as the inner reticulated layer boundary B4. Further, a line segment having the maximum contrast on the outer layer side (the side having the larger z coordinate in FIG. 7A) than the inner boundary film B1 is extracted as the photoreceptor inner / outer segment boundary B5. Further, the lowest line segment in the layer boundary candidate group is extracted as the retinal pigment epithelium boundary B6.

  Note that a variable shape model such as Snakes or a level set method may be applied using these line segments as initial values, and further precise extraction may be performed. Moreover, you may comprise so that the boundary of a layer may be extracted by the graph cut method. Note that boundary extraction using a deformable shape model or a graph cut may be performed three-dimensionally on a three-dimensional tomographic image or two-dimensionally on each two-dimensional tomographic image. Needless to say, any method may be used as a method for extracting the layer boundary as long as the layer boundary can be extracted from the tomographic image of the eye.

<Step S1530>
A magnification determining unit 1321 determines the type of magnification (number of magnifications, angle of view, and pixel size) of the high-magnification image D Hj . In the present embodiment, the case where the magnification number and the pixel size are fixed (each 1, 1, [μm] × 1 [μm] × 1 [μm]) will be described, and thus detailed description thereof will be omitted. Note that the angle of view and pixel size are different from those of the first embodiment in that parameters in the z-axis direction are included. The angle of view is a variable parameter, and the angle of view is set to the threshold value Ta [%] only in the high-magnification image in which the distance between the photoreceptor inner / outer segment boundary B5 and the retinal pigment epithelium boundary B6 obtained in S1520 is at or above the threshold Trd Enlarge. The reason for increasing the angle of view is that it is an important region for observation, so as to prevent image leakage due to fixation disparity, and the outer segment of the photoreceptor cell above the retinal detachment region is icicle-shaped in the retinal pigment epithelium boundary. This is because the whole photoreceptor cell may be acquired as a high-magnification image in some cases.

Next, the position determination unit 1322 determines the acquisition position and the coherence gate position of each high-magnification image D Hj . In this embodiment, all are variable parameters, and the acquisition position of the high-magnification image D Hj is determined by the following procedure.
a) Determination of the arrangement of the representative positions of the image acquisition pattern.
b) Determination of the arrangement of the image acquisition pattern in the xy plane direction.
c) Determination of the arrangement of the image acquisition pattern in the z-axis direction.

Here, with respect to a), the representative position of the image acquisition pattern is set as the center of the image acquisition pattern, and the center is arranged so as to coincide with the position of the center of gravity on the retinal detachment region. The retinal detachment region refers to a region obtained by projecting a region where the distance between the photoreceptor inner / outer segment boundary B5 and the retinal pigment epithelium boundary B6 is greater than or equal to the threshold Trd onto the xy plane.
For b), in order to include the retinal detachment region within the region of the image acquisition pattern, the arrangement in the xy direction of the high-magnification image is determined by the following procedure. That is, a circle connecting the image centers of the high-magnification image at the outermost periphery is obtained, the circle is expanded to a position where the circle becomes a circumscribed circle of the retinal detachment area, and the high-magnification is satisfied so that the circle area is filled at regular intervals. The position in the xy direction of the image is determined.
The acquisition position in the z-axis direction of c) is determined so that the photoreceptor inner / outer segment boundary B5 acquired in S1520 coincides with the image center of the high-magnification image. Further, the coherence gate of each high-magnification image D Hj is set to a position closest to the photoreceptor inner / outer segment boundary B5 detected in S1520 among the settable positions.

  FIG. 15 (i) shows the initial acquisition position of the image acquisition pattern in this embodiment, and FIG. 15 (j) shows the acquisition position determined in this step. However, in both figures, the angle of view of the high-magnification image increases on the retinal detachment region and the overlap between the high-magnification images only in the acquisition positions for the two central rows of the image acquisition pattern for easy understanding of the acquisition pattern. Is omitted. Note that the type of variable parameter is not limited to the above, and any image acquisition parameter may be a variable parameter.

<Step S1540>
The order determination unit 1324 determines the acquisition order of the high-magnification images D Hj . In the present embodiment, of the following i) to iv), i) is the innermost (highest priority), ii) is the second inner, iii) is the third inner, and iv) is the outermost (lowest priority). Repeat the process so as to form a loop. In other words, the following steps i) to iv) are executed after the acquisition start position (in this embodiment, the upper end of the image acquisition pattern is set and the acquisition magnification is set to the minimum magnification).
i) Acquire images of the same arrangement pattern, the same acquisition magnification, and the same image acquisition position as many times as the number of acquisitions.
ii) Images in the same arrangement pattern and the same acquisition magnification are acquired again in the same manner as in i) after being moved to an adjacent image acquisition position.
iii) When ii) is completed, increase the value of the acquisition magnification and execute the operation of ii) again, and repeat the same operation for the number of magnifications.
iv) When iii) is completed, the operation of iii) is executed in another arrangement and repeated until images are acquired in all arrangements.
In the above example, i) is not repeatedly acquired (the number of acquisitions is only once), and the image acquisition pattern is not a multi-arrangement type, so the process of iv) is omitted. In addition, the adjacent image acquisition position of ii) can be moved in any direction, and in the present embodiment, the acquisition position is moved to the adjacent position in the horizontal direction (when it is not horizontally, diagonally below, and when it is not diagonally below). Let That is, among the high-magnification image acquisition positions, high-magnification images are sequentially acquired in the order of the first stage from right to left, the second stage from left to right, and the third stage from right to left. The order determination method of the present invention is not limited to the above procedure, and any known order setting method may be used.

<Step S1550>
The high-magnification image acquisition unit 112 requests the tomographic imaging apparatus 60 to acquire a plurality of high-magnification images D Hj and fixation target positions F Hj using the image acquisition parameters instructed by the determination unit 132. The tomographic image capturing apparatus 60 acquires and transmits the high-magnification image D Hj and the fixation target position F Hj in response to this acquisition request. The high-magnification image D Hj and the fixation target position F Hj are received. The high-magnification image acquisition unit 112 stores the received high-magnification image D Hj and fixation target position F Hj in the storage unit 120.

In the present embodiment, the acquisition position of the high-magnification image D Hj is determined using the image feature relating to the photoreceptor layer boundary, but the present invention is not limited to this. For example, as in the case of Embodiment 1, the operator may operate (move, enlarge, reduce) the position of the image acquisition pattern of the high-magnification image and adjust the acquisition position collectively.

  As described above, when acquiring a plurality of high-magnification adaptive optical OCT tomographic images, the ophthalmologic apparatus 10 is based on the image features related to the layer shape extracted from the OCT tomographic image having a wider angle of view than the high-magnification image. A parameter value related to acquisition of a plurality of high-magnification images is determined. Thereby, it is possible to efficiently image a tissue, a cell group, or a lesion candidate having a different distribution depending on the eye to be examined in a wider range than a high-magnification image.

<< Other Embodiments >>
In the above-described embodiment, the alignment target image is realized as an SLO image or an ocular tomogram, but the present invention is not limited to this. For example, the wide-angle image D L may be realized as a fundus camera image, and the high-magnification image D H may be realized as an adaptive optical fundus camera image. Alternatively, the wide-angle image D L may be realized as an image with different modalities such as a wide-angle-of-view SLO image and the high-magnification image DH as a projection image of an adaptive optical tomographic image. Furthermore, a configuration may be realized in which the compound machine of the adaptive optics SLO image capturing apparatus 20 and the tomographic image capturing apparatus 60 and the ophthalmic apparatus 10 are directly connected.

  The present invention can also be realized by executing the following processing. That is, software (program) that realizes the functions of the above-described embodiments is supplied to a system or apparatus via a network or various storage media, and a computer (or CPU, MPU, or the like) of the system or apparatus reads the program. It is a process to be executed.

10: ophthalmic apparatus, 20: SLO image capturing apparatus, 60: tomographic image capturing apparatus,
130: Image processing unit, 134: Image feature acquisition unit

Claims (10)

  1. An information processing apparatus for controlling imaging of a plurality of images in a region of an eye to be examined that is wider than a region of one image to be captured,
    Storage means for storing a plurality of basic patterns indicating the arrangement of a plurality of imaging positions for capturing the images;
    Selecting means for selecting one basic pattern from the plurality of basic patterns stored in the storage means;
    Adjusting means for adjusting the imaging positions of the plurality of images associated in advance with the selected basic pattern;
    According to a plurality of image pickup positions included in the basic pattern selected by said selecting means, have a control unit for capturing a plurality of images on the imaging device in the region,
    The control means, wherein according to the adjusted imaging position, the information processing apparatus according to claim Rukoto to capture a plurality of images on the imaging device.
  2. The imaging device
    Wavefront measuring means for measuring the wavefront of the return light obtained by irradiating the eye to be examined with measurement light emitted from a light source;
    Wavefront correcting means for correcting the wavefront of the return light based on the wavefront measured by the wavefront measuring means;
    The information processing apparatus according to claim 1, further comprising: an image generation unit that generates an image of the eye to be examined using the return light whose wavefront is corrected by the wavefront correction unit.
  3. Image acquisition means for acquiring a wide-angle image having a wider angle of view than the image obtained by imaging the region;
    Alignment means for aligning the wide-angle image acquired by the image acquisition means and the image generated by the image generation means;
    The information processing apparatus according to claim 2, further comprising: display control means for displaying the image on the wide-angle image based on the alignment result by the alignment means.
  4. The information processing according to any one of claims 1 to 3, wherein the basic pattern further includes at least one of an angle of view, a pixel size, a number of frames, a frame rate, and an in-focus position of the image. apparatus.
  5. The information processing apparatus according to claim 3, wherein the resolution of the image is higher than the resolution of the wide-angle image.
  6. An information processing apparatus for controlling imaging of a plurality of images in a region of an eye to be examined that is wider than a region of one image to be captured,
    Storage means for storing a plurality of basic patterns indicating the arrangement of a plurality of positions for capturing the images;
    Selecting means for selecting one basic pattern from the plurality of basic patterns stored in the storage means;
    Adjusting means for adjusting an imaging condition pre-associated with the selected basic pattern;
    Accordance imaging conditions related to the imaging of a plurality of images previously associated to the basic pattern selected by said selecting means, have a control unit for capturing a plurality of images on the imaging device in the region,
    The information processing apparatus , wherein the control unit causes the imaging apparatus to capture the plurality of images according to the adjusted imaging condition .
  7. The information processing apparatus according to claim 6 , wherein the imaging condition includes at least one of the acquisition position, the angle of view, the pixel size, the number of frames, the frame rate, and the focus position of the image.
  8. A control method for an information processing apparatus for controlling imaging of a plurality of images in a region of an eye to be examined that is wider than a region of one image to be captured,
    A selection step in which the selection unit selects one basic pattern from a plurality of basic patterns indicating the arrangement of a plurality of imaging positions for capturing the images stored in the storage unit;
    An adjusting step for adjusting the imaging positions of the plurality of images associated in advance with the selected basic pattern;
    Control means, in accordance with a plurality of image pickup positions included in the basic pattern selected in the selection step, have a control step of imaging the plurality of images on the imaging device in the region,
    Wherein in the control step, according to the adjusted imaging position, the control method of an information processing apparatus according to claim Rukoto to capture a plurality of images on the imaging device.
  9. A control method for an information processing apparatus for controlling imaging of a plurality of images in a region of an eye to be examined that is wider than a region of one image to be captured,
    A selection step in which the selection means selects one basic pattern from a plurality of basic patterns indicating an arrangement of a plurality of positions at which the images stored in the storage means are captured;
    An adjustment step in which the adjustment means adjusts an imaging condition previously associated with the selected basic pattern;
    Control means, in accordance with imaging conditions related to the imaging of a plurality of images previously associated to the basic pattern selected in the selection step, have a control step of imaging the plurality of images on the imaging device in the region,
    In the control step, the control method of the information processing apparatus, which causes the imaging apparatus to capture the plurality of images according to the adjusted imaging condition .
  10. The computer program for functioning a computer as each means with which the information processing apparatus of any one of Claims 1 thru | or 7 is provided.
JP2017134948A 2017-07-10 2017-07-10 Information processing apparatus, information processing apparatus control method, and computer program Active JP6465930B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017134948A JP6465930B2 (en) 2017-07-10 2017-07-10 Information processing apparatus, information processing apparatus control method, and computer program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017134948A JP6465930B2 (en) 2017-07-10 2017-07-10 Information processing apparatus, information processing apparatus control method, and computer program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2013111332 Division 2013-05-27

Publications (2)

Publication Number Publication Date
JP2017170266A JP2017170266A (en) 2017-09-28
JP6465930B2 true JP6465930B2 (en) 2019-02-06

Family

ID=59969926

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017134948A Active JP6465930B2 (en) 2017-07-10 2017-07-10 Information processing apparatus, information processing apparatus control method, and computer program

Country Status (1)

Country Link
JP (1) JP6465930B2 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4819851B2 (en) * 2008-07-31 2011-11-24 キヤノン株式会社 Diagnosis support apparatus and method, program, and recording medium
JP5704879B2 (en) * 2009-09-30 2015-04-22 株式会社ニデック Fundus observation device
JP5735790B2 (en) * 2010-12-02 2015-06-17 株式会社ニデック Ophthalmic imaging equipment
JP5845608B2 (en) * 2011-03-31 2016-01-20 株式会社ニデック Ophthalmic imaging equipment
JP2013070941A (en) * 2011-09-29 2013-04-22 Nidek Co Ltd Ophthalmologic photographing apparatus

Also Published As

Publication number Publication date
JP2017170266A (en) 2017-09-28

Similar Documents

Publication Publication Date Title
US10092178B2 (en) Systems and methods for efficiently obtaining measurements of the human eye using tracking
Zhang et al. Wide-field imaging of retinal vasculature using optical coherence tomography-based microangiography provided by motion tracking
JP5771259B2 (en) Apparatus and method for multifunction adaptive optical retinal imaging
JP2018171453A (en) Procedural optical coherence tomography for surgery, and related systems and methods
US8556428B2 (en) Method and apparatus for imaging in an eye
JP5432625B2 (en) Ophthalmic observation device
JP6460618B2 (en) Optical coherence tomography apparatus and control method thereof
JP6321430B2 (en) Ophthalmic equipment
JP5339934B2 (en) Optical tomographic imaging apparatus and optical tomographic imaging method
US8899749B2 (en) Image processing apparatus, image processing method, image processing system, SLO apparatus, and program
JP5085086B2 (en) Fundus observation apparatus, fundus image display apparatus, and program
EP2926722A1 (en) Fundus photographing device
US20160345822A1 (en) Ophthalmologic apparatus
JP4157839B2 (en) Retinal region imaging method and system for living eye
US8096658B2 (en) Fundus oculi observation device and program for controlling the same
US8939583B2 (en) Ophthalmic apparatus, method of controlling ophthalmic apparatus and storage medium
KR101506526B1 (en) Ophthalmologic apparatus and control method therefor
JP5236089B1 (en) Optical coherence tomography apparatus, control method of optical coherence tomography apparatus, and program
JP6586196B2 (en) Ophthalmic photographing apparatus and ophthalmic image display apparatus
US20190191987A1 (en) Image processing apparatus, ophthalmologic imaging apparatus, image processing method, and storage medium
JP5192250B2 (en) Fundus observation device
ES2745501T3 (en) Photorefraction Ocular Projection Device
JP5220208B2 (en) Control device, imaging control method, and program
JP5818458B2 (en) Image processing apparatus, photographing system, image processing method, and program
JP4823693B2 (en) Optical image measuring device

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170804

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170804

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20180613

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20180619

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20181210

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190108

R151 Written notification of patent or utility model registration

Ref document number: 6465930

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151