WO2021199772A1 - Device for estimating state of eyeball internal tissue and method therefor - Google Patents

Device for estimating state of eyeball internal tissue and method therefor Download PDF

Info

Publication number
WO2021199772A1
WO2021199772A1 PCT/JP2021/006219 JP2021006219W WO2021199772A1 WO 2021199772 A1 WO2021199772 A1 WO 2021199772A1 JP 2021006219 W JP2021006219 W JP 2021006219W WO 2021199772 A1 WO2021199772 A1 WO 2021199772A1
Authority
WO
WIPO (PCT)
Prior art keywords
eyeball
shadow
light
unit
image
Prior art date
Application number
PCT/JP2021/006219
Other languages
French (fr)
Japanese (ja)
Inventor
ニーラム コーシック
佳那 竹山
羽根 一博
敬 佐々木
中澤 徹
Original Assignee
国立大学法人東北大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 国立大学法人東北大学 filed Critical 国立大学法人東北大学
Priority to JP2021538069A priority Critical patent/JP7214270B2/en
Publication of WO2021199772A1 publication Critical patent/WO2021199772A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes

Definitions

  • the present invention relates to an apparatus for estimating the state of the internal tissue of the eyeball and a method thereof.
  • a method of irradiating the internal tissue of the eyeball with light to image the target tissue is known.
  • an imaging system for eye tissue has been proposed in which an irradiation beam is emitted from different illumination points toward the eye tissue, and light backscattered from at least one of the retina and the iris is collected from the pupil to acquire an image of the fundus.
  • this system it is possible to process an image of the fundus to obtain a phase contrast image and extract information on eyeball tissue from the phase contrast image (see Patent Document 1).
  • a phase contrast image is calculated from the intensities of the two images, and this phase contrast image is reconstructed using an algorithm of a predetermined function to obtain only the phase and absorption information of light. It is necessary to acquire the including image.
  • One of the objects of the present invention is to estimate the state of the internal tissue of the eyeball easily and with high accuracy.
  • the state estimation device for the internal tissue of the eyeball is arranged at a first position outside the eyeball of the living body, and a first light irradiation unit that irradiates light toward the target tissue inside the eyeball. It includes an imaging unit that captures the light reflected inside the eyeball, acquires shadow information about the target tissue from the first imaging information obtained by the imaging unit, and obtains shadow information about the target tissue from the shadow information of the target tissue. It is provided with a processing unit that estimates the state.
  • the method of estimating the state of the internal tissue of the eyeball irradiates light from the first position outside the eyeball of the living body toward the target tissue inside the eyeball, and the light reflected inside the eyeball. To image. Further, the shadow information about the target tissue is acquired from the first imaged information captured, and the state of the target tissue is estimated from the shadow information.
  • the state of the internal tissue of the eyeball can be estimated easily and with high accuracy.
  • FIG. 1 is a diagram showing a configuration of a state estimation device for an internal tissue of the eyeball according to the first embodiment.
  • FIG. 2 is a cross-sectional view schematically showing the internal structure of the main body of FIG.
  • FIG. 3 is a diagram showing an optical system used by the light irradiation unit and the imaging unit of FIG.
  • FIG. 4 is a block diagram showing the configuration of the control device of FIG. 1 and the function of each device.
  • FIG. 5 is a diagram showing an image captured in the first embodiment, and FIG. 5A shows an example of a marked image with a shadow extracted.
  • FIG. 5B shows a method of acquiring the size of the shadow from the image of FIG. 5A.
  • FIG. 6 is a cross-sectional view schematically showing the internal structure of the main body in the second embodiment.
  • FIG. 7 is a diagram showing an optical system used by the light irradiation unit and the imaging unit of the second embodiment.
  • FIG. 8 is a diagram showing an image captured in the second embodiment.
  • FIG. 9 is a diagram showing the verification results using the model eye.
  • FIG. 9A shows an example of a marked image in which a shadow is extracted
  • FIG. 9B shows a method of obtaining the size of a shadow from the image of FIG. 9A.
  • FIG. 10 shows the result of comparing the measurement results by the present state estimation device and the optical coherence tomography (OCT).
  • OCT optical coherence tomography
  • FIG. 10A shows an image of an enucleated pig eye taken by using a fundus camera
  • FIG. 10B shows an intensity-normalized image of the image of FIG. 10A
  • FIG. 10C shows an image obtained by applying a clustering algorithm using the K-means method to the image of FIG. 10B
  • FIG. 10D shows an image obtained by differentiating the image of FIG. 10C.
  • OCT optical coherence tomography
  • the horizontal direction is the front-back direction (in the figure, the front is indicated by “F” and the rear is indicated by “B") and the left-right direction (in the figure, the left is indicated by “L” and the right is indicated by “R”. It will be described in detail in (shown).
  • the left-right direction the left and right are determined based on the state of facing from the rear to the front.
  • the direction of action of gravity is downward (indicated by "D” in the figure), and the opposite direction of downward is upward (indicated by "U” in the figure).
  • FIG. 1 Configuration of state estimation device for internal tissue of the eyeball
  • the state estimation device 1 of the internal tissue of the eyeball including the main body 2 and the control device 3 will be described.
  • the main body 2 transmits the captured image to the control device 3.
  • an image of the light reflected inside the eyeball is transmitted.
  • the main body 2 includes a light irradiation unit 10 and an imaging unit 20.
  • the light irradiation unit 10 and the image pickup unit 20 have an integral structure.
  • the main body 2 has a size that can be carried by the subject by hand, and can be used, for example, by attaching it to a camera of a portable device.
  • the control device 3 receives an image from the main body 2 and estimates the state of the internal tissue of the eyeball from the image.
  • the light irradiation unit 10 has a housing 11.
  • the housing 11 is provided with a hole 12 on the rear surface. Since the left and right eyeballs are measured respectively, one hole 12 is provided here, but if it is desired to measure both eyeballs at the same time, two holes 12 may be provided side by side in the left-right direction. ..
  • the holes 12 have a size sufficient to surround the periphery of the eye 4 of the living body.
  • a flexible member such as rubber around the hole 12 between the hole 12 and the surface in contact with the face of the subject.
  • the flexible member comes into contact with the face around the eye 4 to reduce stray light from the outside, and the deformation of the flexible member makes the optical axis of the imaging unit 20 easier for the subject's eye 4. Can be matched to.
  • Lighting is provided inside the housing 11.
  • the lighting is an infrared LED.
  • the wavelength of illumination may be visible light or the like, but infrared light is preferable because it does not make the subject feel dazzling and because the penetration depth of light when irradiating through the skin or the like.
  • the illumination light is not incident from the cornea through the lens as is generally performed, but is directly irradiated through the sclera of the eyeball 40 or the skin around the eyeball 40. Therefore, there is no need for a mydriatic drug for pupil dilation as used in eye examinations.
  • an LED having an 850 nm wavelength is used.
  • the illumination is referred to as “upper illumination L U” (first light irradiation part).
  • the installation position of the upper illumination L U is referred to as “upper position P1" (first position).
  • the upper position P1 will be described later.
  • the illumination position is not limited to the upper position P1 as long as the light can enter through the sclera or the skin around the eyeball, and the lower position, the right position, and the like. Any position on the left is possible.
  • oblique / off-axis illumination is mainly used through the sclera and the skin around the eyeball 40.
  • the lighting is not limited to the fixed one, and for example, the lighting can be installed on a rotating member around the outer periphery of the hole 12 in the housing and can be moved so that the irradiation position becomes an appropriate position. You may.
  • the imaging unit 20 includes a tubular body 21. Inside the tubular body 21, a first lens R1, a second lens R2, a camera 22, and an image sensor 23 are provided.
  • the first lens R1 and the second lens R2 are existing ophthalmic objective lenses.
  • the first lens R1 is provided at a boundary where the front surface of the housing 11 and the rear surface of the tubular body 21 are connected.
  • the second lens R2 is provided in the central region of the tubular body 21 in the front-rear direction.
  • the first lens R1 and the second lens R2 are for collecting the light reflected inside the eyeball 40 and relaying it to the camera 22.
  • a camera 22 is provided in front of the second lens R2.
  • the camera 22 captures the light from the second lens R2 and acquires the captured image.
  • the camera 22 includes an image sensor 23.
  • the image sensor 23 forms an image of an image pickup target and converts the light and darkness of the imaged light into an electric signal.
  • the image converted into an electric signal is transmitted to the control device 3.
  • the image pickup unit 20 is an example of an image pickup unit.
  • the optical system used by the light irradiation unit 10 and the image pickup unit 20 will be described with reference to FIGS. 2 and 3.
  • the eyeball 40 is protected by a plurality of membranes.
  • the membranes are arranged in the order of sclera, choroid, and retina 41 from the outside of the eyeball.
  • the light that enters the eyeball 40 from the pupil 42 is sensed by the photoreceptor cells of the retina 41, transmitted to the brain through the optic nerve 43 stretched around the retina 41, and becomes an image.
  • the optic nerve 43 which covers the retina, converges into a single thick bundle at the back of the eyeball.
  • the place where the optic nerve 43 converges into one bundle is called the optic nerve head 44.
  • the optic disc 44 is often recessed. In other words, the shape of the optic nerve head 44 can be regarded as crater-like.
  • Upper illumination L U includes, in the casing 11, the optical axis O L of illumination, the optical axis passing through the pupil 42 of the eye 40 horizontally (eyeball optical axis) against O E, the predetermined upward It is installed so as to incline at an inclination angle ⁇ 1.
  • the predetermined inclination angle ⁇ 1 is, the light from the upper illumination L U, enters the portion other than the pupil 42 of the eye 40 is set to to illuminate the fundus 45, for example 45 °.
  • the upper position P1 above the illumination L U is installed is a position 45 ° inclined upward with respect to the hole 12 of the housing 11.
  • the upper illumination L U emits light toward the upper position P1 in the interior of the eye 40.
  • the light emitted from the upper illumination L U, directly sclera of the eye 40, and / or through the skin around the eye 40 reaches the retina 41, the interior of the optic disc 44 of the eye 40
  • the fundus 45 including the fundus 45 is irradiated.
  • the light incident on the eyeball 40 is scattered to some extent inside the eyeball 40, but most of the light is reflected by the fundus 45 and emitted from the pupil 42 to the outside of the eyeball 40.
  • the first lens R1 (aberration correction lens) of the image pickup unit 20 collects the light emitted from the pupil 42 to the outside of the eyeball 40.
  • the second lens R2 (aberration correction lens) relays the light from the first lens R1 to the image sensor 23.
  • Imaging range of the fundus 45 is determined by the angle ⁇ 2 and the focal length F L.
  • the angle of view ⁇ 2 represents the width of the imaging range in terms of angles.
  • the focal length F L here, refers to the distance between the center of the focal point F P and the first lens R1 which light reflected from the fundus 45 gather.
  • the focal length FL is set so that the angle of view ⁇ 2 is 60 °.
  • the central axis of the optical axis of the first lens R1 and the second lens R2 coincides with the central axis of the tubular body 21.
  • the camera 22 collects the light relayed from the second lens R2 by a lens (not shown).
  • the focused light is received by the image sensor 23 and is imaged.
  • the captured image is referred to as a "single illumination image I S" (first imaging information).
  • the configuration of the control device 3 will be described with reference to FIG.
  • the control device 3 includes a processing device 30 and a storage device 34.
  • the processing device 30 realizes a function described later by executing a program stored in the storage device 34.
  • the processing device 30 is a CPU (Central Processing Unit).
  • the processing device 30 may be configured by a DSP (Digital Signal Processor) or a programmable logic circuit device (PLD; Programmable Logic Device).
  • the processing device 30 is an example of a processing unit.
  • the storage device 34 stores the information readable and writable.
  • the storage device 34 includes at least one of a RAM (Random Access Memory), a ROM (Read Only Memory), an HDD (Hard Disk Drive), an SSD (Solid State Disk), a semiconductor memory, and an organic memory.
  • the storage device 34 may include a recording medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory, and a reading device capable of reading information from the recording medium.
  • the control device 3 may be realized by an integrated circuit (for example, LSI (Large Scale Integration) or the like).
  • the function of the processing device 30 includes a shadow information acquisition unit 31, a parameter acquisition unit 32, and a state estimation unit 33.
  • Shadow information obtaining unit 31 obtains the contour and the shadow S of the optic disk 44 (shadow information) from a single illumination image I S acquired from the imaging unit 20.
  • shadow refers to the dark portion generated when the light beam is interrupted by the object or the like.
  • Light irradiated from above the illumination L U strikes obliquely to the optic disc 44.
  • the optic nerve head 44 is recessed, a shadow S is formed in the recess from the direction of light irradiation according to the depth of the recess. Therefore, the single lighting image I S of the captured light reflected from the fundus 45, includes a shadow S of depression of the optic papilla 44.
  • Single illumination image I S including shadow S since it can be said that the sterically captured images of the fundus 45, also referred quasi 3D image.
  • FIG. 5A showing an example of a single illumination image I S.
  • FIG. 5A is an actual measurement of pig eyes using the device 1 of the present embodiment.
  • Shadow information obtaining unit 31 extracts the low lightness region within a single illumination image I S as a shadow S (a specific, acquisition) is.
  • the contour of the optic nerve head 44 is extracted.
  • the shadow information acquisition unit 31 marks a range including the shadow S and its peripheral region. As shown in FIG. 5A, here, the marks are marked with break points.
  • the parameter acquisition unit 32 acquires the shadow size L (first parameter) from the shadow S acquired by the shadow information acquisition unit 31.
  • the size L of the shadow in particular, from the upper illumination L U from the direction the light is irradiated shadow formed in a manner extending in the recess (recesses) of the optic disc 44 length L (recessed edge , The longest length of shadow that extends in the direction of light irradiation (advancing).)
  • the length L of the shade may be obtained directly from the image I S obtained. However, it may be difficult to accurately obtain the shadow edge when the shadow edge is unclear by the direct method. In the present embodiment, a method of calculating (acquiring) the length L of the shadow S will be described with reference to FIG. 5B.
  • the length L of the shadow S is acquired by using two circles or ellipses (hereinafter, referred to as "first circle C1" and “second circle C2").
  • the size of the circles C1 and C2 is preferably large enough to include the shadow S.
  • the shadow S is surrounded by the first circle C1 so that one arc of the shadow S (a part of the arc around the circumference) is inscribed in the first circle C1.
  • the shadow S is the first circle along the contour of the recess of the optic disc 44 (preferably the inner contour; where the inner contour refers specifically to the inner edge of the crater-shaped recess). It is desirable to enclose it in C1.
  • a part of the arc of the first circle C1 follows a part of the contour (periphery) of the shadow S forming a part of the contour of the recess of the optic nerve head 44.
  • the first circle C1 is placed on the shadow S.
  • the shadow S is surrounded by the second circle C2 so that the other arc of the shadow S (the other part of the circumference is isolated) is inscribed in the second circle C2.
  • the second circle C2 is shadowed so that a part of the arc of the second circle C2 follows a part of the contour (periphery) of the shadow S facing the inner contour of the recess of the optic nerve head 44. Place on S.
  • the arc of the second circle C2 follows the contour of the shadow S facing a part of the contour (periphery) of the shadow S along which a part of the arc of the first circle C1 follows.
  • the second circle C2 is placed on the shadow S.
  • a part of the contour of the shadow S on which the second circle C2 is overlapped is different from a part of the contour of the shadow S on which the first circle C1 is overlapped. Therefore, the arc of the second circle C2 intersects the arc of the first circle C1 at two points.
  • the length of the shadow S surrounded by the arc of the first circle C1 and the arc of the second circle C2 is measured.
  • the virtual coordinates in FIGS. 2 and 3) in which the upward direction U is the Y axis and the left direction L is the X axis, the origin schreib coincides with or substantially coincides with the center of one circle, and two circles. C1, so that a line of arc connecting each other point of intersection of C2 is parallel to the Y axis, overlaying a single illumination image I S.
  • the distance between the points on the arcs of the two circles which is the width of the shadow S in the X-axis direction, that is, the points on the arcs of the second circle C2 and the arcs of the first circle C1 intersecting the X-axis. Measure the distance between the points.
  • the length of the shadow S is the width of the area surrounded by the first circle C1 and the second circle C2, and is the line connecting the vertices of the arcs of the intersecting circles C1 and C2. It is the length of a line that is almost vertical. As a result, the shadow length L is accurately acquired.
  • the parameter acquisition unit 32 is a value representing the size of the optic nerve head 44 different from this size D, based on the length L of the shadow and the size D (diameter; second parameter) of the optic nerve head. Acquire (calculate) d (depth; third parameter).
  • the shape of the recess of the optic nerve head 44 can be modeled as a crater-shaped parabolic surface, an ellipse, or a spherical surface.
  • the depth d of the recess is obtained in a two-dimensional plan projection drawing when a shadow of exactly half the diameter D of the recess is obtained.
  • the shadow length L and the inclination angle ⁇ 1 are expressed by the following simple equation 1.
  • the above equation cannot be used for very deep recesses or shallow recesses. Further, it is very difficult to obtain a shadow of exactly half the diameter D in the recess of the optic nerve head 44, which is a complicated elliptical (circular) shape. Shadows that are shorter or longer than half the diameter D can be corrected using an arbitrary correction factor, but they are not accurate. Therefore, in the present embodiment, the recess of the optic disc 44 is a parabolic curved surface, an ellipse, or a spherical surface having the first circle C1 as the diameter D, and the shape formed by the shadow is the second circle C2. Use the result derived from the calculation expressed as an equation.
  • the size D of the optic nerve head 44 is the diameter D of the optic nerve head 44, and here, a predetermined literature value (in the case of a human, a general diameter is 1.5 mm) is used.
  • the diameter of the optic nerve head 44 may be estimated from the fundus photograph and used.
  • the value d is, in detail, the depth d of the depression of the optic nerve head 44.
  • the state estimation unit 33 acquires the depth d of the depression of the optic disc 44 acquired by the parameter acquisition unit 32, and estimates the state of the optic disc 44. For example, the state estimation unit 33 sets the acquired depth d as a general value (for example, a clinically obtained average value) or a value obtained for each eye 4 of the subject at the time of the previous examination. By comparing, it is estimated whether the condition of the optic disc 44 is normal or abnormal.
  • a general value for example, a clinically obtained average value
  • the function of the storage device 34 includes a storage unit 35.
  • the storage unit 35 stores various information used for estimating the state of the optic nerve head 44.
  • Storage unit 35 for example, a single illumination image I S captured by the imaging unit 20, a single illumination image shadow information obtaining unit 31 is marked I S, using the parameter acquiring unit 32 and the acquired parameters D, Information including d, L, the estimation result in the state estimation unit 33, and the like is stored. Further, information such as the result detected by the irradiation adjusting unit 37, which will be described later, is stored.
  • the emitted light is collected by the first lens R1 and the second lens R2 of the image pickup unit 20, and is imaged by the camera 22.
  • Single illumination image I S captured by the camera 22 is converted into an electric signal by the image sensor 23 of the imaging unit 20, it is transmitted to the processor 30. If the captured fundus image deviates significantly from the desired position, the subject can easily adjust to the desired position by shifting the direction of the main body 2.
  • Shadow information obtaining unit 31 of the processor 30 obtains a shadow S of the optic papilla 44 from a single illumination image I S acquired.
  • the parameter acquisition unit 32 of the processing unit 30 acquires the length L of the shadow S formed in the recess of the optic nerve head 44 acquired by the shadow information acquisition unit 31.
  • the length L of the shadow S is obtained using the two circles C1 and C2.
  • the state estimation unit 33 of the processing unit 30 estimates the state of the optic nerve head 44 based on the depth d.
  • the number of parameters required to estimate the state of the optic nerve head 44 is small.
  • the length L of the shadow S can be extracted using two circles, and does not require specialized knowledge (particularly knowledge of mathematics). Therefore, the parameters necessary for estimating the state of the optic nerve head 44 can be obtained by a simple method.
  • the state estimation device for the internal tissue of the eyeball and the method thereof according to the second embodiment will be described with reference to FIGS. 4, 6 and 7.
  • the state estimation device according to the present embodiment is different from the state estimation device according to the first embodiment in that a plurality of lights are provided.
  • the same reference numerals are given to the configurations common to those of the first embodiment, and the description thereof will be omitted.
  • the housing 11 has, in its interior, in addition to the above illumination L U, comprising a lower illumination is installed in the lower position L D (second light irradiator).
  • the illuminations L U and L D are both infrared LEDs.
  • Lower illumination L D similar to the above illumination L U, the optical axis O L of illumination, with respect to the optical axis passing through the pupil 42 of the eye 40 horizontally (eyeball optical axis) O E, a predetermined downward It is installed so as to incline at the inclination angle ⁇ 1 of.
  • the predetermined inclination angle ⁇ 1 is set similarly to the above illumination L U, for example to 45 °.
  • FIG. 7 The optical system used by the light irradiation unit 10 and the image pickup unit 20 will be described with reference to FIG. 7.
  • Upper illumination L U has a light irradiated from the upper position P1
  • the lower the illumination L D irradiates light from the lower position P2 (second position).
  • Light irradiated upward illumination L U and lower illumination L eyeball from D 40 inside the optic disc 44 is directly sclera of the eye 40, and / or through the skin around the eye 40, retina 41 Is reflected by the fundus 45, and is emitted from the pupil 42 to the outside of the eyeball 40.
  • the emitted light is collected by the imaging unit 20 and imaged. Since the image captured by the imaging unit 20 captures two lights, it is referred to as "plural illumination image IP " (second imaging information).
  • the plurality of illumination images IP are transmitted from the imaging unit 20 to the processing device 30.
  • control device configuration As shown in FIG. 4, the control device 3 further includes a lighting control device 36 (adjustment unit).
  • the function of the lighting control device 36 includes an irradiation adjusting unit 37.
  • Radiation adjusting section 37 obtains the plurality of illumination images I P from the imaging unit 20, based on the plurality of illumination image I P, adjust the illumination L U, L D.
  • Figure 8 shows an example of multiple illumination image I P.
  • Radiation adjusting section 37 detects the irradiation state of the illumination L U, L D lightness of the plurality illumination image I P.
  • plural illumination image I P it is seen that light is hitting strong on the left side of the image.
  • the optic disc 44 is imaged (region surrounded by a broken line), point low brightness there are a plurality in the plurality of illumination image I P.
  • Radiation adjusting section 37 the illumination L U, from the irradiation condition of the L D, adjusting the direction and the irradiation intensity of the illumination L U, L D. Therefore, the inclination angle ⁇ 1 of the illuminations L U and L D described in the first embodiment may be changed from 45 °. By changing the tilt angle ⁇ 1, the brightness of the image can be further emphasized. Furthermore, radiation adjusting section 37, any upward illumination L U and lower illumination L D may determine whether to get a single illumination image I S using.
  • two illumination L U light is irradiated to the eyeball 40 and different in time from each L D, may respectively be obtained a single illumination image I S.
  • radiation adjusting section 37 by comparing the to do single illumination image I S acquired, the illumination L U, may be adjusted L D. Furthermore, radiation adjusting section 37 may determine whether to obtain more single lighting image I S using any of the illumination L.
  • Each two illumination L U respectively determine the length L of the shade from a single illumination image I S obtained in L D, comparison, or, by, for example, taking an average value, it is increased more measurement precision good.
  • the number of lights may be three or more. In this case, different images can be obtained for the entire fundus 45. Further, the position of the illumination is not limited to the upper side and the lower side, and may be the left side and the right side.
  • the lighting may be a visible light LED instead of an infrared LED. Visible light can be used to estimate the state of arteries in the fundus 45, including the retina 41. If a shadow is visible in the lower part of the retina 41, an abnormality of the artery in the fundus 45 is presumed. The length of the shadow can be used to determine the presence or absence of arterial bleeding or blood clots.
  • the shadows formed by an object irradiated with light from a light source are mainly shadows formed when the object completely blocks the light rays (main shadow) and shadows formed when the object partially blocks the light rays (penumbra).
  • the penumbra is not as clear as the main shadow. Since the surface of the recess of the optic nerve head 44 is not flat, a penumbra is often formed in the recess depending on the surface shape of the recess. Therefore, in order to accurately detect shadows from an image with penumbra, it is recommended to apply a method using a clustering-based algorithm that divides the pixels of the image into a specific number of similar or dissimilar groups. preferable.
  • the K-means clustering which is the most common clustering algorithm.
  • the image subjected to the processing such as the K-means method may be further differentiated. By processing the image in this way, the shadow region can be specified more clearly.
  • the state estimation device 1 in this demonstration has the same structure as that shown in FIGS. 6 and 7. Specifically, two IR-LEDs (wavelength 850 nm) were used as light sources, and each was arranged so as to be at an angle of 45 ° with respect to the optical axis. The illumination light from the LED has an intensity that does not cause heat damage, and when it passes through the sclera, it diffuses and can uniformly illuminate the internal region of the eyeball 40.
  • a 78D ophthalmic lens (outer angle field of view 60 °, focal length 8 mm) (Righton, Japan) was used as the first lens R1 (objective lens).
  • An aberration correction lens was used as the second lens R2.
  • the color filter of the camera sensor of the Web camera (Logitech HD Pro Web camera C920) was changed for IR imaging by replacing it with an IR filter.
  • the designed state estimation device was a palm-sized, portable and lightweight size, and had excellent ability to capture a fundus image as shown in FIGS. 9A and 9B.
  • FIG. 10A is a monochrome image of a pig's eyeball obtained by using the state estimation device 1 of the present embodiment.
  • FIG. 10B is an intensity image in which one pixel is normalized with an intensity of 8 bits (256 levels).
  • FIG. 10C is an image obtained by repeating the K-means method 1000 times. By the K-means method, it was possible to clearly separate the shadowed part from the non-shadowed part. The image of FIG. 10C was differentiated in order to obtain the shadow length more accurately.
  • FIG. 10D is an image obtained by the differential K-means method obtained by the differential process.
  • the shadow area identified by the differential processing was the area surrounded by the broken line.
  • the diameter of the optic nerve head was about 2.1 mm.
  • the angle of oblique illumination in this image was 45 °.
  • the depth of the optic disc in various shadow cross sections was calculated using Equation 3, the depth of the optic disc varied from 179 ⁇ m to 350 ⁇ m.
  • the range of this value was in good agreement (overlap) with the range of the depth value of the optic disc of the porcine eyeball obtained by OCT. From the above, by using the state estimation device 1 of the present embodiment, the depth of the optic nerve head equivalent to that of the OCT can be easily obtained without using an expensive device such as the OCT. It was proved that the state estimation device 1 of the above is excellent as a simple inspection device.
  • the volume and area of the optic nerve head can be calculated from the values obtained by the state estimation device 1 of the present embodiment.
  • the shape obtained from the shadow can be used to reconstruct the shape of the optic nerve head in 3D, and there is a possibility that the progression of various eye diseases including glaucoma can be easily detected by daily diagnosis.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention estimates the state of an eyeball internal tissue with high accuracy and in a simple manner. A device 1 for estimating the state of an eyeball internal tissue is provided with a first light irradiation unit LU, an imaging unit 20, and a processing unit. The first light irradiation unit LU is disposed at a first position P1 outside an eyeball 40 of a living body, and irradiates a target tissue 44 inside 45 of the eyeball 40 with light. The imaging unit 20 captures an image of light reflected on the inside 45 of the eyeball 40. The processing unit acquires shadow information about the target tissue 44 from first imaging information obtained by the imaging unit 20, and estimates the state of the target tissue 44 from the shadow information.

Description

眼球内部組織の状態推定装置およびその方法State estimation device for internal tissue of the eyeball and its method
 本発明は、眼球の内部組織の状態を推定する装置およびその方法に関する。 The present invention relates to an apparatus for estimating the state of the internal tissue of the eyeball and a method thereof.
 眼球の内部組織に向かって光を照射し、対象組織を撮像する方法が知られている。たとえば、異なる照明点から眼球組織に向かって照射ビームを照射し、網膜および虹彩のうちの少なくとも一方から後方散乱した光を瞳孔から集め、眼底の画像を取得する眼球組織の撮像システムが提案されている。このシステムによれば、眼底の画像を処理して位相コントラスト画像を取得し、この位相コントラスト画像から眼球組織の情報を抽出することができる(特許文献1参照)。 A method of irradiating the internal tissue of the eyeball with light to image the target tissue is known. For example, an imaging system for eye tissue has been proposed in which an irradiation beam is emitted from different illumination points toward the eye tissue, and light backscattered from at least one of the retina and the iris is collected from the pupil to acquire an image of the fundus. There is. According to this system, it is possible to process an image of the fundus to obtain a phase contrast image and extract information on eyeball tissue from the phase contrast image (see Patent Document 1).
特表2019-518511号公報Special Table 2019-518511 Gazette
 しかしながら、上記のような眼球組織の撮像システムでは、位相コントラスト画像を取得するために、2つの異なる照明点から撮像された少なくとも2つの画像が必要である。さらに、眼球組織の情報を抽出するために、2つの画像の強度から位相コントラスト画像を計算し、この位相コントラスト画像を所定の関数のアルゴリズムを用いて再構成し、光の位相および吸収情報のみを含む画像を取得する必要がある。このように、眼球組織の情報を抽出するために、複数の眼底の画像に対して複数の処理を施すことが必要となるため、複雑である。 However, in the above-mentioned imaging system for eye tissue, at least two images captured from two different illumination points are required in order to acquire a phase contrast image. Furthermore, in order to extract information on the eye tissue, a phase contrast image is calculated from the intensities of the two images, and this phase contrast image is reconstructed using an algorithm of a predetermined function to obtain only the phase and absorption information of light. It is necessary to acquire the including image. As described above, in order to extract the information of the eyeball tissue, it is necessary to perform a plurality of processes on the images of the plurality of fundus, which is complicated.
 本発明の目的の一つは、眼球の内部組織の状態を簡便に、かつ、高い精度で推定することである。 One of the objects of the present invention is to estimate the state of the internal tissue of the eyeball easily and with high accuracy.
 なお、この目的に限らず、後述する「発明を実施するための形態」に示す各構成から導き出される作用および効果であって、従来の技術では得られない作用および効果を奏することも、本件の他の目的として位置付けることができる。 Not limited to this purpose, it is also possible that the actions and effects derived from each configuration shown in the “mode for carrying out the invention” described later and which cannot be obtained by the conventional technique are exhibited. It can be positioned as another purpose.
 一つの側面では、眼球内部組織の状態推定装置は、生体の眼球の外部である第一の位置に配置され、前記眼球の内部の対象組織に向かって光を照射する第一光照射部と、前記眼球の内部で反射される光を撮像する撮像部と、を備えるとともに、前記撮像部で得られた第一撮像情報から前記対象組織に関する影情報を取得し、前記影情報から前記対象組織の状態を推定する処理部を備える。 In one aspect, the state estimation device for the internal tissue of the eyeball is arranged at a first position outside the eyeball of the living body, and a first light irradiation unit that irradiates light toward the target tissue inside the eyeball. It includes an imaging unit that captures the light reflected inside the eyeball, acquires shadow information about the target tissue from the first imaging information obtained by the imaging unit, and obtains shadow information about the target tissue from the shadow information of the target tissue. It is provided with a processing unit that estimates the state.
 他の側面では、眼球内部組織の状態推定方法は、生体の眼球の外部である第一の位置から前記眼球の内部の対象組織に向かって光を照射し、前記眼球の内部で反射される光を撮像する。さらに、前記撮像された第一撮像情報から前記対象組織に関する影情報を取得し、前記影情報から前記対象組織の状態を推定する。 On the other side, the method of estimating the state of the internal tissue of the eyeball irradiates light from the first position outside the eyeball of the living body toward the target tissue inside the eyeball, and the light reflected inside the eyeball. To image. Further, the shadow information about the target tissue is acquired from the first imaged information captured, and the state of the target tissue is estimated from the shadow information.
 本発明によれば、眼球の内部組織の状態を簡便に、かつ、高い精度にて推定できる。 According to the present invention, the state of the internal tissue of the eyeball can be estimated easily and with high accuracy.
図1は、第一実施形態の眼球内部組織の状態推定装置の構成を示す図である。FIG. 1 is a diagram showing a configuration of a state estimation device for an internal tissue of the eyeball according to the first embodiment. 図2は、図1の本体の内部構造を概略的に示す断面図である。FIG. 2 is a cross-sectional view schematically showing the internal structure of the main body of FIG. 図3は、図2の光照射ユニットおよび撮像ユニットが用いる光学系を表す図である。FIG. 3 is a diagram showing an optical system used by the light irradiation unit and the imaging unit of FIG. 図4は、図1の制御装置の構成および各装置の機能を表すブロック図である。FIG. 4 is a block diagram showing the configuration of the control device of FIG. 1 and the function of each device. 図5は、第一実施形態で撮像された画像を示す図であり、図5Aに影を抽出した印付けがされた画像の例を示す。図5Bに図5Aの画像から影の大きさを取得する手法を示す。FIG. 5 is a diagram showing an image captured in the first embodiment, and FIG. 5A shows an example of a marked image with a shadow extracted. FIG. 5B shows a method of acquiring the size of the shadow from the image of FIG. 5A. 図6は、第二実施形態における本体の内部構造を概略的に示す断面図である。FIG. 6 is a cross-sectional view schematically showing the internal structure of the main body in the second embodiment. 図7は、第二実施形態の光照射ユニットおよび撮像ユニットが用いる光学系を表す図である。FIG. 7 is a diagram showing an optical system used by the light irradiation unit and the imaging unit of the second embodiment. 図8は、第二実施形態で撮像された画像を示す図である。FIG. 8 is a diagram showing an image captured in the second embodiment. 図9は、モデルアイを用いた検証結果を示す図である。図9Aに影を抽出した印付けがされた画像の例を示し、図9Bに図9Aの画像から影の大きさを取得する手法を示す。FIG. 9 is a diagram showing the verification results using the model eye. FIG. 9A shows an example of a marked image in which a shadow is extracted, and FIG. 9B shows a method of obtaining the size of a shadow from the image of FIG. 9A. 図10は、本状態推定装置および光干渉断層計(OCT)による測定結果を比較した結果を示す。図10Aに眼底カメラを使用して撮像した除核豚眼の画像を示し、図10Bに図10Aの画像の強度正規化画像を示す。さらに、図10Cに図10Bの画像にK-平均法を用いたクラスタリングアルゴリズムを適用した画像を示し、図10Dに図10Cの画像に微分処理を行なった画像を示す。FIG. 10 shows the result of comparing the measurement results by the present state estimation device and the optical coherence tomography (OCT). FIG. 10A shows an image of an enucleated pig eye taken by using a fundus camera, and FIG. 10B shows an intensity-normalized image of the image of FIG. 10A. Further, FIG. 10C shows an image obtained by applying a clustering algorithm using the K-means method to the image of FIG. 10B, and FIG. 10D shows an image obtained by differentiating the image of FIG. 10C.
 以下、本発明の、眼球内部組織の状態推定装置およびその方法に関する実施形態について説明する。
 眼圧の上昇などの影響で、視神経乳頭付近での形状の変化を伴う進行性の視神経萎縮が起こり、その進行過程で、視神経線維が損傷を受け、視野喪失を生じる緑内障という病気がある。緑内障の進行中に、視神経線維が消失するため、多くの場合、視野喪失が発生する以前に、図2に示す、視神経が収束する視神経乳頭44の陷凹が深くなる。よって、視神経乳頭の3次元形状やその時間的変化を観察することで緑内障を発見することができる。
 特に、緑内障は加齢に伴って発症しやすくなる。緑内障が発症すると、徐々に視神経が障害され視野が狭くなる。しかし、緑内障の症状は、初期段階では、患者によって自覚されにくい。このため、緑内障は、治療の開始が遅れやすい。
 視神経乳頭の測定手法としては、光干渉断層計(OCT)を用いた眼底3次元画像解析が知られているが、高価で、専門病院などにしか備えられていないのが現状で、日常的に人々が検査のために用いることはできない。上記のように、緑内障は、自覚されにくいもののため、自覚症状が現れ、専門病院に来院する頃には失明など手遅れになることがあり、多くの人々が日常的に緑内障の発症を簡便に検査することができる機器の開発は重要である。
 本実施形態では、緑内障の発症を早期に簡便に検査するために、眼球の内部組織、特に、対象組織の一例として、視神経乳頭の状態を精度よく推定する手法を例として説明する。
Hereinafter, embodiments of the present invention relating to the state estimation device for the internal tissue of the eyeball and the method thereof will be described.
Due to the effects of increased intraocular pressure and the like, progressive optic nerve atrophy accompanied by a change in shape near the optic nerve head occurs, and in the process of progression, there is a disease called glaucoma in which the optic nerve fibers are damaged and visual field loss occurs. Due to the disappearance of the optic nerve fibers during the progression of glaucoma, the optic disc 44, as shown in FIG. 2, is often deepened before the visual field loss occurs. Therefore, glaucoma can be detected by observing the three-dimensional shape of the optic nerve head and its temporal change.
In particular, glaucoma is more likely to develop with aging. When glaucoma develops, the optic nerve is gradually damaged and the visual field becomes narrower. However, the symptoms of glaucoma are less likely to be noticed by the patient in the early stages. For this reason, glaucoma tends to delay the start of treatment.
Three-dimensional image analysis of the fundus using an optical coherence tomography (OCT) is known as a method for measuring the optic nerve head, but it is expensive and is currently provided only in specialized hospitals, etc., on a daily basis. It cannot be used by people for inspection. As mentioned above, since glaucoma is difficult to be aware of, subjective symptoms may appear and it may be too late such as blindness by the time you visit a specialized hospital, and many people routinely examine the onset of glaucoma easily. The development of equipment that can be used is important.
In this embodiment, in order to easily and quickly examine the onset of glaucoma, a method of accurately estimating the state of the optic nerve head will be described as an example of the internal tissue of the eyeball, particularly the target tissue.
 本実施形態では、説明で用いる図面の方向を下記のように定義する。
 水平方向を前後方向(図中には前方を「F」で示すとともに後方を「B」で示す)および左右方向(図中には左方を「L」で示すとともに右方を「R」で示す)に細別して説明する。左右方向については、後方から前方へ向いた状態を基準に左右を定める。また、鉛直方向のうち重力の作用方向を下方(図中には「D」で示す)とし、下方の反対方向を上方(図中には「U」で示す)とする。
In this embodiment, the directions of the drawings used in the description are defined as follows.
The horizontal direction is the front-back direction (in the figure, the front is indicated by "F" and the rear is indicated by "B") and the left-right direction (in the figure, the left is indicated by "L" and the right is indicated by "R". It will be described in detail in (shown). Regarding the left-right direction, the left and right are determined based on the state of facing from the rear to the front. In the vertical direction, the direction of action of gravity is downward (indicated by "D" in the figure), and the opposite direction of downward is upward (indicated by "U" in the figure).
[1.第一実施形態]
[1-1.眼球内部組織の状態推定装置の構成]
 図1に示すように、本実施形態では、本体2と、制御装置3とを備える眼球内部組織の状態推定装置1を説明する。
 ここでは、本体2および制御装置3は、無線で接続される例を示しているが、有線で接続されてもよい。
 本体2は、撮像した画像を制御装置3に送信する。本例では、眼球の内部で反射された光を撮像した画像が送信される。本体2は、光照射ユニット10と、撮像ユニット20とを備える。光照射ユニット10と、撮像ユニット20とは一体の構造になっている。
 本体2は、被検者が手で持ち運びできる程度の大きさであり、例えば、携帯機器のカメラに取り付けることで使用することも可能である。
 制御装置3は、本体2から画像を受信し、画像から眼球の内部組織の状態を推定する。
[1. First Embodiment]
[1-1. Configuration of state estimation device for internal tissue of the eyeball]
As shown in FIG. 1, in the present embodiment, the state estimation device 1 of the internal tissue of the eyeball including the main body 2 and the control device 3 will be described.
Here, although the example in which the main body 2 and the control device 3 are connected wirelessly is shown, they may be connected by wire.
The main body 2 transmits the captured image to the control device 3. In this example, an image of the light reflected inside the eyeball is transmitted. The main body 2 includes a light irradiation unit 10 and an imaging unit 20. The light irradiation unit 10 and the image pickup unit 20 have an integral structure.
The main body 2 has a size that can be carried by the subject by hand, and can be used, for example, by attaching it to a camera of a portable device.
The control device 3 receives an image from the main body 2 and estimates the state of the internal tissue of the eyeball from the image.
[1-2.本体2の内部構造]
 図2に示すように、光照射ユニット10は、筐体11を有する。
 ここでは、筐体11には、後方の面に孔12が設けられている。眼球の測定は、左右それぞれ測定するため、ここでは、孔12は1つ設けられているが、両方の眼球の測定を同時に行いたい場合は、左右方向に並んで2つ設けるようにしてもよい。孔12は、生体の眼4の周辺をそれぞれ囲むことができる程度の大きさを有する。孔12が被検者の眼4の位置に合うように、筐体11の後面を被検者の顔面に接触させると、生体の眼4の周辺と筐体11とで閉空間が形成される。
 なお、孔12と被検者の顔面に接触させる面との間には、孔12の周囲にゴムなどの可撓性の部材を設けるのが好ましい。眼4の周辺の顔面を可撓性の部材が接触することで、外部からの迷光を減らし、また、可撓性部材の変形により、撮像ユニット20の光軸を被検者の眼4に簡便に合せることができる。
[1-2. Internal structure of main body 2]
As shown in FIG. 2, the light irradiation unit 10 has a housing 11.
Here, the housing 11 is provided with a hole 12 on the rear surface. Since the left and right eyeballs are measured respectively, one hole 12 is provided here, but if it is desired to measure both eyeballs at the same time, two holes 12 may be provided side by side in the left-right direction. .. The holes 12 have a size sufficient to surround the periphery of the eye 4 of the living body. When the rear surface of the housing 11 is brought into contact with the face of the subject so that the hole 12 is aligned with the position of the eye 4 of the subject, a closed space is formed between the periphery of the eye 4 of the living body and the housing 11. ..
It is preferable to provide a flexible member such as rubber around the hole 12 between the hole 12 and the surface in contact with the face of the subject. The flexible member comes into contact with the face around the eye 4 to reduce stray light from the outside, and the deformation of the flexible member makes the optical axis of the imaging unit 20 easier for the subject's eye 4. Can be matched to.
 筐体11の内部には、照明が設けられている。照明は、赤外線LEDである。照明の波長は、可視光などでも良いが、被検者に眩しさを感じさせないため、および、皮膚などを通して照射する際の光の侵入長などから、赤外光が好ましい。本実施形態では、一般的に行われているような、照明光を、レンズを通して角膜から入射させるものではなく、直接、眼球40の強膜や、眼球40周囲の皮膚を通して照射するものである。このため、眼の検査で用いられるような瞳孔拡張のための散瞳薬が不要である。ここでは例えば、850nm波長のLEDを使用している。照明は、筐体11内部の上方の位置に設置されているため、この照明を「上方照明L」(第一光照射部)と称する。また、上方照明Lの設置位置を「上方位置P1」(第一の位置)と称する。上方位置P1については後述する。なお、ここでは上方位置P1で説明を行うが、照明の位置は光が強膜や、眼球周囲の皮膚を通して入射できる位置であれば上方位置P1に限定されず、下方位置や、右方位置、左方位置いずれの位置でも可能である。本実施形態では、主に、強膜や眼球40周囲の皮膚を介した斜め/軸外照明を使用する。また、照明は固定されるものに限らず、例えば、筐体内で孔12の外周周辺に、回転する部材に照明を設置して、照射位置が適切な位置になるように移動可能なものであってもよい。 Lighting is provided inside the housing 11. The lighting is an infrared LED. The wavelength of illumination may be visible light or the like, but infrared light is preferable because it does not make the subject feel dazzling and because the penetration depth of light when irradiating through the skin or the like. In the present embodiment, the illumination light is not incident from the cornea through the lens as is generally performed, but is directly irradiated through the sclera of the eyeball 40 or the skin around the eyeball 40. Therefore, there is no need for a mydriatic drug for pupil dilation as used in eye examinations. Here, for example, an LED having an 850 nm wavelength is used. Illumination, because it is installed in the housing 11 inside the upper position, the illumination is referred to as "upper illumination L U" (first light irradiation part). Also, the installation position of the upper illumination L U is referred to as "upper position P1" (first position). The upper position P1 will be described later. Although the description will be given here at the upper position P1, the illumination position is not limited to the upper position P1 as long as the light can enter through the sclera or the skin around the eyeball, and the lower position, the right position, and the like. Any position on the left is possible. In this embodiment, oblique / off-axis illumination is mainly used through the sclera and the skin around the eyeball 40. Further, the lighting is not limited to the fixed one, and for example, the lighting can be installed on a rotating member around the outer periphery of the hole 12 in the housing and can be moved so that the irradiation position becomes an appropriate position. You may.
 撮像ユニット20は、筒体21を備える。筒体21の内部には、第一レンズR1と、第二レンズR2と、カメラ22と、撮像素子23とが設けられている。
 ここでは、第一レンズR1および第二レンズR2は、既存の眼科用対物レンズである。第一レンズR1は、筐体11の前方の面と筒体21の後方の面とが接続する境界に設けられている。第二レンズR2は、筒体21の前後方向の中央領域に設けられている。第一レンズR1および第二レンズR2は、眼球40の内部で反射された光を集光しカメラ22へ中継するためのものである。
The imaging unit 20 includes a tubular body 21. Inside the tubular body 21, a first lens R1, a second lens R2, a camera 22, and an image sensor 23 are provided.
Here, the first lens R1 and the second lens R2 are existing ophthalmic objective lenses. The first lens R1 is provided at a boundary where the front surface of the housing 11 and the rear surface of the tubular body 21 are connected. The second lens R2 is provided in the central region of the tubular body 21 in the front-rear direction. The first lens R1 and the second lens R2 are for collecting the light reflected inside the eyeball 40 and relaying it to the camera 22.
 第二レンズR2の前方には、カメラ22が設けられている。カメラ22は、第二レンズR2からの光を取り込んで、撮像された画像を取得する。カメラ22は、撮像素子23を備える。撮像素子23は、撮像対象を結像し、結像した光の明暗を電気信号に変換する。電気信号に変換された画像は、制御装置3に送信される。撮像ユニット20は、撮像部の一例である。 A camera 22 is provided in front of the second lens R2. The camera 22 captures the light from the second lens R2 and acquires the captured image. The camera 22 includes an image sensor 23. The image sensor 23 forms an image of an image pickup target and converts the light and darkness of the imaged light into an electric signal. The image converted into an electric signal is transmitted to the control device 3. The image pickup unit 20 is an example of an image pickup unit.
[1-3.光学系]
 図2および図3を参照して、光照射ユニット10および撮像ユニット20が用いる光学系を説明する。
 はじめに、図2を参照して、眼4の構造を簡単に説明する。眼球40は、複数の膜で保護されている。膜は、眼球の外側から強膜、脈絡膜、網膜41の順番に並んでいる。瞳孔42から眼球40の中に入った光は、網膜41の視細胞で感知され、網膜41に張り巡らされた視神経43を通って脳に伝達され、映像になる。網膜を覆っている視神経43は、眼球の奥の方で1本の太い束に収束する。視神経43が1本の束に収束する箇所は、視神経乳頭44という。視神経乳頭44は、陥凹していることが多い。言い換えると、視神経乳頭44の形状はクレーター状とみなすことができる。
[1-3. Optical system]
The optical system used by the light irradiation unit 10 and the image pickup unit 20 will be described with reference to FIGS. 2 and 3.
First, the structure of the eye 4 will be briefly described with reference to FIG. The eyeball 40 is protected by a plurality of membranes. The membranes are arranged in the order of sclera, choroid, and retina 41 from the outside of the eyeball. The light that enters the eyeball 40 from the pupil 42 is sensed by the photoreceptor cells of the retina 41, transmitted to the brain through the optic nerve 43 stretched around the retina 41, and becomes an image. The optic nerve 43, which covers the retina, converges into a single thick bundle at the back of the eyeball. The place where the optic nerve 43 converges into one bundle is called the optic nerve head 44. The optic disc 44 is often recessed. In other words, the shape of the optic nerve head 44 can be regarded as crater-like.
 上方照明Lは、筐体11の内部に、照明の光軸Oが、眼球40の瞳孔42を水平に通過する光軸(眼球の光軸)Oに対して、上方向に所定の傾斜角度θ1で傾斜するように設置されている。その所定の傾斜角度θ1は、上方照明Lからの光が、眼球40の瞳孔42以外の部位に入射して眼底45を照射するように、例えば45°に設定されている。言い換えると、上方照明Lが設置されている上方位置P1は、筐体11の孔12に対して上方に45°傾斜する位置である。 Upper illumination L U includes, in the casing 11, the optical axis O L of illumination, the optical axis passing through the pupil 42 of the eye 40 horizontally (eyeball optical axis) against O E, the predetermined upward It is installed so as to incline at an inclination angle θ1. The predetermined inclination angle θ1 is, the light from the upper illumination L U, enters the portion other than the pupil 42 of the eye 40 is set to to illuminate the fundus 45, for example 45 °. In other words, the upper position P1 above the illumination L U is installed is a position 45 ° inclined upward with respect to the hole 12 of the housing 11.
 図3に示すように、上方照明Lは、上方位置P1から眼球40の内部に向かって光を照射する。ここでは、上方照明Lから照射された光は、直接、眼球40の強膜、および/又は、眼球40周辺の皮膚を通過して、網膜41に達し、眼球40の内部の視神経乳頭44を含む眼底45に照射される。眼球40に入射した光は、眼球40の内部で多少散乱するが、光の大部分は、眼底45で反射され、瞳孔42から眼球40の外部に出射する。 As shown in FIG. 3, the upper illumination L U emits light toward the upper position P1 in the interior of the eye 40. Here, the light emitted from the upper illumination L U, directly sclera of the eye 40, and / or through the skin around the eye 40, reaches the retina 41, the interior of the optic disc 44 of the eye 40 The fundus 45 including the fundus 45 is irradiated. The light incident on the eyeball 40 is scattered to some extent inside the eyeball 40, but most of the light is reflected by the fundus 45 and emitted from the pupil 42 to the outside of the eyeball 40.
 撮像ユニット20の第一レンズR1(収差補正レンズ)は、瞳孔42から眼球40の外部に出射された光を集光する。第二レンズR2(収差補正レンズ)は、第一レンズR1からの光を撮像素子23に中継する。
 眼底45の撮像範囲は、画角θ2と焦点距離Fによって決定される。画角θ2は、撮像範囲の広さを角度で表したものである。図2に示すように、焦点距離Fとは、ここでは、眼底45から反射した光が集まる焦点Fと第一レンズR1の中心との距離を言う。眼底45を広い範囲で撮像するためには、焦点距離Fを短く設定する必要がある。本実施形態では、画角θ2が60°になるように、焦点距離Fを定めている。
 なお、第一レンズR1および第二レンズR2の光軸の中心軸は、筒体21の中心軸と合致している。
The first lens R1 (aberration correction lens) of the image pickup unit 20 collects the light emitted from the pupil 42 to the outside of the eyeball 40. The second lens R2 (aberration correction lens) relays the light from the first lens R1 to the image sensor 23.
Imaging range of the fundus 45 is determined by the angle θ2 and the focal length F L. The angle of view θ2 represents the width of the imaging range in terms of angles. As shown in FIG. 2, the focal length F L, here, refers to the distance between the center of the focal point F P and the first lens R1 which light reflected from the fundus 45 gather. To image the fundus 45 in a wide range, it is necessary to set short focal length F L. In the present embodiment, the focal length FL is set so that the angle of view θ2 is 60 °.
The central axis of the optical axis of the first lens R1 and the second lens R2 coincides with the central axis of the tubular body 21.
 図3に示すように、カメラ22は、第二レンズR2から中継された光をレンズ(不図示)で集光する。集光された光は、撮像素子23で受光され、結像される。本実施形態では、上方照明Lのみからの光を照射して眼底45を撮像するため、撮像された画像は「単一照明画像I」(第一撮像情報)と称する。 As shown in FIG. 3, the camera 22 collects the light relayed from the second lens R2 by a lens (not shown). The focused light is received by the image sensor 23 and is imaged. In the present embodiment, in order to image the fundus 45 is irradiated with light from only the upper lighting L U, the captured image is referred to as a "single illumination image I S" (first imaging information).
[1-4.制御装置の構成]
 図4を参照して、制御装置3の構成を説明する。
 制御装置3は、処理装置30と記憶装置34とを備える。処理装置30は、記憶装置34に記憶されたプログラムを実行することにより、後述する機能を実現する。本例では、処理装置30は、CPU(Central  Processing  Unit)である。なお、処理装置30は、DSP(Digital  Signal  Processor)、又は、プログラム可能な論理回路装置(PLD;Programmable  Logic  Device)により構成されていてもよい。処理装置30は処理部の一例である。
[1-4. Control device configuration]
The configuration of the control device 3 will be described with reference to FIG.
The control device 3 includes a processing device 30 and a storage device 34. The processing device 30 realizes a function described later by executing a program stored in the storage device 34. In this example, the processing device 30 is a CPU (Central Processing Unit). The processing device 30 may be configured by a DSP (Digital Signal Processor) or a programmable logic circuit device (PLD; Programmable Logic Device). The processing device 30 is an example of a processing unit.
 記憶装置34は、情報を読み書き可能に記憶する。例えば、記憶装置34は、RAM(Random  Access  Memory)、ROM(Read  Only  Memory)、HDD(Hard  Disk  Drive)、SSD(Solid  State  Disk)、半導体メモリ、および、有機メモリの少なくとも1つを備える。なお、記憶装置34は、フレキシブルディスク、光ディスク、光磁気ディスク、および、半導体メモリ等の記録媒体と、記録媒体から情報を読み取り可能な読取装置と、を備えていてもよい。
 なお、制御装置3は、集積回路(例えば、LSI(Large  Scale  Integration)等)により実現されてよい。
The storage device 34 stores the information readable and writable. For example, the storage device 34 includes at least one of a RAM (Random Access Memory), a ROM (Read Only Memory), an HDD (Hard Disk Drive), an SSD (Solid State Disk), a semiconductor memory, and an organic memory. The storage device 34 may include a recording medium such as a flexible disk, an optical disk, a magneto-optical disk, and a semiconductor memory, and a reading device capable of reading information from the recording medium.
The control device 3 may be realized by an integrated circuit (for example, LSI (Large Scale Integration) or the like).
 [1-5.処理装置の機能]
 図4に示すように、処理装置30の機能は、影情報取得部31と、パラメータ取得部32と、状態推定部33とを備える。
[1-5. Processing device functions]
As shown in FIG. 4, the function of the processing device 30 includes a shadow information acquisition unit 31, a parameter acquisition unit 32, and a state estimation unit 33.
 影情報取得部31は、撮像ユニット20から取得した単一照明画像Iから視神経乳頭44の輪郭形状および影S(影情報)を取得する。(本発明において、「影」は物体などにより光線が遮られた際に生じる暗部をいう。)上方照明Lから照射された光は、視神経乳頭44に対して斜めに当たる。上述したように、視神経乳頭44は陥凹しているため、陥凹の深さに応じて、光が照射された方向から、陥凹内に影Sが形成される。このため、眼底45から反射した光を撮像した単一照明画像Iには、視神経乳頭44の陥凹の影Sが含まれる。影Sを含む単一照明画像Iは、眼底45を立体的に撮像した画像といえるため、準3D画像とも称される。 Shadow information obtaining unit 31 obtains the contour and the shadow S of the optic disk 44 (shadow information) from a single illumination image I S acquired from the imaging unit 20. (In the present invention, "shadow" refers to the dark portion generated when the light beam is interrupted by the object or the like.) Light irradiated from above the illumination L U strikes obliquely to the optic disc 44. As described above, since the optic nerve head 44 is recessed, a shadow S is formed in the recess from the direction of light irradiation according to the depth of the recess. Therefore, the single lighting image I S of the captured light reflected from the fundus 45, includes a shadow S of depression of the optic papilla 44. Single illumination image I S including shadow S, since it can be said that the sterically captured images of the fundus 45, also referred quasi 3D image.
 図5Aに、単一照明画像Iの例を示す。図5Aは豚の目を本実施形態の装置1を用いて実際に測定したものである。
 影情報取得部31は、単一照明画像Iの中で明度の低い領域を影Sとして抽出(特定、取得)する。また、同時に、視神経乳頭44の輪郭を抽出する。影情報取得部31は、所定の閾値等を用いて、影Sを抽出すると、影Sとその周辺領域とを含む範囲に印をつける。図5Aに示すように、ここでは、破点で印付けがされている。
In Figure 5A, showing an example of a single illumination image I S. FIG. 5A is an actual measurement of pig eyes using the device 1 of the present embodiment.
Shadow information obtaining unit 31 extracts the low lightness region within a single illumination image I S as a shadow S (a specific, acquisition) is. At the same time, the contour of the optic nerve head 44 is extracted. When the shadow S is extracted by the shadow information acquisition unit 31 using a predetermined threshold value or the like, the shadow information acquisition unit 31 marks a range including the shadow S and its peripheral region. As shown in FIG. 5A, here, the marks are marked with break points.
 パラメータ取得部32は、影情報取得部31が取得した影Sから、影の大きさL(第一パラメータ)を取得する。影の大きさLは、詳細には、上方照明Lの光が照射された方向から視神経乳頭44の陥凹(陥凹部)に伸びる形で形成する影の長さL(陥凹の縁から、光が照射された(進行する)方向に伸びる影のうち最も長い長さ。)である。この影の長さLは、得られた画像Iから直接求めてもよい。しかし、直接求める方法では、影の縁が不明確な場合、正確に求めるのが難しい場合がある。本実施形態では、図5Bを参照して、影Sの長さLの算出(取得)手法を説明する。 The parameter acquisition unit 32 acquires the shadow size L (first parameter) from the shadow S acquired by the shadow information acquisition unit 31. The size L of the shadow, in particular, from the upper illumination L U from the direction the light is irradiated shadow formed in a manner extending in the recess (recesses) of the optic disc 44 length L (recessed edge , The longest length of shadow that extends in the direction of light irradiation (advancing).) The length L of the shade may be obtained directly from the image I S obtained. However, it may be difficult to accurately obtain the shadow edge when the shadow edge is unclear by the direct method. In the present embodiment, a method of calculating (acquiring) the length L of the shadow S will be described with reference to FIG. 5B.
(0)前提として、パラメータ取得部32は、予め、影情報取得部31が取得した影Sを含む単一照明画像Iの一部(破線で囲まれた領域;図5A参照)の画像を拡大し、既存のソフトウェアを用いて影Sを強調する処理を行なう。 (0) It is assumed that the parameter acquisition unit 32 in advance, part of a single illumination image I S including a shadow S of the shadow information obtaining unit 31 obtains (area surrounded by a broken line; see FIG. 5A) images Enlarge and use existing software to emphasize the shadow S.
(1)影Sの長さLの取得は、2つの円又は楕円(以下、「第一の円C1」および「第二の円C2」と称する)を用いて行なう。円C1,C2の大きさは、影Sを内包することができる程度の大きさが望ましい。
 まず、影Sの一方の弧(円周の孤の一部)が第一の円C1に内接されるように、影Sを第一の円C1で囲む。詳細には、視神経乳頭44の陥凹部の輪郭(好ましくは、内輪郭;ここで、内輪郭とは、特にクレーター状の陥凹部の内縁を言う。)に沿う形で影Sを第一の円C1で囲むことが望ましい。但し、形状が不明確な場合、視神経乳頭44の陥凹部の輪郭の一部を形成する影Sの輪郭(周縁)の一部に、第一の円C1の円弧の一部が沿うように、第一の円C1を影S上に配置する。
(1) The length L of the shadow S is acquired by using two circles or ellipses (hereinafter, referred to as "first circle C1" and "second circle C2"). The size of the circles C1 and C2 is preferably large enough to include the shadow S.
First, the shadow S is surrounded by the first circle C1 so that one arc of the shadow S (a part of the arc around the circumference) is inscribed in the first circle C1. Specifically, the shadow S is the first circle along the contour of the recess of the optic disc 44 (preferably the inner contour; where the inner contour refers specifically to the inner edge of the crater-shaped recess). It is desirable to enclose it in C1. However, when the shape is unclear, a part of the arc of the first circle C1 follows a part of the contour (periphery) of the shadow S forming a part of the contour of the recess of the optic nerve head 44. The first circle C1 is placed on the shadow S.
(2)さらに、影Sのもう一方の弧(円周の孤の他部)が第二の円C2に内接されるように、影Sを第二の円C2で囲む。詳細には、視神経乳頭44の陥凹部の内輪郭に対向する影Sの輪郭(周縁)の一部に、第二の円C2の円弧の一部が沿うように、第二の円C2を影S上に配置する。但し、形状が不明確な場合、第一の円C1の円弧の一部が沿う影Sの輪郭(周縁)の一部に対向する影Sの輪郭に、第二の円C2の円弧が沿うように、第二の円C2を影S上に配置する。ここで、第二の円C2が重ねられる影Sの輪郭の一部とは、第一の円C1が重ねられた影Sの輪郭の一部とは異なる。よって、第二の円C2の円弧は、第一の円C1の円弧と二点で交差する。 (2) Further, the shadow S is surrounded by the second circle C2 so that the other arc of the shadow S (the other part of the circumference is isolated) is inscribed in the second circle C2. Specifically, the second circle C2 is shadowed so that a part of the arc of the second circle C2 follows a part of the contour (periphery) of the shadow S facing the inner contour of the recess of the optic nerve head 44. Place on S. However, when the shape is unclear, the arc of the second circle C2 follows the contour of the shadow S facing a part of the contour (periphery) of the shadow S along which a part of the arc of the first circle C1 follows. The second circle C2 is placed on the shadow S. Here, a part of the contour of the shadow S on which the second circle C2 is overlapped is different from a part of the contour of the shadow S on which the first circle C1 is overlapped. Therefore, the arc of the second circle C2 intersects the arc of the first circle C1 at two points.
(3)そして、第一の円C1の円弧と第二の円C2の円弧に囲まれた影Sの長さを測定する。ここでは、(図2及び図3における)上方向UをY軸、左方向LをX軸とする仮想座標を、その原点оが、一方の円の中心と一致または略一致し、二つの円C1,C2の円弧が交差する点どうしを結んだ線がY軸に平行になるように、単一照明画像Iに重ねる。そして、影SのX軸方向の幅である二つの円の円弧上の点の間の距離、すなわちX軸と交差する第二の円C2の円弧上の点と第一の円C1の円弧上の点との間の距離を測定する。換言すれば、影Sの長さは、第一の円C1と第二の円C2とで囲まれた領域の幅であって、交差した円C1,C2の円弧の頂点どうしを結んだ線と略垂直な線の長さである。これにより、精度よく影の長さLが取得される。 (3) Then, the length of the shadow S surrounded by the arc of the first circle C1 and the arc of the second circle C2 is measured. Here, the virtual coordinates (in FIGS. 2 and 3) in which the upward direction U is the Y axis and the left direction L is the X axis, the origin о coincides with or substantially coincides with the center of one circle, and two circles. C1, so that a line of arc connecting each other point of intersection of C2 is parallel to the Y axis, overlaying a single illumination image I S. Then, the distance between the points on the arcs of the two circles, which is the width of the shadow S in the X-axis direction, that is, the points on the arcs of the second circle C2 and the arcs of the first circle C1 intersecting the X-axis. Measure the distance between the points. In other words, the length of the shadow S is the width of the area surrounded by the first circle C1 and the second circle C2, and is the line connecting the vertices of the arcs of the intersecting circles C1 and C2. It is the length of a line that is almost vertical. As a result, the shadow length L is accurately acquired.
 次に、パラメータ取得部32は、影の長さLと、視神経乳頭の大きさD(直径;第二パラメータ)とに基づいて、この大きさDとは異なる視神経乳頭44の大きさを表す値d(深さ;第三パラメータ)を取得(算出)する。 Next, the parameter acquisition unit 32 is a value representing the size of the optic nerve head 44 different from this size D, based on the length L of the shadow and the size D (diameter; second parameter) of the optic nerve head. Acquire (calculate) d (depth; third parameter).
 ここで、視神経乳頭44の陥凹部の形状は、クレーター状の放物曲面、楕円、或いは、球面にモデル化できると仮定する。このような形状に傾斜光が入射した場合、陥凹部に厳密にその直径Dの半分の長さの影が得られた場合、陥凹部の深さdは、2次元平面投影図で得られた影の長さLおよび傾斜角度θ1で、以下の単純な数式1により表される。
Figure JPOXMLDOC01-appb-M000001
Here, it is assumed that the shape of the recess of the optic nerve head 44 can be modeled as a crater-shaped parabolic surface, an ellipse, or a spherical surface. When oblique light is incident on such a shape, the depth d of the recess is obtained in a two-dimensional plan projection drawing when a shadow of exactly half the diameter D of the recess is obtained. The shadow length L and the inclination angle θ1 are expressed by the following simple equation 1.
Figure JPOXMLDOC01-appb-M000001
 しかし、上記式は、非常に深い陥凹部や、浅い陥凹部には使用できない。また、複雑な楕円(円)形である視神経乳頭44の陥凹部に、陥凹部に厳密にその直径Dの半分の長さの影を得ることは、非常に困難なことである。直径Dの半分より短い、又は、長い影は、任意の補正係数を用いて修正する手段もあるが正確とは言えない。
 そこで、本実施形態では、視神経乳頭44の陥凹部を第一の円C1を直径Dとする放物曲面、楕円、或いは、球面とし、影が形成する形状を第二の円C2として、それぞれを方程式として表した計算から導かれた結果を用いる。
However, the above equation cannot be used for very deep recesses or shallow recesses. Further, it is very difficult to obtain a shadow of exactly half the diameter D in the recess of the optic nerve head 44, which is a complicated elliptical (circular) shape. Shadows that are shorter or longer than half the diameter D can be corrected using an arbitrary correction factor, but they are not accurate.
Therefore, in the present embodiment, the recess of the optic disc 44 is a parabolic curved surface, an ellipse, or a spherical surface having the first circle C1 as the diameter D, and the shape formed by the shadow is the second circle C2. Use the result derived from the calculation expressed as an equation.
 ここで、視神経乳頭44の大きさDは、視神経乳頭44の直径Dであり、ここでは、所定の文献値(ヒトの場合、一般的な直径は1.5mmである。)を用いる。なお、眼底写真から視神経乳頭44の直径を推計して、その値を用いてもよい。また、値dは、詳細には、視神経乳頭44の陥凹の深さdである。視神経乳頭44の陥凹部が第一の円C1を直径Dとする放物曲面、或いは、球面とし、影が形成する形状を第二の円C2としたとき、影の長さLと、視神経乳頭44の直径Dと、視神経乳頭44の陥凹の深さdとの関係は、数式2により表される。
Figure JPOXMLDOC01-appb-M000002
 また、視神経乳頭44の陥凹部が楕円形であった場合は、影の長さLと、視神経乳頭44の直径Dと、視神経乳頭44の陥凹の深さdとの関係は、数式3により表される。
Figure JPOXMLDOC01-appb-M000003
Here, the size D of the optic nerve head 44 is the diameter D of the optic nerve head 44, and here, a predetermined literature value (in the case of a human, a general diameter is 1.5 mm) is used. The diameter of the optic nerve head 44 may be estimated from the fundus photograph and used. Further, the value d is, in detail, the depth d of the depression of the optic nerve head 44. When the recess of the optic nerve head 44 is a radial curved surface having the first circle C1 as the diameter D or a spherical surface and the shape formed by the shadow is the second circle C2, the length L of the shadow and the optic nerve head The relationship between the diameter D of 44 and the depth d of the depression of the optic disc 44 is expressed by Equation 2.
Figure JPOXMLDOC01-appb-M000002
When the recess of the optic disc 44 is elliptical, the relationship between the length L of the shadow, the diameter D of the optic disc 44, and the depth d of the recess of the optic disc 44 is determined by Equation 3. expressed.
Figure JPOXMLDOC01-appb-M000003
 図2に示したように、上方照明Lの照明の光軸Oは、45°の傾斜角度θ1で眼球40に入射しているため、tanθ1=1である。従って、上記数式に、影の長さL、視神経乳頭44の直径D、角度θ1を挿入することで、視神経乳頭44の陥凹の深さdを取得することができる。このうち、視神経乳頭44の直径Dおよび角度θ1は予め設定されている値であるため、実質的には、影の長さLを取得するだけで、視神経乳頭44の陥凹の深さdを取得することが可能である。 As shown in FIG. 2, the optical axis O L of the illumination of the upper illumination L U is because it is incident on the eyeball 40 with the inclination angle θ1 of 45 °, a tan .THETA.1 = 1. Therefore, the depth d of the depression of the optic nerve head 44 can be obtained by inserting the shadow length L, the diameter D of the optic nerve head 44, and the angle θ1 into the above mathematical formula. Of these, since the diameter D and the angle θ1 of the optic nerve head 44 are preset values, the depth d of the depression of the optic nerve head 44 can be substantially obtained only by acquiring the shadow length L. It is possible to get it.
 状態推定部33は、パラメータ取得部32が取得した視神経乳頭44の陥凹の深さdを取得し、視神経乳頭44の状態を推定する。状態推定部33は、たとえば、取得された深さdを、一般的な値(例えば臨床的に得られた平均値)や、被検者の眼4毎に以前の検査時に得られた値と比較することで、視神経乳頭44の状態が正常であるか異常であるかを推定する。 The state estimation unit 33 acquires the depth d of the depression of the optic disc 44 acquired by the parameter acquisition unit 32, and estimates the state of the optic disc 44. For example, the state estimation unit 33 sets the acquired depth d as a general value (for example, a clinically obtained average value) or a value obtained for each eye 4 of the subject at the time of the previous examination. By comparing, it is estimated whether the condition of the optic disc 44 is normal or abnormal.
[1-6.記憶装置の機能]
 図4に示すように、記憶装置34の機能は、記憶部35を備える。
 記憶部35は、視神経乳頭44の状態推定のため用いられる種々の情報を記憶する。記憶部35は、たとえば、撮像ユニット20で撮像された単一照明画像I、影情報取得部31が印付けした単一照明画像I、パラメータ取得部32で用いるおよび取得されたパラメータD、d、L、状態推定部33における推定結果などを含む情報を記憶する。更に、後述する照射調整部37において検出した結果などの情報を記憶する。
[1-6. Storage device functions]
As shown in FIG. 4, the function of the storage device 34 includes a storage unit 35.
The storage unit 35 stores various information used for estimating the state of the optic nerve head 44. Storage unit 35, for example, a single illumination image I S captured by the imaging unit 20, a single illumination image shadow information obtaining unit 31 is marked I S, using the parameter acquiring unit 32 and the acquired parameters D, Information including d, L, the estimation result in the state estimation unit 33, and the like is stored. Further, information such as the result detected by the irradiation adjusting unit 37, which will be described later, is stored.
[1-7.動作]
 眼球内部組織の状態推定方法について説明する。
 まず、本体2の孔12を眼球40の位置に合わせ、本体2の筐体11の後方の面を被検者の顔面に接触させる。
 次に、筐体11内部の上方照明Lから眼球40内部の視神経乳頭44に向かって光を照射する。上方照明Lの照明の光軸Oは、眼球40の光軸Oに対して上方に45°傾斜している。
 上方照明Lから照射された光は、眼球40の眼底45で反射され、眼球40の瞳孔42から出射される。出射された光は、撮像ユニット20の第一レンズR1および第二レンズR2で集光され、カメラ22で撮像される。
 カメラ22で撮像された単一照明画像Iは、撮像ユニット20の撮像素子23で電気信号に変換され、処理装置30に送信される。
 なお、撮像された眼底画像が所期の位置より大きくずれている場合は、被検者が本体2の方向をずらすことで、所期の位置に簡便に調整できる。
[1-7. motion]
The method of estimating the state of the internal tissue of the eyeball will be described.
First, the hole 12 of the main body 2 is aligned with the position of the eyeball 40, and the rear surface of the housing 11 of the main body 2 is brought into contact with the face of the subject.
Next, light is irradiated toward the upper lighting L U of the housing 11 to the eye 40 inside the optic disc 44. The optical axis O L of the illumination of the upper illumination L U is inclined by 45 ° upwards with respect to the optical axis O E of the eye 40.
Light emitted from the upper illumination L U is reflected by the fundus 45 of the eye 40, is emitted from the pupil 42 of the eye 40. The emitted light is collected by the first lens R1 and the second lens R2 of the image pickup unit 20, and is imaged by the camera 22.
Single illumination image I S captured by the camera 22 is converted into an electric signal by the image sensor 23 of the imaging unit 20, it is transmitted to the processor 30.
If the captured fundus image deviates significantly from the desired position, the subject can easily adjust to the desired position by shifting the direction of the main body 2.
 処理装置30の影情報取得部31は、取得した単一照明画像Iから視神経乳頭44の影Sを取得する。
 処理部30のパラメータ取得部32は、影情報取得部31が取得した視神経乳頭44の陥凹に形成される影Sの長さLを取得する。影Sの長さLは、二つの円C1,C2を用いて取得される。
 パラメータ取得部32は、影の長さLと、視神経乳頭44の直径Dと、上方照明Lの傾斜角度θ1とに基づいて、視神経乳頭44の陥凹の深さdを取得する。
 処理部30の状態推定部33は、深さdに基づいて視神経乳頭44の状態を推定する。
Shadow information obtaining unit 31 of the processor 30 obtains a shadow S of the optic papilla 44 from a single illumination image I S acquired.
The parameter acquisition unit 32 of the processing unit 30 acquires the length L of the shadow S formed in the recess of the optic nerve head 44 acquired by the shadow information acquisition unit 31. The length L of the shadow S is obtained using the two circles C1 and C2.
Parameter acquisition unit 32, and shadow length L, a and the diameter D of the optic nerve head 44, on the basis of the inclination angle θ1 of upper illumination L U, to obtain the depth d of the recess of the optic nerve head 44.
The state estimation unit 33 of the processing unit 30 estimates the state of the optic nerve head 44 based on the depth d.
[1-8.作用および効果]
 本実施形態の眼球内部組織の状態推定装置は、上述のような構成を備えるため、下記のような作用および効果を得ることができる。
(1)これによれば、被検者が正視している間に、照射された光によって視神経乳頭44の陥凹に形成された影Sが含まれた単一照明画像Iを取得できる。この結果、視神経乳頭44の状態を簡便に、かつ、高い精度で推定できる。
(2)更に、影Sの長さLと、視神経乳頭44の直径Dとから、視神経乳頭44の陥凹の深さdを取得できる。特に、本実施形態では視神経乳頭44の直径Dは文献値を採用したため、影の長さLを取得するだけでよい。よって、視神経乳頭44の状態を推定するために必要なパラメータは少なくて済む。
(3)影Sの長さLは、2つの円を用いて抽出することができ、専門的な知識(特に数学の知識)を必要としない。よって、視神経乳頭44の状態を推定するために必要なパラメータをシンプルな手法で取得できる。
(4)上方照明Lから眼球40に対して斜めに光を照射することで、強制的に瞳孔42を大きく広げることなく、眼球40の眼底45を撮像できる。よって、被検者への負担が少ない。
[1-8. Action and effect]
Since the state estimation device for the internal tissue of the eyeball of the present embodiment has the above-described configuration, the following actions and effects can be obtained.
(1) According to this, while the subject is looking straight, it can be acquired that contains the shadow S formed recess of the optic disc 44 by the irradiation light single illumination image I S. As a result, the state of the optic nerve head 44 can be estimated easily and with high accuracy.
(2) Further, the depth d of the depression of the optic nerve head 44 can be obtained from the length L of the shadow S and the diameter D of the optic nerve head 44. In particular, in the present embodiment, since the diameter D of the optic nerve head 44 adopts the literature value, it is only necessary to obtain the shadow length L. Therefore, the number of parameters required to estimate the state of the optic nerve head 44 is small.
(3) The length L of the shadow S can be extracted using two circles, and does not require specialized knowledge (particularly knowledge of mathematics). Therefore, the parameters necessary for estimating the state of the optic nerve head 44 can be obtained by a simple method.
(4) by irradiating light obliquely against the upper illumination L eye 40 from U, forcibly without increasing significantly the pupil 42 can image the fundus 45 of the eye 40. Therefore, the burden on the subject is small.
[2.第二実施形態]
 第二実施形態に係る眼球内部組織の状態推定装置およびその方法について、図4、図6および図7を参照して説明する。本実施形態に係る状態推定装置は、第一実施形態に係る状態推定装置に対して、照明を複数備える点で相違する。以下、第一実施形態と共通する構成には同一の符号を付し、その説明は省略する。
[2. Second embodiment]
The state estimation device for the internal tissue of the eyeball and the method thereof according to the second embodiment will be described with reference to FIGS. 4, 6 and 7. The state estimation device according to the present embodiment is different from the state estimation device according to the first embodiment in that a plurality of lights are provided. Hereinafter, the same reference numerals are given to the configurations common to those of the first embodiment, and the description thereof will be omitted.
[2-1.本体2の内部構造]
 図6に示すように、筐体11は、その内部に、上方照明Lに加え、下方の位置に設置された下方照明L(第二光照射部)を備える。照明L,Lはいずれも赤外線LEDである。
 下方照明Lは、上方照明Lと同様に、照明の光軸Oが、眼球40の瞳孔42を水平に通過する光軸(眼球の光軸)Oに対して、下方向に所定の傾斜角度θ1で傾斜するように設置されている。その所定の傾斜角度θ1は、上方照明Lと同様に、例えば45°に設定されている。
[2-1. Internal structure of main body 2]
As shown in FIG. 6, the housing 11 has, in its interior, in addition to the above illumination L U, comprising a lower illumination is installed in the lower position L D (second light irradiator). The illuminations L U and L D are both infrared LEDs.
Lower illumination L D, similar to the above illumination L U, the optical axis O L of illumination, with respect to the optical axis passing through the pupil 42 of the eye 40 horizontally (eyeball optical axis) O E, a predetermined downward It is installed so as to incline at the inclination angle θ1 of. The predetermined inclination angle θ1 is set similarly to the above illumination L U, for example to 45 °.
[2-2.光学系]
 図7を参照して、光照射ユニット10および撮像ユニット20が用いる光学系を説明する。
 上方照明Lは、上方位置P1から光を照射し、同時に、下方照明Lは、下方位置P2(第二の位置)から光を照射する。上方照明Lおよび下方照明Lから眼球40内部の視神経乳頭44に向かって照射された光は、直接、眼球40の強膜、および/又は、眼球40周辺の皮膚を通過して、網膜41に達し、眼底45で反射され、瞳孔42から眼球40の外部に出射される。出射された光は撮像ユニット20で集光され、撮像される。撮像ユニット20で撮像された画像は、2つの光を撮像したものであるため、「複数照明画像I」(第二撮像情報)と称する。複数照明画像Iは、撮像ユニット20から処理装置30に送信される。
[2-2. Optical system]
The optical system used by the light irradiation unit 10 and the image pickup unit 20 will be described with reference to FIG. 7.
Upper illumination L U has a light irradiated from the upper position P1, at the same time, the lower the illumination L D irradiates light from the lower position P2 (second position). Light irradiated upward illumination L U and lower illumination L eyeball from D 40 inside the optic disc 44 is directly sclera of the eye 40, and / or through the skin around the eye 40, retina 41 Is reflected by the fundus 45, and is emitted from the pupil 42 to the outside of the eyeball 40. The emitted light is collected by the imaging unit 20 and imaged. Since the image captured by the imaging unit 20 captures two lights, it is referred to as "plural illumination image IP " (second imaging information). The plurality of illumination images IP are transmitted from the imaging unit 20 to the processing device 30.
[2-3.制御装置の構成]
 図4に示すように、制御装置3は、更に、照明制御装置36(調整部)を備える。照明制御装置36の機能は、照射調整部37を備える。照射調整部37は、撮像ユニット20から複数照明画像Iを取得し、この複数照明画像Iに基づいて、照明L,Lを調整する。
[2-3. Control device configuration]
As shown in FIG. 4, the control device 3 further includes a lighting control device 36 (adjustment unit). The function of the lighting control device 36 includes an irradiation adjusting unit 37. Radiation adjusting section 37 obtains the plurality of illumination images I P from the imaging unit 20, based on the plurality of illumination image I P, adjust the illumination L U, L D.
 図8に、複数照明画像Iの例を示す。
 照射調整部37は、この複数照明画像I中の明度から照明L,Lの照射状態を検出する。ここでは、複数照明画像Iは、画像内の左側に光が強く当たっていることが分かる。また、視神経乳頭44が撮像されているが(破線で囲んだ領域)、複数照明画像I内には明度が低い箇所が複数存在する。
Figure 8 shows an example of multiple illumination image I P.
Radiation adjusting section 37 detects the irradiation state of the illumination L U, L D lightness of the plurality illumination image I P. Here, plural illumination image I P, it is seen that light is hitting strong on the left side of the image. Although the optic disc 44 is imaged (region surrounded by a broken line), point low brightness there are a plurality in the plurality of illumination image I P.
 照射調整部37は、照明L,Lの照射状態から、照明L,Lの向きや照射強度を調整する。従って、第一実施形態に記載した照明L,Lの傾斜角度θ1は、45°から変更されることがある。傾斜角度θ1を変更することで、画像の明度をより強調することができる。
 さらに、照射調整部37は、上方照明Lおよび下方照明Lのいずれを用いて単一照明画像Iを取得するかを決定してもよい。
Radiation adjusting section 37, the illumination L U, from the irradiation condition of the L D, adjusting the direction and the irradiation intensity of the illumination L U, L D. Therefore, the inclination angle θ1 of the illuminations L U and L D described in the first embodiment may be changed from 45 °. By changing the tilt angle θ1, the brightness of the image can be further emphasized.
Furthermore, radiation adjusting section 37, any upward illumination L U and lower illumination L D may determine whether to get a single illumination image I S using.
[2-4.作用および効果]
(5)上方照明Lおよび下方照明Lを同時に照射することで、眼底45について広角の複数照明画像Iを取得することができる。よって、撮像したい視神経乳頭44の位置や角度等を精度よく把握できる。
(6)複数照明画像Iに基づいて、上方照明Lおよび下方照明Lを調整するため、より明確な影Sが含まれる単一照明画像Iを得ることができる。よって、眼球40内部の視神経乳頭44の状態をより一層高い精度で推定できる。
[2-4. Action and effect]
(5) by irradiating the same time upward illumination L U and the lower lighting L D, it is possible to obtain a plurality of illumination images I P of the wide-angle fundus 45. Therefore, the position and angle of the optic nerve head 44 to be imaged can be accurately grasped.
(6) based on the plurality of illumination image I P, in order to adjust the upper illumination L U and the lower lighting L D, it is possible to obtain a single illumination image I S that includes the clearer shadow S. Therefore, the state of the optic nerve head 44 inside the eyeball 40 can be estimated with even higher accuracy.
[3.変形例]
[3-1.変形例1]
 上述の実施形態はあくまでも例示に過ぎず、この実施形態で明示しない種々の変形や技術の適用を排除する意図はない。本実施形態の各構成は、それらの趣旨を逸脱しない範囲で種々変形して実施することができる。また、必要に応じて取捨選択することができ、適宜組み合わせることもできる。
[3. Modification example]
[3-1. Modification 1]
The above-described embodiment is merely an example, and there is no intention of excluding the application of various modifications and techniques not specified in this embodiment. Each configuration of the present embodiment can be variously modified and implemented without departing from the gist thereof. In addition, it can be selected as needed and can be combined as appropriate.
 たとえば、2つの照明L,Lのそれぞれから時間を異にして眼球40に光を照射し、それぞれ単一照明画像Iを取得してもよい。この場合、照射調整部37は、取得した単一照明画像Iどうしを比較することで、各照明L,Lを調整してもよい。さらに、照射調整部37は、いずれの照明Lを用いてさらに単一照明画像Iを取得するかを決定してもよい。また、各2つの照明L,Lで得られた単一照明画像Iからそれぞれ影の長さLを求め、比較、或いは、平均値をとるなどして、より測定精度を高めてもよい。 For example, two illumination L U, light is irradiated to the eyeball 40 and different in time from each L D, may respectively be obtained a single illumination image I S. In this case, radiation adjusting section 37, by comparing the to do single illumination image I S acquired, the illumination L U, may be adjusted L D. Furthermore, radiation adjusting section 37 may determine whether to obtain more single lighting image I S using any of the illumination L. Each two illumination L U, respectively determine the length L of the shade from a single illumination image I S obtained in L D, comparison, or, by, for example, taking an average value, it is increased more measurement precision good.
 あるいは、照明の数は3つ以上であってもよい。この場合、眼底45全体について異なる画像を取得できる。また、照明の位置は上方および下方に限定されず、左方および右方であってもよい。 Alternatively, the number of lights may be three or more. In this case, different images can be obtained for the entire fundus 45. Further, the position of the illumination is not limited to the upper side and the lower side, and may be the left side and the right side.
 照明は赤外線LEDではなく、可視光LEDを用いてもよい。可視光を用いると、網膜41を含む眼底45における動脈の状態を推定することができる。網膜41下部に影が見える場合には、眼底45における動脈の異常が推定される。影の長さによって、動脈の出血や血栓の有無を判断することが可能である。 The lighting may be a visible light LED instead of an infrared LED. Visible light can be used to estimate the state of arteries in the fundus 45, including the retina 41. If a shadow is visible in the lower part of the retina 41, an abnormality of the artery in the fundus 45 is presumed. The length of the shadow can be used to determine the presence or absence of arterial bleeding or blood clots.
[3-2.変形例2]
 ここで、図5A,図5Bおよび図8に示されるようなモノクロ画像から、画像の影を正確に認定するのが困難な場合がある。その原因としては、例えば、陥凹部の位置や陥凹部の形状が考えられる。陥凹部の位置が原因となる例は、視神経乳頭の中心付近にはほとんど陥凹部が存在せず、視神経乳頭の中心以外の部分に陥凹部が存在する場合である。この場合には、視神経乳頭の内縁の影の濃淡だけでは、正確に陥凹部の位置(輪郭)を認定するのが難しい。また、陥凹部の形状が原因となる例は、陥凹内に形成される影領域が明瞭でない場合である。光源から光線が照射された物体が形成する影は、主に、物体が完全に光線を遮る場合にできる影(本影)と、物体が部分的に光線を遮る場合にできる影(半影)とに分けられ、半影は本影に比べて明瞭でない。視神経乳頭44の陥凹部の表面は平坦でないため、陥凹内には陥凹部の表面形状に応じて半影が生じる場合が多い。このため、半影が生じている画像から正確に影を検出するには、画像のピクセルを特定の数の類似または非類似のグループに分割するクラスタリングベースのアルゴリズムを用いた手法を適用するのが好ましい。例えば、最も一般的なクラスタリングのアルゴリズムである、K-平均法(K-means clustering)等を用いるのが好ましい。また、K-平均法等の処理を行った画像を、更に、微分処理してもよい。このように画像を処理することで、より、影領域を明確に特定することができる。
[3-2. Modification 2]
Here, it may be difficult to accurately identify the shadow of the image from the black-and-white images as shown in FIGS. 5A, 5B, and 8. As the cause, for example, the position of the recess and the shape of the recess can be considered. An example in which the position of the recess is the cause is that there is almost no recess near the center of the optic disc and there is a recess in a portion other than the center of the optic disc. In this case, it is difficult to accurately determine the position (contour) of the recess only by the shade of the shadow on the inner edge of the optic nerve head. Further, an example caused by the shape of the recess is a case where the shadow region formed in the recess is not clear. The shadows formed by an object irradiated with light from a light source are mainly shadows formed when the object completely blocks the light rays (main shadow) and shadows formed when the object partially blocks the light rays (penumbra). The penumbra is not as clear as the main shadow. Since the surface of the recess of the optic nerve head 44 is not flat, a penumbra is often formed in the recess depending on the surface shape of the recess. Therefore, in order to accurately detect shadows from an image with penumbra, it is recommended to apply a method using a clustering-based algorithm that divides the pixels of the image into a specific number of similar or dissimilar groups. preferable. For example, it is preferable to use the K-means clustering, which is the most common clustering algorithm. Further, the image subjected to the processing such as the K-means method may be further differentiated. By processing the image in this way, the shadow region can be specified more clearly.
[4.検証結果]
[4-1.検証1]
 視神経乳頭の陥凹部の影の長さから深さが正確に算出されるかを検証した。図9A,図9Bを参照し、解析結果を評価する。
[眼球内部組織の状態推定装置の光学設計]
 本実証における状態推定装置1は、図6および図7に示した構造と同一である。詳細には、2つのIR-LED(波長850nm)を光源とし、それぞれ光軸に対し45°の角度位置になるように配置した。LEDからの照明光は、熱障害の懸念がない強度であり、強膜を通過すると拡散し、眼球40の内部領域を均一に照らすことができる。第一レンズR1(対物レンズ)として、78D ophthalmic lens(外角視野60°、焦点距離8mm)(Righton、日本)を使用した。また、第二レンズR2として収差補正レンズ用いた。カメラ22として、Webカメラ(Logitech HD ProウェブカメラC920)のカメラセンサのカラーフィルタをIRフィルタに置き換えることにより、IRイメージング用に変更した。設計した状態推定装置は手のひらサイズの携帯可能な軽量の大きさであり、図9A,図9Bに示すような、優れた眼底画像の撮像能力を有していた。
[4. inspection result]
[4-1. Verification 1]
It was verified whether the depth was accurately calculated from the length of the shadow of the recess of the optic nerve head. The analysis results are evaluated with reference to FIGS. 9A and 9B.
[Optical design of the state estimation device for the internal tissue of the eyeball]
The state estimation device 1 in this demonstration has the same structure as that shown in FIGS. 6 and 7. Specifically, two IR-LEDs (wavelength 850 nm) were used as light sources, and each was arranged so as to be at an angle of 45 ° with respect to the optical axis. The illumination light from the LED has an intensity that does not cause heat damage, and when it passes through the sclera, it diffuses and can uniformly illuminate the internal region of the eyeball 40. A 78D ophthalmic lens (outer angle field of view 60 °, focal length 8 mm) (Righton, Japan) was used as the first lens R1 (objective lens). An aberration correction lens was used as the second lens R2. As the camera 22, the color filter of the camera sensor of the Web camera (Logitech HD Pro Web camera C920) was changed for IR imaging by replacing it with an IR filter. The designed state estimation device was a palm-sized, portable and lightweight size, and had excellent ability to capture a fundus image as shown in FIGS. 9A and 9B.
[影の長さによる深さの算出]
 検証では、3Dプリンタを使用してモデルアイを作成した。モデルアイの視神経乳頭の寸法は、直径3mm、深さ1.5mmとした。目の瞳孔として機能する目のレンズとして、焦点距離24mmの収差補正レンズを使用した。モデルアイの目は水で満たされている。状態推定装置1を用いてモデルアイの測定を行った。
 図9Aおよび図9Bに示すように、モデルアイの画像では、眼底画像同様、光の影を撮像することができた。得られた影の画像を、本実施形態に記載の手法を用いて解析した。モデルアイは球面状であるので、数式2を用いた。解析により算出された深さは、設計された深さと一致する1.50mmであった。
 したがって、本実施形態に記載の手法を用いて実際の視神経乳頭の深さを影の長さから求めることができ、この深さから視神経乳頭の状態を推定できることが確認できた。
[Calculation of depth by shadow length]
In the verification, a model eye was created using a 3D printer. The dimensions of the optic disc of the model eye were 3 mm in diameter and 1.5 mm in depth. An aberration-correcting lens with a focal length of 24 mm was used as the eye lens that functions as the pupil of the eye. The eyes of the model eye are filled with water. The model eye was measured using the state estimation device 1.
As shown in FIGS. 9A and 9B, the image of the model eye was able to capture the shadow of light as in the fundus image. The obtained shadow image was analyzed using the method described in this embodiment. Since the model eye has a spherical shape, Equation 2 is used. The depth calculated by the analysis was 1.50 mm, which is consistent with the designed depth.
Therefore, it was confirmed that the actual depth of the optic disc can be obtained from the length of the shadow by using the method described in the present embodiment, and the state of the optic disc can be estimated from this depth.
[4-2.検証2]
[視神経乳頭の深さの測定結果の比較]
 視神経乳頭の深さについて、本実施形態の状態推定装置1を用いた測定結果と、既存のOCTを用いた測定結果とを比較した。
 サンプルとして、摘出された3つのブタの眼球を用意した。
 OCT画像で測定された視神経乳頭の断面画像を解析したところ、これらサンプルの測定された断面点での深さは183μm~490μmの範囲で変化していた。
 次に、ブタの眼球を本実施形態の状態推定装置1を用いて測定を行った。図10Aは、本実施形態の状態推定装置1を用いて得られたブタの眼球のモノクロ画像である。傾斜光照明により視神経乳頭の影が得られた。ブタの眼球の視神経乳頭の形状は、楕円形であることが知られている。そのため、数式3を用いて視神経乳頭の深さを求めた。
 さらに、正確な影を求めるにあたり、画像処理を行った。図10Bは、1画素を8bit(256レベル)の強度で正規化された強度画像である。図10Cは、K-平均法を1000回繰り返して得られた画像である。K-平均法により、影の部分とそうでない部分とを明確に分離することができた。
 より正確に影の長さを求めるために、図10Cの画像を微分処理した。図10Dは、微分処理により得られた、微分K-平均法による画像である。微分処理により特定された影の領域は破線で囲まれた領域であった。視神経乳頭の直径は約2.1mmであった。また、この画像の斜光照明の角度は45°であった。数式3を用いて、様々な影の断面での視神経乳頭の深さを計算したところ、視神経乳頭の深さは179μm~350μmまで変化していた。この値の範囲は、OCTで求まったブタの眼球の視神経乳頭の深さの値の範囲とよく一致(重複)していた。
 以上より、本実施形態の状態推定装置1を用いることで、OCTのような高価な装置を用いることなく、簡易的に、OCTと同等の視神経乳頭の深さを求めることができ、本実施形態の状態推定装置1が、簡易的な検査装置として、優れていることが実証できた。
 また、本実施形態の状態推定装置1で得られた値から、視神経乳頭の容積と、面積を計算することもできる。また影から得られた形状により、視神経乳頭の形状を3D再構成することもでき、緑内障を含む様々な眼疾患の進行を日常の診断で、簡易的に検出できる可能性がある。
[4-2. Verification 2]
[Comparison of measurement results of optic disc depth]
Regarding the depth of the optic nerve head, the measurement result using the state estimation device 1 of the present embodiment and the measurement result using the existing OCT were compared.
As a sample, the eyeballs of three excised pigs were prepared.
When the cross-sectional image of the optic nerve head measured by the OCT image was analyzed, the depth at the measured cross-sectional point of these samples varied in the range of 183 μm to 490 μm.
Next, the eyeball of the pig was measured using the state estimation device 1 of the present embodiment. FIG. 10A is a monochrome image of a pig's eyeball obtained by using the state estimation device 1 of the present embodiment. The shadow of the optic nerve head was obtained by tilted light illumination. The shape of the optic nerve head of the porcine eyeball is known to be elliptical. Therefore, the depth of the optic nerve head was calculated using Equation 3.
Further, image processing was performed to obtain an accurate shadow. FIG. 10B is an intensity image in which one pixel is normalized with an intensity of 8 bits (256 levels). FIG. 10C is an image obtained by repeating the K-means method 1000 times. By the K-means method, it was possible to clearly separate the shadowed part from the non-shadowed part.
The image of FIG. 10C was differentiated in order to obtain the shadow length more accurately. FIG. 10D is an image obtained by the differential K-means method obtained by the differential process. The shadow area identified by the differential processing was the area surrounded by the broken line. The diameter of the optic nerve head was about 2.1 mm. The angle of oblique illumination in this image was 45 °. When the depth of the optic disc in various shadow cross sections was calculated using Equation 3, the depth of the optic disc varied from 179 μm to 350 μm. The range of this value was in good agreement (overlap) with the range of the depth value of the optic disc of the porcine eyeball obtained by OCT.
From the above, by using the state estimation device 1 of the present embodiment, the depth of the optic nerve head equivalent to that of the OCT can be easily obtained without using an expensive device such as the OCT. It was proved that the state estimation device 1 of the above is excellent as a simple inspection device.
It is also possible to calculate the volume and area of the optic nerve head from the values obtained by the state estimation device 1 of the present embodiment. In addition, the shape obtained from the shadow can be used to reconstruct the shape of the optic nerve head in 3D, and there is a possibility that the progression of various eye diseases including glaucoma can be easily detected by daily diagnosis.
1  眼球内部組織の状態推定装置
2  本体
3  制御装置
4  眼
10 光照射ユニット
11 筐体
12 孔
20 撮像ユニット(撮像部)
21 筒体
22 カメラ
23 撮像素子
30 処理装置(処理部)
31 影情報取得部
32 パラメータ取得部
33 状態推定部
34 記憶装置
35 記憶部
36 照明制御装置(調整部)
37 照射調整部
40  眼球
41  網膜
42  瞳孔
43  視神経
44  視神経乳頭
45  眼底
P1 上方位置(第一の位置)
P2 下方位置(第二の位置)
  上方照明(第一光照射部)
  下方照明(第二光照射部)
 照明の光軸
 眼球の光軸
 焦点
  焦点距離
θ1 傾斜角度
θ2 画角
R1 第一レンズ
R2 第二レンズ
 単一照明画像(第一撮像情報)
 複数照明画像(第二撮像情報)
C1 第一の円
C2 第二の円
S  影(影情報)
L  影の長さ(第一パラメータ)
D  視神経乳頭の直径(第二パラメータ)
d  視神経乳頭の陥凹の深さ(第三パラメータ)
 
1 State estimation device for internal tissue of the eyeball 2 Main body 3 Control device 4 Eye 10 Light irradiation unit 11 Housing 12 Hole 20 Imaging unit (imaging unit)
21 Cylinder 22 Camera 23 Image sensor 30 Processing device (processing unit)
31 Shadow information acquisition unit 32 Parameter acquisition unit 33 State estimation unit 34 Storage device 35 Storage unit 36 Lighting control device (adjustment unit)
37 Irradiation adjustment unit 40 Eyeball 41 Retina 42 Pupil 43 Optic nerve 44 Optic nerve papilla 45 Fundus P1 upper position (first position)
P2 lower position (second position)
L U upper illumination (first illumination unit)
L D lower illumination (second illumination unit)
O L illumination optical axis O E eyeball optical axis F P focus F L focal length θ1 inclination angle θ2 angle R1 first lens R2 second lens I S single lighting image (first sensing information)
IP multiple illumination image (second imaging information)
C1 First circle C2 Second circle S Shadow (shadow information)
L Shadow length (first parameter)
D Diameter of optic disc (second parameter)
d Depth of optic disc depression (third parameter)

Claims (9)

  1.  生体の眼球の外部である第一の位置に配置され、前記眼球の内部の対象組織に向かって光を照射する第一光照射部と、
     前記眼球の内部で反射される光を撮像する撮像部と、を備えるとともに、
     前記撮像部で得られた第一撮像情報から前記対象組織に関する影情報を取得し、前記影情報から前記対象組織の状態を推定する処理部を備える、
    眼球内部組織の状態推定装置。
    A first light irradiation unit that is placed at a first position outside the eyeball of a living body and irradiates light toward a target tissue inside the eyeball.
    It also includes an imaging unit that captures the light reflected inside the eyeball.
    The processing unit includes a processing unit that acquires shadow information about the target tissue from the first imaging information obtained by the imaging unit and estimates the state of the target tissue from the shadow information.
    A device for estimating the state of internal tissues of the eyeball.
  2.  前記処理部は、
     前記第一撮像情報から前記影情報の大きさを表す第一パラメータを取得し、
     前記第一パラメータと、前記対象組織の大きさを表す第二パラメータとに基づいて、前記第二パラメータとは異なる前記対象組織の大きさを表す第三パラメータを取得し、
     前記第三パラメータから前記対象組織の状態を推定する、
    請求項1に記載の眼球内部組織の状態推定装置。
    The processing unit
    The first parameter representing the magnitude of the shadow information is acquired from the first imaging information, and the shadow information is obtained.
    Based on the first parameter and the second parameter representing the size of the target tissue, a third parameter representing the size of the target tissue different from the second parameter is acquired.
    The state of the target tissue is estimated from the third parameter.
    The state estimation device for an internal tissue of the eyeball according to claim 1.
  3.  前記第一パラメータの取得は、
     少なくとも二つの円を用いて前記影情報の大きさを取得することを含む、
    請求項2に記載の眼球内部組織の状態推定装置。
    The acquisition of the first parameter is
    Including obtaining the magnitude of the shadow information using at least two circles,
    The state estimation device for an internal tissue of the eyeball according to claim 2.
  4.  前記第一の位置は、前記第一光照射部が照射する光の光軸が前記眼球の光軸に対して所定の角度で傾斜する位置である、
    請求項1~3のいずれか一項に記載の眼球内部組織の状態推定装置。
    The first position is a position where the optical axis of the light emitted by the first light irradiation unit is inclined at a predetermined angle with respect to the optical axis of the eyeball.
    The state estimation device for an internal tissue of the eyeball according to any one of claims 1 to 3.
  5.  前記第一光照射部に加え、更に、
     前記第一の位置とは異なる第二の位置に配置され、前記眼球の内部の前記対象組織に向かって光を照射する第二光照射部を備えるとともに、
     前記第一光照射部および前記第二光照射部の照射を調整する調整部を備える、
    請求項1~4のいずれか一項に記載の眼球内部組織の状態推定装置。
    In addition to the first light irradiation unit,
    It is arranged at a second position different from the first position, and includes a second light irradiation unit that irradiates light toward the target tissue inside the eyeball.
    The first light irradiation unit and the adjustment unit for adjusting the irradiation of the second light irradiation unit are provided.
    The state estimation device for an internal tissue of the eyeball according to any one of claims 1 to 4.
  6.  前記撮像部は、
     前記第一光照射部および前記第二光照射部から前記眼球の内部の前記対象組織に向かって同時に照射され、前記眼球の内部で反射される光を撮像した第二撮像情報を取得し、
     前記調整部は、前記第二撮像情報に基づいて、前記第一光照射部および前記第二光照射部の照射を調整する、
    請求項5に記載の眼球内部組織の状態推定装置。
    The imaging unit
    The second imaging information obtained by imaging the light that is simultaneously irradiated from the first light irradiation unit and the second light irradiation unit toward the target tissue inside the eyeball and reflected inside the eyeball is acquired.
    The adjusting unit adjusts the irradiation of the first light irradiation unit and the second light irradiation unit based on the second imaging information.
    The state estimation device for an internal tissue of the eyeball according to claim 5.
  7.  前記影情報は、視神経乳頭の陥凹部内の影情報である、
    請求項1~6のいずれか一項に記載の眼球内部組織の状態推定装置。
    The shadow information is shadow information in the recess of the optic nerve head.
    The state estimation device for an internal tissue of the eyeball according to any one of claims 1 to 6.
  8.  前記影情報は、クラスタリングのアルゴリズムを用いた画像処理手段により、影の部分と影でない部分とを分離して得られる、
    請求項1~7のいずれか一項に記載の眼球内部組織の状態推定装置。
    The shadow information is obtained by separating a shadow portion and a non-shadow portion by an image processing means using a clustering algorithm.
    The state estimation device for an internal tissue of the eyeball according to any one of claims 1 to 7.
  9.  生体の眼球の外部である第一の位置から前記眼球の内部の対象組織に向かって光を照射し、
     前記眼球の内部で反射される光を撮像するとともに、
     前記撮像された第一撮像情報から前記対象組織に関する影情報を取得し、
     前記影情報から前記対象組織の状態を推定する、
    眼球内部組織の状態推定方法。
    Light is irradiated from the first position outside the eyeball of the living body toward the target tissue inside the eyeball.
    In addition to imaging the light reflected inside the eyeball
    The shadow information about the target tissue is acquired from the first imaged information captured, and the shadow information is obtained.
    The state of the target tissue is estimated from the shadow information.
    A method for estimating the state of internal tissues of the eyeball.
PCT/JP2021/006219 2020-03-31 2021-02-18 Device for estimating state of eyeball internal tissue and method therefor WO2021199772A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021538069A JP7214270B2 (en) 2020-03-31 2021-02-18 STATE ESTIMATING DEVICE AND METHOD OF THE INTERNAL TISSUE OF EYE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020064949 2020-03-31
JP2020-064949 2020-03-31

Publications (1)

Publication Number Publication Date
WO2021199772A1 true WO2021199772A1 (en) 2021-10-07

Family

ID=77928315

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/006219 WO2021199772A1 (en) 2020-03-31 2021-02-18 Device for estimating state of eyeball internal tissue and method therefor

Country Status (2)

Country Link
JP (1) JP7214270B2 (en)
WO (1) WO2021199772A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04295329A (en) * 1991-03-25 1992-10-20 Sony Corp Inspecting device for pupil defect
JP2000083902A (en) * 1998-09-16 2000-03-28 Koonan:Kk Eyeball photographing device
JP2002269539A (en) * 2000-12-01 2002-09-20 Shigehiro Masui Image processor, image processing method, and computer- readable storage medium with image processing program stored therein, and diagnosis support system using them
JP2002542863A (en) * 1999-04-29 2002-12-17 トルサナ・ディアベテス・ディアグノスティクス・アー/エス Analysis of base image
JP2009022506A (en) * 2007-07-19 2009-02-05 Gifu Univ System for analyzing fundus examination image and program for analyzing fundus examination image
JP2010187746A (en) * 2009-02-16 2010-09-02 Canon Inc Fundus camera
JP2014036207A (en) * 2012-08-10 2014-02-24 Fujitsu Ltd Depth detecting method, etching method, depth detecting device, and etching device
JP2018054296A (en) * 2016-09-26 2018-04-05 株式会社村田製作所 Road surface inspecting apparatus, road surface inspecting method and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109414162A (en) 2016-05-13 2019-03-01 洛桑联邦理工学院 For retinal absorption phase under oblique illumination and the system of dark-field imaging, method and apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04295329A (en) * 1991-03-25 1992-10-20 Sony Corp Inspecting device for pupil defect
JP2000083902A (en) * 1998-09-16 2000-03-28 Koonan:Kk Eyeball photographing device
JP2002542863A (en) * 1999-04-29 2002-12-17 トルサナ・ディアベテス・ディアグノスティクス・アー/エス Analysis of base image
JP2002269539A (en) * 2000-12-01 2002-09-20 Shigehiro Masui Image processor, image processing method, and computer- readable storage medium with image processing program stored therein, and diagnosis support system using them
JP2009022506A (en) * 2007-07-19 2009-02-05 Gifu Univ System for analyzing fundus examination image and program for analyzing fundus examination image
JP2010187746A (en) * 2009-02-16 2010-09-02 Canon Inc Fundus camera
JP2014036207A (en) * 2012-08-10 2014-02-24 Fujitsu Ltd Depth detecting method, etching method, depth detecting device, and etching device
JP2018054296A (en) * 2016-09-26 2018-04-05 株式会社村田製作所 Road surface inspecting apparatus, road surface inspecting method and program

Also Published As

Publication number Publication date
JPWO2021199772A1 (en) 2021-10-07
JP7214270B2 (en) 2023-01-30

Similar Documents

Publication Publication Date Title
JP7250653B2 (en) Image processing device, image processing method and program
AU2016204944B2 (en) Photorefraction ocular screening device and methods
US9427152B2 (en) Adaptive infrared retinoscopic device for detecting ocular aberrations
JP6040224B2 (en) System and method for improving eye imaging
JP5026741B2 (en) Operation method of ophthalmic examination apparatus
JP4621496B2 (en) Line scan ophthalmoscope
JP5582772B2 (en) Image processing apparatus and image processing method
JP6940349B2 (en) Ophthalmic equipment
WO2017094243A1 (en) Image processing apparatus and image processing method
JP7348374B2 (en) Ophthalmology information processing device, ophthalmology imaging device, ophthalmology information processing method, and program
WO2017101222A1 (en) Method and system for calculating corneal refractive power
US10575987B2 (en) Ophthalmic treatment device and control method therefor
Gairola et al. SmartKC: smartphone-based corneal topographer for keratoconus detection
JP6637743B2 (en) Ophthalmic equipment
WO2021199772A1 (en) Device for estimating state of eyeball internal tissue and method therefor
Hasan et al. An algorithm to differentiate astigmatism from Keratoconus in Axial Topgraphic images
WO2022097621A1 (en) Ophthalmic information processing device, ophthalmic device, ophthalmic information processing method, and program
WO2023238729A1 (en) Ophthalmologic device, method for controlling ophthalmologic device, program, and recording medium
Schramm et al. A modified Hartmann–Shack aberrometer for measuring stray light in the anterior segment of the human eye
Hasan et al. Automatic diagnosis of astigmatism for Pentacam sagittal maps
Almeida-Galárraga Diagnosis and Degree of Evolution in a Keratoconus-Type Corneal Ectasia from Image Processing
ES2688769B2 (en) Method for measuring intraocular diffusion that affects different ocular media of the eye and computer program products thereof
Singh et al. Optical coherence tomography in current glaucoma practice: Pearls and Pitfalls
Fülep et al. Far-field infrared system for the high-accuracy in-situ measurement of ocular pupil diameter
TWM539329U (en) Cataract classification determination device

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021538069

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21781582

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21781582

Country of ref document: EP

Kind code of ref document: A1