JP3548473B2 - Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus - Google Patents

Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus Download PDF

Info

Publication number
JP3548473B2
JP3548473B2 JP32992699A JP32992699A JP3548473B2 JP 3548473 B2 JP3548473 B2 JP 3548473B2 JP 32992699 A JP32992699 A JP 32992699A JP 32992699 A JP32992699 A JP 32992699A JP 3548473 B2 JP3548473 B2 JP 3548473B2
Authority
JP
Japan
Prior art keywords
blood vessel
value
fundus image
blood
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP32992699A
Other languages
Japanese (ja)
Other versions
JP2001145604A (en
Inventor
聡 佐久間
作一 大塚
啓之 新井
裕子 高橋
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP32992699A priority Critical patent/JP3548473B2/en
Publication of JP2001145604A publication Critical patent/JP2001145604A/en
Application granted granted Critical
Publication of JP3548473B2 publication Critical patent/JP3548473B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a method and an apparatus for discriminating arterial and vein images of a fundus image, and more particularly to an image processing which enables labor saving and speedup of diagnosis and examination using a fundus image which is effective for grasping the state of ophthalmologic diseases and systemic diseases. It is about technology.
[0002]
[Prior art]
The fundus image is the only method capable of observing the blood vessel image without invasiveness, and is effective for grasping the state of systemic diseases that cause changes in the blood vessels of the whole body such as diabetes. In particular, it is known that the value of the width of the artery and vein in the fundus image well represents the progress of these diseases. For this reason, the measurement of the width of an artery and a vein in a fundus image has been generally performed not only in ophthalmology but also in internal medicine and health examinations. Until now, the measurement of the blood vessel width has been performed manually by a doctor while directly observing the fundus image.
[0003]
However, these operations require special knowledge, are complicated, and have a problem of high inspection cost. For the purpose of solving this problem, a method of automatically measuring the blood vessel width (Japanese Patent Application No. 8-228807: a method for measuring the ratio of arteriovenous diameter of the fundus, and Japanese Patent Application No. 10-352933: the fundus) There has been proposed a method for automatically measuring a temporal change in arteriovenous blood vessel diameter ratio in an image, a system therefor, and a recording medium recording the program thereof.
[0004]
Further, in a wavelength region of 600 nm or more, the light beam is divided into light beams of at least two different wavelength regions, and the images are independently captured by the fundus imaging unit, and the images are combined to correspond to different colors and displayed as one color image. Japanese Patent Application Laid-Open No. Hei 8-71045 discloses a fundus examination apparatus in which an artery and a vein of a choroidal blood vessel are distinguished and observed by displaying on a means.
[0005]
[Problems to be solved by the invention]
In the above-mentioned conventional method, arteries and veins are identified by assuming that arteries and veins are adjacent to each other. However, since this assumption often does not hold, there is a problem that artery and veins are erroneously identified. there were. For this reason, the conventional method has not yet been put to practical use.
[0006]
Therefore, although a technique for automatically and accurately identifying arteries and veins in a fundus image is expected, it has not been realized yet.
[0007]
In addition, the conventional fundus examination apparatus has a problem that the configuration is complicated, the apparatus becomes overwhelming, and the examination cost of fundus image examination increases.
[0008]
An object of the present invention is to provide a technique that enables a computer to automatically and accurately identify arteries and veins in a fundus image.
[0009]
It is another object of the present invention to provide a technique capable of realizing labor-saving and speeding up examination of blood vessels in a fundus image.
[0010]
The above and other objects and novel features of the present invention will become apparent from the description of the present specification and the accompanying drawings.
[0011]
[Means for Solving the Problems]
The following is a brief description of an outline of typical inventions disclosed in the present application.
[0012]
(1)A fundus image processing apparatus including a processing unit area setting unit, a blood vessel separating unit, a feature amount calculating unit, and an artery / vein identifying unit;Shot in colorA method of processing digitized fundus image data,,
The processing unit area setting means,A processing unit area setting step of setting a processing unit area including one blood vessel and a background portion for fundus image data;The blood vessel separating means is provided in the processing unit area.Separate the blood vessel and backgroundLet goBlood vessel separation process,The feature amount calculating means,The fundus image data for each processing unit areaFrom the pixel values of the background portion and the blood vessel portion, the value I of the red component of the background portion 1 ( R ) And the value of green component I 1 ( G ) And the value I of the reddish component of the blood vessel portion Two ( R ) And the value of green component I Two ( G ), And as a blood color feature quantity indicating the color tone of blood, { log I 1 ( R )- log I Two ( R )} / { log I 1 ( G )- log I Two ( G )}The value of theCalculating a feature amount for calculatingArteriovenous identification means, the blood colorFeature valueBased onIt is determined whether the blood vessel in the processing unit area is an artery or a vein.SeparateArtery and vein identification processCharacterized by.
[0013]
(2)A method for processing fundus image data digitized and digitized by a fundus image processing apparatus including a processing unit area setting unit, a blood vessel separation unit, a feature amount calculation unit, and an arteriovenous identification unit,,
A processing unit region setting step in which the processing unit region setting means sets a processing unit region including one blood vessel and a background portion with respect to the fundus image data; A blood vessel separating step of separating a background part and a blood vessel part, and the feature amount calculating unit performs gamma correction for each of the background part and the blood vessel part in the fundus image data for each of the processing unit regions so that γ <1 is satisfied. From the value, the value I of the red component in the background 1 ( R ) And the value of green component I 1 ( G ) And the value I of the reddish component of the blood vessel portion Two ( R ) And the value of green component I Two ( G ), And as a blood color feature quantity indicating the color tone of blood, ΔI 1 ( R ) -I Two ( R )} / {I 1 ( G ) -I Two ( G A) a feature amount calculating step of calculating a value of}, and an arteriovenous identification step of determining whether the blood vessel in the processing unit area is an artery or a vein based on the blood color feature amount. Characterized by having.
[0014]
(3) BeforeRecord(1)Or described in (2)A method for identifying an arteriovenous image of a fundus image,The fundus image processing apparatus further includes a blood vessel image creating unit, an arteriovenous intersection detecting unit, a learning area setting unit, a feature amount pair calculating unit, and an evaluation function setting unit, wherein the blood vessel image creating unit includes the fundus image data A blood vessel image creating process of creating a blood vessel image data which is a binary image in which a blood vessel portion and a background portion are separated, and the arteriovenous intersection detecting means narrows each blood vessel of the fundus image based on the blood vessel image data. An arteriovenous intersection detecting step of detecting, as an arteriovenous intersection, an intersection of blood vessels where two blood vessels intersect in the forward direction with the direction being the positive direction of the blood vessel; A learning region setting step of setting a learning region including a blood vessel and a background portion for each of two blood vessels that intersect with each other; Two Calculating the blood color feature amount from the pixel value of each of the background portion and the blood vessel portion in the fundus image data for each learning region for each of the blood vessels; a feature amount pair calculation process as a pair of the feature amount; In each of the feature value pairs corresponding to intersections of a plurality of blood vessels extracted from the entire fundus image, the evaluation function setting means may determine that one of the paired blood color feature values falls into an artery category, Evaluation function setting for arteriovenous identification that sets a threshold so that one is divided into vein categoriesAnd wherein the arteriovenous identification step comprises:Blood colorFeatures andBased on the thresholdThe processing unit areaInBlood vesselsButarteryIsveinIsIdentifyProcess.
[0015]
(4) BeforeRecord(1)Or(3)NoGap 1Described in sectionArtery and Vein Identification of Human Fundus ImagesIs a computer-readable recording medium on which a program for causing a computer to execute the process of (1) is recorded.
[0016]
(5)A fundus image processing apparatus for processing digitized fundus image data captured in color,
Processing unit area setting means for setting a processing unit area including one blood vessel and a background part with respect to the fundus image data; blood vessel separating means for separating a blood vessel part and a background part in the processing unit area; From the pixel values of the background part and the blood vessel part in the fundus image data for each unit area, the value I of the reddish component of the background part is calculated. 1 ( R ) And the value of green component I 1 ( G ) And the value I of the reddish component of the blood vessel portion Two ( R ) And the value of green component I Two ( G ), And as a blood color feature quantity indicating the color tone of blood, { log I 1 ( R )- log I Two ( R )} / { log I 1 ( G )- log I Two ( G ) A feature amount calculating means for calculating the value of}, and arteriovenous identifying means for identifying whether a blood vessel in the processing unit area is an artery or a vein based on the blood color feature amount.
[0017]
(6) digitized fundus image taken in colordataA fundus image processing apparatus for processing
SaidFundus imageAbout dataOne blood vesselAnd tallKeibeAndIncludingWherePhysical unit areaToA processing unit area setting means for setting, a blood vessel separating means for separating a blood vessel part and a background part in the processing unit area,For each processing unit area, the value I of the red component of the background portion is determined from the gamma-corrected pixel values of the background portion and the blood vessel portion in the fundus image data that satisfy γ <1. 1 ( R ) And the value of green component I 1 ( G ) And the value I of the reddish component of the blood vessel portion Two ( R ) And the value of green component I Two ( G ), And as a blood color feature quantity indicating the color tone of blood, ΔI 1 ( R ) -I Two ( R )} / {I 1 ( G ) -I Two ( G A) feature amount calculating means for calculating the value of}, andBlood vessels are arteriesOrVeinOrArtery and vein identification means for identifyingIs provided.
[0022]
According to the present invention, an artery and a vein in a fundus image can be automatically and accurately identified. Thereby, the work of measuring the blood vessel width in the fundus image can be made more efficient, and the examination cost of the fundus image examination can be reduced.
[0023]
Hereinafter, the present invention will be described in detail with reference to the drawings together with embodiments (examples) according to the present invention.
[0024]
BEST MODE FOR CARRYING OUT THE INVENTION
FIG. 1 is a block diagram showing a schematic configuration of a fundus image processing apparatus according to an embodiment (example) of the present invention, wherein 100 is a fundus image processing apparatus, 101 is a fundus image obtained from a film or a camera, and 102 is a fundus image. Image input means 103, data storage means 103 for storing data (information) such as fundus image data and character / symbol data, 104 information input means for inputting information such as character data and symbol data, 105 A processing unit area setting means, 106 is a blood vessel separation (blood vessel image creation) means, 107 is a feature amount calculating means, 108 is an arteriovenous identification means, 109 is a control section, and 110 is a result display / output means.
[0025]
The fundus image processing apparatus 100 according to the present embodiment uses, for example, a general-purpose computer (a personal computer or the like), and as illustrated in FIG. 1, a cross section of a blood vessel including both a single blood vessel and a background portion of the blood vessel in the fundus image. Alternatively, a processing unit area setting means 105 for setting a small area as a processing unit area, a blood vessel separating means 106 for separating a blood vessel part and a background part in the processing unit area, a blood vessel part and a background part separated by the blood vessel separating means A feature amount calculating unit 107 for calculating a feature of the color distribution as a feature amount, an arteriovenous identification identifying unit for identifying whether a blood vessel in the processing unit area is an artery or a vein using the calculated feature amount 108, and a control unit 109 for controlling each of the means.
[0026]
In the embodiment, a general-purpose computer (a personal computer or the like) is used as the fundus image processing apparatus 100. This is not limited to a general-purpose computer, and a dedicated image processing device may be created and used. The image input unit 102 reads fundus image data into the fundus image processing apparatus 100 by any of the following methods.
[0027]
(A) Reading data from a recording medium on which fundus image data is reported.
(B) Capture data directly from a digital camera or film scanner.
[0028]
As shown in FIG. 1, the fundus image processing apparatus 100 according to the present embodiment stores all fundus image data captured by the image input unit 102 and intermediate data generated during the processing in the data storage unit. Stored, retrieved as needed, and deleted when no longer needed. The processing unit area setting means 105 includes an apparatus for automatically setting a processing unit area by a computer and an apparatus designated by a user using a result display / output means 110 such as a monitor and information input means 104 such as a mouse and a keyboard. There is. In the former case, the information input means 104 in the figure becomes unnecessary.
[0029]
The operations of the processing unit area setting unit 105, the blood vessel separating unit 106, the feature amount calculating unit 107, and the arteriovenous identification unit 108 will be described later in detail.
[0030]
The result display / output unit 110 displays / outputs, for example, a color-coded illustration in the fundus image. Alternatively, coordinate data indicating the range of the processing unit area and data indicating whether the blood vessel is an artery or a vein are displayed on a screen, or the data is output to a recording medium and stored.
[0031]
The fundus image data has a red component and a green component. For example, as shown in FIG. 2, the feature amount calculation unit 107 uses the color information of the blood vessel part and the background part in the processing unit area to absorb light by the blood vessel wall and blood for each band (or wavelength) of the image. An absorption contribution calculating means 107A for calculating an absorption contribution corresponding to the effect of the above, and a blood color characteristic indicating the color tone of the blood itself without depending on the thickness of the blood vessel wall and the inner diameter of the blood vessel using the absorption contribution calculated for each band It is composed of a blood color characteristic amount calculating means 107B for calculating the amount and a sub-control unit 109A.
[0032]
The sub-control unit 109A performs exactly the same operation as one control unit together with the control unit 109.
[0033]
Further, as shown in FIG. 3, for example, the feature amount calculating means 107 calculates the R value (red component) and the G value (green component) of the blood vessel portion and the R value and the G value of the background portion in the processing unit area. A blood-color-feature-value calculating unit 107B for calculating a value as a color feature value; a blood-vessel-width-feature-value calculating unit 107C for calculating the width of a blood vessel in a processing unit area as a feature value; and a sub-control unit 109B. I have.
[0034]
The sub control unit 109B performs exactly the same operation as one control unit together with the control unit 109.
[0035]
Further, as shown in FIG. 4, the present invention includes a retinal nerve papillary detecting means 111, an arteriovenous intersection detecting means 112, a learning area setting means 113, a feature value pair calculating means 114, and an evaluation function as shown in FIG. A configuration in which the setting unit 115 and the sub control unit 109C are added can also be adopted. With these additional means, an evaluation function for arteriovenous knowledge identification is set, and the set evaluation function is used in the arteriovenous knowledge identification means 108.
[0036]
The sub-control unit 109C performs exactly the same operation as one control unit together with the control unit 109. 2 to 4 will be described later in detail.
[0037]
Hereinafter, a method for discriminating an artery and a vein of a fundus image using the fundus image processing apparatus of the present embodiment will be described.
[0038]
Conventionally, fundus images have been photographed with silver lead photographs, but with the recent spread of digital cameras, photographing with digital cameras has been increasing. In the case of a digital camera, photographed digital image data can be directly input to a computer. On the other hand, in the case of a silver lead photograph, it can be easily input to a computer as digital image data by using a device such as a scanner. It is assumed that the most common RGB format is used as the data format of the image when inputting it to the computer. In addition, a color image in a format other than the RGB format can be easily converted to the RGB format, and thus is applicable. However, the present invention is not limited to a specific RGB format, and may be any image composed of a red component (R in a broad sense), a green component (G in a broad sense), and a blue component (B in a broad sense). .
[0039]
First, the outline and properties of a fundus image example will be described. As shown in FIG. 5, in the fundus image, a bright place near the center left is called a retinal nerve papillae (for simplicity, the nipple), and has a property that all major blood vessels extend from the nipple. In these blood vessels, both arteries and veins are mixed, and since the network configuration varies from person to person, the arteries and veins cannot be distinguished from the network structure or position.
[0040]
On the other hand, the color of blood flowing in blood vessels changes depending on the oxidation state of hemoglobin. Broadly speaking, oxidized hemoglobin is redder and reduced hemoglobin has a cloudy hue. Due to the difference in the color tone of hemoglobin and the subtle difference in the blood vessel wall, the state of the arteries and veins on the image is subtly different. The artery and the vein can be distinguished by capturing the subtle color difference between the artery and the vein.
[0041]
Each of the units may be realized by creating a dedicated device, or a general-purpose computer having a data input unit, a data operation unit, a data temporary storage unit, a data holding unit, a data display unit, and the like may be used. It is obvious that the program can be easily realized by executing the program.
[0042]
Hereinafter, a method for discriminating an artery and a vein of a fundus image using the fundus image processing apparatus of the present embodiment will be described.
[0043]
6 to 9 are flowcharts showing a processing procedure of an arteriovenous identification method of a fundus image according to an embodiment of the present invention. FIG. 6 is a flowchart showing an outline of a processing procedure of an arteriovenous identification method of a fundus image. 6 is a flowchart showing a detailed process of the feature amount calculating process shown in FIG. 6, FIG. 8 is a flowchart showing another detailed process of the feature amount calculating process shown in FIG. 6, and FIG. It is a flowchart which shows a detailed process.
[0044]
As shown in FIG. 6 to FIG. 9, the outline of the processing procedure of the arteriovenous identification method of a fundus image according to the present embodiment is as follows. Is set as a processing unit area [1], a blood vessel separation step [2] for separating a blood vessel part and a background part in the processing unit area, and a blood vessel separation step [2]. A feature amount calculating step [3] of calculating the feature of the color distribution of the blood vessel portion and the background portion as the feature amount, and using the calculated feature amount to determine whether the blood vessel in the processing unit area is an artery or a vein. It consists of an arteriovenous discrimination identifying process [4].
[0045]
In the processing unit area setting step [1], first, a part in which an artery or a vein is to be identified in the fundus image is set as a processing unit area. The site is specified as a cross section or a small region including a blood vessel portion and a blood vessel background portion for one blood vessel as shown in FIG. As a method of designating the cross section or the small area, there are a case where the person manually designates the image while viewing the image and a case where the computer automatically detects the image. However, the method is not limited here.
[0046]
An example of a method for a computer to automatically detect a cross section or a small area is described below.
First, a blood vessel image, which is a binary image in which a blood vessel part and a non-blood vessel part (background part) are separated by a method described later (refer to “Method of creating blood vessel image” in blood vessel separation process [2]), is created. The blood vessel part of the image is thinned, and the obtained area is a fixed area in the vertical direction centered on each thin line, or a local area (circle, rectangle, etc.) that includes the line segment of the cross section is a small area. Can be detected as The thinning process is described in (Reference 1: Published by the University of Tokyo Press, pp.577-578 of Image Analysis Handbook, and further detailed literature from the relevant page), and will not be described here.
[0047]
The extraction and analysis processing of a cross section based on thinning is described in Japanese Patent Application No. 10-352933 (A method for automatically measuring a temporal change in arteriovenous blood vessel diameter ratio in a fundus image, a system thereof, and a recording medium recording the program thereof). ), It may be performed with reference to this. In addition to the above, a method of automatically detecting a cross section or a small area can be considered, but the method is not limited here.
[0048]
In the blood vessel separation step [1], the blood vessel and the background part of the blood vessel are separated in the set processing unit area. By applying the “method of creating a blood vessel image” described below, a blood vessel image that is a binary image obtained by separating a blood vessel and a background part in a fundus image can be obtained.
[0049]
Here, the blood vessel image may be created for the entire fundus image before referring to the corresponding processing unit area in the blood vessel image, or the blood vessel image may be created only for the designated processing unit area. .
[0050]
A method of creating a blood vessel image that creates a blood vessel image as a binary image in which a blood vessel part and a non-blood vessel part are separated by using the fundus image as an input will be described. There are various methods for creating a blood vessel image. Here, a method using a smoothing filter will be described as an example.
[0051]
FIG. 11 is a flowchart showing the procedure for creating a blood vessel image when a smoothing filter is used, and FIG. 12 is a diagram showing how an actual blood vessel image is created. First, a smoothing filter is applied to a G (green component) image of the fundus image to create a smoothed G 'image (FIG. 12 (3)). There are various types of smoothing filters such as an average value of pixel values in a local area as a central pixel value of the local area, a Gaussian filter, and the like. The details of the smoothing filter are described in (pp. 538 to 550 of the above-mentioned reference 1), and thus the description thereof is omitted here.
[0052]
Next, the difference value between the value G'n of the n-th pixel n of the G 'image obtained by the smoothing process and the pixel value Gn of the pixel n (the pixel at the same position as G') of the original G image ( G'n-Gn) is calculated, and when this value is larger than a predetermined threshold value Thl, the pixel n is set as a blood vessel pixel, and conversely, when the value is smaller than the threshold value Thl, it is set as a non-blood vessel pixel. By performing this process on all the pixels in the image, a blood vessel image having two values of the blood vessel pixel and the non-blood vessel pixel can be obtained (FIG. 12 (5)). It is also possible to reduce the noise component in the blood vessel image by performing labeling processing on the obtained blood vessel image and finding and removing small isolated pixels.
[0053]
It should be noted that the above-described method of creating a blood vessel image can be performed on the entire image, or can be performed on a part (local region) of the image.
[0054]
In the present embodiment, the blood vessel image is created using the G image. However, any image other than the G image can be similarly detected as long as the blood vessel portion is clearly visible.
[0055]
In the feature value calculation step, a feature value related to color distribution is calculated in the set processing unit area. As in the example shown in FIG. 13 (a diagram showing the color distribution of the cross section), there is a difference between the artery and the vein in the color distribution of the blood vessel and the background. This color difference can be modeled by considering the reflection of light from the blood vessel background and the absorption of light by the blood vessel wall and blood in the blood vessel part. By setting the characteristic amount based on this optical model, accurate identification becomes possible.
[0056]
The optical model of the fundus image will be briefly described with reference to FIG. 14 (optical model of the fundus image). Now, the intensity of the incident light with respect to the wavelength λ of the incident light (λ may be an arbitrary wavelength or a wavelength range (band) such as R, G, B, etc.) is represented by I0(Λ), the intensity of the observation light when there is no blood vessel (that is, in the background) is I1(Λ), intensity I of observation light at the blood vessel2(Λ). I1I for (λ)2(Λ) is attenuated by these media when penetrating the blood vessel (blood vessel wall, blood). In the case of FIG. 14,
[0057]
(Equation 1)
I2(Λ) = I1(Λ) · exp (−μd) · exp (−μ′d ′)
Is derived from Lambert-Beer's law (see Reference 2: published by Asakura Shoten, Dictionary of Color Science, item 500 of pp. 250). Here, μ is the absorption coefficient of blood, d is the thickness of the blood part, μ ′ is the absorption coefficient of the blood vessel wall, d ′ is the thickness of the blood vessel wall, μ and μ ′ are media-specific values and generally λ. Depends on. Among them, the value of the blood absorption coefficient μ changes depending on the oxidation state of hemoglobin in the blood, and thus is effective information for identifying arteries and veins. However, since the values of d and d 'are also unknown, the observation value I1(Λ), I2The correspondence between (λ) and μ cannot be determined.
[0058]
Therefore, the observed value I is not dependent on the unknowns d and d 'by some approximation or addition of information.1(Λ), I2If the relationship between (λ) and μ can be described, it can be used as a feature for discriminating arteries and veins. An example of a feature value based on this concept is shown below.
[0059]
(Example 1 of feature amount)
From the equation (1), when the equation (1) is rewritten for each of the G and R bands,
[0060]
(Equation 2)
I2(G) = I1(G) · exp (−μgd) · exp (−μg'D')
[0061]
(Equation 3)
I2(R) = I1(R) · exp (−μrd) · exp (−μr'D')
Here, the suffixes r and g in the above equation represent the R and G bands, and those with a dash (') indicate values relating to the blood vessel wall. Further, by transforming the equation of Equation 2 and the equation of Equation 3,
[0062]
(Equation 4)
−μgd-μg'D' = log {I2(G) / I1(G)}
[0063]
(Equation 5)
−μrd-μr'D' = log {I2(R) / I1(R)}
It becomes. Here, the right-hand sides of the equations (4) and (5) are referred to as absorption contributions (examples).
[0064]
It should be noted that the absorption contribution in the present invention does not have to strictly follow the equations of Equations 4 and 5 and quantitatively indicates the difference in color between the blood vessel portion and the background portion in a certain band. Generally.
[0065]
Here, the following property is assumed between the thickness d 'of the blood vessel wall and the inner diameter d of the blood vessel. The assumed thickness of the blood vessel wall and the inner diameter of the blood vessel are proportional (d '= ad, a: constant). Also, μgCan be treated as a constant since it does not change much even if the oxidation state of hemoglobin in blood changes. This can be said from the fact that the values of the hemoglobin spectral transmittance in the G region (approximately 500 nm to 600 nm) described in Reference Document 3 (edited by the Color Society of Japan, New Color Science Handbook, pp. 1293) are almost the same ( Accurately, the sum of the values integrated by the G sensitivity curve of the device). As a result, when d is eliminated using the equations of Equations 4 and 5,
[0066]
(Equation 6)
μrlogI1(R) -logITwo(R)/logI1(G) -logITwo(G)+ Const relation is obtained. Where logI1(R), logITwo(R), logI1(G), logITwo(G) is an amount observable from the image (details will be described later).
[0067]
The right side of the equation (6) takes a small value for an artery and a large value for a vein, which is an effective feature amount for arteriovenous discrimination. When considered as a feature quantity, the constant term in the equation (6) can be ignored. Here, a value obtained by removing the constant term from the right side of Expression 6 is referred to as a blood vessel color feature amount (example). Note that the blood vessel color feature value in the present invention does not have to strictly follow the equation (6), and does not depend on the blood vessel wall thickness and the blood vessel inner diameter by using the absorption contribution of each band. Indicates the general characteristic amount indicating the color tone of the blood itself calculated as described above.
[0068]
Note that the relationship between the thickness of the blood vessel wall and the inner diameter of the blood vessel is not a proportional relationship as in Assumption 2, if the relationship between the thickness d ′ of the blood vessel wall and the inner diameter d of the blood vessel is obtained in some way, If d and d 'are eliminated using the relation in the same manner as described above, a feature amount corresponding to the relation can be obtained. The relationship between the inner diameter d of the blood vessel and the thickness d 'of the blood vessel wall may be given by any mathematical formula, or basically the same even if given as a correspondence table of measured values.
[0069]
Here, the process of calculating the right side of the formulas 4 and 5 is an example of the absorption contribution calculating process shown in FIG. 2, and the right side (excluding the constant term) of the formula 6 is calculated. The process is an example of the blood feature value calculation process shown in FIG.
[0070]
Where I1(R), I2(R), I1(G), I2A specific calculation method of (G) will be described. In the following, I1(R) = Rs, I2(R) = Rv, I1(G) = Gs, I2(G) = Gv (s: background part, v: suffix indicating blood vessel part).
[0071]
FIG. 15 is a flowchart showing the procedure for calculating the color feature amount according to the present embodiment. As shown in FIG. 15, in the calculation processing of the color feature amount of the present embodiment, first, a blood vessel part and a background part in a processing unit area are separated (a blood vessel image is created). The average value of the R value and the G value of the blood vessel portion and the average value of the R value and the G value of the background portion are calculated with reference to the processing unit region of the blood vessel image and the original image, and the average value of the blood vessel portion of the G image is calculated. The value may be Gv, the average value of the background portion may be Gs, the average value of the blood vessel portion of the R image may be Rv, and the average value of the background portion may be Rs. FIG. 15 shows a simple average value calculation procedure. Also, instead of simply taking an average value, a histogram for each of RGB is calculated for each blood vessel pixel and background pixel in the processing unit area, and when the color of the blood vessel portion is calculated using the respective histograms, the histogram is calculated. The average color is calculated only for the histogram range that includes a certain percentage of pixels from the low luminance side, while when calculating the color of the background part, the number of pixels of a certain percentage from the high luminance side of the histogram is calculated. A method of calculating an average color using only the included histogram range may be used. Details of the processing are shown in FIGS.
[0072]
Note that the threshold value K in FIG. 17 is a parameter (K%) that determines a fixed ratio adopted in the histogram. By using this method, it is possible to eliminate the influence of pixels having an intermediate value near the boundary between the blood vessel portion and the background portion, and there is an advantage that the difference between the blood vessel portion and the background portion becomes clearer. For the same purpose, pixels around the boundary between the blood vessel part and the background part may be directly detected, and pixels within a certain range of those pixels may be excluded. The boundary between the blood vessel portion and the background portion may be detected by detecting a pixel having a blood vessel pixel and a background pixel adjacent to each other on the blood vessel image.
[0073]
If accurate focusing is possible at the time of imaging, the difference in color of the blood itself is more accurately reflected by setting the pixel values Gc and Rc at the center of the blood vessel to the colors Gv and Rv of the blood vessel. Characteristic amount.
[0074]
In the above description, the case where RGB is used as the observation image has been described. However, the method of constructing the feature quantity based on the expression of Equation 1 is applicable to a case where an arbitrary wavelength input or a wavelength band (band) other than RGB is observed. Obviously, they are essentially the same, and can be easily extended.
[0075]
(Example 2 of feature amount)
In Example 1 of the above feature amount, a value obtained by performing gamma correction (a value obtained by raising the input value to the power of γ to correspond to the output value) with respect to the physical luminance value of each band (RGB) is an observed value. In this case, it can be seen that the effect of the γ-th power is offset by the equation of Equation 6. Therefore, the equation of Equation 6 can be applied as it is regardless of the presence or absence of the γ correction. By the way, when the observed value has been subjected to gamma correction such that γ <1 with respect to the physical luminance value, it is possible to approximate the feature amount to a simpler form. In the log operation and the γ correction (γ <1), the first derivative is positive, and the second derivative is negative, and both values behave similarly. In such a case, the expression of Equation 6 is
[0076]
(Equation 7)
μr∝ (I1(R) -I2(R)) / (I1(G) -I2(G)) + const
It is also possible to approximate The right side (excluding the constant term) of the equation (7) may be used as a feature value.
[0077]
(Example 3 of feature amount)
Example 3 of the feature amount shown below corresponds to FIG.
Blood vessel and background color I1(R), I2(R), I1(G), I2The value of (G) includes the fluctuation of the thickness of the blood vessel wall and the inner diameter of the blood vessel as described above, and is not sufficient as a feature amount by itself. On the other hand, the thickness D of the blood vessel wall and the width D of the blood vessel including the inner diameter of the blood vessel can be measured from the fundus image (the measuring method will be described later), and the width D of the blood vessel is the thickness d 'of the blood vessel wall and the inner diameter of the blood vessel. It is considered that there is some correspondence with d. Therefore, the blood vessel width D is calculated as one of the feature amounts (FIG. 8, blood vessel width calculation process), and the color I of the blood vessel part and the background part is calculated.1(R), I2(R), I1(G), I2(G) is calculated (FIG. 3, color feature value calculation process), and a set of these values is used as a feature value. By using such a feature amount, even if the relationship between the blood vessel width D, the blood vessel wall thickness d ', and the blood vessel inner diameter d is not explicitly obtained, it is possible to identify the feature space in this feature space.
[0078]
Here, a method of detecting a blood vessel width will be described. The width of the blood vessel when the cross section is designated as the processing unit area may be obtained by measuring the distance between the pixels at both ends of the blood vessel portion on the cross section. As a method of calculating the width of a blood vessel when a small area is designated, as shown in FIG. 18, the average values Lx and Ly of the run lengths in the horizontal and vertical directions of the blood vessel pixels in the small area are obtained, and W = If Lx · sin θ and θ = arctan (Ly / Lx), it can be approximately obtained (see FIG. 18). In addition to this, a method for measuring the width of a blood vessel such as (Japanese Patent Application No. 10-079034 “Vessel measurement method”) is conceivable, but the method is not limited in the present invention.
[0079]
It should be noted that the B image is not used in the above-described example of the three feature amounts. This is because the value of B generally behaves the same as the value of G in a fundus image, and is not effective as a feature value because the value of B is small and the SN ratio is low. However, when the B image has information that does not exist in R and G, or when the value of B is stable and the SN ratio is good, this can be used, of course.
[0080]
In addition, if there is new knowledge on the color of the blood vessel part, the blood vessel wall, and the blood vessel inner diameter in addition to the example of the feature amount shown here, the unknown d, based on the basic model of Expression 1 and the obtained knowledge, By excluding the uncertainty of d ', the feature can be easily configured in the same manner as in the first example of the feature.
[0081]
In the arteriovenous discrimination step [4], whether the cross section or the small region is an artery or a vein is determined in the feature value space using the feature value calculated next. As a determination method, by using a plurality of fundus images in advance, collecting case data combining a feature amount related to a cross section of a blood vessel and attribute data indicating whether it is an artery or a vein, and analyzing the case data A threshold (separation hyperplane) in the feature amount space and an evaluation function for separation may be set.
[0082]
An example of a method for setting an evaluation function using case data will be described. First, using the collected case data, statistical data representing the distribution of the case of the artery and the distribution of the case of the vein in the feature space are calculated. As the statistical data, the center (centroid) of each distribution and the axial spread (variance) of each feature amount are the most basic information, and these values can roughly express the distribution of case data. it can. Next, an evaluation function using statistical data of acquired cases will be described. As a relatively simple method of evaluating the possibility that the feature of the cross section of interest belongs to the case of artery and the case of distribution of the case of vein in the feature space, as a relatively simple method, the distribution of each artery and vein is considered. There is a method of calculating a distance (weighted Euclidean distance) in consideration of the width of each distribution from the center, and using this distance as a possibility (probability) of belonging to each distribution. The weighted Euclidean distance d in the N-dimensional feature space is
Feature vector C: = (cl, C2, ..., cN)
Mean S of distribution of category jj: = (Ml, M2, ..., mN)
Variance value V of category j distributionj: = (Vl, V2, ..., vN)
Then dj= {SQRT} (ck-Mk)2/ Vk} (SQRT: square root). What is necessary is just to determine that this distance belongs to the shorter category.
[0083]
In addition, a method of finding a new coordinate axis most suitable for separating the case data of arteries and veins in the feature amount space and performing distance calculation on the coordinate axis in consideration of each distribution (discrimination analysis method: Reference document 4: Ohmsha, published by Kenichiro Ishii et al., "Easy-to-understand Pattern Recognition", pp. 114-123), enabling more accurate discrimination. There are various other statistical discrimination methods, but these methods are also described in detail in the above-mentioned Reference 4 “Easy-to-understand pattern recognition” and are not described here.
[0084]
The separating hyperplane may be set so as to be orthogonal to a new coordinate axis most suitable for separation found in the above-described discriminant analysis method, and to have the same distance from both categories.
[0085]
As described above, the evaluation function or the separation hyperplane for identification in the feature space can be obtained using the case data. However, the color of the fundus image is not always constant due to individual differences and variations in imaging conditions. Therefore, it is desirable to set an optimal evaluation function or separation hyperplane in each image. The method will be described below. An outline of the processing is shown in FIG.
[0086]
The blood vessels in the fundus image have the property that arteries and veins do not intersect. Further, in a normal fundus image, an artery and a vein intersect at several places. Therefore, one of the two blood vessels constituting the intersection is an artery and the other is a vein. Assuming that the direction in which the blood vessels approach the peripheral part and become thinner (coincides with the direction away from the nipple at the fundus) is the positive direction, the point where the blood flow merges in the forward direction may be detected as the intersection between the artery and the vein (FIG. 19).
[0087]
Various methods are conceivable as a method for detecting this intersection. As an example, an outline of a method using line segment tracking will be described. First, the brightest part in the fundus image is detected as a nipple (FIG. 9, retinal nerve disc detection process). Next, a blood vessel image in which the blood vessel and the background are separated is created by the above-described method (FIG. 4, blood vessel image creating process). An example of a method of detecting an intersection between an artery and a vein using this blood vessel image will be described below.
[0088]
The blood vessel image created as described above is thinned, and on the thinned blood vessel image, a point where a line segment branches is searched for while tracing blood vessels one by one from the vicinity of the nipple to the outside (in a direction away from the nipple). When there are two line segments that return in the nipple direction at the found branch point, including the line segment being searched, the branch point may be detected as an intersection (FIG. 9, arteriovenous intersection). Detection process). Here, the method of detecting the intersection is not limited.
[0089]
Next, as in the example shown in FIG. 20, at a place before the intersection of two blood vessels that merge at the detected intersection (a place closer to the nipple than the intersection), a cross section or small section including each blood vessel part and the background part is obtained. The regions are automatically set as learning regions (FIG. 9, learning region setting step). Then, the blood vessel and the background portion are separated in each cross section or small area (FIG. 9, blood vessel separation process), and the above-described feature amount is calculated to be a pair of the feature amount (FIG. 4. Feature amount pair calculation process). By performing this at a plurality of intersections in the fundus image, data of a plurality of pairs of feature points can be obtained.
[0090]
Which of the pairs of these features is an artery and a vein is determined by the following method. For some fundus images, the feature values of the cross-sections of the blood vessels in the fundus images and case data in which they are recorded as arteries or veins are collected in advance, and empirically or the above-described discriminant analysis method (Reference Document 4) is performed. By using this, the axis that can best separate the artery and vein categories is calculated. Then, of the pairs of feature amounts obtained from the fundus image of interest, the one closer to the artery category on this axis is the artery, and the other is the vein. In this way, it is determined which of the feature amount pairs obtained from the fundus image of interest is which artery and which is vein. Thus, case data in the feature space can be obtained from the image of interest. By using this, it is possible to obtain an optimal separating hyperplane or an evaluation function for the image of interest (FIG. 9, evaluation function setting process). The method for setting the separating hyperplane and the evaluation function can be applied as described above.
[0091]
As described above, the invention made by the inventor has been specifically described based on the embodiment. However, the present invention is not limited to the embodiment, and can be variously modified without departing from the gist thereof. Of course.
[0092]
【The invention's effect】
As described above, according to the present invention, arteries and veins in a fundus image can be automatically and accurately identified. Thereby, the work of measuring the blood vessel width in the fundus image can be made more efficient, and the examination cost of the fundus image examination can be reduced.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a schematic configuration of a fundus image processing apparatus according to an embodiment of the present invention.
FIG. 2 is a block diagram illustrating a schematic configuration of another fundus image processing apparatus according to the embodiment;
FIG. 3 is a block diagram illustrating a schematic configuration of another fundus image processing apparatus according to the embodiment;
FIG. 4 is a block diagram showing a schematic configuration of another fundus image processing apparatus according to the embodiment;
FIG. 5 is a diagram for explaining the outline and the properties of an example of a fundus image according to the embodiment.
6 is a flowchart showing a processing procedure of a method for identifying an artery / vein of a fundus image by the fundus image processing apparatus shown in FIG. 1;
7 is a flowchart illustrating a processing procedure of a method for identifying an artery and a vein of a fundus image by the fundus image processing apparatus illustrated in FIG. 2;
8 is a flowchart showing a processing procedure of a method for identifying an artery / vein of a fundus image by the fundus image processing apparatus shown in FIG. 3;
9 is a flowchart showing a processing procedure of a method for identifying an artery and a vein of a fundus image by the fundus image processing apparatus shown in FIG. 4;
FIG. 10FIG. 4 is a diagram illustrating an example (a cross section and a small area) of a processing unit area according to the present embodiment.
FIG. 11It is a flowchart which shows the creation processing procedure of the blood vessel image of this embodiment.
FIG.It is a figure showing signs of creation of a blood vessel image of this embodiment.
FIG. 13FIG. 3 is a diagram illustrating a state of a color distribution of a cross section according to the embodiment.
FIG. 14It is a figure showing an optical model of a fundus image of this embodiment.
FIG.FIG. 9 is a diagram illustrating a flow of calculating a blood vessel color and a background color (in the case of a simple average value) according to the present embodiment.
FIG.5 is a flowchart illustrating a procedure of calculating a blood vessel color and a background color (when a histogram is used) according to the embodiment;
FIG. 17 is a flowchart showing a procedure of calculating a blood vessel color and a background color (when a histogram is used) according to the embodiment;
FIG.It is a figure showing the blood vessel width detection method of this embodiment.
FIG. 19 is a diagram illustrating an example of an arteriovenous intersection according to the present embodiment.
FIG. 20 is a diagram illustrating a setting (in the case of a cross section) of a processing unit area near an arteriovenous intersection according to the present embodiment.

Claims (6)

  1. A fundus image processing apparatus including a processing unit area setting unit, a blood vessel separation unit, a feature amount calculation unit, and an arteriovenous identification unit, which is a method of processing digitalized fundus image data captured in color,
    A processing unit area setting step in which the processing unit area setting means sets a processing unit area including one blood vessel and a background portion for the fundus image data;
    The vessel separation means, and vessel separation process that releases min a blood vessel portion and the background portion of the processing unit area
    The feature value calculating means calculates a value of a red component I 1 ( R ) and a value of a green component of a background component from pixel values of the background portion and the blood vessel portion in the fundus image data for each of the processing unit regions. I 1 ( G ), the value I 2 ( R ) of the reddish component of the blood vessel part, and the value I 2 ( G ) of the greenish component are obtained, and as the blood color feature quantity indicating the color tone of the blood,
    A feature value calculation process of calculating a value of { log I 1 ( R ) −log I 2 ( R )} / { log I 1 ( G ) −log I 2 ( G )} ;
    The arteriovenous identification means, movement of the fundus image, characterized in that blood vessels in the processing unit region on the basis of the blood color characteristic amount and an arteriovenous identification process you identify whether a vein or an artery Vein identification method.
  2. A fundus image processing apparatus including a processing unit area setting unit, a blood vessel separation unit, a feature amount calculation unit, and an arteriovenous identification unit, which is a method of processing digitalized fundus image data captured in color ,
    A processing unit area setting step in which the processing unit area setting means sets a processing unit area including one blood vessel and a background portion for the fundus image data;
    The blood vessel separation means, a blood vessel separation step of separating a blood vessel part and a background part in the processing unit area,
    The feature amount calculating means calculates a value I of a reddish component of a background portion from a gamma-corrected pixel value satisfying γ <1 for each of the background portion and the blood vessel portion in the fundus image data for each processing unit region. 1 ( R ) and the value I 1 ( G ) of the green component , and the value I 2 ( R ) of the red component and the value I 2 ( G ) of the green component of the blood vessel, and the blood indicating the color tone of the blood As color features,
    A feature value calculating process of calculating a value of {I 1 ( R ) −I 2 ( R )} / {I 1 ( G ) −I 2 ( G )} ;
    The arteriovenous identification means, movement of the eye bottom image you characterized in that blood vessels in the processing unit region on the basis of the blood color characteristic amount and an arteriovenous identification process for identifying whether a vein or an artery Vein identification method.
  3. 3. The method according to claim 1, wherein the fundus image processing apparatus further includes a blood vessel image creating unit, an arteriovenous intersection detecting unit, a learning area setting unit, a feature amount pair calculating unit, and an evaluation. Function setting means,
    A blood vessel image creating step in which the blood vessel image creating means creates blood vessel image data that is a binary image obtained by separating a blood vessel portion and a background portion of the fundus image data;
    The arteriovenous intersection detecting means sets an intersection of two blood vessels in a forward direction as an artery / vein intersection, with the direction in which each blood vessel of the fundus image becomes thinner based on the blood vessel image data as a positive direction of the blood vessel. Arteriovenous intersection detection process to detect as
    A learning area setting step of setting a learning area including a blood vessel and a background portion for each of two blood vessels intersecting toward the detected arteriovenous intersection,
    The feature value pair calculation means calculates the blood color feature value from the pixel values of the background portion and the blood vessel portion in the fundus image data for each learning region for each of two intersecting blood vessels, and calculates the feature value. A feature value pair calculation process as a pair of
    In each of the feature value pairs corresponding to intersections of a plurality of blood vessels extracted from the entire fundus image, the evaluation function setting means may determine that one of the paired blood color feature values falls into an artery category, Having an arteriovenous identification evaluation function setting step of setting a threshold so that one is divided into vein categories ,
    The arteriovenous identification process, movement of the eye bottom image you wherein the blood vessels in the processing unit region on the basis of said said blood color feature amount threshold is a process that identifies whether a vein or an artery Vein identification method.
  4. A computer-readable recording medium that stores a program for causing a computer to execute the processing steps of the method for identifying an artery and vein of a fundus image according to claim 1.
  5. A fundus image processing apparatus that processes digitized fundus image data captured in color,
    A processing unit region setting means for setting a including processing unit region and one vessel and background portion for said fundus image data,
    Blood vessel separating means for separating a blood vessel part and a background part in the processing unit region,
    From the pixel values of the background portion and the blood vessel portion in the fundus image data for each of the processing unit regions, a value I 1 ( R ) of a red component of the background portion, a value I 1 ( G ) of a green component , and The value of the reddish component I 2 ( R ) and the value of the greenish component I 2 ( G ) of the blood vessel portion are obtained, and the blood color feature quantity indicating the color tone of the blood is obtained as
    Feature value calculating means for calculating a value of { log I 1 ( R ) −log I 2 ( R )} / { log I 1 ( G ) −log I 2 ( G )} ;
    Fundus image processing apparatus vessel in the processing unit region on the basis of the blood color characteristic amount; and a arteriovenous identifying means for identifying arterial der Luca vein der Luca.
  6. A fundus image processing apparatus that processes digitized fundus image data captured in color ,
    Processing unit area setting means for setting a processing unit area including one blood vessel and a background portion for the fundus image data;
    Blood vessel separating means for separating a blood vessel part and a background part in the processing unit region,
    For each of the processing unit regions, a value I 1 ( R ) of a reddish component and a greenish tint of the background portion are obtained from gamma-corrected pixel values of the background portion and the blood vessel portion in the fundus image data that satisfy γ <1. The value I 1 ( G ) of the component, the value I 2 ( R ) of the reddish component of the blood vessel part, and the value I 2 ( G ) of the greenish component are obtained, and as a blood color feature quantity indicating the color tone of the blood,
    Feature value calculating means for calculating a value of {I 1 ( R ) −I 2 ( R )} / {I 1 ( G ) −I 2 ( G )};
    A fundus image processing apparatus comprising: an artery / vein identification unit that identifies whether a blood vessel in the processing unit area is an artery or a vein based on the blood color feature amount.
JP32992699A 1999-11-19 1999-11-19 Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus Expired - Fee Related JP3548473B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP32992699A JP3548473B2 (en) 1999-11-19 1999-11-19 Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP32992699A JP3548473B2 (en) 1999-11-19 1999-11-19 Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus

Publications (2)

Publication Number Publication Date
JP2001145604A JP2001145604A (en) 2001-05-29
JP3548473B2 true JP3548473B2 (en) 2004-07-28

Family

ID=18226825

Family Applications (1)

Application Number Title Priority Date Filing Date
JP32992699A Expired - Fee Related JP3548473B2 (en) 1999-11-19 1999-11-19 Method and apparatus for identifying arteriovenous of fundus image, recording medium and apparatus

Country Status (1)

Country Link
JP (1) JP3548473B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101999885A (en) * 2010-12-21 2011-04-06 中国人民解放军国防科学技术大学 Endogenous optical imaging method for automatically separating arteries and veins

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006032261A1 (en) * 2004-09-21 2006-03-30 Imedos Gmbh Method and device for analysing the retinal vessels by means of digital images
JP4958254B2 (en) * 2005-09-30 2012-06-20 興和株式会社 Image analysis system and image analysis program
JP4854390B2 (en) 2006-06-15 2012-01-18 株式会社トプコン Spectral fundus measuring apparatus and measuring method thereof
JP5182689B2 (en) * 2008-02-14 2013-04-17 日本電気株式会社 Fundus image analysis method, apparatus and program thereof
JP4772839B2 (en) 2008-08-13 2011-09-14 株式会社エヌ・ティ・ティ・ドコモ Image identification method and imaging apparatus
JP5740403B2 (en) * 2009-10-15 2015-06-24 ザ・チャールズ・スターク・ドレイパ・ラボラトリー・インコーポレイテッド System and method for detecting retinal abnormalities
JP6038438B2 (en) * 2011-10-14 2016-12-07 国立大学法人 東京医科歯科大学 Fundus image analysis apparatus, fundus image analysis method and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101999885A (en) * 2010-12-21 2011-04-06 中国人民解放军国防科学技术大学 Endogenous optical imaging method for automatically separating arteries and veins

Also Published As

Publication number Publication date
JP2001145604A (en) 2001-05-29

Similar Documents

Publication Publication Date Title
Seoud et al. Red lesion detection using dynamic shape features for diabetic retinopathy screening
Trucco et al. Validating retinal fundus image analysis algorithms: issues and a proposal
Odstrcilik et al. Retinal vessel segmentation by improved matched filtering: evaluation on a new high-resolution fundus image database
Dias et al. Retinal image quality assessment using generic image quality indicators
Yu et al. Fast localization and segmentation of optic disk in retinal images using directional matched filtering and level sets
Akram et al. Automated detection of dark and bright lesions in retinal images for early detection of diabetic retinopathy
Marin et al. Obtaining optic disc center and pixel region by automatic thresholding methods on morphologically processed fundus images
Sopharak et al. Simple hybrid method for fine microaneurysm detection from non-dilated diabetic retinopathy retinal images
US9924867B2 (en) Automated determination of arteriovenous ratio in images of blood vessels
Joshi et al. Optic disk and cup segmentation from monocular color retinal images for glaucoma assessment
US9898659B2 (en) System and method for remote medical diagnosis
Niemeijer et al. Fast detection of the optic disc and fovea in color fundus photographs
JP4909378B2 (en) Image processing apparatus, control method therefor, and computer program
US20200085290A1 (en) Artificial neural network and system for identifying lesion in retinal fundus image
US8591414B2 (en) Skin state analyzing method, skin state analyzing apparatus, and computer-readable medium storing skin state analyzing program
Abràmoff et al. Retinal imaging and image analysis
Zhang et al. Detection of microaneurysms using multi-scale correlation coefficients
US8406860B2 (en) Method for evaluating blush in myocardial tissue
JP4411071B2 (en) Assessment of lesions in the image
US9135701B2 (en) Medical image processing
JP5955163B2 (en) Image processing apparatus and image processing method
Phillips et al. Quantification of diabetic maculopathy by digital imaging of the fundus
US7027627B2 (en) Medical decision support system and method
Chrástek et al. Automated segmentation of the optic nerve head for diagnosis of glaucoma
US6902935B2 (en) Methods of monitoring effects of chemical agents on a sample

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20031212

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20031224

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20040223

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20040413

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20040416

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090423

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090423

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100423

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100423

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110423

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120423

Year of fee payment: 8

LAPS Cancellation because of no payment of annual fees