WO2003030075A1 - Detection de papille optique dans une image de fond d'oeil - Google Patents

Detection de papille optique dans une image de fond d'oeil Download PDF

Info

Publication number
WO2003030075A1
WO2003030075A1 PCT/DK2002/000663 DK0200663W WO03030075A1 WO 2003030075 A1 WO2003030075 A1 WO 2003030075A1 DK 0200663 W DK0200663 W DK 0200663W WO 03030075 A1 WO03030075 A1 WO 03030075A1
Authority
WO
WIPO (PCT)
Prior art keywords
optic nerve
image
nerve head
candidate
head area
Prior art date
Application number
PCT/DK2002/000663
Other languages
English (en)
Inventor
Per Rønsholt ANDRESEN
Johan Doré HANSEN
Michael Grunkin
Niels Vaever Hartvig
Jannik Godt
Ebbe Sørensen
Soffia Björk Smith
Original Assignee
Retinalyze Danmark A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Retinalyze Danmark A/S filed Critical Retinalyze Danmark A/S
Publication of WO2003030075A1 publication Critical patent/WO2003030075A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection

Definitions

  • the present invention relates to a method for assessing the presence or absence of the optic nerve head in images of the ocular fundus (the back of the eye), hereinaf- ter referred to as the fundus.
  • Diabetes is the leading cause of blindness in working age adults. It is a disease that, among its many symptoms, includes a progressive impairment of the peripheral vascular system. These changes in the vasculature of the retina cause progressive vision impairment and eventually complete loss of sight. The tragedy of diabetic reti- nopathy is that in the vast majority of cases, blindness is preventable by early diagnosis and treatment, but screening programs that could provide early detection are not widespread.
  • the present invention relates to a method for assessing the presence or absence of the optic nerve head in fundus images.
  • Current methods may be able to detect the optic nerve head in many normal images, but the methods are not reliable when applied to images not containing the optic nerve.
  • the method should be robust in the sense that it should be applicable to a wide variety of images independent of illumination, presence of symptoms of diseases and/or artefacts of the image.
  • the present invention provides a method, comprising
  • n candidate optic nerve head area(s) wherein n is an integer > 1 ,
  • step d) classifying the candidate optic nerve head area selected in step c) with respect to a threshold as the optic nerve head area or not.
  • the present invention provides a method, comprising
  • n is an integer > 1 ,
  • step e) selecting the highest ranking candidate optic nerve head area fulfilling the validating criteria, f) classifying the candidate optic nerve head area selected in step e) with respect to a threshold as the optic nerve head area or not.
  • n is a small number, such as less than 15, more prefera- bly less than 10, more preferably less than 5. Normally n is 2, 3, or 4, most preferably 4.
  • the invention relates to a system capable of conducting the method, such as a system for assessing the presence or absence of the optic nerve head in a fundus image, comprising
  • the method according to the invention may be applied in several procedures for identifying structures or indications of diseases or abnormal conditions.
  • the invention relates to the use of the method of assessing the optic nerve head for establishing a coordinate system on an image of the ocular fundus, comprising
  • assessing the presence of the fovea region arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
  • Such a coordinate system allows a precise location of other structures in the fundus image, thereby providing more exact diagnosis, for example by establishing a coordinate system by the method, and grading the lesions with respect to distance to the fovea region.
  • the localisation of for example lesions with respect to the fovea region may also be accomplished by another method according to the invention for establishing a coordinate system on an image of the ocular fundus, comprising
  • the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the detection of optic nerve head may be used for detecting vessels in the image, optionally as an iterative method.
  • the present invention relates to a method for detecting vessels in a fundus image, comprising a) estimating the localisation of the optic nerve head region by a method as defined above,
  • steps a) and b) optionally repeating steps a) and b) at least once.
  • An important aspect of the present invention is the application of the method in a method for assessing the presence or absence of lesions in a fundus image.
  • the detection of the optic nerve head is used to for example mask the optic nerve head area in order to avoid false positive lesions very likely detected in the neighbourhood of the optic nerve head.
  • the invention also relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
  • the invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
  • Fig. 1 is a fundus image.
  • Fig. 2 is an unsharp filtered image of the fundus of Fig. 1.
  • Fig. 3 is an image of scoring of branching points in the vessel tree of the fundus of Fig. 1. Starting points are established at highest scoring branching points.
  • Fig. 4 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied.
  • the vessels have all been denoted 1 pixel in width (Medial Axis Transformation - MAT). Starting points are established in maxima of the image.
  • Fig. 5 is the fundus image of Fig. 1 wherein a filter enhancing sagital structures is applied. Starting points are established in maxima of the image.
  • Fig. 6 is an image of the fundus in Fig. 1 , wherein intensity maximum has given rise to a starting point.
  • Fig. 7 is an image of the fundus in Fig. 1, wherein variance maximum of the image has given rise to a starting point.
  • Fig. 8 is an image of the fundus in Fig. 1 , wherein the vessels have been masked.
  • Fig. 9 is an image of the fundus in Fig. 1, wherein the position of all starting points detected by the various methods is shown.
  • Fig. 10 is an image showing excluding areas of the fundus of Fig. 1 based on tangential vessel exclusion.
  • Fig. 11 shows the optimal circles found from the starting points, and the power assigned to the circles.
  • Fig. 12 shows the circle being accepted as the optic nerve head.
  • Fig. 13 shows 4 different fundus images, wherein the optic nerve head is present in two of the images, and absent in the remaining images.
  • Figure 14 The gradient orthogonal to a circle is used as cost function to find the border of the ONH.
  • the cost function is computed as the sum of individual gradients, where the gradient is computed as the difference between the intensity value at the end- and start-point.
  • the cost is computed as the difference between the sum of the intensity at outer- and inner-circle-points, (a) and (b) are equal (as long as the computations are linear).
  • Figure 15 The figure shows the correspondence between the circle power (sums of the intensities at the 'nCircleResolution' circle-points) and the radial distance from the centre of the ONH. The optimal radius is found at the highest gradient.
  • Figure 16 The circle power seen visually in Figure 11 here plotted as a graph.
  • FIG. 18 The schematic drawing shows the tangential vessel exclusion algorithm. An exclusion area is drawn around all significant vessels in the fundus image. The ONH is more likely to be found outside the tangential exclusion areas.
  • Figure 19 The figure depicts how the neighborhood around an initial vessel segment is expanded inside the confidence band.
  • the endpoints of the straight vessels segment are the expanded node points, which are farthest away.
  • FIG. 20 The figure shows an ONH candidate and the crossing of an arcade (vessel segment).
  • Image The term image is used to describe a representation of the region to be examined, i.e. the term image includes 1 -dimensional representations, 2-dimensional representations, 3-dimensionals representations as well as n-dimensional. Thus, the term image includes a volume of the region, a matrix of the region as well as an array of information of the region.
  • the term representative means that the starting point may represent a point or an area of the optic nerve head.
  • ROI Region of interest. Visibility: The term visibility is used in the normal meaning of the word, i.e. how visible a lesion or a structure of the fundus region is compared to background and other structures/lesions.
  • Optic nerve head The term is used in its normal anatomical meaning, i.e. the area in the fundus of the eye where the optic nerve enters the retina. Synonyms for the area are, for example, the "blind" spot, the papilla, or the optic disk.
  • Fovea The term is used in its normal anatomical meaning, i.e. the spot in retina having a great concentration of cones giving rise to the vision. Fovea and the term
  • macula lutea are used as synonyms. Also fovea has an increased pigmentation.
  • Red-green-blue image The term relates to the image having the red channel, the green channel and the blue channel, also called the RBG image.
  • Starting point The term describes a point or area for starting the search for candidate optic nerve head areas.
  • the term starting point is thus not limited to a mathematical point, such as not limited to a pixel, but merely denotes a localisation for starting a search.
  • the images of the present invention may be any sort of images and presentations of the fundus.
  • the image is presented on a medium selected from dias, paper photos or digital photos.
  • the image may be any other kind of representation, such as a presentation on an array of elements, for example a CCD.
  • the image may be a grey-toned image or a colour image; in a preferred embodiment the image is a colour image.
  • the green and/or the red channel is used for assaying the presence or absence of the optic nerve head area, and more preferred an average of the green and the red channel is used.
  • the candidate optic nerve head area(s) may be detected by any suitable method, for example by filtering, by template matching, by establishing starting points, and from said starting points grow regions and/or by other methods search for candidate areas, and/or combinations thereof.
  • the candidate optic nerve head area(s) are detected by establishing starting points, and from the starting points searching for candidate areas.
  • the starting points may be established by a variety of suitable methods and of combinations of such methods.
  • the establishment of starting points is conducted by applying the same approach as the present inventors have experienced that the human eye uses for identifying the optic nerve head when examining the fundus image manually. This approach is principally a combination of following the vessels in the image and looking for a bright area.
  • the image may be filtered and/or blurred before establishing or as a part of establishing starting points for the method.
  • the low frequencies of the image may be removed before establishing starting points.
  • the image may be un- sharp filtered, for example by median or mean filtering the image and subtracting the filtered result from the image.
  • the starting points may be established as extrema of the image.
  • the image is however a filtered image, wherein the filtering may be linear and/or non-linear.
  • the filtering method is a filtering method using templates, wherein the template may exhibit any suitable geometry for identifying the optic nerve head.
  • templates are circular templates having a diameter of the expected optic nerve head ⁇ some percent, for example + 30 %. It is within the scope of the invention, that the image may be filtered with one or more filters before establishing starting points, or as a part of the step of establishing starting points. Thus, in one embodiment of the invention starting points are established by combining two or more filters.
  • the extrema may thus be identified individually by one or more of several methods, such as the following:
  • Searching for the part of the vessels where no further branching points is detected that is establishing at least one extremum in the image based on vessel branching points, preferably establishing at least one maximum in the image based on vessel branching points.
  • the blood vessels branching perpendicularly at the optic nerve head area may be designated longitudinal blood vessels or sagitally oriented vessels, and the optic nerve head may be detected by searching for the longitudinal vessels.
  • one method may be establishing at least one extremum in the image based on a filter enhancing sagital structures, preferably establishing at least one maximum in the image based on a filter enhancing sagital structures.
  • the sagital filtering may be conducted on the vessels present in the image without any transformation. However, the sagital filtering may additionally or only be conducted on a thinned vessel image, for example wherein the vessels are all thinned to one pixel.
  • the vascular system may be isolated from the rest of the image context and skeletonized, i.e. the network of the vessels identified.
  • One method for tracking vessels is may be to extract linear components that may be regarded as those of the blood vessels. That is, groups of pixels forming bright lines on the dark background are considered to be images of the blood vessels.
  • Another method for tracking vessels is a method wherein use is made of the fact that the vessels are linear in a local neighbourhood wherein different filter matrices have different orientations. The localisation and orientation of such line elements may be determined using a template matching approach sometimes referred to as match filters).
  • a preferred method for tracking vessels is by tracking individual vessels from starting points representative for vessels, and iteratively grow the vessel network of the retina.
  • a preferred embodiment hereof is described in a co-pending PCT patent application entitled "Vessel tracking" of RETINALYZE A/S.
  • the estimation of candidate optic nerve head areas is adjusted with respect to vessels appearing in the image.
  • adjusted is meant either that an iterative estimation of optic nerve head and vessels is conducted, wherein for each iteration, the significance of the localisation of both increases towards a maximum, or that knowledge of the anatomical localisation of vessels adjacent the optic nerve head is used for locating and/or validating the position of the optic nerve head.
  • the estimation of candidate optic nerve head areas is preceded by detection of vessels in the image. Having identified the blood vessels in the image, it is desirable to be able to distinguish between veins and arteries among the blood vessels. This can be important, for example in the diagnosis of venous beading and focal arteriolar narrowing.
  • the vascular system observed in the ocular fundus images is by nature a 2-dimen- sional projection of a 3-dimensional structure. It is quite difficult in principle to distinguish veins from arteries, solely by looking at isolated vessel segments. However, it has been discovered that effective separation can be achieved by making use of the fact that, individually, the artery structure and the vein vessel structures is each a perfect tree (i.e., there is one unique path along the vessels from the heart to each capillary and back).
  • the artery and vein structures are each surface filling, so that all tissue is either supplied or drained by specific arteries or veins, respectively.
  • a method for distinguishing veins from arteries is described in WO 00/65982 to Tor- sana Diabetes Diagnostic A/S and is based on the realisation that crossings of vessel segments are, for practical purposes, always between a vein and an artery (i.e. crossings between arteries and arteries or between veins and veins are, for practical purposes, non-existent).
  • the optic nerve head will normally be one of the brightest areas in the image, or at least locally the brightest area.
  • a method may be establishing at least one intensity extremum in the image, preferably at least one intensity maximum.
  • causes for failure is for example the very normal shadow in the region of the optic nerve head, wherein the shadow is caused by the nose of the person having his or her fundus examined. Therefore, in a preferred embodiment at least one local intensity maximum is established.
  • the extrema may be established on any image function, such as wherein the image function is the unsharped image, the red chan- nel image, the green channel image, an average of the green and the red channel or any combinations thereof.
  • the method may include establishing at least one variance extremum in the image, preferably establishing at least one variance maximum in the image. For the same reasons as described with respect to the intensity at least one local variance maximum is established.
  • the ex- trema may be established on any image function, such as wherein the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel or any combinations thereof.
  • the variance extremum is a weighted variance maximum, or a local variance maximum, more preferably a local weighted variance maximum.
  • Another method for establishing starting points may be random establishment of starting points, wherein the ultimate random establishment, is establishing a starting point in substantially each pixel of the image.
  • a random establishment may be combined with any of the methods discussed above.
  • the starting points may be established as grid points, such as evenly distributed or unevenly distributed grid points. Again this method may be combined with any of the methods of establishing extrema in the image and/or random establishment.
  • starting points are established by more than one of the methods described in order to increase the probability of assessing the correct localisation of the optic nerve head when present in the image, also with respect to images having less optimal illumination or presenting other forms of less optimal image quality.
  • starting points are established by at least two of the steps or methods described above, such as by at least three of the steps or methods described above, such as by at least four of the steps or methods de- scribed above, such as at least five of the steps or methods described above. Search for best candidate
  • the search may be for a geometric shape representative for the optic nerve head.
  • the best candidate optic nerve head area may be represented by the periphery of the area, such as a closed curve, such as a circle, an ellipse, a snake, a polygonal, wherein the latter represent an irregular geometrical form.
  • the best candidate optic nerve head area is represented as an open curve, such as a circle, an ellipse, a snake, a polygonal.
  • the best candidate optic nerve head area may be repre- sented by the area as such, for example represented as a closed area, such as a circle, an ellipse, a snake, a polygonal.
  • the search initiated in step b) is a search for a centre of a best matching circle, wherein the centre is positioned in a search region of a predetermined size established around each starting point.
  • the centre of the best matching circle is not necessarily the starting point, but rather that the centre is positioned within the search region around the starting point.
  • the search for best matching circles is conducted by using searching for a variety of radii in the search region.
  • the radius of the best matching circle n the range of + 100 % of the expected diameter of the optic nerve head, such as n the range of + 90 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 80 % of the expected diameter of the optic nerve head, such as n the range of + 70 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 60 % of the expected diameter of the optic nerve head, such as n the range of + 50 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 40 % of the expected diameter of the optic nerve head, such as n the range of + 35 % of the expected diameter of the optic nerve head, such as n the range of + 30 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 25 % of the expected diameter of the optic nerve head, such as n the range of ⁇ 20 % of the expected diameter of the optic nerve head, such as n the range of + 15 % of
  • the expected optic nerve head diameter may be predetermined as an absolute fig- ure, however in order to apply the method to a variety of images taken with different resolutions etc. it is more appropriate to estimate the expected optic nerve head diameter in relation to other structures in the image.
  • the expected optic nerve head diameter is estimated from the caliber of vessels in the image or simply from the size of the image.
  • the expected nerve head diameter may be esti- mated from the camera magnification, or from the height of the image.
  • the optic nerve head diameter may be established as a standard for the specific camera setup, by measuring the optic nerve head diameter in a number of different images.
  • the search region used for finding the best candidates may have any geometric form, such as a circular region, or a rectangular region.
  • the power is preferably a value representative for the visibility of the candidate in the image.
  • the power is calculated as a measure of the candidate optic nerve head area edge, such as wherein said measure is selected from the summarized gradient, the summarized variance and/or the mean of the summarized variance, Laplace filtering, the curvature, the intensity, the skewness, the kurtosis, derived measure from Fourier transformation, derived measure from co-occurrence matrix, and derived measure from fractale dimension.
  • the power is calculated as the gradient of the candidate optic nerve head area edge.
  • the power calculated is weighted with respect to other known structures present in the image in order to reduce the risk of inadvertently assignment of a too high or too low power to the candidate optic nerve head area.
  • known structures in the image may for example be the vessels present in the image, in particular the vessel in the local region comprising the candidate optic nerve head area.
  • a known structure is the departure of image, since the border between the area outside the image (the region of interest (ROI)) and the area inside the image may represent a high gradient.
  • the selection step is a step for selecting the most probable optic nerve head area(s) among the various candidates for further validation. Therefore n best candidate optic nerve head area(s) are selected with respect to the power assigned as described above, i.e.
  • n candidate nerve head areas are the areas having the n highest powers
  • n is an integer > 1 , such as an integer in the range of from 1-100, such as an integer in the range of from 1-50, such as an integer in the range of from 1-25, such as an integer in the range of from 1-10, such as an integer in the range of from 2-25, such as an integer in the range of from 2-10, such as an integer in the range of from 3-25, such as an integer in the range of from 3-10, such as n being 1 , 2, 3, 4, 5 or 6.
  • the selected candidate optic nerve head areas are then ranked with respect to at least one validating criteria, said validating criteria may be related to the anatomical structures of the fundus region, i.e. related to the vessels as well the brightness of the optic nerve head area.
  • the validation step is conducted in order to increase the probability of the candidate optic nerve head area being the "true" optic nerve head. Although a power has been assigned previously the power may be biased by local factors in the image, and therefore not be able to rank the candidates properly.
  • the candidate optic nerve head area is preferably ranked with respect to the presence or absence of at least one of the following validating criteria:
  • any substantial sagital vessels detected extending out superior and/or inferior from the candidate optic nerve head area establishing at least one intensity extremum in the image, preferably at least one intensity maximum, more preferably at least one local intensity maximum, on the image function, wherein the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof,
  • a candidate cannot be ranked as a candidate if not a minimum of the criteria is fulfilled. Therefore, it may be preferred that the candidate optic nerve head area is ranked so that at least one of the criteria related to the vessels is fulfilled, and at least one of the other criteria is fulfilled, for example as ranking with respect to the presence or absence of the following validating criteria:
  • the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and/or
  • the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • the image function is the unsharped image, the red channel image, the green channel image, an aver- age of the green and the red channel, or any combinations thereof, and
  • the im- age function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • a further criterion may be added to the criteria discussed above or substituted for one of the criteria discussed above.
  • An example of such a further criterion may be
  • a candidate optic nerve head area not fulfilling at least one of the criteria is rejected as a candidate optic nerve head area, whereby it is not ranked but simply rejected.
  • the candidate optic nerve head area should preferably fulfil at least two of the criteria, otherwise the candidate is rejected as a candidate optic nerve head area.
  • the criteria mentioned above may be applied to the candidate optic nerve head area by weighting the power of the candidate optic nerve head areas with the criteria fulfilled by each candidate. Thereby a candidate area having an extraordinary high power may be ranked highest, although some of the other criteria are not fulfilled.
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, or
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any com- binations thereof,
  • the image function is the un- sharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof, and
  • the image function is the unsharped image, the red channel image, the green channel image, an average of the green and the red channel, or any combinations thereof.
  • the ranking it may be possible to obtain a candidate having the highest ranking although its power is not the highest.
  • the candidate optic nerve head areas being ranked and not rejected are subjected to a selection step, wherein the highest ranking candidate optic nerve head area fulfilling the validating criteria is selected for evaluating the probability that the candidate in fact represents the "true" optic nerve head area or not.
  • the threshold may be a predetermined absolute threshold, but in a preferred embodiment the threshold is a dynamic threshold relating to the image and the structures therein. In a preferred embodiment the threshold is the power of one of the other candidates multiplied with a constant.
  • the candidate optic nerve head may be classified as the optic nerve head area, if the, optionally weighted, power of optic nerve head area is at least k times higher than the, optionally weighted, power of at least one of the other candidate optic nerve head areas, such as k being in the range of from 1.01 to 10, such as k being in the range of from 1.1 to 8 such as k being in the range of from 1.2 to 6 such as k being in the range of from 1.3 to 5 such as k being in the range of from 1.4 to 4 such as k being in the range of from 1.5 to 3 such as k being in the range of from 1.6 to 2,5 such as k being in the range of from 1.7 to 2.0, otherwise the absence of the optic nerve head in the image is assessed.
  • k being in the range of from 1.01 to 10
  • k being in the range of from 1.1 to 8
  • k being in the range of from 1.2 to 6
  • k being in the range of from 1.3 to 5
  • k being in the range of from
  • the at least one of the other candidate optic nerve head areas may be selected as the candidate optic nerve head area being ranked as No. 2, such as the candidate optic nerve head area being ranked as No. 3, such as the candidate optic nerve head area being ranked as No. 4, such as the candidate optic nerve head area being ranked as No. 5, such as the candidate optic nerve head area being ranked as No. 6, such as the candidate optic nerve head area being ranked as No. 7, such as the candidate optic nerve head area being ranked as No. 8, , such as the candidate optic nerve head area being ranked as No. 9, such as the candidate optic nerve head area being ranked as No. 10, such as the candidate optic nerve head area being ranked as No. 11 , such as the candidate optic nerve head area being ranked as No. 12, such as the candidate optic nerve head area being ranked as No.13, such as the candidate optic nerve head area being ranked as No. 14, such as the candidate optic nerve head area being ranked as No. 15, or combinations of these.
  • the absence of the optic nerve head in the image is assessed, whereby the method is concluded to be of a region of the fundus not including the optic nerve head or an optic nerve head not visible, whereby it would not disturb other processing of the image.
  • the automatic fundus coordinate system setting procedure includes three procedures, namely an optic disc detecting procedure, a fovea detecting procedure and a fundus coordinate system setting procedure.
  • an optic disc detecting procedure e.g., a laser scanner
  • a fovea detecting procedure e.g., a fundus coordinate system setting procedure
  • a fundus coordinate system setting procedure e.g., a fundus coordinate system setting procedure.
  • fovea is used synonymously with the term macula lutea.
  • the present invention relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
  • assessing the presence of the fovea region arranging a coordinate system having one axis parallel with an axis through the optic nerve head and the fovea.
  • the optic nerve head area is preferably assessed by the method described above, since this method is capable of assessing not only the presence of the optic nerve head area when present, but also the absence of the optic nerve head area when absent. Thus, no false positive detection of the optic nerve head areas is conducted which would lead to wrongly applied coordinate systems.
  • the macula lutea is a region having a radius approximately equal to twice the diameter of the optic disc around the center of the central portion of the fundus.
  • the macula lutea is made up of cones arranged in a close arrangement and constitutes a region of visual acuity and colour sense.
  • the region corresponding to the macula lutea is darker than the region surrounding the macula lutea.
  • the macula lutea has an area of a certain size and a conical shape.
  • Coordinate axes are determined on the basis of the results of detection of the optic disc and the macula lutea.
  • the abscissa axis passes the center of the optic disc and the macula lutea
  • the ordinate axis is perpendicular to the abscissa axis and passes through the center of the optic disc.
  • the distance between the macula lutea and the optic disc may be calculated in units of disc diameter.
  • the axes of the orthogonal coordinate system are inclined relative to the image. The procedure includes image data transformation to make the orthogonal coordinate system coincide with the image.
  • a curvilinear coordinate system established by combining these coordinate axes and a nerve fiber bundle distribution pattern corresponds to a fundus coordinate system.
  • coordinate axes are set such that the abscissa having reverse direction to the macula lutea is located at 0°, the upper part of the ordinate is located at 90°, the macula lutea side abscissa is located at 180°, and the lower part of the ordinate is located at 270°.
  • the angle of this coordinate axis is called the optic disc inner angle.
  • the coordinate system may be used for locating various structures and pathological condition in relation to other structures, for example for locating lesions in relation to the fovea region, since this region represents the specific vision region, and lesions close to the fovea may affect the vision, whereas lesions distanced from fovea may be of less importance prognostically.
  • the present invention further relates to a method for establishing a coordinate system on an image of the ocular fundus, comprising
  • the present invention further relates to a method for grading lesions in a fundus image, comprising establishing a coordinate system by the method as described above, and grading the lesions with respect to distance to the fovea region.
  • the optic nerve head may be used for registering the images, and accordingly, the present invention relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the invention also relates to a method for registering at least two different fundus images of the same fundus, comprising assessing the presence or absence of the optic nerve head in said images by a method as defined above, and orienting the images with respect to the optic nerve head.
  • the identification of the optic nerve head area may also aid in detecting the vessels, since all vessels in retina either start from the optic nerve head or end at the optic nerve head, that is both the arteriolar network and the venolar network of vessels appear as trees having their roots at the optic nerve head, and may be tracked starting at the optic nerve head.
  • detection of the optic nerve head and detection for the vessels may also be an iterative process, wherein the optic nerve head is detected giving rise to a detection of the vessels, and the detection of the vessels thereby gives rise to a reiteration of the detection of the optic nerve head and so forth, until a maximum of significance for both the optic nerve head and the vessels has been met.
  • the present invention further relates to a method for detecting vessels in a fundus image, comprising
  • a very important aspect of the invention is the detection of the optional presence of the optic nerve head area in a fundus image before detection of any lesions of the fundus.
  • Lesions of the retina normally embrace microaneurysms and exudates, which show up on fundus images as generally "dot shaped" (i.e. substantially circular) areas. It is of interest to distinguish between such microaneurysms and exudates, and further to distinguish them from other pathologies in the image, such as "cotton wool spots" and hemorrhages. If the optic nerve head area is present in the image it may give rise to errors when detecting lesions in the image. Therefore, the present invention further relates to a method for assessing the presence or absence of lesions in a fundus image, comprising
  • a region around the detected optic nerve head area is masked, such as a region having a dimension corresponding to at least 1.1 times the diameter of the estimated optic nerve head area, such as at least 1.3 times the diameter of the estimated optic nerve head area, such as at least 1.5 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 1.7 times the diameter of the estimated optic nerve head area is masked, such as a region corresponding to at least 2.0 times the diameter of the estimated optic nerve head area is masked.
  • the lesions may be detected by any suitable method known to the person skilled in the art.
  • Glaucoma causes the exfoliation of the nerve fiber layer, entailing the expansion of the recess of the optic disc on which the nerve fiber layer converges. It is known that, in most cases, a portion of the optic disc blood vessel extending in the vicinity of the optic disc edge is bent severely as the recess of the optic disc expands.
  • indicators for glaucoma may be detected by studying the nerve fiber layer and/or the vessels in the vicinity of the optic nerve head edge. Preferably both methods are applied.
  • the reflectance of the nerve fiber layer decreases with progressive exfoliation due to glaucoma or the like, and the tone of the image of the nerve fiber layer darkens.
  • specialists in glaucoma analysis are capable of identifying very delicate glaucomatous changes in the nerve fiber layer, such changes are generally difficult to detect.
  • exfoliation of the nerve fiber layer propagates along the fiber bundles, such exfoliation is called optic nerve fiber bundle defect.
  • this embodiment is capable of introducing, into image processing, medical information including the probability of occurrence of glaucomatous scotomas in the visual field.
  • the information about the nerve fiber layer can be contrasted with information about the optic disc by projecting the information about the nerve fiber layer on the information about the optic disc inner angle ⁇ . Accordingly, a comprehensive glaucoma analyzing system can be built for example as described in US Patent 5,868,134, Sugiyama, et al.
  • the image may be converted into a blue and green image having sixty-four gradations.
  • the blue and green image corresponds to a common red-free projection photograph. Defects in the nerve fiber bundles are set off by contrast and the influence of the choroid blood vessels is insignificant in the blue and green Image. Details of the image that will adversely affect the analysis are excluded.
  • the optic disc portion, the blood vessel portion and the periphery of the blood vessels may be extracted and excluded to use a retina portion for the follow- ing analysis.
  • the analysis of blood vessel curvature on the optic disc edge can be carried out by the embodiment in US Patent 5,868,134, Sugiyama, et al. wherein a blood vessel curvature VC( ⁇ ) with respect to the direction of the optic disc inner angle ( ⁇ ) as de- scribed above for the co-ordinate system, is detected for each of the optic disc edge blood vessels.
  • the blood vessel curvature VC( ⁇ ) is a curvature with respect to a direction of the midpoint C.
  • the blood vessel curvatures VC( ⁇ )determined are stored for the comprehensive analysis of glaucoma.
  • the blood vessel curvature may be conducted by measuring the length ration of the straight line between two arbitrary points on a blood vessel intersecting the optic nerve head edge to the length between the same points measured along the course of the vessel. If the ratio is close to 1 , the vessel is straight, and the closer to 0 the ratio approaches, the more bending the vessel.
  • the indicators are determined by assessing the inner and outer edge of the optic nerve head.
  • the present invention relates to a method for detecting indicators of glaucoma in a fundus image, comprising
  • the perimeter may be a generally known automatic perimeter, a flicker perimeter for glaucoma detection, a blue pyramid perimeter or a perimeter using contrast sensitivity measurement.
  • the invention further relates to a system for assessing the presence or absence of the optic nerve head in a fundus image.
  • the system according to the invention may be any system capable of conducting the method as described above as well as any combinations thereof within the scope of the inven- tion. Accordingly, the system comprises
  • n is an integer > 1 ,
  • a graphical user interface module may operate in conjunction with a display screen of a display monitor.
  • the graphical user interface may be implemented as part of the processing system to receive input data and commands from a conventional keyboard and mouse through an interface and display results on a display monitor.
  • many components of a conventional computer system have not been discussed such as address buffers, memory buffers, and other standard control circuits because these elements are well known in the art and a detailed description thereof is not necessary for understanding the present invention.
  • Pre-acquired image data can be fed directly into the processing system through a network interface and stored locally on a mass storage device and/or in a memory. Furthermore, image data may also be supplied over a network, through a portable mass storage medium such as a removable hard disk, optical disks, tape drives, or any other type of data transfer and/or storage devices which are known in the art.
  • a parallel computer platform having multiple processors is also a suitable hardware platform for use with a system according to the present invention.
  • Such a configuration may include, but not be limited to, parallel machines and workstations with multiple processors.
  • the processing system can be a single computer, or several computers can be connected through a communications network to create a logical processing system.
  • the present system allows the grader, that is the person normally grading the images to identify the optic nerve head area more rapidly and securely, if it is present in the image. Also, the present system allows an automatic detection of lesions and other pathologies of the retina without interference from the optic nerve head area, again as an aiding tool for the traditional grader.
  • the network may carry data signals including control or image adjustment signals by which the expert examining the images at the examining unit directly controls the image acquisition occurring at the recordation localisation, i.e. the acquisition unit.
  • command signals as zoom magnification, steering adjustments, and wavelength of field illumination may be selectively varied remotely to achieve desired imaging effect.
  • questionable tissue structures requiring greater magnification or a different perspective for their elucidation may be quickly resolved without ambiguity by varying such con- trol parameters.
  • by switching illumination wavelengths views may be selectively taken to represent different layers of tissue, or to accentuate imaging of the vasculature and blood flow characteristics.
  • the control signals may include time varying signals to initiate stimulation with certain wavelengths of light, to initiate im- aging at certain times after stimulation or delivery of dye or drugs, or other such precisely controlled imaging protocols.
  • the digital data signals for these operations may be interfaced to the ophthalmic equipment in a relatively straightforward fashion, provided such equipment already has initiating switches or internal digital circuitry for controlling the particular parameters involved, or is capable of readily adapting electric controls to such control parameters as system focus, illumination and the like.
  • the imaging and ophthalmic treatment in- strumentation in this case will generally include a steering and stabilization system which maintains both instruments in alignment and stabilized on the structures appearing in the field of view.
  • the invention contemplates that the system control further includes image identification and correlation software which allows the ophthalmologist at site to identify particular positions in the retinal field of view, such as pinpointing particular vessels or tissue structures, and the image acquisition computer includes image recognition software which enables it to identify patterns in the video frames and correlate the identified position with each image frame as it is acquired at the acquisition site.
  • the image recognition software may lock onto a pattern of retinal vessels.
  • the invention further contemplates that the images provided by acquisition unit are processed for photogrammetric analysis of tissue features and optionally blood flow characteristics. This may be accomplished as follows. An image acquired at the recordation unit is sent to an examination unit, where it is displayed on the screen. As indicated schematically in the figure, such image may include a network of blood vessels having various diameters and lengths. These vessels include both arterial and venous capillaries constituting the blood supply and return network.
  • the workstation is equipped with a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
  • a photogrammetric measurement program which for example may enable the technician to place a cursor on an imaged vessel, and moving the cursor along the vessel while clicking, have the software automatically determine the width of the vessel and the subvessels to which it is connected, as well as the coordinates thereof.
  • the software for noting coordinates from the pixel positions and linking displayed features in a record, as well as submodules which determine vessel capacities and the like, are straightforward and readily built up from photogrammetric program techniques.
  • Work station protocols may also be implemented to automatically map the vasculature as described above, or to compare two images taken at historically different times and identify or annotate the changes which have occurred, highlighting for the operator features such as vessel erosion, tissue which has changed colour, or other differences.
  • a user graphical interface allows the specialist to type in diagnostic indications linked to the image, or to a particular feature ap- pearing at a location in the image, so that the image or processed version of it becomes more useful.
  • the relative health of the vessel, its blood carrying capacity and the like may also be visually observed and noted.
  • This photogrammetric analy- sis allows a road map of the vasculature and its capacity to be compiled, together with annotations as to the extent of tissue health or disease apparent upon such inspection.
  • a very precise and well-annotated medical record may be readily compiled and may be compared to a previously taken view for detailed evidence of changes over a period of time, or may be compared, for example, to immediately preceding angiographic views in order to assess the actual degree of blood flow occurring therein.
  • the measurement entries at the examination unit become an annotated image record and are stored in the central library as part of the patient's record.
  • the present invention changes the dynamics of patient access to care, and the efficiency of delivery of ophthalmic expertise in a manner that solves an enormous current health care dilemma, namely, the obstacle to proper universal screening for diabetic retinopathy.
  • Possible initial locations or starting points (seed points) of the optical nerve head (ONH) are found by determining the local maximums in feature images.
  • the feature images are produced by considering the flux in the vessel tree, vertical filtering of the vessel tree, and the intensity and variance of the un-sharpened fundus image.
  • the ONH is of a circular shape.
  • a three dimensional exhaustive search is performed around the seed points, in order to de- termine the optimal position and diameter of a circle fitting the (possible) ONH. Utilizing rules regarding the structure of the vessel tree and the intensity and variance maximums of the un-sharpened image allows accepting or rejecting a possible ONH. If accepted, then the origin and diameter of the ONH is already known from the exhaustive search.
  • the fundus image in Figure 1 serves to illustrate the individual steps in the ONH detection throughout the current research report.
  • the C/C++ header that defines the input/output from the ONH detection library is shown below.
  • ROI detection in the mask image for full images the mask might be NULL.
  • the image might be a RGB image or a multi-frame image (then it is assumed the first frame (frame 0) contains the fundus image).
  • Uncertain vessels have values ]0;100[, and certain vessels have values > 100 (the actual width is given as 'pixel value minus 100').
  • ESD expectedDiskDiameter Double value of the expected disk diameter (notice it is the diameter, NOT radius).
  • EDD Expected Disk Dia- meter
  • the EDD is found as the mean ONH radius measured in a number of im- ages. The measure is done in pixels.
  • the system has predefined values for 30 and 45 degrees of fundus coverage.
  • the optic disc area is independent of age beyond and age of about 3 to 10 years.
  • the optic disc measurements vary ac- cording to the method applied.
  • Mean optic disc area of non-highly myopic Caucasians examined in various studies ranged between 2.1 mm 2 and about 2.8 mm 2 .
  • the optic disc has a slightly vertically oval form with the vertical diameter being about 7% to 10% larger than the horizontal one.
  • the area interval between 2.1 - 2.8 mm 2 give a horizontal diameter that equals 1570 - 1813 ⁇ m.
  • gray scale intensity is calculated as the mean of the red and green channel. Else it is assumed to be a gray scale image
  • the first step is applied to suppress noise.
  • the size of the median filter equals 'medianKernedSize' (Table 1 ).
  • the ONH is normally more pronounced in the red channel when having poor image quality. Therefore it is also used when converting to a gray scale image in the second step.
  • the image is reduced in size mainly since the ONH is a large feature and experi- ments show no significant performance decrease.
  • the image is reduced so the EDD becomes 'RescaledEDD' pixels (Table 1 ).
  • the rescaled rows and columns become 'if and 'cc' (Table 1 ), respectively.
  • minFluxWidth VesselThicknesSfmeanwidthFractiie'). where 'meanWidthFractile' is found in Table 1.
  • the endpoints of the thickest vessels initialize a search where the vessels are followed until the thickness drops below 'minFluxWidth' or the direction of the vessel changes too abruptly.
  • a counter at each node point in the vessel tree is increased every time the node is visited.
  • the flux seed points are defined as all the node points having a counter larger than or equal to the value of the largest count minus one ( Figure 3).
  • the large vessels near the ONH are characterized by being almost vertical. Therefore two feature images emphasizing the vertical vessels are used to guide the search for the ONH, figure 5.
  • the two images that are filtered are described in the two following sub-sections.
  • the two images do not generate identical seed points.
  • the MAT helps minimizing this problem, as the width has no influence after the MAT has been applied.
  • the result from the vessel-tracking algorithm is part of the input to the ONH algorithm.
  • the input is converted to a Boolean image where the certain vessels form the foreground.
  • the medial axis transform algorithm finds the mid-line of a structure.
  • the midpoints are defined by the center of the largest circle toughing more than one point of the border (of the object).
  • the MAT is calculated from the Boolean vessel image, figure 4.
  • the ONH is often the brightest area in the image. Gaussian filtering of the pre- filtered image is used in order to facilitate this observation.
  • the kernel size is given by 'blurKernelSize' (Table 1 ), figure 6.
  • the variance feature is undoubtedly one of the best features, if the ONH is present in the FOV. But as any other single feature, this feature may also fail. Especially, when the contrast of the image is poor.
  • a variance filter or as in this case, a standard deviation filter gives a "blocky" feature image.
  • a weighted filter solves this.
  • a low pass filter such as the Gaussian filter, can be applied afterwards. This could also have been achieved by using a weighted variance filter defined as the convolution of the original variance filter and the Gaussian filter.
  • 'max' and 'min' are the global maximum and global minimum, respectively.
  • 'val' is the value of the local maximum being tested.
  • Some areas should not be included when searching for suitable ONH candidates.
  • the following section uses the gradient in order to find the optimal placement of the ONH candidate. Obviously, high gradients between the brighter ROI and the darker surrounding area should be avoided, and pixels near the ROI border are therefore excluded from the calculation.
  • the ONH is assumed to have a circular shape leaving the center and the radius as the only unknowns, in total, three degrees of freedom.
  • the gradient orthogonal to a circle is used as cost function to find the border of the ONH ( Figure 14).
  • Figure 14b is chosen in the current implementation.
  • the number of points on the circle is 'nCircleResolution' (Table 2).
  • a 3D exhaustive search is performed in order to find the maximal gradient.
  • the size of the search area around the seed point is '2* boxHalf- Size+V with step size 'boxStep' (Table 2).
  • the last search dimension is the radius of the circle.
  • the radius of the outer circle is 'gradientCircleEnlargement' (Table 2) times the radius of the inner circle ( Figure 14).
  • avoidance mask partially avoids the (very) high false gradients. This means that the gradient is "excluded” if the end point is placed outside the mask. Notice, that "excluded” means that the gradient value is set to zero. This is neces- sary in order to avoid false high gradients and "wrong" circle powers. If the mean is used instead, the optimization has a tendency to favor placement outside the mask.
  • Seed points may be placed so there search areas overlaps, which may result in the same optimal position (and therefore also the same optimal radius). These identical results are removed and thereby producing an array of unique local optimal ONH candidates.
  • a descending sort with respect to circle power is performed on the array. (Figure 11 shows the optimized ONH candidates. The optimization was initialized from the seed points seen. The intensities of the circles are equivalent with the circle power assigned to each ONH candidate). Choose the right ONH among candidates
  • the vessel graph is used to define SVSs. Let two connected node points define a line. A confidence band is placed around this line. The width of the band is '2*confidence ⁇ mit ' (Table 3). Recursively, connected node points are included as long as they are inside the confidence band.
  • the included node points are searched to find the two node points farthest away.
  • the resulting two node points define the SVS.
  • the Euclidian distance between them defines the length of the SVS.
  • the SVS is only regarded as being a significant vessel segment if it fulfills one of two rules: If the caliber of the SVS has a caliber ⁇ or > the segment must be longer than 'long ⁇ neLength' or ' minLineLength' (Table 3), respectively (it must either be thin and long or thick and short).
  • the vicinity of the ONH candidate is searched for vessel crossings. It is a good assumption that large vessels (arcades) vertically cross the top and bottom of the peri- phery of the ONH. However, this restriction is relaxed since the vessel tracking may not be capable of finding the arcades e.g. if the image quality is degraded. It is also a common problem that the vessel tracking is not capable of tracking all the way into the ONH but stops a distance away from it. The projection of the 3D fundus to a 2D image may also introduce "mysterious" changes in the direction of the vessels.
  • an enlarged ONH radius is used when searching for crossings between the ONH periphery and the vertical vessels.
  • the search radius is enlarged by a factor 'circleEnlargeFacto (Table 3).
  • Arcades are assumed to have a caliber larger than or equal to 'minWidth', which is defined as
  • minWidth VesselWidthDistribution (v , ⁇ sS e/F rac ⁇ ).
  • a "vertical" arcade is defined to have a caliber ⁇ 'minWidth' and must cross the top or bottom quadrant with an angle less than 'maxVesselAngle '(Table 3). The angle is calculated as the angle between the vessel segment and the vector going from the center of the ONH can- didate to the point on the periphery.
  • the variance feature is also a very good feature. This means that the likelihood of having a correct identified ONH containing neither a seed point from one of these features is negligible.
  • the ONH candidates are regarded as being samples from the fundus image and thereby use them to detect the normal background "variation".
  • the ratio of circle powers between the best ONH candidate (see below for the best candidate) and the OnhCandidate2CompareWith'- ⁇ h should be above (or equal) to 'gradientPowerFactor' (Table 3):
  • OnhCandidate2CompareWith' is incremented by one, if the center of the OnhCandi- date2CompareWith'- ⁇ h ONH candidate is inside the best candidate. This is done until the center is outside the best candidate or there are no more candidates.
  • Figure 16 shows the circle power for the example fundus image.
  • topB and bottomB are two Booleans that are true if an arcade is crossing the periphery of the ONH through the top and bottom quadrant, respectively.
  • the Booleans intMaxB and varMaxB are true if an intensity and variance seed point is present inside the ONH candidate, respectively.
  • adjusted circle power is calculated as:
  • two can- didates are flagged, namely 1 ) the candidate having the highest adjust power which does not have the center point masked out in the tangential vessel image and 2) the candidate having the highest adjust power which has the center point masked out in the tangential vessel image.
  • the two flags are called OptimalGradNr' and 'optimal- MaskNf (initially the two flags equal the number of the last candidate).
  • optimalMaskNr ⁇ optimalGradNr, which means that a more powerful adjusted ONH candidate exists, then it is tested if the tangential masked candidate should be chosen instead: the center of the masked candidate should be inside the periphery of the OptimalGradNr' candidate, and satisfy a visibility criteria.
  • the ONH detection library returns with a true 'foundONH' and the center and radius of the ONH, else 'foundONH' is false.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention concerne un procédé destiné à déterminer la présence ou l'absence de la papille optique dans des images de fond d'oeil (arrière de l'oeil) grâce à un procédé efficace et à un système capable de réaliser une détection de validation de la papille optique lorsqu'elle est présente dans une image et de ne pas détecter de papille optique lorsque celle-ci est absente d'une image, indépendamment de l'éclairage, de la présence de maladies ou d'artefacts dans l'image. L'invention comprend la détection d'une zone de papille optique potentielle, l'attribution d'une puissance à chaque zone, la classification des zones potentielles, la sélection de la zone au classement le plus élevé, et la classification de la zone au classement le plus élevé par rapport à un seuil de présence de zone de papille optique.
PCT/DK2002/000663 2001-10-03 2002-10-03 Detection de papille optique dans une image de fond d'oeil WO2003030075A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
DKPA200101449 2001-10-03
DKPA200101449 2001-10-03
DKPA200200632 2002-04-25
DKPA200200632 2002-04-25
US37623202P 2002-04-30 2002-04-30
US60/376,232 2002-04-30

Publications (1)

Publication Number Publication Date
WO2003030075A1 true WO2003030075A1 (fr) 2003-04-10

Family

ID=27222544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DK2002/000663 WO2003030075A1 (fr) 2001-10-03 2002-10-03 Detection de papille optique dans une image de fond d'oeil

Country Status (1)

Country Link
WO (1) WO2003030075A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006067226A2 (fr) * 2004-12-22 2006-06-29 Dilab I Lund Ab Identification biometrique d'animaux de laboratoire
DE102005058217A1 (de) * 2005-12-06 2007-06-28 Siemens Ag Verfahren und System zur computergestützten Erkennung von Hochkontrastobjekten in tomographischen Aufnahmen
CN100367310C (zh) * 2004-04-08 2008-02-06 复旦大学 视网膜神经节细胞感受野尺度可变层次网络模型及其算法
US7583827B2 (en) 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
GB2470727A (en) * 2009-06-02 2010-12-08 Univ Aberdeen Processing retinal images using mask data from reference images
EP2779095A3 (fr) * 2013-03-15 2016-03-02 Kabushiki Kaisha TOPCON Procédé de segmentation de l'image d'une papille optique
CN110543802A (zh) * 2018-05-29 2019-12-06 北京大恒普信医疗技术有限公司 眼底图像中左右眼识别方法与装置
CN112712521A (zh) * 2021-01-18 2021-04-27 佛山科学技术学院 一种基于全局梯度搜索的眼底视盘自动定位方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053865A (en) * 1993-09-21 2000-04-25 Kabushiki Kaisha Topcon Retinal disease analyzer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6053865A (en) * 1993-09-21 2000-04-25 Kabushiki Kaisha Topcon Retinal disease analyzer

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AKITA K ET AL: "A COMPUTER METHOD OF UNDERSTANDING OCULAR FUNDUS IMAGES", PATTERN RECOGNITION, PERGAMON PRESS INC. ELMSFORD, N.Y, US, vol. 15, no. 6, 1982, pages 431 - 443, XP000877036, ISSN: 0031-3203 *
GOLDBAUWM M ET AL: "AUTOMATED DIAGNOSIS AND IMAGE UNDERSTANDING WITH OBJECT EXTRACTION,OBJECT CLASSIFICATION, AND INFERENCING IN RETINAL IMAGES", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP) LAUSANNE, SEPT. 16 - 19, 1996, NEW YORK, IEEE, US, vol. 3, 16 September 1996 (1996-09-16), pages 695 - 698, XP000704110, ISBN: 0-7803-3259-8 *
WOOD S L ET AL: "Estimation Of Nerve Fiber Loss From Digitized Retinal Images", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 1991. VOL.13: 1991., PROCEEDINGS OF THE ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ORLANDO, FL, USA 31 OCT.-3 NOV. 1991, NEW YORK, NY, USA,IEEE, US, 31 October 1991 (1991-10-31), pages 269 - 270, XP010101608, ISBN: 0-7803-0216-8 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7583827B2 (en) 2001-10-03 2009-09-01 Retinalyze Danmark A/S Assessment of lesions in an image
CN100367310C (zh) * 2004-04-08 2008-02-06 复旦大学 视网膜神经节细胞感受野尺度可变层次网络模型及其算法
WO2006067226A2 (fr) * 2004-12-22 2006-06-29 Dilab I Lund Ab Identification biometrique d'animaux de laboratoire
WO2006067226A3 (fr) * 2004-12-22 2006-09-21 Dilab I Lund Ab Identification biometrique d'animaux de laboratoire
DE102005058217A1 (de) * 2005-12-06 2007-06-28 Siemens Ag Verfahren und System zur computergestützten Erkennung von Hochkontrastobjekten in tomographischen Aufnahmen
DE102005058217B4 (de) * 2005-12-06 2013-06-06 Siemens Aktiengesellschaft Verfahren und System zur computergestützten Erkennung von Hochkontrastobjekten in tomographischen Aufnahmen
GB2470727A (en) * 2009-06-02 2010-12-08 Univ Aberdeen Processing retinal images using mask data from reference images
EP2779095A3 (fr) * 2013-03-15 2016-03-02 Kabushiki Kaisha TOPCON Procédé de segmentation de l'image d'une papille optique
US10497124B2 (en) 2013-03-15 2019-12-03 Kabushiki Kaisha Topcon Optic disc image segmentation method and apparatus
CN110543802A (zh) * 2018-05-29 2019-12-06 北京大恒普信医疗技术有限公司 眼底图像中左右眼识别方法与装置
CN112712521A (zh) * 2021-01-18 2021-04-27 佛山科学技术学院 一种基于全局梯度搜索的眼底视盘自动定位方法
CN112712521B (zh) * 2021-01-18 2023-12-12 佛山科学技术学院 基于全局梯度搜索的眼底视盘自动定位方法及其存储介质

Similar Documents

Publication Publication Date Title
US7583827B2 (en) Assessment of lesions in an image
Hoover et al. Locating the optic nerve in a retinal image using the fuzzy convergence of the blood vessels
US6996260B1 (en) Analysis of fundus images
US20220151568A1 (en) Supervised machine learning based multi-task artificial intelligence classification of retinopathies
Kipli et al. A review on the extraction of quantitative retinal microvascular image feature
WO2018116321A2 (fr) Procédé de traitement d'image de fond rétinien
Zhu et al. Digital image processing for ophthalmology: Detection of the optic nerve head
WO2003030075A1 (fr) Detection de papille optique dans une image de fond d'oeil
Giancardo Automated fundus images analysis techniques to screen retinal diseases in diabetic patients
WO2003030073A1 (fr) Mesure de qualite
Mangrulkar Retinal image classification technique for diabetes identification
Zhou et al. Computer aided diagnosis for diabetic retinopathy based on fundus image
Noronha et al. A review of fundus image analysis for the automated detection of diabetic retinopathy
WO2004082453A2 (fr) Determination de lesions dans une image
WO2003030101A2 (fr) Detection de vaisseaux sur une image
Niemeijer Automatic detection of diabetic retinopathy in digital fundus photographs
Khatter et al. Retinal vessel segmentation using Robinson compass mask and fuzzy c-means
Mohammadi et al. The computer based method to diabetic retinopathy assessment in retinal images: a review.
DK1444635T3 (en) Assessment of lesions in an image
Patil et al. Screening and detection of diabetic retinopathy by using engineering concepts
de Moura et al. Artery/vein vessel tree identification in near-infrared reflectance retinographies
Lin et al. Vascular tree construction with anatomical realism for retinal images
Kayte Design and Development of Non-Proliferative Diabetic Retinopathy Detection Technique using Image Features Extraction Techniques
Raju DETECTION OF DIABETIC RETINOPATHY USING IMAGE PROCESSING
Ahmed College of Graduate Studies

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MK MN MW MX MZ NO NZ OM PH PT RO RU SD SE SG SI SK SL TJ TM TN TR TZ UA UG US UZ VC VN YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP