CN113887677B - Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla - Google Patents

Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla Download PDF

Info

Publication number
CN113887677B
CN113887677B CN202111479461.8A CN202111479461A CN113887677B CN 113887677 B CN113887677 B CN 113887677B CN 202111479461 A CN202111479461 A CN 202111479461A CN 113887677 B CN113887677 B CN 113887677B
Authority
CN
China
Prior art keywords
blood vessel
characteristic
image
diameter
vessel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111479461.8A
Other languages
Chinese (zh)
Other versions
CN113887677A (en
Inventor
于红刚
张丽辉
姚理文
卢姿桦
罗任权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202111479461.8A priority Critical patent/CN113887677B/en
Publication of CN113887677A publication Critical patent/CN113887677A/en
Application granted granted Critical
Publication of CN113887677B publication Critical patent/CN113887677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)
  • Endoscopes (AREA)

Abstract

The embodiment of the invention discloses a method, a device, equipment and a medium for classifying capillary vessel images in epithelial papillae. The method comprises the following steps: inputting an endoscopic image of capillary vessels in the epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image; acquiring a target blood vessel region in an endoscope image from the effective region according to a connected domain algorithm; and acquiring the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density of the target blood vessel in the target blood vessel region, and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image. The method extracts the target blood vessel in the endoscopic image based on the neural network technology and acquires five indexes for evaluating the target blood vessel, and further realizes accurate judgment of the esophageal cancer infiltration depth according to the five indexes, so that the efficiency and the accuracy of the judgment of the esophageal cancer infiltration depth are improved.

Description

Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla
Technical Field
The invention relates to the technical field of medical assistance, in particular to a method, a device, equipment and a medium for classifying capillary vessel images in epithelial papilla.
Background
The endothelial papillary capillary loop (IPCL) is a microvasculature that develops perpendicular to the branch vessels under the squamous epithelium. The morphological change degree of capillary loops in epithelial papilla is a key index for diagnosing the infiltration depth of esophageal cancer under an endoscope, and the damage of an IPCL structure is increased along with the infiltration of tumors, so that the identification of the change of the IPCL plays an important role in evaluating the esophageal-pharyngeal lesion. When squamous cell carcinoma invades to the inherent mucosa, irregular capillaries are formed in a loop shape with thickened pipe diameter; when the tumor invades to the mucosal muscularis or the superficial layer invades to the submucosa, the abnormal micro blood vessel loop-shaped structure disappears and is obviously elongated and deformed; the deep invasion of tumor into submucosa, complete destruction of IPCL structure, and at least 3 times of superficial infiltration vessel diameter of blood vessel diameter, accompanied by increased caliber and various abnormal tumor angiogenesis. Thus, IPCL vessels of esophageal squamous cell carcinoma exhibit abnormalities in degree of irregularity such as dilation, tortuosity, caliber changes, and shape abnormalities.
At present, the esophageal lesion property judgment is generally determined by observing capillary loops in epithelial papillae by adopting a narrow-band imaging (NBI) technology, specifically, blood vessels on the mucosal surface are dyed brown to form high contrast with background tissues to examine the change of the capillary vessels in detail, and then a typing standard is proposed according to the abnormal degree of IPCL to predict the infiltration depth of esophageal cancer. However, the evaluation standard of the abnormal degree of the blood vessel under the dyeing and amplification is difficult to be homogenized and is greatly influenced by the subjectivity of an evaluator, so that the homogenization and high-precision evaluation of the abnormal degree of the esophagus IPCL under the dyeing and amplification endoscope are difficult to realize, and the technical problems of poor accuracy and low efficiency are caused when the esophageal cancer infiltration depth is judged through the dyeing endoscope.
Disclosure of Invention
The embodiment of the invention provides a method, a device, equipment and a medium for classifying capillary vessel images in epithelial papillae, and solves the technical problem that the esophageal cancer infiltration depth cannot be accurately identified under a dyeing endoscope in the prior art.
In a first aspect, an embodiment of the present invention provides a method for classifying an image of capillary vessels in an epithelial papilla, including:
inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image;
acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm;
acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region;
and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image.
In a second aspect, an embodiment of the present invention provides a device for classifying an image of capillary vessels in an epithelial papilla, including:
the first input unit is used for inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image;
the first acquisition unit is used for acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm;
the second acquisition unit is used for acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region;
and the second input unit is used for inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image.
In a third aspect, an embodiment of the present invention further provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the computer program to implement the method for classifying an image of capillary vessels in an epithelial papilla according to the first aspect.
In a fourth aspect, the present invention further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, causes the processor to execute the method for classifying an image of capillary vessels in an epithelial papilla according to the first aspect.
The embodiment of the invention provides a method, a device, equipment and a medium for classifying capillary vessel images in epithelial papilla, wherein the method adopts a neural network technology to extract a target vessel from an endoscopic image so as to obtain five indexes of characteristic diameter, characteristic distortion quantitative value, characteristic area ratio, centroid eccentricity and whole image density of the evaluated target vessel, then classifies the target vessel according to the five indexes, and can determine the current form of capillary vessel loops in the epithelial papilla according to the classification result of the target vessel, so that the accurate judgment of the esophageal cancer infiltration depth is realized, and the efficiency and the accuracy of the judgment of the esophageal cancer infiltration depth are improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 2 is a sub-flowchart of a method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 3 is another schematic flow chart of a method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of another method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 5 is another flowchart of a method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of another method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
FIG. 7 is a schematic flow chart of another method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention;
fig. 8 is a schematic block diagram of a classification apparatus for capillary vessel images within epithelial papillae according to an embodiment of the present invention;
FIG. 9 is a schematic block diagram of a computer apparatus provided by an embodiment of the present invention;
FIG. 10 is a flowchart illustrating an exemplary implementation of the present invention;
FIG. 11 is a flowchart of acquiring an effective region in an endoscopic image according to an embodiment of the present invention;
fig. 12 is a coordinate diagram of determining whether each pixel point of the target blood vessel is located in the target blood vessel region by using an area method according to the embodiment of the present invention;
FIG. 13 is a diagram illustrating the effect of obtaining a target blood vessel in an endoscopic image according to an embodiment of the present invention;
FIG. 14 is a schematic diagram of a method for measuring a diameter of a target blood vessel using Halcon for endoscopic images according to an embodiment of the present invention;
FIG. 15 is a schematic diagram illustrating the quantification of the tortuosity of a target vessel in an endoscopic image according to an embodiment of the present invention;
fig. 16 is a schematic diagram of a geometric center of an endoscopic image and an equivalent centroid of an active area according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention. The method for classifying the capillary vessel images in the epithelial papilla, which is disclosed by the embodiment of the invention, is applied to the terminal equipment, and is executed by application software installed in the terminal equipment. The terminal device is a terminal device with an internet access function, such as a desktop computer, a notebook computer, a tablet computer, or a mobile phone.
The method for classifying the capillary vessel image in the epithelial papilla will be described in detail below.
As shown in FIG. 1, the method includes the following steps S110 to S140.
S110, inputting an endoscope image of the capillary vessel in the epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscope image.
The endoscope image is obtained through an NBI technology, epithelial papillary internal capillary vessels exist in the endoscope image, and the neural network model is trained in advance and used for extracting characteristic information containing the epithelial papillary internal capillary vessels from the endoscope image, namely the effective area.
Specifically, the neural network model is constructed by a neural network with an image segmentation function, and the neural network can be an Unet neural network or a Mask-RCNN neural network.
It should be noted that, in the process of constructing the neural network model, the specific neural network adopted by the neural network model may be selected according to actual conditions, and the present invention is not limited specifically.
In another embodiment of the present invention, as shown in fig. 2, before step S110, the method further includes: s210 and S220.
S210, inputting a training sample into the neural network model to obtain the mean square error loss of the neural network model;
and S220, updating the network parameters of the neural network model according to the mean square error loss.
In this embodiment, as shown in fig. 11, fig. 11 is a flowchart for acquiring an effective region in an endoscopic image according to an embodiment of the present invention. The neural network model is obtained by constructing the Unet neural network, wherein the Unet neural network is obtained by improving the full convolution neural network, and the Unet neural network realizes the full extraction of features by enhancing the connection between layers and adding up-sampling and down-convolution, thereby realizing accurate segmentation under the condition of less training samples. The neural network model can effectively extract characteristic information containing capillary vessels in the epithelial papilla from the endoscopic image by adopting the Unet neural network.
Specifically, the training sample is obtained by NBI technology, and the training sample has characteristic information of capillary vessels in epithelial papillae, and after the training sample is input into the neural network model, the network parameters of the neural network model can be updated through the mean square error loss output by the neural network model, so as to train the neural network model.
Generating the function of the mean square error loss as:
Figure 130482DEST_PATH_IMAGE002
where m is the number of training samples input, nerveThe predicted value of the network model is
Figure 599640DEST_PATH_IMAGE003
True value of
Figure 5214DEST_PATH_IMAGE005
In another embodiment of the present invention, as shown in fig. 3, before step S210, the method further includes: s310 and S320.
S310, carrying out video decoding on the endoscope video of the capillary vessel in the epithelial papilla to obtain a video decoding image;
and S320, labeling the video decoding image to obtain the training sample.
In this embodiment, the training sample is obtained from an esophageal endoscope video shot by an NBI technique, after a terminal device obtains the video of the endoscope image, the video is decoded to obtain a video decoded image, and then the video decoded image is labeled to outline a blood vessel contour in the video decoded image to form a training sample for training the neural network model, so that after the training sample is input into the neural network model, a mean square error loss of the neural network model can be obtained, thereby implementing training of the neural network model.
And S120, acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm.
Specifically, the connected domain algorithm is also called a connected domain labeling algorithm, that is, the labeling of the target blood vessel in the endoscopic image is completed in the effective region, five indexes for judging the esophageal cancer infiltration depth in the endoscopic image can be obtained after the target blood vessel region is obtained from the effective region through the connected domain algorithm, and then the current form of the capillary loop in the epithelial papilla in the endoscopic image can be classified by performing linear weighted calculation according to the five indexes, so that the judgment of the esophageal cancer infiltration depth in the endoscopic image is realized.
In other inventive embodiments, as shown in fig. 4, step S120 includes sub-steps S121 and S122.
S121, traversing the effective region to obtain the connected domain area and the minimum external horizontal rectangle of the target blood vessel;
and S122, determining the target blood vessel region according to the connected domain area and the minimum circumscribed horizontal rectangle.
Specifically, the connected domain area and the minimum circumscribed horizontal rectangle of all the blood vessels in the effective region are obtained by traversing the effective region, and then the connected domain area and the minimum circumscribed horizontal rectangle of the target blood vessel are determined. When the connected region area and the minimum external horizontal rectangle of the target blood vessel are determined, the blood vessel where the largest connected region except the background area in the connected region areas of all the blood vessels is located is taken as the target blood vessel, and the minimum external horizontal rectangle of the blood vessel is the minimum external horizontal rectangle of the target blood vessel.
In addition, after determining the position of the target blood vessel, the target blood vessel region needs to be further determined. Specifically, the target blood vessel region shown in fig. 13 can be determined by traversing all pixel points in the minimum circumscribed horizontal rectangle of the target blood vessel and then judging whether each pixel point is in the connected domain of the target blood vessel by using the area method shown in fig. 12. The specific process comprises the following steps: assuming that the vertex coordinates of the connected domain of the target blood vessel are
Figure 850810DEST_PATH_IMAGE007
The coordinate of a certain pixel point in the minimum external horizontal rectangle of the target blood vessel is
Figure 268016DEST_PATH_IMAGE008
If the pixel point is in the connected domain of the target blood vessel, the triangular area formed by the pixel point and all adjacent vertexes of the connected domain is a polygonal area, and the following equation is required to be satisfied:
Figure 32710DEST_PATH_IMAGE010
and if the pixel points of the equation are not satisfied inside the minimum external horizontal rectangle of the target blood vessel, setting the pixel values of the pixel points as background pixels of the endoscope image.
S130, obtaining the characteristic diameter, the characteristic distortion quantization value, the characteristic area occupation quantity, the centroid eccentricity and the whole image density of the target blood vessel in the target blood vessel region.
Specifically, the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density are all indexes used for judging the current form of the capillary loop in the epithelial papilla in the endoscope image, and the current form of the capillary loop in the epithelial papilla in the endoscope image can be identified through the five indexes, so that the esophageal cancer infiltration depth in the endoscope image can be judged.
In other inventive embodiments, as shown in fig. 5, step S130 includes sub-steps S131, S132, S133, S134, and S135.
S131, generating the characteristic diameter according to the maximum class average diameter and the minimum class average diameter of the target blood vessel.
Specifically, the maximum class average diameter is an average of diameters of the portions of the target blood vessels belonging to the maximum class, and the minimum class average diameter is an average of diameters of the portions of the target blood vessels belonging to the minimum class, and the characteristic diameter can be calculated from the maximum class average diameter and the minimum class average diameter.
In this embodiment, as shown in fig. 14, fig. 14 is a schematic diagram of measuring the diameter of a target blood vessel using Halcon according to an embodiment of the present invention. Measuring diameters at multiple sites on the target vessel by invoking a diameter measurement kit in Halcon
Figure 800946DEST_PATH_IMAGE012
Then obtaining all maximum class diameters and all minimum class diameters from the obtained maximum class diameters and all minimum class diameters, calculating the maximum class average diameter and the minimum class average diameter, and finally calculating the minimum class average diameter according to the maximum class average diameter and the minimum class average diameterThe characteristic diameter can be calculated from the maximum class average diameter and the minimum class average diameter. The specific calculation formula is as follows:
Figure 184654DEST_PATH_IMAGE014
wherein,
Figure 721945DEST_PATH_IMAGE016
is the maximum class average diameter for the particles,
Figure 657540DEST_PATH_IMAGE018
d is the minimum class average diameter and the characteristic diameter.
In addition, since Halcon is affected by the color difference between the target blood vessel and the background color and the curve fitting ability, the blood vessel with a complex shape needs to be divided into a plurality of sections, and then the diameter of each section of the blood vessel is measured, so that the diameters of a plurality of parts on the target blood vessel can be measured.
And S132, respectively acquiring a plurality of tortuosity quantification values and a plurality of area occupation quantities of the target blood vessel.
Specifically, the plurality of tortuosity quantification values and the plurality of area proportion quantities of the target blood vessel are tortuosity quantification values and area proportion quantities of different positions in the target blood vessel, the tortuosity quantification is used for representing the tortuosity of the blood vessel, and the area proportion is used for representing the area proportion of the blood vessel in the minimum circumscribed horizontal rectangle.
And S133, acquiring the characteristic distortion quantization value and the characteristic area ratio from the plurality of distortion quantization values and the plurality of area ratio.
In this embodiment, after obtaining the plurality of quantitative distortion values and the plurality of area ratio quantities, the quantitative characteristic distortion value may be calculated by a preset quantitative vascular distortion formula, and the area ratio quantities at various positions in the target blood vessel may be calculated by a preset quantitative area ratio formula
Figure 647493DEST_PATH_IMAGE019
Then, a geometric mean calculation formula is used to obtain the characteristic area proportion quantity.
Fig. 15 is a schematic diagram of quantification of tortuosity of a target blood vessel in an endoscopic image according to an embodiment of the present invention. As shown in fig. 15, the principle of quantifying the tortuosity of the target vessel is as follows: the width and height of the minimum circumscribed rectangle parallel to the target blood vessel are used as a new line to be intersected with the target blood vessel, and the more complicated the target blood vessel is, the more the parallel lines are intersected with the target blood vessel. According to the principle, the more complicated the target blood vessel is, the more the pixel points on the target blood vessel are projected to the width and height of the minimum circumscribed rectangle of the target blood vessel, and the greater the density of the points falling on the line is. Wherein, the quantitative formula of the blood vessel distortion is as follows:
Figure 569313DEST_PATH_IMAGE021
Figure 351324DEST_PATH_IMAGE023
Figure 333186DEST_PATH_IMAGE025
wherein,
Figure 76014DEST_PATH_IMAGE026
is a blood vessel distortion quantitative value, n is the total number of pixel points on the inner side and the outer side of the wall of the capillary,
Figure 660579DEST_PATH_IMAGE027
and
Figure 906884DEST_PATH_IMAGE028
the distribution is the length and width of the smallest circumscribed rectangle of the vessel.
The area ratio formula is as follows:
Figure 918702DEST_PATH_IMAGE029
wherein,
Figure 883247DEST_PATH_IMAGE030
for the vessel diameter at each pixel point on the vessel centerline,
Figure 146870DEST_PATH_IMAGE027
and
Figure 372315DEST_PATH_IMAGE028
the distribution is the length and width of the smallest circumscribed rectangle of the vessel.
The geometric mean calculation formula is:
Figure 430400DEST_PATH_IMAGE031
in addition, before the characteristic distortion quantitative value and the characteristic area ratio are respectively calculated through the blood vessel distortion quantitative formula and the area ratio formula, the plurality of distortion quantitative values and the plurality of area ratio are screened, so that the finally generated characteristic distortion quantitative value and the characteristic area ratio are more accurate, and the accuracy of judging the infiltration depth of the esophageal cancer is improved. The process of screening may be performed with reference to step S131b, and is not limited in particular.
And S134, generating the centroid eccentricity according to the equivalent centroid of the effective area and the geometric center of the endoscope image.
In this embodiment, the equivalent centroid is a mass of the particles of the blood vessel in the effective region equal to a total mass of the particle system, and the geometric center is a positive center position of the endoscopic image. Fig. 16 is a schematic diagram of the geometric center of an endoscopic image and the equivalent centroid of an effective region according to an embodiment of the present invention, as shown in fig. 16. After the effective area is obtained, if the capillary vessels in the epithelial papilla in the endoscopic image are not abnormal, the distribution of the capillary vessels is uniform in the whole visual field theoretically, and the equivalent mass center of all the blood vessels is required to be at the geometric center of the whole visual field or close to the geometric center; when capillary vessels in the epithelial papilla in the endoscopic image are abnormal, the blood vessels in the visual field disappear or are squeezed by a focus to change the distribution of the blood vessels, and the equivalent mass center of the blood vessels in all effective areas deviates far from the geometric center with high probability. After the equivalent mass center of the effective area and the geometric center of the endoscope image are obtained, the mass center eccentricity of the target blood vessel can be calculated through a preset mass center eccentricity calculation formula. Wherein, the centroid eccentricity calculation formula is as follows:
Figure 6875DEST_PATH_IMAGE033
Figure 339768DEST_PATH_IMAGE035
Figure 295085DEST_PATH_IMAGE037
wherein e is the eccentricity of the mass center,
Figure 179865DEST_PATH_IMAGE039
is the equivalent center of mass of the active area,
Figure 119002DEST_PATH_IMAGE041
is the centroid of each blood vessel in the endoscopic image,
Figure 724427DEST_PATH_IMAGE042
for each blood vessel area in the endoscopic image,
Figure 658885DEST_PATH_IMAGE043
and
Figure 324352DEST_PATH_IMAGE042
all can be connectedThe method is obtained on the basis of the through domain,
Figure 609840DEST_PATH_IMAGE044
which is the geometric center of the endoscopic image,
Figure 7237DEST_PATH_IMAGE046
and S135, acquiring the whole image density according to a preset blood vessel whole image density formula.
In this embodiment, the vessel map density formula is a density formula pre-constructed according to the structure of the blood vessel in the endoscopic image, and the vessel map density formula is:
Figure DEST_PATH_IMAGE047
wherein,
Figure 468306DEST_PATH_IMAGE048
for each vessel area in the vessel map, which is available through the connected component, W, H is the width and height of the endoscopic image, respectively.
In another embodiment of the present invention, as shown in fig. 6, before step S131, the method further includes: s131a, S131b, S131c and S131 d.
S131a, acquiring a plurality of diameters of each blood vessel in the target blood vessel;
s131b, screening the diameters of the branches to obtain screened diameters;
s131c, clustering the screened diameters according to a K-means algorithm to obtain a clustering result of each screened diameter;
s131d, generating the maximum class average diameter and the minimum class average diameter according to the clustering result.
Specifically, the target blood vessel in the endoscopic image is composed of a main blood vessel and a plurality of branch blood vessels, a plurality of diameters of each blood vessel in the target blood vessel can be obtained by measuring each blood vessel in the target blood vessel for a plurality of times, then, after a normal numerical value is screened out from the target blood vessel and is clustered into 3 classes by adopting a K-means algorithm, classification of each diameter after screening can be completed, after the clustering of each diameter after screening is completed, the average diameter of each class is calculated, and then, the maximum class average diameter and the minimum class average diameter can be obtained from the average diameter.
The K-means algorithm is also called a fast clustering method, and the principle is as follows: selecting K points in a data set according to a certain strategy as an initial center of each cluster, observing the rest data, dividing the data into clusters closest to the K points, namely dividing the data into K clusters to finish one division, wherein the formed new clusters are not the optimal division, recalculating the center point of each cluster in the generated new clusters, and then dividing again until the division result keeps unchanged every time.
In other inventive embodiments, as shown in fig. 7, step S131b includes sub-steps S131b1 and S131b 2.
S131b1, acquiring the average diameter of each blood vessel and the variance of the diameter of each blood vessel;
s131b2, screening the diameters of each blood vessel according to the average diameter of each blood vessel and the variance of the diameter of each blood vessel to obtain the screened diameters.
In this embodiment, the average diameter of each blood vessel in the target blood vessel and the variance of the diameter of each blood vessel are calculated, and then the screening of the diameters of each blood vessel can be completed according to the average diameter and the variance.
The average diameter is calculated as:
Figure 429308DEST_PATH_IMAGE050
the variance of each vessel diameter is calculated as:
Figure 77458DEST_PATH_IMAGE052
when in use
Figure 290265DEST_PATH_IMAGE054
Or
Figure 605840DEST_PATH_IMAGE056
When, the diameter is rejected
Figure 472165DEST_PATH_IMAGE058
The numerical value can complete the screening of the diameters of each blood vessel in the target blood vessel.
S140, inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to complete classification of target blood vessels in the endoscope image, and further obtaining a classification result of the endoscope image.
The classification model is trained in advance and used for classifying target blood vessels in the endoscope image, the classification model can be constructed by any one of Logistic regression, support vector machine, extreme gradient lifting, decision tree, random forest and BP neural network, and the calculation formula of linear weighted fitting is as follows:
Figure 873190DEST_PATH_IMAGE060
wherein
Figure DEST_PATH_IMAGE061
Figure 889688DEST_PATH_IMAGE062
Figure DEST_PATH_IMAGE063
Figure 325348DEST_PATH_IMAGE064
Figure DEST_PATH_IMAGE065
the characteristic diameter, the characteristic distortion quantization value and the characteristic areaThe occupation ratio, the centroid eccentricity and the whole map density are calculated, zeta is the blood vessel abnormal degree coefficient, lambda1、λ2、λ3、λ4、λ5Respectively, as a respective weight. Wherein λ is1、λ2、λ3、λ4、λ5The determination may be performed by a grid search method, a greedy search method, or the like, which is not particularly limited herein.
Specifically, after five index values of a target blood vessel in the endoscopic image are input into the classification model, an abnormal degree coefficient of the target blood vessel can be obtained, and then the abnormal grade of the target blood vessel is judged according to the numerical range of the abnormal degree coefficient of the target blood vessel, so that the endoscopic image can be classified.
For example, when the coefficient of the degree of abnormality of the target blood vessel is less than or equal to 0.35, the target blood vessel in the endoscopic image can be determined to be normal; when the abnormal degree coefficient of the target blood vessel is between 0.35 and 0.89, the target blood vessel in the endoscopic image can be judged to be in general abnormality; when the coefficient of the degree of abnormality of the target blood vessel is between 0.89 and 1.25, the target blood vessel in the endoscopic image can be judged to be abnormal; when the coefficient of the degree of abnormality of the target blood vessel is larger than 1.25, the target blood vessel in the endoscopic image can be judged to be very abnormal.
In the method for classifying the capillary vessel image in the epithelial papilla provided by the embodiment of the invention, an endoscopic image of the capillary vessel in the epithelial papilla is input into a pre-trained neural network model to obtain an effective area of the endoscopic image; acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm; acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region; and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image. According to the method, the accurate judgment of the esophageal cancer infiltration depth is realized through the quantitative evaluation of the abnormal degree of the capillary vessels in the esophageal epithelial papilla, the efficiency and the accuracy of the judgment of the esophageal cancer infiltration depth are improved, and meanwhile, a reliable basis is provided for clinical treatment decisions of patients with esophageal cancer.
Fig. 10 is a flowchart illustrating a specific application of the method for classifying an image of capillary vessels in an epithelial papilla according to an embodiment of the present invention, where fig. 10 is a flowchart illustrating a specific application of the method according to an embodiment of the present invention. As can be seen from fig. 10, an esophageal staining magnified image of capillary vessels in an esophageal epithelial papilla, i.e., the endoscopic image, is obtained by an electronic staining endoscope, then a vessel segmentation process is performed on the endoscopic image to obtain a vessel segmentation map, i.e., an effective area of the endoscopic image, then a single vessel, i.e., the target vessel, is extracted from the effective area, and simultaneously vessel integer map distribution quantification and vessel integer map density quantification, i.e., two indexes of centroid eccentricity and integer map density of the target vessel, are obtained from the vessel segmentation map, then vessel diameter quantification, vessel tortuosity quantification and vessel area proportion quantification are performed on the target vessel, and undesirable singular values are removed, so that a characteristic diameter, a characteristic tortuosity quantification value and a characteristic area proportion of the target vessel are obtained, and finally five indexes for evaluating the target vessel are subjected to fitting classification by a machine learning method to obtain capillary vessels in the epithelial papilla The classification of the images can further judge the abnormal degree of the capillary vessels in the epithelial papilla.
The embodiment of the invention also provides a device 100 for classifying the capillary vessel images in the epithelial papilla, which is used for executing any embodiment of the method for classifying the capillary vessel images in the epithelial papilla.
Specifically, referring to fig. 8, fig. 8 is a schematic block diagram of a classification apparatus 100 for an image of capillary vessels in epithelial papilla according to an embodiment of the present invention.
As shown in fig. 8, the apparatus 100 for classifying an image of capillary vessels in an epithelial papilla includes: a first input unit 110, a first acquisition unit 120, a second acquisition unit 130, and a second input unit 140.
The first input unit 110 is configured to input an endoscopic image of a capillary vessel in an epithelial papilla into a pre-trained neural network model, so as to obtain an effective region of the endoscopic image.
In another embodiment, the apparatus 100 for classifying an image of capillary vessels in an epithelial papilla further comprises: a third input unit and an update unit.
The third input unit is used for inputting the training samples into the neural network model to obtain the mean square error loss of the neural network model; and the updating unit is used for updating the network parameters of the neural network model according to the mean square error loss.
In another embodiment, the apparatus 100 for classifying an image of capillary vessels in an epithelial papilla further comprises: a decoding unit and an annotation unit.
The decoding unit is used for carrying out video decoding on the endoscope video of the capillary vessel in the epithelial papilla to obtain a video decoding image; and the marking unit is used for marking the video decoding image to obtain the training sample.
A first obtaining unit 120, configured to obtain a target blood vessel region in the endoscopic image from the effective region according to a connected domain algorithm.
In another embodiment, the first obtaining unit 120 includes: a traversing unit and a determining unit.
The traversal unit is used for traversing the effective region to obtain the connected domain area and the minimum external horizontal rectangle of the target blood vessel; and the determining unit is used for determining the target blood vessel region according to the connected domain area and the minimum circumscribed horizontal rectangle.
A second obtaining unit 130, configured to obtain a characteristic diameter, a characteristic tortuosity quantization value, a characteristic area ratio, a centroid eccentricity, and a whole map density of the target blood vessel in the target blood vessel region.
In another embodiment, the second obtaining unit 130 includes: the device comprises a first generation unit, a third acquisition unit, a fourth acquisition unit, a second generation unit and a fifth acquisition unit.
A first generating unit, configured to generate the characteristic diameter according to the maximum class average diameter and the minimum class average diameter of the target blood vessel; a third acquisition unit configured to acquire a plurality of tortuosity quantization values and a plurality of area ratio amounts of the target blood vessel, respectively; a fourth obtaining unit configured to obtain the characteristic distortion quantization value and the characteristic area ratio amount from the plurality of distortion quantization values and the plurality of area ratio amounts; the second generating unit is used for generating the centroid eccentricity according to the equivalent centroid of the effective area and the geometric center of the endoscope image; and the fifth acquisition unit is used for acquiring the whole image density according to a preset blood vessel whole image density formula.
In another embodiment, the apparatus 100 for classifying an image of capillary vessels in an epithelial papilla further comprises: the device comprises a sixth acquisition unit, a first screening unit, a clustering unit and a third generation unit.
A sixth acquiring unit configured to acquire a plurality of diameters of each of the target blood vessels; the first screening unit is used for screening the diameters of the branches to obtain screened diameters; the clustering unit is used for clustering the screened diameters according to a K-means algorithm to obtain a clustering result of each screened diameter; a third generating unit, configured to generate the maximum class average diameter and the minimum class average diameter according to the clustering result.
In another embodiment, the first screening unit includes: a seventh obtaining unit and a second screening unit.
A seventh acquiring unit configured to acquire the average diameter of each of the blood vessels and a variance of the diameter of each of the blood vessels; and the second screening unit is used for screening the diameters of each blood vessel according to the average diameter of each blood vessel and the variance of the diameter of each blood vessel to obtain the screened diameter.
The second input unit 140 is configured to input the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity, and the whole image density into a pre-trained classification model, so as to obtain a classification result of the endoscope image.
The classification device 100 for capillary vessels in epithelial papillary provided by the embodiment of the present invention is used for inputting the endoscopic image of capillary vessels in epithelial papillary into a pre-trained neural network model to obtain an effective region of the endoscopic image; acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm; acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region; and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image.
Referring to fig. 9, fig. 9 is a schematic block diagram of a computer device according to an embodiment of the present invention.
Referring to fig. 9, the device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a storage medium 503 and an internal memory 504.
The storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032, when executed, may cause the processor 502 to perform a method of classifying an image of capillary vessels within an epithelial papilla.
The processor 502 is used to provide computing and control capabilities that support the operation of the overall device 500.
The internal memory 504 provides an environment for the execution of the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can be enabled to execute the method for classifying the capillary vessel image in the epithelial papilla.
The network interface 505 is used for network communication, such as providing transmission of data information. Those skilled in the art will appreciate that the configuration shown in fig. 9 is a block diagram of only a portion of the configuration associated with aspects of the present invention and does not constitute a limitation of the apparatus 500 to which aspects of the present invention may be applied, and that a particular apparatus 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following functions: inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image; acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm; acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region; and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image.
Those skilled in the art will appreciate that the embodiment of the apparatus 500 shown in fig. 9 does not constitute a limitation on the specific construction of the apparatus 500, and in other embodiments, the apparatus 500 may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. For example, in some embodiments, the apparatus 500 may only include the memory and the processor 502, and in such embodiments, the structure and function of the memory and the processor 502 are the same as those of the embodiment shown in fig. 8, and are not repeated herein.
It should be understood that in the present embodiment, the Processor 502 may be a Central Processing Unit (CPU), and the Processor 502 may also be other general-purpose processors 502, a Digital Signal Processor 502 (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. The general-purpose processor 502 may be a microprocessor 502 or the processor 502 may be any conventional processor 502 or the like.
In another embodiment of the present invention, a computer storage medium is provided. The storage medium may be a nonvolatile computer-readable storage medium or a volatile storage medium. The storage medium stores a computer program 5032, wherein the computer program 5032 when executed by the processor 502 performs the steps of: inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image; acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm; acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region; and inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided by the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only a logical division, and there may be other divisions when the actual implementation is performed, or units having the same function may be grouped into one unit, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may also be an electric, mechanical or other form of connection.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product stored in a storage medium and including instructions for causing a device 500 (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. A method of classifying an image of capillary vessels within an epithelial papilla, comprising:
inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image;
acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm;
acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region;
inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image;
wherein, the obtaining of the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region comprises:
generating the characteristic diameter according to the maximum class average diameter and the minimum class average diameter of the target blood vessel;
respectively acquiring a plurality of tortuosity quantification values and a plurality of area occupation quantities of the target blood vessel; wherein, each area ratio quantity is calculated according to a preset area ratio formula;
calculating a plurality of distortion quantitative values according to a preset blood vessel distortion quantitative formula to obtain the characteristic distortion quantitative value;
calculating the area occupation ratios according to a preset geometric mean calculation formula to obtain the characteristic area occupation ratio;
generating the centroid eccentricity according to the equivalent centroid of the effective area and the geometric center of the endoscope image;
obtaining the whole image density according to a preset blood vessel whole image density formula;
the calculation formula of the characteristic diameter is as follows:
Figure 105831DEST_PATH_IMAGE002
wherein,
Figure 101600DEST_PATH_IMAGE004
is the maximum class average diameter for the particles,
Figure 209234DEST_PATH_IMAGE006
is the minimum class average diameter;
the characteristic distortion quantization formula is:
Figure 807705DEST_PATH_IMAGE008
Figure 700706DEST_PATH_IMAGE010
Figure 851065DEST_PATH_IMAGE012
wherein,
Figure 196726DEST_PATH_IMAGE013
is a blood vessel distortion quantitative value, n is the total number of pixel points on the inner side and the outer side of the wall of the capillary,
Figure 864468DEST_PATH_IMAGE014
and
Figure 126822DEST_PATH_IMAGE015
the length and width of the minimum circumscribed rectangle which is respectively a blood vessel are distributed;
the area ratio formula is as follows:
Figure 854607DEST_PATH_IMAGE016
wherein,
Figure 77778DEST_PATH_IMAGE017
for the vessel diameter at each pixel point on the vessel centerline,
Figure 158997DEST_PATH_IMAGE014
and
Figure 416803DEST_PATH_IMAGE015
the length and width, Δ, of the smallest bounding rectangle that is distributed, in particular, the blood vesseliThe area occupation quantity of each part in the target blood vessel is taken as the ratio quantity;
the geometric mean calculation formula is as follows:
Figure 440123DEST_PATH_IMAGE018
wherein Δ is the characteristic amount of distortion;
the centroid eccentricity calculation formula is as follows:
Figure 150590DEST_PATH_IMAGE020
wherein e is the eccentricity of the mass center,
Figure 504342DEST_PATH_IMAGE022
is the equivalent center of mass of the active area,
Figure 475709DEST_PATH_IMAGE024
is the centroid of each blood vessel in the endoscopic image,
Figure 279717DEST_PATH_IMAGE026
for each blood vessel area in the endoscopic image,
Figure 75548DEST_PATH_IMAGE024
and
Figure 623204DEST_PATH_IMAGE027
obtained on the basis of the connected domain,
Figure 449078DEST_PATH_IMAGE029
which is the geometric center of the endoscopic image,
Figure 564933DEST_PATH_IMAGE031
the blood vessel whole map density formula is as follows:
Figure 718834DEST_PATH_IMAGE032
wherein,
Figure 194814DEST_PATH_IMAGE033
for each vessel area in the vessel map, which is available through the connected component, W, H is the width and height of the endoscopic image, respectively.
2. The method for classifying an image of capillary vessels in epithelial papilla according to claim 1, wherein before inputting the endoscopic image of capillary vessels in epithelial papilla into a pre-trained neural network model to obtain an effective region of the endoscopic image, the method further comprises:
inputting a training sample into the neural network model to obtain the mean square error loss of the neural network model;
and updating the network parameters of the neural network model according to the mean square error loss.
3. The method for classifying capillary vessel images within epithelial papilla according to claim 2, further comprising, before the inputting training samples into the neural network model to obtain the mean square error loss of the neural network model:
performing video decoding on the endoscope video of the capillary vessels in the epithelial papilla to obtain a video decoding image;
and labeling the video decoding image to obtain the training sample.
4. The method for classifying capillary vessel images within epithelial papilla according to claim 1, wherein the extracting the target vessel region in the endoscopic image from the effective region according to a connected domain algorithm comprises:
traversing the effective region to obtain the connected domain area and the minimum external horizontal rectangle of the target blood vessel;
and determining the target blood vessel region according to the connected domain area and the minimum circumscribed horizontal rectangle.
5. The method for classifying capillary vessel images within an epithelial papilla according to claim 1, further comprising, before the generating the characteristic diameter from the maximum and minimum mean-like diameters of the target vessel:
obtaining a plurality of diameters of each of the target vessels;
screening a plurality of diameters of each branch to obtain screened diameters;
clustering the screened diameters according to a K-means algorithm to obtain a clustering result of each screened diameter;
and generating the maximum class average diameter and the minimum class average diameter according to the clustering result.
6. The method for classifying capillary vessel images within epithelial papillae according to claim 5, wherein said screening a plurality of diameters of each of said blood vessels to obtain a screened diameter comprises:
obtaining the average diameter of each blood vessel and the variance of the diameter of each blood vessel;
and screening the diameters of each blood vessel according to the average diameter of each blood vessel and the variance of the diameter of each blood vessel to obtain the screened diameter.
7. A classification device for an image of capillary vessels in an epithelial papilla, comprising:
the first input unit is used for inputting an endoscopic image of capillary vessels in an epithelial papilla into a pre-trained neural network model to obtain an effective area of the endoscopic image;
the first acquisition unit is used for acquiring a target blood vessel region in the endoscope image from the effective region according to a connected domain algorithm;
the second acquisition unit is used for acquiring the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio, the centroid eccentricity and the whole map density of the target blood vessel in the target blood vessel region;
the second input unit is used for inputting the characteristic diameter, the characteristic distortion quantitative value, the characteristic area ratio, the centroid eccentricity and the whole image density into a pre-trained classification model to obtain a classification result of the endoscope image;
wherein, obtain the characteristic diameter, the characteristic distortion quantization value, the characteristic area ratio volume, the centroid eccentricity and the whole map density of target blood vessel in the target blood vessel region, include:
generating the characteristic diameter according to the maximum class average diameter and the minimum class average diameter of the target blood vessel;
respectively acquiring a plurality of tortuosity quantification values and a plurality of area occupation quantities of the target blood vessel; wherein, each area ratio quantity is calculated according to a preset area ratio formula;
calculating a plurality of distortion quantitative values according to a preset blood vessel distortion quantitative formula to obtain the characteristic distortion quantitative value;
calculating the area occupation ratios according to a preset geometric mean calculation formula to obtain the characteristic area occupation ratio;
generating the centroid eccentricity according to the equivalent centroid of the effective area and the geometric center of the endoscope image;
obtaining the whole image density according to a preset blood vessel whole image density formula;
the calculation formula of the characteristic diameter is as follows:
Figure 16140DEST_PATH_IMAGE035
wherein,
Figure 302896DEST_PATH_IMAGE037
is the maximum class average diameter for the particles,
Figure 68726DEST_PATH_IMAGE039
is the minimum class average diameter;
the characteristic distortion quantization formula is:
Figure 692606DEST_PATH_IMAGE041
Figure 509383DEST_PATH_IMAGE043
Figure 950729DEST_PATH_IMAGE045
wherein,
Figure 79222DEST_PATH_IMAGE047
is a blood vessel distortion quantitative value, n is the total number of pixel points on the inner side and the outer side of the wall of the capillary,
Figure 647738DEST_PATH_IMAGE014
and
Figure 178076DEST_PATH_IMAGE015
the length and width of the minimum circumscribed rectangle which is respectively a blood vessel are distributed;
the area ratio formula is as follows:
Figure 55902DEST_PATH_IMAGE016
wherein,
Figure 202850DEST_PATH_IMAGE017
for the vessel diameter at each pixel point on the vessel centerline,
Figure 575056DEST_PATH_IMAGE014
and
Figure 225480DEST_PATH_IMAGE015
the length and width, Δ, of the smallest bounding rectangle that is distributed, in particular, the blood vesseliThe area occupation quantity of each part in the target blood vessel is taken as the ratio quantity;
the geometric mean calculation formula is as follows:
Figure 8629DEST_PATH_IMAGE018
wherein Δ is the characteristic amount of distortion;
the centroid eccentricity calculation formula is as follows:
Figure 377293DEST_PATH_IMAGE049
Figure 287611DEST_PATH_IMAGE051
Figure DEST_PATH_IMAGE053
wherein e is the eccentricity of the mass center,
Figure DEST_PATH_IMAGE054
is the equivalent center of mass of the active area,
Figure DEST_PATH_IMAGE055
is the centroid of each blood vessel in the endoscopic image,
Figure DEST_PATH_IMAGE056
for each blood vessel area in the endoscopic image,
Figure 995804DEST_PATH_IMAGE055
and
Figure 700586DEST_PATH_IMAGE056
obtained on the basis of the connected domain,
Figure DEST_PATH_IMAGE057
which is the geometric center of the endoscopic image,
Figure DEST_PATH_IMAGE059
the blood vessel whole map density formula is as follows:
Figure 353285DEST_PATH_IMAGE032
wherein,
Figure 67294DEST_PATH_IMAGE033
for each vessel area in the vessel map, which is available through the connected component, W, H is the width and height of the endoscopic image, respectively.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of classifying an image of capillaries within an epithelial papilla according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to execute the method of classifying an image of capillary vessels within an epithelial papilla according to any one of claims 1 to 6.
CN202111479461.8A 2021-12-07 2021-12-07 Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla Active CN113887677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111479461.8A CN113887677B (en) 2021-12-07 2021-12-07 Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111479461.8A CN113887677B (en) 2021-12-07 2021-12-07 Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla

Publications (2)

Publication Number Publication Date
CN113887677A CN113887677A (en) 2022-01-04
CN113887677B true CN113887677B (en) 2022-03-01

Family

ID=79015690

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111479461.8A Active CN113887677B (en) 2021-12-07 2021-12-07 Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla

Country Status (1)

Country Link
CN (1) CN113887677B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114078128B (en) * 2022-01-20 2022-04-12 武汉大学 Medical image processing method, device, terminal and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017099804A1 (en) * 2015-12-11 2017-06-15 Hewlett-Packard Development Company, L.P. Density classifiers based on plane regions
CN109616195A (en) * 2018-11-28 2019-04-12 武汉大学人民医院(湖北省人民医院) The real-time assistant diagnosis system of mediastinum endoscopic ultrasonography image and method based on deep learning
CN113192064A (en) * 2021-05-27 2021-07-30 武汉楚精灵医疗科技有限公司 Esophageal cancer B3 type blood vessel identification method based on coefficient of variation method
CN113205492A (en) * 2021-04-26 2021-08-03 武汉大学 Microvessel distortion degree quantification method for gastric mucosa staining amplification imaging
CN113344859A (en) * 2021-05-17 2021-09-03 武汉大学 Method for quantifying capillary surrounding degree of gastric mucosa staining amplification imaging
CN113706533A (en) * 2021-10-28 2021-11-26 武汉大学 Image processing method, image processing device, computer equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190388060A1 (en) * 2018-06-22 2019-12-26 General Electric Company Imaging system and method with live examination completeness monitor

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017099804A1 (en) * 2015-12-11 2017-06-15 Hewlett-Packard Development Company, L.P. Density classifiers based on plane regions
CN109616195A (en) * 2018-11-28 2019-04-12 武汉大学人民医院(湖北省人民医院) The real-time assistant diagnosis system of mediastinum endoscopic ultrasonography image and method based on deep learning
CN113205492A (en) * 2021-04-26 2021-08-03 武汉大学 Microvessel distortion degree quantification method for gastric mucosa staining amplification imaging
CN113344859A (en) * 2021-05-17 2021-09-03 武汉大学 Method for quantifying capillary surrounding degree of gastric mucosa staining amplification imaging
CN113192064A (en) * 2021-05-27 2021-07-30 武汉楚精灵医疗科技有限公司 Esophageal cancer B3 type blood vessel identification method based on coefficient of variation method
CN113706533A (en) * 2021-10-28 2021-11-26 武汉大学 Image processing method, image processing device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"EFFECTUAL HUMAN AUTHENTICATION FOR CRITICAL SECURITY";L. Latha.et al;《ICTACT JOURNAL ON IMAGE AND VIDEO PROCESSING》;20101130;全文 *
"肠梗阻患者并发急性肾损伤的临床特点及影响因素分析";邓庆铃等;《胃肠病和肝病学杂志》;20210831;全文 *

Also Published As

Publication number Publication date
CN113887677A (en) 2022-01-04

Similar Documents

Publication Publication Date Title
JP7113916B2 (en) Methods and systems for utilizing quantitative imaging
CN106815481B (en) Lifetime prediction method and device based on image omics
US10176408B2 (en) Systems and methods for analyzing pathologies utilizing quantitative imaging
CN101996329B (en) Device and method for detecting blood vessel deformation area
CN108961273B (en) Method and system for segmenting pulmonary artery and pulmonary vein from CT image
CN102324109A (en) Method for three-dimensionally segmenting insubstantial pulmonary nodule based on fuzzy membership model
CN115965750B (en) Vascular reconstruction method, vascular reconstruction device, vascular reconstruction computer device, and vascular reconstruction program
CN110163872A (en) A kind of method and electronic equipment of HRMR image segmentation and three-dimensional reconstruction
CN113066583A (en) Aneurysm rupture risk prediction method, aneurysm rupture risk prediction device and storage medium
WO2015040990A1 (en) Disease analysis device, control method, and program
KR101135205B1 (en) A pulmonary vessel extraction method for automatical disease detection using chest ct images
Yadav et al. [Retracted] FVC‐NET: An Automated Diagnosis of Pulmonary Fibrosis Progression Prediction Using Honeycombing and Deep Learning
CN113887677B (en) Method, device, equipment and medium for classifying capillary vessel images in epithelial papilla
Purnama et al. Follicle detection on the usg images to support determination of polycystic ovary syndrome
Qian et al. Segmentation of the common carotid intima-media complex in ultrasound images using 2-D continuous max-flow and stacked sparse auto-encoder
CN115760858A (en) Kidney pathological section cell identification method and system based on deep learning
CN102509273B (en) Tumor segmentation method based on homogeneous pieces and fuzzy measure of breast ultrasound image
Sun et al. Kidney tumor segmentation based on FR2PAttU-Net model
Song et al. New morphological features for grading pancreatic ductal adenocarcinomas
CN113192067A (en) Intelligent prediction method, device, equipment and medium based on image detection
Kim et al. Image biomarkers for quantitative analysis of idiopathic interstitial pneumonia
CN111292309B (en) Method and device for judging degree of dissimilarity of lung tissue
Weber et al. Automatic identification of crossovers in cryo‐EM images of murine amyloid protein A fibrils with machine learning
Fiori et al. Automatic colon polyp flagging via geometric and texture features
Liu et al. Novel superpixel‐based algorithm for segmenting lung images via convolutional neural network and random forest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant