CN113487613B - Pig intestine epithelium damage repair image identification method and system - Google Patents

Pig intestine epithelium damage repair image identification method and system Download PDF

Info

Publication number
CN113487613B
CN113487613B CN202111046243.5A CN202111046243A CN113487613B CN 113487613 B CN113487613 B CN 113487613B CN 202111046243 A CN202111046243 A CN 202111046243A CN 113487613 B CN113487613 B CN 113487613B
Authority
CN
China
Prior art keywords
edge
point
pixel point
schlieren
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111046243.5A
Other languages
Chinese (zh)
Other versions
CN113487613A (en
Inventor
鲁慧杰
马现永
余苗
田志梅
容庭
邓盾
崔艺燕
刘志昌
李贞明
宋敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Animal Science of Guangdong Academy of Agricultural Sciences
Original Assignee
Institute of Animal Science of Guangdong Academy of Agricultural Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Animal Science of Guangdong Academy of Agricultural Sciences filed Critical Institute of Animal Science of Guangdong Academy of Agricultural Sciences
Priority to CN202111046243.5A priority Critical patent/CN113487613B/en
Publication of CN113487613A publication Critical patent/CN113487613A/en
Application granted granted Critical
Publication of CN113487613B publication Critical patent/CN113487613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30028Colon; Small intestine

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method and a system for recognizing a repair image of porcine intestinal epithelial injury, wherein a TEM image of a porcine small intestine section slice is obtained; identifying and marking the schlieren edge in each edge line in the TEM image to obtain a schlieren edge set; marking unclear schlieren edges in each schlieren edge in the schlieren edge set; repairing the strip edge marked as unclear in the strip edge set to obtain a repaired strip edge set; the schlieren region on the TEM image is marked according to the position of each repaired schlieren in the repaired schlieren set, so that the problems of unclear boundaries between porcine intestinal epithelial cells, overlapping schlieren edges, adhesion and the like can be solved, the recognition rate and the definition of the schlieren are improved, errors are reduced in subsequent statistical analysis of the schlieren and an intestinal epithelial injury repair model for animal experiments constructed according to the schlieren, and the accuracy of statistical results is improved.

Description

Pig intestine epithelium damage repair image identification method and system
Technical Field
The disclosure belongs to the technical field of computer vision, image processing technology and data analysis, and particularly relates to a porcine intestinal epithelial injury repair image identification method and system.
Background
At present, the intestinal tract system has the largest mucosal surface of the organism, is formed by a single-layer intestinal epithelial cell capable of continuously self-renewing, is a barrier between an intestinal microenvironment and a mucosal immune system, and protects the organism from being damaged by toxic substances in the intestinal tract and the external environment to a certain extent. The general flow of the sampling image is that the section of the small intestine of the pig is fixed by Bouin's fluid, embedded by paraffin, stained by hematoxylin-eosin (HE), the section image is collected under a microscope, the cavity surface of the small intestine is visible to have finger-shaped protrusions formed by the longitudinal section of the villus of the small intestine, the surface of the finger-shaped protrusions is covered by a layer of columnar epithelial cells, on the free surface (the surface facing the intestinal cavity) of the cells, a thin layer of red-stained columnar cell surface is visible under the microscope and is usually attached with a special structure called as a schlieren, the schlieren refers to the shape of the free surface of the absorption cells of the small intestine epithelium under the optical microscope, and the sampling edges are similar to the waved edges and are formed by densely and regularly arranged microvilli under the electron microscope; the basal surface of the cell is connected with the basement membrane, connective tissue is arranged below the basement membrane, the columnar cells are closely arranged, the height of the cell is larger than the width, the cell boundary is not clear, the problems of overlapping and adhesion of schlieren edges in the collected image and the like are often caused due to the reasons of illumination conditions, dyeing conditions and the like, the subsequent statistical analysis is seriously influenced by the overlapping of a plurality of schlieren edges, and the serious error of an enteron injury repair model which is constructed according to the schlieren edges and used for animal experiments is caused, so that the problem which needs to be solved in the pig enteron image processing is solved.
Disclosure of Invention
The invention aims to provide a method and a system for recognizing a repair image of porcine intestinal epithelial injury, so as to solve one or more technical problems in the prior art and provide at least one beneficial choice or creation condition.
In order to achieve the above object, according to an aspect of the present disclosure, there is provided a porcine intestinal epithelial injury repair image recognition method, including the steps of:
s100, acquiring a TEM image of the pig small intestine slice;
s200, graying the TEM image to obtain a gray image;
s300, carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
s400, identifying and marking the schlieren edges in each edge line to obtain a schlieren edge set;
s500, marking unclear strip edges in each strip edge in the strip edge set;
s600, repairing the streak edges marked as unclear in the streak edge set into repaired streak edges, and forming a repaired streak edge set by the repaired streak edges;
and S700, marking a fringe area on the TEM image according to the position of each repaired fringe in the repaired fringe set.
Further, in S100, the method of acquiring the TEM image of the pig small intestine section is: the pig small intestine is sliced and fixed by Bouin's solution, embedded by paraffin, stained by hematoxylin-eosin, imaged under a microscope so as to collect slice images as TEM images, wherein the microscope is any one of a common optical microscope, a total internal reflection fluorescence microscope, an ultraviolet light microscope and the like with a bright field photographing function.
Further, in S400, the method for identifying and marking the strip edge in each edge line to obtain the strip edge set includes:
because the schlieren is a continuous edge line in the image, the gray value of the schlieren and the gray value of the peripheral pixel points have larger difference, and therefore, the schlieren needs to be screened out by the following method;
s401, let the set of all edge lines be B1, B1= { B1i1Wherein, b1i1Is the i1 th edge line in the set B1; setting the initial value of i1 to be 1, i1 epsilon [1, N1]N1 is the number of elements in the set B1, and an empty set is set as a strip edge set;
s402, acquiring an edge line b1i1The number of upper pixel points is N2, the variable j1 is set to have an initial value of 1, j1 ∈ [1, N2 ]](ii) a Calculate edge line b1i1All images ofThe average gray value of the pixel points is HAVG 1;
s403, if the edge line b1i1The coordinates of the upper j1 th pixel point are (x1, y1), h (x1, y1) is the gray value of the pixel point at the coordinates (x1, y1), and the pixel value or the point with the maximum gray value in the eight neighborhoods of the pixel point at (x1, y1) is searched to obtain a point MAXNE;
s404, obtaining a straight line L by connecting a pixel point at (x1, y1) with a point MAXNE, setting a gray value of a straight line L from a pixel point at coordinates (x1, y1) to a next pixel point of a pixel point at the point MAXNE along the straight line L as H1, and setting a gray value of a straight line L from the point MAXNE to a next pixel point of a pixel point at coordinates (x1, y1) along the straight line L from the point MAXNE to the pixel point at coordinates (x1, y1) as H2; if H1 is less than H2, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the forward direction of L, and the direction from the point MAXNE to the pixel point at the coordinates (x1, y1) is made to be the reverse direction of L, otherwise, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the reverse direction of L, and the pixel point at the coordinates (x1, y1) from the point MAXNE is made to be the forward direction of L; (the gray value inside the schlieren is larger than the gray value of the outer pixel);
s405, starting from the pixel point at the coordinate (x1, y1), taking the first pixel point of which the gray value H3 of the forward search pixel point along the straight line L meets the distinguishing condition as a point P1, and taking the first pixel point of which the gray value H3 of the reverse search pixel point along the straight line L meets the distinguishing condition as a point P2; if there is no pixel point satisfying the distinguishing condition in the forward direction or the reverse direction of the straight line L, taking pixel points of two intersection points of the straight line L and the image boundary of the gray-scale image (the image boundary is a circle of pixel points at the outermost side of the image) as a point P1 and a point P2; let the Euclidean distance value between the coordinate of the point P1 and the coordinate (x1, y1) be the distance P1, the Euclidean distance value between the coordinate of the point P2 and the coordinate (x1, y1) be the distance P2, if the distance P1 is less than the distance P2, the pixel point P1 is taken as the comparison pixel point P3, otherwise, the pixel point P2 is taken as the comparison pixel point P3; calculating the average value HAVG2 of all pixel points on a segment from the pixel point at the coordinate (x1, y1) to P3 on the straight line L; wherein the distinguishing condition is MinH (b 1)i1)≤H3≤MaxH(b1i1) ,MaxH(b1i1) Is edge line b1i1In the gray values of all the pixelsMaximum gray value, MinH (b 1)i1) Is edge line b1i1The minimum gray value in the gray values of all the pixel points is obtained;
s406, if the edge line b1i1If the gray value h (x1, y1) of the upper j1 th pixel point satisfies the condition 1 in the forward direction of the line L or satisfies the condition 2 in the reverse direction of the line L, the edge line b1 is markedi1The upper j1 th pixel point is a schlieren edge point, namely the gray value h (x1, y1) needs to meet the condition 1 or the condition 2; (since the difference in the gray values of the pixels on both sides of the streak edge is extremely large, it is necessary to identify a point on the streak edge by condition 1 or condition 2);
the condition 1 is:
Figure 954155DEST_PATH_IMAGE001
where m1 is the number of rows of the image matrix of the point P3 in the grayscale image, and n1 is the number of columns of the image matrix of the point P3 in the grayscale image; wherein U1= min (x1, m 1); v1= min (y1, n 1); wherein u = min (x1, m1) + | x1-m1 |; v = min (y1, n1) + | y1-n1 |; min (x1, m1) is the smaller of x1 and m 1; min (y1, n1) is the smaller of y1 and n 1; h (i2, j2) is the gray value of the pixel point at the coordinate (i2, j2), and i2 and j2 are variables;
the condition 2 is:
Figure 305501DEST_PATH_IMAGE002
s407, if j1 ≦ N2, increment the value of j1 by 1 and go to step S403, otherwise determine edge line b1i1Whether the number of the pixel points marked with the strip edge points exceeds a set strip edge point threshold value or not, and if so, the edge line b1 is markedi1Marked as a string edge and added to the string edge set and goes to step S408; fringe point threshold is set to 0.4,0.8 of N2]Doubling;
s408, if i1 ≦ N1, increment the value of i1 by 1 and go to step S402, otherwise go to step S409;
and S409, obtaining a schlieren set.
Further, in S500, the method for marking the unclear one of the individual ridges in the ridge set includes:
respectively calculating clear scale values CLE of all the strip edges in the strip edge set, wherein the calculation method of CLE comprises the following steps:
Figure 923565DEST_PATH_IMAGE003
wherein, R (k2) is the gray guide value of the k2 th pixel point on the textured edge, k2 belongs to [1, N4], and N4 is the number of the pixel points on the textured edge; the calculation method of the gray scale guide value R (k2) of the k2 th pixel point on the schlieren edge comprises the following steps:
Figure 69375DEST_PATH_IMAGE004
the coordinates of a k2 th pixel point on the schlieren edge are (x2, y2), h (x2, y2) is the gray value of the pixel point at the coordinates (x2, y2), that is, h (x2, y2) is the gray value of a k2 th pixel point on the schlieren edge, h (x2-1, y2) is the gray value of the pixel point at the coordinates (x2-1, y2), h (x2+1, y2) is the gray value of the pixel point at the coordinates (x2+1, y2), h (x2, y2+1) is the gray value of the pixel point at the coordinates (x2, y2+1), h (x2, y2-1) is the gray value of the pixel point at the coordinates (x2, y 2-1);
let the arithmetic mean of the clear scale values CLE of all the strip edges be CLE _ ave, and mark all the strip edges with clear scale values CLE < CLE _ ave as unclear strip edges in the strip edge set.
Further, in S600, the method of repairing the streak edge marked as unclear in the streak edge set to the repaired streak edge and forming the repaired streak edge set from the repaired streak edges includes:
s601, setting GMax as the maximum gray value in the gray values of all the pixels of the unclear schlieren, and setting GMin as the minimum gray value in the gray values of all the pixels of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
sequentially traversing each pixel point of unclear schlieren edgeWhen the gray value h (x3, y3) of the pixel point with the coordinate (x3, y3) in the traversed unclear schlieren edge is in the following range: GMin is less than or equal to h (x3, y3) < Gmean, then passes
Figure 976151DEST_PATH_IMAGE005
Or
Figure 662348DEST_PATH_IMAGE006
Converting the gray value of a pixel point with coordinates (x3, y3) in the unclear schlieren edge from h (x3, y3) to CH (x3, y3), wherein CH (x3, y3) is the gray value strengthened by the pixel point with coordinates (x3, y 3);
when the gray value h (x4, y4) of the pixel point with the coordinate of (x4, y4) in the traversed unclear schlieren edge is in the following range Gmean ≤ h (x4, y4) ≦ GMax, passing through
Figure 836715DEST_PATH_IMAGE007
Or
Figure 215743DEST_PATH_IMAGE008
Converting the gray value of a pixel point with coordinates (x4, y4) in the unclear schlieren edge from h (x4, y4) to CH (x4, y4), wherein CH (x3, y3) is strengthened h (x3, y 3);
and processing each (x3, y3) pixel point in the unclear schlieren edge into CH (x3, y3), processing each (x4, y4) pixel point in the unclear schlieren edge into CH (x4, y4) so as to obtain a repaired schlieren edge, and taking a set formed by all repaired schlieren edges as a repaired schlieren edge set.
Preferably, in S600, the method for repairing the streaked edges marked as unclear in the streaked edge sets to repaired streaked edges and forming the repaired streaked edge sets from the repaired streaked edges may further include the following steps:
setting GMax as the maximum gray value in the gray values of all the pixel points of the unclear schlieren, and setting GMin as the minimum gray value in the gray values of all the pixel points of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
all the pixels with the gray values larger than or equal to Gmean and smaller than or equal to Gmax in the pixels on the unclear strip edge are marked as adhesion pixels, the pixels except the adhesion pixels on the unclear strip edge are taken as common pixels, the common pixels on the unclear strip edge are sequentially traversed, all curve sections formed by all continuous common pixels separated by the adhesion pixels on the unclear strip edge are taken as first curve sections, and the first curve sections with the Euclidean distance between all two end points larger than 3 pixels are marked as common curve sections; taking a common curve segment with the minimum Euclidean distance value between two end points as a threshold curve segment, and taking the Euclidean distance between the two end points of the threshold curve segment as a screening threshold TH;
sequentially traversing the adhesive pixel points on the unclear strip edge, taking a curve section formed by all continuous adhesive pixel points separated by common pixel points on the unclear strip edge as a second curve section, marking the second curve section with the Euclidean distance value between all two end points being larger than a threshold TH as the adhesive curve section,
calculating the similarity between the adhesion curve segment and each common curve segment by a curve similarity algorithm; obtaining and copying a common curve segment with the highest similarity as a substitute segment of the adhesion curve segment; setting the Euclidean distance value between two end points of the adhesion curve section as DS 1; the value of Euclidean distance between two end points of the substitute line segment is DS 2; taking a straight line obtained by connecting two end points of the adhesion curve segment as a first reference straight line LV1, taking a straight line obtained by connecting two end points of the substitute line segment as a second reference straight line LV2, and taking an included angle between LV1 and LV2 as an angle AG 1;
when DS1 is equal to DS2, the substitute line segment is rotated by an angle AG1 in the direction from LV2 to LV1, the rotated substitute line segment is shifted to the corresponding position where the two end points coincide with the two end points of the adhesion curve segment, the adhesion curve segment is deleted, and the substitute line segment replaces the adhesion curve segment to obtain a repaired textured edge;
the specific meaning of the overall rotation angle AG1 of the substitute line segment in the direction from LV2 to LV1 is as follows: if LV2 is rotated clockwise by angle AG1 to coincide with LV1, the angle AG1 is rotated clockwise instead of the line segment; if LV2 is rotated counterclockwise by an angle AG1 that can coincide with LV1, the angle AG1 is rotated counterclockwise instead of the line segment;
when DS1 is smaller than DS2, any one of the two endpoints of the substitute line segment is replaced as a reference endpoint V1; searching a pixel point V2 on the substitute line segment to enable the Euclidean distance between V1 and V2 to be DS1 to obtain a point V2, intercepting a curve segment between an end point V1 and an end point V2 on the substitute line segment to be used as a first supplementary line segment, taking a straight line connecting two end points of the first supplementary line segment as a third reference straight line LV3, calculating an included angle between LV1 and LV3 to be used as an angle AG2, integrally rotating the first supplementary line segment in the direction from LV3 to LV1 by an angle AG2, moving the first supplementary line segment to the corresponding positions where the two end points of the first supplementary line segment are coincided with the two end points of the adhesion curve segment, and deleting the adhesion curve segment, so that the first supplementary line segment replaces the adhesion curve segment to obtain a repaired striated edge;
the specific meaning of the overall rotation angle AG2 of the first supplementary line segment in the direction from LV3 to LV1 is as follows: if LV3 is rotated clockwise by angle AG2 to coincide with LV1, then the first supplemental line segment is rotated clockwise by angle AG 2; if LV3 is rotated counterclockwise by angle AG2 to coincide with LV1, rotate the first supplemental line segment counterclockwise by angle AG 2;
when DS1 is greater than DS2, the substitute line segment is rotated by an angle AG1 integrally in the direction from LV2 to LV1, any end point of the substitute line segment is taken as a connecting end point, the substitute line segment is moved to a corresponding position where the connecting end point is coincided with one end point of the adhesive curve segment, and the Euclidean distance between the other end point of the substitute line segment and the adhesive curve segment is smaller than DS1, so that a spliced curve segment obtained by the substitute line segment and the adhesive curve segment is obtained, and at the moment, because the length of the substitute line segment is insufficient, the other end point of the substitute line segment except the connecting end point cannot be coincided with the end point of the adhesive curve segment; taking the other end point of the alternative line segment in the piecing curve segment except the connecting end point as an expansion end point V3; taking an end point which is not connected with the substitute line segment on the adhesion curve segment in the splicing curve segment as an end point V4 to be connected; the Euclidean distance value between the expansion endpoint V3 and the endpoint V4 to be connected is DS 3; searching a pixel point V5 on the splicing curve segment to enable the Euclidean distance value between an expansion end point V3 and a point V5 to be DS3 so as to obtain a point V5, copying a curve segment between a point V3 and a point V5 on the splicing curve segment to be used as a second supplementary line segment, taking a straight line obtained by connecting two end points of the second supplementary line segment as a fourth reference straight line LV4, taking a straight line obtained by connecting the expansion end point V3 and an end point V4 to be connected as a fifth reference straight line LV5, taking the included angle between LV4 and LV5 as an angle AG3, rotating the second supplementary line segment to rotate the angle AG3 integrally in the direction from LV4 and LV5, moving the position of the second supplementary line segment to the position of one of the two end points of the second supplementary line segment to coincide with the position of the expansion end point V3, and coinciding the other end point with the position of the end point V4 to be connected, and obtaining a repaired schlieren edge;
the specific meaning of the second supplementary line segment to the overall rotation angle AG3 in the directions from LV4 and LV5 is as follows: if LV4 is rotated clockwise by angle AG3 to coincide with LV5, then the second supplemental line segment is rotated clockwise by angle AG 2; if LV4 is rotated counterclockwise by angle AG3 to coincide with LV5, then rotate the second supplemental line segment counterclockwise by angle AG 3;
preferably, the adhesion curve segment is also deleted after the repaired striated edge is obtained.
When the line segment is not replaced, connecting two end points of the adhesion curve segment on the textured edge to obtain a third supplemented line segment, deleting the adhesion curve segment, and replacing the adhesion curve segment with the third supplemented line segment to obtain a repaired textured edge;
and taking the set formed by all the repaired striae as a repaired striae set.
The algorithm of the curve similarity at least comprises any one of a FRECHET distance algorithm and a Hausdorff distance matching algorithm.
The invention also provides a pig intestine epithelial injury repair image recognition system, which comprises: a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer program to run in the units of the following system:
the TEM image acquisition unit is used for acquiring a TEM image of the pig small intestine slice;
the image graying unit is used for graying the TEM image to obtain a grayscale image;
the boundary detection unit is used for carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
the line edge identification unit is used for identifying and marking the line edges in each edge line to obtain a line edge set;
the unclear marking unit is used for marking unclear schlieren edges in each schlieren edge in the schlieren edge set;
the schlieren restoration unit is used for restoring the schlieren marked as unclear in the schlieren set into restored schlieren, and the restored schlieren set is formed by the restored schlieren;
and the fringe marking unit is used for marking a fringe area on the TEM image according to the position of each repaired fringe in the repaired fringe set.
The beneficial effect of this disclosure does: the invention provides a method and a system for recognizing a repairing image of porcine intestinal epithelial injury, which can solve the problems of unclear boundaries between porcine intestinal epithelial cells, overlapping and adhesion of schlieren borders and the like, improve the recognition rate and definition of the schlieren borders, reduce errors of a subsequent statistical analysis and an intestinal epithelial injury repairing model for animal experiments, and improve the accuracy of statistical results.
Drawings
The foregoing and other features of the present disclosure will become more apparent from the detailed description of the embodiments shown in conjunction with the drawings in which like reference characters designate the same or similar elements throughout the several views, and it is apparent that the drawings in the following description are merely some examples of the present disclosure and that other drawings may be derived therefrom by those skilled in the art without the benefit of any inventive faculty, and in which:
FIG. 1 is a flow chart of an image recognition method for repairing damage of porcine intestinal epithelium;
fig. 2 is a structural diagram of an image recognition system for repairing damage of porcine intestinal epithelium.
Detailed Description
The conception, specific structure and technical effects of the present disclosure will be clearly and completely described below in conjunction with the embodiments and the accompanying drawings to fully understand the objects, aspects and effects of the present disclosure. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Fig. 1 is a flowchart of a method for recognizing images for repairing damaged porcine intestinal epithelium, and the method for recognizing images for repairing damaged porcine intestinal epithelium according to an embodiment of the present invention is described below with reference to fig. 1, and includes the following steps:
s100, acquiring a TEM image of the pig small intestine slice;
s200, graying the TEM image to obtain a gray image;
s300, carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
s400, identifying and marking the schlieren edges in each edge line to obtain a schlieren edge set;
s500, marking unclear strip edges in each strip edge in the strip edge set;
s600, repairing the streak edges marked as unclear in the streak edge set into repaired streak edges, and forming a repaired streak edge set by the repaired streak edges;
and S700, marking a fringe area on the TEM image according to the position of each repaired fringe in the repaired fringe set.
Further, in S100, the method of acquiring the TEM image of the pig small intestine section is: the pig small intestine is sliced and fixed by Bouin's solution, embedded by paraffin, stained by hematoxylin-eosin, imaged under a microscope so as to collect slice images as TEM images, wherein the microscope is any one of a common optical microscope, a total internal reflection fluorescence microscope, an ultraviolet light microscope and the like with a bright field photographing function.
Further, in S400, the method for identifying and marking the strip edge in each edge line to obtain the strip edge set includes:
because the schlieren is a continuous edge line in the image, the gray value of the schlieren and the gray value of the peripheral pixel points have larger difference, and therefore, the schlieren needs to be screened out by the following method;
s401, let the set of all edge lines be B1, B1= { B1i1Wherein, b1i1Is the i1 th edge line in the set B1; setting the initial value of i1 to be 1, i1 epsilon [1, N1]N1 is the number of elements in the set B1, and an empty set is set as a strip edge set;
s402, acquiring an edge line b1i1The number of upper pixel points is N2, the variable j1 is set to have an initial value of 1, j1 ∈ [1, N2 ]](ii) a Calculate edge line b1i1Average gray values of all the pixel points are HAVG 1;
s403, if the edge line b1i1The coordinates of the upper j1 th pixel point are (x1, y1), h (x1, y1) is the gray value of the pixel point at the coordinates (x1, y1), and the pixel value or the point with the maximum gray value in the eight neighborhoods of the pixel point at (x1, y1) is searched to obtain a point MAXNE;
s404, obtaining a straight line L by connecting a pixel point at (x1, y1) with a point MAXNE, setting a gray value of a next pixel point of the straight line L from a pixel point at coordinates (x1, y1) to a pixel point at the point MAXNE as H1, (if the next pixel point of the pixel point at the point MAXNE is an image boundary or no next pixel point, randomly obtaining a gray value of any pixel point in eight neighborhoods of the point MAXNE as H1), setting a gray value of the next pixel point of the straight line L from the point MAXNE to a pixel point at coordinates (x1, y1) along the straight line L from the coordinates (x1, y1) as H2, and (if the next pixel point at coordinates (x1, y1) is an image boundary or no next pixel point, randomly obtaining a gray value of any pixel point in eight neighborhoods of coordinates (x1, y1) as H2); if H1 is less than H2, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the forward direction of L, and the direction from the pixel point at the coordinates (x1, y1) to the pixel point at the coordinates (x1, y1) on the straight line L is made to be the reverse direction of L, otherwise, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the reverse direction of L, and the pixel point at the coordinates (x1, y1) on the straight line L is made to be the forward direction of L; (the gray value inside the schlieren is larger than the gray value of the outer pixel);
s405, respectively starting from the pixel point at the coordinates (x1, y1), searching the gray value H3 of the pixel point along the forward direction of the straight line L satisfies the condition MinH (b 1)i1)≤H3≤MaxH(b1i1) Is used as the point P1, the gray value H3 of the reverse search pixel point along the straight line L satisfies the condition MinH (b 1)i1)≤H3≤MaxH(b1i1) As point P2; if the condition MinH is not satisfied in the forward or reverse direction (b 1)i1)≤H3≤MaxH(b1i1) The pixel points of (2) are the pixel points of two intersection points of the boundary and the forward direction and the reverse direction along the straight line L on the gray-scale image as P1 and P2; let the Euclidean distance value between the coordinate of P1 and the coordinate (x1, y1) be the distance P1, the Euclidean distance value between the coordinate of P2 and the coordinate (x1, y1) be the distance P2, if the distance P1 is less than the distance P2, the P1 pixel point is taken as the comparison pixel point P3, otherwise, the P2 pixel point is taken as the comparison pixel point P3; calculating the average value HAVG2 of all pixel points on a segment from the pixel point at the coordinate (x1, y1) to P3 on the straight line L; wherein MinH (b 1)i1) Is edge line b1i1The minimum gray value in the gray values of all the pixel points is obtained; MaxH (b 1)i1) Is edge line b1i1The maximum gray value in the gray values of all the pixel points is obtained;
s406, if the edge line b1i1If the gray value h (x1, y1) of the upper j1 th pixel point satisfies the condition 1 in the forward direction of the line L or satisfies the condition 2 in the reverse direction of the line L, the edge line b1 is markedi1The upper j1 th pixel point is a schlieren edge point, namely the gray value h (x1, y1) needs to meet the condition 1 or the condition 2;
the condition 1 is:
Figure 344236DEST_PATH_IMAGE001
where m1 is the number of rows of the image matrix of the point P3 in the grayscale image, and n1 is the number of columns of the image matrix of the point P3 in the grayscale image; wherein U1= min (x1, m 1); v1= min (y1, n 1); wherein u = min (x1, m1) + | x1-m1 |; v = min (y1, n1) + | y1-n1 |; min (x1, m1) is the smaller of x1 and m 1; min (y1, n1) is the smaller of y1 and n 1; h (i2, j2) is the gray value of the pixel point at the coordinate (i2, j2), and i2 and j2 are variables;
the condition 2 is:
Figure 834124DEST_PATH_IMAGE002
s407, if j1 ≦ N2, increment the value of j1 by 1 and go to step S403, otherwise determine edge line b1i1Whether the number of the pixel points marked with the strip edge points exceeds a set strip edge point threshold value or not, and if so, the edge line b1 is markedi1Marked as a string edge and added to the string edge set and goes to step S408; fringe point threshold is set to 0.4,0.8 of N2]Doubling;
s408, if i1 ≦ N1, increment the value of i1 by 1 and go to step S402, otherwise go to step S409;
and S409, obtaining a schlieren set.
Further, in S500, the method for marking the unclear one of the individual ridges in the ridge set includes:
respectively calculating clear scale values CLE of all the strip edges in the strip edge set, wherein the calculation method of CLE comprises the following steps:
Figure 895621DEST_PATH_IMAGE003
wherein, R (k2) is the gray guide value of the k2 th pixel point on the textured edge, k2 belongs to [1, N4], and N4 is the number of the pixel points on the textured edge; the calculation method of the gray scale guide value R (k2) of the k2 th pixel point on the schlieren edge comprises the following steps:
Figure 383234DEST_PATH_IMAGE004
the coordinates of a k2 th pixel point on the schlieren edge are (x2, y2), h (x2, y2) is the gray value of the pixel point at the coordinates (x2, y2), that is, h (x2, y2) is the gray value of a k2 th pixel point on the schlieren edge, h (x2-1, y2) is the gray value of the pixel point at the coordinates (x2-1, y2), h (x2+1, y2) is the gray value of the pixel point at the coordinates (x2+1, y2), h (x2, y2+1) is the gray value of the pixel point at the coordinates (x2, y2+1), h (x2, y2-1) is the gray value of the pixel point at the coordinates (x2, y 2-1);
let the arithmetic mean of the clear scale values CLE of all the strip edges be CLE _ ave, and mark all the strip edges with clear scale values CLE < CLE _ ave as unclear strip edges in the strip edge set.
Further, in S600, the method of repairing the streak edge marked as unclear in the streak edge set to the repaired streak edge and forming the repaired streak edge set from the repaired streak edges includes:
s601, setting GMax as the maximum gray value in the gray values of all the pixels of the unclear schlieren, and setting GMin as the minimum gray value in the gray values of all the pixels of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
sequentially traversing each pixel point of the unclear schlieren edge, and when the gray value h (x3, y3) of the pixel point with the coordinate (x3, y3) in the traversed unclear schlieren edge is in the following range: GMin is less than or equal to h (x3, y3) < Gmean, then passes
Figure 264602DEST_PATH_IMAGE005
Or
Figure 558180DEST_PATH_IMAGE009
Converting the gray value of a pixel point with coordinates (x3, y3) in the unclear schlieren edge from h (x3, y3) to CH (x3, y3), wherein CH (x3, y3) is the gray value strengthened by the pixel point with coordinates (x3, y 3);
when the gray value h (x4, y4) of the pixel point with the coordinate of (x4, y4) in the traversed unclear schlieren edge is in the following range Gmean ≤ h (x4, y4) ≦ GMax, passing through
Figure 943025DEST_PATH_IMAGE010
Or
Figure 398277DEST_PATH_IMAGE008
Converting the gray value of a pixel point with coordinates (x4, y4) in the unclear schlieren edge from h (x4, y4) to CH (x4, y4), wherein CH (x3, y3) is strengthened h (x3, y 3);
and processing each (x3, y3) pixel point in the unclear schlieren edge into CH (x3, y3), processing each (x4, y4) pixel point in the unclear schlieren edge into CH (x4, y4) so as to obtain a repaired schlieren edge, and taking a set formed by all repaired schlieren edges as a repaired schlieren edge set.
Preferably, in S600, the method for repairing the streaked edges marked as unclear in the streaked edge sets to repaired streaked edges and forming the repaired streaked edge sets from the repaired streaked edges may further include the following steps:
setting GMax as the maximum gray value in the gray values of all the pixel points of the unclear schlieren, and setting GMin as the minimum gray value in the gray values of all the pixel points of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
all the pixels with the gray values larger than or equal to Gmean and smaller than or equal to Gmax in the pixels on the unclear strip edge are marked as adhesion pixels, the pixels except the adhesion pixels on the unclear strip edge are taken as common pixels, the common pixels on the unclear strip edge are sequentially traversed, all curve sections formed by all continuous common pixels separated by the adhesion pixels on the unclear strip edge are taken as first curve sections, and the first curve sections with the Euclidean distance between all two end points larger than 3 pixels are marked as common curve sections; taking the smallest common curve segment in the Euclidean distance values between all the two end points as a threshold curve segment, and taking the Euclidean distance between the two end points of the threshold curve segment as a screening threshold TH;
sequentially traversing the adhesive pixel points on the unclear strip edge, taking a curve section formed by all continuous adhesive pixel points separated by common pixel points on the unclear strip edge as a second curve section, marking the second curve section with the Euclidean distance value between all two end points being larger than a threshold TH as the adhesive curve section,
calculating the similarity between the adhesion curve segment and each common curve segment by a curve similarity algorithm; obtaining a common curve segment with the highest similarity as a substitute line segment of the adhesion curve segment; setting the Euclidean distance value between two end points of the adhesion curve section as DS 1; the value of Euclidean distance between two end points of the substitute line segment is DS 2; taking a straight line obtained by connecting two end points of the adhesion curve segment as a first reference straight line LV1, taking a straight line obtained by connecting two end points of the substitute line segment as a second reference straight line LV2, and taking an included angle between LV1 and LV2 as an angle AG 1;
when DS1 is equal to DS2, the substitute line segment is rotated by an angle AG1 in the direction from LV2 to LV1, two end points of the substitute line segment are correspondingly connected with two end points of the adhesion curve segment, the adhesion curve segment is deleted, and the substitute line segment replaces the adhesion curve segment to obtain a repaired textured edge;
when DS1 is smaller than DS2, any one of the two endpoints of the substitute line segment is replaced as a reference endpoint V1; searching a pixel point V2 on the substitute line segment to enable the Euclidean distance between V1 and V2 to be DS1 to obtain a point V2, intercepting a curve segment between an end point V1 and an end point V2 on the substitute line segment to be used as a first supplementary line segment, taking a straight line obtained by connecting two end points of the first supplementary line segment as a third reference straight line LV3, calculating an included angle between LV1 and LV3 to be used as an angle AG2, integrally rotating the first supplementary line segment towards the direction from LV3 to LV1 by an angle AG2, correspondingly connecting the two end points of the first supplementary line segment with the two end points of the adhesion curve segment, and deleting the adhesion curve segment, so that the first supplementary line segment replaces the adhesion curve segment to obtain a repaired textured edge;
when DS1 is greater than DS2, the substitute line segment is rotated by an angle AG1 integrally in the direction from LV2 to LV1, any end point of the substitute line segment is used as a connecting end point, and the connecting end point of the substitute line segment is connected with any end point of two end points of the adhesion curve segment to obtain a spliced curve segment obtained by the substitute line segment and the adhesion curve segment, and at the moment, because the length is insufficient, the other end point of the substitute line segment except the connecting end point cannot be connected with the end point of the adhesion curve segment; taking the other end point of the alternative line segment in the piecing curve segment except the connecting end point as an expansion end point V3; taking an end point which is not connected with the substitute line segment on the adhesion curve segment in the splicing curve segment as an end point V4 to be connected; the Euclidean distance value between the expansion endpoint V3 and the endpoint V4 to be connected is DS 3; searching a pixel point V5 on the splicing curve segment to enable the Euclidean distance value between expansion end points V3 and V5 to be DS3 to obtain a point V5, copying a curve segment between an end point V3 and an end point V5 on the splicing curve segment to be used as a second supplementary line segment, taking a straight line obtained by connecting two end points of the second supplementary line segment as a fourth reference straight line LV4, taking a straight line obtained by connecting the expansion end point V3 and an end point V4 to be connected as a fifth reference straight line LV5, taking the included angle between LV4 and LV5 as an angle AG3, rotating the second supplementary line segment by an angle AG3 in the direction from LV4 and LV5 in an integral manner, connecting one of the two end points of the second supplementary line segment with the expansion end point V3, and connecting the other end point with an end point V4 to be connected to obtain a repaired schlieren edge;
preferably, the adhesion curve segment is also deleted after the repaired striated edge is obtained.
When the line segment is not replaced, connecting two end points of the adhesion curve segment on the textured edge to obtain a repair straight line, deleting the adhesion curve segment, and replacing the adhesion curve segment with the straight line segment to obtain a repaired textured edge;
and taking the set formed by all the repaired striae as a repaired striae set.
The algorithm of the curve similarity at least comprises any one of a FRECHET distance algorithm and a Hausdorff distance matching algorithm.
Preferably, in an embodiment of the present disclosure, part of key source codes of the C + + language of the pig intestinal epithelial injury repair image identification method is as follows:
marking unclear striae in each striae in the striae set;
DrawEdgeCurve::DrawEdgeCurve(CP2 *P,int ptNum)
{ for(int i=0;i<ptNum;i++)
{this->P[i]=P[i];
}n=ptNum-1;
};
void DrawEdgeCurve::Draw(CDC*pDC)
{
CPen NewPen,*pOldPen;
NewPen.CreatePen(PS_SOLID,1,RGB(0,0,255));
pOldPen=pDC->SelectObject(&NewPen);
pDC->MoveTo(ROUND(P[0].x),ROUND(P[0].y));
double tStep=0.01;
for(double t=0.0;t<=1.0;t=t+tStep)
{
double x=0.0,y=0.0;
for(int i=0;i<=n;i++)
{
x+=P[i].x*R(n,i)*h(t,i)*h(1-t,n-i);
y+=P[i].y*R(n,i)*h(t,i)*h(1-t,n-i);
}
pDC->LineTo(ROUND(x),ROUND(y));
};
pDC->SelectObject(pOldPen);
NewPen.DeleteObject();
};
v/sequentially traversing the adhesion pixel points on the unclear schlieren edges;
double DrawEdgeCurve::R(const int&n,const int&i)
{
return(Grayvalue(n)/(Grayvalue(i)*Grayvalue(n-i)));
};
int DrawEdgeCurve::Grayvalue(int n)
{
int Grayvalue;
if(n==0||n==1)
{
Grayvalue=1;
}
else
Grayvalue=n*Grayvalue(n-1);
return Grayvalue;
};
repair of streaks marked as unclear in a set of streaks
void DrawEdgeCurve::DrawControlPolygon(CDC *pDC)
{
CBrush NewBrush,*pOldBrush;
pOldBrush=(CBrush*)pDC->SelectStockObject(BLACK_BRUSH);
pDC->MoveTo(ROUND(P[0].x),ROUND(P[0].y));
for(int i=0;i<=n;i++)
{ pDC->LineTo(ROUND(P[i].x),ROUND(P[i].y));
pDC->Fix(ROUND(P[i].x)-5,ROUND(P[i].y)-5,ROUND(P[i].x)+5,ROUND(P[i].y)+5);
}
pDC->SelectObject(pOldBrush);
}。
An embodiment of the present disclosure provides a pig intestine epithelium damage repair image recognition system, as shown in fig. 2, which is a structural diagram of the pig intestine epithelium damage repair image recognition system of the present disclosure, and the pig intestine epithelium damage repair image recognition system of the embodiment includes: the image recognition system comprises a processor, a memory and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to realize the steps in the embodiment of the image recognition system for repairing the porcine intestinal epithelial injury.
The system comprises: a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer program to run in the units of the following system:
the TEM image acquisition unit is used for acquiring a TEM image of the pig small intestine slice;
the image graying unit is used for graying the TEM image to obtain a grayscale image;
the boundary detection unit is used for carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
the line edge identification unit is used for identifying and marking the line edges in each edge line to obtain a line edge set;
the unclear marking unit is used for marking unclear schlieren edges in each schlieren edge in the schlieren edge set;
the schlieren restoration unit is used for restoring the schlieren marked as unclear in the schlieren set into restored schlieren, and the restored schlieren set is formed by the restored schlieren;
and the fringe marking unit is used for marking a fringe area on the TEM image according to the position of each repaired fringe in the repaired fringe set.
The image recognition system for repairing the porcine intestinal epithelial injury can be operated in computing equipment such as a desktop computer, a notebook computer, a palm computer and a cloud server. The image recognition system for repairing the porcine intestinal epithelial injury can be operated by a system comprising, but not limited to, a processor and a memory. It will be understood by those skilled in the art that the example is merely an example of a porcine intestinal epithelial injury repair image recognition system, and does not constitute a limitation of a porcine intestinal epithelial injury repair image recognition system, and may include more or less components than a porcine intestinal epithelial injury repair image recognition system, or may combine certain components, or different components, for example, the porcine intestinal epithelial injury repair image recognition system may further include an input-output device, a network access device, a bus, etc.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. The general processor can be a microprocessor or the processor can be any conventional processor and the like, the processor is a control center of the pig intestine epithelial injury repairing image identification system operating system, and various interfaces and lines are used for connecting various parts of the whole pig intestine epithelial injury repairing image identification system operable system.
The memory can be used for storing the computer program and/or the module, and the processor realizes various functions of the image recognition system for repairing the porcine intestinal epithelial injury by running or executing the computer program and/or the module stored in the memory and calling the data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
Although the description of the present disclosure has been rather exhaustive and particularly described with respect to several illustrated embodiments, it is not intended to be limited to any such details or embodiments or any particular embodiments, so as to effectively encompass the intended scope of the present disclosure. Furthermore, the foregoing describes the disclosure in terms of embodiments foreseen by the inventor for which an enabling description was available, notwithstanding that insubstantial modifications of the disclosure, not presently foreseen, may nonetheless represent equivalent modifications thereto.

Claims (6)

1. The image identification method for repairing the porcine intestinal epithelial injury is characterized by comprising the following steps of:
s100, acquiring a TEM image of the pig small intestine slice;
s200, graying the TEM image to obtain a gray image;
s300, carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
s400, identifying and marking the schlieren edges in each edge line to obtain a schlieren edge set;
s500, marking unclear strip edges in each strip edge in the strip edge set;
s600, repairing the streak edges marked as unclear in the streak edge set into repaired streak edges, and forming a repaired streak edge set by the repaired streak edges;
s700, marking a fringe area on the TEM image according to the position of each repaired fringe in the repaired fringe set;
the method for identifying and marking the strip edges in each edge line to obtain the strip edge set comprises the following steps:
s401, let the set of all edge lines be B1, B1= { B1i1Wherein, b1i1Is the i1 th edge line in the set B1; setting the initial value of i1 to be 1, i1 epsilon [1, N1]N1 is the number of elements in the set B1, and an empty set is set as a strip edge set;
s402, acquiring an edge line b1i1The number of upper pixel points is N2, the variable j1 is set to have an initial value of 1, j1 ∈ [1, N2 ]](ii) a Calculate edge line b1i1Average gray values of all the pixel points are HAVG 1;
s403, if the edge line b1i1The coordinates of the upper j1 th pixel point are (x1, y1), h (x1, y1) is the gray value of the pixel point at the coordinates (x1, y1), and the pixel value or the point with the maximum gray value in the eight neighborhoods of the pixel point at (x1, y1) is searched to obtain a point MAXNE;
s404, obtaining a straight line L by connecting a pixel point at (x1, y1) with a point MAXNE, setting a gray value of a straight line L from a pixel point at coordinates (x1, y1) to a next pixel point of a pixel point at the point MAXNE along the straight line L as H1, and setting a gray value of a straight line L from the point MAXNE to a next pixel point of a pixel point at coordinates (x1, y1) along the straight line L from the point MAXNE to the pixel point at coordinates (x1, y1) as H2; if H1 is less than H2, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the forward direction of L, and the direction from the pixel point at the coordinates (x1, y1) to the pixel point at the coordinates (x1, y1) on the straight line L is made to be the reverse direction of L, otherwise, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the reverse direction of L, and the pixel point at the coordinates (x1, y1) on the straight line L is made to be the forward direction of L;
s405, starting from the pixel point at the coordinate (x1, y1), taking the first pixel point of which the gray value H3 of the forward search pixel point along the straight line L meets the distinguishing condition as a point P1, and taking the first pixel point of which the gray value H3 of the reverse search pixel point along the straight line L meets the distinguishing condition as a point P2; if the straight line L does not have a pixel point meeting the distinguishing condition in the forward direction or the reverse direction, taking the pixel points of two intersection points of the straight line L and the image boundary of the gray image as a point P1 and a point P2; let the euclidean distance value between the coordinate of the point P1 and the coordinate (x1, y1) be the P1 distance, the euclidean distance value between the coordinate of the point P2 and the coordinate (x1, y1) be the P2 distance, if the P1 distance is less than the P2 distance, the pixel point P1 is taken as the comparison pixel point P3, otherwise, the pixel point P2 is taken as the comparison pixel point P3; calculating the average value HAVG2 of all pixel points on a segment from the pixel point at the coordinate (x1, y1) to P3 on the straight line L; wherein the distinguishing condition is MinH (b 1)i1)≤H3≤MaxH(b1i1) ,MaxH(b1i1) Is edge line b1i1The maximum gray value, MinH (b 1), among the gray values of all the pixelsi1) Is edge line b1i1The minimum gray value in the gray values of all the pixel points is obtained;
s406, if the edge line b1i1If the gray value h (x1, y1) of the upper j1 th pixel point satisfies the condition 1 in the forward direction of the line L or satisfies the condition 2 in the reverse direction of the line L, the edge line b1 is markedi1The upper j1 th pixel points are texture edge points, that is, the gray value h (x1, y1) needs to satisfy the condition 1 or the condition 2, and then the edge line b1 is markedi1The upper j1 th pixel point is a schlieren edge point;
the condition 1 is:
Figure 231012DEST_PATH_IMAGE001
where m1 is the number of rows of the image matrix of the point P3 in the grayscale image, and n1 is the number of columns of the image matrix of the point P3 in the grayscale image; u1= min (x1, m 1); v1= min (y1, n 1); u = min (x1, m1) + | x1-m1 |; v = min (y1, n1) + | y1-n1 |; min (x1, m1) is the smaller of x1 and m 1; min (y1, n1) is the smaller of y1 and n 1; h (i2, j2) is the gray value of the pixel point at the coordinate (i2, j2), and i2 and j2 are variables;
the condition 2 is:
Figure 254332DEST_PATH_IMAGE002
s407, if j1 ≦ N2, increment the value of j1 by 1 and go to step S403, otherwise determine edge line b1i1Whether the number of the pixel points marked with the strip edge points exceeds a set strip edge point threshold value or not, and if so, the edge line b1 is markedi1Marked as a string edge and added to the string edge set and goes to step S408; fringe point threshold is set to 0.4,0.8 of N2]Doubling;
s408, if i1 ≦ N1, increment the value of i1 by 1 and go to step S402, otherwise go to step S409;
s409, a set of ridges composed of the ridges is obtained.
2. The image recognition method for repairing the intestinal epithelial injury of the pig as claimed in claim 1, wherein in S100, the method for obtaining the TEM image of the section of the small intestine of the pig comprises: the pig small intestine is sliced and fixed by Bouin's solution, embedded by paraffin, stained by hematoxylin-eosin, imaged under a microscope so as to collect slice images as TEM images, wherein the microscope is any one of a common optical microscope, a total internal reflection fluorescence microscope and an ultraviolet light microscope with a bright field photographing function.
3. The image recognition method for repairing porcine intestinal epithelial injury according to claim 1, wherein in S500, the method for marking the unclear one of the individual schlieren edges in the schlieren edge set comprises:
respectively calculating clear scale values CLE of all the strip edges in the strip edge set, wherein the calculation method of CLE comprises the following steps:
Figure 230378DEST_PATH_IMAGE003
wherein, R (k2) is the gray guide value of the k2 th pixel point on the schlieren edge, k2 belongs to [1, N4], N4 is the number of the pixel points on the schlieren edge, and k2 is the accumulation variable; the calculation method of the gray scale guide value R (k2) of the k2 th pixel point on the schlieren edge comprises the following steps:
Figure 239923DEST_PATH_IMAGE004
the coordinates of a k2 th pixel point on the schlieren edge are (x2, y2), h (x2, y2) is the gray value of the pixel point at the coordinates (x2, y2), that is, h (x2, y2) is the gray value of a k2 th pixel point on the schlieren edge, h (x2-1, y2) is the gray value of the pixel point at the coordinates (x2-1, y2), h (x2+1, y2) is the gray value of the pixel point at the coordinates (x2+1, y2), h (x2, y2+1) is the gray value of the pixel point at the coordinates (x2, y2+1), h (x2, y2-1) is the gray value of the pixel point at the coordinates (x2, y 2-1);
let the arithmetic mean of the clear scale values CLE of all the strip edges be CLE _ ave, and mark all the strip edges with clear scale values CLE < CLE _ ave as unclear strip edges in the strip edge set.
4. The image recognition method for repairing porcine intestinal epithelial injury according to claim 3, wherein in step S600, the striae marked as unclear in the set of striae are repaired to be repaired striae, and the method for constructing the set of repaired striae from each repaired striae comprises:
setting GMax as the maximum gray value in the gray values of all the pixel points of the unclear schlieren, and setting GMin as the minimum gray value in the gray values of all the pixel points of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
sequentially traversing each pixel point of the unclear schlieren edge, and when the gray value h (x3, y3) of the pixel point with the coordinate (x3, y3) in the traversed unclear schlieren edge is in the following range: GMin is less than or equal to h (x3, y3) < Gmean, then passes
Figure 617814DEST_PATH_IMAGE005
Or
Figure 562768DEST_PATH_IMAGE006
Converting the gray value of a pixel point with coordinates (x3, y3) in the unclear schlieren edge from h (x3, y3) to CH (x3, y3), wherein CH (x3, y3) is the gray value strengthened by the pixel point with coordinates (x3, y 3);
when the gray value h (x4, y4) of the pixel point with the coordinate (x4, y4) in the traversed unclear schlieren edge is in the range Gmean ≤ h (x4, y4) ≦ GMax, the gray value passes through
Figure 26110DEST_PATH_IMAGE007
Or
Figure 839345DEST_PATH_IMAGE008
Converting the gray value of a pixel point with coordinates (x4, y4) in the unclear schlieren edge from h (x4, y4) to CH (x4, y4), wherein CH (x3, y3) is strengthened h (x3, y3), and CH (x4, y4) is strengthened h (x4, y 4); and processing each (x3, y3) pixel point in the unclear schlieren edge into CH (x3, y3), processing each (x4, y4) pixel point in the unclear schlieren edge into CH (x4, y4) so as to obtain a repaired schlieren edge, and taking a set formed by all repaired schlieren edges as a repaired schlieren edge set.
5. The image recognition method for repairing the epithelial injury of the pig intestine according to claim 3, wherein in S600, GMax is the maximum gray value among the gray values of the pixels of the unclear schlieren, and GMin is the minimum gray value among the gray values of the pixels of the unclear schlieren; calculating the average gray value of all pixel points on the unclear schlieren edge as Gmean;
all the pixels with the gray values larger than or equal to Gmean and smaller than or equal to Gmax in the pixels on the unclear strip edge are marked as adhesion pixels, the pixels except the adhesion pixels on the unclear strip edge are taken as common pixels, the common pixels on the unclear strip edge are sequentially traversed, all curve sections formed by all continuous common pixels separated by the adhesion pixels on the unclear strip edge are taken as first curve sections, and the first curve sections with the Euclidean distance between all two end points larger than 3 pixels are marked as common curve sections; taking a common curve segment with the minimum Euclidean distance value between two end points as a threshold curve segment, and taking the Euclidean distance between the two end points of the threshold curve segment as a screening threshold TH;
sequentially traversing the adhesive pixel points on the unclear strip edge, taking a curve section formed by all continuous adhesive pixel points separated by common pixel points on the unclear strip edge as a second curve section, marking the second curve section with the Euclidean distance value between all two end points being larger than a threshold TH as the adhesive curve section,
calculating the similarity between the adhesion curve segment and each common curve segment by a curve similarity algorithm; obtaining and copying a common curve segment with the highest similarity as a substitute segment of the adhesion curve segment; setting the Euclidean distance value between two end points of the adhesion curve section as DS 1; the value of Euclidean distance between two end points of the substitute line segment is DS 2; taking a straight line obtained by connecting two end points of the adhesion curve segment as a first reference straight line LV1, taking a straight line obtained by connecting two end points of the substitute line segment as a second reference straight line LV2, and taking an included angle between LV1 and LV2 as an angle AG 1;
when DS1 is equal to DS2, the substitute line segment is rotated by an angle AG1 in the direction from LV2 to LV1, the rotated substitute line segment is shifted to the corresponding position where the two end points coincide with the two end points of the adhesion curve segment, the adhesion curve segment is deleted, and the substitute line segment replaces the adhesion curve segment to obtain a repaired textured edge;
the specific meaning of the overall rotation angle AG1 of the substitute line segment in the direction from LV2 to LV1 is as follows: if LV2 is rotated clockwise by angle AG1 to coincide with LV1, the angle AG1 is rotated clockwise instead of the line segment; if LV2 is rotated counterclockwise by an angle AG1 that can coincide with LV1, the angle AG1 is rotated counterclockwise instead of the line segment;
when DS1 is smaller than DS2, any one of the two endpoints of the substitute line segment is replaced as a reference endpoint V1; searching a pixel point V2 on the substitute line segment to enable the Euclidean distance between V1 and V2 to be DS1 to obtain a point V2, intercepting a curve segment between an end point V1 and an end point V2 on the substitute line segment to be used as a first supplementary line segment, taking a straight line connecting two end points of the first supplementary line segment as a third reference straight line LV3, calculating an included angle between LV1 and LV3 to be used as an angle AG2, integrally rotating the first supplementary line segment in the direction from LV3 to LV1 by an angle AG2, moving the first supplementary line segment to the corresponding positions where the two end points of the first supplementary line segment are coincided with the two end points of the adhesion curve segment, and deleting the adhesion curve segment, so that the first supplementary line segment replaces the adhesion curve segment to obtain a repaired striated edge;
the specific meaning of the overall rotation angle AG2 of the first supplementary line segment in the direction from LV3 to LV1 is as follows: if LV3 is rotated clockwise by angle AG2 to coincide with LV1, then the first supplemental line segment is rotated clockwise by angle AG 2; if LV3 is rotated counterclockwise by angle AG2 to coincide with LV1, rotate the first supplemental line segment counterclockwise by angle AG 2;
when DS1 is greater than DS2, the substitute line segment is rotated by an angle AG1 integrally in the direction from LV2 to LV1, any end point of the substitute line segment is taken as a connecting end point, the substitute line segment is moved to a corresponding position where the connecting end point is coincided with one end point of the adhesive curve segment, and the Euclidean distance between the other end point of the substitute line segment and the adhesive curve segment is smaller than DS1, so that a spliced curve segment obtained by the substitute line segment and the adhesive curve segment is obtained, and at the moment, because the length of the substitute line segment is insufficient, the other end point of the substitute line segment except the connecting end point cannot be coincided with the end point of the adhesive curve segment; taking the other end point of the alternative line segment in the piecing curve segment except the connecting end point as an expansion end point V3; taking an end point which is not connected with the substitute line segment on the adhesion curve segment in the splicing curve segment as an end point V4 to be connected; the Euclidean distance value between the expansion endpoint V3 and the endpoint V4 to be connected is DS 3; searching a pixel point V5 on the splicing curve segment to enable the Euclidean distance value between an expansion end point V3 and a point V5 to be DS3 so as to obtain a point V5, copying a curve segment between a point V3 and a point V5 on the splicing curve segment to be used as a second supplementary line segment, taking a straight line obtained by connecting two end points of the second supplementary line segment as a fourth reference straight line LV4, taking a straight line obtained by connecting the expansion end point V3 and an end point V4 to be connected as a fifth reference straight line LV5, taking the included angle between LV4 and LV5 as an angle AG3, rotating the second supplementary line segment to rotate the angle AG3 integrally in the direction from LV4 and LV5, moving the position of the second supplementary line segment to the position of one of the two end points of the second supplementary line segment to coincide with the position of the expansion end point V3, and coinciding the other end point with the position of the end point V4 to be connected, and obtaining a repaired schlieren edge;
when the line segment is not replaced, connecting two end points of the adhesion curve segment on the textured edge to obtain a third supplemented line segment, deleting the adhesion curve segment, and replacing the adhesion curve segment with the third supplemented line segment to obtain a repaired textured edge;
the specific meaning of the second supplementary line segment to the overall rotation angle AG3 in the directions from LV4 and LV5 is as follows: if LV4 is rotated clockwise by angle AG3 to coincide with LV5, then the second supplemental line segment is rotated clockwise by angle AG 2; if LV4 is rotated counterclockwise by angle AG3 to coincide with LV5, then rotate the second supplemental line segment counterclockwise by angle AG 3;
and taking the set formed by all the repaired striae as a repaired striae set.
6. An image recognition system for repairing damage to porcine intestinal epithelium, the system comprising: a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer program to run in the units of the following system:
the TEM image acquisition unit is used for acquiring a TEM image of the section of the small intestine of the pig;
the image graying unit is used for graying the TEM image to obtain a grayscale image;
the boundary detection unit is used for carrying out boundary detection on the gray level image through a watershed algorithm to obtain a plurality of edge lines;
the line edge identification unit is used for identifying and marking the line edges in each edge line to obtain a line edge set;
the unclear marking unit is used for marking unclear schlieren edges in each schlieren edge in the schlieren edge set;
the schlieren restoration unit is used for restoring the schlieren marked as unclear in the schlieren set into restored schlieren, and the restored schlieren set is formed by the restored schlieren;
a fringe marking unit for marking a fringe region on the TEM image according to the position of each repaired fringe in the repaired fringe set;
the method for identifying and marking the strip edges in each edge line to obtain the strip edge set comprises the following steps:
s401, let the set of all edge lines be B1, B1= { B1i1Wherein, b1i1Is the i1 th edge line in the set B1; setting the initial value of i1 to be 1, i1 epsilon [1, N1]N1 is the number of elements in the set B1, and an empty set is set as a strip edge set;
s402, acquiring an edge line b1i1The number of upper pixel points is N2, the variable j1 is set to have an initial value of 1, j1 ∈ [1, N2 ]](ii) a Calculate edge line b1i1Average gray values of all the pixel points are HAVG 1;
s403, if the edge line b1i1The coordinates of the upper j1 th pixel point are (x1, y1), h (x1, y1) is the gray value of the pixel point at the coordinates (x1, y1), and the pixel value or the point with the maximum gray value in the eight neighborhoods of the pixel point at (x1, y1) is searched to obtain a point MAXNE;
s404, obtaining a straight line L by connecting a pixel point at (x1, y1) with a point MAXNE, setting a gray value of a straight line L from a pixel point at coordinates (x1, y1) to a next pixel point of a pixel point at the point MAXNE along the straight line L as H1, and setting a gray value of a straight line L from the point MAXNE to a next pixel point of a pixel point at coordinates (x1, y1) along the straight line L from the point MAXNE to the pixel point at coordinates (x1, y1) as H2; if H1 is less than H2, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the forward direction of L, and the direction from the pixel point at the coordinates (x1, y1) to the pixel point at the coordinates (x1, y1) on the straight line L is made to be the reverse direction of L, otherwise, the direction from the pixel point at the coordinates (x1, y1) to the point MAXNE on the straight line L is made to be the reverse direction of L, and the pixel point at the coordinates (x1, y1) on the straight line L is made to be the forward direction of L;
s405, starting from the pixel point at the coordinate (x1, y1), taking the first pixel point of which the gray value H3 of the forward search pixel point along the straight line L meets the distinguishing condition as a point P1, and taking the first pixel point of which the gray value H3 of the reverse search pixel point along the straight line L meets the distinguishing condition as a point P2; if the straight line L does not have a pixel point meeting the distinguishing condition in the forward direction or the reverse direction, taking the pixel points of two intersection points of the straight line L and the image boundary of the gray image as a point P1 and a point P2; let the euclidean distance value between the coordinate of the point P1 and the coordinate (x1, y1) be the P1 distance, the euclidean distance value between the coordinate of the point P2 and the coordinate (x1, y1) be the P2 distance, if the P1 distance is less than the P2 distance, the pixel point P1 is taken as the comparison pixel point P3, otherwise, the pixel point P2 is taken as the comparison pixel point P3; calculating the average value HAVG2 of all pixel points on a segment from the pixel point at the coordinate (x1, y1) to P3 on the straight line L; wherein the distinguishing condition is MinH (b 1)i1)≤H3≤MaxH(b1i1) ,MaxH(b1i1) Is edge line b1i1The maximum gray value, MinH (b 1), among the gray values of all the pixelsi1) Is edge line b1i1The minimum gray value in the gray values of all the pixel points is obtained;
s406, if the edge line b1i1If the gray value h (x1, y1) of the upper j1 th pixel point satisfies the condition 1 in the forward direction of the line L or satisfies the condition 2 in the reverse direction of the line L, the edge line b1 is markedi1The upper j1 th pixel points are texture edge points, that is, the gray value h (x1, y1) needs to satisfy the condition 1 or the condition 2, and then the edge line b1 is markedi1The upper j1 th pixel point is a schlieren edge point;
the condition 1 is:
Figure 71743DEST_PATH_IMAGE001
where m1 is the number of rows of the image matrix of the point P3 in the grayscale image, and n1 is the number of columns of the image matrix of the point P3 in the grayscale image; u1= min (x1, m 1); v1= min (y1, n 1); u = min (x1, m1) + | x1-m1 |; v = min (y1, n1) + | y1-n1 |; min (x1, m1) is the smaller of x1 and m 1; min (y1, n1) is the smaller of y1 and n 1; h (i2, j2) is the gray value of the pixel point at the coordinate (i2, j2), and i2 and j2 are variables;
the condition 2 is:
Figure 702445DEST_PATH_IMAGE002
s407, if j1 ≦ N2, increment the value of j1 by 1 and go to step S403, otherwise determine edge line b1i1Whether the number of the pixel points marked with the strip edge points exceeds a set strip edge point threshold value or not, and if so, the edge line b1 is markedi1Marked as a string edge and added to the string edge set and goes to step S408; fringe point threshold is set to 0.4,0.8 of N2]Doubling;
s408, if i1 ≦ N1, increment the value of i1 by 1 and go to step S402, otherwise go to step S409;
s409, a set of ridges composed of the ridges is obtained.
CN202111046243.5A 2021-09-08 2021-09-08 Pig intestine epithelium damage repair image identification method and system Active CN113487613B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111046243.5A CN113487613B (en) 2021-09-08 2021-09-08 Pig intestine epithelium damage repair image identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111046243.5A CN113487613B (en) 2021-09-08 2021-09-08 Pig intestine epithelium damage repair image identification method and system

Publications (2)

Publication Number Publication Date
CN113487613A CN113487613A (en) 2021-10-08
CN113487613B true CN113487613B (en) 2021-11-16

Family

ID=77947329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111046243.5A Active CN113487613B (en) 2021-09-08 2021-09-08 Pig intestine epithelium damage repair image identification method and system

Country Status (1)

Country Link
CN (1) CN113487613B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327451A (en) * 2016-11-28 2017-01-11 玉溪师范学院 Image restorative method of ancient animal fossils
CN109657579A (en) * 2018-12-07 2019-04-19 上海爱信诺航芯电子科技有限公司 A kind of detection of fingerprint crackle and restorative procedure
CN110335283A (en) * 2019-07-10 2019-10-15 广东工业大学 Image repair method, device, equipment and computer readable storage medium
CN111168136A (en) * 2020-01-16 2020-05-19 上海交通大学 Surface milling surface cutting line segmentation method and system based on high-definition measurement

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101980285B (en) * 2010-11-09 2013-04-17 西安电子科技大学 Method for restoring non-local images by combining GMRF priori

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327451A (en) * 2016-11-28 2017-01-11 玉溪师范学院 Image restorative method of ancient animal fossils
CN109657579A (en) * 2018-12-07 2019-04-19 上海爱信诺航芯电子科技有限公司 A kind of detection of fingerprint crackle and restorative procedure
CN110335283A (en) * 2019-07-10 2019-10-15 广东工业大学 Image repair method, device, equipment and computer readable storage medium
CN111168136A (en) * 2020-01-16 2020-05-19 上海交通大学 Surface milling surface cutting line segmentation method and system based on high-definition measurement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Chlorogenic acid: A comprehensive review of the dietary sources, processing effects, bioavailability, beneficial properties, mechanisms of action, and future directions;Huijie Lu 等;《COMPREHENSIVE REVIEWS IN FOOD SCIENCE AND FOOD SAFETY》;20200904;第1-2页 *

Also Published As

Publication number Publication date
CN113487613A (en) 2021-10-08

Similar Documents

Publication Publication Date Title
CN108596166B (en) Container number identification method based on convolutional neural network classification
CN108596944B (en) Method and device for extracting moving target and terminal equipment
CN110309824B (en) Character detection method and device and terminal
US10121245B2 (en) Identification of inflammation in tissue images
CN111080660B (en) Image segmentation method, device, terminal equipment and storage medium
CN110705583A (en) Cell detection model training method and device, computer equipment and storage medium
WO2012096988A2 (en) Method and apparatus for shape based deformable segmentation of multiple overlapping objects
CN114749342B (en) Lithium battery pole piece coating defect identification method, device and medium
CN100351851C (en) Iris positioning method based on morphology and probability statistic
WO2021147437A1 (en) Identity card edge detection method, device, and storage medium
CN113033558B (en) Text detection method and device for natural scene and storage medium
CN109345536B (en) Image super-pixel segmentation method and device
CN111626295A (en) Training method and device for license plate detection model
CN117094916B (en) Visual inspection method for municipal bridge support
CN111192279B (en) Object segmentation method based on edge detection, electronic terminal and storage medium
CN110598703B (en) OCR (optical character recognition) method and device based on deep neural network
Lermé et al. Reducing graphs in graph cut segmentation
Zhang et al. Using image processing technology to create a novel fry counting algorithm
CN113487613B (en) Pig intestine epithelium damage repair image identification method and system
CN114005120A (en) License plate character cutting method, license plate recognition method, device, equipment and storage medium
Shahid et al. A hybrid vision-based surface coverage measurement method for robotic inspection
CN111199228B (en) License plate positioning method and device
CN117372415A (en) Laryngoscope image recognition method, device, computer equipment and storage medium
CN109993695B (en) Image fragment splicing method and system for irregular graphic annotation
CN114529570A (en) Image segmentation method, image identification method, user certificate subsidizing method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant