CN113034460B - Endometrial gland density estimation method - Google Patents

Endometrial gland density estimation method Download PDF

Info

Publication number
CN113034460B
CN113034460B CN202110299344.7A CN202110299344A CN113034460B CN 113034460 B CN113034460 B CN 113034460B CN 202110299344 A CN202110299344 A CN 202110299344A CN 113034460 B CN113034460 B CN 113034460B
Authority
CN
China
Prior art keywords
image
hysteroscope
picture
gland
endometrium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110299344.7A
Other languages
Chinese (zh)
Other versions
CN113034460A (en
Inventor
徐大宝
徐露
赵行平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Kemeisen Medical Technology Co ltd
Original Assignee
Hunan Kemeisen Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Kemeisen Medical Technology Co ltd filed Critical Hunan Kemeisen Medical Technology Co ltd
Priority to CN202110299344.7A priority Critical patent/CN113034460B/en
Publication of CN113034460A publication Critical patent/CN113034460A/en
Application granted granted Critical
Publication of CN113034460B publication Critical patent/CN113034460B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/303Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the vagina, i.e. vaginoscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06T5/70
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Abstract

The invention discloses a method for estimating endometrial gland density, which belongs to the field of artificial intelligence, and the method comprises the steps of training a detection network, acquiring endometrial video data, intercepting a hysteroscope picture, carrying out standardized processing on the hysteroscope picture, and identifying the hysteroscope picture by using the training network and a matching network to acquire a gland; combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium; endometrial gland density was obtained. The invention can improve the specificity of classification of the intrauterine adhesion and the accuracy of prediction prognosis, analyzes the intrauterine adhesion judgment mechanism based on deep learning, accurately positions the pathomorphism target point of the intrauterine adhesion, identifies images more accurately through strategy selection, can greatly reduce the operation process, can select corresponding interest characteristic values in advance, improves the detection precision and efficiency, collects big data of classification of the intrauterine adhesion, and provides an intrauterine adhesion homogenization diagnosis and treatment scheme.

Description

Endometrial gland density estimation method
Technical Field
The invention relates to the field of artificial intelligence, in particular to an endometrial gland density estimation method.
Background
Intrauterine adhesion (IUA), also known as Asherman syndrome, is characterized in that damage to endometrium and adhesion between uterine walls are caused by damage to uterine cavity, infection and the like caused by various physical or chemical factors, thereby resulting in partial or complete occlusion of uterine cavity and/or uterine cervix. According to the difference of the adhesion position, the main clinical manifestations are hypomenorrhea, amenorrhea, periodic hypogastric pain, abnormal pregnancy, etc., which are one of the main causes of abortion and infertility. In China, the most common cause of intrauterine adhesion is induced abortion, and hysteroscopy technology is gradually popularized, so that various uterine cavity operations are increased, the incidence rate of IUA is higher and higher, the fertility of women of childbearing age is seriously influenced due to high recurrence rate and poor curative effect of treatment, and the stability of families and society of patients is also seriously influenced. The length of the course of disease and the severity of the disease affect the prognosis of treatment, and early detection, early diagnosis and early treatment can improve the pregnancy prognosis of IUA patients to a certain extent. And the judgment of the severity of intrauterine adhesion influences clinical treatment decisions. Therefore, early, accurate and noninvasive diagnosis of intrauterine adhesion and severity grading are of great clinical significance to patients. For IUA patients with fertility requirements, the aim of the therapy is mainly to obtain offspring. Pregnancy after operation of uterine cavity adhesion patients is closely related to severity of IUA, so that grading uterine cavity adhesion patients is very necessary for predicting prognosis and pregnancy outcome.
At present, the diagnosis of the intrauterine adhesion is agreed, and the gold standard is hysteroscopy. The postoperative prognosis evaluation standards of IUA patients include American birth control Association scoring standard, European gynecological endoscope Association grading standard, Chinese IUA scoring standard and the like, but no standard is accepted internationally, and the endometrium which is the index most directly influencing prognosis is not included in the scoring standards due to the limitations of methods and technologies. The endometrium is the determining factor for the implantation of fertilized ovum and the development of embryo. The endometrial glands secrete a number of proteins essential for embryo survival, growth, development, such as mucin MUC-1 and glycoprotein-a, which also confirms the importance of endometrial glands in pregnancy. With the advent of the artificial intelligence era, computer-aided medical image analysis has played an increasingly important role in early diagnosis of diseases. A new generation of artificial intelligence technology taking deep learning as a core can automatically extract high-dimensional features, so that higher classification accuracy is obtained. Therefore, the endometrial gland density estimation method with high accuracy is designed, an important auxiliary effect can be provided for clinical medical decision making, and a new thought and theoretical guidance is provided for improving IUA noninvasive diagnosis rate and prognosis pregnancy rate.
Disclosure of Invention
The invention aims to provide a method for estimating endometrial gland density, which solves the technical problems that the endometrial gland density can only be identified through visual observation and the efficiency is low in the prior art.
A method for estimating endometrial gland density, said method comprising the steps of,
step 1: inputting the marked hysteroscope picture into a training detection network for training to obtain the image characteristics of the gland;
step 2: a doctor puts a medical instrument probe into the uterus to obtain endometrium video data, and intercepts a hysteroscope picture according to the moving speed of the medical instrument probe;
and step 3: standardizing the acquired hysteroscope pictures, wherein the standardized treatment comprises unified definition treatment and noise removal regularization of the hysteroscope pictures, and planarization mapping treatment of the treated images;
and 4, step 4: identifying the hysteroscope picture by using a training network and a matching network to obtain a gland;
and 5: combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium;
and 6: obtaining the density of the endometrium glands according to the size of the endometrium overall image and the number of the glands in the endometrium overall image.
Further, the specific process of step 2 is: the hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe of the medical instrument moving in the endometrium, Δ t is the time from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture, t is the time of obtaining the hysteroscope picture, v is the linear velocity of the probe of the medical instrument moving in the endometrium, v is the linear velocity of the probe moving in the endometrium, t is the linear velocity of the probe moving in the endometrium, v is the linear velocity of the time from the last time of obtaining the hysteroscope picture, t is the time of obtaining the hysteroscope picture, v 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 And obtaining the time of the corresponding frame of the hysteroscope picture for the next time.
Further, the specific process of step 3 is: the unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output of the enhancer is a high-precision image, and the input of the discriminator is a real high-precision image or an image enhancementThe image after the strong processor handles, adjust the low precision image in multicenter to the high precision image, eliminate the difference of different image precision, the specific process of eliminating the regularization of noise is for using the machine noise shielding algorithm, the structure of machine noise shielding algorithm includes a discriminator and a compound generator, divide into two kinds of types of reference center and ordinary center earlier multicenter, the discriminator is used for distinguishing the input image is the synthetic image of reference center image or ordinary center, compound generator adds the noise mask for ordinary central data through the mask generator, then generate the plane coordinate axoplasm, paste the image after handling on the coordinate axoplasm, then can obtain two front and back images specifically do:
Figure BDA0002985599500000021
d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2
Further, the specific process of step 4 is to input relevant strategy selection parameters from the strategy selection module, input the strategy selection parameters and the training detection network characteristic values into the matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then output the identified glands, label the glands and categories, the categories include the mouth-opened glands and the mouth-non-opened glands, and count the number of the glands in each image.
Further, the policy selection module selects the principle as follows:
f(x)=max[ω 1 c i2 s i3 a i4 k i ]
wherein, c i Is the standard density of gland, s i Is the age of the patient, a i For accuracy of detection, k i The four factors are respectively assigned with four corresponding parameters for gland mean sizeWhich are each ω 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and can select the selected lateral fissure by putting the characteristic values of the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the preliminarily determined gland category.
Further, the matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network features, the search subnet extracts features of a detection image, and the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal clipping unit based on a bottleneck residual block, the internal clipping unit clips the features influenced by filling and zero padding inside the block to prevent the learning position deviation of the convolution filter, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of shortcut connections, the three layers of shortcut connections are 1 × 1, 3 × 3 and 1 × 1 convolution respectively, wherein 1 × 1 convolution layer is responsible for reducing and then recovering the size, so that the 3 × 3 convolution layer becomes a bottleneck with small input and output sizes.
Further, the specific process of step 5 is: acquiring a concrete image of the gland in each image through the step 4, and overlapping the front end part of the front image and the rear end overlapping part of the rear image of two adjacent images according to the rule of overlapping the front end part of the front image and the rear end overlapping part of the rear image i Then the distance d of the front end of the previous image is acquired i And the rear end distance d of the following image i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly superposed together, and if the partial images are not superposed, the distance d is used i Gradually decreasing or increasing for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structures of the gland images are the same, superposition is completed, d i ' is gradually decreasing or increasing distance.
Further, the specific process of step 6 is: the formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
Figure BDA0002985599500000041
Figure BDA0002985599500000042
wherein (c) x ,c y ) Coordinates representing the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) And (4) representing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, so as to obtain the average density, and simultaneously, in the step 4, the number of glands identified by each image and the image are reserved and output together with the average density to complete density estimation.
By adopting the technical scheme, the invention has the following technical effects:
the invention can improve the specificity of classification of the intrauterine adhesion and the accuracy of prediction prognosis, analyzes the intrauterine adhesion judgment mechanism based on deep learning, accurately positions the pathomorphism target point of the intrauterine adhesion, identifies images more accurately through strategy selection, can greatly reduce the operation process, can select corresponding interest characteristic values in advance, improves the detection precision and efficiency, collects big data of classification of the intrauterine adhesion, and provides an intrauterine adhesion homogenization diagnosis and treatment scheme.
Drawings
FIG. 1 is a flow chart of the method of the present invention
FIG. 2 is a schematic block diagram of the present invention.
FIG. 3 is a diagram of an original count picture according to the present invention.
FIG. 4 is a cropped target area picture according to the present invention.
Fig. 5 is a target area picture after gray-scale conversion according to the present invention.
Fig. 6 is a picture of the target area after the equalization process according to the present invention.
FIG. 7 is a picture of a marker after counting by the detection algorithm of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, preferred embodiments are given and the present invention is described in further detail. It should be noted, however, that the numerous details set forth in the description are merely for the purpose of providing the reader with a thorough understanding of one or more aspects of the present invention, which may be practiced without these specific details.
An endometrial gland density estimation method, as shown in figures 1-2, comprises the following steps,
step 1: and inputting the marked hysteroscope picture into a training detection network for training to obtain the image characteristics of the gland. The size of the input image is set to 416 x 3, the glands on the image are identified manually, then characteristic values are extracted, and then the extracted values and the manual identification are subjected to secondary identification. Then, features are extracted from the image through a feature extraction network.
Training the detection network to select sample pairs from the ImageNet dataset and extract target images from frames of a video, wherein the target images have a size of 127 x 3. And inputting the target image into a subnet corresponding to the template matching network to obtain a target feature layer.
And 2, step: the doctor puts the medical instrument probe into the uterus to obtain endometrium video data, and captures the hysteroscope picture according to the moving speed of the medical instrument probe. The hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe of the medical instrument moving on the endometrium, Δ t is the hysteroscope image obtained from the last timeTime from slice to the next acquisition of hysteroscope picture, t 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 And obtaining the time of the corresponding frame of the hysteroscope picture for the next time.
And step 3: and carrying out standardization processing on the acquired hysteroscope pictures, wherein the standardization processing comprises unified definition processing and noise removal regularization of the hysteroscope pictures, and planarization and mapping processing of the processed images. The unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output is a high-precision image, the input of the discriminator is a real high-precision image or an image processed by the image enhancer, the low-precision image in a plurality of centers is adjusted to a high-precision image, the difference of different image precision is eliminated, the regularization process of removing noise is that a machine noise shielding algorithm is used, the structure of the machine noise shielding algorithm comprises the discriminator and a composite generator, the plurality of centers are firstly divided into two types of reference centers and common centers, the discriminator is used for discriminating whether the input image is a reference center image or a composite image of the common centers, the composite generator adds a noise mask to the common center data through the mask generator, and then generates a plane coordinate axial plane, pasting the processed images on the coordinate axis surface, and obtaining the front and the back images as follows:
Figure BDA0002985599500000051
d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2
And 4, step 4: and identifying the hysteroscope picture by using a training network and a matching network to obtain the gland. As shown in fig. 2, relevant strategy selection parameters are input from a strategy selection module, the strategy selection parameters and training detection network characteristic values are input to a matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then the identified glands are output, the volumes and the categories of the glands are labeled, the categories comprise the mouth-opened glands and the non-mouth-opened glands, and the number of the glands in each image is counted.
The strategy selection module selects the principle as follows:
f(x)=max[ω 1 c i2 s i3 a i4 k i ]
wherein, c i Is the standard density of gland, s i The age of the patient, a i To detect the accuracy, k i The above four factors are respectively assigned with four corresponding parameters, which are respectively omega, for the gland mean size 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and selects the characteristic values of the training detection network by putting the selected lateral fissure into the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the primary gland category determination.
The matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network characteristics, the search subnet extracts characteristics of a detection image, the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal cutting unit based on a bottleneck residual block, the internal cutting unit cuts out characteristics influenced by filling and zero padding in the block to prevent a convolution filter from learning position deviation, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of quick connections, the three layers of quick connections are 1 x 1, 3 x 3 and 1 x 1 convolution respectively, wherein the 1 x 1 convolution layer is responsible for reducing and then recovering the size, and the 3 x 3 convolution layer becomes a bottleneck with small input and output sizes.
And 5: and (4) combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium. Acquiring a specific image of the gland in each image through step 4, and then combining the front end part of the previous image of the two adjacent imagesThe overlapped part at the rear end of the latter image is overlapped according to the actual distance d between the two images i Then the distance d of the front end of the previous image is acquired i The distribution structure of the glandular image and the distance d between the rear ends of the following images i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly superposed together, and if the partial images are not superposed, the distance d is used i Gradually getting smaller or larger for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structure of the gland images is the same, the superposition is completed, d i ' is gradually decreasing or increasing distance.
And 6: obtaining the density of endometrial glands according to the size of the entire endometrial image and the number of glands in the entire endometrial image. The formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
Figure BDA0002985599500000071
Figure BDA0002985599500000072
wherein (c) x ,c y ) Denotes the coordinates of the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) Expressing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, and obtaining the average density, wherein in the step 4, the number of glands recognized by each image and the image are both reservedAnd outputting the average density together with the average density to finish density estimation.
The density obtained includes the average glandular density of the whole endometrium and the density and the number of the total glandular in a single image, and meanwhile, according to the glandular in the single image, which glandular is active can be known, and a luminous flash point is arranged in the middle of the active glandular volume, as shown in figure 3, the glandular is well identified according to the image pixel and the image exposure point, and the glandular is no longer metabolized and needs to be removed.
As shown in fig. 3, an original count picture of a single frame obtained from a medical apparatus is a circular structure picture, and then a square target region picture of fig. 4 is obtained after square cutting, and then a gray-scale conversion is performed on the picture to obtain a gray-scale-converted target region picture shown in fig. 5, and then a target region picture after equalization processing shown in fig. 6 is obtained after normalization processing, so that the volume and the region size of the gland can be seen more clearly. Fig. 7 is a comparison graph of the gland mark and the following mark in the counting process, after the mark is carried out by an algorithm, the specific number and size of the glands can be clearly seen, and whether the glands are active or not can be identified through the middle brightness.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and these improvements and modifications should also be construed as the protection scope of the present invention.

Claims (6)

1. The endometrial gland density estimation method is characterized by comprising the following steps: the method comprises the following steps of,
step 1: inputting the marked hysteroscope picture into a training detection network for training to obtain the gland image characteristics;
and 2, step: a doctor puts a medical instrument probe into the uterus to obtain endometrium video data, and intercepts a hysteroscope picture according to the moving speed of the medical instrument probe;
and step 3: standardizing the acquired hysteroscope pictures, wherein the standardized treatment comprises unified definition treatment and noise removal regularization of the hysteroscope pictures, and planarization mapping treatment of the treated images;
and 4, step 4: identifying the hysteroscope picture by using a training network and a matching network to obtain a gland;
and 5: combining the identified hysteroscope pictures after de-overlapping to obtain an integral endometrium image;
step 6: obtaining the density of endometrium glands according to the size of the endometrium overall image and the number of the glands in the endometrium overall image;
the specific process of the step 2 is as follows: the hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe moving on the endometrium, Δ t is the time from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture, t 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 The time for acquiring the corresponding frame of the hysteroscope picture next time;
the specific process of the step 3 is as follows: the unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output is a high-precision image, the input of the discriminator is a real high-precision image or an image processed by the image enhancer, the low-precision image in a plurality of centers is adjusted to a high-precision image, the difference of different image precision is eliminated, the regularization process of removing noise is that a machine noise shielding algorithm is used, the structure of the machine noise shielding algorithm comprises the discriminator and a composite generator, the plurality of centers are firstly divided into two types of reference centers and common centers, the discriminator is used for discriminating whether the input image is a reference center image or a composite image of the common centers, the composite generator adds a noise mask to the common center data through the mask generator, and then generates a plane coordinate axial plane, pasting the processed images on the coordinate axis surface, and obtaining the front and the back images as follows:
Figure FDA0003780530940000011
d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2
2. The method for estimating endometrial gland density as in claim 1, wherein: the specific process of the step 4 is that relevant strategy selection parameters are input from the strategy selection module, the strategy selection parameters and the training detection network characteristic values are input into the matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then the identified glands are output, the gland volumes and the categories are labeled, the categories comprise the mouth-opened glands and the non-mouth-opened glands, and the number of the glands in each image is counted.
3. The method for estimating endometrial gland density as claimed in claim 2, wherein: the strategy selection module selects the principle as follows:
f(x)=max[ω 1 c i2 s i3 a i4 k i ]
wherein, c i Is the standard density of gland, s i Is the age of the patient, a i For accuracy of detection, k i The above four factors are respectively assigned with four corresponding parameters, which are respectively omega, for the gland mean size 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and selects the characteristic values of the training detection network by putting the selected lateral fissure into the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the primary gland category determination.
4. The method for estimating endometrial gland density as claimed in claim 2, wherein: the matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network characteristics, the search subnet extracts characteristics of a detection image, the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal cutting unit based on a bottleneck residual block, the internal cutting unit cuts out characteristics influenced by filling and zero padding in the block to prevent a convolution filter from learning position deviation, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of quick connections, the three layers of quick connections are 1 x 1, 3 x 3 and 1 x 1 convolution respectively, wherein the 1 x 1 convolution layer is responsible for reducing and then recovering the size, and the 3 x 3 convolution layer becomes a bottleneck with small input and output sizes.
5. The method for estimating endometrial gland density as in claim 4, wherein: the specific process of the step 5 is as follows: acquiring a concrete image of the gland in each image through the step 4, and overlapping the front end part of the front image and the rear end overlapping part of the rear image of two adjacent images according to the rule of overlapping the front end part of the front image and the rear end overlapping part of the rear image i Then the distance d of the front end of the previous image is acquired i The distribution structure of the glandular image and the distance d between the rear ends of the following images i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly overlapped together, and if the partial images are not overlapped, the distance d is calculated i Gradually decreasing or increasing for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structures of the gland images are the same, superposition is completed, d i ' is gradually decreasing or increasing distance.
6. The method for estimating endometrial gland density as in claim 5, wherein: the specific process of the step 6 is as follows: the formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
Figure FDA0003780530940000031
Figure FDA0003780530940000032
wherein (c) x ,c y ) Coordinates representing the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) And (4) representing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, and obtaining the average density.
CN202110299344.7A 2021-03-21 2021-03-21 Endometrial gland density estimation method Active CN113034460B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110299344.7A CN113034460B (en) 2021-03-21 2021-03-21 Endometrial gland density estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110299344.7A CN113034460B (en) 2021-03-21 2021-03-21 Endometrial gland density estimation method

Publications (2)

Publication Number Publication Date
CN113034460A CN113034460A (en) 2021-06-25
CN113034460B true CN113034460B (en) 2022-09-09

Family

ID=76472267

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110299344.7A Active CN113034460B (en) 2021-03-21 2021-03-21 Endometrial gland density estimation method

Country Status (1)

Country Link
CN (1) CN113034460B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106535738A (en) * 2014-04-01 2017-03-22 弗迪格医疗有限公司 A monitoring system for continuously sensing the uterus
CN110647889A (en) * 2019-08-26 2020-01-03 中国科学院深圳先进技术研究院 Medical image recognition method, medical image recognition apparatus, terminal device, and medium
CN111344801A (en) * 2017-11-22 2020-06-26 通用电气公司 System and method for multimodal computer-assisted diagnosis of prostate cancer

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100158332A1 (en) * 2008-12-22 2010-06-24 Dan Rico Method and system of automated detection of lesions in medical images
EP2599055A2 (en) * 2010-07-30 2013-06-05 Fundação D. Anna Sommer Champalimaud E Dr. Carlos Montez Champalimaud Systems and methods for segmentation and processing of tissue images and feature extraction from same for treating, diagnosing, or predicting medical conditions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106535738A (en) * 2014-04-01 2017-03-22 弗迪格医疗有限公司 A monitoring system for continuously sensing the uterus
CN111344801A (en) * 2017-11-22 2020-06-26 通用电气公司 System and method for multimodal computer-assisted diagnosis of prostate cancer
CN110647889A (en) * 2019-08-26 2020-01-03 中国科学院深圳先进技术研究院 Medical image recognition method, medical image recognition apparatus, terminal device, and medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于形态学重构的宫腔镜下子宫内膜腺体开口标记算法;杨淼等;《北京生物医学工程》;20151031(第05期);第468、469页 *

Also Published As

Publication number Publication date
CN113034460A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN108898160B (en) Breast cancer histopathology grading method based on CNN and imaging omics feature fusion
WO2021203795A1 (en) Pancreas ct automatic segmentation method based on saliency dense connection expansion convolutional network
WO2019001208A1 (en) Segmentation algorithm for choroidal neovascularization in oct image
CN109815888B (en) Novel Pasteur staining method-based abnormal cervical cell automatic identification method
CN107895364B (en) A kind of three-dimensional reconstruction system for the preoperative planning of virtual operation
CN109636805A (en) A kind of uterine neck image lesion region segmenting device and method based on classification priori
CN112967285B (en) Chloasma image recognition method, system and device based on deep learning
CN109035283A (en) It is a kind of precisely to be detected and quantitative analysis method based on the pulmonary emphysema for randomly selecting subregion
CN112215799A (en) Automatic classification method and system for grinded glass lung nodules
CN114677378B (en) Computer-aided diagnosis and treatment system based on ovarian tumor benign and malignant prediction model
CN114627067A (en) Wound area measurement and auxiliary diagnosis and treatment method based on image processing
CN113034460B (en) Endometrial gland density estimation method
CN108836394B (en) Automatic measuring method for descending angle of fetal head
WO2018098697A1 (en) Image feature repeatability measurement method and device
CN111354057B (en) Bone fracture line map drawing method based on image deformation technology
CN112017772A (en) Disease cognition model construction method and system based on woman leucorrhea
WO2023024524A1 (en) Fetal ultrasonic radiomics feature-based chromosomal abnormality prediction model construction method and diagnosis device
Shariaty et al. Severity and progression quantification of covid-19 in ct images: a new deep-learning approach
Tang et al. MRI image segmentation system of uterine fibroids based on AR-Unet network
CN115187577A (en) Method and system for automatically delineating breast cancer clinical target area based on deep learning
CN110399899A (en) Uterine neck OCT image classification method based on capsule network
CN112598669B (en) Lung lobe segmentation method based on digital human technology
CN113409275B (en) Method for determining thickness of transparent layer behind fetal neck based on ultrasonic image and related device
CN113255718B (en) Cervical cell auxiliary diagnosis method based on deep learning cascade network method
Cao et al. A deep learning-based method for cervical transformation zone classification in colposcopy images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant