CN113034460B - Endometrial gland density estimation method - Google Patents
Endometrial gland density estimation method Download PDFInfo
- Publication number
- CN113034460B CN113034460B CN202110299344.7A CN202110299344A CN113034460B CN 113034460 B CN113034460 B CN 113034460B CN 202110299344 A CN202110299344 A CN 202110299344A CN 113034460 B CN113034460 B CN 113034460B
- Authority
- CN
- China
- Prior art keywords
- image
- hysteroscope
- picture
- gland
- endometrium
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000004907 gland Anatomy 0.000 title claims abstract description 80
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000002357 endometrial effect Effects 0.000 title claims abstract description 22
- 210000004696 endometrium Anatomy 0.000 claims abstract description 32
- 238000001514 detection method Methods 0.000 claims abstract description 28
- 238000012549 training Methods 0.000 claims abstract description 26
- 230000008569 process Effects 0.000 claims abstract description 17
- 238000012545 processing Methods 0.000 claims abstract description 10
- 239000000523 sample Substances 0.000 claims description 13
- 230000000762 glandular Effects 0.000 claims description 12
- 239000003623 enhancer Substances 0.000 claims description 9
- 239000002131 composite material Substances 0.000 claims description 6
- 230000003247 decreasing effect Effects 0.000 claims description 5
- 239000000284 extract Substances 0.000 claims description 5
- 208000037170 Delayed Emergence from Anesthesia Diseases 0.000 claims description 3
- 238000013507 mapping Methods 0.000 claims description 3
- 210000004291 uterus Anatomy 0.000 claims description 3
- 208000028685 Asherman syndrome Diseases 0.000 abstract description 24
- 201000001389 adhesions of uterus Diseases 0.000 abstract description 23
- 238000004393 prognosis Methods 0.000 abstract description 8
- 238000003745 diagnosis Methods 0.000 abstract description 5
- 238000013473 artificial intelligence Methods 0.000 abstract description 4
- 238000013135 deep learning Methods 0.000 abstract description 3
- 238000000265 homogenisation Methods 0.000 abstract description 2
- 230000007246 mechanism Effects 0.000 abstract description 2
- 230000035935 pregnancy Effects 0.000 description 6
- 201000010099 disease Diseases 0.000 description 3
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 206010000210 abortion Diseases 0.000 description 2
- 231100000176 abortion Toxicity 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000013399 early diagnosis Methods 0.000 description 2
- 230000035558 fertility Effects 0.000 description 2
- 210000001161 mammalian embryo Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 206010000084 Abdominal pain lower Diseases 0.000 description 1
- 201000000736 Amenorrhea Diseases 0.000 description 1
- 206010001928 Amenorrhoea Diseases 0.000 description 1
- 102000002322 Egg Proteins Human genes 0.000 description 1
- 108010000912 Egg Proteins Proteins 0.000 description 1
- 101710088235 Envelope glycoprotein C homolog Proteins 0.000 description 1
- 208000002777 Gynatresia Diseases 0.000 description 1
- 206010021033 Hypomenorrhoea Diseases 0.000 description 1
- 208000037093 Menstruation Disturbances Diseases 0.000 description 1
- 108010008707 Mucin-1 Proteins 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 231100000540 amenorrhea Toxicity 0.000 description 1
- 210000003679 cervix uteri Anatomy 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000002513 implantation Methods 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000036512 infertility Effects 0.000 description 1
- 208000000509 infertility Diseases 0.000 description 1
- 231100000535 infertility Toxicity 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 210000004681 ovum Anatomy 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002980 postoperative effect Effects 0.000 description 1
- 108090000623 proteins and genes Proteins 0.000 description 1
- 102000004169 proteins and genes Human genes 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000002560 therapeutic procedure Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/303—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the vagina, i.e. vaginoscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20132—Image cropping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Computing Systems (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Evolutionary Computation (AREA)
- Computational Linguistics (AREA)
- Heart & Thoracic Surgery (AREA)
- Mathematical Physics (AREA)
- Animal Behavior & Ethology (AREA)
- Reproductive Health (AREA)
- Gynecology & Obstetrics (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for estimating endometrial gland density, which belongs to the field of artificial intelligence, and the method comprises the steps of training a detection network, acquiring endometrial video data, intercepting a hysteroscope picture, carrying out standardized processing on the hysteroscope picture, and identifying the hysteroscope picture by using the training network and a matching network to acquire a gland; combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium; endometrial gland density was obtained. The invention can improve the specificity of classification of the intrauterine adhesion and the accuracy of prediction prognosis, analyzes the intrauterine adhesion judgment mechanism based on deep learning, accurately positions the pathomorphism target point of the intrauterine adhesion, identifies images more accurately through strategy selection, can greatly reduce the operation process, can select corresponding interest characteristic values in advance, improves the detection precision and efficiency, collects big data of classification of the intrauterine adhesion, and provides an intrauterine adhesion homogenization diagnosis and treatment scheme.
Description
Technical Field
The invention relates to the field of artificial intelligence, in particular to an endometrial gland density estimation method.
Background
Intrauterine adhesion (IUA), also known as Asherman syndrome, is characterized in that damage to endometrium and adhesion between uterine walls are caused by damage to uterine cavity, infection and the like caused by various physical or chemical factors, thereby resulting in partial or complete occlusion of uterine cavity and/or uterine cervix. According to the difference of the adhesion position, the main clinical manifestations are hypomenorrhea, amenorrhea, periodic hypogastric pain, abnormal pregnancy, etc., which are one of the main causes of abortion and infertility. In China, the most common cause of intrauterine adhesion is induced abortion, and hysteroscopy technology is gradually popularized, so that various uterine cavity operations are increased, the incidence rate of IUA is higher and higher, the fertility of women of childbearing age is seriously influenced due to high recurrence rate and poor curative effect of treatment, and the stability of families and society of patients is also seriously influenced. The length of the course of disease and the severity of the disease affect the prognosis of treatment, and early detection, early diagnosis and early treatment can improve the pregnancy prognosis of IUA patients to a certain extent. And the judgment of the severity of intrauterine adhesion influences clinical treatment decisions. Therefore, early, accurate and noninvasive diagnosis of intrauterine adhesion and severity grading are of great clinical significance to patients. For IUA patients with fertility requirements, the aim of the therapy is mainly to obtain offspring. Pregnancy after operation of uterine cavity adhesion patients is closely related to severity of IUA, so that grading uterine cavity adhesion patients is very necessary for predicting prognosis and pregnancy outcome.
At present, the diagnosis of the intrauterine adhesion is agreed, and the gold standard is hysteroscopy. The postoperative prognosis evaluation standards of IUA patients include American birth control Association scoring standard, European gynecological endoscope Association grading standard, Chinese IUA scoring standard and the like, but no standard is accepted internationally, and the endometrium which is the index most directly influencing prognosis is not included in the scoring standards due to the limitations of methods and technologies. The endometrium is the determining factor for the implantation of fertilized ovum and the development of embryo. The endometrial glands secrete a number of proteins essential for embryo survival, growth, development, such as mucin MUC-1 and glycoprotein-a, which also confirms the importance of endometrial glands in pregnancy. With the advent of the artificial intelligence era, computer-aided medical image analysis has played an increasingly important role in early diagnosis of diseases. A new generation of artificial intelligence technology taking deep learning as a core can automatically extract high-dimensional features, so that higher classification accuracy is obtained. Therefore, the endometrial gland density estimation method with high accuracy is designed, an important auxiliary effect can be provided for clinical medical decision making, and a new thought and theoretical guidance is provided for improving IUA noninvasive diagnosis rate and prognosis pregnancy rate.
Disclosure of Invention
The invention aims to provide a method for estimating endometrial gland density, which solves the technical problems that the endometrial gland density can only be identified through visual observation and the efficiency is low in the prior art.
A method for estimating endometrial gland density, said method comprising the steps of,
step 1: inputting the marked hysteroscope picture into a training detection network for training to obtain the image characteristics of the gland;
step 2: a doctor puts a medical instrument probe into the uterus to obtain endometrium video data, and intercepts a hysteroscope picture according to the moving speed of the medical instrument probe;
and step 3: standardizing the acquired hysteroscope pictures, wherein the standardized treatment comprises unified definition treatment and noise removal regularization of the hysteroscope pictures, and planarization mapping treatment of the treated images;
and 4, step 4: identifying the hysteroscope picture by using a training network and a matching network to obtain a gland;
and 5: combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium;
and 6: obtaining the density of the endometrium glands according to the size of the endometrium overall image and the number of the glands in the endometrium overall image.
Further, the specific process of step 2 is: the hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe of the medical instrument moving in the endometrium, Δ t is the time from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture, t is the time of obtaining the hysteroscope picture, v is the linear velocity of the probe of the medical instrument moving in the endometrium, v is the linear velocity of the probe moving in the endometrium, t is the linear velocity of the probe moving in the endometrium, v is the linear velocity of the time from the last time of obtaining the hysteroscope picture, t is the time of obtaining the hysteroscope picture, v 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 And obtaining the time of the corresponding frame of the hysteroscope picture for the next time.
Further, the specific process of step 3 is: the unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output of the enhancer is a high-precision image, and the input of the discriminator is a real high-precision image or an image enhancementThe image after the strong processor handles, adjust the low precision image in multicenter to the high precision image, eliminate the difference of different image precision, the specific process of eliminating the regularization of noise is for using the machine noise shielding algorithm, the structure of machine noise shielding algorithm includes a discriminator and a compound generator, divide into two kinds of types of reference center and ordinary center earlier multicenter, the discriminator is used for distinguishing the input image is the synthetic image of reference center image or ordinary center, compound generator adds the noise mask for ordinary central data through the mask generator, then generate the plane coordinate axoplasm, paste the image after handling on the coordinate axoplasm, then can obtain two front and back images specifically do:d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2 。
Further, the specific process of step 4 is to input relevant strategy selection parameters from the strategy selection module, input the strategy selection parameters and the training detection network characteristic values into the matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then output the identified glands, label the glands and categories, the categories include the mouth-opened glands and the mouth-non-opened glands, and count the number of the glands in each image.
Further, the policy selection module selects the principle as follows:
f(x)=max[ω 1 c i +ω 2 s i +ω 3 a i +ω 4 k i ]
wherein, c i Is the standard density of gland, s i Is the age of the patient, a i For accuracy of detection, k i The four factors are respectively assigned with four corresponding parameters for gland mean sizeWhich are each ω 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and can select the selected lateral fissure by putting the characteristic values of the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the preliminarily determined gland category.
Further, the matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network features, the search subnet extracts features of a detection image, and the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal clipping unit based on a bottleneck residual block, the internal clipping unit clips the features influenced by filling and zero padding inside the block to prevent the learning position deviation of the convolution filter, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of shortcut connections, the three layers of shortcut connections are 1 × 1, 3 × 3 and 1 × 1 convolution respectively, wherein 1 × 1 convolution layer is responsible for reducing and then recovering the size, so that the 3 × 3 convolution layer becomes a bottleneck with small input and output sizes.
Further, the specific process of step 5 is: acquiring a concrete image of the gland in each image through the step 4, and overlapping the front end part of the front image and the rear end overlapping part of the rear image of two adjacent images according to the rule of overlapping the front end part of the front image and the rear end overlapping part of the rear image i Then the distance d of the front end of the previous image is acquired i And the rear end distance d of the following image i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly superposed together, and if the partial images are not superposed, the distance d is used i Gradually decreasing or increasing for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structures of the gland images are the same, superposition is completed, d i ' is gradually decreasing or increasing distance.
Further, the specific process of step 6 is: the formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
wherein (c) x ,c y ) Coordinates representing the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) And (4) representing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, so as to obtain the average density, and simultaneously, in the step 4, the number of glands identified by each image and the image are reserved and output together with the average density to complete density estimation.
By adopting the technical scheme, the invention has the following technical effects:
the invention can improve the specificity of classification of the intrauterine adhesion and the accuracy of prediction prognosis, analyzes the intrauterine adhesion judgment mechanism based on deep learning, accurately positions the pathomorphism target point of the intrauterine adhesion, identifies images more accurately through strategy selection, can greatly reduce the operation process, can select corresponding interest characteristic values in advance, improves the detection precision and efficiency, collects big data of classification of the intrauterine adhesion, and provides an intrauterine adhesion homogenization diagnosis and treatment scheme.
Drawings
FIG. 1 is a flow chart of the method of the present invention
FIG. 2 is a schematic block diagram of the present invention.
FIG. 3 is a diagram of an original count picture according to the present invention.
FIG. 4 is a cropped target area picture according to the present invention.
Fig. 5 is a target area picture after gray-scale conversion according to the present invention.
Fig. 6 is a picture of the target area after the equalization process according to the present invention.
FIG. 7 is a picture of a marker after counting by the detection algorithm of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, preferred embodiments are given and the present invention is described in further detail. It should be noted, however, that the numerous details set forth in the description are merely for the purpose of providing the reader with a thorough understanding of one or more aspects of the present invention, which may be practiced without these specific details.
An endometrial gland density estimation method, as shown in figures 1-2, comprises the following steps,
step 1: and inputting the marked hysteroscope picture into a training detection network for training to obtain the image characteristics of the gland. The size of the input image is set to 416 x 3, the glands on the image are identified manually, then characteristic values are extracted, and then the extracted values and the manual identification are subjected to secondary identification. Then, features are extracted from the image through a feature extraction network.
Training the detection network to select sample pairs from the ImageNet dataset and extract target images from frames of a video, wherein the target images have a size of 127 x 3. And inputting the target image into a subnet corresponding to the template matching network to obtain a target feature layer.
And 2, step: the doctor puts the medical instrument probe into the uterus to obtain endometrium video data, and captures the hysteroscope picture according to the moving speed of the medical instrument probe. The hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe of the medical instrument moving on the endometrium, Δ t is the hysteroscope image obtained from the last timeTime from slice to the next acquisition of hysteroscope picture, t 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 And obtaining the time of the corresponding frame of the hysteroscope picture for the next time.
And step 3: and carrying out standardization processing on the acquired hysteroscope pictures, wherein the standardization processing comprises unified definition processing and noise removal regularization of the hysteroscope pictures, and planarization and mapping processing of the processed images. The unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output is a high-precision image, the input of the discriminator is a real high-precision image or an image processed by the image enhancer, the low-precision image in a plurality of centers is adjusted to a high-precision image, the difference of different image precision is eliminated, the regularization process of removing noise is that a machine noise shielding algorithm is used, the structure of the machine noise shielding algorithm comprises the discriminator and a composite generator, the plurality of centers are firstly divided into two types of reference centers and common centers, the discriminator is used for discriminating whether the input image is a reference center image or a composite image of the common centers, the composite generator adds a noise mask to the common center data through the mask generator, and then generates a plane coordinate axial plane, pasting the processed images on the coordinate axis surface, and obtaining the front and the back images as follows:d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2 。
And 4, step 4: and identifying the hysteroscope picture by using a training network and a matching network to obtain the gland. As shown in fig. 2, relevant strategy selection parameters are input from a strategy selection module, the strategy selection parameters and training detection network characteristic values are input to a matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then the identified glands are output, the volumes and the categories of the glands are labeled, the categories comprise the mouth-opened glands and the non-mouth-opened glands, and the number of the glands in each image is counted.
The strategy selection module selects the principle as follows:
f(x)=max[ω 1 c i +ω 2 s i +ω 3 a i +ω 4 k i ]
wherein, c i Is the standard density of gland, s i The age of the patient, a i To detect the accuracy, k i The above four factors are respectively assigned with four corresponding parameters, which are respectively omega, for the gland mean size 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and selects the characteristic values of the training detection network by putting the selected lateral fissure into the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the primary gland category determination.
The matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network characteristics, the search subnet extracts characteristics of a detection image, the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal cutting unit based on a bottleneck residual block, the internal cutting unit cuts out characteristics influenced by filling and zero padding in the block to prevent a convolution filter from learning position deviation, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of quick connections, the three layers of quick connections are 1 x 1, 3 x 3 and 1 x 1 convolution respectively, wherein the 1 x 1 convolution layer is responsible for reducing and then recovering the size, and the 3 x 3 convolution layer becomes a bottleneck with small input and output sizes.
And 5: and (4) combining the identified hysteroscope pictures after de-overlapping to obtain an integral image of the endometrium. Acquiring a specific image of the gland in each image through step 4, and then combining the front end part of the previous image of the two adjacent imagesThe overlapped part at the rear end of the latter image is overlapped according to the actual distance d between the two images i Then the distance d of the front end of the previous image is acquired i The distribution structure of the glandular image and the distance d between the rear ends of the following images i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly superposed together, and if the partial images are not superposed, the distance d is used i Gradually getting smaller or larger for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structure of the gland images is the same, the superposition is completed, d i ' is gradually decreasing or increasing distance.
And 6: obtaining the density of endometrial glands according to the size of the entire endometrial image and the number of glands in the entire endometrial image. The formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
wherein (c) x ,c y ) Denotes the coordinates of the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) Expressing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, and obtaining the average density, wherein in the step 4, the number of glands recognized by each image and the image are both reservedAnd outputting the average density together with the average density to finish density estimation.
The density obtained includes the average glandular density of the whole endometrium and the density and the number of the total glandular in a single image, and meanwhile, according to the glandular in the single image, which glandular is active can be known, and a luminous flash point is arranged in the middle of the active glandular volume, as shown in figure 3, the glandular is well identified according to the image pixel and the image exposure point, and the glandular is no longer metabolized and needs to be removed.
As shown in fig. 3, an original count picture of a single frame obtained from a medical apparatus is a circular structure picture, and then a square target region picture of fig. 4 is obtained after square cutting, and then a gray-scale conversion is performed on the picture to obtain a gray-scale-converted target region picture shown in fig. 5, and then a target region picture after equalization processing shown in fig. 6 is obtained after normalization processing, so that the volume and the region size of the gland can be seen more clearly. Fig. 7 is a comparison graph of the gland mark and the following mark in the counting process, after the mark is carried out by an algorithm, the specific number and size of the glands can be clearly seen, and whether the glands are active or not can be identified through the middle brightness.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and these improvements and modifications should also be construed as the protection scope of the present invention.
Claims (6)
1. The endometrial gland density estimation method is characterized by comprising the following steps: the method comprises the following steps of,
step 1: inputting the marked hysteroscope picture into a training detection network for training to obtain the gland image characteristics;
and 2, step: a doctor puts a medical instrument probe into the uterus to obtain endometrium video data, and intercepts a hysteroscope picture according to the moving speed of the medical instrument probe;
and step 3: standardizing the acquired hysteroscope pictures, wherein the standardized treatment comprises unified definition treatment and noise removal regularization of the hysteroscope pictures, and planarization mapping treatment of the treated images;
and 4, step 4: identifying the hysteroscope picture by using a training network and a matching network to obtain a gland;
and 5: combining the identified hysteroscope pictures after de-overlapping to obtain an integral endometrium image;
step 6: obtaining the density of endometrium glands according to the size of the endometrium overall image and the number of the glands in the endometrium overall image;
the specific process of the step 2 is as follows: the hysteroscope picture is a square picture, the side length is l, and the next hysteroscope picture is obtained when d is more than l, wherein t is t 2 -t 1 D ═ Δ t × v, v is the linear velocity of the probe moving on the endometrium, Δ t is the time from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture, t 1 Time, t, for last acquisition of corresponding frame of hysteroscope picture 2 The time for acquiring the corresponding frame of the hysteroscope picture next time;
the specific process of the step 3 is as follows: the unified definition processing of the hysteroscope picture uses an image enhancement algorithm, the structure of the image enhancement algorithm comprises an enhancer and a discriminator, the input of the enhancer is a low-precision image, the output is a high-precision image, the input of the discriminator is a real high-precision image or an image processed by the image enhancer, the low-precision image in a plurality of centers is adjusted to a high-precision image, the difference of different image precision is eliminated, the regularization process of removing noise is that a machine noise shielding algorithm is used, the structure of the machine noise shielding algorithm comprises the discriminator and a composite generator, the plurality of centers are firstly divided into two types of reference centers and common centers, the discriminator is used for discriminating whether the input image is a reference center image or a composite image of the common centers, the composite generator adds a noise mask to the common center data through the mask generator, and then generates a plane coordinate axial plane, pasting the processed images on the coordinate axis surface, and obtaining the front and the back images as follows:d i the distance from the last time of obtaining the hysteroscope picture to the next time of obtaining the hysteroscope picture is as follows, the coordinate and the time of the last time of obtaining the hysteroscope picture are (x) 1 ,y 1 ) i And t 1 The coordinate and time of the hysteroscope picture are acquired next time and are (x) 2 ,y 2 ) i And t 2 。
2. The method for estimating endometrial gland density as in claim 1, wherein: the specific process of the step 4 is that relevant strategy selection parameters are input from the strategy selection module, the strategy selection parameters and the training detection network characteristic values are input into the matching network module, the matching network module identifies the glands in the image according to the gland characteristics and the strategy selection parameters of the training detection network, then the identified glands are output, the gland volumes and the categories are labeled, the categories comprise the mouth-opened glands and the non-mouth-opened glands, and the number of the glands in each image is counted.
3. The method for estimating endometrial gland density as claimed in claim 2, wherein: the strategy selection module selects the principle as follows:
f(x)=max[ω 1 c i +ω 2 s i +ω 3 a i +ω 4 k i ]
wherein, c i Is the standard density of gland, s i Is the age of the patient, a i For accuracy of detection, k i The above four factors are respectively assigned with four corresponding parameters, which are respectively omega, for the gland mean size 1 ,ω 2 ,ω 3 And ω 4 The method can freely set respective parameter values according to specific requirements, has high flexibility and expansibility, and selects the characteristic values of the training detection network by putting the selected lateral fissure into the training detection network, wherein the target with the highest score is the target of interest selected by the strategy, namely the primary gland category determination.
4. The method for estimating endometrial gland density as claimed in claim 2, wherein: the matching network module comprises a target subnet and a search subnet, the target subnet accesses training detection network characteristics, the search subnet extracts characteristics of a detection image, the target subnet and the search subnet share the same weight and bias, wherein the search subnet adopts an internal cutting unit based on a bottleneck residual block, the internal cutting unit cuts out characteristics influenced by filling and zero padding in the block to prevent a convolution filter from learning position deviation, the residual unit is a key module of the template matching network and consists of 3 stacked convolution layers and three layers of quick connections, the three layers of quick connections are 1 x 1, 3 x 3 and 1 x 1 convolution respectively, wherein the 1 x 1 convolution layer is responsible for reducing and then recovering the size, and the 3 x 3 convolution layer becomes a bottleneck with small input and output sizes.
5. The method for estimating endometrial gland density as in claim 4, wherein: the specific process of the step 5 is as follows: acquiring a concrete image of the gland in each image through the step 4, and overlapping the front end part of the front image and the rear end overlapping part of the rear image of two adjacent images according to the rule of overlapping the front end part of the front image and the rear end overlapping part of the rear image i Then the distance d of the front end of the previous image is acquired i The distribution structure of the glandular image and the distance d between the rear ends of the following images i The image distribution structures of the glands are compared, if the image distribution structures are the same, the partial images are directly overlapped together, and if the partial images are not overlapped, the distance d is calculated i Gradually decreasing or increasing for comparison until the distance d of the front end of the previous image i ' the distribution structure of the glandular image and the distance d between the posterior ends of the following images i ' when the distribution structures of the gland images are the same, superposition is completed, d i ' is gradually decreasing or increasing distance.
6. The method for estimating endometrial gland density as in claim 5, wherein: the specific process of the step 6 is as follows: the formula for calculating the coordinates of the integral image of the endometrium is as follows:
b x =σ(t x )+c x
b y =σ(t y )+c y
wherein (c) x ,c y ) Coordinates representing the upper left corner of the grid, (σ (t) x ),σ(t y ) Are offset values, are all compared with d i ' or d i Same, (p) w ,p h ) Width and height of the prior box (b) x ,b y ,b w ,b h ) And (4) representing the coordinates of the finally obtained bounding box, calculating the size of the integral image of the endometrium according to the coordinates of the finally obtained bounding box, then counting the number of glands on the integral image of the endometrium, and obtaining the average density.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110299344.7A CN113034460B (en) | 2021-03-21 | 2021-03-21 | Endometrial gland density estimation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110299344.7A CN113034460B (en) | 2021-03-21 | 2021-03-21 | Endometrial gland density estimation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113034460A CN113034460A (en) | 2021-06-25 |
CN113034460B true CN113034460B (en) | 2022-09-09 |
Family
ID=76472267
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110299344.7A Active CN113034460B (en) | 2021-03-21 | 2021-03-21 | Endometrial gland density estimation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113034460B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106535738A (en) * | 2014-04-01 | 2017-03-22 | 弗迪格医疗有限公司 | A monitoring system for continuously sensing the uterus |
CN110647889A (en) * | 2019-08-26 | 2020-01-03 | 中国科学院深圳先进技术研究院 | Medical image recognition method, medical image recognition apparatus, terminal device, and medium |
CN111344801A (en) * | 2017-11-22 | 2020-06-26 | 通用电气公司 | System and method for multimodal computer-assisted diagnosis of prostate cancer |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100158332A1 (en) * | 2008-12-22 | 2010-06-24 | Dan Rico | Method and system of automated detection of lesions in medical images |
WO2012016242A2 (en) * | 2010-07-30 | 2012-02-02 | Aureon Biosciences, Inc. | Systems and methods for segmentation and processing of tissue images and feature extraction from same for treating, diagnosing, or predicting medical conditions |
-
2021
- 2021-03-21 CN CN202110299344.7A patent/CN113034460B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106535738A (en) * | 2014-04-01 | 2017-03-22 | 弗迪格医疗有限公司 | A monitoring system for continuously sensing the uterus |
CN111344801A (en) * | 2017-11-22 | 2020-06-26 | 通用电气公司 | System and method for multimodal computer-assisted diagnosis of prostate cancer |
CN110647889A (en) * | 2019-08-26 | 2020-01-03 | 中国科学院深圳先进技术研究院 | Medical image recognition method, medical image recognition apparatus, terminal device, and medium |
Non-Patent Citations (1)
Title |
---|
基于形态学重构的宫腔镜下子宫内膜腺体开口标记算法;杨淼等;《北京生物医学工程》;20151031(第05期);第468、469页 * |
Also Published As
Publication number | Publication date |
---|---|
CN113034460A (en) | 2021-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108898160B (en) | Breast cancer histopathology grading method based on CNN and imaging omics feature fusion | |
WO2021203795A1 (en) | Pancreas ct automatic segmentation method based on saliency dense connection expansion convolutional network | |
CN109190540A (en) | Biopsy regions prediction technique, image-recognizing method, device and storage medium | |
CN112967285B (en) | Chloasma image recognition method, system and device based on deep learning | |
CN109815888B (en) | Novel Pasteur staining method-based abnormal cervical cell automatic identification method | |
CN107895364A (en) | A kind of three-dimensional reconstruction system for the preoperative planning of virtual operation | |
CN109636805A (en) | A kind of uterine neck image lesion region segmenting device and method based on classification priori | |
CN109035283A (en) | It is a kind of precisely to be detected and quantitative analysis method based on the pulmonary emphysema for randomly selecting subregion | |
CN113469987A (en) | Dental X-ray image lesion area positioning system based on deep learning | |
CN112215799A (en) | Automatic classification method and system for grinded glass lung nodules | |
CN114627067A (en) | Wound area measurement and auxiliary diagnosis and treatment method based on image processing | |
CN114677378A (en) | Computer-aided diagnosis and treatment system based on ovarian tumor benign and malignant prediction model | |
CN110710986B (en) | CT image-based cerebral arteriovenous malformation detection method and system | |
CN113034460B (en) | Endometrial gland density estimation method | |
Tang et al. | MRI image segmentation system of uterine fibroids based on AR-Unet network | |
CN108836394B (en) | Automatic measuring method for descending angle of fetal head | |
WO2018098697A1 (en) | Image feature repeatability measurement method and device | |
CN112017772A (en) | Disease cognition model construction method and system based on woman leucorrhea | |
Shariaty et al. | Severity and progression quantification of covid-19 in ct images: a new deep-learning approach | |
CN115187577B (en) | Automatic drawing method and system for breast cancer clinical target area based on deep learning | |
CN111354057A (en) | Bone fracture line map drawing method based on image deformation technology | |
CN112598669B (en) | Lung lobe segmentation method based on digital human technology | |
CN113255718B (en) | Cervical cell auxiliary diagnosis method based on deep learning cascade network method | |
CN113409275B (en) | Method for determining thickness of transparent layer behind fetal neck based on ultrasonic image and related device | |
CN111768845B (en) | Pulmonary nodule auxiliary detection method based on optimal multi-scale perception |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |