CN109598709A - Mammary gland assistant diagnosis system and method based on fusion depth characteristic - Google Patents

Mammary gland assistant diagnosis system and method based on fusion depth characteristic Download PDF

Info

Publication number
CN109598709A
CN109598709A CN201811440304.4A CN201811440304A CN109598709A CN 109598709 A CN109598709 A CN 109598709A CN 201811440304 A CN201811440304 A CN 201811440304A CN 109598709 A CN109598709 A CN 109598709A
Authority
CN
China
Prior art keywords
lump
depth characteristic
mammary gland
fusion
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811440304.4A
Other languages
Chinese (zh)
Other versions
CN109598709B (en
Inventor
王之琼
李默
信俊昌
张倩倩
任捷
黄玉坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN201811440304.4A priority Critical patent/CN109598709B/en
Publication of CN109598709A publication Critical patent/CN109598709A/en
Application granted granted Critical
Publication of CN109598709B publication Critical patent/CN109598709B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of mammary gland assistant diagnosis system and method based on fusion depth characteristic, is related to medical image post-procession technique field.System includes pretreatment unit, Mass detection unit, fusion depth characteristic extraction unit and lump diagnosis unit, is pre-processed to original galactophore image, the subregion for being divided into several not overlap mammary region;Mammary gland subregion depth characteristic is extracted using convolutional neural networks CNN, all subregion depth characteristic is clustered using US-ELM, obtains breast lump and non-lump region;Lump depth characteristic is extracted using convolutional neural networks CNN, while extracting lump form, textural characteristics, by lump depth, form, Texture Feature Fusion at fusion depth characteristic;Fusion depth characteristic is learnt using learning machine ELM is transfinited, finally obtains the good pernicious diagnostic result of lump.The present invention is applied in mammary gland auxiliary diagnosis, can effectively assist the Accurate Diagnosis of mammary gland disease.

Description

Mammary gland assistant diagnosis system and method based on fusion depth characteristic
Technical field
The present invention relates to medical image post-procession technique field more particularly to a kind of mammary gland based on fusion depth characteristic are auxiliary Help diagnostic system and method.
Background technique
Breast cancer seriously endangers women life and health, and morbidity and mortality are located at the 1st of women's diseases With the 2nd.Early detection lump can effectively reduce Death Rate of Breast Cancer.Mammograms because its detect price compared with It is low, it is sensitive to minute lesion in mammary gland and become breast cancer early screening when a kind of most common detection method.However in reality During the diagnosis of border, because of the reasons such as image department doctor fatigue and absent minded, breast structure complexity, it may appear that diagnosis The not high situation of accuracy rate.In view of these situations, Mammogram Analysis comes into being.
The main flow of classical mammary gland auxiliary diagnosis includes pre-processing to mammograms, is obtained later Area-of-interest simultaneously obtains lump region to complete breast lump auxiliary detection;Then, lump is extracted according to the experience of doctor The features such as form, texture, density and form feature vector, finally classified to these feature vectors and obtain good pernicious examine Disconnected result.
Although classical mammary gland auxiliary diagnosis achieves certain achievement, but its accuracy rate is still to be improved.In mammary gland The superiority and inferiority of feature set during auxiliary diagnosis, be directly related to auxiliary diagnosis result it is accurate whether.It is obtained according to doctors experience The feature come no doubt has its advantage, but certainly exists the feature that is not yet found or can not be stated by doctor.In recent years, depth Learning method, especially convolutional neural networks are being schemed because it can extract objective substantive characteristics and be not necessarily to artificial the advantages that participating in As achieving huge success in terms of identification, speech recognition, natural language processing, this is excellent for the feature set in mammary gland auxiliary diagnosis Change brings new opportunity.
Summary of the invention
It is a kind of special based on fusion depth the technical problem to be solved by the present invention is in view of the above shortcomings of the prior art, provide The mammary gland assistant diagnosis system and method for sign extract the objective substantive characteristics of breast lump using deep learning method, by it It is merged with form, textural characteristics, to form fusion depth characteristic, is separately to the mammary gland auxiliary diagnosis stage, The accurate segmentation and diagnosis for carrying out breast lump, can effectively assist the Accurate Diagnosis of mammary gland disease.
In order to solve the above technical problems, the technical solution used in the present invention is:
On the one hand, the present invention provides a kind of mammary gland assistant diagnosis system based on fusion depth characteristic, including pretreatment list Member, Mass detection unit, fusion depth characteristic extraction unit and lump diagnosis unit;
Pretreatment unit includes image denoising device, image intensifier, sliding window generator and subregion divider;
Image denoising device, for carrying out denoising to original galactophore image, the galactophore image after being denoised;
Image intensifier, for emphasizing entirety or local characteristics in the galactophore image after denoising, difference portion in enlarged image The difference of interdigit, the region that inhibits to lose interest in expression and increase the contrast of doubtful lump and surrounding tissue, obtain enhancing figure Picture;
Sliding window generator, for generating the upper left corner or the right side for being placed on mammary gland part in the mammary gland CC image of left side The square sliding window in the upper right corner of mammary gland part in the mammary gland CC image of side;
Subregion divider is divided into several overlapped subregions for that will enhance mammary gland part in image, makees The basis of post-processing unit for it;
Mass detection unit includes subregion depth characteristic extractor, cluster device and segmentation result extractor;
Subregion depth characteristic extractor, for extracting the depth characteristic for several sub-regions that pretreatment unit obtains;
Device is clustered, for using cluster to the depth characteristic of each sub-regions using the unsupervised learning machine US-ELM that transfinites Algorithm is clustered, and breast lump and non-lump region are obtained;
Segmentation result display, for extracting the edge coordinate for the lump that cluster obtains, to test the standard of Mass detection True rate, and basis is provided for Computer-aided Diagnosis of Breast Cancer;
Merging depth characteristic extraction unit includes depth characteristic extractor, morphological feature extraction device, texture feature extraction device With Fusion Features device;
Depth characteristic extractor, for extracting the depth characteristic for the lump that Mass detection unit detects using CNN network, Obtain depth characteristic square vector;
Morphological feature extraction device obtains form spy for extracting the morphological feature for the lump that Mass detection unit detects Levy square vector;
Texture feature extraction device obtains texture spy for extracting the textural characteristics for the lump that Mass detection unit detects Levy square vector;
Fusion Features device forms fusion depth characteristic vector for merging the depth, form, textural characteristics of lump;
Lump diagnosis unit, for being learnt using learning machine ELM is transfinited to fusion depth characteristic vector, and final It is raw to the good pernicious diagnostic result of lump, including transformation matrix generator, random parameter generator, converter, weight vector parameter It grows up to be a useful person and parameter selector;
Transformation matrix generator, for according to learning machine (Extreme Learning Machine, ELM) algorithm original that transfinites Reason generates the Laplace transform matrix of the fusion depth characteristic in breast molybdenum target image lump region;
Random parameter generator, it is random to generate ELM network inputs for the hidden node number according to setting ELM network The weight vectors of node and the threshold value of hidden node;The input node refers to fusion depth characteristic vector;
Converter, it is raw using the weight vectors of input node and the threshold value of hidden node for the principle according to ELM algorithm At the hidden layer output matrix H of the mammary gland sub-district characteristic of field in ELMi
Weight vector parameter generators: using the principle of the learning machine that transfinites (ELM), for according to hidden layer output matrix HiAnd The target of output calculates the weight vectors parameter beta of output nodei
Parameter selector: for selecting the optimal parameter β generated in random parameter generator, according to the Computing Principle of ELM, Classified using fusion depth characteristic of the optimal parameter β to breast lump region, obtains final lump classification results.
On the other hand, the present invention also provides a kind of mammary gland aided diagnosis method based on fusion depth characteristic, use is above-mentioned Based on fusion depth characteristic mammary gland assistant diagnosis system realize, method includes the following steps:
Step 1: the pretreatment of hot-tempered enhancing is carried out to original galactophore image, while generating square sliding window, it will be newborn The subregion that gland region segmentation is not overlapped at several;
Step 2: extracting the depth characteristic of mammary gland subregion using convolutional neural networks CNN, and transfinited using unsupervised Learning machine (Unsupervised Extreme Learning Machine, US-ELM) carries out the depth characteristic of all subregion Cluster, detects breast lump, obtains breast lump and non-lump region;
Step 3: for the breast lump region extracted, the depth characteristic of lump is extracted using convolutional neural networks CNN, Form, the textural characteristics for extracting lump simultaneously, by the depth, form, Texture Feature Fusion of lump at a fusion depth characteristic;
Step 4: fusion depth characteristic being learnt using learning machine ELM is transfinited, and finally obtains the good pernicious of lump Diagnostic result.
Further, step 1 specifically includes:
Step 101: mean filter operation being carried out to galactophore image, the image after being denoised;
Step 102: operation being enhanced to the image degree of comparing after denoising, obtains enhancing image;
Step 103: edge detecting operation being carried out to enhancing image, obtains mammary gland edge coordinate;
Step 104: generating a square sliding window, the window is allowed to be placed on left side mammary gland CC image in mammary region The upper right corner of mammary gland part in the upper left corner of middle mammary gland part or right side breast CC image;
Step 105: using the square sliding window generated, the slide downward since the starting point coordinate of mammary gland, by certain Sequentially, several mammary gland subregions not overlapped are obtained;Wherein, in the mammary gland CC image of left side sliding window press from a left side to Right, sequence from top to bottom, the sequence of sliding window from right to left, from top to bottom in right side breast CC image.
Further, step 2 specifically includes:
Step 201: one 8 layers of CNN network of design, for extracting the depth characteristic of each mammary gland subregion;
Step 202: according to the principle of US-ELM, the depth characteristic of each sub-regions is clustered, obtain lump with it is non- Lump region;
Step 203: according to step 202 as a result, the edge coordinate in lump region is extracted, to test Mass detection As a result and next diagnostic operation is carried out.
Further, step 3 extracts the fusion depth characteristic in breast lump region, specifically includes:
Step 301: the CNN network of one 10 layers of design, for extracting the depth characteristic in lump region;
Step 302: extracting the morphological feature of lump, including like circularity g1, normalization radius entropy g2, normalization radius Variance g3, area ratio g4With roughness g5, specific formula is as follows:
Wherein, A is lump area, and P is mass edge perimeter, pkIt is that the standardization border of marginal point falls in k-th of section Probability, the normalization radius of marginal point is distributed in [0,1], this section is divided into K subinterval, and calculate each section Number comprising marginal point;MN is the total number of all marginal points, diIt is i-th point on edge of normalization radius, davgIt is side The Average normalized radius of edge point;
Step 303: extracting the textural characteristics of lump, including inverse difference moment t1, entropy t2, energy t3, related coefficient t4And contrast t5, specific formula is as follows:
t2=∑ P (i, j) * [- lnP (i, j)]
t3=∑ P2(i,j)
t5=(i-j)2*P(i,j)
Wherein, P (i, j) is the i-th row jth column element, μ in gray level co-occurrence matrixes Px、μyIt is in gray level co-occurrence matrixes P respectively The mean value of row, column, δx、δyIt is the variance of row, column in gray level co-occurrence matrixes P respectively;
Step 304: depth characteristic that fusion steps 301~303 are extracted, morphological feature, textural characteristics form one and melt Close depth characteristic.
Further, step 4 pair fusion depth characteristic is classified, and finally obtains the good pernicious classification results of lump, It specifically includes:
Step 401: using the principle of ELM, generating the Laplacian Matrix of input node;It is special that input node merges depth Sign;
Step 402: setting hidden node number, and the weight vectors of input node and the threshold of hidden node are generated at random Value;
Step 403: according to the principle of ELM, using the weight vectors of input node and the threshold value of hidden node, by step 3 The feature vector that obtained fusion depth characteristic is constituted is converted into the hidden layer output matrix H of the mammary gland sub-district characteristic of field in ELMi
Step 404: each weight vector parameter generators, which generate weight vectors according to respective fusion depth characteristic vector, joins Number βi, and it is sent to parameter selector;
Step 405: parameter selector receives and summarizes the weight vectors parameter of each weight vector parameter production device;According to remittance Total weight vectors parameter betaiIt is selected, the classification results obtained using different parameters are selected optimized parameter β, then pressed According to the Computing Principle of ELM, classified using fusion depth characteristic of the optimal parameter β to input, obtains final lump classification As a result.
The beneficial effects of adopting the technical scheme are that the cream provided by the invention based on fusion depth characteristic Gland assistant diagnosis system and method after pre-processing to galactophore image, are gathered using the depth characteristic to mammary gland subregion The method of class obtains breast lump region, realizes the detection of breast lump;Extract depth, the form, texture in breast lump region Feature is formed fusion depth characteristic, is classified using classifier, and finally obtain the good pernicious diagnostic result of breast lump, The system and method can effectively assist the diagnosis of mammary gland disease.
Detailed description of the invention
Fig. 1 is system structure diagram provided in an embodiment of the present invention;
Fig. 2 is method flow diagram provided in an embodiment of the present invention;
Fig. 3 is the structure for the convolutional neural networks CNN that breast lump detection-phase provided in an embodiment of the present invention uses;
Fig. 4 is the structure of the convolutional neural networks CNN used in the Breast Masses stage provided in an embodiment of the present invention.
Specific embodiment
With reference to the accompanying drawings and examples, specific embodiments of the present invention will be described in further detail.Implement below Example is not intended to limit the scope of the invention for illustrating the present invention.
In the present embodiment, original galactophore image is (I1,I2,…,IN).Auxiliary diagnosis based on mammary gland fusion depth characteristic The structural block diagram of system, as shown in Figure 1, the system includes: pretreatment unit, Mass detection unit, fusion depth characteristic extraction Unit and lump diagnosis unit.
Pretreatment unit includes image denoising device, image intensifier, sliding window generator and subregion divider.
Image denoising device, for original galactophore image (I1,I2,…,IN) carry out denoising, the figure after being denoised As (U1,U2,…,UN);
Image intensifier, for emphasizing entirety or local characteristics in the galactophore image after denoising, difference portion in enlarged image The difference of interdigit, the region that inhibits to lose interest in expression and increase the contrast of doubtful lump and surrounding tissue, obtain enhancing figure As (T1,T2,…,TN);
Sliding window generator, for generating the upper left corner or the right side for being placed on mammary gland part in the mammary gland CC image of left side The square sliding window in the upper right corner of mammary gland part in the mammary gland CC image of side, square sliding window pixel is big in the present embodiment Small is 48 × 48;
Subregion divider is divided into several overlapped subregions for that will enhance mammary gland part in image (S1,S2,...,Sq), as post-processing unit basis.Wherein, q is the number of subregion.
Mass detection unit includes subregion depth characteristic extractor, cluster device and segmentation result extractor.
Subregion depth characteristic extractor, for extracting the depth characteristic for several sub-regions that pretreatment unit obtains (F1,F2,...,Fq)。
Device is clustered, for using cluster to the depth characteristic of each sub-regions using the unsupervised learning machine US-ELM that transfinites Algorithm is clustered, and breast lump and non-lump region are obtained;
Segmentation result display, for extracting the edge coordinate for the lump that cluster obtains, to test the standard of Mass detection True rate, and basis is provided for Computer-aided Diagnosis of Breast Cancer.
Fusion depth characteristic extraction unit include depth characteristic extractor, morphological feature extraction device, texture feature extraction device, Density feature extractor and Fusion Features device.
Depth characteristic extractor, for extracting the depth characteristic for the lump that Mass detection unit detects using CNN network (f1,f2,...,fn), wherein n depends on the structure of CNN network;
Morphological feature extraction device, for extracting the morphological feature (g of lump1,g2,...,g5);
Texture feature extraction device, for extracting the textural characteristics (t of lump1,t2,...,t5);
Fusion Features device, for merging the depth characteristic (f of lump1,f2,...,fn), morphological feature (g1,g2,...,g5)、 Textural characteristics (t1,t2,...,t5), form fusion depth characteristic (FF1,FF2,...,FFx), wherein x=n+5+5, indicates fusion The number of feature.
Lump diagnosis unit, for being learnt using learning machine ELM is transfinited to fusion depth characteristic vector, and final It is raw to the good pernicious diagnostic result of lump, including transformation matrix generator, random parameter generator, converter, weight vector parameter It grows up to be a useful person and parameter selector.
Transformation matrix generator, for according to learning machine (Extreme Learning Machine, ELM) algorithm original that transfinites Reason generates the Laplace transform matrix L of the fusion depth characteristic in breast molybdenum target image lump region;
Random parameter generator, for the hidden node number s according to setting ELM network, random generation input node Weight vectors ω12,...,ωsWith the threshold value b of hidden node1,b2,...,bs;Input node refers to fusion depth characteristic vector;
Converter, it is raw using the weight vectors of input node and the threshold value of hidden node for the principle according to ELM algorithm At the hidden layer output matrix H of the mammary gland sub-district characteristic of field in ELMi
Weight vector parameter generators: using the principle of the learning machine that transfinites (ELM), for according to hidden layer output matrix HiAnd The target of output calculates the weight vectors parameter beta of output nodei
Parameter selector: for selecting the optimal parameter β generated in random parameter generator, according to the Computing Principle of ELM, Classified using fusion depth characteristic of the optimal parameter β to breast lump region, obtains final lump classification results.
The present embodiment also provide it is a kind of using it is above-mentioned based on fusion depth characteristic mammary gland assistant diagnosis system realize Based on the mammary gland aided diagnosis method of fusion depth characteristic, as shown in Fig. 2, specifically comprising the following steps.
Step 1: to original galactophore image (I1,I2,…,IN) pretreatment of hot-tempered enhancing is carried out, while generating square Sliding window, the subregion (S for being divided into several not overlap mammary region1,S2,...,Sq), it specifically includes:
Step 101: mean filter operation being carried out to galactophore image, the image (U after being denoised1,U2,…,UN);
Step 102: operation being enhanced to the image degree of comparing after denoising, obtains enhancing image (T1,T2,…,TN);
Step 103: edge detecting operation being carried out to enhancing image, obtains mammary gland edge coordinate;
Step 104: generating a square sliding window, the window is allowed to be placed on left side mammary gland CC image in mammary region The upper right corner of mammary gland part in the upper left corner of middle mammary gland part or right side breast CC image;Square sliding window in the present embodiment Pixel size is 48 × 48;
Step 105: utilizing the square sliding window generated, the slide downward since the starting point coordinate of mammary gland, in left side Sliding window is by sequence from left to right, from top to bottom in mammary gland CC image, and in right side breast CC image sliding window from The right side obtains several mammary gland subregion (S not overlapped to left, sequence from top to bottom1,S2,...,Sq)。
Step 2: extracting the depth characteristic of mammary gland subregion using convolutional neural networks CNN, and transfinited using unsupervised Learning machine (Unsupervised Extreme Learning Machine, US-ELM) carries out the depth characteristic of all subregion Cluster, obtains breast lump and non-lump region, specifically includes:
Step 201: one 8 layers of CNN network of design, as shown in figure 3, the depth characteristic for extracting each mammary gland subregion (F1,F2,...,Fq);
Step 202: according to the principle of US-ELM, the depth characteristic of each sub-regions is clustered, obtain lump with it is non- Lump region;
Step 203: according to step 202 as a result, the edge coordinate in lump region is extracted, to test Mass detection As a result and next diagnostic operation is carried out.
Step 3: for the breast lump region extracted, the depth characteristic of lump is extracted using convolutional neural networks CNN, Form, the textural characteristics of lump are extracted simultaneously, by the depth, form, Texture Feature Fusion of lump at a fusion depth characteristic, It specifically includes:
Step 301: the CNN network of one 10 layers of design, as shown in figure 4, the depth characteristic for extracting lump region (f1,f2,...,fn);
Step 302: extracting the morphological feature of lump, including like circularity g1, normalization radius entropy g2, normalization radius Variance g3, area ratio g4With roughness g5, specific formula is as follows:
Wherein, A is lump area, and P is mass edge perimeter, pkIt is that the standardization border of marginal point falls in k-th of section Probability, MN is the total number of all marginal points, diIt is i-th point on edge of normalization radius, davgIt is being averaged for marginal point Normalization radius;
The normalization radius of marginal point is distributed in [0,1], this section is divided into K subinterval, and calculate each area Between include marginal point number;What the entropy of normalization radius showed is the difference between normalization radius;In the present embodiment, according to Document " research of breast lump detection technique [D] .2014. of Wang Zhiqiong based on extreme learning machine ", sets 100 for K;
Step 303: extracting the textural characteristics of lump, including inverse difference moment t1, entropy t2, energy t3, related coefficient t4And contrast t5, specific formula is as follows:
t2=∑ P (i, j) * [- lnP (i, j)]
t3=∑ P2(i,j)
t5=(i-j)2*P(i,j)
Wherein, P (i, j) is the i-th row jth column element, μ in gray level co-occurrence matrixes Px、μyIt is in gray level co-occurrence matrixes P respectively The mean value of row, column, δx、δyIt is the variance of row, column in P respectively;
F (x, y) is that a width size is M × M, gray level NgBreast lump area image, then gray level co-occurrence matrixes can It is expressed as
P (i, j)=# { (x1,y1)(x2,y2)∈M×M|f(x1,y1)=i, f (x2,y2)=j }
Wherein, P is Ng×NgRank matrix, enables (x1,y1) and (x2,y2) between distance be equal to d, two pixel lines and sit Parameter angulation θ, # (x) are equal to the number of all elements in set x, be calculated by above-mentioned formula comprising pixel angle and The value range of gray level co-occurrence matrixes P (i, j, d, θ), i and the j of location information are the values in f (x, y) breast lump region, That is i≤M, j≤M.
Step 304: depth characteristic that fusion steps 301~303 are extracted, morphological feature, textural characteristics form one and melt Depth characteristic is closed, to classify later to lump.
Step 4: fusion depth characteristic being learnt using learning machine ELM is transfinited, and finally obtains the good pernicious of lump Diagnostic result.
Step 401: using the principle of ELM, generating the Laplacian Matrix L of fusion depth characteristic;
Step 402: setting hidden node number, and the weight vectors ω of input node is generated at random12,...,ωsWith The threshold value b of hidden node1,b2,...,bs
Step 403: according to the principle of ELM, utilizing the weight vectors ω of input node12,...,ωsAnd hidden node Threshold value b1,b2,...,bs, the feature vector that the fusion depth characteristic that step 3 obtains is constituted is converted into of the mammary gland in ELM The hidden layer output matrix H of provincial characteristicsi
Step 404: each weight vector parameter generators, which generate weight vectors according to respective fusion depth characteristic vector, joins Number βi, and it is sent to parameter selector;
Step 405: parameter selector receives and summarizes the weight vectors parameter of each weight vector parameter production device;According to remittance Total weight vectors parameter betaiIt is selected, the classification results obtained using different parameters are selected optimized parameter β, then pressed According to the Computing Principle of ELM, classified using fusion depth characteristic of the optimal parameter β to input, obtains final lump classification As a result.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used To modify to technical solution documented by previous embodiment, or some or all of the technical features are equal Replacement;And these are modified or replaceed, model defined by the claims in the present invention that it does not separate the essence of the corresponding technical solution It encloses.

Claims (6)

1. it is a kind of based on fusion depth characteristic mammary gland assistant diagnosis system, it is characterised in that: the system include pretreatment unit, Mass detection unit, fusion depth characteristic extraction unit and lump diagnosis unit;
Pretreatment unit includes image denoising device, image intensifier, sliding window generator and subregion divider;
Image denoising device, for carrying out denoising to original galactophore image, the galactophore image after being denoised;
Image intensifier, for emphasizing entirety or local characteristics in the galactophore image after denoising, in enlarged image between different parts Difference, the region that inhibits to lose interest in expression and increase the contrast of doubtful lump and surrounding tissue, obtain enhancing image;
Sliding window generator is placed on the upper left corner of mammary gland part or right side cream in the mammary gland CC image of left side for generating one The square sliding window in the upper right corner of mammary gland part in gland CC image;
Subregion divider is divided into several overlapped subregions for that will enhance mammary gland part in image, as it The basis of post-processing unit;
Mass detection unit includes subregion depth characteristic extractor, cluster device and segmentation result extractor;
Subregion depth characteristic extractor, for extracting the depth characteristic for several sub-regions that pretreatment unit obtains;
Device is clustered, for using clustering algorithm to the depth characteristic of each sub-regions using the unsupervised learning machine US-ELM that transfinites It is clustered, obtains breast lump and non-lump region;
Segmentation result display, for extracting the edge coordinate for the lump that cluster obtains, to test the accuracy rate of Mass detection, And basis is provided for Computer-aided Diagnosis of Breast Cancer;
Merging depth characteristic extraction unit includes depth characteristic extractor, morphological feature extraction device, texture feature extraction device and spy Levy fusion device;
Depth characteristic extractor is obtained for being extracted the depth characteristic for the lump that Mass detection unit detects using CNN network Depth characteristic square vector;
Morphological feature extraction device obtains morphological feature square for extracting the morphological feature for the lump that Mass detection unit detects Vector;
Texture feature extraction device obtains textural characteristics square for extracting the textural characteristics for the lump that Mass detection unit detects Vector;
Fusion Features device forms fusion depth characteristic vector for merging the depth, form, textural characteristics of lump;
Lump diagnosis unit for being learnt using learning machine ELM is transfinited to fusion depth characteristic vector, and finally obtains swollen The good pernicious diagnostic result of block, including transformation matrix generator, random parameter generator, converter, weight vector parameter generators And parameter selector;
Transformation matrix generator, for according to transfiniting learning machine (Extreme Learning Machine, ELM) algorithm principle, Generate the Laplace transform matrix of the fusion depth characteristic in breast molybdenum target image lump region;
Random parameter generator, it is random to generate ELM network inputs node for the hidden node number according to setting ELM network Weight vectors and hidden node threshold value;The input node refers to fusion depth characteristic vector;
Converter is generated for the principle according to ELM algorithm using the weight vectors of input node and the threshold value of hidden node The hidden layer output matrix H of mammary gland sub-district characteristic of field in ELMi
Weight vector parameter generators: using the principle of the learning machine that transfinites (ELM), for according to hidden layer output matrix HiAnd output Target, calculate the weight vectors parameter beta of output nodei
Parameter selector: it for selecting the optimal parameter β generated in random parameter generator, according to the Computing Principle of ELM, uses Optimal parameter β classifies to the fusion depth characteristic in breast lump region, obtains final lump classification results.
2. a kind of mammary gland aided diagnosis method based on fusion depth characteristic, using described in claim 1 a kind of based on fusion The mammary gland assistant diagnosis system of depth characteristic is realized, it is characterised in that: method includes the following steps:
Step 1: carrying out the pretreatment of hot-tempered enhancing to original galactophore image, while generating square sliding window, by area mammaria The subregion that regional partition is not overlapped at several;
Step 2: extracting the depth characteristic of mammary gland subregion using convolutional neural networks CNN, and utilize unsupervised study of transfiniting Machine (Unsupervised Extreme Learning Machine, US-ELM) clusters the depth characteristic of all subregion, Breast lump is detected, breast lump and non-lump region are obtained;
Step 3: for the breast lump region extracted, the depth characteristic of lump is extracted using convolutional neural networks CNN, simultaneously Form, the textural characteristics for extracting lump, by the depth, form, Texture Feature Fusion of lump at a fusion depth characteristic;
Step 4: fusion depth characteristic being learnt using learning machine ELM is transfinited, and finally obtains the good pernicious diagnosis of lump As a result.
3. the mammary gland aided diagnosis method according to claim 2 based on fusion depth characteristic, it is characterised in that: the step Rapid 1 specifically includes:
Step 101: mean filter operation being carried out to galactophore image, the image after being denoised;
Step 102: operation being enhanced to the image degree of comparing after denoising, obtains enhancing image;
Step 103: edge detecting operation being carried out to enhancing image, obtains mammary gland edge coordinate;
Step 104: generating a square sliding window, the window is allowed to be placed in mammary region cream in the mammary gland CC image of left side The upper right corner of mammary gland part in the upper left corner of gland part or right side breast CC image;
Step 105: using the square sliding window generated, the slide downward since the starting point coordinate of mammary gland, in certain sequence, Obtain several mammary gland subregions not overlapped;Wherein, in the mammary gland CC image of left side sliding window by from left to right, from upper Sequence under, the sequence of sliding window from right to left, from top to bottom in right side breast CC image.
4. the mammary gland aided diagnosis method according to claim 2 based on fusion depth characteristic, it is characterised in that: the step Rapid 2 specifically include:
Step 201: one 8 layers of CNN network of design, for extracting the depth characteristic of each mammary gland subregion;
Step 202: according to the principle of US-ELM, the depth characteristic of each sub-regions being clustered, obtains lump and non-lump Region;
Step 203: according to step 202 as a result, the edge coordinate in lump region is extracted, to test the result of Mass detection And carry out next diagnostic operation.
5. the mammary gland aided diagnosis method according to claim 2 based on fusion depth characteristic, it is characterised in that: the step Rapid 3 extract the fusion depth characteristic in breast lump region, specifically include:
Step 301: the CNN network of one 10 layers of design, for extracting the depth characteristic in lump region;
Step 302: extracting the morphological feature of lump, including like circularity g1, normalization radius entropy g2, normalization radius variance g3, area ratio g4With roughness g5, specific formula is as follows:
Wherein, A is lump area, and P is mass edge perimeter, pkIt is that the standardization border of marginal point falls in the general of k-th of section Rate, the normalization radius of marginal point are distributed in [0,1], this section are divided into K subinterval, and calculate each section and include The number of marginal point;MN is the total number of all marginal points, diIt is i-th point on edge of normalization radius, davgIt is marginal point Average normalized radius;
Step 303: extracting the textural characteristics of lump, including inverse difference moment t1, entropy t2, energy t3, related coefficient t4With contrast t5, Specific formula is as follows:
t2=∑ P (i, j) * [- lnP (i, j)]
t3=∑ P2(i, j)
t5=(i-j)2* P (i, j)
Wherein, P (i, j) is the i-th row jth column element, μ in gray level co-occurrence matrixes Px、μyIt is row, column in gray level co-occurrence matrixes P respectively Mean value, δx、δyIt is the variance of row, column in gray level co-occurrence matrixes P respectively;
Step 304: it is deep to form a fusion for depth characteristic that fusion steps 301~303 are extracted, morphological feature, textural characteristics Spend feature.
6. the mammary gland aided diagnosis method according to claim 2 based on fusion depth characteristic, it is characterised in that: the step Rapid 4 pairs of fusions depth characteristic is classified, and finally obtains the good pernicious classification results of lump, is specifically included:
Step 401: using the principle of ELM, generating the Laplacian Matrix of input node;Input node merges depth characteristic;
Step 402: setting hidden node number, and the weight vectors of input node and the threshold value of hidden node are generated at random;
Step 403: being obtained step 3 using the weight vectors of input node and the threshold value of hidden node according to the principle of ELM The feature vector that constitutes of fusion depth characteristic be converted into the hidden layer output matrix H of the mammary gland sub-district characteristic of field in ELMi
Step 404: each weight vector parameter generators generate weight vectors parameter beta according to respective fusion depth characteristic vectori, And it is sent to parameter selector;
Step 405: parameter selector receives and summarizes the weight vectors parameter of each weight vector parameter production device;According to what is summarized Weight vectors parameter betaiIt is selected, the classification results obtained using different parameters select optimized parameter β, then according to ELM Computing Principle, classified using fusion depth characteristic of the optimal parameter β to input, obtain final lump classification results.
CN201811440304.4A 2018-11-29 2018-11-29 Mammary gland auxiliary diagnosis system and method based on fusion depth characteristic Active CN109598709B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811440304.4A CN109598709B (en) 2018-11-29 2018-11-29 Mammary gland auxiliary diagnosis system and method based on fusion depth characteristic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811440304.4A CN109598709B (en) 2018-11-29 2018-11-29 Mammary gland auxiliary diagnosis system and method based on fusion depth characteristic

Publications (2)

Publication Number Publication Date
CN109598709A true CN109598709A (en) 2019-04-09
CN109598709B CN109598709B (en) 2023-05-26

Family

ID=65960476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811440304.4A Active CN109598709B (en) 2018-11-29 2018-11-29 Mammary gland auxiliary diagnosis system and method based on fusion depth characteristic

Country Status (1)

Country Link
CN (1) CN109598709B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109893100A (en) * 2019-04-18 2019-06-18 盐城工学院 A kind of method that breast density quantification calculates in breast cancer risk assessment
CN110070538A (en) * 2019-04-28 2019-07-30 华北电力大学(保定) Bolt two-dimensional visual documents structured Cluster method based on form optimization depth characteristic
CN110148467A (en) * 2019-05-16 2019-08-20 东北大学 A kind of Lung neoplasm device of computer aided diagnosis and method based on improvement CNN
CN110348457A (en) * 2019-06-25 2019-10-18 北京邮电大学 A kind of image characteristic extracting method, extraction element, electronic equipment and storage medium
CN110473193A (en) * 2019-08-12 2019-11-19 北京爱康宜诚医疗器材有限公司 Detection method, detection device, storage medium and the processor of acetabular bone defect
CN110570405A (en) * 2019-08-26 2019-12-13 天津大学 pulmonary nodule intelligent diagnosis method based on mixed features
CN110610498A (en) * 2019-08-13 2019-12-24 上海联影智能医疗科技有限公司 Mammary gland molybdenum target image processing method, system, storage medium and equipment
CN110647939A (en) * 2019-09-24 2020-01-03 广州大学 Semi-supervised intelligent classification method and device, storage medium and terminal equipment
CN110675382A (en) * 2019-09-24 2020-01-10 中南大学 Aluminum electrolysis superheat degree identification method based on CNN-LapseLM
CN111291789A (en) * 2020-01-19 2020-06-16 华东交通大学 Breast cancer image identification method and system based on multi-stage multi-feature deep fusion
CN111325709A (en) * 2019-12-26 2020-06-23 联博智能科技有限公司 Wireless capsule endoscope image detection system and detection method
CN111554383A (en) * 2020-04-24 2020-08-18 浙江杜比医疗科技有限公司 Neural network for breast tumor detection and detection system thereof
CN111583320A (en) * 2020-03-17 2020-08-25 哈尔滨医科大学 Breast cancer ultrasonic image typing method and system fusing deep convolutional network and image omics characteristics and storage medium
CN111680687A (en) * 2020-06-09 2020-09-18 江西理工大学 Depth fusion model applied to mammary X-ray image anomaly identification and classification method thereof
WO2021057423A1 (en) * 2019-09-29 2021-04-01 京东方科技集团股份有限公司 Image processing method, image processing apparatus, and storage medium
CN113450309A (en) * 2021-05-28 2021-09-28 北京大学人民医院 Breast cancer ultrasonic image processing method, electronic device and storage medium
CN116258697A (en) * 2023-02-22 2023-06-13 浙江大学 Automatic classification device and method for child skin disease images based on rough labeling
CN117893528A (en) * 2024-03-13 2024-04-16 云南迪安医学检验所有限公司 Method and device for constructing cardiovascular and cerebrovascular disease classification model

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005599A1 (en) * 2008-11-12 2013-01-03 A Luxembourg Corporation Methods and systems of using exosomes for determining phenotypes
CN104182755A (en) * 2014-08-30 2014-12-03 西安电子科技大学 Mammary gland molybdenum target X-ray image block feature extraction method based on tower-shaped principal component analysis (PCA)
US20150213302A1 (en) * 2014-01-30 2015-07-30 Case Western Reserve University Automatic Detection Of Mitosis Using Handcrafted And Convolutional Neural Network Features
US20160253466A1 (en) * 2013-10-10 2016-09-01 Board Of Regents, The University Of Texas System Systems and methods for quantitative analysis of histopathology images using multiclassifier ensemble schemes
CN106023239A (en) * 2016-07-05 2016-10-12 东北大学 Breast lump segmentation system and method based on mammary gland subarea density clustering
CN106339591A (en) * 2016-08-25 2017-01-18 汤平 Breast cancer prevention self-service health cloud service system based on deep convolutional neural network
US20180075628A1 (en) * 2016-09-12 2018-03-15 Zebra Medical Vision Ltd. Systems and methods for automated detection of an indication of malignancy in a mammographic image
CN108416360A (en) * 2018-01-16 2018-08-17 华南理工大学 Cancer diagnosis system and method based on breast molybdenum target calcification feature
CN108875829A (en) * 2018-06-20 2018-11-23 鲁东大学 A kind of classification method and system of tumor of breast image

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005599A1 (en) * 2008-11-12 2013-01-03 A Luxembourg Corporation Methods and systems of using exosomes for determining phenotypes
US20160253466A1 (en) * 2013-10-10 2016-09-01 Board Of Regents, The University Of Texas System Systems and methods for quantitative analysis of histopathology images using multiclassifier ensemble schemes
US20150213302A1 (en) * 2014-01-30 2015-07-30 Case Western Reserve University Automatic Detection Of Mitosis Using Handcrafted And Convolutional Neural Network Features
CN104182755A (en) * 2014-08-30 2014-12-03 西安电子科技大学 Mammary gland molybdenum target X-ray image block feature extraction method based on tower-shaped principal component analysis (PCA)
CN106023239A (en) * 2016-07-05 2016-10-12 东北大学 Breast lump segmentation system and method based on mammary gland subarea density clustering
CN106339591A (en) * 2016-08-25 2017-01-18 汤平 Breast cancer prevention self-service health cloud service system based on deep convolutional neural network
US20180075628A1 (en) * 2016-09-12 2018-03-15 Zebra Medical Vision Ltd. Systems and methods for automated detection of an indication of malignancy in a mammographic image
CN108416360A (en) * 2018-01-16 2018-08-17 华南理工大学 Cancer diagnosis system and method based on breast molybdenum target calcification feature
CN108875829A (en) * 2018-06-20 2018-11-23 鲁东大学 A kind of classification method and system of tumor of breast image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
RAHIMEH ROUHI ET AL.: "Benign and malignant breast tumors classification based on region growing and CNN segmentation", 《EXPERT SYSTEMS WITH APPLICATIONS》 *
李静: "基于深度学习的乳腺癌早期诊断研究", 《中国优秀硕士学位论文全文数据库医药卫生科技辑》 *
王之琼: "基于极限学习机的乳腺肿块检测技术研究", 《中国博士学位论文全文数据库医药卫生科技辑》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109893100A (en) * 2019-04-18 2019-06-18 盐城工学院 A kind of method that breast density quantification calculates in breast cancer risk assessment
CN110070538A (en) * 2019-04-28 2019-07-30 华北电力大学(保定) Bolt two-dimensional visual documents structured Cluster method based on form optimization depth characteristic
CN110070538B (en) * 2019-04-28 2022-04-15 华北电力大学(保定) Bolt two-dimensional visual structure clustering method based on morphological optimization depth characteristics
CN110148467B (en) * 2019-05-16 2023-05-23 东北大学 Pulmonary nodule computer-aided diagnosis device and method based on improved CNN
CN110148467A (en) * 2019-05-16 2019-08-20 东北大学 A kind of Lung neoplasm device of computer aided diagnosis and method based on improvement CNN
CN110348457A (en) * 2019-06-25 2019-10-18 北京邮电大学 A kind of image characteristic extracting method, extraction element, electronic equipment and storage medium
CN110348457B (en) * 2019-06-25 2021-09-21 北京邮电大学 Image feature extraction method, image feature extraction device, electronic equipment and storage medium
CN110473193A (en) * 2019-08-12 2019-11-19 北京爱康宜诚医疗器材有限公司 Detection method, detection device, storage medium and the processor of acetabular bone defect
CN110610498A (en) * 2019-08-13 2019-12-24 上海联影智能医疗科技有限公司 Mammary gland molybdenum target image processing method, system, storage medium and equipment
CN110570405A (en) * 2019-08-26 2019-12-13 天津大学 pulmonary nodule intelligent diagnosis method based on mixed features
CN110675382A (en) * 2019-09-24 2020-01-10 中南大学 Aluminum electrolysis superheat degree identification method based on CNN-LapseLM
CN110647939A (en) * 2019-09-24 2020-01-03 广州大学 Semi-supervised intelligent classification method and device, storage medium and terminal equipment
CN110647939B (en) * 2019-09-24 2022-05-24 广州大学 Semi-supervised intelligent classification method and device, storage medium and terminal equipment
WO2021057423A1 (en) * 2019-09-29 2021-04-01 京东方科技集团股份有限公司 Image processing method, image processing apparatus, and storage medium
CN111325709A (en) * 2019-12-26 2020-06-23 联博智能科技有限公司 Wireless capsule endoscope image detection system and detection method
CN111291789A (en) * 2020-01-19 2020-06-16 华东交通大学 Breast cancer image identification method and system based on multi-stage multi-feature deep fusion
CN111291789B (en) * 2020-01-19 2022-07-05 华东交通大学 Breast cancer image identification method and system based on multi-stage multi-feature deep fusion
CN111583320A (en) * 2020-03-17 2020-08-25 哈尔滨医科大学 Breast cancer ultrasonic image typing method and system fusing deep convolutional network and image omics characteristics and storage medium
CN111554383A (en) * 2020-04-24 2020-08-18 浙江杜比医疗科技有限公司 Neural network for breast tumor detection and detection system thereof
CN111554383B (en) * 2020-04-24 2023-09-05 浙江杜比医疗科技有限公司 Neural network for breast tumor detection and detection system thereof
CN111680687B (en) * 2020-06-09 2022-05-10 江西理工大学 Depth fusion classification method applied to mammary X-ray image anomaly identification
CN111680687A (en) * 2020-06-09 2020-09-18 江西理工大学 Depth fusion model applied to mammary X-ray image anomaly identification and classification method thereof
CN113450309A (en) * 2021-05-28 2021-09-28 北京大学人民医院 Breast cancer ultrasonic image processing method, electronic device and storage medium
CN116258697A (en) * 2023-02-22 2023-06-13 浙江大学 Automatic classification device and method for child skin disease images based on rough labeling
CN116258697B (en) * 2023-02-22 2023-11-24 浙江大学 Automatic classification device and method for child skin disease images based on rough labeling
CN117893528A (en) * 2024-03-13 2024-04-16 云南迪安医学检验所有限公司 Method and device for constructing cardiovascular and cerebrovascular disease classification model
CN117893528B (en) * 2024-03-13 2024-05-17 云南迪安医学检验所有限公司 Method and device for constructing cardiovascular and cerebrovascular disease classification model

Also Published As

Publication number Publication date
CN109598709B (en) 2023-05-26

Similar Documents

Publication Publication Date Title
CN109598709A (en) Mammary gland assistant diagnosis system and method based on fusion depth characteristic
Castro et al. Elastic deformations for data augmentation in breast cancer mass detection
CN110084318B (en) Image identification method combining convolutional neural network and gradient lifting tree
Li et al. Multi-view mammographic density classification by dilated and attention-guided residual learning
CN108416360B (en) Cancer diagnosis system and method based on breast molybdenum target calcification features
CN110472530B (en) Retina OCT image classification method based on wavelet transformation and migration learning
Bhanumathi et al. CNN based training and classification of MRI brain images
CN104715259A (en) Nuclear self-adaptive optimizing and classifying method of X-ray mammary gland images
Fan et al. Lung nodule detection based on 3D convolutional neural networks
Li Research on the detection method of breast cancer deep convolutional neural network based on computer aid
Wang et al. An end-to-end mammogram diagnosis: A new multi-instance and multiscale method based on single-image feature
Bing et al. Sparse representation based multi-instance learning for breast ultrasound image classification
Nedra et al. Detection and classification of the breast abnormalities in Digital Mammograms via Linear Support Vector Machine
Abbas et al. Lungs nodule cancer detection using statistical techniques
Mahbub et al. A modified CNN and fuzzy AHP based breast cancer stage detection system
CN114565786A (en) Tomography image classification device and method based on channel attention mechanism
Rizzi et al. A supervised method for microcalcification cluster diagnosis
Amirjahan et al. Comparative analysis of various classification algorithms for skin Cancer detection
Zheng et al. 3D context-aware convolutional neural network for false positive reduction in clustered microcalcifications detection
Kulkarni et al. A comparative study of different deep learning architectures for benign-malignant mass classification
Valério et al. Deepmammo: deep transfer learning for lesion classification of mammographic images
Indu et al. Diagnosis of lung cancer nodules in ct scan images using fuzzy neural network
Uyun et al. Selection Mammogram Texture Descriptors Based on Statistics Properties Backpropagation Structure
Hassanien et al. Digital mammogram segmentation algorithm using pulse coupled neural networks
Nasiri et al. Breast cancer detection in mammograms using wavelet and contourlet transformations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant