CN109409413A - X-ray breast lump image automatic classification method - Google Patents

X-ray breast lump image automatic classification method Download PDF

Info

Publication number
CN109409413A
CN109409413A CN201811140302.3A CN201811140302A CN109409413A CN 109409413 A CN109409413 A CN 109409413A CN 201811140302 A CN201811140302 A CN 201811140302A CN 109409413 A CN109409413 A CN 109409413A
Authority
CN
China
Prior art keywords
image
breast
breast lump
layer
lump image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811140302.3A
Other languages
Chinese (zh)
Other versions
CN109409413B (en
Inventor
徐勇
孙利雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou University
Original Assignee
Guizhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou University filed Critical Guizhou University
Priority to CN201811140302.3A priority Critical patent/CN109409413B/en
Publication of CN109409413A publication Critical patent/CN109409413A/en
Application granted granted Critical
Publication of CN109409413B publication Critical patent/CN109409413B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of X-ray breast lump image automatic classification methods.The weight that the present invention carries out three channels to RGB triple channel breast lump image first adjusts and recombination, the breast lump image of generation have more separability to benign pernicious classification;Design gray scale Webweb structure, use is reassigned by pixel of the exponential function of the truth of a matter to entire characteristic pattern of constant e with different weights, pixel is adjusted by breast lump image classification rule, pixel is avoided to concentrate in a certain narrow zone, it realizes feature reinforcing effect, improves the separability of the benign pernicious classification of breast lump image;Characteristic pattern is generated by gray scale Webweb, full articulamentum is input to and carries out feature extraction, finally use SoftMax layers of progress image classification, predict the benign malignant nature of breast image.

Description

X-ray breast lump image automatic classification method
Technical field
The present invention relates to technical field of image processing, especially X-ray breast lump image automatic classification method.
Background technique
The automatic classification that breast lump image is carried out using artificial aptitude manner determines that breast lump is benign or dislikes Property, the medical resource of a large amount of preciousnesses can be not only saved, avoids causing to examine due to subjective reasons such as diagnostician's experience and levels Dislocation misses, and can also improve diagnostic accuracy, shorten Diagnostic Time, there is very big economic benefit.
Breast image is to be formed to human internal organs by transmission projection, and during transmission projection, ray is saturating It can be deviated because the influence of object generates during penetrating, the breast image edge and lump texture of generation are fuzzy.Use conventional method Such as first order differential operator Roberts operator, Prewitt operator, Second Order Differential Operator Laplace operator etc. carries out breast image The information such as object edge are enhanced, and the requirement of breast image nicety of grading is unable to satisfy.Due to breast lump and body of gland, muscle etc. Other tissue similitudes are high, so being classified using conventional method to breast lump, there is inefficient, accuracys rate Low problem.
Summary of the invention
Technical problem to be solved by the invention is to provide a kind of X-ray breast lump image automatic classification method, its energy The accurate breast lump differentiated in X-ray breast lump image is benign or pernicious.
The present invention is implemented as follows: X-ray breast lump image automatic classification method, includes three modules: RGB channel Recombinate sub-network;Gray scale breast image convolution characteristic extracting module and feature extraction and categorization module;It carries out as follows:
1) weight that sub-network adjusts three color channels of RGB breast image is recombinated by RGB channel, obtained by different power The rgb format breast lump image that heavy Combination nova generates;
2) the rgb format breast lump obtained is generated in step 1) by gray scale breast image convolution characteristic extracting module Convolution operation is carried out on image, to generate characteristic pattern;Specifically, using using constant e is the exponential function of the truth of a matter to characteristic pattern Gray value transformation, the higher luminance area of discrimination is stretched, enhancing different type rgb format breast lump image Otherness achievees the purpose that feature enhances;
3) characteristic pattern for the breast lump image that step 2) extracts is input to by spy by feature extraction and categorization module Sign is extracted and the full articulamentum of categorization module carries out feature extraction, then is input to Softmax layers and is classified, and breast lump is exported Benign pernicious classification information.
The RGB channel recombination sub-network is one three layers of neural network, and wherein input layer is the threeway of rgb format Road breast lump image uses three weight w of hidden layer and output interlayer after hidden layer calculatesR、wG、wBRespectively to original Three channel weights in figure are adjusted, and obtain three single channel figures, three single channel figures are stacked up, after being adjusted Breast lump image.
The concrete mode that RGB channel recombinates network is as follows:
Input: ((XR,XG,XB), Y), wherein XR,XG,XBIt is the gray scale in three channels of rgb format breast lump image respectively Image, grey scale pixel value range are [0,255], and Y is the tag along sort of breast lump image, is worth for { pernicious, benign } two classes;
Output: (X'R,X'G,X'B), X'R,X'G,X'BIt is the gray value in tri- channels RGB adjusted respectively, after adjustment Breast lump image for benign tumors and Malignant mass classification have more separability;
Input layer is arranged to weight between hidden layer in step a): wRi=1, wGi=1, wBi=1;
Step b) uses random value in (0.5,1.2) range to initialize hidden layer to weight between output layer: wR、wG、wB
Step c) the forward calculation stage: calculating forward since input layer, the gray scale mammary gland after output layer is recombinated Image;
Input of the gray level image that step d) output layer obtains as subsequent module;
Step e) output layer receives subsequent module feedback information;
Step f) the backcasting stage: calculating backward since output layer, and the control information received according to output layer makes With gradient method adjustment hidden layer and output interlayer weight;
So far, the parameter training and adjustment of a RGB channel recombination network are completed.
Gray scale breast image convolution characteristic extracting module, the gray scale Webweb being superimposed comprising two, in gray scale net Net is made of three network layers, is convolutional layer, e index mapping layer and maximum value down-sampling layer respectively;Convolutional layer uses 30 3*3 Convolution kernel to input picture carry out convolution, generate 30 characteristic patterns, be input to e index mapping layer carry out gray value differentiation Mapping, then carry out maximization down-sampling, the characteristic pattern after obtaining dimensionality reduction;Characteristic pattern after dimensionality reduction is input to connecting entirely for 5000 dimensions It connects layer and carries out feature extraction, then be input to Softmax layers and classify, the classification knot that output breast image is benign or pernicious Fruit.
The differentiation mapping for being input to e index mapping layer progress gray value refers to, uses using normal e as the finger of the truth of a matter Number function, the small low value region of discrimination is compressed, the big high level region of discrimination is stretched, and realizes gray scale mammary gland The feature of image enhances function;It is as follows as the calculation method of the index mapping layer at bottom using constant e:
Firstly, the numberical range of pixel is [0,255], numberical range is compressed between [0,2] using following formula:
Wherein, x1It is input pixel value, y1It is the value being mapped to after [0,2];
Later, the y that will be mapped to [0,2] section using e as the exponential function of the truth of a matter is used1It is mapped between [0,6.39]:
Wherein, y1It is the value being mapped to gray-scale pixels after [0,2] range, y2It is the value being mapped to after [0,6.39].
Between the step 1), image is first subjected to equal proportion window scaling, then subnet is recombinated by RGB channel Network adjusts the weight of three color channels of RGB breast image, and image equal proportion window Zoom method is pre- to original breast image Processing.
Using image equal proportion window Zoom method, by breast lump image size normalize to one it is fixed-size just Square region, detailed process is as follows:
1) square window is set, and the side length of window is the length to normalized breast lump image long side;It will The upper left corner of the upper left corner Alignment window of breast lump image is put into window, then window and breast lump image are carried out grade ratio Example scaling, size normalize to fixed dimension, complete the dimension normalization operation of breast lump image;Breast lump image normalizing Change having a size of 200*200 pixel;
2) trained and test data set is divided:
The training of breast lump image classification model is carried out using the 90% of MIAS data set, remaining 10% is used as model Verification of correctness.
When carrying out the training of breast lump image classification model using the 90% of MIAS data set, sample is expanded It fills, including such as under type:
1) image translates: breast lump image original being carried out to the movement of upper and lower, left and right four direction, sample is completed and expands It fills;
2) image angle rotates
Breast lump image original rotate by a certain angle clockwise, counterclockwise.Sample is completed to expand;
3) image mirror image
Breast lump image is subjected to left and right, upper and lower image copying, sample is completed and expands.
Compared with prior art, the present invention proposes a kind of depth network frame of new breast lump image classification, in conjunction with Imaged image channel information, completing breast image data characteristics enhances function, proposes for the research work of breast lump image classification New thinking and solution;The automatic classification that breast lump image is carried out using artificial aptitude manner, determines breast lump It is benign or pernicious, can not only save the medical resource of a large amount of preciousnesses, avoid since diagnostician's experience and level etc. are main It sees reason and causes diagnostic error, diagnostic accuracy can also be improved, shorten Diagnostic Time, there is very big economic benefit;Together When, the raising of accuracy rate of diagnosis can greatly reduce patient's body and mind pain and financial burden, and be conducive to patient's treatment, extend Life span has very big social value.
Detailed description of the invention
Fig. 1 is GL-CNN network structure of the invention;
Fig. 2 is that RGB weight of the invention adjusts network;
Fig. 3 is gray scale Webweb (GL-NIN) of the invention;
Fig. 4 is e index function of the invention;
Fig. 5 is breast image size method for normalizing flow chart of the invention.
Specific embodiment
The embodiment of the present invention: X-ray breast lump image automatic classification method includes three modules: RGB channel recombination Sub-network;Gray scale breast image convolution characteristic extracting module and feature extraction and categorization module;It carries out as follows:
1) weight that sub-network adjusts three color channels of RGB breast image is recombinated by RGB channel, obtained by different power The rgb format breast lump image that heavy Combination nova generates;
The RGB channel recombination sub-network is one three layers of neural network, and wherein input layer is the threeway of rgb format Road breast lump image uses three weight w of hidden layer and output interlayer after hidden layer calculatesR、wG、wBRespectively to original Three channel weights in figure are adjusted, and obtain three single channel figures, three single channel figures are stacked up, after being adjusted Breast lump image.
The concrete mode that RGB channel recombinates network is as follows:
Input: ((XR,XG,XB), Y), wherein XR,XG,XBIt is the gray scale in three channels of rgb format breast lump image respectively Image, grey scale pixel value range are [0,255], and Y is the tag along sort of breast lump image, is worth for { pernicious, benign } two classes;
Output: (X'R,X'G,X'B), X'R,X'G,X'BIt is the gray value in tri- channels RGB adjusted respectively, after adjustment Breast lump image for benign tumors and Malignant mass classification have more separability;
Input layer is arranged to weight between hidden layer in step a): wRi=1, wGi=1, wBi=1;
Step b) uses random value in (0.5,1.2) range to initialize hidden layer to weight between output layer: wR、wG、wB
Step c) the forward calculation stage: calculating forward since input layer, the gray scale mammary gland after output layer is recombinated Image;
Input of the gray level image that step d) output layer obtains as subsequent module;
Step e) output layer receives subsequent module feedback information;
Step f) the backcasting stage: calculating backward since output layer, and the control information received according to output layer makes With gradient method adjustment hidden layer and output interlayer weight;
So far, the parameter training and adjustment of a RGB channel recombination network are completed.
2) the rgb format breast lump obtained is generated in step 1) by gray scale breast image convolution characteristic extracting module Convolution operation is carried out on image, to generate characteristic pattern;Specifically, using using constant e is the exponential function of the truth of a matter to characteristic pattern Gray value transformation, the higher luminance area of discrimination is stretched, enhancing different type rgb format breast lump image Otherness achievees the purpose that feature enhances;
Gray scale breast image convolution characteristic extracting module includes two gray scale Webwebs, i.e. image Gray Level Network In Network and GL-NIN;GL-NIN is made of three network layers, be respectively convolutional layer, e index mapping layer and Maximum value down-sampling layer;Convolutional layer carries out convolution to input picture using the convolution kernel of 30 3*3, generates 30 characteristic patterns, defeated Enter to e index mapping layer to carry out the differentiation mapping of gray value, then carries out maximization down-sampling, the characteristic pattern after obtaining dimensionality reduction; The full articulamentum that characteristic pattern after dimensionality reduction is input to 5000 dimensions is subjected to feature extraction, then is input to Softmax layers and classifies, The classification results that output breast image is benign or pernicious.
The differentiation mapping for being input to e index mapping layer progress gray value refers to, uses using normal e as the finger of the truth of a matter Number function, the small low value region of discrimination is compressed, the big high level region of discrimination is stretched, and realizes gray scale mammary gland The feature of image enhances function;It is as follows as the calculation method of the index mapping layer at bottom using constant e:
Firstly, the numberical range of pixel is [0,255], numberical range is compressed between [0,2] using following formula:
Wherein, x1It is input pixel value, y1It is the value being mapped to after [0,2];
Later, the y that will be mapped to [0,2] section using e as the exponential function of the truth of a matter is used1It is mapped between [0,6.39]:
Wherein, y1It is the value being mapped to gray-scale pixels after [0,2] range, y2It is the value being mapped to after [0,6.39].
3) characteristic pattern for the breast lump image that step 2) extracts is input to by spy by feature extraction and categorization module Sign is extracted and the full articulamentum of categorization module carries out feature extraction, then is input to Softmax layers and is classified, and breast lump is exported Benign pernicious classification information.
Image equal proportion window Zoom method:
Using image equal proportion window Zoom method, by breast lump image size normalize to one it is fixed-size just Square region, detailed process is as follows:
1) square window is set, and the side length of window is the length to normalized breast lump image long side;It will The upper left corner of the upper left corner Alignment window of breast lump image is put into window, then window and breast lump image are carried out grade ratio Example scaling, size normalize to fixed dimension, complete the dimension normalization operation of breast lump image;Breast lump image normalizing Change having a size of 200*200 pixel;
2) trained and test data set is divided:
The training of breast lump image classification model is carried out using the 90% of MIAS data set, remaining 10% is used as model Verification of correctness.MIAS includes 120 breast lumps altogether, and sample size is very few, for the generalization ability for enhancing model, is needed to sample Expanded, the present embodiment expands sample using following scheme.
When carrying out the training of breast lump image classification model using the 90% of MIAS data set, sample is expanded It fills, including such as under type:
1) image translates: breast lump image original being carried out to the movement of upper and lower, left and right four direction, sample is completed and expands It fills;Specific method:
Breast lump image original all pixels are distinguished up and down, left and right four direction to carry out with 5 pixels being times Number, movement 5 times, the pixel for removing image abandon, and blank position is completed 20 times of samples and expanded with 0 supplement.
2) image angle rotates
Breast lump image original rotate by a certain angle clockwise, counterclockwise, sample is completed and expands;Specific side Method:
It is for multiple that the progress of breast lump image original all pixels is suitable with 5 degree using image center location as rotation center Hour hands and counterclockwise each rotation 5 times, the pixel for removing image abandon, and blank space is completed 20 times of samples and expanded with 0 supplement.
3) image mirror image
Breast lump image is subjected to left and right, upper and lower image copying, sample is completed and expands.Specific method:
One, left and right mirror image: the blank image of identical as the breast lump image size of creation, to original breast lump image from From left to right carries out the duplication of image column, is filled into blank image with direction from right to left, completes breast lump image right and left To mirror image;Two, upper and lower mirror image: establishing the blank image of identical as breast lump image size, to original breast lump image from The upper duplication for carrying out image line downwards, is filled into blank image with direction from bottom to top, completes breast lump image upper and lower To mirror image.

Claims (7)

1. a kind of X-ray breast lump image automatic classification method, it is characterised in that: include three modules: RGB channel recon Network;Gray scale breast image convolution characteristic extracting module and feature extraction and categorization module;It carries out as follows:
1) weight that sub-network adjusts three color channels of RGB breast image is recombinated by RGB channel, obtained by different weight weights The rgb format breast lump image that Combination nova generates;
2) it is generated on the rgb format breast lump image obtained by gray scale breast image convolution characteristic extracting module in step 1) Convolution operation is carried out, to generate characteristic pattern;Specifically, using using constant e is the exponential function of the truth of a matter to the gray scale of characteristic pattern Value transformation, the higher luminance area of discrimination is stretched, and enhances the difference of different type rgb format breast lump image Property, achieve the purpose that feature enhances;
3) characteristic pattern for the breast lump image that step 2) extracts is input to feature and is mentioned by feature extraction and categorization module It takes and carries out feature extraction with the full articulamentum of categorization module, then be input to Softmax layers and classify, export the good of breast lump The pernicious classification information of property.
2. X-ray breast lump image automatic classification method according to claim 1, it is characterised in that: the RGB is logical Recombination sub-network in road is one three layers of neural network, and wherein input layer is the triple channel breast lump image of rgb format, is passed through After hidden layer calculates, three weight w of hidden layer and output interlayer are usedR、wG、wBRespectively to three channel weights in original image It is adjusted, obtains three single channel figures, three single channel figures are stacked up, the breast lump image after being adjusted.
3. X-ray breast lump image automatic classification method according to claim 2, which is characterized in that RGB channel recombination The concrete mode of network is as follows:
Input: ((XR,XG,XB), Y), wherein XR,XG,XBIt is the grayscale image in three channels of rgb format breast lump image respectively Picture, grey scale pixel value range are [0,255], and Y is the tag along sort of breast lump image, is worth for { pernicious, benign } two classes;
Output: (X'R,X'G,X'B), X'R,X'G,X'BIt is the gray value in tri- channels RGB adjusted, cream adjusted respectively Adenoncus block image has more separability for benign tumors and Malignant mass classification;
Input layer is arranged to weight between hidden layer in step a): wRi=1, wGi=1, wBi=1;
Step b) uses random value in (0.5,1.2) range to initialize hidden layer to weight between output layer: wR、wG、wB
Step c) the forward calculation stage: calculating forward since input layer, the gray scale breast image after output layer is recombinated;
Input of the gray level image that step d) output layer obtains as subsequent module;
Step e) output layer receives subsequent module feedback information;
Step f) the backcasting stage: calculating backward since output layer, and the control information received according to output layer uses ladder Degree method adjusts hidden layer and output interlayer weight;
So far, the parameter training and adjustment of a RGB channel recombination network are completed.
4. X-ray breast lump image automatic classification method according to claim 1, it is characterised in that: gray scale mammary gland shadow As convolution characteristic extracting module, the gray scale Webweb being superimposed comprising two, gray scale Webweb is made of three network layers, It is convolutional layer, e index mapping layer and maximum value down-sampling layer respectively;Convolutional layer is using the convolution kernel of 30 3*3 to input picture Convolution is carried out, 30 characteristic patterns are generated, e index mapping layer is input to and carries out the differentiation mapping of gray value, then maximized Down-sampling, the characteristic pattern after obtaining dimensionality reduction;The full articulamentum that characteristic pattern after dimensionality reduction is input to 5000 dimensions is subjected to feature extraction, It is input to Softmax layers again to classify, the classification results that output breast image is benign or pernicious.
5. X-ray breast lump image automatic classification method according to claim 4, it is characterised in that: the input The differentiation mapping for carrying out gray value to e index mapping layer refers to, uses using normal e as the exponential function of the truth of a matter, and discrimination is small Low value region compressed, the big high level region of discrimination is stretched, realize gray scale breast image feature enhance function Energy;It is as follows as the calculation method of the index mapping layer at bottom using constant e:
Firstly, the numberical range of pixel is [0,255], numberical range is compressed between [0,2] using following formula:
Wherein, x1It is input pixel value, y1It is the value being mapped to after [0,2];
Later, the y that will be mapped to [0,2] section using e as the exponential function of the truth of a matter is used1It is mapped between [0,6.39]:
Wherein, y1It is the value being mapped to gray-scale pixels after [0,2] range, y2It is the value being mapped to after [0,6.39].
6. X-ray breast lump image automatic classification method according to claim 1, it is characterised in that: in the step It is rapid 1) between, first by image carry out equal proportion window scaling, then by RGB channel recombinate sub-network adjustment RGB breast image three The weight of a color channel, image equal proportion window Zoom method is pre-processed to original breast image, by breast lump image Size normalizes to a fixed-size square area, and detailed process is as follows:
1) square window is set, and the side length of window is the length to normalized breast lump image long side;By mammary gland The upper left corner of the upper left corner Alignment window of lump image is put into window, then window and breast lump image are carried out equal proportion contracting It puts, size normalizes to fixed dimension, completes the dimension normalization operation of breast lump image;Breast lump image normalizes ruler Very little is 200*200 pixel;
2) trained and test data set is divided:
The training of breast lump image classification model is carried out using the 90% of MIAS data set, remaining 10% is correct as model Property verifying.
7. X-ray breast lump image automatic classification method according to claim 6, it is characterised in that: using MIAS When the 90% of data set carries out the training of breast lump image classification model, sample is expanded, including such as under type:
1) image translates: breast lump image original being carried out to the movement of upper and lower, left and right four direction, sample is completed and expands;
2) image angle rotates
Breast lump image original rotate clockwise, counterclockwise, sample is completed and expands;
3) image mirror image
Breast lump image is subjected to left and right, upper and lower image copying, sample is completed and expands.
CN201811140302.3A 2018-09-28 2018-09-28 Automatic classification method for X-ray breast lump images Active CN109409413B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811140302.3A CN109409413B (en) 2018-09-28 2018-09-28 Automatic classification method for X-ray breast lump images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811140302.3A CN109409413B (en) 2018-09-28 2018-09-28 Automatic classification method for X-ray breast lump images

Publications (2)

Publication Number Publication Date
CN109409413A true CN109409413A (en) 2019-03-01
CN109409413B CN109409413B (en) 2022-09-16

Family

ID=65465581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811140302.3A Active CN109409413B (en) 2018-09-28 2018-09-28 Automatic classification method for X-ray breast lump images

Country Status (1)

Country Link
CN (1) CN109409413B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110232396A (en) * 2019-04-09 2019-09-13 贵州大学 X-ray breast image deep learning classification method
CN110320285A (en) * 2019-07-08 2019-10-11 创新奇智(青岛)科技有限公司 A kind of steel rail defect recognition methods, system and electronic equipment based on ultrasonic signal
CN111126226A (en) * 2019-12-17 2020-05-08 杭州电子科技大学 Radiation source individual identification method based on small sample learning and feature enhancement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050129297A1 (en) * 2003-12-15 2005-06-16 Kamath Vidya P. Classification of breast lesion method and system
CN104835155A (en) * 2015-04-30 2015-08-12 浙江大学 Fractal-based early-stage breast cancer calcification point computer auxiliary detection method
CN106339591A (en) * 2016-08-25 2017-01-18 汤平 Breast cancer prevention self-service health cloud service system based on deep convolutional neural network
CN106682435A (en) * 2016-12-31 2017-05-17 西安百利信息科技有限公司 System and method for automatically detecting lesions in medical image through multi-model fusion
CN106780451A (en) * 2016-12-07 2017-05-31 西安电子科技大学 X-ray, ultrasound, infrared image fusion breast lump image detecting method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050129297A1 (en) * 2003-12-15 2005-06-16 Kamath Vidya P. Classification of breast lesion method and system
CN104835155A (en) * 2015-04-30 2015-08-12 浙江大学 Fractal-based early-stage breast cancer calcification point computer auxiliary detection method
CN106339591A (en) * 2016-08-25 2017-01-18 汤平 Breast cancer prevention self-service health cloud service system based on deep convolutional neural network
CN106780451A (en) * 2016-12-07 2017-05-31 西安电子科技大学 X-ray, ultrasound, infrared image fusion breast lump image detecting method
CN106682435A (en) * 2016-12-31 2017-05-17 西安百利信息科技有限公司 System and method for automatically detecting lesions in medical image through multi-model fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
史盛君等: "基于神经网络的乳腺X光片中肿块检测法", 《哈尔滨理工大学学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110232396A (en) * 2019-04-09 2019-09-13 贵州大学 X-ray breast image deep learning classification method
CN110232396B (en) * 2019-04-09 2022-07-01 贵州大学 X-ray mammary gland image deep learning classification method
CN110320285A (en) * 2019-07-08 2019-10-11 创新奇智(青岛)科技有限公司 A kind of steel rail defect recognition methods, system and electronic equipment based on ultrasonic signal
CN111126226A (en) * 2019-12-17 2020-05-08 杭州电子科技大学 Radiation source individual identification method based on small sample learning and feature enhancement

Also Published As

Publication number Publication date
CN109409413B (en) 2022-09-16

Similar Documents

Publication Publication Date Title
CN109376636B (en) Capsule network-based eye fundus retina image classification method
CN108268870B (en) Multi-scale feature fusion ultrasonic image semantic segmentation method based on counterstudy
CN110930416B (en) MRI image prostate segmentation method based on U-shaped network
CN107657602A (en) Based on the breast structure disorder recognition methods for migrating convolutional neural networks twice
CN109635835A (en) A kind of breast lesion method for detecting area based on deep learning and transfer learning
CN110321920A (en) Image classification method, device, computer readable storage medium and computer equipment
CN109767440A (en) A kind of imaged image data extending method towards deep learning model training and study
CN109584251A (en) A kind of tongue body image partition method based on single goal region segmentation
CN107909566A (en) A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning
CN106682616A (en) Newborn-painful-expression recognition method based on dual-channel-characteristic deep learning
CN106780448A (en) A kind of pernicious sorting technique of ultrasonic Benign Thyroid Nodules based on transfer learning Yu Fusion Features
CN107403201A (en) Tumour radiotherapy target area and jeopardize that organ is intelligent, automation delineation method
CN109409413A (en) X-ray breast lump image automatic classification method
CN109993735A (en) Image partition method based on concatenated convolutional
CN109035160A (en) The fusion method of medical image and the image detecting method learnt based on fusion medical image
CN106408001A (en) Rapid area-of-interest detection method based on depth kernelized hashing
CN111291789B (en) Breast cancer image identification method and system based on multi-stage multi-feature deep fusion
CN105913086A (en) Computer-aided mammary gland diagnosing method by means of characteristic weight adaptive selection
CN108416821A (en) A kind of CT Image Super-resolution Reconstruction methods of deep neural network
CN111488912B (en) Laryngeal disease diagnosis system based on deep learning neural network
CN111709446B (en) X-ray chest radiography classification device based on improved dense connection network
CN109671060A (en) Area of computer aided breast lump detection method based on selective search and CNN
CN111724345A (en) Pneumonia picture verification device and method capable of adaptively adjusting size of receptive field
CN108304889A (en) A kind of digital breast imaging image radiation group method based on deep learning
CN114581474A (en) Automatic clinical target area delineation method based on cervical cancer CT image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant