CN112801166A - Melanoma classification method based on feature data and dimension scaling - Google Patents
Melanoma classification method based on feature data and dimension scaling Download PDFInfo
- Publication number
- CN112801166A CN112801166A CN202110094691.6A CN202110094691A CN112801166A CN 112801166 A CN112801166 A CN 112801166A CN 202110094691 A CN202110094691 A CN 202110094691A CN 112801166 A CN112801166 A CN 112801166A
- Authority
- CN
- China
- Prior art keywords
- data
- image
- feature
- image data
- melanoma
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 201000001441 melanoma Diseases 0.000 title claims abstract description 27
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000012545 processing Methods 0.000 claims abstract description 14
- 238000000605 extraction Methods 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 10
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000013508 migration Methods 0.000 claims description 3
- 230000005012 migration Effects 0.000 claims description 3
- 238000011176 pooling Methods 0.000 claims description 3
- 238000013526 transfer learning Methods 0.000 claims description 2
- 238000013135 deep learning Methods 0.000 description 8
- 238000012549 training Methods 0.000 description 8
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000001965 increasing effect Effects 0.000 description 4
- 230000004913 activation Effects 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000004880 explosion Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 230000008034 disappearance Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 208000006964 Nevi and Melanomas Diseases 0.000 description 1
- 241001506089 Scaphophyllum Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 206010040882 skin lesion Diseases 0.000 description 1
- 231100000444 skin lesion Toxicity 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/082—Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4046—Scaling the whole image or part thereof using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30088—Skin; Dermal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Abstract
The invention discloses a melanoma classification method based on feature data and dimension scaling, which comprises the following steps: performing enhancement processing on the image; extracting data of the image after the enhancement processing to obtain target image data; and performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data; performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data; and classifying the fused image data. By applying the embodiment of the invention, the characteristic data which has a large influence on the incidence of melanoma and the effective network which can be scaled and adjusted are fused, so that the performance and the accuracy of the constructed network are obviously improved.
Description
Technical Field
The invention relates to the technical field of computer image recognition, in particular to a melanoma classification method based on feature data and dimension scaling.
Background
Previously experienced doctors have used a method of identification with a skin mirror to diagnose melanoma. Such diagnosis has great drawbacks and limitations. From an objective point of view, the judgment of naked eyes is seriously influenced by hair interference, light influence and the like on the image. From the subjective perspective, each doctor can judge according to own experience, and the subjective color is strong. In addition, the visual fatigue caused by long-time observation of the skin image causes larger difference of diagnosis results, poorer repeatability and lower diagnosis rate.
With The popularization of computer hardware GPUs and The high-speed development of The computer vision field, Yasuhiro Fujisawa and Sae Inoue (y. Fujisawa, s. Inoue, and y. nakamura, "The usability of deep learning-based, computer-aided skin processors," Frontiers med, vol.6, p.191, aug.2019.) and others applied deep learning to melanoma classification of a skin mirror image, and found that some of them have achieved The level of an experienced doctor of a method based on deep learning calculation classification capability of neural network, and some are even better than The judgment of The doctor. Nasiri, s., heller et al (Nasiri, s., heller, j., Jung, m.et al. depict Melanoma Deep-CLASS: a Deep connected neural networks approach to CLASS learning images bmc Bioinformatics 21,84(2020). https:// doi.org/10.1186/s12859-020-3351-y) fine-tunes the inclusion V3 model pre-trained on ImageNet, and then fuses the manual function and the Deep learning function through the dense layer of the artificial neural network. The diagnostic accuracy of the feature fusion model reaches 82.02%. In the skin lesion analysis of ISBI 2016 towards Melanoma detection challenge data set, Vinay B N et al (Vinay B N, Shah P J, Shekar V, et al. detection of Melanoma using Deep Learning Techniques [ C ]//2020International Conference on Computation, Automation and Knowledge Management (ICCAKM). IEEE,2020:391 394.) segmented images using U-Net and then classified using a depth residual network with an accuracy of 88.7%. However, the increasing depth of convolutional neural networks presents problems such as overfitting, data computationally intensive, gradient vanishing, gradient explosion, etc.
Huhaiyin and Wang ocean, et al, proposed a model combining multiple deep learning network models (patent application publication No.: CN108427963A), which did not increase much accuracy, but increased the amount of computation too much. The method aims at the defects that the calculated amount is too large, the accuracy is not greatly improved, overfitting is easy to occur on a training set and the like caused by the deep increase of the existing melanoma recognition network and the joint learning. The neural network efficiency-Mix fusing characteristic data such as age and gender of an identification object and a multidimensional scaling network efficiency network is provided for classifying melanoma, and compared with Restnet50, the identification accuracy of the network is improved by 5%.
Disclosure of Invention
The invention aims to provide a melanoma classification method based on feature data and dimension scaling, and aims to solve the problems that in the prior art, the calculated amount is too large, the accuracy is not improved high, overfitting is easy to occur on a training set and the like.
In order to achieve the above object, the present invention provides a melanoma classification method based on feature data and dimension scaling, comprising:
performing enhancement processing on the image;
extracting data of the image after the enhancement processing to obtain target image data;
performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data;
and classifying the fused image data.
Optionally, the step of extracting data from the enhanced image to obtain target image data includes:
adjusting the image subjected to data enhancement and preprocessing into 384 × 384, and performing transfer learning on the image on an efficiency network trained on ImageNet so as to calculate the average value of each feature map through a global average pooling layer to avoid overfitting;
and taking the data subjected to the migration learning as target image data.
In one implementation, the step of extracting the data features of the enhanced image to obtain feature data includes:
respectively converting the gender data and the age data into objects and inputting the objects to a multidimensional scaling network full-connection layer;
the obtained characteristic data is normalized and activated by the RELU layer.
Optionally, the step of connecting the obtained target image data and the feature data to obtain fused image data includes:
and connecting the obtained target image data and the feature data through a coordinate function to obtain fused image data.
By applying the melanoma classification method based on the characteristic data and the dimension scaling, provided by the embodiment of the invention, the characteristic data which has a large influence on the melanoma incidence and the scalable effective network are fused, so that the performance and the accuracy of the network are obviously improved. The method for improving the image recognition accuracy of the network based on the Restnet50 is characterized in that problems occurring in the research process are solved and improved by adopting methods of image preprocessing, data enhancement, dropout layer addition and the like, the fitting capability and the generalization capability of the model are better optimized, the situations of overlarge calculated amount, overfitting, gradient explosion, gradient disappearance and the like caused by the fact that the depth of the network is increased in the prior art are reduced, and the recognition accuracy of the network is improved by 5% compared with that of the Restnet 50.
Drawings
Fig. 1 is a flowchart illustrating a melanoma classification method based on feature data and dimension scaling according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of an Efficient-Mix network structure according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
As shown in fig. 1, the present invention provides a melanoma classification method based on feature data and dimension scaling, which includes:
s110, enhancing the image;
it should be noted that deep learning is based on huge data volume and reasonably distributed and accurate data labels to realize accurate identification and classification. Therefore, deep learning requires a large amount of data for training, but melanoma data as medical data is limited in amount, and thus data enhancement is very important. The data quantity and diversity of training data are increased through data enhancement, the generalization capability of the model can be improved, and overfitting is avoided. The data enhancement mainly comprises the operations of rotating, cutting, zooming, stretching, contrast enhancement, brightness enhancement and the like on the image. The data enhancement technique expands the size of the training data set by making a series of random changes to the training images to produce similar but different training samples. Randomly changing the training samples can reduce the dependence of the model on certain attributes, thereby improving the generalization capability of the model.
The efficiency network capable of being scaled and adjusted in multiple dimensions can balance all dimensions of the network during scaling, and better precision and efficiency can be achieved. Using a complex coefficientTo scale the network width, depth and resolution uniformly as shown in the formula.
WhereinIs a specific coefficient of the number of bits to be processed,the usage amount of the control resource, α, β, γ, determines the specific allocation of the resource, and α, β, γ represent the network depth, the network width, and the specific weight of the image resolution, respectively.
Compared with a VGG-16 network with deeper depth, the relationship among the network depth, the network width and the resolution can be balanced by adopting the multidimensional scaling Efficient network, and the problems of overfitting, gradient explosion, gradient disappearance, larger calculated amount and the like caused by the increase of the number of network layers are effectively avoided.
S120, extracting data of the image after the enhancement processing to obtain target image data;
it can be understood that in the model structure, the melanoma image subjected to data enhancement and preprocessing is firstly resized to 384 × 384, and then the image is input to the efficiency network trained on ImageNet and is subjected to migration learning at the data extraction layer, so as to obtain image data. And calculating the average value of each feature map through a global average pooling layer to avoid overfitting.
S130, performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
in the present invention, characteristic data strongly associated with melanoma, i.e., sex and age, are converted into a subject, a full link layer of the input data is used to extract data characteristics, and then the extracted characteristic data is normalized and activated by the RELU layer. Experiments show that when the output sizes of the sex and the age are adjusted to 50, the influence of the sex and the age is too small, so that the effect of the fused network is poor; when the size is adjusted to 500, the occupation proportion of gender and age is too high, the decision factor of the image is too low, and the accuracy rate is reduced. And when the size is adjusted to be about 100, the model effect is best.
S140, performing data connection on the obtained target image data and the feature data to obtain fused image data;
it can be understood that, in the converged network, a very critical step is to connect the processed image data and the feature data at a converged connection layer by using a concatenate function, and through this step, the Efficient-Mix network model of the present invention merges the feature information and the image information at the same time. The model and the data are more closely associated, melanoma can be predicted more accurately, and compared with an effective network which is judged only according to image information, the identification accuracy is effectively improved by about 1%.
And S150, classifying the fused image data.
It should be noted that, the effective-Mix network model uses RELU as a full connection layer of an activation function, a BN layer for data normalization processing, and a discard layer for preventing overfitting, and finally adds a classification layer with an activation function of Sigmoid to classify the fused image data, thereby rapidly and accurately distinguishing normal black nevi and melanoma. The Efficient-Mix network structure is shown in fig. 2.
Since the class distribution of SIIM melanoma is extremely unbalanced, the Loss function herein employs Focal Loss. Compared with the traditional binary Loss function, the Focal local not only solves the problem of uneven distribution of positive and negative samples through a specific weight coefficient, but also balances the proportion of easily classified samples and easily misclassified samples by using a modulation factor. The conventional two-class cross entropy loss is shown in the formula.
Where p represents the probability of predicting a sample to be 1 via the activation function, with a value between 0 and 1. y represents the true class value to which the sample corresponds. For positive samples, the higher the output probability value, the lower the loss, and conversely, the lower the output probability value, the lower the loss for negative samples. A common method for solving the class imbalance is to give different specific weight factors to different classes, and apply a weight factor alpha to the positive class samples, which belongs to [0,1 ]]The weight of the negative class sample is 1-alpha, using alphatThe weighting factors for the positive and negative samples are collectively referred to, and, as such, p is usedtRepresenting the predicted probability of positive and negative samples, CE loss can be derivedThe following formula is shown.
CE(pt)=-αtlog(pt)
Alpha can balance the proportion of positive and negative examples, but it is difficult to distinguish the proportion of easy-to-classify samples and easy-to-misclassify samples. Focal local will modulate factor (1-p)t)γAdding to CE Loss, the Focal Loss can be derived as shown below, where γ is the modulation factor. For samples which are difficult to distinguish, the Focal local performs better when a large number of samples with low prediction probability are used.
FL(pt)=-(1-pt)γlog(pt)
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.
Claims (4)
1. A melanoma classification method based on feature data and dimension scaling is characterized by comprising the following steps:
performing enhancement processing on the image;
extracting data of the image after the enhancement processing to obtain target image data;
performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data;
and classifying the fused image data.
2. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of extracting the data of the image after the enhancement processing to obtain target image data comprises:
adjusting the image subjected to data enhancement and preprocessing into 384 × 384, and performing transfer learning on the image on an efficiency network trained on ImageNet so as to calculate the average value of each feature map through a global average pooling layer to avoid overfitting;
and taking the data subjected to the migration learning as target image data.
3. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of performing data feature extraction on the image after enhancement processing to obtain feature data comprises:
respectively converting the gender data and the age data into objects and inputting the objects to a multidimensional scaling network full-connection layer;
the obtained characteristic data is normalized and activated by the RELU layer.
4. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of connecting the obtained target image data and the feature data to obtain fused image data comprises:
and connecting the obtained target image data and the feature data through a coordinate function to obtain fused image data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110094691.6A CN112801166B (en) | 2021-01-25 | 2021-01-25 | Melanoma classification method based on characteristic data and dimension scaling |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110094691.6A CN112801166B (en) | 2021-01-25 | 2021-01-25 | Melanoma classification method based on characteristic data and dimension scaling |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112801166A true CN112801166A (en) | 2021-05-14 |
CN112801166B CN112801166B (en) | 2024-03-01 |
Family
ID=75811483
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110094691.6A Active CN112801166B (en) | 2021-01-25 | 2021-01-25 | Melanoma classification method based on characteristic data and dimension scaling |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112801166B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113610061A (en) * | 2021-09-30 | 2021-11-05 | 国网浙江省电力有限公司电力科学研究院 | Method and system for identifying unstressed conducting wire based on target detection and residual error network |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107909566A (en) * | 2017-10-28 | 2018-04-13 | 杭州电子科技大学 | A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning |
CN108427963A (en) * | 2018-03-02 | 2018-08-21 | 浙江工业大学 | A kind of dermopathic classifying identification method of melanoma based on deep learning |
CN109711281A (en) * | 2018-12-10 | 2019-05-03 | 复旦大学 | A kind of pedestrian based on deep learning identifies again identifies fusion method with feature |
CN111191660A (en) * | 2019-12-30 | 2020-05-22 | 浙江工业大学 | Rectal cancer pathology image classification method based on multi-channel collaborative capsule network |
WO2021000906A1 (en) * | 2019-07-02 | 2021-01-07 | 五邑大学 | Sar image-oriented small-sample semantic feature enhancement method and apparatus |
-
2021
- 2021-01-25 CN CN202110094691.6A patent/CN112801166B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107909566A (en) * | 2017-10-28 | 2018-04-13 | 杭州电子科技大学 | A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning |
CN108427963A (en) * | 2018-03-02 | 2018-08-21 | 浙江工业大学 | A kind of dermopathic classifying identification method of melanoma based on deep learning |
CN109711281A (en) * | 2018-12-10 | 2019-05-03 | 复旦大学 | A kind of pedestrian based on deep learning identifies again identifies fusion method with feature |
WO2021000906A1 (en) * | 2019-07-02 | 2021-01-07 | 五邑大学 | Sar image-oriented small-sample semantic feature enhancement method and apparatus |
CN111191660A (en) * | 2019-12-30 | 2020-05-22 | 浙江工业大学 | Rectal cancer pathology image classification method based on multi-channel collaborative capsule network |
Non-Patent Citations (2)
Title |
---|
李航;余镇;倪东;雷柏英;汪天富;: "基于深度残差网络的皮肤镜图像黑色素瘤的识别", 中国生物医学工程学报, no. 03 * |
胡海根;孔祥勇;周乾伟;管秋;陈胜勇;: "基于深层卷积残差网络集成的黑色素瘤分类方法", 计算机科学, no. 05 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113610061A (en) * | 2021-09-30 | 2021-11-05 | 国网浙江省电力有限公司电力科学研究院 | Method and system for identifying unstressed conducting wire based on target detection and residual error network |
Also Published As
Publication number | Publication date |
---|---|
CN112801166B (en) | 2024-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109583342B (en) | Human face living body detection method based on transfer learning | |
CN107316307B (en) | Automatic segmentation method of traditional Chinese medicine tongue image based on deep convolutional neural network | |
CN106803247B (en) | Microangioma image identification method based on multistage screening convolutional neural network | |
CN108364288B (en) | Segmentation method and device for breast cancer pathological image | |
CN110428432B (en) | Deep neural network algorithm for automatically segmenting colon gland image | |
CN107316294B (en) | Lung nodule feature extraction method based on improved depth Boltzmann machine | |
CN111738064B (en) | Haze concentration identification method for haze image | |
CN109389129A (en) | A kind of image processing method, electronic equipment and storage medium | |
CN112700461B (en) | System for pulmonary nodule detection and characterization class identification | |
CN112529870B (en) | Multi-scale CNNs lung nodule false positive eliminating method based on combination of source domain and frequency domain | |
CN112926652B (en) | Fish fine granularity image recognition method based on deep learning | |
CN109801305B (en) | SAR image change detection method based on deep capsule network | |
CN110264454B (en) | Cervical cancer histopathological image diagnosis method based on multi-hidden-layer conditional random field | |
Song et al. | Hybrid deep autoencoder with Curvature Gaussian for detection of various types of cells in bone marrow trephine biopsy images | |
CN111968124B (en) | Shoulder musculoskeletal ultrasonic structure segmentation method based on semi-supervised semantic segmentation | |
CN114648806A (en) | Multi-mechanism self-adaptive fundus image segmentation method | |
CN112183237A (en) | Automatic white blood cell classification method based on color space adaptive threshold segmentation | |
CN111815563A (en) | Retina optic disk segmentation method combining U-Net and region growing PCNN | |
Eassa et al. | Automated detection and classification of galaxies based on their brightness patterns | |
CN112801166A (en) | Melanoma classification method based on feature data and dimension scaling | |
CN109741351A (en) | A kind of classification responsive type edge detection method based on deep learning | |
Nirmala et al. | HoG based Naive Bayes classifier for glaucoma detection | |
Yadav et al. | Oral cancer detection using deep learning approach | |
Vig et al. | Entropy-based multilevel 2D histogram image segmentation using DEWO optimization algorithm | |
CN112001896B (en) | Thyroid gland border irregularity detection device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |