CN112801166A - Melanoma classification method based on feature data and dimension scaling - Google Patents

Melanoma classification method based on feature data and dimension scaling Download PDF

Info

Publication number
CN112801166A
CN112801166A CN202110094691.6A CN202110094691A CN112801166A CN 112801166 A CN112801166 A CN 112801166A CN 202110094691 A CN202110094691 A CN 202110094691A CN 112801166 A CN112801166 A CN 112801166A
Authority
CN
China
Prior art keywords
data
image
feature
image data
melanoma
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110094691.6A
Other languages
Chinese (zh)
Other versions
CN112801166B (en
Inventor
陈嘉豪
胡西川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Maritime University
Original Assignee
Shanghai Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Maritime University filed Critical Shanghai Maritime University
Priority to CN202110094691.6A priority Critical patent/CN112801166B/en
Publication of CN112801166A publication Critical patent/CN112801166A/en
Application granted granted Critical
Publication of CN112801166B publication Critical patent/CN112801166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4046Scaling the whole image or part thereof using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Abstract

The invention discloses a melanoma classification method based on feature data and dimension scaling, which comprises the following steps: performing enhancement processing on the image; extracting data of the image after the enhancement processing to obtain target image data; and performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data; performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data; and classifying the fused image data. By applying the embodiment of the invention, the characteristic data which has a large influence on the incidence of melanoma and the effective network which can be scaled and adjusted are fused, so that the performance and the accuracy of the constructed network are obviously improved.

Description

Melanoma classification method based on feature data and dimension scaling
Technical Field
The invention relates to the technical field of computer image recognition, in particular to a melanoma classification method based on feature data and dimension scaling.
Background
Previously experienced doctors have used a method of identification with a skin mirror to diagnose melanoma. Such diagnosis has great drawbacks and limitations. From an objective point of view, the judgment of naked eyes is seriously influenced by hair interference, light influence and the like on the image. From the subjective perspective, each doctor can judge according to own experience, and the subjective color is strong. In addition, the visual fatigue caused by long-time observation of the skin image causes larger difference of diagnosis results, poorer repeatability and lower diagnosis rate.
With The popularization of computer hardware GPUs and The high-speed development of The computer vision field, Yasuhiro Fujisawa and Sae Inoue (y. Fujisawa, s. Inoue, and y. nakamura, "The usability of deep learning-based, computer-aided skin processors," Frontiers med, vol.6, p.191, aug.2019.) and others applied deep learning to melanoma classification of a skin mirror image, and found that some of them have achieved The level of an experienced doctor of a method based on deep learning calculation classification capability of neural network, and some are even better than The judgment of The doctor. Nasiri, s., heller et al (Nasiri, s., heller, j., Jung, m.et al. depict Melanoma Deep-CLASS: a Deep connected neural networks approach to CLASS learning images bmc Bioinformatics 21,84(2020). https:// doi.org/10.1186/s12859-020-3351-y) fine-tunes the inclusion V3 model pre-trained on ImageNet, and then fuses the manual function and the Deep learning function through the dense layer of the artificial neural network. The diagnostic accuracy of the feature fusion model reaches 82.02%. In the skin lesion analysis of ISBI 2016 towards Melanoma detection challenge data set, Vinay B N et al (Vinay B N, Shah P J, Shekar V, et al. detection of Melanoma using Deep Learning Techniques [ C ]//2020International Conference on Computation, Automation and Knowledge Management (ICCAKM). IEEE,2020:391 394.) segmented images using U-Net and then classified using a depth residual network with an accuracy of 88.7%. However, the increasing depth of convolutional neural networks presents problems such as overfitting, data computationally intensive, gradient vanishing, gradient explosion, etc.
Huhaiyin and Wang ocean, et al, proposed a model combining multiple deep learning network models (patent application publication No.: CN108427963A), which did not increase much accuracy, but increased the amount of computation too much. The method aims at the defects that the calculated amount is too large, the accuracy is not greatly improved, overfitting is easy to occur on a training set and the like caused by the deep increase of the existing melanoma recognition network and the joint learning. The neural network efficiency-Mix fusing characteristic data such as age and gender of an identification object and a multidimensional scaling network efficiency network is provided for classifying melanoma, and compared with Restnet50, the identification accuracy of the network is improved by 5%.
Disclosure of Invention
The invention aims to provide a melanoma classification method based on feature data and dimension scaling, and aims to solve the problems that in the prior art, the calculated amount is too large, the accuracy is not improved high, overfitting is easy to occur on a training set and the like.
In order to achieve the above object, the present invention provides a melanoma classification method based on feature data and dimension scaling, comprising:
performing enhancement processing on the image;
extracting data of the image after the enhancement processing to obtain target image data;
performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data;
and classifying the fused image data.
Optionally, the step of extracting data from the enhanced image to obtain target image data includes:
adjusting the image subjected to data enhancement and preprocessing into 384 × 384, and performing transfer learning on the image on an efficiency network trained on ImageNet so as to calculate the average value of each feature map through a global average pooling layer to avoid overfitting;
and taking the data subjected to the migration learning as target image data.
In one implementation, the step of extracting the data features of the enhanced image to obtain feature data includes:
respectively converting the gender data and the age data into objects and inputting the objects to a multidimensional scaling network full-connection layer;
the obtained characteristic data is normalized and activated by the RELU layer.
Optionally, the step of connecting the obtained target image data and the feature data to obtain fused image data includes:
and connecting the obtained target image data and the feature data through a coordinate function to obtain fused image data.
By applying the melanoma classification method based on the characteristic data and the dimension scaling, provided by the embodiment of the invention, the characteristic data which has a large influence on the melanoma incidence and the scalable effective network are fused, so that the performance and the accuracy of the network are obviously improved. The method for improving the image recognition accuracy of the network based on the Restnet50 is characterized in that problems occurring in the research process are solved and improved by adopting methods of image preprocessing, data enhancement, dropout layer addition and the like, the fitting capability and the generalization capability of the model are better optimized, the situations of overlarge calculated amount, overfitting, gradient explosion, gradient disappearance and the like caused by the fact that the depth of the network is increased in the prior art are reduced, and the recognition accuracy of the network is improved by 5% compared with that of the Restnet 50.
Drawings
Fig. 1 is a flowchart illustrating a melanoma classification method based on feature data and dimension scaling according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of an Efficient-Mix network structure according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
As shown in fig. 1, the present invention provides a melanoma classification method based on feature data and dimension scaling, which includes:
s110, enhancing the image;
it should be noted that deep learning is based on huge data volume and reasonably distributed and accurate data labels to realize accurate identification and classification. Therefore, deep learning requires a large amount of data for training, but melanoma data as medical data is limited in amount, and thus data enhancement is very important. The data quantity and diversity of training data are increased through data enhancement, the generalization capability of the model can be improved, and overfitting is avoided. The data enhancement mainly comprises the operations of rotating, cutting, zooming, stretching, contrast enhancement, brightness enhancement and the like on the image. The data enhancement technique expands the size of the training data set by making a series of random changes to the training images to produce similar but different training samples. Randomly changing the training samples can reduce the dependence of the model on certain attributes, thereby improving the generalization capability of the model.
The efficiency network capable of being scaled and adjusted in multiple dimensions can balance all dimensions of the network during scaling, and better precision and efficiency can be achieved. Using a complex coefficient
Figure BDA0002913655920000042
To scale the network width, depth and resolution uniformly as shown in the formula.
Figure BDA0002913655920000041
Wherein
Figure BDA0002913655920000043
Is a specific coefficient of the number of bits to be processed,the usage amount of the control resource, α, β, γ, determines the specific allocation of the resource, and α, β, γ represent the network depth, the network width, and the specific weight of the image resolution, respectively.
Compared with a VGG-16 network with deeper depth, the relationship among the network depth, the network width and the resolution can be balanced by adopting the multidimensional scaling Efficient network, and the problems of overfitting, gradient explosion, gradient disappearance, larger calculated amount and the like caused by the increase of the number of network layers are effectively avoided.
S120, extracting data of the image after the enhancement processing to obtain target image data;
it can be understood that in the model structure, the melanoma image subjected to data enhancement and preprocessing is firstly resized to 384 × 384, and then the image is input to the efficiency network trained on ImageNet and is subjected to migration learning at the data extraction layer, so as to obtain image data. And calculating the average value of each feature map through a global average pooling layer to avoid overfitting.
S130, performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
in the present invention, characteristic data strongly associated with melanoma, i.e., sex and age, are converted into a subject, a full link layer of the input data is used to extract data characteristics, and then the extracted characteristic data is normalized and activated by the RELU layer. Experiments show that when the output sizes of the sex and the age are adjusted to 50, the influence of the sex and the age is too small, so that the effect of the fused network is poor; when the size is adjusted to 500, the occupation proportion of gender and age is too high, the decision factor of the image is too low, and the accuracy rate is reduced. And when the size is adjusted to be about 100, the model effect is best.
S140, performing data connection on the obtained target image data and the feature data to obtain fused image data;
it can be understood that, in the converged network, a very critical step is to connect the processed image data and the feature data at a converged connection layer by using a concatenate function, and through this step, the Efficient-Mix network model of the present invention merges the feature information and the image information at the same time. The model and the data are more closely associated, melanoma can be predicted more accurately, and compared with an effective network which is judged only according to image information, the identification accuracy is effectively improved by about 1%.
And S150, classifying the fused image data.
It should be noted that, the effective-Mix network model uses RELU as a full connection layer of an activation function, a BN layer for data normalization processing, and a discard layer for preventing overfitting, and finally adds a classification layer with an activation function of Sigmoid to classify the fused image data, thereby rapidly and accurately distinguishing normal black nevi and melanoma. The Efficient-Mix network structure is shown in fig. 2.
Since the class distribution of SIIM melanoma is extremely unbalanced, the Loss function herein employs Focal Loss. Compared with the traditional binary Loss function, the Focal local not only solves the problem of uneven distribution of positive and negative samples through a specific weight coefficient, but also balances the proportion of easily classified samples and easily misclassified samples by using a modulation factor. The conventional two-class cross entropy loss is shown in the formula.
Figure BDA0002913655920000051
Where p represents the probability of predicting a sample to be 1 via the activation function, with a value between 0 and 1. y represents the true class value to which the sample corresponds. For positive samples, the higher the output probability value, the lower the loss, and conversely, the lower the output probability value, the lower the loss for negative samples. A common method for solving the class imbalance is to give different specific weight factors to different classes, and apply a weight factor alpha to the positive class samples, which belongs to [0,1 ]]The weight of the negative class sample is 1-alpha, using alphatThe weighting factors for the positive and negative samples are collectively referred to, and, as such, p is usedtRepresenting the predicted probability of positive and negative samples, CE loss can be derivedThe following formula is shown.
CE(pt)=-αtlog(pt)
Figure BDA0002913655920000052
Figure BDA0002913655920000061
Alpha can balance the proportion of positive and negative examples, but it is difficult to distinguish the proportion of easy-to-classify samples and easy-to-misclassify samples. Focal local will modulate factor (1-p)t)γAdding to CE Loss, the Focal Loss can be derived as shown below, where γ is the modulation factor. For samples which are difficult to distinguish, the Focal local performs better when a large number of samples with low prediction probability are used.
FL(pt)=-(1-pt)γlog(pt)
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (4)

1. A melanoma classification method based on feature data and dimension scaling is characterized by comprising the following steps:
performing enhancement processing on the image;
extracting data of the image after the enhancement processing to obtain target image data;
performing data feature extraction on the image after the enhancement processing to obtain feature data, wherein the feature data at least comprises: gender data and age data;
performing data connection on the obtained target image data and the obtained characteristic data to obtain fused image data;
and classifying the fused image data.
2. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of extracting the data of the image after the enhancement processing to obtain target image data comprises:
adjusting the image subjected to data enhancement and preprocessing into 384 × 384, and performing transfer learning on the image on an efficiency network trained on ImageNet so as to calculate the average value of each feature map through a global average pooling layer to avoid overfitting;
and taking the data subjected to the migration learning as target image data.
3. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of performing data feature extraction on the image after enhancement processing to obtain feature data comprises:
respectively converting the gender data and the age data into objects and inputting the objects to a multidimensional scaling network full-connection layer;
the obtained characteristic data is normalized and activated by the RELU layer.
4. The melanoma classification method based on feature data and dimension scaling according to claim 1, wherein the step of connecting the obtained target image data and the feature data to obtain fused image data comprises:
and connecting the obtained target image data and the feature data through a coordinate function to obtain fused image data.
CN202110094691.6A 2021-01-25 2021-01-25 Melanoma classification method based on characteristic data and dimension scaling Active CN112801166B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110094691.6A CN112801166B (en) 2021-01-25 2021-01-25 Melanoma classification method based on characteristic data and dimension scaling

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110094691.6A CN112801166B (en) 2021-01-25 2021-01-25 Melanoma classification method based on characteristic data and dimension scaling

Publications (2)

Publication Number Publication Date
CN112801166A true CN112801166A (en) 2021-05-14
CN112801166B CN112801166B (en) 2024-03-01

Family

ID=75811483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110094691.6A Active CN112801166B (en) 2021-01-25 2021-01-25 Melanoma classification method based on characteristic data and dimension scaling

Country Status (1)

Country Link
CN (1) CN112801166B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610061A (en) * 2021-09-30 2021-11-05 国网浙江省电力有限公司电力科学研究院 Method and system for identifying unstressed conducting wire based on target detection and residual error network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107909566A (en) * 2017-10-28 2018-04-13 杭州电子科技大学 A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning
CN108427963A (en) * 2018-03-02 2018-08-21 浙江工业大学 A kind of dermopathic classifying identification method of melanoma based on deep learning
CN109711281A (en) * 2018-12-10 2019-05-03 复旦大学 A kind of pedestrian based on deep learning identifies again identifies fusion method with feature
CN111191660A (en) * 2019-12-30 2020-05-22 浙江工业大学 Rectal cancer pathology image classification method based on multi-channel collaborative capsule network
WO2021000906A1 (en) * 2019-07-02 2021-01-07 五邑大学 Sar image-oriented small-sample semantic feature enhancement method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107909566A (en) * 2017-10-28 2018-04-13 杭州电子科技大学 A kind of image-recognizing method of the cutaneum carcinoma melanoma based on deep learning
CN108427963A (en) * 2018-03-02 2018-08-21 浙江工业大学 A kind of dermopathic classifying identification method of melanoma based on deep learning
CN109711281A (en) * 2018-12-10 2019-05-03 复旦大学 A kind of pedestrian based on deep learning identifies again identifies fusion method with feature
WO2021000906A1 (en) * 2019-07-02 2021-01-07 五邑大学 Sar image-oriented small-sample semantic feature enhancement method and apparatus
CN111191660A (en) * 2019-12-30 2020-05-22 浙江工业大学 Rectal cancer pathology image classification method based on multi-channel collaborative capsule network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李航;余镇;倪东;雷柏英;汪天富;: "基于深度残差网络的皮肤镜图像黑色素瘤的识别", 中国生物医学工程学报, no. 03 *
胡海根;孔祥勇;周乾伟;管秋;陈胜勇;: "基于深层卷积残差网络集成的黑色素瘤分类方法", 计算机科学, no. 05 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610061A (en) * 2021-09-30 2021-11-05 国网浙江省电力有限公司电力科学研究院 Method and system for identifying unstressed conducting wire based on target detection and residual error network

Also Published As

Publication number Publication date
CN112801166B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN109583342B (en) Human face living body detection method based on transfer learning
CN107316307B (en) Automatic segmentation method of traditional Chinese medicine tongue image based on deep convolutional neural network
CN106803247B (en) Microangioma image identification method based on multistage screening convolutional neural network
CN108364288B (en) Segmentation method and device for breast cancer pathological image
CN110428432B (en) Deep neural network algorithm for automatically segmenting colon gland image
CN107316294B (en) Lung nodule feature extraction method based on improved depth Boltzmann machine
CN111738064B (en) Haze concentration identification method for haze image
CN109389129A (en) A kind of image processing method, electronic equipment and storage medium
CN112700461B (en) System for pulmonary nodule detection and characterization class identification
CN112529870B (en) Multi-scale CNNs lung nodule false positive eliminating method based on combination of source domain and frequency domain
CN112926652B (en) Fish fine granularity image recognition method based on deep learning
CN109801305B (en) SAR image change detection method based on deep capsule network
CN110264454B (en) Cervical cancer histopathological image diagnosis method based on multi-hidden-layer conditional random field
Song et al. Hybrid deep autoencoder with Curvature Gaussian for detection of various types of cells in bone marrow trephine biopsy images
CN111968124B (en) Shoulder musculoskeletal ultrasonic structure segmentation method based on semi-supervised semantic segmentation
CN114648806A (en) Multi-mechanism self-adaptive fundus image segmentation method
CN112183237A (en) Automatic white blood cell classification method based on color space adaptive threshold segmentation
CN111815563A (en) Retina optic disk segmentation method combining U-Net and region growing PCNN
Eassa et al. Automated detection and classification of galaxies based on their brightness patterns
CN112801166A (en) Melanoma classification method based on feature data and dimension scaling
CN109741351A (en) A kind of classification responsive type edge detection method based on deep learning
Nirmala et al. HoG based Naive Bayes classifier for glaucoma detection
Yadav et al. Oral cancer detection using deep learning approach
Vig et al. Entropy-based multilevel 2D histogram image segmentation using DEWO optimization algorithm
CN112001896B (en) Thyroid gland border irregularity detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant