CN110321968B - Ultrasonic image classification device - Google Patents

Ultrasonic image classification device Download PDF

Info

Publication number
CN110321968B
CN110321968B CN201910624643.6A CN201910624643A CN110321968B CN 110321968 B CN110321968 B CN 110321968B CN 201910624643 A CN201910624643 A CN 201910624643A CN 110321968 B CN110321968 B CN 110321968B
Authority
CN
China
Prior art keywords
image
unit
sample
images
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910624643.6A
Other languages
Chinese (zh)
Other versions
CN110321968A (en
Inventor
艾雄志
王永华
万频
齐蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201910624643.6A priority Critical patent/CN110321968B/en
Publication of CN110321968A publication Critical patent/CN110321968A/en
Application granted granted Critical
Publication of CN110321968B publication Critical patent/CN110321968B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The embodiment of the invention discloses an ultrasonic image classification device, wherein a conversion unit performs format conversion on an acquired ultrasonic image to obtain a target ultrasonic image; and the residual processing unit is used for carrying out residual processing on the target ultrasonic image to obtain a plurality of sub-images. The residual processing is carried out on the image, and the ultrasonic image can be divided into a plurality of sub-images, so that the ultrasonic image can be divided into smaller image units for processing, and the accuracy of image analysis is effectively improved. Compared with the traditional convolution analysis, the residual error processing mode can reduce the data processing amount, so that the processing efficiency of image classification is improved. The determining unit analyzes the plurality of sub-images by using the trained capsule network, and determines the image category to which the ultrasonic image belongs. The capsule network can obtain the relevance between the ultrasonic image and each image category through carrying out cluster analysis on the sub-images, so that the image category to which the ultrasonic image belongs is accurately estimated.

Description

Ultrasonic image classification device
Technical Field
The invention relates to the technical field of ultrasonic image processing, in particular to an ultrasonic image classification device.
Background
Papillary thyroid carcinoma is one of the most common types of cancer, and its onset can be classified into irregular shape, underscore boundary, echo unevenness, calcification, and the like. Ultrasound examination is the preferred imaging examination. The medical image data has complexity and diversity and has higher requirements on the work of doctors. In the case where a large number of ultrasound images need to be diagnosed, the diagnostic efficiency of the physician tends to decrease as the number of ultrasound images increases. As the fatigue of the doctor can increase the misjudgment rate, the result obtained by preprocessing the related technologies such as artificial intelligence and the like is sent to the doctor for judgment, so that the workload of the doctor can be greatly reduced, and the working efficiency is improved.
The existing medical image classification technology takes a convolutional neural network as a basic module to realize the functions. Because of the high complexity of medical images, the ultrasonic images have a series of problems of cell overlapping, edge blurring, vascular interference and the like, and the convolutional neural network is used as a classifier only to often cause a large number of false positive phenomena, so that the low accuracy rate has little help to doctors in the actual medical judgment process
It can be seen that how to improve the accuracy of ultrasound image classification is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the invention aims to provide an ultrasonic image classification device which can improve the accuracy of ultrasonic image classification.
In order to solve the technical problems, an embodiment of the present invention provides an ultrasound image classification device, which includes a conversion unit, a residual processing unit, and a determination unit;
the conversion unit is used for carrying out format conversion on the acquired ultrasonic image to obtain a target ultrasonic image;
the residual processing unit is used for carrying out residual processing on the target ultrasonic image to obtain a plurality of sub-images;
the determining unit is used for analyzing the plurality of sub-images by utilizing the trained capsule network and determining the image category to which the ultrasonic image belongs.
Optionally, the determining unit includes an extracting subunit, a clustering subunit and a classifying subunit;
the extraction subunit is used for extracting primary features of each sub-image and converting the primary features into feature vectors according to preset dimensions;
the clustering subunit is used for clustering the feature vectors by utilizing a dynamic routing algorithm to obtain an output result;
and the classifying subunit is used for determining the image category to which the ultrasonic image belongs according to the norm value of the output result.
Optionally, the conversion unit includes a format conversion subunit and a size conversion subunit;
the format conversion subunit is used for converting the acquired ultrasonic image into a jpg image;
the size conversion subunit is configured to fix the jpg image to a target ultrasound image with a preset pixel size by using a bilinear interpolation method.
Optionally, for the training process of the capsule network, the device includes an extraction unit, a clustering unit, a calculation unit, a judgment unit, an adjustment unit and an output unit;
the extraction unit is used for extracting sample primary features of each sample image and converting the sample primary features into sample feature vectors according to preset dimensions; each sample image has a corresponding actual image class value;
the clustering unit is used for carrying out clustering processing on sample feature vectors corresponding to the same target sample image by utilizing a dynamic routing algorithm to obtain a sample output result;
the calculating unit is used for calculating a norm value corresponding to the sample output result; taking the norm value with the maximum value as the sample norm value of the target sample image;
the judging unit is used for judging whether the sample norm value is matched with the actual image class value or not; if not, triggering the adjusting unit; if yes, triggering the output unit;
the adjusting unit is used for adjusting the weight value of the capsule network according to the deviation value of the sample output value and the actual image class value; and returning to the step of extracting the primary characteristics of the samples of each sample image after adjusting the weight value;
and the output unit is used for ending training and outputting a trained capsule network.
Optionally, the clustering unit is specifically configured to obtain a sample output result according to the following formula:
Figure BDA0002126694290000031
in the method, in the process of the invention,
Figure BDA0002126694290000032
wherein v is j The output result of the j-th capsule layer is represented; s is(s) j A vector representing an input of a j-th capsule layer;
Figure BDA0002126694290000033
ensuring the output value of the capsule layer to be within the interval [0,1 ]];c ij Representing the associated values of the adjacent capsule layers i, j;
Figure BDA0002126694290000034
The j-th layer prediction vector output is obtained by calculating the i-th layer capsule layer; b ij The probability that the i-th capsule layer is selected by the j-th capsule layer is represented, and when the routing layer starts to execute, the initial value is set to 0; k is the total number of capsules in the capsule layer, j is [1, k ]];W ij Represented are weight values for learning and back propagation; u (u) i The output of the ith capsule layer is shown.
Optionally, the judging is specifically configured to calculate a deviation value L of the sample output value and the actual image class value according to the following formula c
L c =T c max(0,m + -||v c ||) 2 +λ(1-T c )max(0,||v c ||-m - ) 2
Wherein c represents the category of the image classification; t (T) c An indicator function representing class c, class c being T when present c =1, T in absence of class c c =0;m + Representing the upper edge parameters; m is m - Representing the lower edge parameters; λ represents a scale parameter; v c Representing the output of the ultrasound image under category c.
Optionally, the image categories include irregular shape images, unclear boundary images, echo non-uniform images, calcified images, and normal images.
Optionally, the display device further comprises a display unit; the display unit is used for displaying the image category to which the ultrasonic image belongs.
Optionally, the device further comprises a storage unit;
the storage unit is used for storing the image category to which the ultrasonic image belongs.
Optionally, the system further comprises a statistics unit;
the statistics unit is used for counting the total number of the ultrasonic images acquired in the preset time and the number of the ultrasonic images corresponding to each image type.
According to the technical scheme, the conversion unit performs format conversion on the acquired ultrasonic image to obtain a target ultrasonic image; and the residual processing unit is used for carrying out residual processing on the target ultrasonic image to obtain a plurality of sub-images. The residual processing is carried out on the image, and the ultrasonic image can be divided into a plurality of sub-images, so that the ultrasonic image can be divided into smaller image units for processing, and the accuracy of image analysis is effectively improved. Compared with the traditional convolution analysis, the residual error processing mode can reduce the data processing amount, so that the processing efficiency of image classification is improved. The determining unit analyzes the plurality of sub-images by using the trained capsule network, and determines the image category to which the ultrasonic image belongs. The capsule network can obtain the relevance between the ultrasonic image and each image category through carrying out cluster analysis on the sub-images, so that the image category to which the ultrasonic image belongs is accurately estimated.
Drawings
For a clearer description of embodiments of the present invention, the drawings that are required to be used in the embodiments will be briefly described, it being apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
Fig. 1 is a schematic structural diagram of an ultrasound image classification device according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a network model of fusion of a residual network and a capsule network according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without making any inventive effort are within the scope of the present invention.
In order to better understand the aspects of the present invention, the present invention will be described in further detail with reference to the accompanying drawings and detailed description.
Next, an ultrasound image classification apparatus provided by an embodiment of the present invention will be described in detail. Fig. 1 is a schematic structural diagram of an ultrasound image classification device according to an embodiment of the present invention, where the device includes a conversion unit 11, a residual processing unit 12, and a determination unit 13.
And a conversion unit 11, configured to perform format conversion on the acquired ultrasound image, so as to obtain a target ultrasound image.
After the ultrasonic image is acquired, the ultrasonic image needs to be preprocessed, namely the ultrasonic image is converted into an image format which can be recognized by a computer, and in practical application, the ultrasonic image can be converted into a jpg image.
In order to realize the standardization processing of the ultrasonic images, the ultrasonic images after format conversion can be fixed into ultrasonic images with preset pixel sizes by utilizing a bilinear interpolation method in consideration of the difference of the sizes of different ultrasonic images.
The preset pixel size can be set according to actual requirements, for example, the preset pixel size can be set to 78×78 pixel values.
Taking the jpg image as an example, the jpg image may be fixed to a target ultrasound image with 78×78 pixel values.
And a residual processing unit 12, configured to perform residual processing on the target ultrasound image, so as to obtain a plurality of sub-images.
The residual processing unit 12 mainly aims at dividing the target ultrasonic image into smaller image units for processing, so that the analysis result is finer and more accurate. Compared with the processing mode of the convolutional neural network, the target ultrasonic image is subjected to residual processing, so that the calculated amount of image analysis is reduced, and the processing efficiency of image classification is improved.
The residual processing unit 12 may include a plurality of residual modules, where each residual module is composed of a convolution kernel size 1*1, a convolution kernel size 3*3, and a channel number 32. Correspondingly, the target ultrasonic image is subjected to residual processing, and 32 sub-images can be obtained.
In practical applications, the residual processing unit 12 may be named according to the number of residual modules included in the residual processing unit 12, where one residual module is named as ResNet1, two residual modules are named as ResNet2, and so on, and N residual modules are named as ResNetN.
And the determining unit 13 is used for analyzing the plurality of sub-images by utilizing the trained capsule network and determining the image category to which the ultrasonic image belongs.
The ultrasonic image classification device provided by the embodiment of the invention is suitable for classifying ultrasonic images of papillary thyroid cancer. Symptoms such as irregular shape, unclear boundaries, uneven echo, calcification, and the like can be classified according to the onset of papillary thyroid cancer. Thus, in capsule network training, the set image categories may include irregular shape images, unclear boundary images, echo non-uniform images, calcification images, and normal images.
In the embodiment of the invention, the sub-images corresponding to the same ultrasonic image are input into a trained capsule network as a group of images, so that the image category of the ultrasonic image is determined.
The determination unit 13 may be divided into an extraction subunit, a clustering subunit and a categorizing subunit according to different layer functions of the capsule network.
And the extraction subunit is used for extracting the primary features of each sub-image and converting the primary features into feature vectors according to preset dimensions.
The primary features reflect image distribution features of the sub-images. The primary features extracted by the sub-images of different image categories are different.
The extracted primary features belong to scalar quantities, which need to be converted into vector form in order to facilitate subsequent processing analysis. In the embodiment of the invention, the primary features are subjected to space dimension conversion, and a new dimension is expanded on the basis of the primary feature dimension, so that feature vectors corresponding to each primary feature one by one are obtained.
For example, a primary feature of the output of the convolution layer is [8,24], which is converted from the spatial dimension of the initial capsule layer, then becomes [8,8,3], and the number of values in the matrix is unchanged, but one dimension is increased, so that the feature vector can represent more parameters, thereby improving the recognition accuracy and preparing for the input of the next layer.
And the clustering subunit is used for clustering the feature vectors by utilizing a dynamic routing algorithm to obtain an output result.
The clustering of the feature vectors refers to the clustering of the feature vectors corresponding to the same ultrasonic image.
In combination with the above description, the image categories may include 5 categories, and correspondingly, the obtained output results include 5 output results. The 5 output results reflect the relevance of the ultrasound image to the 5 categories, respectively.
And the classifying sub-unit is used for determining the image category to which the ultrasonic image belongs according to the norm value of the output result.
The number of elements in each output result may be 16, and further, the L2 norms are calculated for the 5 output results, that is, the norm value (i.e., length) of each output result is calculated, and the output result with the largest value represents the category with the largest image probability. In the embodiment of the invention, the probability of the entity occurrence is measured by the magnitude of the L2 norm of the output vector, and the larger the L2 norm value is, the larger the probability of the corresponding category occurrence is. The image category corresponding to the norm value with the largest value is the image category to which the ultrasonic image belongs.
The capsule network includes an input layer, a convolution layer, an initial capsule layer (primalycaps), a digital capsule layer (DigitCaps), and an output layer.
In the embodiment of the invention, the classification of the ultrasonic image is realized through the mutual cooperation of the residual error network and the capsule network. The network model of the fusion of the residual network and the capsule network is shown in fig. 2, wherein the leftmost part in fig. 2 represents an ultrasonic image, resNet represents the residual network, primarycaps represents an initial capsule layer, and Digitcaps represents a digital capsule layer. The ultrasonic image is processed by a residual network to obtain a plurality of sub-images, the Primarycaps are used for presenting the sub-images in the form of feature vectors, and the Digitcaps analyze the feature vectors by using a dynamic routing algorithm to obtain an output result.
In the embodiment of the invention, the capsule network is trained through the sample image, and the weight value of the capsule network is continuously adjusted, so that the accuracy of the capsule network can meet the preset requirement, and the trained capsule network is obtained.
The trained capsule network is the basis for performing image class analysis. Next, the training process of the capsule network will be described.
The ultrasonic image classification device comprises an extraction unit, a clustering unit, a calculation unit, a judgment unit, an adjustment unit and an output unit aiming at the training process of the capsule network.
The extraction unit is used for extracting sample primary features of each sample image and converting the sample primary features into sample feature vectors according to preset dimensions; wherein each sample image has its corresponding actual image class value.
And the clustering unit is used for carrying out clustering processing on sample feature vectors corresponding to the same target sample image by utilizing a dynamic routing algorithm to obtain a sample output result.
The clustering unit is specifically configured to obtain a sample output result according to the following formula:
Figure BDA0002126694290000081
in the method, in the process of the invention,
Figure BDA0002126694290000082
wherein v is j The output result of the j-th capsule layer is represented; s is(s) j A vector representing an input of a j-th capsule layer;
Figure BDA0002126694290000083
ensuring the output value of the capsule layer to be within the interval [0,1 ]];c ij Representing the associated values of the adjacent capsule layers i, j;
Figure BDA0002126694290000084
The j-th layer prediction vector output is obtained by calculating the i-th layer capsule layer; b ij The probability that the i-th capsule layer is selected by the j-th capsule layer is represented, and when the routing layer starts to execute, the initial value is set to 0; k is the total number of capsules in the capsule layer, j is [1, k ]];W ij Represented are weight values for learning and back propagation; u (u) i The output of the ith capsule layer is shown.
The calculating unit is used for calculating a norm value corresponding to the sample output result; and taking the norm value with the maximum value as the sample norm value of the target sample image.
And the judging unit is used for judging whether the sample norm value is matched with the actual image category value.
Whether the sample norm value is matched with the actual image class value may include that the sample norm value is the same as the actual image class value, or may include that a deviation value of the sample norm value and the actual image class value is within a preset range.
When the sample norm value is not matched with the actual image category value, the category evaluation accuracy of the capsule network is not required, and the adjusting unit can be triggered.
The adjusting unit is used for adjusting the weight value of the capsule network according to the deviation value of the sample output value and the actual image class value; and returning to the step of extracting the sample primary features of each sample image after adjusting the weight values.
In the training process of the capsule network, the image type of the ultrasonic image can be preset.
To facilitate distinguishing between multiple image categories, a classification flag may be set for each image category, for example, 0_00001 for a first image of irregular shape, 0_00002 for a second image of irregular shape, and so on. In capsule network training, the shape irregularity categories included 444 sample images in total. 1_00001 represents the first image with unclear boundaries, which is a total of 499 sample images during training of the capsule network. 2_00001 represents the first image of echo non-uniformity, which is similar to 454 sample images in total when the capsule network is trained. 3_00001 represents the first image of calcification, and the calcification class totals 515 sample images when the capsule network is trained. 4_00001 represents the normal first image, which is a total of 468 sample images during capsule network training.
The pooling layer in the conventional convolutional network loses some information in the process of dimension reduction, so that the recognition accuracy is reduced. The identification precision of the image category can be effectively improved by utilizing the dynamic routing principle in the capsule network, the principle is to update the association degree of the upper and lower capsule layers mainly by judging the weight value, the proportion of the layer in the learning process is replaced, if the predicted result is close to the true value, the association value of the layer is increased, and if the predicted result is far away from the true value, the association value of the layer is reduced.
In the embodiment of the invention, the judgment is specifically used for calculating the deviation value L of the sample output value and the actual image class value according to the following formula c
L c =T c max(0,m + -||v c ||) 2 +λ(1-T c )max(0,||v c ||-m - ) 2
Wherein c represents the category of the image classification; t (T) c An indicator function representing class c, class c being T when present c =1, T in absence of class c c =0;m + Representing the upper edge parameters; m is m - Representing the lower edge parameters; λ represents a scale parameter; v c Representing the output of the ultrasound image under category c.
When the sample norm value is matched with the actual image category value, the category evaluation accuracy of the capsule network is required, and the output unit can be triggered to finish training at the moment, so that the trained capsule network is output.
According to the technical scheme, the conversion unit performs format conversion on the acquired ultrasonic image to obtain a target ultrasonic image; and the residual processing unit is used for carrying out residual processing on the target ultrasonic image to obtain a plurality of sub-images. The residual processing is carried out on the image, and the ultrasonic image can be divided into a plurality of sub-images, so that the ultrasonic image can be divided into smaller image units for processing, and the accuracy of image analysis is effectively improved. Compared with the traditional convolution analysis, the residual error processing mode can reduce the data processing amount, so that the processing efficiency of image classification is improved. The determining unit analyzes the plurality of sub-images by using the trained capsule network, and determines the image category to which the ultrasonic image belongs. The capsule network can obtain the relevance between the ultrasonic image and each image category through carrying out cluster analysis on the sub-images, so that the image category to which the ultrasonic image belongs is accurately estimated.
In the embodiment of the invention, in order to facilitate doctors to intuitively understand the category evaluation result of the ultrasonic image, a display unit can be arranged. And displaying the image category to which the ultrasonic image belongs through the display unit.
In practical application, in order to facilitate the subsequent call of the image category evaluation result of the ultrasound image, a storage unit may be provided in the ultrasound image classification device.
And the storage unit is used for storing the image category to which the ultrasonic image belongs.
For example, the ultrasound images and their corresponding image categories may be stored in a correspondence list to facilitate subsequent review calls.
In order to facilitate understanding of the probability of occurrence of different image categories, a statistical unit may be provided in the ultrasound image classification device. The statistics unit is used for counting the total number of the ultrasonic images acquired in the preset time and the number of the ultrasonic images corresponding to each image type.
The above describes in detail an ultrasound image classification device provided by the embodiment of the present invention. In the description, each embodiment is described in a progressive manner, and each embodiment is mainly described by the differences from other embodiments, so that the same similar parts among the embodiments are mutually referred. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section. It should be noted that it will be apparent to those skilled in the art that various modifications and adaptations of the invention can be made without departing from the principles of the invention and these modifications and adaptations are intended to be within the scope of the invention as defined in the following claims.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.

Claims (9)

1. An ultrasonic image classification device is characterized by comprising a conversion unit, a residual error processing unit and a determination unit;
the conversion unit is used for carrying out format conversion on the acquired ultrasonic image to obtain a target ultrasonic image;
the residual processing unit is used for carrying out residual processing on the target ultrasonic image to obtain a plurality of sub-images;
the determining unit is used for analyzing the plurality of sub-images by utilizing the trained capsule network and determining the image category to which the ultrasonic image belongs;
the determining unit comprises an extracting subunit, a clustering subunit and a classifying subunit;
the extraction subunit is used for extracting primary features of each sub-image, and expanding a new dimension on the basis of the primary feature dimension according to the preset dimension so as to obtain feature vectors corresponding to each primary feature one by one;
the clustering subunit is used for clustering the feature vectors by utilizing a dynamic routing algorithm to obtain an output result;
and the classifying subunit is used for determining the image category to which the ultrasonic image belongs according to the norm value of the output result.
2. The apparatus of claim 1, wherein the conversion unit comprises a format conversion subunit and a size conversion subunit;
the format conversion subunit is used for converting the acquired ultrasonic image into a jpg image;
the size conversion subunit is configured to fix the jpg image to a target ultrasound image with a preset pixel size by using a bilinear interpolation method.
3. The device according to claim 1, characterized in that for the training process of the capsule network, the device comprises an extraction unit, a clustering unit, a calculation unit, a judgment unit, an adjustment unit and an output unit;
the extraction unit is used for extracting sample primary features of each sample image and converting the sample primary features into sample feature vectors according to preset dimensions; each sample image has a corresponding actual image class value;
the clustering unit is used for carrying out clustering processing on sample feature vectors corresponding to the same target sample image by utilizing a dynamic routing algorithm to obtain a sample output result;
the calculating unit is used for calculating a norm value corresponding to the sample output result; taking the norm value with the maximum value as the sample norm value of the target sample image;
the judging unit is used for judging whether the sample norm value is matched with the actual image class value or not; if not, triggering the adjusting unit; if yes, triggering the output unit;
the adjusting unit is used for adjusting the weight value of the capsule network according to the deviation value of the sample output value and the actual image class value; and returning to the step of extracting the primary characteristics of the samples of each sample image after adjusting the weight value;
and the output unit is used for ending training and outputting a trained capsule network.
4. The apparatus of claim 3, wherein the clustering unit is specifically configured to obtain the sample output result according to the following formula:
Figure FDA0004052202440000021
in the method, in the process of the invention,
Figure FDA0004052202440000022
wherein v is j The output result of the j-th capsule layer is represented; s is(s) j A vector representing an input of a j-th capsule layer;
Figure FDA0004052202440000023
ensuring the output value of the capsule layer to be within the interval [0,1 ]];c ij Representing the associated values of the adjacent capsule layers i, j;
Figure FDA0004052202440000024
The j-th layer prediction vector output is obtained by calculating the i-th layer capsule layer; b ij Representing the probability that the ith capsule layer is selected by the jth capsule layerWhen the routing layer starts to execute, its initial value is set to 0; k is the total number of capsules in the capsule layer, j is [1, k ]];W ij Represented are weight values for learning and back propagation; u (u) i The output of the ith capsule layer is shown.
5. The apparatus according to claim 4, wherein the determination is specifically for calculating a deviation value L of a sample output value from the actual image class value according to the following formula c
L c =T c max(0,m + -||v c ||) 2 +λ(1-T c )max(0,||v c ||-m - ) 2
Wherein c represents the category of the image classification; t (T) c An indicator function representing class c, class c being T when present c =1, T in absence of class c c =0;m + Representing the upper edge parameters; m is m - Representing the lower edge parameters; λ represents a scale parameter; v c Representing the output of the ultrasound image under category c.
6. The apparatus of any of claims 1-5, wherein the image categories include irregularly shaped images, poorly defined images, echogenic non-uniform images, calcified images, and normal images.
7. The device of any one of claims 1-5, further comprising a display unit; the display unit is used for displaying the image category to which the ultrasonic image belongs.
8. The apparatus of any one of claims 1-5, further comprising a storage unit;
the storage unit is used for storing the image category to which the ultrasonic image belongs.
9. The apparatus of any one of claims 1-5, further comprising a statistics unit;
the statistics unit is used for counting the total number of the ultrasonic images acquired in the preset time and the number of the ultrasonic images corresponding to each image type.
CN201910624643.6A 2019-07-11 2019-07-11 Ultrasonic image classification device Active CN110321968B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910624643.6A CN110321968B (en) 2019-07-11 2019-07-11 Ultrasonic image classification device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910624643.6A CN110321968B (en) 2019-07-11 2019-07-11 Ultrasonic image classification device

Publications (2)

Publication Number Publication Date
CN110321968A CN110321968A (en) 2019-10-11
CN110321968B true CN110321968B (en) 2023-05-05

Family

ID=68121960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910624643.6A Active CN110321968B (en) 2019-07-11 2019-07-11 Ultrasonic image classification device

Country Status (1)

Country Link
CN (1) CN110321968B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111080168B (en) * 2019-12-30 2021-08-03 国网江苏省电力有限公司信息通信分公司 Power communication network equipment reliability evaluation method based on capsule network
CN111723840A (en) * 2020-05-08 2020-09-29 天津大学 Clustering and style migration method for ultrasonic images
CN112233106A (en) * 2020-10-29 2021-01-15 电子科技大学中山学院 Thyroid cancer ultrasonic image analysis method based on residual capsule network
CN113450325B (en) * 2021-06-28 2022-09-09 什维新智医疗科技(上海)有限公司 Thyroid nodule benign and malignant recognition device
CN115019305B (en) * 2022-08-08 2022-11-11 成都西交智汇大数据科技有限公司 Method, device and equipment for identifying root tip cells and readable storage medium
CN117351012B (en) * 2023-12-04 2024-03-12 天津医科大学第二医院 Fetal image recognition method and system based on deep learning

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106874921B (en) * 2015-12-11 2020-12-04 清华大学 Image classification method and device
CN108830826B (en) * 2018-04-28 2020-10-20 四川大学 System and method for detecting pulmonary nodules
CN109300107B (en) * 2018-07-24 2021-01-22 深圳先进技术研究院 Plaque processing method, device and computing equipment for magnetic resonance blood vessel wall imaging
CN109840560B (en) * 2019-01-25 2023-07-04 西安电子科技大学 Image classification method based on clustering in capsule network

Also Published As

Publication number Publication date
CN110321968A (en) 2019-10-11

Similar Documents

Publication Publication Date Title
CN110321968B (en) Ultrasonic image classification device
CN110427970B (en) Image classification method, apparatus, computer device and storage medium
JP6547069B2 (en) Convolutional Neural Network with Subcategory Recognition Function for Object Detection
CN111524137B (en) Cell identification counting method and device based on image identification and computer equipment
WO2018120942A1 (en) System and method for automatically detecting lesions in medical image by means of multi-model fusion
CN109447998B (en) Automatic segmentation method based on PCANet deep learning model
Deng et al. Classification of breast density categories based on SE-Attention neural networks
CN110276745B (en) Pathological image detection algorithm based on generation countermeasure network
CN110021425B (en) Comparison detector, construction method thereof and cervical cancer cell detection method
CN110046627B (en) Method and device for identifying mammary gland image
US11830187B2 (en) Automatic condition diagnosis using a segmentation-guided framework
CN109363698A (en) A kind of method and device of breast image sign identification
CN109363699A (en) A kind of method and device of breast image lesion identification
CN109448854A (en) A kind of construction method of pulmonary tuberculosis detection model and application
CN109146891B (en) Hippocampus segmentation method and device applied to MRI and electronic equipment
CN108664986B (en) Based on lpNorm regularized multi-task learning image classification method and system
CN110879982A (en) Crowd counting system and method
CN111462102B (en) Intelligent analysis system and method based on novel coronavirus pneumonia X-ray chest radiography
CN112132166A (en) Intelligent analysis method, system and device for digital cytopathology image
CN111540467B (en) Schizophrenia classification identification method, operation control device and medical equipment
CN114758137A (en) Ultrasonic image segmentation method and device and computer readable storage medium
CN109461144B (en) Method and device for identifying mammary gland image
CN113782184A (en) Cerebral apoplexy auxiliary evaluation system based on facial key point and feature pre-learning
CN111899259A (en) Prostate cancer tissue microarray classification method based on convolutional neural network
CN113034528A (en) Target area and organ-at-risk delineation contour accuracy testing method based on image omics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant