CN118154991A - Fewer-sample appendix classification system based on ultrasonic image and storage medium - Google Patents
Fewer-sample appendix classification system based on ultrasonic image and storage medium Download PDFInfo
- Publication number
- CN118154991A CN118154991A CN202410564817.5A CN202410564817A CN118154991A CN 118154991 A CN118154991 A CN 118154991A CN 202410564817 A CN202410564817 A CN 202410564817A CN 118154991 A CN118154991 A CN 118154991A
- Authority
- CN
- China
- Prior art keywords
- appendix
- classification
- features
- ultrasonic image
- queue
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013145 classification model Methods 0.000 claims abstract description 26
- 238000001514 detection method Methods 0.000 claims abstract description 17
- 230000011218 segmentation Effects 0.000 claims abstract description 15
- 206010003011 Appendicitis Diseases 0.000 claims description 38
- 238000002604 ultrasonography Methods 0.000 claims description 30
- 230000006870 function Effects 0.000 claims description 21
- 238000012549 training Methods 0.000 claims description 19
- 230000001154 acute effect Effects 0.000 claims description 14
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 239000013598 vector Substances 0.000 claims description 10
- 206010000269 abscess Diseases 0.000 claims description 9
- 206010017711 Gangrene Diseases 0.000 claims description 6
- 230000000968 intestinal effect Effects 0.000 claims description 6
- 230000008961 swelling Effects 0.000 claims description 6
- 230000015572 biosynthetic process Effects 0.000 claims description 4
- 208000004998 Abdominal Pain Diseases 0.000 claims description 3
- 210000000683 abdominal cavity Anatomy 0.000 claims description 3
- 238000004820 blood count Methods 0.000 claims description 3
- 210000000265 leukocyte Anatomy 0.000 claims description 3
- 210000001165 lymph node Anatomy 0.000 claims description 3
- 239000012528 membrane Substances 0.000 claims description 3
- 230000000877 morphologic effect Effects 0.000 claims description 3
- 210000000440 neutrophil Anatomy 0.000 claims description 3
- 238000004590 computer program Methods 0.000 claims description 2
- 238000000034 method Methods 0.000 abstract description 28
- 238000012545 processing Methods 0.000 abstract description 3
- 238000011282 treatment Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000003745 diagnosis Methods 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 2
- 230000003115 biocidal effect Effects 0.000 description 2
- 210000004534 cecum Anatomy 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- OYPRJOBELJOOCE-UHFFFAOYSA-N Calcium Chemical compound [Ca] OYPRJOBELJOOCE-UHFFFAOYSA-N 0.000 description 1
- 206010007882 Cellulitis Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008952 bacterial invasion Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012733 comparative method Methods 0.000 description 1
- 238000012631 diagnostic technique Methods 0.000 description 1
- 230000010339 dilation Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 231100000915 pathological change Toxicity 0.000 description 1
- 230000036285 pathological change Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000011269 treatment regimen Methods 0.000 description 1
Landscapes
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
The invention belongs to the technical field of medical image processing, and particularly relates to an ultrasonic image-based few-sample appendix classification system and a storage medium. The system of the invention comprises: the input module is used for inputting the appendix ultrasonic image and detection information of the patient; the segmentation module is used for positioning the position of the appendix in the ultrasonic image through the appendix segmentation model and cutting out the appendix ultrasonic image of the cross section and the longitudinal section; the prediction module is integrated with a classification model and is used for obtaining appendix ultrasonic images corresponding to the cross section and the longitudinal section respectively according to the detection information, the cross section and the longitudinal section to obtain predicted data, and obtaining a final classification result through a winner taking strategy; the output module is used for outputting the final classification result; the classification model has a memory queue. Compared with the existing method, the method has better classification accuracy for the ultrasonic images with small sample size.
Description
Technical Field
The invention belongs to the technical field of medical image processing, and particularly relates to an ultrasonic image-based few-sample appendix classification system and a storage medium.
Background
Acute appendicitis is one of the most common surgical emergency worldwide, and it is reported that the risks for each individual for their lifetime are about 7-8%. The appendix is a slender and bent cecum which is opened on the inner side wall behind the cecum, is projected on the lower right side of the abdomen of the body surface, and causes the disease when factors such as lumen obstruction, bacterial invasion or environmental influence occur.
Acute appendicitis belongs to basic first diagnosis and treatment types in a hierarchical diagnosis and treatment system of China, and is divided into four pathological types according to the clinical process and pathological changes of the acute appendicitis: ① Acute simple appendicitis; ② Acute suppurative appendicitis (also called acute cellulitis appendicitis); ③ Gangrene appendicitis (resulting in perforation); ④ Periappendicular abscess. ① And ② are classified as simple type acute appendicitis, ③ and ④ are classified as complex type acute appendicitis. The treatment scheme is as follows: ① The acute simple appendicitis can be treated by antibiotic conservation treatment or surgical excision treatment; ② The acute suppurative appendicitis, gangrene appendicitis and perforated appendicitis need to be treated by surgical excision, and the two should be concerned with complications; ③ When periappendicular abscess is formed, only percutaneous aspiration drainage is required, antibiotic treatment is added, and complications are of concern. It can be seen that the severity of the disease varies with time of onset and the treatment regimen varies.
Ultrasound is a first-line method for rapid diagnosis, and the intelligent diagnosis technology of ultrasonic images of acute appendicitis is researched, so that the same high-quality medical resources can be enjoyed in areas with insufficient medical conditions. Therefore, it is necessary to develop intelligent diagnostic techniques for classifying the severity of acute appendicitis based on the appendicitis ultrasound images. Training of neural networks requires a large number of samples, but the acquisition of the appendix ultrasound image depends on the number of patients, and a large number of samples are difficult to acquire, so that a learning technology with a small number of samples is important in the appendix classification of the ultrasound image.
Meta learning and data expansion methods are commonly used in the current few sample learning. Where meta-learning tends to train a model to learn how efficiently new tasks are learned. This model is concerned with identifying commonalities between different tasks and using this knowledge to quickly learn new instances through several examples. The data expansion method focuses on expanding the existing data by using various methods, so that the model can be helped to better understand the underlying structure of the data, and the generalization performance of the model is improved. The extension data is typically generated for existing data using a data enhancement method and data generation. However, these methods for learning with few samples still have the problem of insufficient accuracy in classifying acute appendicitis, and thus, there is still a need in the art to develop new methods for learning with few samples suitable for classifying acute appendicitis.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a few-sample appendix classification system and a storage medium based on ultrasonic images.
A few sample appendix classification system based on ultrasound images, comprising:
The input module is used for inputting the appendix ultrasonic image and detection information of the patient;
the segmentation module is used for positioning the position of the appendix in the ultrasonic image through the appendix segmentation model and cutting out the appendix ultrasonic image of the cross section and the longitudinal section;
the prediction module is integrated with a classification model and is used for obtaining appendix ultrasonic images corresponding to the cross section and the longitudinal section respectively according to the detection information, the cross section and the longitudinal section to obtain predicted data, and obtaining a final classification result through a winner taking strategy;
the output module is used for outputting the final classification result;
the classification model has a memory queue, and the classification model builds the memory queue as follows:
Calculating a corresponding classification result by utilizing the appendix classification characteristic;
If the queue capacity does not reach the upper limit, transmitting the appendix classification characteristic and the corresponding classification result into the queue;
if the queue capacity reaches the upper limit, adopting a first-in first-out strategy to discard the appendices classification features and the corresponding classification results which are firstly transmitted into the queue, and then transmitting the appendices classification features and the corresponding classification results which are newly obtained into the queue;
the classification model obtains the prediction data as follows:
if the queue is empty, using the current classification result as the prediction data of the corresponding appendix ultrasound image;
If the queue is not empty, calculating the similarity by using the current appendix classification feature and the memory queue in the memory queue, wherein the classification result corresponding to the feature with the largest similarity in the memory queue is used as the prediction data of the corresponding appendix ultrasonic image.
Preferably, the classification model obtains the prediction data according to the following steps:
Step 1, constructing appendiceal information features by using detection information of a patient; constructing an appendiceal ultrasonic image feature by utilizing an appendiceal ultrasonic image of a cross section or a longitudinal section of a patient;
Step 2, adding the appendices information features and the appendices ultrasonic image features to obtain appendices classification features;
and step 3, outputting the classification probability of the appendix classification characteristic through a full connection layer and a softMax layer to obtain a classification result.
Preferably, in step 1, the appendiceal information feature is obtained according to the following steps:
Step 1.1, inputting detection information of a patient, and converting all data into standard one-hot codes;
Step 1.2, all the one-hot vectors are respectively converted into uniform 128-dimensional vectors through a standard full connection layer;
step 1.3, all 128-dimensional vectors are added and activated by a standard ReLU function;
and step 1.4, calculating 512-dimensional features, which are appendiceal information features, by using the activated features through a two-layer perceptron network.
Preferably, in step 1, the appendix ultrasound image features are obtained as follows:
step 1a, outputting image features of the appendix ultrasonic image of the cross section or the longitudinal section through a ResNet network without a classification layer;
And step 1b, calculating 512-dimensional features of the image features through a layer-by-layer perceptron network, wherein the features are the appendiceal ultrasound image features.
Preferably, in the training of the classification model, a cross entropy loss function and a KNN loss function are used for jointly constructing a classification loss function; the cross entropy loss function is used for calculating the loss of the corresponding classification result by utilizing the appendix classification characteristic, and the KNN loss function is used for calculating the loss of the appendix classification characteristic similarity in the memory queue.
Preferably, the patient detection information includes: age, sex, abdominal pain time, white blood cell count, percent neutrophil granulocyte, appendix diameter, tube wall hierarchy, appendicular cavity content, peri-system membrane swelling, peri-abscess formation, ileocecum intestinal canal swelling, abdominal cavity free effusion, lymph node growth, intestinal canal expansion.
Preferably, the segmentation module obtains the appendiceal ultrasound images of the cross section and the longitudinal section according to the following steps:
Step a, inputting an appendix ultrasonic image into a U-Net network to obtain a predicted appendix position mask;
Step b, given the appendiceal location mask predicted in the step a, calculating the number of masks which are not communicated in each space and the corresponding areas by using a connected domain algorithm, and only reserving the mask corresponding to the maximum area as the appendiceal location mask;
c, giving the appendiceal position mask obtained in the step b, calculating a circumscribed rectangle of the mask according to a labeled morphological algorithm, and expanding long sides and short sides of the circumscribed rectangle by the length of 64 pixels;
and d, cutting the appendices ultrasonic image by using the expanded external rectangle, and converting the cut appendices ultrasonic image into 320 multiplied by 320 by using bilinear interpolation.
Preferably, the classification result is to classify the acute appendicitis of the patient as: acute simple appendicitis, acute suppurative appendicitis, gangrene appendicitis or periappendicular abscess.
The present invention also provides a computer-readable storage medium having stored thereon: a computer program for implementing the ultrasound image-based system for classifying appendices with few samples described above.
The invention provides a new classification method integrated in a small sample appendix classification system, wherein a memory queue is constructed in the classification method, the similarity of feature vectors in past classification experience is used for judging the current classification result, and appendix classification models of a cross section and a longitudinal section are respectively constructed; and combining the appendix classification model prediction data of the cross section and the longitudinal section, and obtaining a classification result through a winner total-taking strategy. Aiming at the characteristic of easy overfitting in the training of few samples of the neural network, the method provided by the invention provides a training and reasoning algorithm for saving past experience in a targeted manner, effectively relieves the difficulty of learning few samples in the neural network, and improves the accuracy of the appendix classification of few samples of the ultrasonic image.
It should be apparent that, in light of the foregoing, various modifications, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.
The above-described aspects of the present invention will be described in further detail below with reference to specific embodiments in the form of examples. It should not be understood that the scope of the above subject matter of the present invention is limited to the following examples only. All techniques implemented based on the above description of the invention are within the scope of the invention.
Drawings
FIG. 1 is a schematic flow chart of an embodiment of the present invention;
Fig. 2 is an exemplary diagram of a process for segmenting and cropping an appendiceal ultrasound image by a few-sample appendiceal classification system according to an embodiment of the present invention, wherein (a) is an exemplary diagram of an original image, (b) is an exemplary diagram of a segmentation result, (c) is an exemplary diagram of a cropping frame, and (d) is an exemplary diagram of a post-cropping result.
Detailed Description
It should be noted that, in the embodiments, algorithms of steps such as data acquisition, transmission, storage, and processing, which are not specifically described, and hardware structures, circuit connections, and the like, which are not specifically described may be implemented through the disclosure of the prior art.
Embodiment ultrasound image-based few-sample appendix classification system
The system of the present embodiment includes:
The input module is used for inputting the appendix ultrasonic image and detection information of the patient;
the segmentation module is used for positioning the position of the appendix in the ultrasonic image through the appendix segmentation model and cutting out the appendix ultrasonic image of the cross section and the longitudinal section;
the prediction module is integrated with a classification model and is used for obtaining appendix ultrasonic images corresponding to the cross section and the longitudinal section respectively according to the detection information, the cross section and the longitudinal section to obtain predicted data, and obtaining a final classification result through a winner taking strategy;
And the output module is used for outputting the final classification result. The classification result is to classify the acute appendicitis of the patient as: acute simple appendicitis, acute suppurative appendicitis, gangrene appendicitis or periappendicular abscess.
The classification model has a memory queue, and the classification model builds the memory queue as follows:
Calculating a corresponding classification result by utilizing the appendix classification characteristic;
If the queue capacity does not reach the upper limit, transmitting the appendix classification characteristic and the corresponding classification result into the queue;
if the queue capacity reaches the upper limit, adopting a first-in first-out strategy to discard the appendices classification features and the corresponding classification results which are firstly transmitted into the queue, and then transmitting the appendices classification features and the corresponding classification results which are newly obtained into the queue;
the classification model obtains the prediction data as follows:
if the queue is empty, using the current classification result as the prediction data of the corresponding appendix ultrasound image;
If the queue is not empty, calculating the similarity by using the current appendix classification feature and the memory queue in the memory queue, wherein the classification result corresponding to the feature with the largest similarity in the memory queue is used as the prediction data of the corresponding appendix ultrasonic image.
The process of classifying the appendices with fewer samples by using the system is shown in fig. 1, and comprises the following steps:
1. inputting an appendiceal ultrasound image and test information of a patient
Wherein the patient's test information is a scalar, comprising: age, sex, abdominal pain time, white blood cell count, percent neutrophil granulocyte, appendix diameter, tube wall hierarchy, appendicular cavity content, peri-system membrane swelling, peri-abscess formation, ileocecum intestinal canal swelling, abdominal cavity free effusion, lymph node growth, intestinal canal dilation.
2. Positioning the appendix in the ultrasonic image by the appendix segmentation model and cutting out the appendix ultrasonic image of the cross section and the longitudinal section
The process is shown in fig. 2, and the specific process is as follows:
Step a, inputting an appendiceal ultrasound image (figure 2 a) into a U-Net network (appendiceal segmentation model) to obtain a predicted appendiceal location mask (figure 2 b);
Step b, given the appendiceal location mask predicted in the step a, calculating the number of masks which are not communicated in each space and the corresponding areas by using a connected domain algorithm, and only reserving the mask corresponding to the maximum area as the appendiceal location mask;
step c, giving the appendiceal location mask obtained in the step b, calculating a circumscribed rectangle of the mask according to a labeled morphological algorithm (figure 2 c), and expanding long sides and short sides (large rectangle in figure 2 c) of the circumscribed rectangle by 64 pixels;
Step d, cutting the appendices ultrasonic image with the expanded external rectangle, and converting the cut appendices ultrasonic image into 320×320 size (fig. 2 d) by using bilinear interpolation.
The training related parameters of the appendiceal segmentation model are as follows: an appendiceal segmentation model was trained using an Adam optimizer and a classification loss function, where the size of the image batch was 8, the learning rate of training was 0.0001, and a total of 200 rounds of training on the dataset, with all images sampled to 512 x 512 size.
3. And obtaining the appendix ultrasonic images corresponding to the cross section and the longitudinal section respectively according to the detection information, the cross section and the longitudinal section to obtain prediction data, and obtaining a final classification result through a winner taking strategy.
The number of the classification models is two, the classification models are respectively used for acquiring prediction data of the appendix ultrasonic image corresponding to the cross section or the longitudinal section, and for each classification model, the prediction data is obtained according to the following steps:
Step 1, constructing appendiceal information features by using detection information of a patient; constructing an appendiceal ultrasonic image feature by utilizing an appendiceal ultrasonic image of a cross section or a longitudinal section of a patient;
Step 2, adding the appendices information features and the appendices ultrasonic image features to obtain appendices classification features;
and step 3, outputting the classification probability of the appendix classification characteristic through a full connection layer and a softMax layer to obtain a classification result.
In step 1, appendiceal information features are obtained as follows:
Step 1.1, inputting detection information of a patient, and converting all data into standard one-hot codes;
Step 1.2, all the one-hot vectors are respectively converted into uniform 128-dimensional vectors through a standard full connection layer;
step 1.3, all 128-dimensional vectors are added and activated by a standard ReLU function;
and step 1.4, calculating 512-dimensional features, which are appendiceal information features, by using the activated features through a two-layer perceptron network.
In step 1, the appendix ultrasound image features are obtained as follows:
step 1a, outputting image features of the appendix ultrasonic image of the cross section or the longitudinal section through a ResNet network without a classification layer;
And step 1b, calculating 512-dimensional features of the image features through a layer-by-layer perceptron network, wherein the features are the appendiceal ultrasound image features.
The training related parameters of the classification model are: the cross entropy loss function and the KNN loss are used to jointly construct a classification loss function of the appendiceal classification model, the whole network is trained through an Adam optimizer, the size of batches of images and information is 8, the training learning rate is 0.0001, 100 rounds of training are performed on a data set, and all images in the training are sampled to be 320 multiplied by 320.
Wherein, a standard cross entropy loss function is used as appendiceal classification loss; training the network by using the KNN loss function so that the similarity between the appendices classification features of different categories and the appendices classification features of corresponding categories in the queue is as small as possible, wherein the method comprises the following steps of:
where n is the training batch size, Q is a memory queue for indicating functions; /(I)For a calculated parameter, the calculation formula is/> Wherein the function/>Appendices classification features are calculated, x i and x j are different training samples in the same training batch, and y i and y j are labels of the corresponding samples.
4. And outputting a final classification result.
To further illustrate the beneficial effects of the present application, the classification performance of different models on acute appendicitis is compared as follows. The data set adopted in the experimental example is 647 cases of acute appendicitis confirmed by pathology: of these, 77 cases of acute simple appendicitis, 324 cases of acute suppurative appendicitis, 181 cases of acute gangrene appendicitis, and 65 cases of acute appendicitis accompanied with abscess. The same data set is adopted, and the following three methods are respectively adopted for classifying the acute appendicitis:
(1) The system and method of the present embodiment;
(2) Classical empirical risk minimization algorithm (ERM algorithm, reference: vapnik v., the nature of STATISTICAL LEARNING they, SPRINGER SCIENCE & business media, 1999);
(3) Data expansion method Mixup (reference) :Yan S., Song H., Li N., Zou L., and Ren L., Improve unsupervised domain adaptation with mixup training. arXiv preprintarXiv:2001.00677, 2020.).
The ERM algorithm is a training model using a standard cross entropy classification loss function, is not applicable to the memory queues and the KNN losses proposed herein, and only uses the current classification result as the reasoning result of the model in the reasoning process. The Mixup algorithm is a popular data expansion method that expands samples by way of data synthesis, alleviating the problem of small samples to some extent. The method of the embodiment uses the history information in reasoning, and reduces the distribution of the characteristic space in training to be more compact by restricting the similarity of the classification result of the current sample and the history classification result, thereby improving the classification accuracy under the small sample.
The results are shown in Table 1:
TABLE 1
It can be seen that the method of this example is superior to the two comparative methods.
It can be seen from the above examples and experimental examples that the present invention provides a new system and method for classifying acute appendicitis, which has better classification accuracy for ultrasound images with a small sample size than the existing methods by designing the classification model structure.
Claims (9)
1. A system for classifying a small sample appendix based on ultrasound images, comprising:
The input module is used for inputting the appendix ultrasonic image and detection information of the patient;
the segmentation module is used for positioning the position of the appendix in the ultrasonic image through the appendix segmentation model and cutting out the appendix ultrasonic image of the cross section and the longitudinal section;
the prediction module is integrated with a classification model and is used for obtaining appendix ultrasonic images corresponding to the cross section and the longitudinal section respectively according to the detection information, the cross section and the longitudinal section to obtain predicted data, and obtaining a final classification result through a winner taking strategy;
the output module is used for outputting the final classification result;
the classification model has a memory queue, and the classification model builds the memory queue as follows:
Calculating a corresponding classification result by utilizing the appendix classification characteristic;
If the queue capacity does not reach the upper limit, transmitting the appendix classification characteristic and the corresponding classification result into the queue;
if the queue capacity reaches the upper limit, adopting a first-in first-out strategy to discard the appendices classification features and the corresponding classification results which are firstly transmitted into the queue, and then transmitting the appendices classification features and the corresponding classification results which are newly obtained into the queue;
the classification model obtains the prediction data as follows:
if the queue is empty, using the current classification result as the prediction data of the corresponding appendix ultrasound image;
If the queue is not empty, calculating the similarity by using the current appendix classification feature and the memory queue in the memory queue, wherein the classification result corresponding to the feature with the largest similarity in the memory queue is used as the prediction data of the corresponding appendix ultrasonic image.
2. The ultrasound image based few sample appendix classification system of claim 1, wherein the classification model obtains the predictive data as follows:
Step 1, constructing appendiceal information features by using detection information of a patient; constructing an appendiceal ultrasonic image feature by utilizing an appendiceal ultrasonic image of a cross section or a longitudinal section of a patient;
Step 2, adding the appendices information features and the appendices ultrasonic image features to obtain appendices classification features;
and step 3, outputting the classification probability of the appendix classification characteristic through a full connection layer and a softMax layer to obtain a classification result.
3. The ultrasound image-based few-sample appendix classification system of claim 2, wherein in step 1, the appendix information feature is obtained by:
Step 1.1, inputting detection information of a patient, and converting all data into standard one-hot codes;
Step 1.2, all the one-hot vectors are respectively converted into uniform 128-dimensional vectors through a standard full connection layer;
step 1.3, all 128-dimensional vectors are added and activated by a standard ReLU function;
and step 1.4, calculating 512-dimensional features, which are appendiceal information features, by using the activated features through a two-layer perceptron network.
4. The ultrasound image-based few sample appendix classification system of claim 2, wherein in step 1, the appendix ultrasound image features are obtained as follows:
step 1a, outputting image features of the appendix ultrasonic image of the cross section or the longitudinal section through a ResNet network without a classification layer;
And step 1b, calculating 512-dimensional features of the image features through a layer-by-layer perceptron network, wherein the features are the appendiceal ultrasound image features.
5. The ultrasound image-based few-sample appendix classification system of claim 1, wherein: in the training of the classification model, a cross entropy loss function and a KNN loss function are used for jointly constructing a classification loss function; the cross entropy loss function is used for calculating the loss of the corresponding classification result by utilizing the appendix classification characteristic, and the KNN loss function is used for calculating the loss of the appendix classification characteristic similarity in the memory queue.
6. The ultrasound image-based few-sample appendix classification system of claim 1, wherein: the patient's detection information includes: age, sex, abdominal pain time, white blood cell count, percent neutrophil granulocyte, appendix diameter, tube wall hierarchy, appendicular cavity content, peri-system membrane swelling, peri-abscess formation, ileocecum intestinal canal swelling, abdominal cavity free effusion, lymph node growth, intestinal canal expansion.
7. The ultrasound image-based few-sample appendix classification system of claim 1, wherein: the segmentation module obtains the appendix ultrasonic images of the cross section and the longitudinal section according to the following steps:
Step a, inputting an appendix ultrasonic image into a U-Net network to obtain a predicted appendix position mask;
Step b, given the appendiceal location mask predicted in the step a, calculating the number of masks which are not communicated in each space and the corresponding areas by using a connected domain algorithm, and only reserving the mask corresponding to the maximum area as the appendiceal location mask;
c, giving the appendiceal position mask obtained in the step b, calculating a circumscribed rectangle of the mask according to a labeled morphological algorithm, and expanding long sides and short sides of the circumscribed rectangle by the length of 64 pixels;
and d, cutting the appendices ultrasonic image by using the expanded external rectangle, and converting the cut appendices ultrasonic image into 320 multiplied by 320 by using bilinear interpolation.
8. The ultrasound image-based few-sample appendix classification system of claim 1, wherein: the classification result is to classify the acute appendicitis of the patient as: acute simple appendicitis, acute suppurative appendicitis, gangrene appendicitis or periappendicular abscess.
9. A computer-readable storage medium having stored thereon: a computer program for implementing the ultrasound image-based few-sample appendix classification system of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410564817.5A CN118154991A (en) | 2024-05-09 | 2024-05-09 | Fewer-sample appendix classification system based on ultrasonic image and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410564817.5A CN118154991A (en) | 2024-05-09 | 2024-05-09 | Fewer-sample appendix classification system based on ultrasonic image and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118154991A true CN118154991A (en) | 2024-06-07 |
Family
ID=91293298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410564817.5A Pending CN118154991A (en) | 2024-05-09 | 2024-05-09 | Fewer-sample appendix classification system based on ultrasonic image and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118154991A (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220370046A1 (en) * | 2021-05-20 | 2022-11-24 | Siemens Medical Solutions Usa, Inc. | Robust view classification and measurement in ultrasound imaging |
CN116704288A (en) * | 2023-06-14 | 2023-09-05 | 浙江工业大学 | Training method and recognition method for appendectomy surgical instrument recognition model |
CN117095169A (en) * | 2023-08-31 | 2023-11-21 | 飞依诺科技股份有限公司 | Ultrasonic image disease identification method and system |
-
2024
- 2024-05-09 CN CN202410564817.5A patent/CN118154991A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220370046A1 (en) * | 2021-05-20 | 2022-11-24 | Siemens Medical Solutions Usa, Inc. | Robust view classification and measurement in ultrasound imaging |
CN116704288A (en) * | 2023-06-14 | 2023-09-05 | 浙江工业大学 | Training method and recognition method for appendectomy surgical instrument recognition model |
CN117095169A (en) * | 2023-08-31 | 2023-11-21 | 飞依诺科技股份有限公司 | Ultrasonic image disease identification method and system |
Non-Patent Citations (2)
Title |
---|
J. H. GAGAN等: "Automated Segmentation of Common Carotid Artery in Ultrasound Images", 《IEEE ACCESS》, vol. 10, 30 May 2022 (2022-05-30), pages 58419 - 58430 * |
解茗: "急性阑尾炎的超声图像病理特征及临床诊断价值", 《河北医学》, vol. 22, no. 12, 31 December 2016 (2016-12-31), pages 1992 - 1995 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112418329B (en) | Cervical OCT image classification method and system based on multi-scale textural feature fusion | |
CN111951246B (en) | Multidirectional X-ray chest radiography pneumonia diagnosis method based on deep learning | |
CN112488234A (en) | End-to-end histopathology image classification method based on attention pooling | |
CN106599549A (en) | Computer-aided diagnosis system and method, and medical system | |
CN116503392B (en) | Follicular region segmentation method for ovarian tissue analysis | |
CN112085742B (en) | NAFLD ultrasonic video diagnosis method based on context attention | |
CN113516181B (en) | Characterization learning method for digital pathological image | |
US20240054760A1 (en) | Image detection method and apparatus | |
Xiang et al. | Exploring low-rank property in multiple instance learning for whole slide image classification | |
CN114841320A (en) | Organ automatic segmentation method based on laryngoscope medical image | |
CN112862783A (en) | Thyroid CT image nodule automatic diagnosis system based on neural network | |
CN114332122B (en) | Cell counting method based on attention mechanism segmentation and regression | |
CN114049339B (en) | Fetal cerebellum ultrasonic image segmentation method based on convolutional neural network | |
CN115471701A (en) | Lung adenocarcinoma histology subtype classification method based on deep learning and transfer learning | |
CN112950615B (en) | Thyroid nodule invasiveness prediction method based on deep learning segmentation network | |
CN112581431B (en) | Method for generating ultrasonic image from ultrasonic diagnosis report based on content maintenance and attention mechanism | |
CN117437423A (en) | Weak supervision medical image segmentation method and device based on SAM collaborative learning and cross-layer feature aggregation enhancement | |
CN110929678B (en) | Method for detecting vulvovaginal candida spores | |
CN118154991A (en) | Fewer-sample appendix classification system based on ultrasonic image and storage medium | |
CN115526331A (en) | Characterization learning method of digital pathological image based on lesion category perception | |
CN116432664A (en) | Dialogue intention classification method and system for high-quality data amplification | |
Tang et al. | DE-Net: dilated encoder network for automated tongue segmentation | |
CN115205599A (en) | Multi-age-range child chest radiography pneumonia classification system based on domain generalization model | |
CN115311491A (en) | Fracture false positive screening method based on course learning and spatial attention | |
CN114004295A (en) | Small sample image data expansion method based on countermeasure enhancement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination |