WO2020107156A1 - Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires - Google Patents

Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires Download PDF

Info

Publication number
WO2020107156A1
WO2020107156A1 PCT/CN2018/117460 CN2018117460W WO2020107156A1 WO 2020107156 A1 WO2020107156 A1 WO 2020107156A1 CN 2018117460 W CN2018117460 W CN 2018117460W WO 2020107156 A1 WO2020107156 A1 WO 2020107156A1
Authority
WO
WIPO (PCT)
Prior art keywords
medical ultrasound
breast
target
ultrasound image
breast medical
Prior art date
Application number
PCT/CN2018/117460
Other languages
English (en)
Chinese (zh)
Inventor
王珊珊
肖韬辉
郑海荣
徐井旭
刘新
梁栋
李程
Original Assignee
深圳先进技术研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳先进技术研究院 filed Critical 深圳先进技术研究院
Priority to PCT/CN2018/117460 priority Critical patent/WO2020107156A1/fr
Publication of WO2020107156A1 publication Critical patent/WO2020107156A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition

Definitions

  • the present application relates to the technical field of image processing, and in particular to an automatic classification method and device for breast medical ultrasound images.
  • the way to classify breast medical images is to use a restricted Boltzmann machine (RBM) to design a two-layer feature extractor to extract and learn task-related features, and then use a support vector machine (SVM) to output the breast Image data feature type.
  • RBM restricted Boltzmann machine
  • SVM support vector machine
  • the two-layer feature extractor is not a deep learning model in the strict sense, the existing classification methods of breast medical images are not intelligent enough, and their shallow network results also lead to the quantity and quality of the extracted features All of them are limited. Therefore, the existing classification methods of breast medical images have the problems of low classification accuracy and poor performance.
  • the present application provides an automatic classification method and device for breast medical ultrasound images, which can effectively improve the accuracy and efficiency of feature recognition of breast medical ultrasound images, and can effectively and reliably improve breast medical ultrasound images.
  • the efficiency and intelligence of the image classification process and the accuracy of the classification results of breast medical ultrasound images are particularly important.
  • this application provides an automated classification method for breast medical ultrasound images, including:
  • HSV color space conversion is performed on the pre-processed target breast medical ultrasound image to obtain the hue H channel feature corresponding to the target breast medical ultrasound image;
  • the preprocessing of the acquired target breast medical ultrasound image includes:
  • the performing HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the hue H-channel feature corresponding to the target breast medical ultrasound image includes:
  • HSV color space conversion is performed on the pre-processed target breast medical ultrasound image to obtain the HSV color feature corresponding to the target breast medical ultrasound image;
  • the automatic classification method of the breast medical ultrasound image further includes:
  • the deep convolutional neural network is composed of multiple convolutional layers and multiple fully connected layers in sequence. After the breast medical ultrasound image is input through the first convolutional layer, the deep convolutional neural network performs layer-by-layer Map to get different representations of the medical ultrasound images of the breast.
  • the acquiring the training sample set of the deep convolutional neural network based on multiple historical breast medical ultrasound images includes:
  • the h-channel values corresponding to each historical breast medical ultrasound image constitute the training sample set of the deep convolutional neural network.
  • the automatic classification method of breast medical ultrasound images further includes:
  • test sample based on at least one test breast medical ultrasound image
  • the current deep convolutional neural network is used as the target deep convolutional neural network for classifying breast medical ultrasound images.
  • the automatic classification method of breast medical ultrasound images further includes:
  • the current deep convolutional neural network does not meet the preset requirements, the current deep convolutional neural network is optimized and/or the updated training sample set is applied to re-train the deep convolutional neural network for model training.
  • this application provides an automatic classification device for breast medical ultrasound images, including:
  • the data preprocessing module is used to preprocess the acquired target breast medical ultrasound image
  • the channel feature extraction module is used to perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the hue H channel feature corresponding to the target breast medical ultrasound image;
  • the model prediction module is used to use the hue H channel feature corresponding to the pre-processed target breast medical ultrasound image as a prediction sample, input a preset target deep convolutional neural network, and output the target deep convolutional neural network As the classification result of the target breast medical ultrasound image.
  • the data preprocessing module includes:
  • Target data acquisition unit used to acquire target breast medical ultrasound images
  • a target data cleaning unit used for performing data cleaning on the target breast medical ultrasound image
  • the target data labeling unit is used to mark the target area in the target breast medical ultrasound image after data cleaning to obtain a sample label of the target breast medical ultrasound image.
  • the channel feature extraction module includes:
  • a target HSV conversion unit configured to perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the HSV color feature corresponding to the target breast medical ultrasound image;
  • the target H channel extraction unit is used to extract the hue H channel value corresponding to the target breast medical ultrasound image from the HSV color feature.
  • the automatic classification device for breast medical ultrasound images further includes:
  • the model building module is used to construct a deep convolutional neural network, and obtain the training sample set of the deep convolutional neural network based on multiple historical breast medical ultrasound images;
  • the model training module is configured to apply the training sample set to perform model training on the deep convolutional neural network.
  • the deep convolutional neural network is composed of multiple convolutional layers and multiple fully connected layers in sequence. After the breast medical ultrasound image is input through the first convolutional layer, the deep convolutional neural network is layered Map to get different representations of the medical ultrasound images of the breast.
  • model building module includes:
  • Historical data acquisition unit for acquiring multiple historical breast medical ultrasound images
  • Historical data cleaning unit used for data cleaning of all historical breast medical ultrasound images
  • the historical data labeling unit is used to mark the target areas in each historical breast medical ultrasound image after data cleaning, to obtain sample labels of each historical breast medical ultrasound image;
  • the historical HSV conversion unit is used to perform HSV color space conversion on each historical breast medical ultrasound image to obtain the HSV color features corresponding to each historical breast medical ultrasound image;
  • the historical H channel extraction unit is used to extract the hue H channel value corresponding to each historical breast medical ultrasound image from the HSV color features of each historical breast medical ultrasound image;
  • the training sample set generating unit is configured to compose the hue H channel value corresponding to each historical breast medical ultrasound image into the training sample set of the deep convolutional neural network.
  • the automatic classification device for breast medical ultrasound images further includes:
  • a test sample acquisition unit configured to obtain a test sample based on at least one breast medical ultrasound image for testing
  • a model testing unit used to apply the test samples to perform model testing on the deep convolutional neural network, and use the output of the deep convolutional neural network as a test result;
  • a test result judging unit for judging whether the current deep convolutional neural network meets the preset requirements based on the test result and the known classification result of at least one test breast medical ultrasound image, and if so, the current deep convolutional neural network As a target deep convolutional neural network for classifying breast medical ultrasound images.
  • test result judging unit is further used to optimize the current deep convolutional neural network and/or apply the updated training sample set if the current deep convolutional neural network does not meet the preset requirements. Deep convolutional neural network for model training.
  • the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor.
  • the processor executes the program, it is implemented as follows:
  • HSV color space conversion is performed on the pre-processed target breast medical ultrasound image to obtain the hue H channel feature corresponding to the target breast medical ultrasound image;
  • the present application provides a computer-readable storage medium on which a computer program is stored, which is implemented when executed by a processor:
  • HSV color space conversion is performed on the pre-processed target breast medical ultrasound image to obtain the hue H channel feature corresponding to the target breast medical ultrasound image;
  • this application provides an automatic classification method and device for breast medical ultrasound images.
  • preprocessing the acquired target breast medical ultrasound images it can provide an accurate and reliable data foundation for subsequent image recognition.
  • the target breast medical ultrasound image is converted into HSV color space to obtain the hue H-channel feature corresponding to the target breast medical ultrasound image, which can extract high-throughput and high-level image features for breast ultrasound image feature classification.
  • the classification results of ultrasound images can effectively improve the accuracy and efficiency of the feature recognition of breast medical ultrasound images, and can effectively and reliably improve the efficiency and intelligence of the classification process of breast medical ultrasound images, and improve the breast medical ultrasound images.
  • the accuracy of the classification results can meet the automation needs of modern breast medicine and can effectively adapt to the current development trend of intelligent diagnosis of medical imaging big data.
  • FIG. 1 is a schematic structural diagram between a server S1 and a client device B1 in an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of a server S1, a client device B1, and a database server S2 in an embodiment of the present invention.
  • FIG. 3 is a schematic flowchart of an automatic classification method of breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of step 100 in an automated classification method of breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of step 200 in an automatic classification method of breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of an automatic classification method of breast medical ultrasound images including step 001 and step 002 in an embodiment of the present invention.
  • step 001 is a schematic flowchart of step 001 in an automatic classification method of breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a deep convolutional neural network in an embodiment of the present invention.
  • FIG. 9 is a schematic flowchart of steps A01 to A05 in an automatic classification method of breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 10 is a schematic flowchart of an apparatus for automatically classifying breast medical ultrasound images in an embodiment of the present invention.
  • FIG. 11 is a schematic flowchart of an automatic classification device for breast medical ultrasound images including a model building module 01 and a model training module 02 in an embodiment of the present invention.
  • FIG. 12 is a schematic flowchart of an automatic classification device for breast medical ultrasound images including a model test module A0 in an embodiment of the present invention.
  • FIG. 13 is a schematic structural diagram of an electronic device in an embodiment of the present invention.
  • the traditional method of classification of breast medical ultrasound images firstly needs to manually segment the region of interest of the breast image, and requires a wealth of prior knowledge to design features that can characterize different regions of interest, then through artificial feature extraction and selection, and finally through design Different classifiers distinguish regions of interest.
  • the process requires a lot of manual operations, and requires a priori knowledge.
  • the number of extracted features is extremely limited, and it is difficult to adapt to the current development trend of intelligent diagnosis of medical imaging big data.
  • an existing classification method of breast medical ultrasound images utilizes a restricted Boltzmann machine (RBM) to design a two-layer feature extractor to extract and learn task-related features, and finally uses support vectors
  • RBM restricted Boltzmann machine
  • SVM The machine outputs the feature type of breast image data.
  • This method has some limitations compared with the method proposed in this application.
  • this technique is not strictly a deep learning model, it only has a two-layer structure.
  • Second, its shallow network structure also leads to a gap in the quantity and quality of its extracted features from the technical solution of the present application, resulting in a large difference in the performance of feature classification. That is to say, the existing technology is mostly a traditional feature engineering method and a shallower network structure, and the extracted quantitative features are limited in number and level, which is the biggest disadvantage of the existing technology.
  • the accuracy of breast ultrasound image data classification is reduced, so improving the quality of its feature extraction is the starting point of the proposed technology.
  • the existing technology requires local segmentation of the breast image, which adds a lot of work; and the segmentation task itself It is not easy to complete, the segmentation result is directly the classification behind the image.
  • the feature extraction method also requires manual design, extraction and selection, which greatly increase the workload of classification.
  • this application provides an automatic classification method for breast medical ultrasound images, an automatic classification device for breast medical ultrasound images, electronic equipment, and a computer
  • the readable storage medium can provide an accurate and reliable data basis for subsequent image recognition by preprocessing the acquired target breast medical ultrasound image, and by performing HSV color space conversion on the preprocessed target breast medical ultrasound image, Obtain the tone H channel feature corresponding to the target breast medical ultrasound image, which can extract high-throughput, high-level image features for breast ultrasound image feature classification, by preprocessing the tone H channel corresponding to the target breast medical ultrasound image
  • the feature is used as a prediction sample, input a preset target deep convolutional neural network, and use the output of the target deep convolutional neural network as the classification result of the target breast medical ultrasound image, which can effectively improve the feature recognition of the breast medical ultrasound image Accuracy and efficiency, and can effectively and reliably improve the efficiency and intelligence of the classification process of breast medical ultrasound images, and improve the accuracy of the classification results of breast medical
  • the present application also provides an automatic classification device for breast medical ultrasound images.
  • the device may be a server S1.
  • the server S1 may be in communication with at least one client device B1.
  • the client The device B1 may send the target breast medical ultrasound image to the server S1, and the server S1 may receive the target breast medical ultrasound image online.
  • the server S1 may preprocess the acquired target breast medical ultrasound image online or offline, and perform HSV color space conversion on the preprocessed target breast medical ultrasound image to obtain the hue H channel corresponding to the target breast medical ultrasound image Feature, the tonal H-channel feature corresponding to the pre-processed target breast medical ultrasound image is used as a prediction sample, a preset target deep convolutional neural network is input, and the output of the target deep convolutional neural network is used as the target Breast medical ultrasound image classification results. Then, the server S1 may send the classification result of the target breast medical ultrasound image online to the client device B1. The client device B1 may receive the classification result of the target breast medical ultrasound image online.
  • the server S1 may also be in communication connection with at least one database server S2, where the database server S2 is used to store historical breast medical ultrasound image data.
  • the database server S2 online sends the historical breast medical ultrasound image to the server S1, the server S1 can receive the historical breast medical ultrasound image online, and then obtain the depth convolution according to a plurality of historical breast medical ultrasound images A training sample set of a neural network, and applying the training sample set to perform model training on the deep convolutional neural network.
  • the database server S2 may also be used to store breast medical ultrasound image data for testing.
  • the database server S2 online sends the test breast medical ultrasound image data to the server S1, the server S1 can receive the test breast medical ultrasound image data online, and then according to at least one test breast medical ultrasound image Obtain a test sample, and apply the test sample to perform a model test on the deep convolutional neural network, and use the output of the deep convolutional neural network as a test result, and then based on the test result and at least one test breast medical ultrasound
  • the known classification results of the image to determine whether the current deep convolutional neural network meets the preset requirements.
  • the current deep convolutional neural network is optimized and/or the updated training sample set is applied to perform model training on the deep convolutional neural network again.
  • the client device B1 may have a display interface so that the user can view the classification result of the target breast medical ultrasound image sent by the server S1 according to the interface.
  • the client device B1 may include a smart phone, a tablet electronic device, a network set-top box, a portable computer, a desktop computer, a personal digital assistant (PDA), a vehicle-mounted device, a smart wearable device, and the like.
  • the smart wearable device may include smart glasses, smart watches, smart bracelets and the like.
  • the automatic classification of breast medical ultrasound images can be performed on the server S1 side as described above, that is, the architecture as shown in FIG. 1, or all operations can be performed on the client device This is done in B1, and the client device B1 can directly communicate with the database server S2. Specifically, the selection may be based on the processing capability of the client device B1, and the user's usage scenario restrictions. This application does not limit this. If all operations are completed in the client device B1, the client device B1 may further include a processor for performing specific processing of automatic classification of breast medical ultrasound images.
  • the network protocol may include, for example, TCP/IP protocol, UDP/IP protocol, HTTP protocol, HTTPS protocol, and so on.
  • the network protocol may also include, for example, the RPC protocol (Remote Procedure Call Protocol), the REST protocol (Representational State Transfer, Representational State Transfer Protocol) used on top of the above protocols.
  • the test breast medical ultrasound image is not included in the historical breast medical ultrasound image used for model training, and for the test breast medical ultrasound image, it is necessary to obtain Its known classification results.
  • the breast medical ultrasound image may be a breast elastic ultrasound image, specifically embodied as breast elastic ultrasound image data.
  • This application uses an imaging omics method based on deep convolutional neural networks, which integrates the technical advantages of deep learning in image processing and the advantages of cutting-edge thinking in imaging omics.
  • Combining medical imaging big data, the medical ultrasound imaging data of breast Performing feature learning and extraction improves the ability to classify features of breast ultrasound image data.
  • the following embodiments and application scenarios are used for specific description.
  • the automatic classification method of breast medical ultrasound images specifically includes the following:
  • Step 100 Pre-process the acquired target breast medical ultrasound image.
  • Step 200 Perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain a hue H-channel feature corresponding to the target breast medical ultrasound image.
  • Step 300 Use the tone H-channel feature corresponding to the pre-processed target breast medical ultrasound image as a prediction sample, input a preset target deep convolutional neural network, and use the output of the target deep convolutional neural network as the Classification results of medical ultrasound images of target breast.
  • the target deep convolutional neural network is composed of multiple convolutional layers and multiple fully connected layers in sequence.
  • the deep convolutional neural network DCNN (Deep Convolutional Neural Network) is the first convolution of the breast medical ultrasound image. After layer input, layer-by-layer mapping is performed in the deep convolutional neural network to obtain different representations of the layers for the breast medical ultrasound image.
  • the last layers of the target deep convolutional neural network are composed of fully connected layers, and the activation function of the last layer is a softmax function, which is used to classify local features of breast ultrasound images.
  • the automatic classification method for breast medical ultrasound images extracts high-throughput and high-level image features through deep convolutional neural networks for feature classification of breast ultrasound images.
  • This application is based on the large sample data of breast ultrasound imaging, and uses the imaging omics method to screen out the key imaging histological features that characterize the clinical phenotype, and constructs a multi-parameter classification device for regions of interest (ROI) of breast ultrasound imaging.
  • ROI regions of interest
  • the present application also provides a specific implementation of step 100 in the automated classification method of breast medical ultrasound images, see Figure 4, the step 100 specifically includes the following:
  • Step 101 Acquire a target breast medical ultrasound image.
  • Step 102 Perform data cleaning on the target breast medical ultrasound image.
  • the automatic classification device for breast medical ultrasound images performs data cleaning on target breast elastic ultrasound image data, and deletes some system setting parameters and other information.
  • Step 103 Mark the target area in the target breast medical ultrasound image after data cleaning to obtain a sample label of the target breast medical ultrasound image.
  • the automatic classification device for medical ultrasound images of the breast marks the target area in the target mammary elastic ultrasound image data after data cleaning to obtain a sample label of the target mammary elastic ultrasound image data after data cleaning,
  • the target region may be a region of interest in breast elastic elasticity ultrasound image data of a doctor.
  • the present application also provides a specific implementation of step 200 in the automated classification method of breast medical ultrasound images, Referring to FIG. 5, the step 200 specifically includes the following content:
  • Step 201 Perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the HSV color feature corresponding to the target breast medical ultrasound image.
  • Step 202 Extract the hue H channel value corresponding to the target breast medical ultrasound image from the HSV color feature.
  • this application provides a model prediction scenario of an automatic classification method for breast medical ultrasound images, the specific content is as follows:
  • the present application also provides model establishment and training performed before step 100 in the automatic classification method of breast medical ultrasound images For the process, see FIG. 6.
  • the model building and training process specifically includes the following:
  • Step 001 Construct a deep convolutional neural network, and obtain the training sample set of the deep convolutional neural network based on multiple historical breast medical ultrasound images.
  • Step 002 Apply the training sample set to perform model training on the deep convolutional neural network.
  • the step 001 of obtaining the training sample set of the deep convolutional neural network based on a plurality of historical breast medical ultrasound images specifically includes the following content:
  • Step 001a Acquire multiple historical breast medical ultrasound images
  • Step 002a Perform data cleaning on each historical breast medical ultrasound image
  • Step 003a Mark the target area in each historical breast medical ultrasound image after data cleaning to obtain a sample label of each historical breast medical ultrasound image
  • Step 004a Perform HSV color space conversion on each historical breast medical ultrasound image to obtain the HSV color features corresponding to each historical breast medical ultrasound image;
  • Step 005a extract, from the HSV color features of each historical breast medical ultrasound image, a hue H channel value corresponding to each historical breast medical ultrasound image;
  • Step 006a Combine the hue H channel values corresponding to each historical breast medical ultrasound image into the training sample set of the deep convolutional neural network.
  • this application provides a model establishment and training scenario of an automatic classification method for breast medical ultrasound images.
  • the specific content is as follows:
  • the deep convolutional neural network is composed of multiple convolutional layers and multiple fully connected layers: the input image is mapped layer by layer in the deep convolutional neural network to obtain different representations of the layers for the image to achieve the depth of the image.
  • the deep convolutional neural network specifically includes 5 convolutional layers: module 1 to module 5, and further specifically includes a module 6 composed of 3 fully connected layers, and the modules 1 to 6 are arranged in sequence.
  • the specific result of the deep convolutional neural network is shown in FIG. 8, and the specific content in each module is shown in Table 1 below.
  • * represents the convolution operation between the input of each layer and the filter; Is the output of the jth neuron after the convolution of the lth layer; Is the output of the k-th neuron in layer l-1, that is, the input data of layer l; Is the weight of the k-th neuron in layer l-1 connected to the j-th neuron in layer l; c is the offset corresponding to the j-th neuron in layer l;
  • there are ReLU functions of other transformation forms such as Leaky-ReLU, P-ReLU, etc.
  • the last layers of the deep convolutional neural network designed by this application are composed of fully connected layers.
  • the activation function of the last layer is the softmax function, which is used to classify the local features of breast ultrasound images.
  • W L and b L are the weight and offset of the last fully connected layer, respectively.
  • the process of acquiring training samples may be performed in parallel with the process of establishing a deep convolutional neural network, or may be performed before or after the process of establishing a deep convolutional neural network.
  • the hue H channel value corresponding to each historical breast elastic ultrasound image data constitutes the training sample set of the deep convolutional neural network.
  • Model training is performed on the deep convolutional neural network with a training sample set composed of hue H channel values corresponding to the historical breast elastic ultrasound image data respectively.
  • the present application also provides the automatic classification method of the breast medical ultrasound images performed before step 100 and after step 002
  • the model testing process specifically includes the following:
  • Step A01 Obtain a test sample based on at least one breast medical ultrasound image for testing
  • Step A02 apply the test sample to perform a model test on the deep convolutional neural network, and use the output of the deep convolutional neural network as the test result;
  • Step A03 based on the test result and the known classification result of at least one test breast medical ultrasound image, determine whether the current deep convolutional neural network meets the preset requirements;
  • step A04 If yes, go to step A04, otherwise, go to step A05.
  • Step A04 The current deep convolutional neural network is used as the target deep convolutional neural network for classifying breast medical ultrasound images.
  • Step A05 If the current deep convolutional neural network does not meet the preset requirements, optimize the current deep convolutional neural network and/or apply the updated training sample set to perform model training on the deep convolutional neural network again.
  • this application provides a model test scenario of an automated classification method for breast medical ultrasound images, the specific contents are as follows:
  • the Hue H channel value corresponding to the breast elastic ultrasound image data for each test is used as the test sample of the deep convolutional neural network.
  • the specific method for optimizing the deep convolutional neural network in the process of returning to the establishment of the deep convolutional neural network is as follows:
  • the weights and offsets are updated by BP back propagation algorithm, and the parameters of deep convolutional neural network are optimized by optimizing the loss function of classification cross entropy, as shown in equation (4), It can extract the features of the region of interest in the energetic breast ultrasound image, and achieve the purpose of rapid and accurate classification of the features of the region of interest in the breast ultrasound image.
  • I and K are the number of classification categories and the total number of training samples
  • Y′ i is the label of the healthy area or the lesion area.
  • the automatic classification method of breast medical ultrasound images provided by the embodiments of the present application can be used to classify the features of the breast region of interest, such as the characteristics of the healthy area or the lesion area or the suspicious lesion area, but it is not limited to the position of the breast .
  • the technology of this application helps to reduce the workload of clinicians and improve the efficiency of doctors by analyzing breast medical image data; in addition, the method can classify breast medical image features with its advantages of non-invasive, real-time, safe and convenient To apply.
  • an automatic classification device for breast medical ultrasound images for implementing the entire content of the automatic classification method for breast medical ultrasound images, see FIG. 10, the automatic classification device for breast medical ultrasound images specifically It includes the following:
  • the data pre-processing module 10 is used to pre-process the acquired target breast medical ultrasound image.
  • the channel feature extraction module 20 is configured to perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain a hue H-channel feature corresponding to the target breast medical ultrasound image.
  • the model prediction module 30 is configured to use the tone H-channel feature corresponding to the target breast medical ultrasound image as a prediction sample, input a preset target depth convolutional neural network, and convert the target depth convolutional neural network The classification result of the target breast medical ultrasound image is output.
  • the embodiment of the automatic classification apparatus for breast medical ultrasound images provided by the present application may be specifically used to execute all the processing processes of each embodiment of the automatic classification method for breast medical ultrasound images in the above embodiments, and the functions thereof will not be repeated here. Reference may be made to the detailed description of the above method embodiments.
  • the automatic classification device for breast medical ultrasound images preprocesses the acquired target breast medical ultrasound images through the data preprocessing module 10, which can provide an accurate and reliable data basis for subsequent image recognition .
  • the channel feature extraction module 20 performs HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the hue H-channel feature corresponding to the target breast medical ultrasound image, which can extract high-throughput, high-level image features
  • the model prediction module 30 uses the hue H channel feature corresponding to the target breast medical ultrasound image as a prediction sample, enters a preset target depth convolutional neural network, and sets the target depth
  • the output of the convolutional neural network as the classification result of the target breast medical ultrasound image can effectively improve the accuracy and efficiency of the feature recognition of the breast medical ultrasound image, and can effectively and reliably improve the efficiency of the classification process of the breast medical ultrasound image
  • the degree of sexuality and intelligence, and improve the accuracy of the classification results of breast medical ultrasound images which can meet the automation
  • the present application also provides a specific implementation of the data preprocessing module 10 in the automatic classification device for breast medical ultrasound images Way, the data preprocessing module 10 specifically includes the following content:
  • the target data acquisition unit 11 is used to acquire a target breast medical ultrasound image.
  • the target data cleaning unit 12 is used for performing data cleaning on the target breast medical ultrasound image.
  • the target data labeling unit 13 is configured to mark the target area in the target breast medical ultrasound image after data cleaning to obtain a sample label of the target breast medical ultrasound image.
  • the present application also provides specifics of the channel feature extraction module 20 in the automatic classification device for breast medical ultrasound images
  • the channel feature extraction module 20 specifically includes the following:
  • the target HSV conversion unit 21 is configured to perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain the HSV color feature corresponding to the target breast medical ultrasound image.
  • the target H channel extraction unit 22 is configured to extract the hue H channel value corresponding to the target breast medical ultrasound image from the HSV color feature.
  • the present application also provides a model building module and a model training module in the automatic classification device for breast medical ultrasound images, see Figure 11, the model building module and model training module specifically include the following:
  • the model building module 01 is used to construct a deep convolutional neural network, and obtain the training sample set of the deep convolutional neural network according to multiple historical breast medical ultrasound images.
  • the model training module 02 is configured to apply the training sample set to perform model training on the deep convolutional neural network.
  • the deep convolutional neural network is composed of multiple convolutional layers and multiple fully connected layers in sequence. After the breast medical ultrasound image is input through the first convolutional layer, it is performed in the deep convolutional neural network Layer by layer mapping to obtain different representations of each layer for the breast medical ultrasound image.
  • the model building module 01 also includes the following contents:
  • the historical data acquisition unit 01a is used to acquire multiple historical breast medical ultrasound images
  • the historical data cleaning unit 01b is used to perform data cleaning on each historical breast medical ultrasound image
  • the historical data labeling unit 01c is used to mark the target areas in each historical breast medical ultrasound image after data cleaning to obtain sample labels of each historical breast medical ultrasound image;
  • the historical HSV conversion unit 01d is used to perform HSV color space conversion on each historical breast medical ultrasound image to obtain HSV color features corresponding to each historical breast medical ultrasound image;
  • the historical H channel extraction unit 01e is used to extract the hue H channel value corresponding to each historical breast medical ultrasound image from the HSV color features of each historical breast medical ultrasound image;
  • the training sample set generation unit 01f is configured to compose the hue H channel value corresponding to each historical breast medical ultrasound image into the training sample set of the deep convolutional neural network.
  • the present application also provides a model test module A0 in the automatic classification device for breast medical ultrasound images.
  • the model test module A0 specifically includes the following content:
  • test sample acquisition unit A1 which is used to obtain a test sample based on at least one breast medical ultrasound image for testing
  • a model testing unit A2 configured to apply the test samples to perform model testing on the deep convolutional neural network, and use the output of the deep convolutional neural network as a test result;
  • a test result determination unit A3 used to determine whether the current deep convolutional neural network meets the preset requirements based on the test result and the known classification results of at least one test breast medical ultrasound image, if so, the current deep convolutional neural network
  • the network serves as a target deep convolutional neural network for classifying breast medical ultrasound images.
  • the test result judgment unit is also used to optimize the current deep convolutional neural network and/or apply the updated training sample set to re-contract the depth if the current deep convolutional neural network does not meet the preset requirements.
  • the automatic classification device for breast medical ultrasound images can be used to classify the characteristics of the breast region of interest, such as the characteristics of the healthy area or the lesion area or the suspicious lesion area, but it is not limited to the position of the breast .
  • the purpose of automatic classification of breast image features is achieved.
  • the technology of this application helps to reduce the workload of clinicians and improve the efficiency of doctors by analyzing breast medical image data; in addition, the device can classify breast medical image features with its advantages of non-invasive, real-time, safe and convenient To apply.
  • Embodiments of the present application also provide a specific implementation of an electronic device that can implement all steps in the automatic classification method of breast medical ultrasound images in the foregoing embodiments.
  • the electronic device specifically includes the following:
  • Processor processing 601, memory (memory) 602, communications interface (Communications) Interface 603 and bus 604;
  • the processor 601, the memory 602, and the communication interface 603 communicate with each other through the bus 604; the communication interface 603 is used to implement an automatic classification device, server, client terminal, and other participating institutions for breast medical ultrasound images Information transmission between;
  • the processor 601 is used to call a computer program in the memory 602, and when the processor executes the computer program, it implements all the steps in the automatic classification method for breast medical ultrasound images in the foregoing embodiments, for example, the When the processor executes the computer program, the following steps are realized:
  • Step 100 Pre-process the acquired target breast medical ultrasound image.
  • Step 200 Perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain a hue H-channel feature corresponding to the target breast medical ultrasound image.
  • Step 300 Use the tone H-channel feature corresponding to the pre-processed target breast medical ultrasound image as a prediction sample, input a preset target deep convolutional neural network, and use the output of the target deep convolutional neural network as the Classification results of medical ultrasound images of target breast.
  • the electronic device provided by the embodiment of the present application can be used to classify the characteristics of the breast region of interest, such as the characteristics of the healthy region or the diseased region or the suspicious diseased region, but it is not limited to the position of the breast.
  • the electronic device can be used to classify the characteristics of the breast region of interest, such as the characteristics of the healthy region or the diseased region or the suspicious diseased region, but it is not limited to the position of the breast.
  • the electronic device provided by the embodiment of the present application can be used to classify the characteristics of the breast region of interest, such as the characteristics of the healthy region or the diseased region or the suspicious diseased region, but it is not limited to the position of the breast.
  • the technology of this application helps to reduce the workload of clinicians and improve the efficiency of doctors by analyzing breast medical image data; in addition, the device can classify breast medical image features with its advantages of non-invasive, real-time, safe and convenient To apply.
  • An embodiment of the present application also provides a computer-readable storage medium capable of implementing all the steps in the automated classification method for breast medical ultrasound images in the above embodiments
  • the computer-readable storage medium stores a computer program
  • the computer program When executed by the processor, all steps of the automatic classification method for breast medical ultrasound images in the foregoing embodiments are implemented. For example, when the processor executes the computer program, the following steps are realized:
  • Step 100 Pre-process the acquired target breast medical ultrasound image.
  • Step 200 Perform HSV color space conversion on the pre-processed target breast medical ultrasound image to obtain a hue H-channel feature corresponding to the target breast medical ultrasound image.
  • Step 300 Use the tone H-channel feature corresponding to the pre-processed target breast medical ultrasound image as a prediction sample, input a preset target deep convolutional neural network, and use the output of the target deep convolutional neural network as the Classification results of medical ultrasound images of target breast.
  • the computer-readable storage medium provided by the embodiments of the present application can be used to classify the features of the breast region of interest, such as the characteristics of the healthy region or the lesion region or the suspicious lesion region, but it is not limited to the location of the breast.
  • the features of the breast region of interest such as the characteristics of the healthy region or the lesion region or the suspicious lesion region, but it is not limited to the location of the breast.
  • the technology of this application helps to reduce the workload of clinicians and improve the efficiency of doctors by analyzing breast medical image data; in addition, the device can classify breast medical image features with its advantages of non-invasive, real-time, safe and convenient To apply.
  • the system, device, module or unit explained in the above embodiments may be specifically implemented by a computer chip or entity, or implemented by a product with a certain function.
  • a typical implementation device is a computer.
  • the computer may be, for example, a personal computer, a laptop computer, an on-board human-machine interaction device, a cellular phone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet A computer, a wearable device, or any combination of these devices.
  • the functions are divided into various modules and described separately.
  • the functions of each module may be implemented in one or more software and/or hardware, or the modules that implement the same function may be implemented by a combination of multiple submodules or subunits.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a division of logical functions.
  • there may be another division manner for example, multiple units or components may be combined or integrated To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • controller in addition to implementing the controller in the form of pure computer-readable program code, it is entirely possible to logically program method steps to make the controller use logic gates, switches, application specific integrated circuits, programmable logic controllers and embedded The same function is realized in the form of a microcontroller or the like. Therefore, such a controller can be regarded as a hardware component, and the device for implementing various functions included therein can also be regarded as a structure within the hardware component. Or even, the means for realizing various functions can be regarded as both a software module of an implementation method and a structure within a hardware component.
  • each flow and/or block in the flowchart and/or block diagram and a combination of the flow and/or block in the flowchart and/or block diagram may be implemented by computer program instructions.
  • These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device
  • These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions
  • the device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device
  • the instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and/or block diagrams.
  • the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory, random access memory (RAM) and/or non-volatile memory in computer-readable media, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
  • RAM random access memory
  • ROM read only memory
  • flash RAM flash random access memory
  • Computer-readable media including permanent and non-permanent, removable and non-removable media, can store information by any method or technology.
  • the information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technologies, read-only compact disc read-only memory (CD-ROM), digital versatile disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • computer-readable media does not include temporary computer-readable media (transitory media), such as modulated data signals and carrier waves.
  • the embodiments of the present specification may be provided as methods, systems, or computer program products. Therefore, the embodiments of the present specification may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, the embodiments of the present specification may take the form of computer program products implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer usable program code.
  • computer usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • Embodiments of this specification may be described in the general context of computer-executable instructions executed by a computer, such as program modules.
  • program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
  • the embodiments of the present specification may also be practiced in distributed computing environments in which tasks are performed by remote processing devices connected through a communication network.
  • program modules may be located in local and remote computer storage media including storage devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un procédé et un dispositif de classification automatisée pour des images médicales ultrasonores mamaires, le procédé comprenant les étapes consistant à : prétraiter les images médicales ultrasonores mamaires cibles obtenues (000) ; réaliser une conversion d'espace de couleur HSV sur les images médicales ultrasonores mamaires cibles prétraitées pour obtenir une caractéristique de canal H de teinte correspondant aux images médicales ultrasonores mamaires cibles (100) ; prendre la caractéristique de canal H de teinte correspondant aux images médicales ultrasonores mamaires cibles prétraitées en tant qu'échantillon de prédiction, entrer celle-ci dans un réseau neuronal convolutif profond cible prédéfini, et prendre la sortie du réseau neuronal convolutionnel profond cible en tant que résultat de classification des images médicales ultrasonores mamaires cibles (200). Le procédé peut efficacement améliorer la précision et l'efficacité de reconnaissance de caractéristiques d'images médicales ultrasonores mamaires, peut améliorer efficacement et de manière fiable une efficacité et une intelligence élevées du processus de classification d'images médicales ultrasonores mamaires, et améliorer la précision du résultat de classification d'images médicales ultrasonores mamaires.
PCT/CN2018/117460 2018-11-26 2018-11-26 Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires WO2020107156A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/117460 WO2020107156A1 (fr) 2018-11-26 2018-11-26 Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/117460 WO2020107156A1 (fr) 2018-11-26 2018-11-26 Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires

Publications (1)

Publication Number Publication Date
WO2020107156A1 true WO2020107156A1 (fr) 2020-06-04

Family

ID=70852458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117460 WO2020107156A1 (fr) 2018-11-26 2018-11-26 Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires

Country Status (1)

Country Link
WO (1) WO2020107156A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862015A (zh) * 2020-07-08 2020-10-30 中国人民解放军战略支援部队信息工程大学 一种图像质量等级确定方法、装置及电子设备
CN112699948A (zh) * 2020-12-31 2021-04-23 无锡祥生医疗科技股份有限公司 超声乳腺病灶的分类方法、装置及存储介质
CN113139076A (zh) * 2021-05-20 2021-07-20 广东工业大学 一种深度特征学习多标签的神经网络影像自动标记方法
CN113421633A (zh) * 2021-06-25 2021-09-21 上海联影智能医疗科技有限公司 特征分类方法、计算机设备和存储介质
CN116309585A (zh) * 2023-05-22 2023-06-23 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080212887A1 (en) * 2005-05-12 2008-09-04 Bracco Imaging S.P.A. Method For Coding Pixels or Voxels of a Digital Image and a Method For Processing Digital Images
CN106339591A (zh) * 2016-08-25 2017-01-18 汤平 一种基于深度卷积神经网络的预防乳腺癌自助健康云服务系统
CN107028593A (zh) * 2017-04-14 2017-08-11 成都知识视觉科技有限公司 一种乳腺导管原位癌的辅助检测方法
CN107705336A (zh) * 2017-04-15 2018-02-16 北京航空航天大学 一种病理图像染色成分调节方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080212887A1 (en) * 2005-05-12 2008-09-04 Bracco Imaging S.P.A. Method For Coding Pixels or Voxels of a Digital Image and a Method For Processing Digital Images
CN106339591A (zh) * 2016-08-25 2017-01-18 汤平 一种基于深度卷积神经网络的预防乳腺癌自助健康云服务系统
CN107028593A (zh) * 2017-04-14 2017-08-11 成都知识视觉科技有限公司 一种乳腺导管原位癌的辅助检测方法
CN107705336A (zh) * 2017-04-15 2018-02-16 北京航空航天大学 一种病理图像染色成分调节方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862015A (zh) * 2020-07-08 2020-10-30 中国人民解放军战略支援部队信息工程大学 一种图像质量等级确定方法、装置及电子设备
CN111862015B (zh) * 2020-07-08 2024-03-19 中国人民解放军战略支援部队信息工程大学 一种图像质量等级确定方法、装置及电子设备
CN112699948A (zh) * 2020-12-31 2021-04-23 无锡祥生医疗科技股份有限公司 超声乳腺病灶的分类方法、装置及存储介质
CN113139076A (zh) * 2021-05-20 2021-07-20 广东工业大学 一种深度特征学习多标签的神经网络影像自动标记方法
CN113139076B (zh) * 2021-05-20 2024-03-29 广东工业大学 一种深度特征学习多标签的神经网络影像自动标记方法
CN113421633A (zh) * 2021-06-25 2021-09-21 上海联影智能医疗科技有限公司 特征分类方法、计算机设备和存储介质
CN116309585A (zh) * 2023-05-22 2023-06-23 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统
CN116309585B (zh) * 2023-05-22 2023-08-22 山东大学 基于多任务学习的乳腺超声图像目标区域识别方法及系统

Similar Documents

Publication Publication Date Title
WO2020107156A1 (fr) Procédé et dispositif de classification automatisée pour images médicales ultrasonores mammaires
Yu et al. Deep-learning-empowered breast cancer auxiliary diagnosis for 5GB remote E-health
Madani et al. Fast and accurate view classification of echocardiograms using deep learning
EP3716198A1 (fr) Procédé et dispositif de reconstruction d'image
WO2020182121A1 (fr) Procédé de reconnaissance d'expression et dispositif associé
WO2020019738A1 (fr) Procédé et dispositif de traitement de plaques permettant d'effectuer une imagerie de paroi de vaisseau à résonance magnétique, et dispositif informatique
DE112020003547T5 (de) Transfer Learning für neuronale Netzwerke
WO2020118618A1 (fr) Méthode et dispositif de reconnaissance d'image de masse de glande mammaire
CN109614993A (zh) 乳腺医学超声图像的自动化分类方法及装置
Abdi et al. Quality assessment of echocardiographic cine using recurrent neural networks: Feasibility on five standard view planes
CN110033023A (zh) 一种基于绘本识别的图像数据处理方法及系统
WO2021098534A1 (fr) Procédé et dispositif de détermination de similarité, procédé et dispositif d'apprentissage de réseau, procédé et dispositif de recherche, et dispositif électronique et support de stockage
Gilbert et al. Automated left ventricle dimension measurement in 2d cardiac ultrasound via an anatomically meaningful cnn approach
CN116664930A (zh) 基于自监督对比学习的个性化联邦学习图像分类方法及系统
JP7225731B2 (ja) 多変数データシーケンスの画像化
CN113628221B (zh) 图像处理方法、图像分割模型训练方法及相关装置
Sallam et al. Mobile-based intelligent skin diseases diagnosis system
Khattar et al. Computer assisted diagnosis of skin cancer: a survey and future recommendations
Nayar et al. Deep learning based model for multi-class classification of cervical cells using pap smear images
Mahmud et al. Quantized depth image and skeleton-based multimodal dynamic hand gesture recognition
CN109636780A (zh) 乳腺密度自动分级方法及装置
CN113705595A (zh) 异常细胞转移程度的预测方法、装置和存储介质
Jayachandran et al. Deep transfer learning for texture classification in colorectal cancer histology
Iqbal et al. Deep-Hist: Breast cancer diagnosis through histopathological images using convolution neural network
CN114224354B (zh) 心律失常分类方法、装置及可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18941320

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 08/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18941320

Country of ref document: EP

Kind code of ref document: A1