CN114757950A - Ultrasonic image processing method, device and computer readable storage medium - Google Patents

Ultrasonic image processing method, device and computer readable storage medium Download PDF

Info

Publication number
CN114757950A
CN114757950A CN202210671375.5A CN202210671375A CN114757950A CN 114757950 A CN114757950 A CN 114757950A CN 202210671375 A CN202210671375 A CN 202210671375A CN 114757950 A CN114757950 A CN 114757950A
Authority
CN
China
Prior art keywords
image
pixel
gray level
parameter
ultrasound image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210671375.5A
Other languages
Chinese (zh)
Other versions
CN114757950B (en
Inventor
谈继勇
王旭东
李元伟
杨洪光
刘根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hanwei Intelligent Medical Technology Co ltd
Original Assignee
Shenzhen Hanwei Intelligent Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hanwei Intelligent Medical Technology Co ltd filed Critical Shenzhen Hanwei Intelligent Medical Technology Co ltd
Priority to CN202210671375.5A priority Critical patent/CN114757950B/en
Publication of CN114757950A publication Critical patent/CN114757950A/en
Application granted granted Critical
Publication of CN114757950B publication Critical patent/CN114757950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping

Abstract

The application discloses an ultrasonic image processing method, an ultrasonic image processing device and a computer readable storage medium, and belongs to the technical field of image processing. Firstly, enhancing the contrast of an ultrasonic image to be processed by adopting a contrast-limiting adaptive histogram equalization algorithm; and then, the improved brightness adaptive guided filtering algorithm is used for denoising the enhanced ultrasonic image, so that the functions of denoising, contrast enhancement and brightness adjustment are effectively realized. And finally, the multidirectional Sobel operator is adopted to enhance the focus edge information in the filtered ultrasonic image, so that the deep learning model can be promoted to learn more feature information of the focus in the breast ultrasonic image after pretreatment, the influence of wrong information on the deep learning is reduced, the quality of the ultrasonic image to be processed is further improved, and the precision of the deep learning model on the extraction of the focus features and the pathological detection is improved.

Description

Ultrasonic image processing method, device and computer readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an ultrasound image processing method and apparatus, and a computer-readable storage medium.
Background
In recent years, with the rapid development of deep learning, the deep learning adopts a convolutional neural network to realize tasks such as segmentation, detection, classification and the like of an ultrasonic image, so that heavy labor of a doctor is greatly relieved, and the workload of the doctor is reduced. In addition, the accuracy and the reliability of auxiliary diagnosis can be accurately improved by the high-performance deep learning algorithm. Therefore, the construction of a high-performance deep learning model has very important significance for breast ultrasonic diagnosis. However, when the deep learning model is trained, the adopted ultrasound image has the problems of speckle noise interference, low contrast, fuzzy lesion edges and the like, so that the ultrasound image information quality is unqualified, and the extraction and pathological detection of the deep learning model on the lesion features are difficult.
The above is only for the purpose of assisting understanding of the technical solutions of the present application, and does not represent an admission that the above is prior art.
Disclosure of Invention
The present application mainly aims to provide an ultrasound image processing method, an ultrasound image processing device, and a computer-readable storage medium, and aims to solve the technical problem that the quality of an ultrasound image to be trained is not qualified, which results in difficulty in extracting a focus and detecting a pathology.
In order to achieve the above object, the present application provides an ultrasound image processing method, including:
acquiring an ultrasonic image to be processed, and carrying out contrast ratio limiting adaptive histogram equalization processing on the ultrasonic image;
acquiring a guide image corresponding to the ultrasonic image after contrast ratio self-adaptive histogram equalization processing is limited;
determining a first parameter and a second parameter according to the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window; the first parameter and the second parameter are parameters of a guide filter function, and the guide filter function is a function for performing guide filtering on the ultrasound image after the contrast-limited adaptive histogram equalization processing;
performing guided filtering processing based on the first parameter, the second parameter, the pixel value of a guided image pixel point and the pixel average value of the ultrasonic image to obtain the ultrasonic image after guided filtering processing;
and performing edge detection processing on the ultrasonic image subjected to the guide filtering processing to obtain a target ultrasonic image.
In one embodiment, the step of determining the first parameter and the second parameter according to the average value of the pixels of the ultrasound image in the window and the average value of the pixels of the guide image in the window comprises:
Acquiring the number of pixel points in a window, the pixel values of the ultrasonic image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window and a smoothing factor;
determining a first parameter according to the number of pixel points in the window, the pixel values of the ultrasound image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window, the smoothing factor, the pixel average value of the ultrasound image in the window and the pixel average value of the guide image in the window;
and determining a second parameter according to the pixel average value of the ultrasonic image in the window, the pixel average value of the guide image in the window and the first parameter.
In an embodiment, the step of performing guided filtering processing based on the first parameter, the second parameter, a pixel value of a guided image pixel point, and a pixel average value of the ultrasound image to obtain the ultrasound image after guided filtering processing includes:
determining a first parameter average value corresponding to the first parameter, and determining a second parameter average value corresponding to the second parameter;
acquiring a first product of the first parameter average value and a pixel value of a guide image pixel point;
Determining a sum of the first product and the average of the second parameter;
and determining a second product of the sum and the pixel average value of the ultrasonic image, and determining the ultrasonic image after the guide filtering processing according to the second product.
In one embodiment, the step of performing the contrast-limited adaptive histogram equalization on the ultrasound image comprises:
averagely dividing the ultrasonic image into a plurality of sub-regions with preset sizes;
determining a restricted contrast histogram for each sub-region;
and performing histogram equalization processing on the limited contrast histogram of each subregion.
In one embodiment, the step of determining a restricted contrast histogram for each sub-region comprises:
drawing a gray level histogram of each subregion according to the number of gray level layers of each subregion and the number of pixels of each gray level layer;
determining the average pixel number of each gray level layer in the gray level histogram of each subarea according to the number of the gray level layers of each subarea and the pixel number of each subarea;
determining the cutting amplitude of each gray level layer in the gray level histogram of each subregion according to the average pixel number of each gray level layer in the gray level histogram of each subregion and a preset cutting coefficient;
Determining the total pixel number intercepted from all the gray level layers in the gray level histograms of the subregions according to the cutting amplitude of each gray level layer in the gray level histograms of the subregions;
calculating the average distributed pixel number of each gray level layer in the gray level histogram of each subregion according to the total pixel number intercepted from all the gray level layers in the gray level histogram of each subregion and the number of the gray level layers of each subregion;
and carrying out pixel allocation according to the average allocated pixel number of each gray level in the gray level histogram of each subregion and the cutting amplitude of each gray level in the gray level histogram of each subregion to obtain a limited contrast gray level histogram of each subregion.
In an embodiment, the step of performing edge detection processing on the ultrasound image after the guided filtering processing to obtain a target ultrasound image includes:
acquiring a convolution template in at least one direction;
performing convolution operation on the convolution templates in each direction and the ultrasonic image subjected to the guide filtering processing respectively to obtain gradient values of image pixels corresponding to the central points of the convolution templates;
determining a gradient image according to the gradient value of the image pixel corresponding to the central point of each convolution template;
And obtaining a target ultrasonic image according to the gradient image.
In one embodiment, the step of obtaining the target ultrasound image according to the gradient image comprises:
converting the gradient image into a binary image;
determining pixels with gradient values larger than a preset threshold value in the binary image as edge pixels, and determining pixels with gradient values smaller than the preset threshold value in the binary image as non-edge pixels;
and extracting the edge pixels and determining the target ultrasonic image according to the edge pixels.
In an embodiment, after the step of performing the edge detection processing on the ultrasound image after the guiding filtering processing to obtain the target ultrasound image, the method further includes:
and training a deep neural network model by using the target ultrasonic image so as to determine a target focus through the trained deep neural network model.
Further, to achieve the above object, the present application also provides an image processing apparatus comprising: the control program of the ultrasonic image processing is stored on the memory and can be operated on the processor, and the steps of the ultrasonic image processing method are realized when the control program of the ultrasonic image processing is executed by the processor.
In addition, to achieve the above object, the present application also provides a computer readable storage medium storing an ultrasound image processing program, which when executed by a processor, implements the steps of the ultrasound image processing method as described above.
According to the technical scheme of the ultrasonic image processing method provided by the embodiment of the application, firstly, the contrast of an ultrasonic image to be processed is enhanced by adopting a contrast-limiting adaptive histogram equalization algorithm; and then, the improved brightness adaptive guided filtering algorithm is used for denoising the enhanced ultrasonic image, so that the functions of denoising, contrast enhancement and brightness adjustment are effectively realized. And finally, the multidirectional Sobel operator is adopted to enhance the focus edge information in the filtered ultrasonic image, so that the deep learning model can be promoted to learn more feature information of the focus in the breast ultrasonic image after pretreatment, the influence of wrong information on the deep learning is reduced, the quality of the ultrasonic image to be processed is further improved, and the precision of the deep learning model on the extraction of the focus features and the pathological detection is improved.
Drawings
Fig. 1 is a schematic structural diagram of an ultrasound image processing apparatus according to the present application;
FIG. 2 is a flowchart illustrating a first embodiment of an ultrasound image processing method according to the present application;
fig. 3 is a detailed flowchart of step S130 of the first embodiment of the ultrasound image processing method according to the present application;
fig. 4 is a detailed flowchart of step S150 of the ultrasound image processing method according to the first embodiment of the present application;
fig. 5 is a flowchart of ultrasound image preprocessing according to the present application.
The objects, features, and advantages of the present application will be further understood by reference to the following description, taken in conjunction with the accompanying drawings, which are a single embodiment and are not intended to be all-inclusive.
Detailed Description
In order to better understand the above technical solution, exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a hardware operating environment of an image processing apparatus.
As shown in fig. 1, the image processing apparatus may include: a processor 1001, e.g. a CPU, a memory 1005, a user interface 1003, a network interface 1004, a communication bus 1002. The communication bus 1002 is used to implement connection communication among these components. The user interface 1003 may include a Display (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., a WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory such as a disk memory. The memory 1005 may alternatively be a storage device separate from the processor 1001 described previously.
It will be understood by those skilled in the art that the configuration of the image processing apparatus shown in fig. 1 is not intended to be limiting to the image processing apparatus and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a storage medium, may include therein an operating system, a network communication module, a user interface module, and an ultrasound image processing program. Among them, the operating system is a program that manages and controls hardware and software resources of the image processing apparatus, an ultrasound image processing program, and other software or programs to run.
In the image processing apparatus shown in fig. 1, the user interface 1003 is mainly used for connecting a terminal, and performing data communication with the terminal; the network interface 1004 is mainly used for a background server and is in data communication with the background server; processor 1001 may be used to invoke an ultrasound image processing program stored in memory 1005.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
acquiring an ultrasonic image to be processed, and carrying out contrast ratio limiting adaptive histogram equalization processing on the ultrasonic image;
acquiring a guide image corresponding to the ultrasonic image after contrast ratio self-adaptive histogram equalization processing is limited;
determining a first parameter and a second parameter according to the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window; the first parameter and the second parameter are parameters of a guide filter function, and the guide filter function is a function for performing guide filtering on the ultrasound image subjected to the contrast-limited adaptive histogram equalization processing;
performing guided filtering processing based on the first parameter, the second parameter, the pixel value of a guided image pixel point and the pixel average value of the ultrasonic image to obtain the ultrasonic image subjected to guided filtering processing;
And performing edge detection processing on the ultrasonic image subjected to the guide filtering processing to obtain a target ultrasonic image.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
acquiring the number of pixel points in a window, the pixel values of the ultrasonic image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window and a smoothing factor;
determining a first parameter according to the number of pixels in the window, the pixel value of the ultrasound image pixels, the pixel value of the guide image pixels, the variance of the guide image in the window, the smoothing factor, the pixel average value of the ultrasound image in the window and the pixel average value of the guide image in the window;
and determining a second parameter according to the pixel average value of the ultrasonic image in the window, the pixel average value of the guide image in the window and the first parameter.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
determining a first parameter average value corresponding to the first parameter, and determining a second parameter average value corresponding to the second parameter;
Acquiring a first product of the first parameter average value and the pixel value of the guide image pixel point;
determining a sum of the first product and the average of the second parameter;
and determining a second product of the sum and the pixel average value of the ultrasonic image, and determining the ultrasonic image after the guide filtering processing according to the second product.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
averagely dividing the ultrasonic image into a plurality of sub-regions with preset sizes;
determining a restricted contrast histogram for each sub-region;
and performing histogram equalization processing on the limited contrast histogram of each subregion.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
drawing a gray level histogram of each subregion according to the number of gray level layers of each subregion and the number of pixels of each gray level layer;
determining the average pixel number of each gray level layer in the gray level histogram of each subarea according to the number of the gray level layers of each subarea and the pixel number of each subarea;
determining the cutting amplitude of each gray level layer in the gray level histogram of each subregion according to the average pixel number of each gray level layer in the gray level histogram of each subregion and a preset cutting coefficient;
Determining the total pixel number intercepted from all the gray level layers in the gray level histograms of the subregions according to the cutting amplitude of each gray level layer in the gray level histograms of the subregions;
calculating the average distributed pixel number of each gray level layer in the gray level histogram of each subregion according to the total pixel number intercepted from all the gray level layers in the gray level histogram of each subregion and the number of the gray level layers of each subregion;
and carrying out pixel allocation according to the average allocated pixel number of each gray level in the gray level histogram of each subregion and the cutting amplitude of each gray level in the gray level histogram of each subregion to obtain a limited contrast gray level histogram of each subregion.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
acquiring a convolution template in at least one direction;
performing convolution operation on the convolution templates in each direction and the ultrasonic image subjected to the guide filtering processing respectively to obtain gradient values of image pixels corresponding to the central points of the convolution templates;
determining a gradient image according to the gradient value of the image pixel corresponding to the central point of each convolution template;
and obtaining a target ultrasonic image according to the gradient image.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
converting the gradient image into a binary image;
determining pixels with gradient values larger than a preset threshold value in the binary image as edge pixels, and determining pixels with gradient values smaller than the preset threshold value in the binary image as non-edge pixels;
and extracting the edge pixels and determining the target ultrasonic image according to the edge pixels.
When the processor 1001 calls the ultrasound image processing program stored in the memory 1005, the following operations are performed:
and training a deep neural network model by using the target ultrasonic image so as to determine a target focus through the trained deep neural network model.
The technical solution of the present application will be described below by way of examples.
The first embodiment:
as shown in fig. 2, in a first embodiment of the present application, an ultrasound image processing method of the present application includes the following steps:
step S110, obtaining an ultrasonic image to be processed, and carrying out contrast-limiting adaptive histogram equalization processing on the ultrasonic image.
In this embodiment, in the process of obtaining the deep neural network model by training the to-be-processed ultrasound image in the related art, because the to-be-processed ultrasound image has the problems of speckle noise interference, low contrast, fuzzy lesion edge and the like, if the to-be-processed ultrasound image is not preprocessed but is directly trained by the to-be-processed ultrasound image to obtain the deep neural network model, the precision of the deep neural network model is reduced, and further, when the to-be-detected ultrasound image is input into the trained deep neural network model, the extraction of the lesion and the pathological detection are difficult. In the related technology, Gaussian filtering or mean filtering is mainly adopted to denoise the ultrasound image, so that noise interference can be effectively removed, and meanwhile, the edge of a focus in the breast ultrasound image is blurred, so that the characteristic information of the focus is reduced, and the performance of a deep learning model is influenced. The histogram equalization is adopted to improve the brightness and contrast of the image, but the noise interference of the breast ultrasound image is enhanced at the same time, so that the noise information interferes with the model performance. The edge detection algorithm is adopted to enhance the focus edge, but the effect is poor due to the interference of noise. Based on the above, the application provides an ultrasonic image processing method, and the ultrasonic image is enhanced by adopting the contrast-limiting adaptive histogram equalization, so that the contrast of the ultrasonic image is improved. And then denoising and brightness adjustment are carried out by adopting the improved brightness self-adaptive guide filtering. And finally, the improved Sobel operator is applied to enhance the focus edge, so that a target ultrasonic image with better image quality is obtained.
In this embodiment, the ultrasound image to be processed in the present application may be an ultrasound image corresponding to each part of a body. For example, the ultrasound image to be processed may be a breast ultrasound image, a bladder ultrasound image, a stomach ultrasound image, or the like. The ultrasound image to be processed is taken as a breast ultrasound image as an example in the application. The breast ultrasound image can be detected by medical ultrasound detection equipment, and a public historical ultrasound image can be selected from a hospital database or a website to be used as a breast ultrasound image to be processed. Ultrasound images contain a lot of clinical diagnostic information that can assist the doctor in making a medical diagnosis. The ultrasonic image to be processed can be input into the initial network model for training, and a trained deep neural network model is obtained. In an embodiment, a plurality of breast ultrasound images to be processed may be obtained, each breast ultrasound image to be processed is preprocessed, and the preprocessed breast ultrasound images to be processed are input into an initial network model to be trained, so as to obtain a trained deep neural network model.
In this embodiment, after the breast ultrasound image to be processed is acquired, the breast ultrasound image is first subjected to contrast-limited adaptive histogram equalization processing. And a contrast-limited self-adaptive histogram equalization algorithm is adopted to enhance the contrast of the breast ultrasound image to be processed, so that the loss of the edge information of the fuzzy focus during the subsequent denoising of the breast ultrasound image is reduced. Specifically, the contrast-limited adaptive histogram equalization algorithm limits the stretching range of the local contrast mainly by limiting the brightness of the local histogram to prevent speckles caused by noise amplification and local contrast overemphasion, thereby improving the image contrast and image quality.
And step S120, acquiring a guide image corresponding to the ultrasound image after the contrast-limited adaptive histogram equalization processing.
In this embodiment, after the contrast-limited adaptive histogram equalization process is performed on the breast ultrasound image to be processed, the guided filtering process is performed on the breast ultrasound image after the contrast-limited adaptive histogram equalization process. The breast ultrasound image is filtered by adopting brightness self-adaptive guide filtering, so that the brightness of the filtered breast ultrasound image is moderate, noise is removed, and the loss of focus edge information is reduced by a strategy of enhancing firstly and then filtering.
In this embodiment, before the breast ultrasound image after the contrast-limited adaptive histogram equalization process is subjected to the guidance filtering process, a guidance image required for guidance filtering needs to be acquired. Specifically, the guide image may be another separate image, or may be the breast ultrasound image itself to be processed, and when the guide image is the breast ultrasound image itself to be processed, the guide filtering becomes a filtering operation for keeping edges. The guide image is used for guiding and filtering the breast ultrasonic image to be processed. In this embodiment, the guide image is a guide image corresponding to the breast ultrasound image after the contrast-limited adaptive histogram equalization process. In one embodiment, the guide image may be a breast ultrasound image to be processed without a constrained contrast adaptive histogram equalization process.
Step S130, determining a first parameter and a second parameter according to the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window; the first parameter and the second parameter are parameters of a guide filter function, and the guide filter function is a function for performing guide filtering on the ultrasound image after the contrast-limited adaptive histogram equalization processing.
In this embodiment, after the guide image is acquired, the breast ultrasound image to be processed after the contrast-limited adaptive histogram equalization process is subjected to guide filtering based on the guide filtering function. Specifically, the guiding filtering function is a function for guiding filtering the breast ultrasound image after the contrast-limited adaptive histogram equalization processing. The guided filter function includes a first parameter and a second parameter. The first parameter and the second parameter can both be determined by the pixel average value of the ultrasound image in the window and the pixel average value of the guide image in the window. The pixel average value of the ultrasound image in the window may be determined according to the pixel values of the image channels of the ultrasound image in the window, for example, the sum of the pixel values of the three image channels of R/G/B of the ultrasound image is determined, and then the pixel value sums are averaged to obtain the pixel average value of the ultrasound image in the window. Similarly, the pixel average value of the guide image in the window may be determined according to the pixel values of the image channels of the guide image in the window, for example, the sum of the pixel values of the three image channels of R/G/B of the guide image is determined, and then the pixel value sums are averaged to obtain the pixel average value of the guide image in the window.
In this embodiment, the window may be a local window or a global window. For example, when the window is a local window, the size of the window can be set according to actual conditions; when the window is a global window, the size of the global window needs to be adapted to the sizes of the guide image and the ultrasound image to be processed. The present application takes the window as a local window as an example. When the window is a local window, the pixel average value of the ultrasonic image in each local window and the pixel average value of the guide image in each local window can be respectively obtained, and then a first parameter and a second parameter are determined according to the pixel average value of the ultrasonic image in each local window and the pixel average value of the guide image in each local window.
And step S140, performing guiding filtering processing based on the first parameter, the second parameter, the pixel value of the guiding image pixel point and the pixel average value of the ultrasonic image to obtain the ultrasonic image after guiding filtering processing.
In this embodiment, after a first parameter and a second parameter are determined, a guiding filtering process is performed according to the first parameter, the second parameter, a pixel value of a guiding image pixel point, and a pixel average value of the ultrasound image. The method and the device improve the existing guide filtering function, combine the pixel average value of the ultrasonic image on the basis of the ultrasonic image after the original guide filtering processing, realize self-adaptive brightness adjustment, and avoid the phenomenon of excessive enhancement. Specifically, in the prior art, the ultrasound image after the guided filtering process is determined according to the first parameter, the second parameter, and the pixel value of the pixel point of the guided image, that is, the ultrasound image after the guided filtering process is determined according to the following formula:
Figure 510776DEST_PATH_IMAGE001
Wherein, akAnd bkIs the constant coefficient of the linear function when the window center is located, according to akAnd bkDetermining a relationship between the ultrasound image and the guide map, wherein akIs a first parameter, bkIs a second parameter, IiTo guide the pixel values of the image pixels, k is the index of the image pixel.
The method improves a guide filtering function, and determines the ultrasonic image after guide filtering according to the following improved formula:
Figure 310105DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 364649DEST_PATH_IMAGE003
is the average of the pixels of the ultrasound image,
Figure 325652DEST_PATH_IMAGE004
is the average value of the first parameter,
Figure 629594DEST_PATH_IMAGE005
is the second parameter average.
And S150, performing edge detection processing on the ultrasonic image subjected to the guide filtering processing to obtain a target ultrasonic image.
In this embodiment, after the breast ultrasound image to be processed is subjected to the guided filtering process, the breast ultrasound image subjected to the guided filtering process is subjected to the edge detection process, so as to obtain a target ultrasound image that can be used for deep neural network model training. In the edge detection processing process, the breast ultrasound image focus edge information is clearer by adopting the multidirectional Sobel operator, so that the deep learning network can learn more edge characteristics, the noise interference is reduced, and the accuracy of deep learning in breast ultrasound diagnosis is improved. Specifically, the method and the device adopt a multi-directional convolution template to carry out edge detection processing. The number of directions of the convolution template can be set according to actual conditions. When the number of directions of the set convolution templates is larger, the gradient of each direction can be accurately detected, so that the edge can be better detected. Wherein, the convolution template of the present application includes four directions of 0 °, 45 °, 90 °, 135 °, and the size of the convolution template is 3 x 3, which are [ -1, 0, 1], [ -2, 0, 2], [ -1, 0, 1], [ -1, 2, 0], [2, 0, -2], [0, -2, 1], [ -1, -2, -1], [0, 0, 0], [1, 2, 1], [ [0, -2, 1], [ -2, 0, 2], [ -1, 2, 0] ], respectively.
In the technical scheme of the embodiment, firstly, the contrast of an ultrasonic image to be processed is enhanced by adopting a contrast-limiting adaptive histogram equalization algorithm; and then, the improved brightness adaptive guided filtering algorithm is used for denoising the enhanced ultrasonic image, so that the functions of denoising, contrast enhancement and brightness adjustment are effectively realized. And finally, the multidirectional Sobel operator is adopted to enhance the focus edge information in the filtered ultrasonic image, so that the deep learning model can be promoted to learn more feature information of the focus in the mammary ultrasonic image after pretreatment, the influence of wrong information on the deep learning is reduced, the quality of the ultrasonic image to be processed is further improved, and the precision of the deep learning model on the extraction of the focus features and the pathological detection is improved.
The second embodiment:
as shown in fig. 3, fig. 3 includes the refinement step of step S130 in the first embodiment of the present application, i.e. steps S131-S133 in fig. 3 are the refinement steps of step S130 in the first embodiment, including:
step S131, acquiring the number of pixel points in a window, the pixel values of the ultrasonic image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window and a smoothing factor;
Step S132, determining a first parameter according to the number of pixel points in the window, the pixel value of the ultrasonic image pixel point, the pixel value of the guide image pixel point, the variance of the guide image in the window, the smoothing factor, the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window;
step S133, determining a second parameter according to the average pixel value of the ultrasound image in the window, the average pixel value of the guide image in the window, and the first parameter.
In this embodiment, after the contrast-limited adaptive histogram equalization processing is performed on the ultrasound image to be processed, the guided filtering processing is performed on the processed ultrasound image. In the process of performing the pilot filtering process, it is necessary to determine each parameter of the pilot filtering function, i.e., the first parameter and the second parameter. And forming a guide filtering function according to the first parameter and the second parameter. Specifically, the number of pixels in a window, the pixel value of a pixel of a guide image, the variance of the guide image in the window, and a smoothing factor are obtained. And determining a first parameter and a second parameter by adopting the variables and combining the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window.
Specifically, the following formula may be used to determine the first parameter:
Figure 435876DEST_PATH_IMAGE006
wherein, | w | is the number of pixel points in the window, IiTo guide the pixel value, p, of a pixel point of an imageiThe pixel values of the pixels in the ultrasound image are,
Figure 610505DEST_PATH_IMAGE007
to guide the pixel average of the image in the window,
Figure 742409DEST_PATH_IMAGE008
is the average of the pixels in the window of the ultrasound image,
Figure 268069DEST_PATH_IMAGE009
to guide the variance of the image in the window,
Figure 409200DEST_PATH_IMAGE010
is a smoothing factorAnd the filter is used for preventing the obtained first parameter from being overlarge and adjusting the filtering effect.
Then, after obtaining the first parameter, a second parameter is determined based on the first parameter. Wherein the second parameter may be determined using the following formula:
Figure 703915DEST_PATH_IMAGE011
in this embodiment, the first parameter and the second parameter of the guided filter function can be determined by the above two formulas.
In the technical solution of this embodiment, a guide filtering function for guiding and filtering an ultrasound image is constructed according to a first parameter and a second parameter by determining the first parameter and the second parameter.
The third embodiment:
the following steps S141 to S144 are the refinement steps of step S140 of the first embodiment, and include:
step S141, determining a first parameter average value corresponding to the first parameter, and determining a second parameter average value corresponding to the second parameter;
Step S142, obtaining a first product of the first parameter average value and the pixel value of the guide image pixel point;
step S143 of determining a sum of the first product and the second parameter average;
step S144, determining a second product of the sum and the pixel average of the ultrasound image, and determining the ultrasound image after the guided filtering process according to the second product.
In this embodiment, after determining the first parameter and the second parameter, further determining a first parameter average value corresponding to the first parameter and a second parameter average value corresponding to the second parameter, wherein the first parameter average values are obtained by averaging after adding all the first parameters; and adding all the second parameters and averaging to obtain the average value of the second parameters. Determining the ultrasonic image after the guide filtering processing by adopting the following formula:
Figure 741141DEST_PATH_IMAGE012
wherein, the
Figure 754097DEST_PATH_IMAGE013
Is the first parameter average value; the above-mentioned
Figure 433340DEST_PATH_IMAGE014
Is the average value of the second parameter; the above-mentioned
Figure 51403DEST_PATH_IMAGE015
Pixel values of pixels of the guide image; the above-mentioned
Figure 56268DEST_PATH_IMAGE016
Is the pixel average of the ultrasound image. According to the method and the device, window operation is adopted in the whole image, and finally the average value is obtained to obtain the ultrasonic image after the guiding filtering processing, so that the noise removal of the breast ultrasonic image is completed.
The fourth embodiment:
the following steps S111 to S113 are the refinement steps of step S110 of the first embodiment, and include:
step S111, averagely dividing the ultrasonic image into a plurality of sub-areas with preset sizes;
step S112, determining a limited contrast histogram of each sub-region;
in step S113, histogram equalization processing is performed on the limit contrast histogram for each sub-region.
In this embodiment, the preset size may be determined according to actual conditions, and the ultrasound image may be segmented into a plurality of sub-regions having the same length and width, for example, the preset size of each sub-region is 4 × 4. The ultrasound image may also be segmented into a plurality of sub-regions of different lengths and widths, for example, each sub-region having a preset size of 4 x 2. The number of sub-regions that can be divided can be determined according to the actual size of the ultrasound image and the preset size of each sub-region to be divided. And dividing the ultrasound image into a number of sub-regions. Wherein the sub-regions are continuous and do not overlap each other. In an embodiment, the ultrasound image may be converted into a gray image, and the gray image is divided into a plurality of sub-regions of a predetermined size. The ultrasound image may be equally divided into a plurality of sub-regions of a preset size using the following formula:
Figure 25361DEST_PATH_IMAGE017
Wherein C is a preset cropping coefficient, K is the number of subregions into which the ultrasonic image can be divided, and nxFor the number of pixels in the horizontal direction per sub-region, nyT is the clipping magnitude for the number of pixels in the vertical direction per sub-region.
In this embodiment, after the ultrasound image is equally divided into a plurality of sub-regions of a preset size, each sub-region has a corresponding number of pixels in the horizontal direction and the vertical direction. Therefore, the limit contrast histogram of the corresponding sub-region may be generated based on the number of pixels corresponding to each sub-region in the horizontal direction as well as the vertical direction. After the limited contrast histogram of each sub-region is determined, the number of pixels of each sub-region needs to be reassigned to perform histogram equalization processing on the limited contrast histogram of each sub-region. Thereby reconstructing pixel values from the positions of the image blocks. And carrying out bilinear difference operation on image blocks positioned in the ultrasonic image according to adjacent image blocks. Image blocks located at the edge of the image may perform a linear interpolation operation according to adjacent image blocks.
Specifically, in an embodiment, the step of determining the limited contrast histogram of each sub-region includes:
Step S1131, drawing a gray level histogram of each sub-region according to the number of gray level layers of each sub-region and the number of pixels of each gray level layer;
step S1132, determining an average number of pixels of each gray scale layer in the gray histogram of each sub-region according to the number of gray scales of each sub-region and the number of pixels of each sub-region;
step S1133, determining the clipping amplitude of each gray level layer in the gray level histogram of each subregion according to the average pixel number of each gray level layer in the gray level histogram of each subregion and a preset clipping coefficient;
step S1134, determining the total number of pixels intercepted from all the gray levels in the gray level histogram of each subregion according to the cutting amplitude of each gray level in the gray level histogram of each subregion;
step S1135, calculating the average distributed pixel number of each gray level layer in the gray level histogram of each subregion according to the total pixel number intercepted from all the gray level layers in the gray level histogram of each subregion and the number of the gray level layers of each subregion;
step S1136, performing pixel allocation according to the average allocated pixel number of each gray level in the gray level histogram of each sub-region and the clipping amplitude of each gray level in the gray level histogram of each sub-region, to obtain a limited contrast gray level histogram of each sub-region.
In the technical scheme of this embodiment, the ultrasound image is enhanced by using a contrast-limited adaptive histogram equalization algorithm, so that the contrast of the breast ultrasound image is significantly improved.
Fifth embodiment:
referring to fig. 4, steps S151 to S154 in fig. 4 are refinement steps of step S150 of the first embodiment, including:
step S151 of acquiring a convolution template in at least one direction;
in this embodiment, the edge directions of the convolution templates actually used in the present application are many, and mainly include four directions of 0 °, 45 °, 90 °, and 135 °, where the 0 ° and 90 ° direction templates are perpendicular to each other and are respectively used to detect the edges in the 90 ° and 0 ° directions; the 45 deg. and 135 deg. orientation templates are perpendicular to each other and are used to detect 135 deg. and 45 deg. orientation edges, respectively. The gradient in other directions can be accurately detected, and the edge of the ultrasonic image can be accurately detected. Of course, in other embodiments, 22.5 °, 45 °, 67.5 °, 90 °, 112.5 °, 135 °, and 157.5 ° convolution modules may be used. The more directions the convolution templates are used, the more accurate the edge detection result.
In this example, the convolution templates of the present application have a size of 3 x 3, which are [ -1, 0, 1], [ -2, 0, 2], [ -1, 0, 1], [ -1, 2, 0], [2, 0, -2], [0, -2, 1], [ -1, -2, -1], [0, 0, 0], [1, 2, 1], [ [0, -2, 1], [ -2, 0, 2], [ -1, 2, 0], respectively. The numerical value in the convolution template is the template weight, and the function of the numerical value is to perform weighting operation with the pixel value of the corresponding position so as to better extract edge information. The setting of the template weight fully considers the contribution of the neighborhood pixel point to the directional gradient of the central point. The direction gradient contribution degree is determined based on the distance from the central point and the size of the included angle, and the principle is as follows: the closer the distance from the central point or the smaller the included angle with the central point, the larger the gradient contribution is; the smaller the contribution to the center point gradient.
Step S152, performing convolution operation on the convolution templates in each direction and the ultrasonic image subjected to the guide filtering processing respectively to obtain gradient values of image pixels corresponding to the central points of the convolution templates;
step S153, determining a gradient image according to the gradient value of the image pixel corresponding to the central point of each convolution template;
in this embodiment, after the convolution template is acquired, the gradient values and the gradient image are further determined. Specifically, convolution operation is performed on defined convolution templates in four directions and the ultrasonic image respectively, and four direction gradient values of image pixel points corresponding to template center points are solved: g1 (x, y), g2 (x, y), g3 (x, y), g4 (x, y), and then calculating the gradient value of the center point according to the following formula:
Figure 977136DEST_PATH_IMAGE018
wherein G (x, y) represents a gradient image, α is an attenuation factor, GiDenotes a direction gradient value, i denotes the ith gradient direction, and in the present application, N is 4, and denotes the above-described four directions. In order to prevent overflow of gradient values, which leads to the inability to refine edges, the present application divides the gradient values in each direction by a decay factor α. Book (notebook)In the application, α is 10.
And step S154, obtaining a target ultrasonic image according to the gradient image.
In this embodiment, after obtaining the gradient image, the gradient image needs to be further processed, and the gradient image needs to be binarized to remove the non-edge region of the weak gradient to obtain an edge contour image, so as to determine the target ultrasound image according to the edge contour image, where the non-edge information with a lower gradient value in the gradient image is used to obtain the edge image with a higher gradient value.
In an embodiment, the step of obtaining the target ultrasound image according to the gradient image includes:
step S1541, converting the gradient image into a binary image;
step S1542, determining the pixels with gradient values larger than a preset threshold value in the binarized image as edge pixels, and determining the pixels with gradient values smaller than the preset threshold value in the binarized image as non-edge pixels;
step S1543, extracting the edge pixel, and determining the target ultrasound image according to the edge pixel.
In this embodiment, the following formula is adopted to perform edge extraction, so as to obtain an edge profile image:
Figure 512023DEST_PATH_IMAGE019
TH is a preset threshold value, and can be set according to actual conditions and needs.
In the technical scheme of the embodiment, the edge information of the focus in the filtered ultrasound image is enhanced by adopting the multidirectional Sobel operator, so that the deep learning model is promoted to learn more feature information of the focus in the breast ultrasound image after preprocessing, the influence of wrong information on the deep learning is reduced, and the prediction performance of the deep neural network model is further improved.
The sixth embodiment:
after the step S150 of the first embodiment of the present application, that is, after the target ultrasound image is obtained, the deep neural network model may be trained by using the target ultrasound image, so as to determine the target lesion through the trained deep neural network model. Specifically, a deep neural network model is constructed, wherein training samples required by the network are breast ultrasound images with labels, and data can be divided into a training set used for a training process of the network model and a verification set used for selecting an optimal model and a test set used for testing a final lesion identification detection effect of the model according to performance of the model on the verification set in a training optimization process, wherein 8906 training sets, 989 verification sets and 448 test sets are used. And acquiring the mean value and the standard deviation of the labeled mammary gland ultrasonic image set. Training the training data by adopting a target detection algorithm RetinaNet based on deep learning to obtain the weight of the neural network model. On the trained deep neural network model, when an ultrasonic image to be detected is input into the deep neural network model, a target focus can be accurately determined.
In the technical scheme of the embodiment, the ultrasonic image after image preprocessing is adopted to carry out deep neural network model training, so that the focus detection result is more accurate.
The seventh embodiment:
referring to fig. 5, fig. 5 is a flowchart of ultrasound image preprocessing according to the present application. Specifically, a breast ultrasound image set is first input. Each breast ultrasound image in the breast ultrasound image set is traversed for preprocessing. In the process of traversing each breast ultrasound image, the specific preprocessing process is as follows: firstly, the CLAHE algorithm (limited contrast adaptive histogram equalization) is adopted to improve the breast ultrasound image contrast. Then, denoising the breast ultrasound image by adopting improved brightness adaptive guide filtering; next, a multidirectional sobel operator is applied to enhance the lesion edges. And finally, storing the processed breast ultrasound image. The preprocessing method is adopted for processing each breast ultrasonic image, so that a preprocessed breast ultrasonic image set is obtained. And then training the deep neural network model by adopting the preprocessed breast ultrasound image set, thereby obtaining the deep neural network model with more accurate focus prediction result.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown or described herein.
Based on the same inventive concept, an embodiment of the present application further provides a computer-readable storage medium, where an ultrasound image processing program is stored in the computer-readable storage medium, and when the ultrasound image processing program is executed by a processor, the ultrasound image processing program implements the above-mentioned steps of ultrasound image processing, and can achieve the same technical effect, and is not described herein again to avoid repetition.
Since the computer-readable storage medium provided in the embodiments of the present application is a computer-readable storage medium used for implementing the method in the embodiments of the present application, based on the method described in the embodiments of the present application, those skilled in the art can understand the specific structure and modification of the computer-readable storage medium, and thus details are not described herein. Any computer readable storage medium that can be used in the method of the embodiments of the present application is intended to be protected by the present application.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable computer-readable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. An ultrasound image processing method, characterized in that the ultrasound image processing method comprises:
acquiring an ultrasonic image to be processed, and carrying out contrast-limiting adaptive histogram equalization processing on the ultrasonic image;
acquiring a guide image corresponding to the ultrasonic image after contrast ratio limiting adaptive histogram equalization processing;
determining a first parameter and a second parameter according to the pixel average value of the ultrasonic image in the window and the pixel average value of the guide image in the window; the first parameter and the second parameter are parameters of a guide filter function, and the guide filter function is a function for performing guide filtering on the ultrasound image after the contrast-limited adaptive histogram equalization processing;
Performing guided filtering processing based on the first parameter, the second parameter, the pixel value of a guided image pixel point and the pixel average value of the ultrasonic image to obtain the ultrasonic image after guided filtering processing;
and carrying out edge detection processing on the ultrasonic image after the guide filtering processing to obtain a target ultrasonic image.
2. The method of claim 1, wherein the step of determining the first parameter and the second parameter based on the mean of the pixels of the ultrasound image in the window and the mean of the pixels of the guide image in the window comprises:
acquiring the number of pixel points in a window, the pixel values of the ultrasonic image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window and a smoothing factor;
determining a first parameter according to the number of pixel points in the window, the pixel values of the ultrasound image pixel points, the pixel values of the guide image pixel points, the variance of the guide image in the window, the smoothing factor, the pixel average value of the ultrasound image in the window and the pixel average value of the guide image in the window;
and determining a second parameter according to the pixel average value of the ultrasonic image in the window, the pixel average value of the guide image in the window and the first parameter.
3. The method of processing an ultrasound image according to claim 2, wherein the step of performing the guided filtering process based on the first parameter, the second parameter, the pixel value of the guided image pixel, and the pixel average value of the ultrasound image to obtain the ultrasound image after the guided filtering process includes:
determining a first parameter average value corresponding to the first parameter, and determining a second parameter average value corresponding to the second parameter;
acquiring a first product of the first parameter average value and the pixel value of the guide image pixel point;
determining a sum of the first product and the average of the second parameter;
and determining a second product of the sum value and the pixel average value of the ultrasonic image, and determining the ultrasonic image after the guiding filtering processing according to the second product.
4. The method of processing an ultrasound image according to claim 1, wherein the step of performing a contrast-limited adaptive histogram equalization process on the ultrasound image comprises:
averagely dividing the ultrasonic image into a plurality of sub-areas with preset sizes;
determining a restricted contrast histogram for each sub-region;
and performing histogram equalization processing on the limited contrast histogram of each subregion.
5. The method of ultrasound image processing according to claim 4, wherein the step of determining a restricted contrast histogram for each sub-region comprises:
drawing a gray level histogram of each subregion according to the number of gray level layers of each subregion and the number of pixels of each gray level layer;
determining the average pixel number of each gray level layer in the gray level histogram of each subarea according to the number of the gray level layers of each subarea and the pixel number of each subarea;
determining the cutting amplitude of each gray level layer in the gray level histogram of each subregion according to the average pixel number of each gray level layer in the gray level histogram of each subregion and a preset cutting coefficient;
determining the total pixel number intercepted from all the gray level layers in the gray level histograms of the subregions according to the cutting amplitude of each gray level layer in the gray level histograms of the subregions;
calculating the average distributed pixel number of each gray level layer in the gray level histogram of each subregion according to the total pixel number intercepted from all the gray level layers in the gray level histogram of each subregion and the number of the gray level layers of each subregion;
and carrying out pixel allocation according to the average allocated pixel number of each gray level in the gray level histogram of each subregion and the cutting amplitude of each gray level in the gray level histogram of each subregion to obtain a limited contrast gray level histogram of each subregion.
6. The method of processing an ultrasound image according to claim 1, wherein the step of performing edge detection processing on the ultrasound image after the guiding filtering processing to obtain a target ultrasound image comprises:
acquiring a convolution template in at least one direction;
performing convolution operation on the convolution templates in each direction and the ultrasonic image subjected to the guide filtering processing respectively to obtain gradient values of image pixels corresponding to the central points of the convolution templates;
determining a gradient image according to the gradient value of the image pixel corresponding to the central point of each convolution template;
and obtaining a target ultrasonic image according to the gradient image.
7. The method of claim 6, wherein the step of obtaining the target ultrasound image from the gradient image comprises:
converting the gradient image into a binary image;
determining pixels with gradient values larger than a preset threshold value in the binary image as edge pixels, and determining pixels with gradient values smaller than the preset threshold value in the binary image as non-edge pixels;
and extracting the edge pixels and determining the target ultrasonic image according to the edge pixels.
8. The method for processing an ultrasound image according to claim 1, wherein after the step of performing edge detection processing on the ultrasound image after the guiding filtering processing to obtain the target ultrasound image, the method further comprises:
and training a deep neural network model by adopting the target ultrasonic image so as to determine a target focus through the trained deep neural network model.
9. An image processing apparatus characterized by comprising: memory, a processor and a control program for ultrasound image processing stored on the memory and executable on the processor, which control program for ultrasound image processing when executed by the processor implements the steps of the method for ultrasound image processing according to any of claims 1-8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores an ultrasound image processing program, which when executed by a processor implements the steps of the ultrasound image processing method of any one of claims 1 to 8.
CN202210671375.5A 2022-06-15 2022-06-15 Ultrasonic image processing method, device and computer readable storage medium Active CN114757950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210671375.5A CN114757950B (en) 2022-06-15 2022-06-15 Ultrasonic image processing method, device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210671375.5A CN114757950B (en) 2022-06-15 2022-06-15 Ultrasonic image processing method, device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN114757950A true CN114757950A (en) 2022-07-15
CN114757950B CN114757950B (en) 2022-11-01

Family

ID=82336939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210671375.5A Active CN114757950B (en) 2022-06-15 2022-06-15 Ultrasonic image processing method, device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114757950B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115908428A (en) * 2023-03-03 2023-04-04 山东大学齐鲁医院 Image processing method and system for adjusting finger retractor
CN116740115A (en) * 2023-08-14 2023-09-12 国网电商科技有限公司 Image edge detection method and device
CN116934755A (en) * 2023-09-18 2023-10-24 中国人民解放军总医院第八医学中心 Pulmonary tuberculosis CT image enhancement system based on histogram equalization

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174890A1 (en) * 2002-03-14 2003-09-18 Masaki Yamauchi Image processing device and ultrasonic diagnostic device
CN104318527A (en) * 2014-10-21 2015-01-28 浙江工业大学 Method for de-noising medical ultrasonic image based on wavelet transformation and guide filter
CN104966272A (en) * 2015-05-29 2015-10-07 中国农业大学 Underwater sea cucumber image processing method and system
CN105825484A (en) * 2016-03-23 2016-08-03 华南理工大学 Depth image denoising and enhancing method based on deep learning
CN109492653A (en) * 2018-11-15 2019-03-19 深圳市比邻星精密技术有限公司 Breast lesion volume measuring method, device, computer equipment and storage medium
US20200008784A1 (en) * 2018-07-05 2020-01-09 Hitachi, Ltd. Ultrasonic imaging device and image processing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174890A1 (en) * 2002-03-14 2003-09-18 Masaki Yamauchi Image processing device and ultrasonic diagnostic device
CN104318527A (en) * 2014-10-21 2015-01-28 浙江工业大学 Method for de-noising medical ultrasonic image based on wavelet transformation and guide filter
CN104966272A (en) * 2015-05-29 2015-10-07 中国农业大学 Underwater sea cucumber image processing method and system
CN105825484A (en) * 2016-03-23 2016-08-03 华南理工大学 Depth image denoising and enhancing method based on deep learning
US20200008784A1 (en) * 2018-07-05 2020-01-09 Hitachi, Ltd. Ultrasonic imaging device and image processing device
CN109492653A (en) * 2018-11-15 2019-03-19 深圳市比邻星精密技术有限公司 Breast lesion volume measuring method, device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈晓冬 等: "分数阶微分加权引导滤波对超声图像的纹理保持", 《光学精密工程》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115908428A (en) * 2023-03-03 2023-04-04 山东大学齐鲁医院 Image processing method and system for adjusting finger retractor
CN115908428B (en) * 2023-03-03 2023-05-12 山东大学齐鲁医院 Image processing method and system for adjusting finger retractor
CN116740115A (en) * 2023-08-14 2023-09-12 国网电商科技有限公司 Image edge detection method and device
CN116740115B (en) * 2023-08-14 2023-11-17 国网电商科技有限公司 Image edge detection method and device
CN116934755A (en) * 2023-09-18 2023-10-24 中国人民解放军总医院第八医学中心 Pulmonary tuberculosis CT image enhancement system based on histogram equalization
CN116934755B (en) * 2023-09-18 2023-12-01 中国人民解放军总医院第八医学中心 Pulmonary tuberculosis CT image enhancement system based on histogram equalization

Also Published As

Publication number Publication date
CN114757950B (en) 2022-11-01

Similar Documents

Publication Publication Date Title
CN114757950B (en) Ultrasonic image processing method, device and computer readable storage medium
Shen et al. An automated lung segmentation approach using bidirectional chain codes to improve nodule detection accuracy
CN109840913B (en) Method and system for segmenting tumor in mammary X-ray image
CN109363698B (en) Method and device for identifying mammary gland image signs
CN110176010B (en) Image detection method, device, equipment and storage medium
CN108109140A (en) Low Grade Gliomas citric dehydrogenase non-destructive prediction method and system based on deep learning
CN111429451B (en) Medical ultrasonic image segmentation method and device
CN109363697B (en) Method and device for identifying focus of breast image
JP2009541838A (en) Method, system and computer program for determining a threshold in an image including image values
CN108364297B (en) Blood vessel image segmentation method, terminal and storage medium
CN113706473B (en) Method for determining long and short axes of focus area in ultrasonic image and ultrasonic equipment
Hamad et al. Brain's tumor edge detection on low contrast medical images
CN105447879A (en) Method and apparatus for detecting breast muscle in breast image
CN112750137A (en) Liver tumor segmentation method and system based on deep learning
CN114332132A (en) Image segmentation method and device and computer equipment
CN112102259A (en) Image segmentation algorithm based on boundary guide depth learning
EP3663982B1 (en) Method to improve the segmentation performance of a computer implemented deep learning algorithm
CN113012127A (en) Cardiothoracic ratio measuring method based on chest medical image
CN111861984B (en) Method and device for determining lung region, computer equipment and storage medium
CN115631194B (en) Method, device, equipment and medium for identifying and detecting intracranial aneurysm
CN113379770B (en) Construction method of nasopharyngeal carcinoma MR image segmentation network, image segmentation method and device
CN112529918B (en) Method, device and equipment for segmenting brain room area in brain CT image
CN111753723B (en) Fingerprint identification method and device based on density calibration
CN114445419A (en) Lung segment segmentation method, device and system based on bronchial topological structure
CN114693671A (en) Lung nodule semi-automatic segmentation method, device, equipment and medium based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant