WO2022249217A1 - A system and method for detecting varicocele using ultrasound images in supine position - Google Patents

A system and method for detecting varicocele using ultrasound images in supine position Download PDF

Info

Publication number
WO2022249217A1
WO2022249217A1 PCT/JO2021/050003 JO2021050003W WO2022249217A1 WO 2022249217 A1 WO2022249217 A1 WO 2022249217A1 JO 2021050003 W JO2021050003 W JO 2021050003W WO 2022249217 A1 WO2022249217 A1 WO 2022249217A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
filter
ultrasound image
applying
interest
Prior art date
Application number
PCT/JO2021/050003
Other languages
French (fr)
Inventor
Omar AL ZOUBI
Mohammad ABU AWAD
Ayman ABDALLAH
Original Assignee
Jordan University Of Science And Technology
Al-Zaytoonah University Of Jordan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jordan University Of Science And Technology, Al-Zaytoonah University Of Jordan filed Critical Jordan University Of Science And Technology
Priority to PCT/JO2021/050003 priority Critical patent/WO2022249217A1/en
Publication of WO2022249217A1 publication Critical patent/WO2022249217A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/40Positioning of patients, e.g. means for holding or immobilising parts of the patient's body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the present disclosure relates to systems and methods for detecting dilations in pampiniform plexus, i.e. varicocele, and more particularly to systems and methods for detecting varicocele by using ultrasound images of a patient’s testicles taken in supine positions.
  • Varicocele may be defined as abnormal tortuosity or dilatations in the pampiniform venous plexus, it usually occurs in about 10-15% of men and adolescent males. From a clinical point of view, varicocele can be expressed by the presence of a palpable, little scrotal mass which is rarely accompanied with moderate pain. Also, varicocele may be associated with infertility.
  • Diagnosis of varicocele is usually performed by examining medical images of a patient in three positions taken to the left and right sides of a testicle, namely supine, valsalva, and standing positions.
  • the supine position is the most important position between them, thus physicians mainly rely on it to diagnose varicocele besides other positions.
  • Examination of the medical images, such as ultrasound images, is usually performed by Physicians manually.
  • some trials were made in the art to develop systems and methods to proceduralize the detection of varicocele.
  • the United States patent application publication US20190392953 discloses a computerized system and method for skin lesion diagnosis.
  • the user enters one or more images of the lesion and answers an online questionnaire.
  • image analysis techniques information from the questionnaire, information from the user profile and optionally additional external information, the system makes a diagnosis of the lesion.
  • the Russian patent application publication RU2177727 discloses a method for diagnosing cases of relapsing in varicocele by determining blood circulation flow rate in spermatic vein by applying ultrasonic real time scanning method. Maximum blood circulation flow rate difference in injured and healthy vein is determined. If the difference value is greater than 50%, then a relapsing varicocele diagnosis is to be set.
  • the Russian patent application publication RU2205598 discloses a method of predicting lesions of renal vein of a varicocele through introduction of a physiological solution into catheter's lumen at a predefined rate till filling left renal vein at its all distance.
  • the Physician should detect the development of defects in left renal vein and establish the presence of stenosis and its character.
  • the physician should diagnose functional stenosis, at no widening - organic stenosis should be stated on.
  • aspects of the present disclosure provide a method for detecting varicocele in a patient using at least one two-dimensional ultrasound image, the method may include a pre-processing stage, a processing stage and a detection stage, wherein the detection stage may include the steps of:
  • a detection unit Applying, by a detection unit, a canny edge detection method on a region of interest of the at least one ultrasound image to detect edges of objects found within the region of interest;
  • the pre-processing stage may include the steps of:
  • Segmenting by a segmentation unit, the at least one ultrasound image after applying the first filter in order to extract a region of interest in the at least one ultrasound image.
  • the suitable color space may be YCbCr color space yellow channel.
  • the first filter may be a median filter configured to set and smoothen the edges of objects contained in the at least one ultrasound image, and to reduce impulsive noise in the at least one ultrasound image while keeping image edges and lines.
  • the segmentation unit may perform Otsu Segmentation using a suitable segmentation parameter to the at least one ultrasound image to extract the region of interest, and wherein the region of interest may contain veins of a patient’s testicle.
  • the Otsu segmentation may have a segmentation parameter of
  • the processing stage may include applying, by a second filtering unit, a plurality of de-noising and smoothing filters to the at least one ultrasound image after the pre-processing stage.
  • the plurality of de-noising and smoothing filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter.
  • the Canny detection method may be achieved by the following steps:
  • the Canny detection method may have thresholding values ranging from 0 to about 150.
  • the system may include a first filtering unit configured to apply a first filter to the at least one ultrasound image; an image conversion unit configured to convert a color space of the at least one ultrasound image into a suitable color space; an image segmentation unit configured to extract a region of interest containing veins of the patient’s testicle; a second filtering unit configured to apply a plurality of de- noising and smoothing filters to the region of interest; a detection unit configured to detect edges of objects contained in the region of interest; a processing unit configured to identify roundness metric, to find Euclidean distance between two points at edges of each object contained in the region of interest, and to determine if the patient has varicocele or not; and a display to display the result.
  • the suitable color space may be YCbCr color space yellow channel
  • the first filter may be a median filter.
  • the segmentation unit may be configured to perform Otsu Segmentation using a suitable segmentation parameter.
  • the Otsu segmentation may have a segmentation parameter of
  • the plurality of de-noising and smoothing filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter.
  • the detection unit may be configured to apply Canny detection method to the extracted region of interest using a suitable thersholding value.
  • the suitable thresholding value may range from 0 to about 150.
  • FIG. 1 illustrates a flowchart of a method for diagnosing varicocele using ultrasound images of a patient’s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
  • FIG. 2 illustrates a flowchart of a pre-processing stage of a method for diagnosing varicocele using ultrasound images of a patient’s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
  • FIG. 3 illustrates a flowchart of a processing stage of a method for diagnosing varicocele using ultrasound images of a patient’ s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
  • FIG. 4 illustrates a flowchart of a detection stage of a method for diagnosing varicocele using ultrasound images of a patient’ s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
  • FIG. 5 A illustrates an original ultrasound image of a patient’s testicle having varicocele.
  • FIG. 5B illustrates the ultrasound image of FIG. 5A converting the color space of such image to YCbCr color space Yellow channel.
  • FIG. 5C illustrates the ultrasound image of FIG. 5B after applying a first filter to such image the filter is configured in accordance with embodiments of the present disclosure.
  • FIG. 5D illustrates the ultrasound image of FIG. 5C after segmentation, the segmentation being performed in accordance with embodiments of the present disclosure.
  • FIG. 5E illustrates the ultrasound image of FIG. 5D after removing background and unnecessary information using region property function, the background and unnecessary information being removed in accordance with embodiments of the present disclosure.
  • FIG. 5F illustrates the ultrasound image after multiplying FIG. 5C with FIG. 5E.
  • FIG. 5G illustrates the ultrasound image of FIG. 5F after applying a Histogram Filter, the Histogram Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 5H illustrates the ultrasound image of FIG. 5G after applying a Weiner Filter, the Weiner Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 51 illustrates the ultrasound image of FIG. 5H after applying a Clahe Filter, the Clahe Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 5J illustrates the ultrasound image of FIG. 51 after applying a Gaussian Filter, the Gaussian Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 5K illustrates the ultrasound image of FIG. 5J after applying an Anisotropic Filter, the Anisotrpic Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 5L illustrates the ultrasound image of FIG. 5K after applying a Canny Edge Detector followed by a Binary Gardient Mask, the Binary Gradient Mask being configured in accordance with embodiments of the present disclosure.
  • FIG. 5M illustrates the ultrasound image of FIG. 5L after applying a Dilated Gardient Mask, the Dilated Gradient Mask being configured in accordance with embodiments of the present disclosure.
  • FIG. 5N illustrates the ultrasound image of FIG. 5M after applying an image closing function, the image closing function being configured in accordance with embodiments of the present disclosure.
  • FIG. 50 illustrates the ultrasound image of FIG. 5N after applying an inverse image closing function, the inverse image closing function being configured in accordance with embodiments of the present disclosure.
  • FIG. 5P illustrates the ultrasound image of FIG. 50 after multiplying it with the image of FIG.5K.
  • FIG. 5Q illustrates the ultrasound image of FIG. 5P after thersholding in accordance with embodiments of the present disclosure.
  • FIG. 5R illustrates the ultrasound image of FIG. 5Q after determining boundaries around detected veins in accordance with embodiments of the present disclosure.
  • FIG. 5S illustrates the ultrasound image of FIG. 5R after applying a roundness metric in accordance with embodiments of the present disclosure.
  • FIG. 5T illustrates the ultrasound image of FIG. 5S after determining the Euclidean distance for all detected veins in accordance with embodiments of the present disclosure.
  • FIG. 5U illustrates the ultrasound image of FIG. 5T after highlighting veins having a Euclidean distance within Euclidean range in accordance with embodiments of the present disclosure.
  • FIG. 5V illustrates the ultrasound image of FIG. 5U after detecting varicocele veins in accordance with embodiments of the present disclosure.
  • FIG. 5W illustrates the ground truth image of the ultrasound image of FIG. 5V.
  • FIG. 6A illustrates an original ultrasound image of a normal testicle.
  • FIG. 6B illustrates the ultrasound image of FIG. 6A converting the color space of such image to YCbCr color space Yellow channel.
  • FIG. 6C illustrates the ultrasound image of FIG. 6B after applying a first filter to such image, the filter is configured in accordance with embodiments of the present disclosure.
  • FIG. 6D illustrates the ultrasound image of FIG. 6C after segmentation, the segmentation being performed in accordance with embodiments of the present disclosure.
  • FIG. 6E illustrates the ultrasound image of FIG. 6D after removing background and unnecessary information using region property function, the background and unnecessary information being removed in accordance with embodiments of the present disclosure.
  • FIG. 6F illustrates the ultrasound image after multiplying FIG. 6C with FIG. 6E.
  • FIG. 6G illustrates the ultrasound image of FIG. 6F after applying a Histogram Filter, the Histogram Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 6H illustrates the ultrasound image of FIG. 6G after applying a Weiner Filter, the Weiner Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 61 illustrates the ultrasound image of FIG. 6H after applying a Clahe Filter, the Clahe Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 6J illustrates the ultrasound image of FIG. 61 after applying a Gaussian Filter, the Gaussian Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 6K illustrates the ultrasound image of FIG. 6J after applying an Anisotropic Filter, the Anisotrpic Filter being configured in accordance with embodiments of the present disclosure.
  • FIG. 6L illustrates the ultrasound image of FIG. 6K after applying a Canny Edge Detector followed by a Binary Gardient Mask, the Binary Gradient Mask being configured in accordance with embodiments of the present disclosure.
  • FIG. 6M illustrates the ultrasound image of FIG. 6L after applying a Dilated Gardient Mask, the Dilated Gradient Mask being configured in accordance with embodiments of the present disclosure.
  • FIG. 6N illustrates the ultrasound image of FIG. 6M after applying an image closing function, the image closing function being configured in accordance with embodiments of the present disclosure.
  • FIG. 60 illustrates the ultrasound image of FIG. 6N after applying an inverse image closing function, the inverse image closing function being configured in accordance with embodiments of the present disclosure.
  • FIG. 6P illustrates the ultrasound image of FIG. 60 after multiplying it with the image of FIG.6K.
  • FIG. 6Q illustrates the ultrasound image of FIG. 6P after thersholding in accordance with embodiments of the present disclosure.
  • FIG. 6R illustrates the ultrasound image of FIG. 6Q after determining boundaries around detected veins in accordance with embodiments of the present disclosure.
  • FIG. 6S illustrates the ultrasound image of FIG. 6R after applying a roundness metric in accordance with embodiments of the present disclosure.
  • FIG. 6T illustrates the ultrasound image of FIG. 6S after determining the Euclidean distance for all detected veins in accordance with embodiments of the present disclosure.
  • FIG. 6U illustrates the ultrasound image of FIG. 6T after highlighting veins having a Euclidean distance within Euclidean range in accordance with embodiments of the present disclosure.
  • FIG. 6V illustrates the ultrasound image of FIG. 6U after detecting varicocele veins in accordance with embodiments of the present disclosure, it is shown in the image that no varicocele is present.
  • FIG. 6W illustrates the ground truth image of the ultrasound image of FIG. 6V.
  • FIG. 7 illustrates a block diagram of a system for detecting varicocele based on at least one two-dimensional ultrasound image of a patient’s testicle, the system being configured in accordance with embodiments of the present disclosure.
  • FIGS. 1-4 illustrate a method for diagnosing varicocele using two-dimensional (“2-D”) ultrasound images of a patient’s testicles in supine position
  • the method may include a pre-processing stage (process block 1-1) to extract at least one region of interest in the at least one ultrasound image to be examined, the at least one region of interest may contain multiple objects; a processing stage (process block 1-2) to clarify the at least one ultrasound image; and a detection stage (process block 1-3).
  • the pre-processing stage may include providing at least one 2-D ultrasound image of a patient’s testicle taken in a supine position (process 4242222, followed by applying a first filter using a first filtering unit to the provided ultrasound images (process block 2-3), and segmenting the at least one ultrasound image using an image segmentation unit (process block 2- 4).
  • FIGS. 5A, 5B illustrate ultrasound images of a patient’s testicle having varicocele before and after converting the color space of the image to YCbCr color space Yellow channel in accordance with embodiments of the present disclosure, respectively.
  • FIGS. 6A, 6B illustrate ultrasound images of a normal testicle before and after converting the color space of the image to YCbCr color space Yellow channel in accordance with embodiments of the present disclosure, respectively.
  • the first filter may be a conventional median filter configured to set and smoothen the edges of the objects contained in the ultrasound images, and to reduce impulsive noise in the image while keeping the useful features and image edges and lines.
  • FIGS. 5C, 6C illustrate ultrasound images of FIGS. 5B, 6B after applying the median filter according to the embodiments of the present disclosure, respectively.
  • the image segmentation unit in embodiments of the present disclosure may be configured to perform image thresholding and remove unneeded parts from the image while keeping the region of interest, i.e. the testicle.
  • the unneeded parts may include the background and any text contained the ultrasound image.
  • the image segmentation unit in embodiments of the present disclosure may perform conventional Otsu Segmentation using a suitable segmentation parameter, preferably 0.1.
  • the image to be segmented contains two classes of pixels, e.g., foreground and background, each class has a different color.
  • Otsu method is important for finding the robust threshold by minimizing the intra-class conflict between the regions outside and inside the object. It also finds the threshold parameter that minimizes the intra-class variance and is defined as a weighted sum of variances of the two classes, as per the following formula:
  • w 0 and w ⁇ represent the probabilities of the two classes separated by a threshold parameter (t);
  • O Q and s are variances of these two classes.
  • Otsu segmentation depends on the fact that minimization of the intra-class variance is correlative to maximization of the inter-class variance as per the following formula: wherein represents inter-class variance; s 2 represents discrepancies of class sets; s 2 represents intra-class variance; t represents a thresholding parameter; w ⁇ and w 2 represent the probabilities of two classes separated by the threshold parameter, respectively; and m- and m 2 represent the means of the two classes, respectively.
  • the class probability m 1 (t) can then be identified from the threshold (t) according to the following formula: wherein P(i) represents the probability of the thresholding parameter.
  • P(i) represents the probability of the thresholding parameter
  • X (i) represents the value to the center of the histogram bin.
  • FIGS. 5D, 6D illustrate the ultrasound images after applying the first filter and segmentation according to the embodiments of the present disclosure.
  • the removal of unneeded parts that contain unnecessary information in the image may be performed by using a region property function.
  • FIG. 5E illustrates an ultrasound image of a normal testicle after removing background and unnecessary information using the region property function
  • FIG. 6E illustrates an ultrasound image of a patient’s testicle after removing background and unnecessary information using the region property function.
  • FIGS. 5E, 6E are multiplied with FIGS. 5C, 6C, respectively.
  • This multiplication preserves the part of the ultrasound image that is present in both ultrasound images before and after performing the region property function, and delete the parts of the ultrasound images that are not found in both ultrasound images before and after applying the region property function.
  • FIG. 5F illustrates an ultrasound image of a patient’s testicle after multiplication of FIG. 5E with FIG. 5C
  • FIG. 6F illustrates an ultrasound image of a normal testicle after multiplication of FIG. 6E with FIG. 6C.
  • the processing stage may include applying, by the second filtering unit, a plurality of de-noising and smoothing filters (process blocks 3-1, 3-2, 3-3, 3-4, 3-5).
  • the plurality of de-noising and smoothing filters may include a Histogram filter (process block 3-1), a Weiner filter (process block 3-2), a Clahe filter (process block 3- 3), a Gaussian filter (process block 3-4), and an Anisotropic Diffusion filter (process block 3-5).
  • FIGS. 5G-5K illustrate ultrasound images of a normal testicle after applying said filters, respectively
  • FIGS. 6G-6K illustrate ultrasound images of a patient’s testicle after applying the said filters, respectively.
  • the Histogram Filter (process block 3-1) is a type of Bayes filter that presents a fact as a histogram.
  • the histogram gives the information about the gray-levels of pixels in the at least one ultrasound image.
  • the histogram filter is shown as an array of pixels.
  • the density value in the grayscale image is from 0 to 255. These values are present as black and white, and the values in between giving different shades of gray such as dark gray or white gray.
  • a histogram of a grayscale image counts the number of incidence of intensity values at each pixel. The below formula illustrates the histogram filter.
  • h(ci) is the Histogram Filter
  • Ci is grayscale of i th pixel image for a total of L gray values; and ni is the number of occurrences of gray levels in the image ci.
  • histogram is a plot of h( ci) or P(ci) versus ci, wherein P(ci) is the probability of occurrence of gray levels.
  • FIGS. 5G, 6G illustrate ultrasound images of testicles taken in a supine position after applying the Histogram filter to FIGS. 5F, 6F, respectively.
  • the Weiner filter (process block 3-2) is used to remove noise in the at least one ultrasound image, and more particularly the white noise.
  • E is the expected value of the argument
  • F is the at least one ultrasound image; and F is the estimate of the at least one ultrasound image.
  • F(u, v ) is the frequency domain of the filter
  • H(u, v) is the degradation function
  • Xh (u, v) is the power spectrum of the noise
  • Sf (u, v) is the power spectrum of the undegraded image
  • G(u, v ) is a degradation function
  • H * (u, v) is the complex conjugate of H(u, v); and F ⁇ u, v) is noised image.
  • FIGS. 5H, 6H illustrate ultrasound images of testicles taken in a supine position after applying the Weiner filter to FIGS. 5G, 6G, respectively.
  • the Clahe filter may improve contrast in the at least one ultrasound image by working on small regions, i.e. tiles, in the image and calculating the contrast transform function for each tile individually.
  • the contrast of the image especially in homogeneous areas, can be limited to avoid amplifying any noise that might be present in the image.
  • MN pr(rk) is an estimated of the probability of occurrence of intensity level rk in an image
  • [0, L-l] is a range of the intensity level; and MN represents the area of an original image.
  • sk is the new distribution of the histogram;
  • Pr(rj) is an estimated of the probability of occurrence of intensity level rk in an image.
  • b MN/L(1 + a/100(smax - 1)) wherein b is a clip limit; a is a clip factor;
  • Smax is the maximum allowable slope
  • FIGS. 51, 61 illustrate ultrasound images of a testicle taken in a supine position after applying the Clahe filter to FIGS. 5H, 6H, respectively.
  • the Gaussian filter may separate the roughness and waviness components of the at least one ultrasound image, thus blurring the at least one ultrasound image and remove noise.
  • the Gaussian filter works according to the following function: wherein s is the standard deviation of the distribution, the distribution is assumed to have a mean of 0; x is the x-direction of the at least one ultrasound image; and y is the y-direction of the at least one ultrasound image.
  • FIGS. 5J, 6J illustrate ultrasound images of a testicle taken in a supine position after applying the Gaussian filter to FIGS. 51, 61, respectively.
  • the Anisotropic filter may enhance the resolution of the at least one ultrasound image by using the insertion of a forward-and-backward nonlinear diffusion post-processing which provides suppression of ringing.
  • the diffusion on pixel intensities makes the texture smooth as possible.
  • diffusion is prevented to happen across edges and this will cause to keep the edges in the image as they are.
  • VI is a gradient threshold
  • FIGS . 5K, 6K illustrate ultrasound images of a testicle taken in a supine position after applying the Anisotropic filter to FIGS. 5J, 6J, respectively.
  • the detection stage may include applying, by a detection unit, a Canny detection method (process block 4-1), applying binary gradient mask (process block 4-2), applying dilated gradient mask (process block 4-3), applying image closing (process block 4-4), inversing image closing (process block 4-5), multiplying inverse image closing with Anisotropic image (process block 4-6), thresholding the image using threshold values from about 0 to about 150 (process block 4-7), and determining boundaries of objects (process block 4- 8), determining a roundness metric (process block 4-9), selecting objects according to the determined roundness metric (process block 4-10), then identifying Euclidean distance between two points on the edges of each of the selected objects (process block 4-11), followed by Selecting objects located in range of Euclide
  • the Canny detection method may include smoothing the at least one ultrasound image by applying a Gaussian filter as previously described in process block 3-4 described above, identifying gradient magnitude and direction, suppressing non-maxima points in the gradient magnitude, and thresholding the results after suppressing non-maxima points.
  • P y(x, y ) is a partial derivation of f l(x, y) in the direction of y;
  • G(X, y) is the gradient magnitude of the point (x,y); and
  • q(c, y) represents the gradient direction of (p x admir p y ).
  • the suppression of non-maxima points in the at least one ultrasound image may be performed in order to get the single pixel edge of the image. This may be achieved by iterate through the image, if the gradient magnitude G(x, y) of a point (x, y) is not greater than the two adjacent interpolation in the direction of q(x, y), the point (x, y) will be marked as non-edge point, otherwise it is edge point.
  • FIGS. 5L, 6L illustrate ultrasound images of testicles taken in a supine position after applying binary gradient mask
  • FIGS. 5M, 6M illustrate ultrasound images of testicles taken in a supine position after applying dilated gradient mask.
  • FIGS. 5N, 6N illustrate ultrasound images of testicles taken in a supine position after image closing of FIGS . 5M, 6M, respectively
  • FIGS . 50, 60 illustrate ultrasound images of testicles taken in a supine position after applying inverse image closing to FIGS. 5N, 6N, respectively.
  • the multiplication of the inversed image closing with the Anisotropic image may provide the initial areas of dilated veins within the at least one region of interest of the at least one ultrasound image.
  • FIGS. 5P, 6P illustrate ultrasound images of testicles taken in a supine position after multiplying FIGS. 50, 60 with respective Anisotropic images, respectively.
  • the threshold value in embodiments of the present disclosure may range from 0 to about 150, this range would eliminate non- dilated veins in the at least one ultrasound image (process blocks 4-7, 4-8).
  • FIGS. 5Q, 6Q illustrate ultrasound images of testicles taken in a supine position after applying thresholding values from about 0 to about 150 to FIGS. 5P, 6P respectively
  • FIGS. 5R, 6R illustrate ultrasound images of testicles taken in a supine position after determining boundaries around detected objects of FIGS. 5Q, 6Q, respectively.
  • the region of interest of the at least one ultrasound image may contain edges of different shapes, however dilated veins usually have circular or even semi-circular shapes in ultrasound imaging.
  • the step of determining a roundness metric (process block 4-9) identifies the circular and non circular objects according to the following formula: area etrs 2 wherein k is the roundness metric; area is the surface area of the object; and perimeter is the perimeter of the object.
  • FIGS. 5S, 6S illustrate ultrasound images of tescticles taken in supine position after applying a roundness metric to FIGS. 5R, 6R, respectively.
  • the vein is dilated vein, and the patient would be diagnosed as having varicocele. But, if the Euclidean distance is not within the said range, then the patient would not be diagnosed as having varicocele (process block 4-12).
  • FIGS. 5T, 6T illustrate ultrasound images of tescticles taken in supine position after identifying Euclidean distance between two points on each of the round objects of FIGS. 5S, 6S, respectively.
  • FIGS. 5U, 6U illustrate ultrasound images of tescticles taken in supine position after highlighting the identified objects with Eucliedean distance within a pre-defined Euclidean range from about 23 to about 60 pixies.
  • FIGS. 5V, 6V illustrate ultrasound images of tescticles taken in supine position after highlighting detected varicocele.
  • FIGS. 5W, 6W illustrate ground truth ultrasound images of testicles taken in supine position.
  • Embodiments of the present disclosure further provide a system for detecting varicocele depending on at least one two-dimensional ultrasound image of a patient’s testicle, the system may include a pre-processing unit 1 having an image conversion unit 10 configured to convert a color space of the at least one ultrasound image into a suitable color space, such as YCbCr color space yellow channel, a first filtering unit 11 configured to apply a first filter, such as a median filter, to the at least one ultrasound image, and an image segmentation unit 12 configured to extract a region of interest containing veins of the patient’s testicle, for example by applying Otsu segmentation with a segmentation parameter of 0.1.
  • a pre-processing unit 1 having an image conversion unit 10 configured to convert a color space of the at least one ultrasound image into a suitable color space, such as YCbCr color space yellow channel, a first filtering unit 11
  • the system may further include a processing unit 2 configured to apply a plurality of de-noising and smoothing filters to the region of interest, the plurality of filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter; a detection unit 3 configured to detect edges of objects contained in the region of interest by using a suitable edge detection method such as Canny edge detection with a thresholding value ranging from 0 to about 150.
  • the detection unit 3 may also be configured to identify roundness metric by applying a suitable roundness metric formula, to find Euclidean distance between edges of objects present in the region of interest, and to determine if the patient has varicocele or not.
  • the system of the present disclosure may also include a display 4 to display the result.
  • aspects of the present disclosure may be a system, a method, and/or a computer program product at any possible technical detail level of integration. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or embodiments combining software and hardware aspects that may all generally be referred to herein as a “circuitry,” “module,” or “system.” Furthermore, aspects of the present disclosure may take the form of a program product embodied in one or more computer readable storage medium(s) having computer readable program code embodied thereon. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (“RAM”), a read-only memory (“ROM”), an erasable programmable read-only memory (“EPROM” or Flash memory), a static random access memory (“SRAM”), a portable compact disc read-only memory (“CD-ROM”), a digital video disk (“DVD”), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital video disk
  • memory stick a floppy disk
  • mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of aspects of the present disclosure may be assembler instructions, instruction-set-architecture (“ISA”) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object-oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user’s computer, partly on the user’s computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer, or entirely on the remote computer or server.
  • the remote computer may be connected to the user’s computer through any type of network, including a local area network (“LAN”) or a wide area network (“WAN”), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (“FPGA”), or programmable logic arrays (“PLA”) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus, or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the block diagrams’ block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which includes one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the Figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Vascular Medicine (AREA)
  • Physiology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Geometry (AREA)

Abstract

There is provided a method for detecting varicocele using at least one ultrasound image of a patient's testicle, the method may include a pre-processing stage, a processing stage and a detection stage. The detection stage may include the steps of applying, by a detection unit, a canny edge detection method on a region of interest of the at least one ultrasound image to detect edges of objects found within the region of interest; determining, by a detection unit, a roundness metric; selecting, by the detection unit, objects having a pre-determined roundness metric; identifying, by the detection unit, Euclidean distance between two points at edges of each of the selected objects individually; and determining, by the detection unit, if the patient has varicocele based on the identified Euclidean distance. The present disclosure also provides a system for executing the method.

Description

A SYSTEM AND METHOD FOR DETECTING VARICOCELE USING ULTRASOUND IMAGES IN SUPINE POSITION
TECHNICAL FIELD
[01] The present disclosure relates to systems and methods for detecting dilations in pampiniform plexus, i.e. varicocele, and more particularly to systems and methods for detecting varicocele by using ultrasound images of a patient’s testicles taken in supine positions.
BACKGROUND INFORMATION
[02] Varicocele may be defined as abnormal tortuosity or dilatations in the pampiniform venous plexus, it usually occurs in about 10-15% of men and adolescent males. From a clinical point of view, varicocele can be expressed by the presence of a palpable, little scrotal mass which is rarely accompanied with moderate pain. Also, varicocele may be associated with infertility.
[03] Diagnosis of varicocele is usually performed by examining medical images of a patient in three positions taken to the left and right sides of a testicle, namely supine, valsalva, and standing positions. The supine position is the most important position between them, thus physicians mainly rely on it to diagnose varicocele besides other positions. Examination of the medical images, such as ultrasound images, is usually performed by Physicians manually. However, some trials were made in the art to develop systems and methods to proceduralize the detection of varicocele.
[04] For instance, the United States patent application publication US20190392953 discloses a computerized system and method for skin lesion diagnosis. The user enters one or more images of the lesion and answers an online questionnaire. Using image analysis techniques, information from the questionnaire, information from the user profile and optionally additional external information, the system makes a diagnosis of the lesion.
[05] The Russian patent application publication RU2177727 discloses a method for diagnosing cases of relapsing in varicocele by determining blood circulation flow rate in spermatic vein by applying ultrasonic real time scanning method. Maximum blood circulation flow rate difference in injured and healthy vein is determined. If the difference value is greater than 50%, then a relapsing varicocele diagnosis is to be set.
[06] The Russian patent application publication RU2205598 discloses a method of predicting lesions of renal vein of a varicocele through introduction of a physiological solution into catheter's lumen at a predefined rate till filling left renal vein at its all distance. By using a monitor, the Physician should detect the development of defects in left renal vein and establish the presence of stenosis and its character. At widening of narrowed area of left renal vein the physician should diagnose functional stenosis, at no widening - organic stenosis should be stated on.
[07] The prior art systems and methods cited above require health care professional or physician intervention in order to be performed, thus they are subject to human error, thus the accuracy of these methods is not high.
SUMMARY
[08] Therefore, it is an object of the present disclosure to provide an objective non- invasive system and method for detecting and diagnosing varicocele based on ultrasound images automatically without the need for intervention of healthcare professionals or users.
[09] Aspects of the present disclosure provide a method for detecting varicocele in a patient using at least one two-dimensional ultrasound image, the method may include a pre-processing stage, a processing stage and a detection stage, wherein the detection stage may include the steps of:
Applying, by a detection unit, a canny edge detection method on a region of interest of the at least one ultrasound image to detect edges of objects found within the region of interest;
Determining, by a detection unit, a roundness metric;
Selecting, by the processing unit, objects having a pre-determined roundness metric;
Identifying, by the detection unit, Euclidean distance between the edges of each of the selected objects; and Determining, by the detection unit, if the patient has varicocele based on the identified Euclidean distance.
[010] In aspects of the present disclosure, the pre-processing stage may include the steps of:
Providing at least one ultrasound image of a patient’s testicle taken in a supine position;
Converting, by an image conversion unit, a color pattern of the at least one ultrasound image into a suitable color space;
Applying, by a first filtering unit, a first filter using to the at least one ultrasound image after converting it into the suitable color space; and
Segmenting, by a segmentation unit, the at least one ultrasound image after applying the first filter in order to extract a region of interest in the at least one ultrasound image.
[Oil] In some aspects, the suitable color space may be YCbCr color space yellow channel.
[012] In aspects of the present disclosure, the first filter may be a median filter configured to set and smoothen the edges of objects contained in the at least one ultrasound image, and to reduce impulsive noise in the at least one ultrasound image while keeping image edges and lines.
[013] In some aspects, the segmentation unit may perform Otsu Segmentation using a suitable segmentation parameter to the at least one ultrasound image to extract the region of interest, and wherein the region of interest may contain veins of a patient’s testicle.
[014] In some aspects, the Otsu segmentation may have a segmentation parameter of
0.1.
[015] In aspects of the present disclosure, the processing stage may include applying, by a second filtering unit, a plurality of de-noising and smoothing filters to the at least one ultrasound image after the pre-processing stage. [016] In some aspects, the plurality of de-noising and smoothing filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter.
[017] In aspects of the present disclosure, the Canny detection method may be achieved by the following steps:
Applying a binary gradient mask to the region of interest of the at least one ultrasound image;
Applying a dilated gradient mask to an image resulting from applying the binary gradient mask;
Applying image closing to an image resulting from applying the dilated gradient mask;
Performing inverse image closing;
Multiplying a resulting image from inverse image closing with an image with applied Anisotropic filter;
Applying thresholding values to the resulting image after multiplication; and
Determining edges of objects in the region of interest in the at least one uultrasound image.
[018] In some aspects, the Canny detection method may have thresholding values ranging from 0 to about 150.
[019] Other aspects of the present disclosure provide a system for detecting varicocele in a patient based on at least one ultrasound image of a patient’s testicle taken in a supine position, the system may include a first filtering unit configured to apply a first filter to the at least one ultrasound image; an image conversion unit configured to convert a color space of the at least one ultrasound image into a suitable color space; an image segmentation unit configured to extract a region of interest containing veins of the patient’s testicle; a second filtering unit configured to apply a plurality of de- noising and smoothing filters to the region of interest; a detection unit configured to detect edges of objects contained in the region of interest; a processing unit configured to identify roundness metric, to find Euclidean distance between two points at edges of each object contained in the region of interest, and to determine if the patient has varicocele or not; and a display to display the result. [020] In some aspects, the suitable color space may be YCbCr color space yellow channel.
[021] In some aspects of the present disclosure, the first filter may be a median filter.
[022] In aspects of the present disclosure, the segmentation unit may be configured to perform Otsu Segmentation using a suitable segmentation parameter.
[023] In some aspects, the Otsu segmentation may have a segmentation parameter of
0.1.
[024] In aspects of the present disclosure, the plurality of de-noising and smoothing filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter.
[025] In aspects of the present disclosure, the detection unit may be configured to apply Canny detection method to the extracted region of interest using a suitable thersholding value.
[026] In some aspects, the suitable thresholding value may range from 0 to about 150.
BRIEF DESCRIPTION OF THE DRAWINGS
[027] The disclosure will now be described with reference to the accompanying drawings, without however limiting the scope and spirit of the disclosure thereto, and in which:
[028] FIG. 1 illustrates a flowchart of a method for diagnosing varicocele using ultrasound images of a patient’s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
[029] FIG. 2 illustrates a flowchart of a pre-processing stage of a method for diagnosing varicocele using ultrasound images of a patient’s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
[030] FIG. 3 illustrates a flowchart of a processing stage of a method for diagnosing varicocele using ultrasound images of a patient’ s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure. [031] FIG. 4 illustrates a flowchart of a detection stage of a method for diagnosing varicocele using ultrasound images of a patient’ s testicles in supine position, the method being configured in accordance with embodiments of the present disclosure.
[032] FIG. 5 A illustrates an original ultrasound image of a patient’s testicle having varicocele.
[033] FIG. 5B illustrates the ultrasound image of FIG. 5A converting the color space of such image to YCbCr color space Yellow channel.
[034] FIG. 5C illustrates the ultrasound image of FIG. 5B after applying a first filter to such image the filter is configured in accordance with embodiments of the present disclosure.
[035] FIG. 5D illustrates the ultrasound image of FIG. 5C after segmentation, the segmentation being performed in accordance with embodiments of the present disclosure.
[036] FIG. 5E illustrates the ultrasound image of FIG. 5D after removing background and unnecessary information using region property function, the background and unnecessary information being removed in accordance with embodiments of the present disclosure.
[037] FIG. 5F illustrates the ultrasound image after multiplying FIG. 5C with FIG. 5E.
[038] FIG. 5G illustrates the ultrasound image of FIG. 5F after applying a Histogram Filter, the Histogram Filter being configured in accordance with embodiments of the present disclosure.
[039] FIG. 5H illustrates the ultrasound image of FIG. 5G after applying a Weiner Filter, the Weiner Filter being configured in accordance with embodiments of the present disclosure.
[040] FIG. 51 illustrates the ultrasound image of FIG. 5H after applying a Clahe Filter, the Clahe Filter being configured in accordance with embodiments of the present disclosure. [041] FIG. 5J illustrates the ultrasound image of FIG. 51 after applying a Gaussian Filter, the Gaussian Filter being configured in accordance with embodiments of the present disclosure.
[042] FIG. 5K illustrates the ultrasound image of FIG. 5J after applying an Anisotropic Filter, the Anisotrpic Filter being configured in accordance with embodiments of the present disclosure.
[043] FIG. 5L illustrates the ultrasound image of FIG. 5K after applying a Canny Edge Detector followed by a Binary Gardient Mask, the Binary Gradient Mask being configured in accordance with embodiments of the present disclosure.
[044] FIG. 5M illustrates the ultrasound image of FIG. 5L after applying a Dilated Gardient Mask, the Dilated Gradient Mask being configured in accordance with embodiments of the present disclosure.
[045] FIG. 5N illustrates the ultrasound image of FIG. 5M after applying an image closing function, the image closing function being configured in accordance with embodiments of the present disclosure.
[046] FIG. 50 illustrates the ultrasound image of FIG. 5N after applying an inverse image closing function, the inverse image closing function being configured in accordance with embodiments of the present disclosure.
[047] FIG. 5P illustrates the ultrasound image of FIG. 50 after multiplying it with the image of FIG.5K.
[048] FIG. 5Q illustrates the ultrasound image of FIG. 5P after thersholding in accordance with embodiments of the present disclosure.
[049] FIG. 5R illustrates the ultrasound image of FIG. 5Q after determining boundaries around detected veins in accordance with embodiments of the present disclosure.
[050] FIG. 5S illustrates the ultrasound image of FIG. 5R after applying a roundness metric in accordance with embodiments of the present disclosure. [051] FIG. 5T illustrates the ultrasound image of FIG. 5S after determining the Euclidean distance for all detected veins in accordance with embodiments of the present disclosure.
[052] FIG. 5U illustrates the ultrasound image of FIG. 5T after highlighting veins having a Euclidean distance within Euclidean range in accordance with embodiments of the present disclosure.
[053] FIG. 5V illustrates the ultrasound image of FIG. 5U after detecting varicocele veins in accordance with embodiments of the present disclosure.
[054] FIG. 5W illustrates the ground truth image of the ultrasound image of FIG. 5V.
[055] FIG. 6A illustrates an original ultrasound image of a normal testicle.
[056] FIG. 6B illustrates the ultrasound image of FIG. 6A converting the color space of such image to YCbCr color space Yellow channel.
[057] FIG. 6C illustrates the ultrasound image of FIG. 6B after applying a first filter to such image, the filter is configured in accordance with embodiments of the present disclosure.
[058] FIG. 6D illustrates the ultrasound image of FIG. 6C after segmentation, the segmentation being performed in accordance with embodiments of the present disclosure.
[059] FIG. 6E illustrates the ultrasound image of FIG. 6D after removing background and unnecessary information using region property function, the background and unnecessary information being removed in accordance with embodiments of the present disclosure.
[060] FIG. 6F illustrates the ultrasound image after multiplying FIG. 6C with FIG. 6E.
[061] FIG. 6G illustrates the ultrasound image of FIG. 6F after applying a Histogram Filter, the Histogram Filter being configured in accordance with embodiments of the present disclosure. [062] FIG. 6H illustrates the ultrasound image of FIG. 6G after applying a Weiner Filter, the Weiner Filter being configured in accordance with embodiments of the present disclosure.
[063] FIG. 61 illustrates the ultrasound image of FIG. 6H after applying a Clahe Filter, the Clahe Filter being configured in accordance with embodiments of the present disclosure.
[064] FIG. 6J illustrates the ultrasound image of FIG. 61 after applying a Gaussian Filter, the Gaussian Filter being configured in accordance with embodiments of the present disclosure.
[065] FIG. 6K illustrates the ultrasound image of FIG. 6J after applying an Anisotropic Filter, the Anisotrpic Filter being configured in accordance with embodiments of the present disclosure.
[066] FIG. 6L illustrates the ultrasound image of FIG. 6K after applying a Canny Edge Detector followed by a Binary Gardient Mask, the Binary Gradient Mask being configured in accordance with embodiments of the present disclosure.
[067] FIG. 6M illustrates the ultrasound image of FIG. 6L after applying a Dilated Gardient Mask, the Dilated Gradient Mask being configured in accordance with embodiments of the present disclosure.
[068] FIG. 6N illustrates the ultrasound image of FIG. 6M after applying an image closing function, the image closing function being configured in accordance with embodiments of the present disclosure.
[069] FIG. 60 illustrates the ultrasound image of FIG. 6N after applying an inverse image closing function, the inverse image closing function being configured in accordance with embodiments of the present disclosure.
[070] FIG. 6P illustrates the ultrasound image of FIG. 60 after multiplying it with the image of FIG.6K.
[071] FIG. 6Q illustrates the ultrasound image of FIG. 6P after thersholding in accordance with embodiments of the present disclosure. [072] FIG. 6R illustrates the ultrasound image of FIG. 6Q after determining boundaries around detected veins in accordance with embodiments of the present disclosure.
[073] FIG. 6S illustrates the ultrasound image of FIG. 6R after applying a roundness metric in accordance with embodiments of the present disclosure.
[074] FIG. 6T illustrates the ultrasound image of FIG. 6S after determining the Euclidean distance for all detected veins in accordance with embodiments of the present disclosure.
[075] FIG. 6U illustrates the ultrasound image of FIG. 6T after highlighting veins having a Euclidean distance within Euclidean range in accordance with embodiments of the present disclosure.
[076] FIG. 6V illustrates the ultrasound image of FIG. 6U after detecting varicocele veins in accordance with embodiments of the present disclosure, it is shown in the image that no varicocele is present.
[077] FIG. 6W illustrates the ground truth image of the ultrasound image of FIG. 6V.
[078] FIG. 7 illustrates a block diagram of a system for detecting varicocele based on at least one two-dimensional ultrasound image of a patient’s testicle, the system being configured in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[079] FIGS. 1-4 illustrate a method for diagnosing varicocele using two-dimensional (“2-D”) ultrasound images of a patient’s testicles in supine position, the method may include a pre-processing stage (process block 1-1) to extract at least one region of interest in the at least one ultrasound image to be examined, the at least one region of interest may contain multiple objects; a processing stage (process block 1-2) to clarify the at least one ultrasound image; and a detection stage (process block 1-3). In embodiments of the present disclosure, the pre-processing stage (process block 1-1) may include providing at least one 2-D ultrasound image of a patient’s testicle taken in a supine position (process 4242222, followed by applying a first filter using a first filtering unit to the provided ultrasound images (process block 2-3), and segmenting the at least one ultrasound image using an image segmentation unit (process block 2- 4). FIGS. 5A, 5B illustrate ultrasound images of a patient’s testicle having varicocele before and after converting the color space of the image to YCbCr color space Yellow channel in accordance with embodiments of the present disclosure, respectively. FIGS. 6A, 6B illustrate ultrasound images of a normal testicle before and after converting the color space of the image to YCbCr color space Yellow channel in accordance with embodiments of the present disclosure, respectively.
[080] In embodiments of the present disclosure the first filter may be a conventional median filter configured to set and smoothen the edges of the objects contained in the ultrasound images, and to reduce impulsive noise in the image while keeping the useful features and image edges and lines. FIGS. 5C, 6C illustrate ultrasound images of FIGS. 5B, 6B after applying the median filter according to the embodiments of the present disclosure, respectively.
[081] The image segmentation unit in embodiments of the present disclosure may be configured to perform image thresholding and remove unneeded parts from the image while keeping the region of interest, i.e. the testicle. The unneeded parts may include the background and any text contained the ultrasound image. The image segmentation unit in embodiments of the present disclosure may perform conventional Otsu Segmentation using a suitable segmentation parameter, preferably 0.1.
[082] According to Otsu segmentation, the image to be segmented contains two classes of pixels, e.g., foreground and background, each class has a different color. Otsu method is important for finding the robust threshold by minimizing the intra-class conflict between the regions outside and inside the object. It also finds the threshold parameter that minimizes the intra-class variance and is defined as a weighted sum of variances of the two classes, as per the following formula:
Figure imgf000012_0001
Wherein represents intra-class variance; w0 and wί represent the probabilities of the two classes separated by a threshold parameter (t);
OQ and s are variances of these two classes. [083] Otsu segmentation depends on the fact that minimization of the intra-class variance is correlative to maximization of the inter-class variance as per the following formula:
Figure imgf000013_0001
wherein represents inter-class variance; s2 represents discrepancies of class sets; s2 represents intra-class variance; t represents a thresholding parameter; wί and w2 represent the probabilities of two classes separated by the threshold parameter, respectively; and m- and m2 represent the means of the two classes, respectively.
[084] The class probability m1(t) can then be identified from the threshold (t) according to the following formula:
Figure imgf000013_0002
wherein P(i) represents the probability of the thresholding parameter.
[085] Whereas, the class mean m^ΐ) can be determined according to the formula:
Figure imgf000013_0003
wherein
P(i) represents the probability of the thresholding parameter; and X (i) represents the value to the center of the histogram bin.
[086] FIGS. 5D, 6D illustrate the ultrasound images after applying the first filter and segmentation according to the embodiments of the present disclosure.
[087] In embodiments of the present disclosure, the removal of unneeded parts that contain unnecessary information in the image, such as the patient’s name, hospital name, date, and time may be performed by using a region property function. FIG. 5E illustrates an ultrasound image of a normal testicle after removing background and unnecessary information using the region property function, and FIG. 6E illustrates an ultrasound image of a patient’s testicle after removing background and unnecessary information using the region property function.
[088] After removing the unneeded parts and unnecessary information, the resulting images, i.e. FIGS. 5E, 6E are multiplied with FIGS. 5C, 6C, respectively. This multiplication preserves the part of the ultrasound image that is present in both ultrasound images before and after performing the region property function, and delete the parts of the ultrasound images that are not found in both ultrasound images before and after applying the region property function. FIG. 5F illustrates an ultrasound image of a patient’s testicle after multiplication of FIG. 5E with FIG. 5C, and FIG. 6F illustrates an ultrasound image of a normal testicle after multiplication of FIG. 6E with FIG. 6C.
[089] In embodiments of the present disclosure, the processing stage (process block 1-2) may include applying, by the second filtering unit, a plurality of de-noising and smoothing filters (process blocks 3-1, 3-2, 3-3, 3-4, 3-5).
[090] The plurality of de-noising and smoothing filters may include a Histogram filter (process block 3-1), a Weiner filter (process block 3-2), a Clahe filter (process block 3- 3), a Gaussian filter (process block 3-4), and an Anisotropic Diffusion filter (process block 3-5). FIGS. 5G-5K illustrate ultrasound images of a normal testicle after applying said filters, respectively, and FIGS. 6G-6K illustrate ultrasound images of a patient’s testicle after applying the said filters, respectively.
[091] The Histogram Filter (process block 3-1) is a type of Bayes filter that presents a fact as a histogram. The histogram gives the information about the gray-levels of pixels in the at least one ultrasound image. The histogram filter is shown as an array of pixels. The density value in the grayscale image is from 0 to 255. These values are present as black and white, and the values in between giving different shades of gray such as dark gray or white gray. A histogram of a grayscale image counts the number of incidence of intensity values at each pixel. The below formula illustrates the histogram filter.
/i(q) = ni , for i = 0, 1, 2, . ,L — 1 wherein h(ci) is the Histogram Filter;
Ci is grayscale of ith pixel image for a total of L gray values; and ni is the number of occurrences of gray levels in the image ci.
Then the probability of occurrence of gray levels can be expressed as:
Figure imgf000015_0001
Thus, histogram is a plot of h( ci) or P(ci) versus ci, wherein P(ci) is the probability of occurrence of gray levels.
[092] FIGS. 5G, 6G illustrate ultrasound images of testicles taken in a supine position after applying the Histogram filter to FIGS. 5F, 6F, respectively.
[093] In embodiments of the present disclosure, the Weiner filter (process block 3-2) is used to remove noise in the at least one ultrasound image, and more particularly the white noise. The Weiner filter depends on considering the at least one ultrasound image and the noise as random processes, and the objective is to find an estimate (F) of the at least one ultrasound image (F) such that the mean square error between them is minimized, wherein the error measure can be computed according to the following formula: e2 = E{(F - F )} wherein e2 is the minimized error;
E is the expected value of the argument;
F is the at least one ultrasound image; and F is the estimate of the at least one ultrasound image.
The minimum of the error function is given in the frequency domain, by the expression
Figure imgf000016_0001
F(u, v) = [ 1 H(u,v) \H(u,v) \ 2 \H{u,v) \ 2+ S^u,v) Sf (u,v)/] G(u, v) wherein
F(u, v ) is the frequency domain of the filter;
H(u, v) is the degradation function;
Xh (u, v) is the power spectrum of the noise;
Sf (u, v) is the power spectrum of the undegraded image;
G(u, v ) is a degradation function;
H * (u, v) is the complex conjugate of H(u, v); and F\u, v) is noised image.
[094] FIGS. 5H, 6H illustrate ultrasound images of testicles taken in a supine position after applying the Weiner filter to FIGS. 5G, 6G, respectively.
[095] In embodiments of the present disclosure, if the noise Xh (u, v)/Sf (u, v ) is zero then the noise power spectrum vanishes and the Wiener filter reduces to the inverse filter. If the noise Xh (u, v)/Sf (u, v ) is large, the Wiener filter becomes zero, thus that frequency is ignored.
[096] The Clahe filter (process block 3-3) may improve contrast in the at least one ultrasound image by working on small regions, i.e. tiles, in the image and calculating the contrast transform function for each tile individually. The contrast of the image, especially in homogeneous areas, can be limited to avoid amplifying any noise that might be present in the image. The contrast limited Adaptive Histogram Equalization is determined based on the formulae shown below. h(rk ) = nk wherein h(rk) is a histogram function; rk is the kth intensity value; and nk is the number of pixel in the image with intensity rk. nk
Pr(rk) = where k = 0, 1, 2, . . . , L - 1
MN pr(rk) is an estimated of the probability of occurrence of intensity level rk in an image;
[0, L-l] is a range of the intensity level; and MN represents the area of an original image.
Figure imgf000017_0001
wherein sk is the new distribution of the histogram; and
Pr(rj) is an estimated of the probability of occurrence of intensity level rk in an image. b = MN/L(1 + a/100(smax - 1)) wherein b is a clip limit; a is a clip factor;
Smax is the maximum allowable slope; and (MN/L ) is clip limit value when a=0.
[097] FIGS. 51, 61 illustrate ultrasound images of a testicle taken in a supine position after applying the Clahe filter to FIGS. 5H, 6H, respectively.
[098] In embodiments of the present disclosure, the Gaussian filter (process block 3- 4) may separate the roughness and waviness components of the at least one ultrasound image, thus blurring the at least one ultrasound image and remove noise. The Gaussian filter works according to the following function:
Figure imgf000017_0002
wherein s is the standard deviation of the distribution, the distribution is assumed to have a mean of 0; x is the x-direction of the at least one ultrasound image; and y is the y-direction of the at least one ultrasound image.
[099] FIGS. 5J, 6J illustrate ultrasound images of a testicle taken in a supine position after applying the Gaussian filter to FIGS. 51, 61, respectively.
[0100] The Anisotropic filter (process block 3-5) may enhance the resolution of the at least one ultrasound image by using the insertion of a forward-and-backward nonlinear diffusion post-processing which provides suppression of ringing. The diffusion on pixel intensities makes the texture smooth as possible. By using the threshold function, diffusion is prevented to happen across edges and this will cause to keep the edges in the image as they are. The formula for the Anisotropic filter is as follows: h = div. (V//(|V/|)) wherein
It is the total variation of original image; div is the diverges curvature level line of the image.;
VI is a gradient threshold;
|VI| is normalized value of the gradient threshold; and c is the decreasing function of the slope, such that:
Figure imgf000018_0001
wherein s is the slope. k is a gradient thresholding parameter.
[0101] FIGS . 5K, 6K illustrate ultrasound images of a testicle taken in a supine position after applying the Anisotropic filter to FIGS. 5J, 6J, respectively. [0102] In embodiments of the present disclosure, the detection stage (process block 1- 3) may include applying, by a detection unit, a Canny detection method (process block 4-1), applying binary gradient mask (process block 4-2), applying dilated gradient mask (process block 4-3), applying image closing (process block 4-4), inversing image closing (process block 4-5), multiplying inverse image closing with Anisotropic image (process block 4-6), thresholding the image using threshold values from about 0 to about 150 (process block 4-7), and determining boundaries of objects (process block 4- 8), determining a roundness metric (process block 4-9), selecting objects according to the determined roundness metric (process block 4-10), then identifying Euclidean distance between two points on the edges of each of the selected objects (process block 4-11), followed by Selecting objects located in range of Euclidean 23-60 pixels, which represent a final detect (process block 4-12),.
[0103] The Canny detection method (process block 4-1) may include smoothing the at least one ultrasound image by applying a Gaussian filter as previously described in process block 3-4 described above, identifying gradient magnitude and direction, suppressing non-maxima points in the gradient magnitude, and thresholding the results after suppressing non-maxima points.
[0104] The identification of the gradient magnitude and direction in the detection unit may be achieved by finite-difference approximations for the partial derivative of the smoothened at least one ultrasound image fl(x,y) in the x and y directions by using the finite difference of the first order partial derivatives in the 2x2 neighborhood, in accordance with the following formulae: px(x, y)= (fl(x +1, y)- f 1 (x, y)+fl(x +1, y +l)-fl(x, y +l))/2
Figure imgf000019_0001
G(X, y)=7 [px(x,y)]2 + [py(x,y)]2 6(x, y)= arctan(px (x, y)/ py (x, y)) wherein f (x, y) represents the original image; g(x, y) represents the Gaussian filter; fl(x, y) is a smooth image; px(x, y) is a partial derivation of/l(x, y) in the direction of x;
P y(x, y ) is a partial derivation of f l(x, y) in the direction of y; G(X, y) is the gradient magnitude of the point (x,y); and q(c, y) represents the gradient direction of (px„ py).
[0105] In embodiments of the present disclosure, the suppression of non-maxima points in the at least one ultrasound image may be performed in order to get the single pixel edge of the image. This may be achieved by iterate through the image, if the gradient magnitude G(x, y) of a point (x, y) is not greater than the two adjacent interpolation in the direction of q(x, y), the point (x, y) will be marked as non-edge point, otherwise it is edge point.
[0106] The contrast between selected objects in the at least one region of interest and the image background differs greatly, thus the binary gradient mask (process block 4- 2) and dilated gradient mask (process block 4-3) help in detecting changes in contrast. FIGS. 5L, 6L illustrate ultrasound images of testicles taken in a supine position after applying binary gradient mask, and FIGS. 5M, 6M illustrate ultrasound images of testicles taken in a supine position after applying dilated gradient mask.
[0107] In embodiments of the present disclosure, the application of image closing and inverse image closing (process blocks 4-4, 4-5) help in differentiating the objects in the at least one region of interest from the background of the image, as well as closing the selected objects’ boundaries. FIGS. 5N, 6N illustrate ultrasound images of testicles taken in a supine position after image closing of FIGS . 5M, 6M, respectively, and FIGS . 50, 60 illustrate ultrasound images of testicles taken in a supine position after applying inverse image closing to FIGS. 5N, 6N, respectively.
[0108] Furthermore, the multiplication of the inversed image closing with the Anisotropic image (process block 4-6) may provide the initial areas of dilated veins within the at least one region of interest of the at least one ultrasound image. FIGS. 5P, 6P illustrate ultrasound images of testicles taken in a supine position after multiplying FIGS. 50, 60 with respective Anisotropic images, respectively.
[0109] In a histogram distribution of the gradient magnitude image after non-maxima suppression, the number of pixels would be accumulated by the direction of increasing gradient magnitude. When an accumulated value reaches an accurate percentage of the total pre-determined value, the corresponding gradient magnitude is set as a high threshold value. In the gradient magnitude image which is after non-maxima suppression, if the gradient magnitude of one point is greater than the high threshold value, the point will be regarded as the edge. The threshold value in embodiments of the present disclosure may range from 0 to about 150, this range would eliminate non- dilated veins in the at least one ultrasound image (process blocks 4-7, 4-8). FIGS. 5Q, 6Q illustrate ultrasound images of testicles taken in a supine position after applying thresholding values from about 0 to about 150 to FIGS. 5P, 6P respectively, and FIGS. 5R, 6R illustrate ultrasound images of testicles taken in a supine position after determining boundaries around detected objects of FIGS. 5Q, 6Q, respectively.
[0110] After detecting the edges, the region of interest of the at least one ultrasound image may contain edges of different shapes, however dilated veins usually have circular or even semi-circular shapes in ultrasound imaging. Thus, the step of determining a roundness metric (process block 4-9) identifies the circular and non circular objects according to the following formula: area
Figure imgf000021_0001
etrs2 wherein k is the roundness metric; area is the surface area of the object; and perimeter is the perimeter of the object.
The objects within the region of interest that have a circulation ratio from about 0.9 to about 0.96 are considered to be substantially circular or semi-circular, i.e. veins, and are selected (process block 4-10). [0111] FIGS. 5S, 6S illustrate ultrasound images of tescticles taken in supine position after applying a roundness metric to FIGS. 5R, 6R, respectively.
[0112] In embodiments of the present disclosure, after selecting the objects according to the determined roundness metric, the Euclidean distance between two points at the perimeter of the selected vein is calculated using the equivalence of the Euclidean to the distance (process block 4-11), using the following formula: dist = VO - X2)2 + O - y-i)2 wherein dist is the Euclidean distance between the edges of the selected vein; xi is the x-position of the first point at the edge;
X2 is the x-position of the second point at the edge; yi is the y-position of the first point at the edge; and y2 is the y-position of the second point at the edge.
If the Euclidean distance between the first and second points in the same vein is within the range of about 23 pixies to about 60 pixies, then the vein is dilated vein, and the patient would be diagnosed as having varicocele. But, if the Euclidean distance is not within the said range, then the patient would not be diagnosed as having varicocele (process block 4-12).
[0113] FIGS. 5T, 6T illustrate ultrasound images of tescticles taken in supine position after identifying Euclidean distance between two points on each of the round objects of FIGS. 5S, 6S, respectively.
[0114] FIGS. 5U, 6U illustrate ultrasound images of tescticles taken in supine position after highlighting the identified objects with Eucliedean distance within a pre-defined Euclidean range from about 23 to about 60 pixies.
[0115] FIGS. 5V, 6V illustrate ultrasound images of tescticles taken in supine position after highlighting detected varicocele.
[0116] FIGS. 5W, 6W illustrate ground truth ultrasound images of testicles taken in supine position. [0117] Embodiments of the present disclosure further provide a system for detecting varicocele depending on at least one two-dimensional ultrasound image of a patient’s testicle, the system may include a pre-processing unit 1 having an image conversion unit 10 configured to convert a color space of the at least one ultrasound image into a suitable color space, such as YCbCr color space yellow channel, a first filtering unit 11 configured to apply a first filter, such as a median filter, to the at least one ultrasound image, and an image segmentation unit 12 configured to extract a region of interest containing veins of the patient’s testicle, for example by applying Otsu segmentation with a segmentation parameter of 0.1. The system may further include a processing unit 2 configured to apply a plurality of de-noising and smoothing filters to the region of interest, the plurality of filters may include a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter; a detection unit 3 configured to detect edges of objects contained in the region of interest by using a suitable edge detection method such as Canny edge detection with a thresholding value ranging from 0 to about 150. The detection unit 3 may also be configured to identify roundness metric by applying a suitable roundness metric formula, to find Euclidean distance between edges of objects present in the region of interest, and to determine if the patient has varicocele or not. The system of the present disclosure may also include a display 4 to display the result.
[0118] Aspects of the present disclosure may be a system, a method, and/or a computer program product at any possible technical detail level of integration. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.), or embodiments combining software and hardware aspects that may all generally be referred to herein as a “circuitry,” “module,” or “system.” Furthermore, aspects of the present disclosure may take the form of a program product embodied in one or more computer readable storage medium(s) having computer readable program code embodied thereon. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure. (However, any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium.) [0119] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (“RAM”), a read-only memory (“ROM”), an erasable programmable read-only memory (“EPROM” or Flash memory), a static random access memory (“SRAM”), a portable compact disc read-only memory (“CD-ROM”), a digital video disk (“DVD”), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[0120] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network, and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
[0121] Computer readable program instructions for carrying out operations of aspects of the present disclosure may be assembler instructions, instruction-set-architecture (“ISA”) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object-oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user’s computer, partly on the user’s computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user’s computer through any type of network, including a local area network (“LAN”) or a wide area network (“WAN”), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field- programmable gate arrays (“FPGA”), or programmable logic arrays (“PLA”) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
[0122] Aspects of the present disclosure have been described herein with reference to flowchart and/or block diagrams of methods, and systems according to embodiments of the present disclosure. It will be understood that some blocks of the block diagrams, and combinations of blocks in the block diagrams, can be implemented by computer readable program instructions.
[0123] These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks. [0124] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus, or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the block diagrams’ block or blocks.
[0125] The block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which includes one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart diagrams, and combinations of blocks in the block diagrams and/or flowchart diagrams, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
[0126] The use of the word “a” or “an” when used in conjunction with the term “comprising” in the claims and/or the specification may mean “one,” but it is also consistent with the meaning of “one or more,” “at least one,” and “one or more than one.” Throughout this application, the term “about” is used to indicate that a value includes the inherent variation of error for the system, the method being employed to determine the value, or the variation that exists among the study subjects.

Claims

CLAIMS What is claimed is:
1. A method for detecting varicocele in a patient using at least one two-dimensional ultrasound image, the method comprises a pre-processing stage, a processing stage and a detection stage, wherein the detection stage comprises the steps of:
Applying, by a detection unit, a canny edge detection method on a region of interest of the at least one ultrasound image to detect edges of objects found within the region of interest;
Determining, by the detection unit, a roundness metric;
Selecting, by the detection unit, objects having a pre-determined roundness metric;
Identifying, by the detection unit, Euclidean distance between the edges of each of the selected objects; and
Determining, by the detection unit, if the patient has varicocele based on the identified Euclidean distance.
2. The method of claim 1, wherein the pre-processing stage comprises the steps of:
Providing at least one ultrasound image of a patient’s testicle taken in a supine position;
Converting, by an image conversion unit, a color pattern of the at least one ultrasound image into a suitable color space;
Applying, by a first filtering unit, a first filter using to the at least one ultrasound image after converting it into the suitable color space; and
Segmenting, by a segmentation unit, the at least one ultrasound image after applying the first filter in order to extract a region of interest in the at least one ultrasound image.
3. The method of claim 2, wherein the suitable color space comprises YCbCr color space yellow channel.
4. The method of claim 2, wherein the first filter comprises a median filter configured to set and smoothen the edges of objects contained in the at least one ultrasound image, and to reduce impulsive noise in the at least one ultrasound image while keeping image edges and lines.
5. The method of claim 2, wherein the segmentation unit performs Otsu Segmentation using a suitable segmentation parameter to the at least one ultrasound image to extract the region of interest, and wherein the region of interest contains veins of a patient’s testicle.
6. The method of claim 6, wherein Otsu segmentation has a segmentation parameter of 0.1.
7. The method of claim 1, wherein the processing stage comprises applying, by a second filtering unit, a plurality of de-noising and smoothing filters to the at least one ultrasound image after the pre-processing stage.
8. The method of claim 7, wherein the plurality of de-noising and smoothing filters comprises a Histogram filter, a Weiner filter, a Clahe filter, a Gaussian filter, and an Anisotropic Diffusion filter.
9. The method of claim 1, wherein the Canny detection method comprises the steps of:
Applying a binary gradient mask to the region of interest of the at least one ultrasound image;
Applying a dilated gradient mask to an image resulting from applying the binary gradient mask;
Applying image closing to an image resulting from applying the dilated gradient mask;
Performing inverse image closing;
Multiplying a resulting image from inverse image closing with an image with applied Anisotropic filter;
Applying thresholding values to the resulting image after multiplication; and
Determining edges of objects in the region of interest in the at least one uultrasound image.
10. The method of claim 9, wherein the Canny detection method has thresholding values ranging from 0 to about 150.
11. A system for detecting varicocele in a patient based on at least one ultrasound image of a patient’s testicle taken in a supine position, the system comprises a pre processing unit having an image conversion unit configured to convert a color space of the at least one ultrasound image into a suitable color space , a first filtering unit configured to apply a first filter to the at least one ultrasound image, and an image segmentation unit configured to extract a region of interest containing veins of the patient’s testicle; a second filtering unit configured to apply a plurality of de-noising and smoothing filters to the region of interest; a detection unit configured to detect edges of objects contained in the region of interest, the detection unit is configured to identify roundness metric, to find Euclidean distance between two points at edges of each object contained in the region of interest, and to determine if the patient has varicocele or not; and a display to display the result.
12. The system of claim 10, wherein the suitable color space comprises YCbCr color space yellow channel.
13. The system of claim 10, wherein the first filter comprises a median filter.
14. The system of claim 10, wherein the segmentation unit is configured to performs Otsu Segmentation using a suitable segmentation parameter.
15. The system of claim 13, wherein Otsu segmentation has a segmentation parameter of 0.1.
16. The system of claim 10, wherein the plurality of de-noising and smoothing filters comprises a Histogram filter, a Weiner filter, a Clahe filter, a Gnussian filter, and an Anisotropic Diffusion filter.
17. The system of claim 1, wherein the detection unit is configured to apply Canny detection method to the extracted region of interest using a suitable thersholding value.
18. The system of claim 16, wherein the suitable thresholding value ranges from 0 to about 150.
PCT/JO2021/050003 2021-05-23 2021-05-23 A system and method for detecting varicocele using ultrasound images in supine position WO2022249217A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JO2021/050003 WO2022249217A1 (en) 2021-05-23 2021-05-23 A system and method for detecting varicocele using ultrasound images in supine position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JO2021/050003 WO2022249217A1 (en) 2021-05-23 2021-05-23 A system and method for detecting varicocele using ultrasound images in supine position

Publications (1)

Publication Number Publication Date
WO2022249217A1 true WO2022249217A1 (en) 2022-12-01

Family

ID=84229664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JO2021/050003 WO2022249217A1 (en) 2021-05-23 2021-05-23 A system and method for detecting varicocele using ultrasound images in supine position

Country Status (1)

Country Link
WO (1) WO2022249217A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090080738A1 (en) * 2007-05-01 2009-03-26 Dror Zur Edge detection in ultrasound images
US20180168728A1 (en) * 2003-09-30 2018-06-21 Biolitec Unternehmensbeteiligungs Ii Ag Method for Treatment of Varicocele
US20190272441A1 (en) * 2014-05-06 2019-09-05 Nant Holdings Ip, Llc Image-based feature detection using edge vectors
US20210155982A1 (en) * 2019-11-21 2021-05-27 10X Genomics, Inc. Pipeline for spatial analysis of analytes

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180168728A1 (en) * 2003-09-30 2018-06-21 Biolitec Unternehmensbeteiligungs Ii Ag Method for Treatment of Varicocele
US20090080738A1 (en) * 2007-05-01 2009-03-26 Dror Zur Edge detection in ultrasound images
US20190272441A1 (en) * 2014-05-06 2019-09-05 Nant Holdings Ip, Llc Image-based feature detection using edge vectors
US20210155982A1 (en) * 2019-11-21 2021-05-27 10X Genomics, Inc. Pipeline for spatial analysis of analytes

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ASYRAF MOHD, RAHIM ABD, OTHMAN NURMIZA, HAFIZAH WAN MAHANI, MAHMUD WAN: "Preliminary study of image processing techniques for the detection of varicocele based on 2D ultrasound images", JOURNAL OF PHYSICS: CONFERENCE SERIES, vol. 1049, 1 January 2018 (2018-01-01), pages 12074, XP093013882 *
G PARTHASARATHY, L RAMANATHAN, K ANITHA, Y JUSTINDHAS: "Predicting Source and Age of Brain Tumor Using Canny Edge Detection Algorithm and Threshold Technique", ASIAN PACIFIC JOURNAL OF CANCER PREVENTION, vol. 20, no. 5, 1 May 2019 (2019-05-01), pages 1409 - 1414, XP093013873, DOI: 10.31557/APJCP.2019.20.5.1409 *
PAUROSO S., DI LEO N., FULLE I., DI SEGNI M., ALESSI S., MAGGINI E.: "Varicocele: Ultrasonographic assessment in daily clinical practice", JOURNAL OF ULTRASOUND, vol. 14, no. 4, 1 December 2011 (2011-12-01), AMSTERDAM, NL , pages 199 - 204, XP093013876, ISSN: 1971-3495, DOI: 10.1016/j.jus.2011.08.001 *

Similar Documents

Publication Publication Date Title
US10740880B2 (en) Systems and methods for analyzing pathologies utilizing quantitative imaging
CN109544540B (en) Diabetic retina image quality detection method based on image analysis technology
Sopharak et al. Simple hybrid method for fine microaneurysm detection from non-dilated diabetic retinopathy retinal images
CN110473243B (en) Tooth segmentation method and device based on depth contour perception and computer equipment
Badsha et al. A new blood vessel extraction technique using edge enhancement and object classification
Esmaeili et al. Automatic optic disk boundary extraction by the use of curvelet transform and deformable variational level set model
Antohe et al. Implications of digital image processing in the paraclinical assessment of the partially edentated patient
Kayal et al. A new dynamic thresholding based technique for detection of hard exudates in digital retinal fundus image
Khan et al. A region growing and local adaptive thresholding-based optic disc detection
JP2005296605A (en) Method of segmenting a radiographic image into diagnostically relevant and diagnostically irrelevant regions
CN113344894B (en) Method and device for extracting features of fundus leopard spots and determining feature indexes
US20210153738A1 (en) Noninvasive techniques for identifying choroidal neovascularization in retinal scans
Salido et al. Hair artifact removal and skin lesion segmentation of dermoscopy images
Ramzan et al. Automated glaucoma detection using retinal layers segmentation and optic cup‐to‐disc ratio in optical coherence tomography images
Jeevakala et al. A novel segmentation of cochlear nerve using region growing algorithm
Kumar et al. An improved medical decision support system to identify the diabetic retinopathy using fundus images
Mirajkar et al. Acute ischemic stroke detection using wavelet based fusion of CT and MRI images
CN116433695B (en) Mammary gland region extraction method and system of mammary gland molybdenum target image
Mitra et al. Peak trekking of hierarchy mountain for the detection of cerebral aneurysm using modified Hough circle transform
WO2022249217A1 (en) A system and method for detecting varicocele using ultrasound images in supine position
Lazar et al. A novel approach for the automatic detection of microaneurysms in retinal images
Rawas et al. HCET-G 2: dermoscopic skin lesion segmentation via hybrid cross entropy thresholding using Gaussian and gamma distributions
Shabbir et al. A comparison and evaluation of computerized methods for blood vessel enhancement and segmentation in retinal images
Ahmed et al. Deep learning based automated detection of intraretinal cystoid fluid
Gao Retinal vessel segmentation using an improved multi-scale line detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21942875

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE