CN110974286B - Method and device for detecting breast X-ray image tumor - Google Patents

Method and device for detecting breast X-ray image tumor Download PDF

Info

Publication number
CN110974286B
CN110974286B CN201911268460.1A CN201911268460A CN110974286B CN 110974286 B CN110974286 B CN 110974286B CN 201911268460 A CN201911268460 A CN 201911268460A CN 110974286 B CN110974286 B CN 110974286B
Authority
CN
China
Prior art keywords
breast
axial
ray image
tumor
nipple
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911268460.1A
Other languages
Chinese (zh)
Other versions
CN110974286A (en
Inventor
俞旸
陈琦程
张超仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huajian Blue Ocean Medical Technology Co ltd
Original Assignee
Beijing Huajian Blue Ocean Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huajian Blue Ocean Medical Technology Co ltd filed Critical Beijing Huajian Blue Ocean Medical Technology Co ltd
Priority to CN201911268460.1A priority Critical patent/CN110974286B/en
Publication of CN110974286A publication Critical patent/CN110974286A/en
Application granted granted Critical
Publication of CN110974286B publication Critical patent/CN110974286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/12Arrangements for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the application provides a method, a device, electronic equipment and a computer readable storage medium for detecting breast X-ray image tumor, which solve the problem of low accuracy of the existing breast X-ray image tumor detection mode. The method for detecting the breast X-ray image tumor comprises the following steps: inputting the axial breast X-ray image and the lateral oblique breast X-ray image into a tumor detection model; determining tumor contour coordinate data of the axial position breast X-ray image and tumor contour coordinate data of the lateral oblique position breast X-ray image; determining the center point coordinates of the axial breast tumor and the center point coordinates of the lateral oblique breast tumor; acquiring an axial nipple coordinate and a lateral nipple coordinate; calculating an axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculating a lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate; and confirming that the axial breast X-ray image tumor and the lateral oblique breast X-ray image tumor are the same breast tumor.

Description

Method and device for detecting breast X-ray image tumor
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a method and a device for detecting breast X-ray image tumor, electronic equipment and a computer readable storage medium.
Background
Breast cancer is the most common malignant tumor of females, and is the malignant tumor with the largest number of patients, and the number of patients in recent years is continuously increasing, so that the health and the quality of life of females are seriously threatened. Early detection and early diagnosis of breast cancer are key to improving curative effect. The breast X-ray image examination is considered as an effective early screening means for breast cancer, and can effectively reduce the death rate of the breast cancer. With the rapid development of artificial intelligence technology, computer-aided technology is gradually introduced into the field of medical imaging to aid manual film reading so as to improve the accuracy of detection and diagnosis. One of the main problems faced by the present mammary gland X-ray image detection method is how to make the system have higher focus detection sensitivity and can generate fewer false positive results.
Disclosure of Invention
In view of the above, the embodiments of the present application provide a method, an apparatus, an electronic device, and a computer readable storage medium for detecting a breast X-ray image tumor, which solve the problem of low accuracy of the existing breast X-ray image detection method.
According to an aspect of the present application, a method for detecting a breast X-ray image tumor according to an embodiment of the present application includes: respectively inputting an axial position breast X-ray image and a lateral oblique breast X-ray image into a tumor detection model to obtain an axial position breast X-ray image tumor and a lateral oblique breast X-ray image tumor marked by the tumor detection model in the axial position breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm; determining mass contour coordinate data of the axial position breast X-ray image and mass contour coordinate data of the lateral position breast X-ray image based on the axial position breast X-ray image and the lateral position breast X-ray image respectively; determining an axial breast mass center point coordinate and a lateral oblique breast mass center point coordinate based on the tumor contour coordinate data of the axial breast X-ray image and the tumor contour coordinate data of the lateral oblique breast X-ray image respectively; acquiring an axial nipple coordinate and a lateral nipple coordinate based on the axial breast X-ray image and the lateral breast X-ray image; calculating an axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculating a lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate; and when the difference between the axial nipple breast spacing and the lateral nipple breast spacing is smaller than a preset value, confirming that the axial nipple X-ray image tumor and the lateral nipple X-ray image tumor are the same breast tumor.
In one embodiment of the present application, acquiring the axial nipple coordinate based on the axial mammography image includes: acquiring an axial breast boundary image based on the axial breast X-ray image; acquiring an axial maximum mask image based on the axial breast X-ray image; subtracting the axial maximum mask image from the axial breast boundary image to obtain a nipple area including an extranipple boundary line; and obtaining the axial nipple coordinate based on the extranipple boundary line.
In an embodiment of the present application, the acquiring the axial breast boundary image based on the axial breast X-ray image includes: acquiring an axis boundary image without morphological operation and a side diagonal boundary image with morphological operation based on the axis breast X-ray image; acquiring a binary threshold image based on the axial breast X-ray image; and subtracting the axial boundary image and the lateral oblique boundary image from the binary threshold image to obtain the axial breast boundary image.
In an embodiment of the present application, before the acquiring the axial nipple coordinate based on the nipple outer boundary line, the acquiring the axial nipple coordinate based on the axial mammography image further includes: and filtering out the object area which is positioned on the top of the axial mammary X-ray image in the nipple area.
In an embodiment of the present application, before acquiring the axial breast boundary image based on the axial breast X-ray image, the acquiring axial nipple coordinate based on the axial breast X-ray image further includes: and converting the axial breast X-ray image into a uniform target format through preprocessing.
In one embodiment of the present application, determining the center point coordinates of the axial breast mass based on the axial breast X-ray image mass comprises: acquiring an axial minimum circumscribed rectangle surrounding the mammary gland X-ray image tumor based on the axial mammary gland X-ray image tumor; and calculating the center point coordinate of the minimum circumscribed rectangle of the axial position to serve as the center point coordinate of the breast lump of the axial position.
In an embodiment of the present application, the axial mammography image and the lateral-oblique mammography image are any two of the following mammography image types respectively: an axial image, a lateral image, and a diagonal lateral image.
In one embodiment of the present application, the tumor detection model is an example segmentation model.
In an embodiment of the present application, the preset value is 3mm.
According to another aspect of the present application, there is provided a mammography image tumor detection apparatus including: the first acquisition module is configured to input an axial position breast X-ray image and a lateral oblique breast X-ray image into a tumor detection model respectively to acquire an axial position breast X-ray image tumor and a lateral oblique breast X-ray image tumor marked by the tumor detection model in the axial position breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm; a first determination module configured to determine mass contour coordinate data of the axial breast X-ray image and mass contour coordinate data of the lateral breast X-ray image based on the axial breast X-ray image and the lateral breast X-ray image, respectively; the second determining module is configured to determine an axial breast mass center point coordinate and a lateral oblique breast mass center point coordinate based on the tumor contour coordinate data of the axial breast X-ray image and the tumor contour coordinate data of the lateral oblique breast X-ray image respectively; a second acquisition module configured to acquire an axial nipple coordinate and a lateral nipple coordinate based on the axial breast X-ray image and the lateral breast X-ray image; the first calculating module is configured to calculate an axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculate a lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate; and the first judging module is configured to confirm that the axial position breast X-ray image tumor and the lateral oblique position breast X-ray image tumor are the same breast tumor when the difference value between the axial position breast spacing and the lateral oblique position breast spacing is smaller than a preset value.
In an embodiment of the present application, the second obtaining module includes: a first acquisition unit configured to acquire an axial breast boundary image based on the axial breast X-ray image; a second acquisition unit configured to acquire an axial maximum mask image based on the axial mammogram image; a third acquisition unit configured to subtract the axial maximum mask image from the axial breast boundary image to acquire a nipple area including an extranipple boundary line; and a fourth acquisition unit configured to acquire the axial nipple coordinate based on the nipple outer boundary line.
In an embodiment of the present application, the first obtaining unit is further configured to: acquiring an axis boundary image without morphological operation and a side diagonal boundary image with morphological operation based on the axis breast X-ray image; acquiring a binary threshold image based on the axial breast X-ray image; and subtracting the axial boundary image and the lateral oblique boundary image from the binary threshold image to obtain the axial breast boundary image.
In an embodiment of the present application, the second obtaining module further includes: and a filtering unit configured to filter out a subject region located on top of the axial mammography image among the nipple regions before acquiring the axial nipple coordinates based on the nipple outer boundary line.
In an embodiment of the present application, the second obtaining module further includes: and the preprocessing unit is configured to convert the axial mammary gland X-ray image into a uniform target format through preprocessing before acquiring the axial mammary gland boundary image based on the axial mammary gland X-ray image.
In an embodiment of the application, the first determining module is further configured to: acquiring an axial minimum circumscribed rectangle surrounding the mammary gland X-ray image tumor based on the axial mammary gland X-ray image tumor; and calculating the center point coordinate of the minimum circumscribed rectangle of the axial position to serve as the center point coordinate of the breast lump of the axial position.
In an embodiment of the present application, the axial mammography image and the lateral-oblique mammography image are any two of the following mammography image types respectively: an axial image, a lateral image, and a diagonal lateral image.
In one embodiment of the present application, the tumor detection model is an example segmentation model.
In an embodiment of the present application, the preset value is 3mm.
According to another aspect of the present application, an embodiment of the present application provides an electronic device, including: a processor; a memory; and computer program instructions stored in the memory, which when executed by the processor, cause the processor to perform the mammography image tumor detection method of any one of the above.
According to another aspect of the present application, an embodiment of the present application provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform a mammography image tumor detection method as described in any one of the preceding claims.
According to another aspect of the present application, an embodiment of the present application provides a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform a mammography image tumor detection method as described in any one of the above.
The embodiment of the application provides a method, a device, electronic equipment and a computer readable storage medium for detecting breast X-ray image tumor, which are used for determining the breast X-ray image tumor by utilizing a deep neural network model established based on a deep learning algorithm and acquiring the center point coordinate and nipple coordinate of the breast tumor; and judging whether the breast tumors detected in the different breast X-ray images are the same tumor or not by comparing the distances between the center point coordinates and the nipple coordinates of the breast tumors in the different breast X-ray images. Therefore, the breast X-ray image tumor detection mode combines a deep learning algorithm and a common computer-aided judgment mode, and finally comprehensively judges breast X-ray image tumors by comparing different breast X-ray images, so that a breast tumor detection mechanism with high sensitivity and low false positive rate is constructed.
Drawings
Fig. 1 is a flowchart illustrating a method for detecting a breast X-ray image tumor according to an embodiment of the application.
Fig. 2 is a flowchart illustrating a method for determining coordinates of a center point of an axial breast tumor based on the axial breast X-ray image tumor according to an embodiment of the present application.
Fig. 3a and 3b are schematic diagrams illustrating a method for detecting a breast X-ray image tumor according to an embodiment of the present application, wherein the method uses a minimum circumscribed rectangle to determine coordinates of a center point of an axial breast tumor.
Fig. 4 is a flow chart of acquiring axial nipple coordinates based on axial mammograms in a breast X-ray image tumor detection method according to an embodiment of the present application.
Fig. 5a to 5f are schematic diagrams illustrating the principle of acquiring axial nipple coordinates in a breast X-ray image tumor detection method according to an embodiment of the present application.
Fig. 6 is a flowchart illustrating a preprocessing procedure for axial or lateral mammography X-ray image in a mammography X-ray image tumor detection method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a breast X-ray image tumor detecting apparatus according to an embodiment of the application.
Fig. 8 is a schematic structural diagram of a breast X-ray image tumor detecting apparatus according to another embodiment of the present application.
Fig. 9 is a schematic structural diagram of a breast X-ray image tumor detecting apparatus according to another embodiment of the present application.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Fig. 1 is a flowchart illustrating a method for detecting a breast X-ray image tumor according to an embodiment of the application. As shown in fig. 1, the method for detecting breast X-ray image tumor comprises the following steps:
step 101: and respectively inputting the axial position breast X-ray image and the lateral oblique breast X-ray image into a tumor detection model to obtain axial position breast X-ray image tumors and lateral oblique breast X-ray image tumors marked by the tumor detection model in the axial position breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm.
The axial (CC) and lateral oblique (Medio Lateral Oblique, MLO) mammograms may be transmission images formed by projection of breast sites with transmission beams, such as molybdenum breast target radiographs, also known as molybdenum breast target images. It should be appreciated that the specific presentation of the axial mammography image and the lateral oblique mammography image may vary depending on the imaging modality. The axial image is an image formed by projecting perspective beams from the head to the foot of a human body; the oblique side image is formed by projecting a perspective beam from the upper part of the inner side of the mammary gland to the lower part of the outer side of the mammary gland at 45 degrees or from the lower part of the outer side of the mammary gland to the upper part of the inner side. In one embodiment of the present application, the tumor detection model may be an example segmentation model, such as a Mask R-CNN model. Mask R-CNN is a two-stage frame that scans an axial mammogram or a lateral-oblique mammogram and generates a proposal (i.e., a region that may contain a target), and a lateral-oblique stage classification proposal and generates a bounding box and a Mask (Mask), wherein the region delimited by the bounding box is an axial mammogram tumor or a lateral-oblique mammogram tumor. However, it should be understood that other deep neural network architectures may be used for the tumor detection model, and the specific type of tumor detection model is not strictly limited by the present application.
Step 102: and respectively determining the tumor contour coordinate data of the axial position breast X-ray image and the tumor contour coordinate data of the lateral position breast X-ray image based on the axial position breast X-ray image and the lateral position breast X-ray image.
Step 103: and determining the center point coordinates of the axial breast tumor and the center point coordinates of the lateral oblique breast tumor respectively based on the tumor contour coordinate data of the axial breast X-ray image and the tumor contour coordinate data of the lateral oblique breast X-ray image.
In one embodiment of the present application, the minimum bounding rectangle may be used to calculate the center point coordinates of the axial breast tumor and the center point coordinates of the lateral breast tumor. Taking the example of determining the center point coordinates of the axial breast tumor based on the axial breast X-ray image tumor, as shown in fig. 2, the method may specifically include the following steps:
step 1031: and acquiring an axial minimum circumscribed rectangle surrounding the mammary X-ray image tumor based on the axial mammary X-ray image tumor.
As shown in fig. 3a, the minimum bounding rectangle (Minimum Bounding Rectangle, MBR) refers to the largest extent of several two-dimensional shapes (e.g. points, lines, polygons) represented in two-dimensional coordinates, i.e. the rectangle bordering the lower border in the largest abscissa, smallest abscissa, largest ordinate, smallest ordinate among the vertices of a given two-dimensional shape. We can use python opencv minAreaRect to generate the smallest bounding rectangle, the function used can be cv2.minuarerect (cnt), where cnt is an array or vector of point sets (coordinates of points stored therein), and this point set is an indefinite number. The function cv2.mingreat () returns a rectangle of Box2D structure: (center (x, y), (width, height), rotation angle of the smallest bounding rectangle). To draw this rectangle, it is necessary to obtain 4 vertex coordinates of the rectangle, which can be obtained by the function cv2.cv.boupepoints (), returned in the form [ [ x0, y0], [ x1, y1], [ x2, y2], [ x3, y3] ].
Step 1032: and calculating the center point coordinate of the minimum circumscribed rectangle of the axial position to serve as the center point coordinate of the breast lump of the axial position.
As shown in fig. 3b, the coordinates of the center point can be calculated according to the four vertices [ x0, y0], [ x1, y1], [ x2, y2], [ x3, y3] of the minimum bounding rectangle obtained in the previous step. Specifically, the center point coordinates may be the diagonal coordinate addition divided by two. The X-axis coordinates of the center point coordinates are: x= (x0+x2)/2 or (x1+x3)/2, y-axis coordinates are: y= (y0+y2)/2 or (y1+y3)/2.
It should be appreciated that the specific procedure for determining the coordinates of the center point of the lateral-oblique breast tumor based on the lateral-oblique breast X-ray image tumor may be similar to that given in the above embodiments, and will not be described in detail herein.
Step 104: and acquiring axial nipple coordinates and lateral nipple coordinates based on the axial nipple X-ray image and the lateral nipple X-ray image.
In an embodiment of the present application, taking the axial nipple coordinate obtained based on the axial mammography image as an example, as shown in fig. 4, the method may specifically include the following steps:
step 1041: an axial breast boundary image is acquired based on the axial breast X-ray image.
Specifically, an axis boundary image without morphological operations (as shown in fig. 5 a) and a side diagonal boundary image with morphological operations (as shown in fig. 5 b) may be first acquired based on an axis mammography image. In one embodiment of the application, a Sauvola thresholding algorithm may be used in conjunction with some morphological closing and Laplacian filters to generate binary masks, and the largest mask that can be used to extract the contour lines is selected to obtain morphologically operated side-bias boundary images. A binary threshold image is then acquired based on the axial mammogram (as shown in fig. 5 c). Finally, the axial boundary image and the lateral oblique boundary image are subtracted from the binary threshold image to obtain an axial breast boundary image (as shown in fig. 5 d).
Step 1042: an axial maximum mask image is acquired based on the axial mammogram.
Specifically, the largest mask is found in the axial mammogram to find the pixels that appear in the largest mask, and the axial maximum mask image is obtained, as shown in fig. 5 e. This is because some pixels will remain unchanged while the threshold and boundary images are subtracted.
Step 1043: the on-axis maximum mask image is subtracted from the on-axis breast boundary image to obtain a nipple area including an off-nipple boundary line.
The axial boundary image and the lateral boundary image acquired by the boundary detection may detect the slightest pixel invisible to the eye, while the threshold image removes the pixel of minimum intensity. The nipple area including the extra-nipple border line is thus obtained by subtracting the on-axis maximum mask image from the on-axis breast border image, as shown in fig. 5 f. The nipple area here does not necessarily only comprise the outer nipple border line, but may also comprise other areas, such as the white area shown in the upper right hand corner of fig. 5 f.
Step 1044: and acquiring axial nipple coordinates based on the external nipple boundary line.
Specifically, (Hue moment) Hue movements of the outer-nipple boundary line may be calculated based on the object transfer rule to obtain the axis nipple coordinates x0 and y0.
In an embodiment of the present application, considering that other areas than the nipple outer boundary line may be included in the nipple area, some logic post-processing rules may be set to filter out the other areas, for example, defining the logic post-processing rules: the extranipple border line cannot be located at the top of the image. In this way, the object region in the nipple region at the top of the axial mammogram can be filtered out to finally determine the extranipple border line.
It should be appreciated that the specific procedure for acquiring lateral-oblique nipple coordinates based on lateral-oblique mammography images may be similar to that given in the above embodiments, and will not be described in detail herein.
Step 105: and calculating the axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculating the lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate.
Specifically, taking the example of calculating the axial nipple breast distance according to the axial breast mass center point coordinates and the axial nipple coordinates, the distance from the nipple to the mass center point can be calculated based on the following formula:
where |AB| is the axial nipple mammary spacing, x1, y1 can be the axial breast tumor center point coordinates, and x1, y1 can be the axial nipple coordinates.
It should be appreciated that the specific procedure for calculating the lateral-oblique breast spacing from the lateral-oblique breast mass center point coordinates and the lateral-oblique nipple coordinates may be similar to that given in the above embodiments, and will not be described in detail herein.
Step 106: when the difference between the axial nipple breast spacing and the lateral nipple breast spacing is smaller than a preset value, the axial nipple X-ray image tumor and the lateral nipple X-ray image tumor are confirmed to be the same breast tumor.
In an embodiment of the present application, the preset value may be 3mm. That is, when the difference between the axial nipple spacing and the lateral nipple spacing is less than 3mm, it is confirmed that the axial breast X-ray image tumor and the lateral nipple X-ray image tumor are the same breast tumor and are marked as yellow. In one embodiment of the present application, when the difference between the axial nipple mammary gland spacing and the lateral nipple mammary gland spacing is greater than a predetermined value, the axial nipple X-ray image tumor and the lateral nipple X-ray image tumor are both marked as suspicious tumors, for example, marked blue.
It should be understood that the magnitude of the preset value can be adjusted according to the actual application scene requirement, and the specific magnitude of the preset value is not strictly limited in the application.
In an embodiment of the present application, before acquiring the axial breast boundary image or the lateral oblique breast boundary image based on the axial breast X-ray image or the lateral oblique breast X-ray image, the axial breast X-ray image or the lateral oblique breast X-ray image is further required to be converted into a uniform target format through preprocessing. As shown in fig. 6, taking an axial breast X-ray image or a lateral oblique breast X-ray image as an example of a DCM (digital imaging and communications in medicine) format image, first, parameters of a DCM file are read to obtain a transmission syntax, the DCM file is decoded according to the transmission syntax, and a dataset (dataset) data element is obtained after the decoding is successful. Then defining an object pointer of the DICOM data structure, and analyzing the DCM file transformation mode to obtain the number of dataset windows, the lookup table function of the interested region and the number of lookup tables. And then adopting different processing modes according to the type of the dataset window. When the type of the dataset window is type 1 (corresponding to a linear function), the DCM file is analyzed according to the window type and the window parameters, and a PNG (portable network graphics) format file with default 16-bit is saved until all the interested windows are processed, and the process is finished. When the dataset window type is type 0 (corresponding to a non-linear function and other types except the linear function), the DCM file is analyzed according to the window type and the window parameters, and the flow is ended after the default 16-bit PNG format file is saved. When the dataset window type is type 2 (corresponding to a nonlinear function), analyzing the DCM file by using an n-th (n is a natural number) lookup table of the DCM file, and storing a PNG format file of a default 16-bit until the circulation can be ended according to Voi Lut (region of interest conversion) judgment.
Therefore, according to the breast X-ray image tumor detection method provided by the embodiment of the application, the breast X-ray image tumor is determined by using the deep neural network model established based on the deep learning algorithm, and the center point coordinate and the nipple coordinate of the breast tumor are obtained; and judging whether the breast tumors detected in the different breast X-ray images are the same tumor or not by comparing the distances between the center point coordinates and the nipple coordinates of the breast tumors in the different breast X-ray images. Therefore, the breast X-ray image tumor detection mode combines a deep learning algorithm and a common computer-aided judgment mode, and finally comprehensively judges breast X-ray image tumors by comparing different breast X-ray images, so that a breast tumor detection mechanism with high sensitivity and low false positive rate is constructed.
Fig. 7 is a schematic structural diagram of a breast X-ray image tumor detecting apparatus according to an embodiment of the application. As shown in fig. 7, the mammography image tumor detection apparatus 70 includes:
the first obtaining module 701 is configured to input the axial breast X-ray image and the lateral oblique breast X-ray image into a tumor detection model respectively to obtain an axial breast X-ray image tumor and a lateral oblique breast X-ray image tumor marked by the tumor detection model in the axial breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm;
a first determining module 702 configured to determine mass contour coordinate data of the axial mammogram and mass contour coordinate data of the lateral oblique mammogram based on the axial mammogram and lateral oblique mammogram, respectively;
a second determining module 703 configured to determine an axial breast mass center point coordinate and a lateral oblique breast mass center point coordinate, respectively, based on the mass contour coordinate data of the axial breast X-ray image and the mass contour coordinate data of the lateral oblique breast X-ray image;
a second acquisition module 704 configured to acquire an axial nipple coordinate and a lateral nipple coordinate based on the axial breast X-ray image and the lateral breast X-ray image;
a first calculation module 705 configured to calculate an axial nipple pitch from the axial breast mass center point coordinates and the axial nipple coordinates, and calculate a lateral oblique nipple pitch from the lateral oblique breast mass center point coordinates and the lateral oblique nipple coordinates; and
the first determining module 706 is configured to confirm that the axial breast X-ray image tumor and the lateral oblique breast X-ray image tumor are the same breast tumor when the difference between the axial breast spacing and the lateral oblique breast spacing is less than a preset value.
Therefore, the breast X-ray image tumor detection device provided by the embodiment of the application determines the breast X-ray image tumor by using the deep neural network model established based on the deep learning algorithm, and acquires the center point coordinate and the nipple coordinate of the breast tumor; and judging whether the breast tumors detected in the different breast X-ray images are the same tumor or not by comparing the distances between the center point coordinates and the nipple coordinates of the breast tumors in the different breast X-ray images. Therefore, the breast X-ray image tumor detection mode combines a deep learning algorithm and a common computer-aided judgment mode, and finally comprehensively judges breast X-ray image tumors by comparing different breast X-ray images, so that a breast tumor detection mechanism with high sensitivity and low false positive rate is constructed.
In one embodiment of the present application, as shown in fig. 8, the second obtaining module 704 includes:
a first acquisition unit 7041 configured to acquire an axial breast boundary image based on the axial breast X-ray image;
a second acquiring unit 7042 configured to acquire an axial maximum mask image based on the axial mammogram image;
a third acquisition unit 7043 configured to subtract the axial maximum mask image from the axial breast boundary image to acquire a nipple area including an extranipple boundary line; and
the fourth acquisition unit 7044 is configured to acquire an axial nipple coordinate based on an extranipple boundary line.
In an embodiment of the present application, the first obtaining unit 7041 is further configured to:
acquiring an axis boundary image without morphological operation and a side diagonal boundary image with morphological operation based on the axis breast X-ray image;
acquiring a binary threshold image based on the axial breast X-ray image; and
the axial boundary image and the lateral oblique boundary image are subtracted from the binary threshold image to obtain an axial breast boundary image.
In one embodiment of the present application, as shown in fig. 9, the side bias acquisition module 704 further includes:
the filtering unit 7045 is configured to filter out a subject region in the nipple region that is located on top of the axial mammography image before acquiring the axial nipple coordinates based on the nipple outer boundary line.
In one embodiment of the present application, as shown in fig. 9, the second obtaining module 704 further includes:
the preprocessing unit 7046 is configured to convert the axial mammography image into a unified target format by preprocessing before acquiring the axial breast boundary image based on the axial mammography image.
In an embodiment of the present application, the first determining module 702 is further configured to:
acquiring an axial minimum circumscribed rectangle surrounding the breast X-ray image tumor based on the axial breast X-ray image tumor; and
and calculating the center point coordinate of the minimum circumscribed rectangle of the axial position to serve as the center point coordinate of the breast lump of the axial position.
In one embodiment of the present application, the axial mammography image and the lateral-oblique mammography image are any two of the following mammography image categories respectively: an axial image, a lateral image, and a diagonal lateral image.
In one embodiment of the present application, the tumor detection model is an example segmentation model.
In one embodiment of the application, the preset value is 3mm.
The specific functions and operations of the various modules in the mammography image tumor detection apparatus 70 described above have been described in detail above with reference to the mammography image tumor detection method described in fig. 1-6. Therefore, a repetitive description thereof will be omitted herein.
It should be noted that the mammography image tumor detection apparatus 70 according to the embodiment of the present application may be integrated into the electronic device 90 as a software module and/or a hardware module, in other words, the electronic device 90 may include the mammography image tumor detection apparatus 70. For example, the mammography image tumor detection apparatus 70 may be a software module in the operating system of the electronic device 90, or may be an application developed for it; of course, the mammography image tumor detection apparatus 70 can also be one of a plurality of hardware modules of the electronic device 90.
In another embodiment of the present application, the mammography image tumor detection apparatus 70 and the electronic device 90 may also be separate devices (e.g., servers), and the mammography image tumor detection apparatus 70 may be connected to the electronic device 90 via a wired and/or wireless network and transmit interactive information according to a prescribed data format.
Fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the application. As shown in fig. 10, the electronic device 90 includes: one or more processors 901 and memory 902; and computer program instructions stored in the memory 902, which when executed by the processor 901, cause the processor 901 to perform the mammography image tumor detection method of any one of the embodiments described above.
The processor 901 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities and may control other components in the electronic device to perform desired functions.
Memory 902 may include one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or nonvolatile memory. Volatile memory can include, for example, random Access Memory (RAM) and/or cache memory (cache) and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on a computer readable storage medium and the processor 901 may execute the program instructions to implement the steps in the mammography image tumor detection method of the various embodiments of the present application described above and/or other desired functions. Information such as light intensity, compensation light intensity, position of the filter, etc. may also be stored in the computer readable storage medium.
In one example, the electronic device 90 may further include: an input device 903 and an output device 904, which are interconnected by a bus system and/or other form of connection mechanism (not shown in fig. 10).
For example, where the electronic device is a robot, such as on an industrial line, the input device 903 may be a camera for capturing the position of the part to be processed. When the electronic device is a stand-alone device, the input means 903 may be a communication network connector for receiving the acquired input signal from an external, removable device. In addition, the input device 903 may also include, for example, a keyboard, mouse, microphone, and the like.
The output device 904 may output various information to the outside, and may include, for example, a display, a speaker, a printer, and a communication network and a remote output apparatus connected thereto, and the like.
Of course, only some of the components of the electronic device 90 that are relevant to the present application are shown in fig. 10 for simplicity, components such as buses, input/output interfaces, etc. are omitted. In addition, the electronic device 90 may include any other suitable components depending on the particular application.
In addition to the methods and apparatus described above, embodiments of the present application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps of the mammography image tumor detection method of any one of the embodiments described above.
The computer program product may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform the steps in the mammography image tumor detection method according to the various embodiments of the present application described in the "exemplary mammography image tumor detection method" section described above.
A computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random access memory ((RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present application have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present application are merely examples and not intended to be limiting, and these advantages, benefits, effects, etc. are not to be considered as essential to the various embodiments of the present application. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, as the application is not necessarily limited to practice with the above described specific details.
The block diagrams of the devices, apparatuses, devices, systems referred to in the present application are only illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
It is also noted that in the apparatus, devices and methods of the present application, the components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered as equivalent aspects of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit embodiments of the application to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is to be construed as including any modifications, equivalents, and alternatives falling within the spirit and principles of the application.

Claims (9)

1. A method for detecting breast X-ray image tumor, comprising:
respectively inputting an axial position breast X-ray image and a lateral oblique breast X-ray image into a tumor detection model to obtain an axial position breast X-ray image tumor and a lateral oblique breast X-ray image tumor marked by the tumor detection model in the axial position breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm;
determining mass contour coordinate data of the axial position breast X-ray image and mass contour coordinate data of the lateral position breast X-ray image based on the axial position breast X-ray image and the lateral position breast X-ray image respectively;
determining an axial breast mass center point coordinate and a lateral oblique breast mass center point coordinate based on the tumor contour coordinate data of the axial breast X-ray image and the tumor contour coordinate data of the lateral oblique breast X-ray image respectively;
acquiring an axial nipple coordinate and a lateral nipple coordinate based on the axial breast X-ray image and the lateral breast X-ray image;
calculating an axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculating a lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate; and
when the difference between the axial nipple spacing and the lateral nipple spacing is smaller than a preset value, confirming that the axial nipple X-ray image tumor and the lateral nipple X-ray image tumor are the same breast tumor;
wherein:
the acquiring the axial nipple coordinate based on the axial breast X-ray image comprises:
acquiring an axial breast boundary image based on the axial breast X-ray image;
acquiring an axial maximum mask image based on the axial breast X-ray image;
subtracting the axial maximum mask image from the axial breast boundary image to obtain a nipple area including an extranipple boundary line; and
and acquiring the axial nipple coordinate based on the external nipple boundary line.
2. The method of claim 1, wherein the acquiring an axial breast boundary image based on the axial breast X-ray image comprises:
acquiring an axis boundary image without morphological operation and a side diagonal boundary image with morphological operation based on the axis breast X-ray image;
acquiring a binary threshold image based on the axial breast X-ray image; and
subtracting the axial boundary image and the lateral oblique boundary image from the binary threshold image to obtain the axial breast boundary image.
3. The method of claim 1, wherein prior to acquiring the axial nipple coordinates based on the nipple outer boundary line, the acquiring axial nipple coordinates based on the axial mammogram further comprises:
and filtering out the object area which is positioned on the top of the axial mammary X-ray image in the nipple area.
4. The method of claim 1, wherein prior to acquiring an axial breast boundary image based on the axial breast X-ray image, the acquiring axial nipple coordinates based on the axial breast X-ray image further comprises:
and converting the axial breast X-ray image into a uniform target format through preprocessing.
5. The method of claim 1, wherein determining the axial breast mass center point coordinates based on the axial breast X-ray image mass comprises:
acquiring an axial minimum circumscribed rectangle surrounding the mammary gland X-ray image tumor based on the axial mammary gland X-ray image tumor; and
and calculating the center point coordinate of the minimum circumscribed rectangle of the axial position to serve as the center point coordinate of the breast tumor of the axial position.
6. The method of claim 1, wherein the axial mammography and lateral-oblique mammography are each any two of the following mammography categories: an axial image, a lateral image, and a diagonal lateral image.
7. The method of claim 1, wherein the tumor detection model is an example segmentation model.
8. The method of claim 1, wherein the preset value is 3mm.
9. A mammography image tumor detection apparatus, comprising:
the first acquisition module is configured to input an axial position breast X-ray image and a lateral oblique breast X-ray image into a tumor detection model respectively to acquire an axial position breast X-ray image tumor and a lateral oblique breast X-ray image tumor marked by the tumor detection model in the axial position breast X-ray image and the lateral oblique breast X-ray image, wherein the tumor detection model is a deep neural network model established based on a deep learning algorithm;
a first determination module configured to determine mass contour coordinate data of the axial breast X-ray image and mass contour coordinate data of the lateral breast X-ray image based on the axial breast X-ray image and the lateral breast X-ray image, respectively;
the second determining module is configured to determine an axial breast mass center point coordinate and a lateral oblique breast mass center point coordinate based on the tumor contour coordinate data of the axial breast X-ray image and the tumor contour coordinate data of the lateral oblique breast X-ray image respectively;
a second acquisition module configured to acquire an axial nipple coordinate and a lateral nipple coordinate based on the axial breast X-ray image and the lateral breast X-ray image;
the first calculating module is configured to calculate an axial nipple mammary gland distance according to the axial breast tumor central point coordinate and the axial nipple coordinate, and calculate a lateral oblique nipple mammary gland distance according to the lateral oblique breast tumor central point coordinate and the lateral oblique nipple coordinate; and
the first judging module is configured to confirm that the axial position breast X-ray image tumor and the lateral oblique position breast X-ray image tumor are the same breast tumor when the difference value between the axial position breast spacing and the lateral oblique position breast spacing is smaller than a preset value;
wherein:
the second acquisition module includes:
a first acquisition unit configured to acquire an axial breast boundary image based on the axial breast X-ray image;
a second acquisition unit configured to acquire an axial maximum mask image based on the axial mammogram image;
a third acquisition unit configured to subtract the axial maximum mask image from the axial breast boundary image to acquire a nipple area including an extranipple boundary line; and
and a fourth acquisition unit configured to acquire the axial nipple coordinates based on the extranipple boundary line.
CN201911268460.1A 2019-12-11 2019-12-11 Method and device for detecting breast X-ray image tumor Active CN110974286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911268460.1A CN110974286B (en) 2019-12-11 2019-12-11 Method and device for detecting breast X-ray image tumor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911268460.1A CN110974286B (en) 2019-12-11 2019-12-11 Method and device for detecting breast X-ray image tumor

Publications (2)

Publication Number Publication Date
CN110974286A CN110974286A (en) 2020-04-10
CN110974286B true CN110974286B (en) 2023-11-17

Family

ID=70092487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911268460.1A Active CN110974286B (en) 2019-12-11 2019-12-11 Method and device for detecting breast X-ray image tumor

Country Status (1)

Country Link
CN (1) CN110974286B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111768878A (en) * 2020-06-30 2020-10-13 杭州依图医疗技术有限公司 Method for visually guiding focus and computer readable storage medium
CN114820590B (en) * 2022-06-06 2023-04-07 北京医准智能科技有限公司 Image processing method, image processing apparatus, electronic device, and medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003009209A1 (en) * 2001-07-18 2003-01-30 Icad, Inc. Computer-aided method and system for detecting spiculated lesions in a mammogram

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7865002B2 (en) * 2006-06-02 2011-01-04 Carolina Imaging Specialists, Inc. Methods and apparatus for computer automated diagnosis of mammogram images
US20170164924A1 (en) * 2015-12-15 2017-06-15 Konica Minolta, Inc. Ultrasound image diagnostic apparatus

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003009209A1 (en) * 2001-07-18 2003-01-30 Icad, Inc. Computer-aided method and system for detecting spiculated lesions in a mammogram

Also Published As

Publication number Publication date
CN110974286A (en) 2020-04-10

Similar Documents

Publication Publication Date Title
Elyan et al. Computer vision and machine learning for medical image analysis: recent advances, challenges, and way forward.
US11100683B2 (en) Image color adjustment method and system
US11508059B2 (en) Methods and systems for detecting a centerline of a vessel
Wang et al. Smartphone-based wound assessment system for patients with diabetes
CN111768343B (en) System and method for facilitating examination of liver tumor cases
CN110832498B (en) Blurring facial features of objects in an image
CN107886508B (en) Differential subtraction method and medical image processing method and system
US20170109893A1 (en) Method and system for image segmentation
WO2006126384A1 (en) Abnormal shadow candidate display method, and medical image processing system
CN110827335B (en) Mammary gland image registration method and device
CN110176010B (en) Image detection method, device, equipment and storage medium
CN110974286B (en) Method and device for detecting breast X-ray image tumor
JP2023502814A (en) Image detection model training method and its related apparatus, equipment, and storage medium
US11672496B2 (en) Imaging systems and methods
Bueno et al. Two-phase flow bubble detection method applied to natural circulation system using fuzzy image processing
CN111784686A (en) Dynamic intelligent detection method, system and readable storage medium for endoscope bleeding area
Wen et al. Pulmonary nodule detection based on convolutional block attention module
JP4453321B2 (en) Medical image processing apparatus and program
JP2006325640A (en) Method of displaying abnormal shadow candidate and medical image processing system
CN111429406B (en) Mammary gland X-ray image lesion detection method and device combining multi-view reasoning
US11311259B2 (en) Image processing apparatus, image processing method, and computer-readable medium
CN109475277B (en) Image processing apparatus, control method for image processing apparatus, and control program for image processing apparatus
CN116934722A (en) Small intestine micro-target detection method based on self-correction coordinate attention
Sridhar et al. Lung Segment Anything Model (LuSAM): A Prompt-integrated Framework for Automated Lung Segmentation on ICU Chest X-Ray Images
JP2006334140A (en) Display method of abnormal shadow candidate and medical image processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant