CN107292815B - Method and device for processing mammary gland image and mammary gland imaging equipment - Google Patents

Method and device for processing mammary gland image and mammary gland imaging equipment Download PDF

Info

Publication number
CN107292815B
CN107292815B CN201710447718.9A CN201710447718A CN107292815B CN 107292815 B CN107292815 B CN 107292815B CN 201710447718 A CN201710447718 A CN 201710447718A CN 107292815 B CN107292815 B CN 107292815B
Authority
CN
China
Prior art keywords
image
transformation
frequency
gray
breast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710447718.9A
Other languages
Chinese (zh)
Other versions
CN107292815A (en
Inventor
赵书睿
周海华
江春花
王汉禹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201710447718.9A priority Critical patent/CN107292815B/en
Publication of CN107292815A publication Critical patent/CN107292815A/en
Priority to EP17914031.4A priority patent/EP3622476A4/en
Priority to CA3067078A priority patent/CA3067078C/en
Priority to PCT/CN2017/120325 priority patent/WO2018227943A1/en
Priority to CA3168047A priority patent/CA3168047A1/en
Priority to CN201780092082.9A priority patent/CN110832540B/en
Priority to US16/023,340 priority patent/US10949950B2/en
Application granted granted Critical
Publication of CN107292815B publication Critical patent/CN107292815B/en
Priority to US17/201,084 priority patent/US11562469B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a method and a device for processing a mammary gland image and mammary gland imaging equipment. The method comprises the following steps: acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image; determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image; and carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image. Through the technical scheme, the gray level transformation area needing thickness equalization in the mammary gland image can be accurately and automatically acquired, so that the mammary gland image with more uniform gray level distribution can be acquired, and a better mammary gland image thickness equalization effect can be achieved.

Description

Method and device for processing mammary gland image and mammary gland imaging equipment
Technical Field
The embodiment of the invention relates to a medical image processing technology, in particular to a method and a device for processing a mammary gland image and mammary gland imaging equipment.
Background
In the full-digital mammography imaging process, the breast is shot when the compression plate compresses. However, due to the fact that the compression plate is not compressed to the edge of the breast and the distribution characteristics of the breast tissue, a large gray level difference exists between the central area and the edge area of the captured breast image, namely the gray level distribution of the breast image is not uniform. In this case, if the image is still displayed with a fixed contrast, only a part of the breast tissue can be observed, possibly resulting in missed diagnosis.
At present, the problem of uneven gray level distribution of the breast image is generally solved by performing gray level transformation (i.e. thickness equalization) on the breast image, and common methods for determining the gray level transformation range include a fixed distance method and a model method. However, the fixed distance method does not consider individual differences at all, and often does not achieve an ideal thickness balancing effect; the modeling method is limited by the assumption of the model, and the thickness balancing effect achieved is limited.
Disclosure of Invention
The embodiment of the invention provides a method and a device for processing a mammary gland image and a mammary gland imaging device, which are used for accurately and automatically acquiring a gray level transformation area needing thickness equalization in the mammary gland image, so that the mammary gland image with more uniform gray level distribution is obtained, and a better mammary gland image thickness equalization effect is achieved.
In a first aspect, an embodiment of the present invention provides a method for processing a breast image, including:
acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image;
and carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
In a second aspect, an embodiment of the present invention further provides a device for processing a breast image, where the device includes:
the image acquisition module is used for acquiring an original scanning image and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
the gray level transformation region determining module is used for determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray level transformation distance and determining a gray level transformation region of the low-frequency image;
and the target image generation module is used for carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
In a third aspect, an embodiment of the present invention further provides a breast imaging apparatus, including an X-ray source for emitting X-rays; a detector for collecting X-rays emitted by the X-ray source; a breast support plate for placing a breast; a compression plate for compressing the breast; the image processor is used for acquiring an original scanning image acquired by the detector and acquiring a low-frequency image and a high-frequency image according to the original scanning image; determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image; and carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
According to the embodiment of the invention, the original scanning image is obtained, and the low-frequency image and the high-frequency image are obtained according to the original scanning image, so that the image details can be kept in the processing process of the mammary gland image, and only the contrast of the image is changed, thereby achieving the purpose of uniform gray scale; in the process of acquiring the gray level transformation area, the width and the compression thickness of the mammary gland are utilized, the individual difference of a mammary gland examiner and the operation difference in the process of mammary gland shooting are considered, the problem of poor thickness balance effect caused by the individual difference is solved, and the gray level transformation area is acquired more accurately; the gray level transformation area is automatically determined by utilizing the breast width, the breast compression thickness and the gray level transformation distance determination model instead of being determined empirically, so that the problem that the processing of the breast image depends on empirical parameters too much is solved, the influence of subjective factors in the processing process is reduced, and the accuracy of obtaining the gray level transformation area is further improved; the target low-frequency image with balanced thickness is obtained by performing gray scale transformation on the gray scale transformation region, and the target low-frequency image and the high-frequency image are reconstructed to generate the target image, so that the mammary gland image with more uniform gray scale distribution can be obtained, a better mammary gland image thickness balancing effect is achieved, and the mammary gland image more meets clinical requirements.
Drawings
Fig. 1 is a flowchart of a method for processing a breast image according to a first embodiment of the present invention;
FIG. 2 is a schematic view of a breast shot in an embodiment of the present invention;
FIG. 3 is an original scanned image and a scanned image over a log domain in an embodiment of the present invention;
FIG. 4 is a breast segmentation template according to a first embodiment of the present invention;
FIG. 5 is a low-frequency original image, a low-frequency breast image, and a high-frequency original image according to a first embodiment of the present invention;
FIG. 6 is a flowchart of a method for processing a breast image according to a second embodiment of the present invention;
fig. 7 is a schematic structural diagram of a breast image processing apparatus according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a method for processing a breast image according to an embodiment of the present invention. The method may be performed by a device for processing a breast image, which may be implemented in software and/or hardware, and which may be integrated in a medical apparatus capable of mammography, such as a mammography apparatus, for example, a mammography apparatus, such as a breast stem panel radiography system, a dedicated screen radiography system, or a Full-Field digital mammography system (FFDM), and so on. As shown in fig. 1, the method of this embodiment specifically includes the following steps:
and S110, acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image.
In particular, the breast may be photographed using a mammography system such as FFDM. As shown in fig. 2, from the cross-sectional view of the nipple when the breast is pressed, the above-mentioned breast shooting process is: first, the supporting plate 201 and the pressing plate 202 are used to press the breast 203, and then the breast 203 is photographed from the photographing angle of view 204, so that the original scan image 310 shown in fig. 3 can be obtained. As can be seen from fig. 2, when the breast 203 is compressed, due to the influence of the compression strength and the compression angle, the breast contour edge region 205 often cannot be compressed well, i.e. the compression of the whole breast is not uniform, so that the compression thickness of the breast 203 is not uniform, and the finally obtained original scan image 310 in fig. 3 has uneven gray scale, such as: the middle region 312 of the breast is obviously darker than the edge region 311 and the root region 313 of the breast, and the gray level of the edge region 311 of the breast is closer to that of the background region 314, so when the original scan image is used as a diagnosis reference, it is not good for diagnosing breast diseases, and therefore, it is necessary to perform thickness equalization processing, i.e. gray level transformation, on the original scan image, more specifically, it is necessary to perform gray level transformation on the edge region of the breast in the original scan image, so as to obtain a breast image meeting clinical requirements.
After the original scan image is obtained, it may be subjected to a preprocessing operation in order to reduce the computational complexity and the amount of computation. For example, the original scan image is logarithmically transformed into a scan image in a LOG domain (i.e., LOG domain), i.e., LOG scan image 320 shown in fig. 3, by utilizing the characteristic that the attenuation of X-rays follows an exponential distribution. Then, the LOG scan image may be filtered by using a filtering algorithm such as bilateral filtering with good detail retention or wavelet filtering with strong applicability, so as to obtain a low-frequency image and a high-frequency image, respectively. The low-frequency image determines the overall shape (overall color) of the mammary gland image, and the high-frequency image determines the detail part of the mammary gland image, so that the subsequent gray scale transformation is mainly performed on the low-frequency image, and the aim of only changing the contrast of the image without influencing the details of the image is fulfilled.
Of course, before and after the logarithmic transformation or before and after the filtering process, the extraction (i.e., the clipping) of the breast region may be performed using the breast segmentation template of fig. 4, and a breast image including only the breast region and not the background region is obtained. The breast segmentation template is obtained by performing relevant segmentation detection on an original scanning image in advance and removing a direct exposure area (namely a background area), a breast wall area, an implant area and the like. That is, the preprocessing operation procedure may be clipping, logarithmic transformation and filtering, or logarithmic transformation, clipping and filtering, or logarithmic transformation, filtering and clipping, and specifically, the execution order may be selected according to actual needs.
S120, determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation area of the low-frequency image.
The breast width refers to the width of the widest part of the breast in the breast image, and specifically may be the maximum value in the vertical distance from the breast contour to the image edge on the side away from the breast contour. Referring to fig. 2, generally, the breast width 206 corresponds to the vertical distance of the nipple from the edge of the image on the side facing away from the contour of the breast. The breast compression thickness refers to a thickness of a portion of the compressed breast in contact with the compression plate in the breast photographing process, and referring to fig. 2, a thickness of the portion of the compressed breast 203 in contact with the compression plate 202 is the breast compression thickness 207. The breast compression thickness can be generally obtained by the imaging parameters of the breast imaging system. The gray scale transformation distance determination model refers to a model which is established in advance according to training sample data and is used for determining the gray scale transformation distance, and the model can be a statistical model or an intelligent algorithm model such as machine learning.
Specifically, the width and the compression thickness of the mammary gland are obtained as model parameters of a gray scale transformation distance determining model, then the gray scale transformation distance is determined by using the obtained model parameters and the gray scale transformation distance determining model, and a gray scale transformation area needing thickness equalization in the low-frequency image is determined according to the gray scale transformation distance.
Illustratively, the gray scale transformation distance determination model is obtained by pre-training as follows: acquiring at least two groups of historical breast images and historical breast compression thicknesses corresponding to the historical breast images; determining historical breast width and historical gray scale transformation distance according to the historical breast image, and determining the historical breast width, the historical breast compression thickness and the historical gray scale transformation distance as model training parameters; and performing model training by using the model training parameters and the set model to obtain a gray scale transformation distance determination model.
The historical breast image refers to a breast image of a past breast examiner obtained clinically before a breast image of a current breast examiner is taken. The historical breast compression thickness refers to the breast compression thickness corresponding to the historical breast image, namely, the breast compression thickness value given by the breast shooting system when each historical breast image is acquired. The historical breast width refers to the breast width corresponding to the historical breast image. The gray-scale transformation distance refers to a distance in a breast image that needs to be subjected to gray-scale transformation, and generally refers to the width of an uncompressed breast portion during a breast photographing process. For example, in fig. 2, the width of the portion not in contact with the compression plate, i.e., the breast contour edge region 205, is the gray scale transformation distance 208. The historical gray scale transformation distance is a gray scale transformation distance corresponding to the historical breast image, and is generally a distance value that is manually adjusted empirically in order to improve the equalization effect when the thickness of the historical breast image is equalized. The set model refers to a model form of a preset gray scale transformation distance determination model, which may be a statistical model or an intelligent algorithm model, and the statistical model may be a linear model or a nonlinear model. Preferably, in the present embodiment, the setting model is set as a statistical model according to the computation complexity; considering that the gradation conversion distance is related to the breast width and the breast compression thickness, the set model is more specifically set to a binary linear model.
Specifically, the corresponding historical breast width and the historical gray level transformation distance are determined according to the obtained historical breast image. So far, each historical breast image has a set of corresponding historical breast width, historical breast compression thickness and historical gray scale transformation distance, and the historical breast width, the historical breast compression thickness and the historical gray scale transformation distance are recorded as a set of model training parameters. Then at least 2 historical breast images correspond to at least 2 sets of model training parameters. Then, the gray scale transformation distance is set as a dependent variable of the set model, and the breast width and the breast compression thickness are set as independent variables of the set model, so that the specific form of the set model can be determined, namely: and d.
And then, performing model fitting training on the set model by using the plurality of groups of model training parameters to determine specific numerical values of model coefficients a, b and c, thereby obtaining a gray scale transformation distance determination model. The gray scale transformation distance determination model can be used for determining the subsequent gray scale transformation distance.
Of course, during the clinical application process of the gray scale transformation distance determination model, the application data of the gray scale transformation distance determination model can be optimized according to the application data. That is, in the clinical application process, if the gray scale transformation distance determined by the gray scale transformation distance determination model makes the final thickness equalization effect less than ideal, the gray scale transformation distance can be artificially adjusted until a better thickness equalization effect is obtained, and then the adjusted gray scale transformation distance, the corresponding breast width and the breast compression thickness are used as new model training parameters to further optimize the gray scale transformation distance determination model.
S130, carrying out gray scale transformation on the gray scale transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
Specifically, after the gray scale transformation region is determined, corresponding gray scale transformation parameters may be obtained, and then the gray scale transformation parameters are used to perform gray scale transformation on the gray scale transformation region in the low-frequency image obtained in S110, so as to obtain a low-frequency image with balanced thickness, that is, the target low-frequency image. And then, reconstructing the target low-frequency image and the high-frequency image obtained in the step S110, that is, adding the gray values of the corresponding pixel points in the target low-frequency image and the high-frequency image to obtain an image with balanced thickness, that is, the target image.
According to the technical scheme of the embodiment, the original scanning image is obtained, and the low-frequency image and the high-frequency image are obtained according to the original scanning image, so that the image details can be kept in the processing process of the mammary gland image, only the contrast of the image is changed, and the aim of uniform gray scale is fulfilled; in the process of acquiring the gray level transformation area, the width and the compression thickness of the mammary gland are utilized, the individual difference of a mammary gland examiner and the operation difference in the process of mammary gland shooting are considered, the problem of poor thickness balance effect caused by the individual difference is solved, and the gray level transformation area is acquired more accurately; the gray level transformation area is automatically determined by utilizing the breast width, the breast compression thickness and the gray level transformation distance determination model instead of being determined empirically, so that the problem that the processing of the breast image depends on empirical parameters too much is solved, the influence of subjective factors in the processing is reduced, and the accuracy of obtaining the gray level transformation area is further improved; the target low-frequency image with balanced thickness is obtained by performing gray scale transformation on the gray scale transformation region, and the target low-frequency image and the high-frequency image are reconstructed to generate the target image, so that the mammary gland image with more uniform gray scale distribution can be obtained, a better mammary gland image thickness balancing effect is achieved, and the mammary gland image more meets clinical requirements.
On the basis of the above technical solution, S110 may preferably acquire an original scanned image, and filter the original scanned image to respectively acquire a low-frequency original image and a high-frequency original image; and segmenting the low-frequency original image to obtain a low-frequency mammary gland image. Correspondingly, S130 may be performing gray scale transformation on the gray scale transformation region of the low-frequency original image to obtain a target low-frequency original image with balanced thickness; and reconstructing the target low-frequency original image and the high-frequency original image to generate a target original image.
Specifically, an original scan image is acquired first, and LOG conversion is performed thereon to obtain a LOG scan image. Then, the LOG scan image is filtered to obtain a low-frequency original image 501 and a high-frequency original image 502 corresponding to the LOG scan image, which include a background region, as shown in fig. 5. Then, the low-frequency original image 501 is cut by using the breast segmentation template shown in fig. 4 to obtain a low-frequency breast image 503, which is a basic image obtained by performing gray scale transformation parameters subsequently. The advantage of such an arrangement is that the data volume of the image can be effectively reduced through the preprocessing operations of logarithmic transformation, filtering and clipping, thereby reducing the computational complexity and the computational complexity of subsequent gray scale transformation parameter determination. Of course, the low frequency original image and the high frequency original image may also be obtained directly based on the original scanned image.
Accordingly, in S130, after the gray scale transformation area is determined, the corresponding gray scale transformation parameter may be obtained, and then the gray scale transformation parameter is used to perform gray scale transformation on the gray scale transformation area in the obtained low-frequency original image 501, so as to obtain a low-frequency original image with balanced thickness, that is, a target low-frequency original image, where the target low-frequency original image includes a background area. Then, the target low-frequency original image and the obtained high-frequency original image 502 are reconstructed to obtain an original image with balanced thickness, i.e. a target original image. The image which is finally obtained and can be applied to clinic is a complete image which completely corresponds to the original scanning image and contains the background area, and the subsequent other operation processing can be facilitated.
On the basis of the above technical solution, S110 may preferably further obtain an original scan image, and obtain a breast image according to the original scan image; and filtering the mammary gland image to respectively obtain a low-frequency mammary gland image and a high-frequency mammary gland image. Correspondingly, in S130, the gray level transformation region of the low-frequency breast image may be subjected to gray level transformation to obtain a target low-frequency breast image with balanced thickness; and reconstructing the target low-frequency mammary gland image and the high-frequency mammary gland image to generate a target mammary gland image.
Specifically, an original scan image is acquired first, and LOG conversion is performed thereon to obtain a LOG scan image. Then, the LOG scan image is cropped using the breast segmentation template of fig. 4, and a breast image including only the breast region is obtained. And then, filtering the mammary gland image to obtain a corresponding low-frequency mammary gland image and a corresponding high-frequency mammary gland image. This low frequency breast image is the low frequency breast image 503 shown in fig. 5. The advantage of such an arrangement is that the data volume of the image can be further reduced through the preprocessing operations of logarithmic transformation, cropping and filtering, thereby reducing the computational complexity and the computational complexity of the subsequent gray-scale transformation parameter determination and the target image generation. Likewise, the object of image cropping may also be the original scanned image.
Accordingly, S130 is specifically configured to obtain corresponding gray scale transformation parameters after determining the gray scale transformation region, and then perform gray scale transformation on the gray scale transformation region in the obtained low-frequency breast image by using the gray scale transformation parameters to obtain a low-frequency breast image with balanced thickness, that is, a target low-frequency breast image, where the target low-frequency breast image only includes a breast region. And then, reconstructing the target low-frequency mammary gland image and the obtained high-frequency mammary gland image to obtain a mammary gland image with balanced thickness, namely the target mammary gland image. The image finally obtained and applied to clinic does not contain background areas, and the data volume in other subsequent operation processing can be reduced.
Example two
Fig. 6 is a flowchart of a processing method of a breast image according to a second embodiment of the present invention, and in this embodiment, further optimization is performed on "determining a model according to a breast width, a breast compression thickness, and a gray-scale transformation distance, and determining a gray-scale transformation region of the low-frequency image" on the basis of the first embodiment. On the basis, the optimization of the gray scale transformation area can be further carried out. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
A method for processing a breast image according to a second embodiment of the present invention is described below with reference to fig. 6, where the method of this embodiment includes:
s210, acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image.
S220, determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining the gray scale transformation distance corresponding to the low-frequency image.
Specifically, a corresponding breast image is acquired from any one of the original scan image, the low frequency image, and the high frequency image, and then its corresponding breast width is determined based on the breast image. The breast width may be determined by first obtaining a vertical distance from each pixel point on the breast contour in the breast image to an image edge on a side away from the breast contour, and then determining a maximum value of the vertical distances as the breast width corresponding to the breast image. However, considering that the breast image is segmented from the original scanned image, when the original scanned image includes other body regions such as an arm, the corresponding breast segmentation template may also include the other body regions, so that the breast image may also include other body regions besides the breast. Thus, the maximum width in the breast image does not necessarily correspond to the breast width. Therefore, the breast image needs to be further processed to remove the image portions of other human body regions in the breast image. And then, acquiring the vertical distance from each pixel point on the breast contour in the processed breast image to the image edge on the side departing from the breast contour, and determining the maximum vertical distance value as the breast width corresponding to the breast image. The possible image portions of the other body regions can be observed by the human eye, usually from a clinically acquired breast image, and then an image range can be defined by human experience.
Then, the acquired breast width W and breast compression thickness T are used as input parameters, and the gray scale transformation distance corresponding to the low-frequency image can be obtained through the gray scale transformation distance determination model Dis ═ a × W + b × T + c.
And S230, determining a region from the mammary gland contour of the low-frequency image to the gray scale transformation distance range as a gray scale transformation region of the low-frequency image.
Specifically, after the gray scale conversion distance is determined, a region in the low-frequency image, which needs to be subjected to gray scale conversion, that is, a gray scale conversion region, can be determined according to the gray scale conversion distance. Namely: starting from the breast contour of the low-frequency image, extending the length of the gray level transformation distance radially to the inside of the breast region, a boundary similar to the breast contour can be determined, and the region between the boundary and the breast contour is the gray level transformation region to be determined. Referring to fig. 3, taking a LOG scan image 320 as an example, in the LOG scan image 320, starting from a breast contour 321, a length of a gray scale transformation distance 323 extends toward the inside of the breast region along a radial direction 322 of the breast contour 321, so as to obtain a breast contour similar boundary 324, and a region formed between the breast contour similar boundary 324 and the breast contour 321 is a gray scale transformation region 325 in the LOG scan image 320.
S240, determining a gray value transformation interval according to the gray value of the gray value transformation area.
Specifically, to perform gray scale conversion on a gray scale conversion region in a low-frequency image, a specific gray scale conversion parameter for gray scale conversion, that is, an operation basis for implementing gray scale conversion, may be, for example, a gray scale conversion formula corresponding to at least one gray scale change interval for performing gray scale conversion. To determine the gray level transformation parameters, a reasonable gray level transformation interval is determined, that is, a target gray level maximum value (interval maximum value of the gray level transformation interval) and a target gray level minimum value (interval minimum value of the gray level transformation interval) which are required to be subjected to gray level transformation in the low-frequency image are determined, and the gray level transformation interval is determined according to the two values. That is, the gradation value conversion section is determined according to the gradation value of the gradation conversion region. And the gray levels of all pixel points in the low-frequency image within the gray level conversion interval are subjected to gray level conversion according to the subsequently determined gray level conversion parameters.
Exemplarily, S240 may include:
respectively determining a first pixel point set and a second pixel point set corresponding to a first edge and a second edge of the gray scale transformation area; determining the maximum gray value of the first pixel point set, and determining the maximum gray value as the maximum interval value of the gray value transformation interval; determining a gray level average value of the second pixel point set, and determining the gray level average value as a minimum interval value of a gray level value transformation interval; and determining the gray value transformation interval according to the interval maximum value and the interval minimum value.
The first edge is a breast contour in the gray scale transformation region, and the second edge is an edge of the gray scale transformation region far away from the breast contour. Referring to fig. 3, taking the LOG scan image 320 as an example, the breast contour 321 is a first edge in the gray scale transformed region 325, and the breast contour similarity boundary 324 is a second edge in the gray scale transformed region 325. Correspondingly, the first pixel point set is the pixel point set formed by all the pixel points on the breast contour, and the second pixel point set is the pixel point set formed by all the pixel points on the second edge.
Specifically, in the gray value transformation area, gray value statistics is performed on gray values of all pixel points in the first pixel point set, a maximum gray value is found, and the maximum gray value is used as an interval maximum value of a gray value transformation interval. Meanwhile, mean value calculation is carried out on the gray values of all the pixels in the second pixel point set to obtain a gray mean value, and the gray mean value is used as the minimum interval value of the gray value transformation interval. Then, a gradation value conversion section is determined from the determined section maximum value and section minimum value. It should be noted that in this embodiment, the order of acquiring the interval endpoint values of the gray value conversion interval is not limited, that is, the above-mentioned execution order of determining the interval maximum value and the interval minimum value may be performed in sequence, or may be performed in reverse order.
And S250, determining the gray level transformation parameters of the low-frequency image according to the gray level value transformation interval.
Specifically, after the gray value transformation interval is determined, the gray value transformation formula in the gray value transformation interval can be determined according to actual needs. The gray value transformation interval may be one gray value transformation interval or a plurality of gray value transformation sub-intervals, the division of the gray value transformation sub-intervals may be empirical division, or automatic division may be performed according to the gray value of the gray value transformation interval, for example, gray level equalization or division according to the gray value corresponding to different distances from the breast contour; the gray scale transformation formula corresponding to each interval or sub-interval may be a linear formula or a non-linear formula, which may be set according to the actual application requirements.
Illustratively, S250 may include:
dividing the gray value transformation interval into N gray value transformation subintervals, and determining a gray value transformation line segment corresponding to each gray value transformation subinterval, wherein N is a positive integer; and performing curve fitting on the N gray level transformation line segments to obtain gray level transformation parameters of the low-frequency image.
Specifically, a distance-gray scale curve in the breast region corresponding to the gray scale value transformation interval may be determined according to the gray scale value transformation interval, and then the gray scale value transformation interval is divided into positive integer numbers, i.e., N gray scale value transformation subintervals, according to the distance-gray scale curve. The distance refers to the minimum distance from a certain pixel point in the image to the mammary gland contour, namely, the distance is connected from the pixel point to each pixel point on the mammary gland contour, the lengths of a plurality of connecting lines can be determined, the shortest connecting line always exists in the lengths, and the shortest connecting line is the minimum distance from the pixel point to the mammary gland contour.
According to the description of S240, it can be known that two edges of the gray value transformation interval are the first edge and the second edge, respectively, and then the minimum distance between different pixel points in the breast region corresponding to the gray value transformation interval and the breast contour can be obtained by traversing all pixel points between the second edge and the first edge. And for each minimum distance, at least one pixel point corresponding to the minimum distance exists, and the gray value corresponding to the minimum distance can be obtained by averaging all the gray values of the at least one pixel point. Thus, according to the traversal result, different minimum distances and corresponding gray values in the breast region corresponding to the gray value transformation interval can be obtained at the same time, and a distance-gray curve can be established by taking the minimum distance as an abscissa and the gray value corresponding to the minimum distance as an ordinate.
It can be understood that, in the actual processing process, the distance-grayscale curve is a discrete curve with a limited abscissa, the minimum value of the abscissa is the minimum distance of a certain pixel point corresponding to the maximum value of the interval, the maximum value of the abscissa is the minimum distance of a certain pixel point corresponding to the minimum value of the interval, and the abscissa value between the two is the minimum distance value actually counted, which is usually a finite number, rather than an infinite number with continuous numerical values. Then, the gray value transformation interval may be divided into a corresponding number of gray value transformation subintervals, i.e., N gray value transformation subintervals, according to a limited number of gray values (assuming N) other than the gray value corresponding to the abscissa maximum value and the abscissa minimum value on the distance-gray value curve.
Then, a gray scale transformation line segment corresponding to each gray scale transformation subinterval may be determined. The slope of the line segment may be set by itself, or may be determined according to the two interval end point values of the gray value conversion sub-interval and the interval end point value of the gray value conversion interval, for example, the slope may be defined as a quotient of 2 times the interval minimum value of the gray value conversion interval and the sum of the two interval end point values of the gray value conversion sub-interval. After the slope determination, the line segment expression may be further determined. For a first segment, the initial point of the segment is the end point value of the gray value transformation interval, so that the expression of the first segment can be directly determined according to the point-slope formula; the initial point of each subsequent line segment can be the terminal point of the previous line segment, and the terminal point value can be calculated according to the expression of the previous line segment and the interval end point value of the gray value conversion subinterval, so that the expression of each line segment can be calculated by using a point skew method. Thus, a gray scale transformation line segment corresponding to each gray scale value transformation subinterval is determined.
Then, fitting may be performed on all the obtained grayscale transformation line segments to obtain grayscale transformation parameters, for example, a least square method, a lagrangian interpolation method, a newton iteration method, a cubic spline interpolation method, or the like may be used to perform curve fitting on the grayscale transformation line segments to obtain a grayscale transformation curve, where the parameters of the grayscale transformation curve are the grayscale transformation parameters. The method has the advantages that the gray scale conversion parameters can be determined in a more detailed mode, so that the gray scale jump of the low-frequency image after gray scale conversion is reduced, and the gray scale conversion effect of the low-frequency image is more continuous and smoother.
And S260, carrying out gray level transformation on the gray level transformation area according to the gray level transformation parameters to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
According to the technical scheme of the embodiment, the gray scale transformation distance in the low-frequency image is determined through the breast width, the breast compression thickness and the gray scale transformation distance determination model, so that the automatic determination of the gray scale transformation distance in the low-frequency image processing is realized, the individuation difference of a breast examiner is considered, the automation of the breast image processing process is realized, and the accuracy of the gray scale transformation distance acquisition is improved; the gray value transformation interval is determined through the gray transformation distance, so that the gray transformation parameters are determined, the automation of the mammary gland image processing process is further realized, the manual participation process is reduced, and the accuracy of the mammary gland image processing is further improved. Therefore, the mammary gland image with more uniform gray distribution is obtained, and the effect of better breast image thickness balance is achieved, so that the mammary gland image more meets the clinical requirement.
The following is an embodiment of a device for processing a breast image according to an embodiment of the present invention, which belongs to the same inventive concept as the method for processing a breast image according to the above embodiments, and reference may be made to the above embodiment of the method for processing a breast image for details which are not described in detail in the embodiment of the device for processing a breast image.
EXAMPLE III
Fig. 7 is a schematic structural diagram of a breast image processing device provided in the third embodiment, where the device specifically includes: an image acquisition module 710, a gray scale transformation region determination module 720, and a target image generation module 730.
The image obtaining module 710 is configured to obtain an original scanned image, and obtain a low-frequency image and a high-frequency image according to the original scanned image;
a gray level transformation region determining module 720, configured to determine a model according to the breast width, the breast compression thickness, and the gray level transformation distance, and determine a gray level transformation region of the low-frequency image;
and the target image generation module 730 is configured to perform gray scale transformation on the gray scale transformation region determined by the gray scale transformation region determination module 720 to obtain a target low-frequency image with balanced thickness, and reconstruct the target low-frequency image and the high-frequency image to generate a target image.
By the mammary gland image processing device, the problems that the processing of the mammary gland image depends too much on empirical parameters and the equalization effect is poor are solved, the gray level transformation area needing thickness equalization in the mammary gland image is accurately and automatically acquired, the mammary gland image with more uniform gray level distribution is obtained, and the better mammary gland image thickness equalization effect is achieved.
Optionally, the image obtaining module 710 is specifically configured to:
acquiring an original scanning image, and filtering the original scanning image to respectively acquire a low-frequency original image and a high-frequency original image;
segmenting the low-frequency original image to obtain a low-frequency mammary gland image;
correspondingly, the target image generation module 730 is specifically configured to:
carrying out gray level transformation on the gray level transformation area of the low-frequency original image to obtain a target low-frequency original image with balanced thickness;
and reconstructing the target low-frequency original image and the high-frequency original image to generate a target original image.
Optionally, the image obtaining module 710 is further specifically configured to:
acquiring an original scanning image, and acquiring a mammary gland image according to the original scanning image;
filtering the mammary gland image to respectively obtain a low-frequency mammary gland image and a high-frequency mammary gland image;
correspondingly, the target image generation module 730 is further specifically configured to:
carrying out gray level transformation on the gray level transformation area of the low-frequency mammary gland image to obtain a target low-frequency mammary gland image with balanced thickness;
and reconstructing the target low-frequency mammary gland image and the high-frequency mammary gland image to generate a target mammary gland image.
Optionally, the gray scale transformation area determining module 720 is specifically configured to:
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining the gray scale transformation distance;
and determining a region from the breast contour of the low-frequency image to the gray scale transformation distance range as the gray scale transformation region.
Optionally, the target image generation module 730 includes:
the gray value transformation interval determination submodule is used for determining a gray value transformation interval according to the gray value of the gray value transformation area;
the gray level transformation parameter determination submodule is used for determining the gray level transformation parameter of the low-frequency image according to the gray level transformation interval;
and the gray level conversion submodule is used for carrying out gray level conversion on the gray level conversion area according to the gray level conversion parameter.
Further, the gray value transformation interval determination submodule is specifically configured to:
respectively determining a first pixel point set and a second pixel point set corresponding to a first edge and a second edge of the gray scale transformation region, wherein the first edge is a breast contour in the gray scale transformation region, and the second edge is a side edge far away from the breast contour in the gray scale transformation region;
determining the maximum gray value of the first pixel point set, and determining the maximum gray value as the maximum interval value of the gray value transformation interval;
determining a gray level average value of the second pixel point set, and determining the gray level average value as a minimum interval value of a gray level value transformation interval;
and determining the gray value transformation interval according to the interval maximum value and the interval minimum value.
Further, the gray scale transformation parameter determination submodule is specifically configured to:
dividing the gray value transformation interval into N gray value transformation subintervals, and determining a gray value transformation line segment corresponding to each gray value transformation subinterval, wherein N is a positive integer;
and performing curve fitting on the N gray level transformation line segments to obtain gray level transformation parameters of the low-frequency image.
Optionally, on the basis of the above apparatus, the apparatus further includes: a gray-scale transformation distance determination model training module 740, the gray-scale transformation distance determination model training module 740 configured to pre-train the gray-scale transformation distance determination model by:
acquiring at least two groups of historical breast images and historical breast compression thicknesses corresponding to the historical breast images;
determining historical breast width and historical gray scale transformation distance according to the historical breast image, and determining the historical breast width, the historical breast compression thickness and the historical gray scale transformation distance as model training parameters;
and performing model training by using the model training parameters and a set model to obtain a gray level transformation distance determination model, wherein the set model is a statistical model.
The processing device for the mammary gland image provided by the embodiment of the invention can execute the processing method for the mammary gland image provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that, in the embodiment of the processing apparatus for a breast image, the units and modules included in the processing apparatus are only divided according to the functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the invention.
Example four
A fourth embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a method for processing a breast image, the method including:
acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image;
and carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the method for processing a breast image provided by any embodiment of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, and the computer software product may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute the method for processing a breast image according to the embodiments of the present invention.
EXAMPLE five
An embodiment of the present invention provides a breast imaging apparatus, which at least includes:
an X-ray source for emitting X-rays;
the detector is used for acquiring X rays emitted by the X-ray source and is used for acquiring an original scanning image;
a breast support plate for placing a breast;
a compression plate for compressing the breast; and
an image processor for processing an image, wherein the image processor is configured to perform a method of processing a breast image, the method comprising:
acquiring an original scanning image acquired by a detector, and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image;
and carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image.
Of course, the image processor of the breast imaging apparatus provided by the embodiment of the present invention is not limited to the method operations described above, and may also perform related operations in the breast image processing method provided by any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (9)

1. A method for processing a breast image, comprising:
acquiring an original scanning image, and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining a gray scale transformation region of the low-frequency image;
carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image;
wherein, the determining a model according to the breast width, the breast compression thickness and the gray scale transformation distance, and the determining the gray scale transformation region of the low-frequency image comprises:
determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray scale transformation distance, and determining the gray scale transformation distance corresponding to the low-frequency image, wherein the gray scale transformation distance determination model is a model which is established in advance according to training sample data and is used for determining the gray scale transformation distance, and the gray scale transformation distance determination model is a statistical model or a machine learning model;
and determining a region from the mammary gland contour of the low-frequency image to the gray scale transformation distance range as a gray scale transformation region of the low-frequency image.
2. The method of claim 1, wherein the acquiring of the original scan image and the acquiring of the low frequency image and the high frequency image from the original scan image comprises:
acquiring an original scanning image, and filtering the original scanning image to respectively acquire a low-frequency original image and a high-frequency original image;
segmenting the low-frequency original image to obtain a low-frequency mammary gland image;
acquiring the width of a mammary gland according to the low-frequency mammary gland image;
correspondingly, the performing gray scale transformation on the gray scale transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image includes:
carrying out gray level transformation on the gray level transformation area of the low-frequency original image to obtain a target low-frequency original image with balanced thickness;
and reconstructing the target low-frequency original image and the high-frequency original image to generate a target original image.
3. The method of claim 1, wherein the acquiring of the original scan image and the acquiring of the low frequency image and the high frequency image from the original scan image comprises:
acquiring an original scanning image, and acquiring a mammary gland image according to the original scanning image;
filtering the mammary gland image to respectively obtain a low-frequency mammary gland image and a high-frequency mammary gland image;
acquiring the width of the mammary gland according to the low-frequency mammary gland image or the high-frequency mammary gland image;
correspondingly, the performing gray scale transformation on the gray scale transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image includes:
carrying out gray level transformation on the gray level transformation area of the low-frequency mammary gland image to obtain a target low-frequency mammary gland image with balanced thickness;
and reconstructing the target low-frequency mammary gland image and the high-frequency mammary gland image to generate a target mammary gland image.
4. The method of claim 1, wherein said graytransforming the grayscale transformed region comprises:
determining a gray value transformation interval according to the gray value of the gray value transformation area;
determining a gray level transformation parameter of the low-frequency image according to the gray level value transformation interval;
and carrying out gray scale conversion on the gray scale conversion area according to the gray scale conversion parameter.
5. The method of claim 4, wherein determining a gray value transformation interval according to the gray value of the gray value transformation region comprises:
respectively determining a first pixel point set and a second pixel point set corresponding to a first edge and a second edge of the gray scale transformation region, wherein the first edge is a breast contour in the gray scale transformation region, and the second edge is a side edge far away from the breast contour in the gray scale transformation region;
determining the maximum gray value of the first pixel point set, and determining the maximum gray value as the maximum interval value of the gray value transformation interval;
determining a gray level average value of the second pixel point set, and determining the gray level average value as a minimum interval value of a gray level value transformation interval;
and determining the gray value transformation interval according to the interval maximum value and the interval minimum value.
6. The method according to claim 4, wherein the determining the gray-scale transformation parameters of the low-frequency image according to the gray-scale value transformation interval comprises:
dividing the gray value transformation interval into N gray value transformation subintervals, and determining a gray value transformation line segment corresponding to each gray value transformation subinterval, wherein N is a positive integer;
and performing curve fitting on the N gray level transformation line segments to obtain gray level transformation parameters of the low-frequency image.
7. The method according to any one of claims 1 to 6, wherein the gray scale transformation distance determination model is obtained by pre-training by:
acquiring at least two groups of historical breast images and historical breast compression thicknesses corresponding to the historical breast images;
determining historical breast width and historical gray scale transformation distance according to the historical breast image, and determining the historical breast width, the historical breast compression thickness and the historical gray scale transformation distance as model training parameters;
and performing model training by using the model training parameters and a set model to obtain a gray level transformation distance determination model, wherein the set model is a statistical model.
8. A device for processing a breast image, comprising:
the image acquisition module is used for acquiring an original scanning image and acquiring a low-frequency image and a high-frequency image according to the original scanning image;
the gray level transformation region determining module is used for determining a model according to the width of the mammary gland, the compression thickness of the mammary gland and the gray level transformation distance and determining a gray level transformation region of the low-frequency image;
the target image generation module is used for carrying out gray level transformation on the gray level transformation area to obtain a target low-frequency image with balanced thickness, and reconstructing the target low-frequency image and the high-frequency image to generate a target image;
wherein the gray scale transformation region determining module is specifically configured to:
determining a gray scale transformation distance according to a breast width, a breast compression thickness and a gray scale transformation distance determination model, wherein the gray scale transformation distance determination model is a model which is established in advance according to training sample data and is used for determining the gray scale transformation distance, and the gray scale transformation distance determination model is a statistical model or a machine learning model;
and determining a region from the breast contour of the low-frequency image to the gray scale transformation distance range as the gray scale transformation region.
9. A breast imaging apparatus, comprising:
an X-ray source for emitting X-rays;
the detector is used for acquiring X rays emitted by the X-ray source and is used for acquiring an original scanning image;
a breast support plate for placing a breast;
a compression plate for compressing the breast; and
an image processor for processing an image, wherein the image processor is configured to perform the method of processing a breast image according to any one of claims 1 to 7.
CN201710447718.9A 2017-06-14 2017-06-14 Method and device for processing mammary gland image and mammary gland imaging equipment Active CN107292815B (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
CN201710447718.9A CN107292815B (en) 2017-06-14 2017-06-14 Method and device for processing mammary gland image and mammary gland imaging equipment
CA3168047A CA3168047A1 (en) 2017-06-14 2017-12-29 System and method for image processing
CA3067078A CA3067078C (en) 2017-06-14 2017-12-29 System and method for image processing
PCT/CN2017/120325 WO2018227943A1 (en) 2017-06-14 2017-12-29 System and method for image processing
EP17914031.4A EP3622476A4 (en) 2017-06-14 2017-12-29 System and method for image processing
CN201780092082.9A CN110832540B (en) 2017-06-14 2017-12-29 Image processing system and method
US16/023,340 US10949950B2 (en) 2017-06-14 2018-06-29 System and method for image processing
US17/201,084 US11562469B2 (en) 2017-06-14 2021-03-15 System and method for image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710447718.9A CN107292815B (en) 2017-06-14 2017-06-14 Method and device for processing mammary gland image and mammary gland imaging equipment

Publications (2)

Publication Number Publication Date
CN107292815A CN107292815A (en) 2017-10-24
CN107292815B true CN107292815B (en) 2020-09-01

Family

ID=60097804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710447718.9A Active CN107292815B (en) 2017-06-14 2017-06-14 Method and device for processing mammary gland image and mammary gland imaging equipment

Country Status (1)

Country Link
CN (1) CN107292815B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3067078C (en) * 2017-06-14 2022-10-04 Chunhua JIANG System and method for image processing
US10949950B2 (en) 2017-06-14 2021-03-16 Shanghai United Imaging Healthcare Co., Ltd. System and method for image processing
CN107862691B (en) * 2017-11-24 2020-06-19 上海联影医疗科技有限公司 Method, apparatus, computer and medium for detecting non-breast regions in breast images
CN108460754B (en) * 2018-01-15 2020-08-18 浙江深博医疗技术有限公司 Method for converting low-frequency ultrasonic image into high-frequency ultrasonic image
CN110163857B (en) * 2019-05-24 2022-03-04 上海联影医疗科技股份有限公司 Image background area detection method and device, storage medium and X-ray system
CN110544250B (en) * 2019-09-06 2022-09-13 上海联影医疗科技股份有限公司 Medical image processing method and system
EP4178446A4 (en) * 2020-08-10 2023-06-07 Shanghai United Imaging Healthcare Co., Ltd. Imaging systems and methods
CN114627037A (en) * 2020-11-26 2022-06-14 季华实验室 Electrostatic spinning thickness detection method and device
CN113674367B (en) * 2021-08-20 2024-03-26 上海宝藤生物医药科技股份有限公司 Pretreatment method of lipoprotein cholesterol reagent scan after electrophoresis

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104274201A (en) * 2014-10-10 2015-01-14 深圳先进技术研究院 Method, system and equipment for tomography of mammary gland and image acquisition and processing method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2391132A1 (en) * 2002-06-21 2003-12-21 Dan Rico Method and apparatus for determining peripheral breast thickness
JP5275668B2 (en) * 2008-04-04 2013-08-28 富士フイルム株式会社 Image processing apparatus and image processing method
KR101689867B1 (en) * 2010-09-15 2016-12-27 삼성전자주식회사 Method for processing image, image processing apparatus and medical image system for performing the same
JP2013233415A (en) * 2012-04-11 2013-11-21 Fujifilm Corp Radiation image photographing apparatus, radiation image photographing program, and radiation image photographing method
JP5844296B2 (en) * 2012-06-11 2016-01-13 富士フイルム株式会社 Radiation image processing apparatus and method
CN104574361B (en) * 2014-11-27 2017-12-29 沈阳东软医疗系统有限公司 A kind of mammary gland peripheral tissues balanced image processing method and device
CN105701796B (en) * 2015-12-31 2018-09-18 上海联影医疗科技有限公司 The thickness equalization methods and device of breast image, mammography system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104274201A (en) * 2014-10-10 2015-01-14 深圳先进技术研究院 Method, system and equipment for tomography of mammary gland and image acquisition and processing method

Also Published As

Publication number Publication date
CN107292815A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN107292815B (en) Method and device for processing mammary gland image and mammary gland imaging equipment
CN107316291B (en) Mammary gland image processing method and mammary gland imaging equipment
Weldon et al. Gabor filter design for multiple texture segmentation
US5923775A (en) Apparatus and method for signal dependent noise estimation and reduction in digital images
Kanwal et al. Region based adaptive contrast enhancement of medical X-ray images
US7764820B2 (en) Multi-threshold peripheral equalization method and apparatus for digital mammography and breast tomosynthesis
US11995808B2 (en) Method for X-ray dental image enhancement
CN109523458B (en) High-precision sparse angle CT reconstruction method combined with sparse induction dynamic guided filtering
CN105701796B (en) The thickness equalization methods and device of breast image, mammography system
CN111028310B (en) Method, device, terminal and medium for determining scanning parameters of breast tomography
CN116071355A (en) Auxiliary segmentation system and method for peripheral blood vessel image
Hasikin et al. Adaptive fuzzy intensity measure enhancement technique for non-uniform illumination and low-contrast images
Georgieva et al. An application of dental X-ray image enhancement
CN106691505B (en) Method and device for processing uniformity and contrast of ultrasonic image
CN106780413B (en) Image enhancement method and device
CN107564021A (en) Detection method, device and the digital mammographic system of highly attenuating tissue
RU2343538C1 (en) Method for correction of digital x-ray images
CN110136085B (en) Image noise reduction method and device
JP2005269451A5 (en)
CN114298920B (en) Super-visual field CT image reconstruction model training and super-visual field CT image reconstruction method
CN116563166A (en) Image enhancement method, device, storage medium and equipment
Packard et al. Glandular segmentation of cone beam breast CT volume images
CN116071337A (en) Endoscopic image quality evaluation method based on super-pixel segmentation
Naidu et al. Enhancement of X-ray images using various Image Processing Approaches
CN110136089B (en) Human embryo heart ultrasonic image enhancement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Patentee after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Patentee before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

CP01 Change in the name or title of a patent holder