CN112801114A - Method and device for determining projection position information of mammary gland image - Google Patents

Method and device for determining projection position information of mammary gland image Download PDF

Info

Publication number
CN112801114A
CN112801114A CN202110075957.2A CN202110075957A CN112801114A CN 112801114 A CN112801114 A CN 112801114A CN 202110075957 A CN202110075957 A CN 202110075957A CN 112801114 A CN112801114 A CN 112801114A
Authority
CN
China
Prior art keywords
projection position
position information
mammary gland
image
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110075957.2A
Other languages
Chinese (zh)
Other versions
CN112801114B (en
Inventor
石磊
程根
史晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yitu Medical Technology Co ltd
Original Assignee
Hangzhou Yitu Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yitu Medical Technology Co ltd filed Critical Hangzhou Yitu Medical Technology Co ltd
Priority to CN202110075957.2A priority Critical patent/CN112801114B/en
Publication of CN112801114A publication Critical patent/CN112801114A/en
Application granted granted Critical
Publication of CN112801114B publication Critical patent/CN112801114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Epidemiology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Library & Information Science (AREA)
  • Databases & Information Systems (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a method for determining projection position information of a mammary gland image, which comprises the following steps: acquiring first projection position information contained in label information of each mammary gland image in a group of mammary gland images; acquiring second projection position information and confidence thereof of each mammary gland image through the image recognition model; and when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range, taking the second projection position information as the projection position information of the mammary gland image. The method for determining the projection position information of the mammary gland image automatically identifies the projection position information of the mammary gland image through the trained deep learning image identification model, can add the projection position information to the mammary gland image lacking the projection position information, and can find and correct the wrong projection position information.

Description

Method and device for determining projection position information of mammary gland image
Technical Field
The invention relates to the technical field of medical treatment, in particular to a method and a device for determining projection position information of a mammary gland image and a mammary gland image display method.
Background
The mammography is a commonly used breast examination method, and when a breast image is taken, one image needs to be taken on the MLO site and the CC site of the left and right breasts, respectively. Fig. 1 is a schematic view of a set of breast images in a mammography examination. Referring to FIG. 1, the four images are left MLO bits, right MLO bits, left CC bits, and right CC bits. The traditional PACS manufacturer carries out matching according to a self-defined image tag of an image shooting device, and the left breast and right breast information, the shooting position information, the device manufacturer information and the like are marked on the general tag. However, in some cases, the tag information may be incorrectly or not labeled during the shooting process due to operator error. The wrong label information may affect the judgment of the doctor on the subsequent focus, and may even cause the misjudgment.
Disclosure of Invention
In order to solve the problem that the breast image lacks of the projection position information or the projection position information is inaccurate in the background art, the invention provides a method for automatically generating the projection position information and correcting the wrong projection position information by a deep learning image identification method.
In order to achieve the above object, the present invention provides a method of determining projection position information of a breast image, comprising:
acquiring first projection position information contained in label information of each mammary gland image in a group of mammary gland images;
acquiring second projection position information and confidence thereof of each mammary gland image through the image recognition model;
and when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range, taking the second projection position information as the projection position information of the mammary gland image.
Optionally, the method further includes:
and when the first projection position information contained in the mammary gland image label information cannot be acquired, taking the second projection position information as the projection position information of the mammary gland image.
Optionally, the projection position information includes a CC projection position and an MLO projection position, and the method further includes:
in the projection position information of the group of mammary gland images, when the number of CC projection positions or MLO projection positions is more than two, two of the projection positions with the highest confidence degrees are selected as the mammary gland images of the projection positions.
Optionally, the method further includes:
and determining left and right breast information of the breast image according to the characteristic information in the breast image.
Optionally, the feature information includes:
one or more of calcifications, lumps and structural distortion.
Optionally, the method further includes:
acquiring equipment information for generating a mammary gland image contained in the mammary gland image label information;
and when the equipment information contains preset information, taking first projection position information as projection position information of the mammary gland image.
Optionally, the device information includes one of a device manufacturer and a device model.
The invention also provides a device for determining the projection position information of the mammary gland image, which comprises:
the first information acquisition unit is used for acquiring first projection position information contained in label information of each mammary gland image in a group of mammary gland images;
the second information acquisition unit is used for acquiring second projection position information and confidence thereof in the mammary gland image through the image recognition model;
and the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the mammary gland image when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range.
Optionally, in the apparatus described above,
the first information acquisition unit is further used for acquiring the equipment information for generating the mammary gland image contained in the mammary gland image label information;
the projection position information acquisition unit is further used for taking the first projection position information as the projection position information of the mammary gland image when the equipment information contains preset information.
The invention also provides a mammary gland image display method, which comprises the following steps:
and responding to a first operation to display a mammary gland image, wherein the display of the mammary gland image comprises displaying the projection position information of the mammary gland image and the acquisition way of the projection position information.
According to the method and the device for determining the projection position information of the mammary gland image, the projection position information of the mammary gland image is automatically recognized through the trained image recognition model, the projection position information can be added to the mammary gland image without the projection position information, and the wrong projection position information can be found and corrected.
Furthermore, according to the method and the device for determining the projection position information of the mammary gland image, the equipment information with poor generalization performance is preset in consideration of the problem that the image recognition model has poor generalization performance on mammary gland images generated by equipment of part manufacturers or models, and when the mammary gland images belong to images generated by equipment of specified manufacturers or models, the image recognition model is forbidden to be used, so that misleading doctors to carry out wrong diagnosis due to wrong projection position information is avoided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic representation of a set of breast images in a mammography exam;
fig. 2 is a flowchart illustrating a method for determining projection position information of a breast image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of lesion depth determination in a CC breast image according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of determining a lesion depth in an MLO breast image according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an apparatus for determining projection location information of a breast image according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 2 is a flowchart illustrating a method of determining projection location information of a breast image according to an embodiment of the present invention. As shown in fig. 2, the method for determining the projection position information of the breast image includes:
s101, first projection position information contained in label information of each mammary gland image in a group of mammary gland images is obtained.
And S102, acquiring second projection position information and confidence thereof of each mammary gland image through the image recognition model.
S103, when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence value belongs to a preset range, taking the second projection position information as the projection position information of the mammary gland image.
S101, first projection position information contained in label information of each mammary gland image in a group of mammary gland images is obtained. Taking a mammary gland molybdenum target image as an example, a mammary gland image is a standard Dicom image (which is called as Digital Imaging and Communications in Medicine, and is an international standard of medical images and related information), and attribute information of the mammary gland image is recorded in the Dicom image through corresponding labels. The attribute information comprises projection position information input by a doctor during image shooting. Specifically, in the shooting process, after the doctor assists the patient in positioning, the doctor selects the projection position through the operation of an operation interface or an operation button of the mammary gland X-ray examination equipment, and the mammary gland X-ray examination equipment adds the projection position information to the attribute information of the picture. In some devices, the attribute information may be displayed directly in the image, such as in the upper left corner of the image, in a smaller font at a location that does not affect the display of breast tissue. At this time, the projection position information in the breast image may be acquired by an image character recognition technique such as OCR.
And S102, acquiring second projection position information and confidence thereof of each mammary gland image through the image recognition model. The image recognition model is obtained by training a large number of mammary gland images and the projection position marking result of the mammary gland images, the projection position marking result is marked by a professional after manual film reading, the trained image recognition model can output projection position information of the mammary gland image to be recognized and confidence corresponding to the projection position information, and the higher the confidence is, the more positive the result output by the image recognition model is. The confidence degree can be represented by an affer value, which is between [0 and 1], for example, when the output result of the image recognition model is to judge whether the projection position information of the breast image is an MLO position, the closer to 1 the affer value is, the more likely it is an MLO projection position, and the closer to 0, the more likely it is a CC projection position.
S103, when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range, taking the second projection position information as the projection position information of the mammary gland image.
In most cases, the first and second projection level information of the breast image should be consistent, but in few cases, the first and second projection level information are not consistent. For example, the first projection position information of a certain breast image is the CC projection position, and the second projection position information is the MLO projection position. In this embodiment, considering that the probability of the misoperation of the doctor is low, when the output result of the image recognition model is relatively positive, for example, when the confidence of the second projection position information is greater than or equal to 0.8, the second projection position information is used as the projection position information of the breast image.
The method further comprises a step of preprocessing the breast image, wherein the preprocessing comprises a binarization processing of the breast image, namely, an edge segmentation of the breast image. The operation can reduce the details of the mammary gland image and improve the identification accuracy of the image identification model.
The method for determining the projection position information of the mammary gland image further comprises the following steps: and when the first projection position information contained in the mammary gland image label information cannot be acquired, taking the second projection position information as the projection position information of the mammary gland image. In a part of the breast image, the projection position information may be missing, in this case, the problem that the first projection position information and the second projection position information are inconsistent as described above does not exist, and in this case, the second projection position information is directly used as the projection position information of the breast image.
Further, the method for determining the projection position information of the breast image according to the embodiment of the present invention further includes: the projection position information comprises a CC projection position and an MLO projection position, and when more than two CC projection positions or MLO projection positions exist in the projection position information of the group of mammary gland images, two mammary gland images with the highest confidence degrees are selected as the projection positions.
In the mammography, only one mammary gland is sometimes imaged depending on patients and different conditions, but both mammary glands are mostly imaged. When both breasts were imaged there were four breast images, two breast images at the CC site and two breast images at the MLO site.
However, in some cases, when the first projection position information included in the label information of each breast image in a group of breast images is acquired in S101, two breast images of CC projection positions and two breast images of MLO projection positions are acquired, but after the second projection position information and the confidence thereof are acquired in S102 and the judgment is performed in S103, there is a possibility that 3 breast images of CC projection positions or 3 breast images of MLO projection positions appear. At this time, the requirement that the mammary gland X-ray examination needs to output mammary gland images of two CC projection positions and two MLO projection positions is obviously not met, so that the judgment needs to be carried out to determine which two are the correct CC projection positions or MLO projection positions.
The solution under the circumstances is explained in detail by specific examples below.
After the first group of four mammary gland images are received, first projection position information contained in the label information of each mammary gland image in the first group is obtained. The projection position information contained in the first and second mammary gland image label information is a CC projection position, and the projection position information contained in the third and fourth mammary gland image label information is an MLO projection position. And further acquiring second projection position information and corresponding confidence values in the first group of mammary gland images through an image recognition model, wherein the CC projection position confidence level of the first mammary gland image is 0.98, the CC projection position confidence level of the second mammary gland image is 0.02, the CC projection position confidence value of the third mammary gland image is 0.97, and the CC projection position confidence value of the fourth mammary gland image is 0.03. At this time, the image recognition model considers that the reliability of the CC projection position of the second breast image is only 0.02, that is, the reliability of the MLO projection position of the second breast image is 0.98, and the confidence value falls within the preset range (0.8 is a threshold value, 0.98 > 0.8), so that the result is positive, and the MLO projection position is taken as the projection position of the second breast image. In the same way, the CC projection position is used as the projection position of the third mammary gland image.
And after receiving the second group of four mammary gland images, acquiring first projection position information contained in label information of each mammary gland image in the second group. The projection position information contained in the first and second mammary gland image label information is a CC projection position, and the projection position information contained in the third and fourth mammary gland image label information is an MLO projection position. And further acquiring second projection position information and corresponding confidence values in the second group of mammary gland images through the image recognition model. The CC projection position confidence value of the first mammary gland image is 0.98, the CC projection position confidence value of the second mammary gland image is 0.99, the CC projection position confidence value of the third mammary gland image is 0.97, and the CC projection position confidence value of the fourth mammary gland image is 0.03. At this time, the image recognition model considers that the third breast image is the CC projection position, and the confidence value is within the preset range, so the output result is more positive. However, considering that it is impossible to output 3 breast images of CC projection sites in X-ray examination, the MLO projection site is taken as the projection site of the third breast image (the CC projection site of the third image has the lowest reliability).
The method for determining the projection position information of the mammary gland image in the embodiment of the invention further comprises the following steps: and determining left and right breast information of the breast image according to the characteristic information in the breast image.
In the breast images, the contour shapes of the breast images of different projection positions are different, so that the projection positions of different breast images can be recognized through a trained image recognition model, but the left and right breast images of the same projection position have no difference in contour shape, even if the difference exists, the difference is only for individual cases, and the difference has no commonality, so that the image recognition model is difficult to recognize the left and right breasts of the same projection position (including an experienced doctor is not distinguished). In this embodiment, the determining of the left and right breast information of the breast image according to the feature information in the breast image refers to a process of determining another corresponding projection site breast image according to the feature information in the breast image of the projection site on the premise that at least one image on the CC projection site or the MLO projection site includes the left and right breast information.
In order to achieve the above object, in this embodiment, a lesion included in each breast image in a set of breast images is identified by a lesion identification model, and the lesion information is used as feature information, where the feature information includes: one or more of calcifications, lumps and structural distortion. The process of obtaining the feature information is performed through a trained lesion identification model, which is the prior art and is not described herein again.
After the feature information included in each breast image is obtained, the similarity between features may be obtained through different measurement methods, such as: the similarity between features can be calculated by distance metric of features in breast images, such as: euclidean distance, mahalanobis distance, etc. The similarity between features can also be measured in the modes of included angle cosine, information entropy and the like. For determining the similarity by distance measurement, the larger the distance difference between the features is, the smaller the similarity between the features is, i.e., the more dissimilar the features are.
The following describes in detail the process of determining left and right breast information of a breast image after acquiring the feature information by using a specific embodiment. Assume that, after receiving the fourth breast image of the third group, the first shoot position information and the left and right breast information included in the label information of each breast image in the group are obtained. The left and right breast information contained in the first breast image label information is the right breast and has no projection position information; the label information of the second, third and fourth mammary gland images does not contain left and right mammary gland information and projection position information. The projection position information of each mammary gland image is determined by the method, the first mammary gland image and the second mammary gland image are confirmed to be CC projection positions, and the third mammary gland image and the fourth mammary gland image are confirmed to be MLO projection positions. Knowing that the first breast image is the right breast image, it can be determined based on this that the second breast image of another CC projection site is the left breast image. And comparing the first mammary gland image with the third and fourth mammary gland images respectively to judge the similarity of the included features.
The feature similarity comparison is a comparison of similarity between features in each breast image. The distance of the feature may include a depth of the lesion, such as may be determined based on a location of the lesion and a location of the nipple. Fig. 3 is a schematic diagram of determining a depth of a lesion in a breast image of a CC projection site according to an embodiment of the present invention. Fig. 4 is a schematic diagram of determining a lesion depth in a breast image of an MLO projection site according to an embodiment of the present invention. In the right breast image of the CC projection site as shown in fig. 3 or in the breast image of the MLO projection site as shown in fig. 4, a straight line where an intersection point of an arc with a preset length as a radius and a breast edge is located with a nipple as a center of a circle is a first straight line; taking a straight line which passes through the nipple and is perpendicular to the first straight line as a second straight line; and determining the projection of the focus on the second straight line, and taking the distance between the projection and the nipple as the depth of the focus. Considering that the depth of the focus determined based on the position of the focus and the position of the nipple is kept unchanged under different projection positions, the same focus under different projection positions can be accurately and quickly determined by adopting the mode.
In this embodiment, the lesion in the first breast image is the first lesion, and the lesion in the third breast image is the third lesion; the lesion in the fourth breast image is a fourth lesion, and the similarity between the first lesion and the third lesion may be determined according to the depth of the first lesion and the depth of the third lesion, for example, if the depth of the first lesion is depth 1 and the depth of the third lesion is depth 2, if the distance is used to measure the first similarity difference between the depth of the first lesion and the depth of the second lesion, the first similarity difference therebetween is | "depth 1-depth 2" |. The same method is adopted, the depth of the first focus and the depth of the fourth focus are obtained, a second similarity difference between the first focus and the fourth focus is determined, the first similarity difference and the second similarity difference are compared, the smaller the similarity difference is, the higher the similarity between the two focuses is, and when the first similarity difference is smaller than the second similarity difference, the third mammary gland image is obtained to be a right MLO projection position image; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth mammary gland image as a right MLO projection position image.
In order to improve the matching accuracy, other information in the lesion information, such as the sign of the lesion, the size of the lesion, etc., may be further considered. And determining the similarity between the signs of the first focus and the third and fourth foci according to the signs of the first focus and the third and fourth foci, and further determining the similarity between the first focus and the third focus and the similarity between the first focus and the fourth focus according to the similarity between the signs of the first focus and the third and fourth foci and the similarity between the depth of the first focus and the depths of the third and fourth foci. Illustratively, if the depth of the first lesion is depth 1, the depth of the third lesion is depth 2, the sign of the first lesion is sign 1, the sign of the third lesion is sign 2, and the similarity difference between the first lesion and the third lesion is a first similarity difference, then the first similarity difference | "depth 1-depth 2" | "+" | sign 1 "-sign 2" |. Obtaining a second similarity difference between the first focus and the fourth focus, comparing the first similarity difference with the second similarity difference, and obtaining a third mammary gland image as a right MLO projection position image when the first similarity difference is smaller than the second similarity difference; and when the first similarity difference is larger than the second similarity difference, obtaining that the fourth mammary gland image is a right MLO projection position image.
In other embodiments, the similarity between the size of the first lesion and the sizes of the third and fourth lesions may be determined according to the size of the first lesion and the sizes of the third and fourth lesions, and the similarity between the first lesion and the third and fourth lesions may be determined according to the similarity between the size of the first lesion and the sizes of the third and fourth lesions and the similarity between the depth of the first lesion and the depths of the third and fourth lesions. Illustratively, if the depth of the first lesion is depth 1, the depth of the third lesion is depth 2, the size of the first lesion is size 1, the size of the third lesion is size 2, and the similarity difference between the first lesion and the third lesion is a first similarity difference, then the first similarity difference | "depth 1-depth 2" | "+" | size 1 "-size 2" |. Obtaining a second similarity difference between the first focus and the fourth focus, comparing the first similarity difference with the second similarity difference, and obtaining a third mammary gland image as a right MLO projection position image when the first similarity difference is smaller than the second similarity difference; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth mammary gland image as a right MLO projection position image.
In further embodiments, the similarity between the first lesion symptom and the third and fourth lesion symptoms may also be determined based on the first lesion symptom and the third and fourth lesion symptoms; determining the similarity between the size of the first focus and the sizes of the third and fourth focuses according to the size of the first focus and the sizes of the third and fourth focuses; and determining the similarity between the first focus and the third and fourth foci according to the similarity between the signs of the first focus and the third and fourth foci, the similarity between the size of the first focus and the sizes of the third and fourth foci, and the similarity between the depth of the first focus and the depths of the third and fourth foci. Illustratively, if the depth of the first lesion is depth 1, the depth of the third lesion is depth 2, the first lesion is characterized by symptom 1, the third lesion is characterized by symptom 2, the first lesion is sized 1, the third lesion is sized 2, the similarity difference between the first lesion and the third lesion is a first similarity difference, and the first similarity difference | "depth 1-depth 2" | "+" | symptom 1 "-symptom 2" | + | size 1 "-size 2" |. Obtaining a second similarity difference between the first focus and the fourth focus, comparing the first similarity difference with the second similarity difference, and obtaining a third mammary gland image as a right MLO projection position image when the first similarity difference is smaller than the second similarity difference; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth mammary gland image as a right MLO projection position image.
In the foregoing implementation manner, when calculating the similarity, coefficients corresponding to different parameter types may also be determined, and taking the similarity calculation formula of the third implementation manner as an example, the similarity difference between the first lesion and the third and fourth lesions may be a similarity difference of a | "depth 1-depth 2" | "+ b" | symptom 1 "-symptom 2" | + c | size 1 "-size 2" |, where a, b, and c may all be determined according to actual experience.
In this embodiment, the indication of the lesion may include at least one category, such as the indication of the lesion including one or a combination of: calcification, lumps, structural distortion, when determining similarity between the signs of the first lesion and the third and fourth lesions according to the signs of the first lesion and the third and fourth lesions, determining a first vector according to the confidence that the first lesion belongs to each category; determining a second vector according to the confidence degrees that the third and fourth focuses belong to each category; and acquiring the distance between the first vector and the second vector, and taking the distance between the first vector and the second vector as the similarity between the first lesion sign and the third and fourth lesion signs.
In the implementation process, the breast image may be input to the calcification detection model, the lump detection model, and the structural distortion detection model, respectively, to determine a vector corresponding to a lesion in the breast image. If the first breast image is input to the calcification detection model, the mass detection model, and the structural distortion detection model, respectively, the confidence that the first lesion is calcified is 0.9, the confidence that the mass is 0.1, and the confidence that the structural distortion is 0, the first vector corresponding to the first lesion is (0.9, 0.1, 0), and similarly, the third or fourth breast image is input to the calcification detection model, the mass detection model, and the structural distortion detection model, respectively, the second vector corresponding to the third or fourth lesion is (0.8, 0.1, 0), and the distance between the first vector and the second vector is the similarity between the first lesion and the third or fourth lesion. The first vector and the second vector may be determined according to the norm of L2, or may be determined according to other manners.
In addition, considering that the shape of the lesion may be an ellipse-like shape, the size of the lesion may be determined according to the major diameter, minor diameter, and mean diameter of the lesion.
The method for determining the projection position information of the breast image of the embodiment further includes:
and acquiring the equipment information for generating the mammary gland image contained in the mammary gland image label information.
And when the equipment information contains preset information, taking first projection position information as projection position information of the mammary gland image.
The mammary gland image generated by the X-ray inspection equipment is a standard Dicom image, the Dicom image has corresponding label information to record attribute information of the image, such as a manufacturer of the X-ray inspection equipment, the model of the equipment, projection position information corresponding to the mammary gland image, left and right breast information and the like, and the manufacturer, the projection position information and the like can be obtained according to the label information. For example, the mammary gland image of Fuji Corporation determines that the Corporation is "FUJIFILM Corporation" according to the "Manufacturer" tag in the tag information, and then determines the left and right side breast information, the image category information, and the projection position information "R MAMMOGRAPHY, CC" (R means that the image is the right mammary gland image, MAMMOGRAPHY means that the image is the molybdenum target, and CC means that the image is the CC projection position image) according to the "Acquisition Device Processing Description" tag in the mammary gland image tag rule of Fuji Corporation.
In practice, after a large amount of data is verified, it is found that the image recognition model has poor generalization performance on breast images generated by X-ray inspection equipment of some manufacturers, and the error rate of results of the image recognition model is relatively high.
The device attribute information is one of device manufacturer and device model information. That is, when the breast image is from an X-ray examination apparatus produced by a specified manufacturer or a specified model of X-ray examination apparatus, the first projection position information is directly used as the projection position of the breast image.
The embodiment of the present invention further provides an apparatus for determining projection position information of a breast image, which is used for executing the above method for determining projection position information of a breast image, and the apparatus includes:
the first information acquisition unit 101 is configured to acquire first projection position information included in label information of each breast image in a group of breast images.
And the second information acquisition unit is used for acquiring second projection position information and confidence thereof in the mammary gland image through the image recognition model.
And the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the mammary gland image when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range.
In the present embodiment, the first and second electrodes are,
the first information obtaining unit 101 is further configured to obtain the device information for generating the breast image included in the breast image tag information.
The projection position information obtaining unit 103 is further configured to, when the device information includes preset information, use first projection position information as projection position information of the breast image.
The embodiment of the invention also provides a mammary gland image display method, which comprises the following steps: and responding to the first operation to display the mammary gland image, wherein the step of displaying the mammary gland image comprises displaying the projection position information of the mammary gland image and the acquisition way of the projection position information.
The first operation may be an operation command for an operator to input a breast image to the breast image display device by clicking with a mouse or touching with a touch panel, where the projection location information for displaying the breast image is: schematic labels for displaying CC or MLO projection positions at positions which do not affect the observation of the mammary gland around the mammary gland image or in the range of the mammary gland image are displayed, and the acquisition way for displaying the projection position information is as follows: displaying whether the projection position is obtained according to the first projection position information or the second projection position information in the periphery of the mammary gland image or in the range of the mammary gland image without influencing the position for observing the mammary gland, or marking only the obtaining way when the projection position is obtained according to the first projection position information or the second projection position information, wherein the default of the non-marked projection position information is that the other projection position information is obtained.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A method of determining projection position information for a breast image, comprising:
acquiring first projection position information contained in label information of each mammary gland image in a group of mammary gland images;
acquiring second projection position information and confidence thereof of each mammary gland image through the image recognition model;
and when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range, taking the second projection position information as the projection position information of the mammary gland image.
2. The method of claim 1, further comprising:
and when the first projection position information contained in the mammary gland image label information cannot be acquired, taking the second projection position information as the projection position information of the mammary gland image.
3. The method of claim 1, wherein the projection bit information includes a CC projection bit and an MLO projection bit, the method further comprising:
in the projection position information of the group of mammary gland images, when the number of CC projection positions or MLO projection positions is more than two, two of the projection positions with the highest confidence degrees are selected as the mammary gland images of the projection positions.
4. The method of claim 1, further comprising:
and determining left and right breast information of the breast image according to the characteristic information in the breast image.
5. The method of claim 4,
the characteristic information includes one or more of calcifications, masses and structural distortions.
6. The method of claim 1, further comprising:
acquiring equipment information for generating a mammary gland image contained in the mammary gland image label information;
and when the equipment information contains preset information, taking first projection position information as projection position information of the mammary gland image.
7. The method of claim 6,
the device information includes one of a device manufacturer and a device model.
8. An apparatus for determining projection position information of a breast image, comprising:
the first information acquisition unit is used for acquiring first projection position information contained in label information of each mammary gland image in a group of mammary gland images;
the second information acquisition unit is used for acquiring second projection position information and confidence thereof in the mammary gland image through the image recognition model;
and the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the mammary gland image when the first projection position information and the second projection position information of the mammary gland image are inconsistent and the confidence degree belongs to a preset range.
9. The apparatus of claim 8,
the first information acquisition unit is further used for acquiring the equipment information for generating the mammary gland image contained in the mammary gland image label information;
the projection position information acquisition unit is further used for taking the first projection position information as the projection position information of the mammary gland image when the equipment information contains preset information.
10. A breast image display method, comprising:
and responding to a first operation to display a mammary gland image, wherein the display of the mammary gland image comprises displaying the projection position information of the mammary gland image and the acquisition way of the projection position information.
CN202110075957.2A 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image Active CN112801114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110075957.2A CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110075957.2A CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Publications (2)

Publication Number Publication Date
CN112801114A true CN112801114A (en) 2021-05-14
CN112801114B CN112801114B (en) 2024-03-08

Family

ID=75810809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110075957.2A Active CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Country Status (1)

Country Link
CN (1) CN112801114B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345515A (en) * 2018-09-17 2019-02-15 代黎明 Sample label confidence calculations method, apparatus, equipment and model training method
CN111353549A (en) * 2020-03-10 2020-06-30 创新奇智(重庆)科技有限公司 Image tag verification method and device, electronic device and storage medium
CN111414946A (en) * 2020-03-12 2020-07-14 腾讯科技(深圳)有限公司 Artificial intelligence-based medical image noise data identification method and related device
CN111430014A (en) * 2020-03-31 2020-07-17 杭州依图医疗技术有限公司 Display method, interaction method and storage medium of glandular medical image
US20200311923A1 (en) * 2019-03-26 2020-10-01 The Johns Hopkins University Method and Apparatus for Registration of Different Mammography Image Views
CN112115913A (en) * 2020-09-28 2020-12-22 杭州海康威视数字技术股份有限公司 Image processing method, device and equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345515A (en) * 2018-09-17 2019-02-15 代黎明 Sample label confidence calculations method, apparatus, equipment and model training method
US20200311923A1 (en) * 2019-03-26 2020-10-01 The Johns Hopkins University Method and Apparatus for Registration of Different Mammography Image Views
CN111353549A (en) * 2020-03-10 2020-06-30 创新奇智(重庆)科技有限公司 Image tag verification method and device, electronic device and storage medium
CN111414946A (en) * 2020-03-12 2020-07-14 腾讯科技(深圳)有限公司 Artificial intelligence-based medical image noise data identification method and related device
CN111430014A (en) * 2020-03-31 2020-07-17 杭州依图医疗技术有限公司 Display method, interaction method and storage medium of glandular medical image
CN112115913A (en) * 2020-09-28 2020-12-22 杭州海康威视数字技术股份有限公司 Image processing method, device and equipment and storage medium

Also Published As

Publication number Publication date
CN112801114B (en) 2024-03-08

Similar Documents

Publication Publication Date Title
CN108520519B (en) Image processing method and device and computer readable storage medium
CN109754387B (en) Intelligent detection and positioning method for whole-body bone imaging radioactive concentration focus
US10565710B2 (en) Systems and user interfaces for determination of electro magnetically identified lesions as included in medical images of differing perspectives
US10318839B2 (en) Method for automatic detection of anatomical landmarks in volumetric data
US10424067B2 (en) Image processing apparatus, image processing method and storage medium
US10916010B2 (en) Learning data creation support apparatus, learning data creation support method, and learning data creation support program
JP4640845B2 (en) Image processing apparatus and program thereof
CN111160367A (en) Image classification method and device, computer equipment and readable storage medium
JP2016067946A (en) Medical image data processing apparatus and medical image data processing method
KR20140055152A (en) Apparatus and method for aiding lesion diagnosis
CN107752979B (en) Automatic generation method of artificial projection, medium and projection image determination device
US11610329B2 (en) Visualization system for visualizing an alignment accuracy
WO2019217903A1 (en) Automated screening of medical data
CN111916186A (en) Chest X-ray intelligent diagnosis system and method by sequential AI diagnosis model
CN113808125A (en) Medical image processing method, focus type identification method and related product
US10726548B2 (en) Confidence determination in a medical imaging video clip measurement based upon video clip image quality
US8073229B2 (en) Image analysis of tube tip positioning
WO2012106580A2 (en) Methods and apparatus for computer-aided radiological detection and imaging
CN111401102A (en) Deep learning model training method and device, electronic equipment and storage medium
US8577108B2 (en) Method for detecting anatomical structures
CN112801114B (en) Method and device for determining projection position information of breast image
CN113538414B (en) Lung image registration method and lung image registration device
JP2005270635A (en) Method for processing image and device for processing image
CN112885435B (en) Method, device and system for determining image target area
JP2018050761A (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant