CN109363699A - A kind of method and device of breast image lesion identification - Google Patents
A kind of method and device of breast image lesion identification Download PDFInfo
- Publication number
- CN109363699A CN109363699A CN201811203383.7A CN201811203383A CN109363699A CN 109363699 A CN109363699 A CN 109363699A CN 201811203383 A CN201811203383 A CN 201811203383A CN 109363699 A CN109363699 A CN 109363699A
- Authority
- CN
- China
- Prior art keywords
- image
- breast
- characteristic
- lesion
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 210000000481 breast Anatomy 0.000 title claims abstract description 404
- 230000003902 lesion Effects 0.000 title claims abstract description 232
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000003447 ipsilateral effect Effects 0.000 claims abstract description 9
- 238000005070 sampling Methods 0.000 claims description 41
- 238000007781 pre-processing Methods 0.000 claims description 28
- 238000012545 processing Methods 0.000 claims description 25
- 210000005075 mammary gland Anatomy 0.000 claims description 20
- 239000000284 extract Substances 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 12
- 238000003860 storage Methods 0.000 claims description 12
- 238000000605 extraction Methods 0.000 claims description 10
- 210000005036 nerve Anatomy 0.000 claims description 2
- 238000010801 machine learning Methods 0.000 abstract description 2
- 238000012549 training Methods 0.000 description 22
- 238000001514 detection method Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 19
- 230000008569 process Effects 0.000 description 13
- 239000006071 cream Substances 0.000 description 10
- 238000013527 convolutional neural network Methods 0.000 description 9
- 210000004907 gland Anatomy 0.000 description 9
- 206010006187 Breast cancer Diseases 0.000 description 8
- 208000026310 Breast neoplasm Diseases 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 6
- 230000002708 enhancing effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000012216 screening Methods 0.000 description 6
- 238000003475 lamination Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 238000010606 normalization Methods 0.000 description 5
- 208000008771 Lymphadenopathy Diseases 0.000 description 4
- 208000013228 adenopathy Diseases 0.000 description 4
- 241000208340 Araliaceae Species 0.000 description 3
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 3
- 235000003140 Panax quinquefolius Nutrition 0.000 description 3
- 230000004913 activation Effects 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 3
- 230000002308 calcification Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 235000008434 ginseng Nutrition 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 230000008447 perception Effects 0.000 description 3
- 238000003786 synthesis reaction Methods 0.000 description 3
- 206010028980 Neoplasm Diseases 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009607 mammography Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- VIKNJXKGJWUCNN-XGXHKTLJSA-N norethisterone Chemical compound O=C1CC[C@@H]2[C@H]3CC[C@](C)([C@](CC4)(O)C#C)[C@@H]4[C@@H]3CCC2=C1 VIKNJXKGJWUCNN-XGXHKTLJSA-N 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/502—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Radiology & Medical Imaging (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Physiology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the present invention provides a kind of method and device of breast image lesion identification, is related to machine learning techniques field, this method comprises: obtaining breast image, obtains breast image;The breast image includes the different breast images thrown according to position of not ipsilateral breast;The breast image is input in characteristic extracting module, the various sizes of characteristic image of the breast image is obtained;Using the same throwing of the breast image according to the other side breast of position breast image as the reference image of the breast image, be input to the characteristic extracting module, obtain various sizes of fixed reference feature image;According to the characteristic image and the fixed reference feature image, determine that breast lesion identifies frame;Frame is identified according to the breast lesion determined from each characteristic image, determines the breast lesion of breast image.
Description
Technical field
The present embodiments relate to machine learning techniques field more particularly to a kind of breast image lesion know method for distinguishing and
Device.
Background technique
Currently, breast image can use the breast of the X-ray examination mankind of low dosage, it can detect various mammary tumors,
The lesions such as tumour facilitate early detection breast cancer, and reduce its death rate.Breast image is a kind of effective detection method, can
For diagnosing the relevant disease of a variety of female mammary glands.Certainly, wherein it is most important using or breast cancer, especially early stage
In the screening of breast cancer.If therefore can effectively detect, various breast cancer early stages are showed on breast image, the help to doctor
It is huge.
After patient breast image, doctor judges the lesion in breast image, this method effect by personal experience
Rate is lower, and there are biggish subjectivities.
Summary of the invention
The embodiment of the present invention provides a kind of method and device of breast image lesion identification, leads in the prior art for solving
Cross the problem of doctors experience judges the method low efficiency of breast lesion in breast image.
The embodiment of the present invention provides a kind of breast image lesion knowledge method for distinguishing, comprising:
Obtain breast image;The breast image includes the different breast images thrown according to position of not ipsilateral breast;
The breast image is input in characteristic extracting module, the various sizes of characteristic pattern of the breast image is obtained
Picture;
Using the same throwing of the breast image according to the other side breast of position breast image as the ginseng of the breast image
Image is examined, the characteristic extracting module is input to, obtains various sizes of fixed reference feature image;
According to the characteristic image and the fixed reference feature image, determine that breast lesion identifies frame;
Frame is identified according to the breast lesion determined from each characteristic image, determines the breast lesion of breast image.
A kind of possible implementation, it is described according to the characteristic image and the fixed reference feature image, determine mastosis
Stove identifies frame, comprising:
Determine the first breast lesion identification frame in the characteristic image and the second mammary gland in the fixed reference feature image
Lesion identifies frame;
If it is determined that the position and/or size of the first breast lesion identification frame and second breast lesion identification frame are all
It is identical, then delete the first breast lesion identification frame.
A kind of possible implementation, the characteristic extracting module include N number of convolution module;N number of convolution module is
Down-sampling convolution block or up-sampling convolution block;The size for the characteristic image that each down-sampling convolution block or up-sampling convolution block extract
It is different, it include the first convolutional layer, the second convolutional layer in each convolution module of N number of convolution module;First convolution
The number of the characteristic image of layer output is less than the number of the characteristic image of first convolutional layer input;Second convolutional layer is defeated
The number of characteristic image out is greater than the number of the characteristic image of first convolutional layer input;N is greater than 0;
For any one characteristic image in the various sizes of characteristic image of the breast image, from the characteristic pattern
Determine that breast lesion identifies frame as in.
A kind of possible implementation, the various sizes of characteristic image for obtaining the breast image, comprising:
The breast image is passed sequentially through into the first spy that N/2 down-sampling convolution block extracts the N/2 breast images
Levy image;
The fisrt feature image that the N/2 down-sampling convolution block exports is passed sequentially through N/2 up-sampling convolution block to extract
The second feature image of the N/2 breast images, the size for the second feature image that each up-sampling convolution block extracts is not
Together;
After the identical fisrt feature image of size and second feature image are merged, N number of breast image is determined not
With the characteristic image of size.
A kind of possible implementation, the nerve convolutional network model further includes feature preprocessing module, the feature
Preprocessing module is located at before N number of convolution module;It is described that the breast image is input in characteristic extracting module, packet
It includes:
The breast image is input in the feature preprocessing module, the feature preprocessing module includes a volume
Lamination, one BN layers, one Relu layers and a pond layer;The convolution kernel size of the feature preprocessing module is greater than the N
The size of convolution sum in a convolution module;
Alternatively, the feature preprocessing module includes continuous multiple convolutional layers, one BN layer, one Relu layers with one
Pond layer;The size of maximum convolution kernel in the convolution kernel size of the feature preprocessing module and N number of convolution module
It is equal.
A kind of possible implementation, it is described that the breast image is input to before characteristic extracting module, further includes:
Obtain the original document of the breast image;
At least one set of window width and window level is chosen in the original document of the breast image, and obtains at least one set of window width
The breast image of the corresponding picture format in window position;
According to the breast image of the corresponding picture format of at least one set window width and window level, mentioned as the feature is input to
The breast image of modulus block.
The embodiment of the present invention provides a kind of device of breast image lesion identification, comprising:
Acquiring unit, for obtaining breast image;The breast image includes the different creams thrown according to position of not ipsilateral breast
Gland image;
It is different to obtain the breast image for the breast image to be input in characteristic extracting module for processing unit
The characteristic image of size;Using the same throwing of the breast image according to the other side breast of position breast image as the mammary gland shadow
The reference image of picture, is input to the characteristic extracting module, obtains various sizes of fixed reference feature image;According to the characteristic pattern
Picture and the fixed reference feature image determine that breast lesion identifies frame;Known according to the breast lesion determined from each characteristic image
Other frame, determines the breast lesion of breast image.
A kind of possible implementation, the processing unit are specifically used for:
Determine the first breast lesion identification frame in the characteristic image and the second mammary gland in the fixed reference feature image
Lesion identifies frame;If it is determined that the position of first breast lesion identification frame and second breast lesion identification frame and/or big
It is small all identical, then delete the first breast lesion identification frame.
On the other hand, the embodiment of the invention provides a kind of calculating equipment, including at least one processing unit and at least
One storage unit, wherein the storage unit is stored with computer program, when described program is executed by the processing unit
When, so that the step of processing unit executes any of the above-described the method.
Another aspect, the embodiment of the invention provides a kind of computer readable storage medium, being stored with can be set by calculating
The standby computer program executed, when described program is run on said computing device, so that calculating equipment execution is above-mentioned
The step of any one the method.
In the embodiment of the present invention, due to extracting the characteristic image of breast image, and the cream in each characteristic image is identified
Gland can quickly identify the lesion of mammary gland, improve the efficiency of breast lesion identification.
In addition, the port number that the first convolutional layer of setting exports is reduced, and volume Two by convolutional neural networks model
The port number of lamination output increases to the port number of the first convolutional layer input, so that effectively remaining image in convolution process
In effective information improve the validity of the extraction of characteristic image while reducing parameter amount, and then improve detection cream
The accuracy that breast lesion detects in gland image.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment
Attached drawing is briefly introduced, it should be apparent that, drawings in the following description are only some embodiments of the invention, for this
For the those of ordinary skill in field, without any creative labor, it can also be obtained according to these attached drawings
His attached drawing.
Fig. 1 a is a kind of schematic diagram of breast image provided in an embodiment of the present invention;
Fig. 1 b is a kind of schematic diagram of breast image provided in an embodiment of the present invention;
Fig. 1 c is a kind of schematic diagram of breast image provided in an embodiment of the present invention;
Fig. 1 d is a kind of schematic diagram of breast image provided in an embodiment of the present invention;
Fig. 2 is the flow diagram that a kind of breast image lesion provided in an embodiment of the present invention knows method for distinguishing;
Fig. 3 a is a kind of structural schematic diagram of characteristic extracting module provided in an embodiment of the present invention;
Fig. 3 b is a kind of structural schematic diagram of characteristic extracting module provided in an embodiment of the present invention;
Fig. 3 c is a kind of structural schematic diagram of characteristic extracting module provided in an embodiment of the present invention;
Fig. 3 is a kind of flow diagram of breast image lesion identification provided in an embodiment of the present invention;
Fig. 4 is a kind of flow diagram of breast image lesion identification provided in an embodiment of the present invention;
Fig. 5 is a kind of structural schematic diagram of the device of breast image lesion identification provided in an embodiment of the present invention;
Fig. 6 is a kind of structural schematic diagram for calculating equipment provided in an embodiment of the present invention.
Specific embodiment
In order to which the purpose of the present invention, technical solution and beneficial effect is more clearly understood, below in conjunction with attached drawing and implementation
Example, the present invention will be described in further detail.It should be appreciated that specific embodiment described herein is only used to explain this hair
It is bright, it is not intended to limit the present invention.
It in the embodiment of the present invention, by taking breast X-ray image as an example, is illustratively described, other images are herein no longer
It repeats.Breast image can use the breast of the X-ray examination mankind (mainly women) of low dosage (about 0.7 milli west is not), it
The lesions such as various mammary tumors, tumour can be detected, facilitate early detection breast cancer, and reduce its death rate.There are some countries to mention
The women for advocating older (generally 45 is more than one full year of life) periodically (interval from 1 year to five year differ) carries out mammography, to be screened out
The breast cancer of early stage.Breast image generally comprises the camera shooting of four parts of X-rays, 2 kinds of respectively 2 side breast throw according to position (position CC end to end, it is interior
Lateral oblique MLO) four parts of breast images, as shown in Fig. 1 a-d.
The prior art often only detects the lesion of calcification or lump type independent in this way, cannot be same to a variety of lesions simultaneously
Shi Jinhang detection, application range are narrow.It is directed to these lesions of calcification simultaneously, uses based on image primary features method, this
Class method is fairly simple, while the accuracy detected is also poor.It is a variety of for calcification, lump, asymmetry, structural distortion etc.
Type is likely to be present in same lesion, and the accuracy of detection is poor, is unable to satisfy application requirement.
In view of the above-mentioned problems, the embodiment of the present invention provides a kind of breast image lesion knowledge method for distinguishing, as shown in Fig. 2, packet
It includes:
Step 201: obtaining breast image;
Step 202: the breast image being input in characteristic extracting module, it is various sizes of to obtain the breast image
Characteristic image;
A kind of possible implementation, the characteristic extracting module include N number of convolution module;N number of convolution module is
Down-sampling convolution block and/or up-sampling convolution block;The characteristic image that each down-sampling convolution block or up-sampling convolution block extract
Size is different, includes the first convolutional layer, the second convolutional layer in each convolution module of N number of convolution module;Described first
The number of the characteristic image of convolutional layer output is less than the number of the characteristic image of first convolutional layer input;Second convolution
The number of the characteristic image of layer output is greater than the number of the characteristic image of second convolutional layer input;N is greater than 0;
For example, this feature extraction module may include three down-sampling convolution blocks.Each convolution module may include
First convolutional layer and the second convolutional layer, the first convolutional layer include convolutional layer, the normalization (Batch connecting with convolutional layer
Normalization, BN) layer, the activation primitive layer that is connect with BN layers, the convolution module as shown in Fig. 3 a includes the first convolutional layer
With the second convolutional layer.
For the depth for increasing characteristic extracting module, as shown in Figure 3b, a kind of possible implementation, characteristic image is through pulleying
The step of volume module may include:
Step 1: the characteristic image that the convolution module inputs is input to first convolutional layer and obtains fisrt feature figure
Picture;The convolution kernel of first convolutional layer can be N1*m*m*N2;N1 is the port number of the characteristic image of convolution module input,
N2 is the port number of fisrt feature image;N1>N2;
Step 2: fisrt feature image is input to second convolutional layer and obtains second feature image;First convolutional layer
Convolution kernel can be N2*m*m*N3;N3 is the port number of second feature image;N3>N2;
Step 3: after the characteristic image that the convolution module is inputted and second feature image merging, it is determined as institute
State the characteristic image of convolution module output.
In a kind of specific embodiment, the number of the characteristic image of the second convolutional layer output can be defeated with the first convolutional layer
The number of the characteristic image entered is equal.That is, N1=N2.
The method of determination of the corresponding characteristic image of breast image as described above is only a kind of possible implementation,
In other possible implementations, the corresponding characteristic image of breast image can also be determined otherwise, is not limited specifically
It is fixed.
It should be understood that the activation primitive in the embodiment of the present invention can be a plurality of types of activation primitives, for example, can
Think line rectification function (Rectified Linear Unit, ReLU), specifically without limitation;
Since the image inputted in the embodiment of the present invention is two dimensional image, the feature extraction in the embodiment of the present invention
Module can be the characteristic extracting module in (2Dimensions, 2D) convolutional neural networks, correspondingly, the volume of the first convolutional layer
Product core size can be m*m, the second convolutional layer convolution kernel size can be n*n;M and n can be the same or different, herein
Without limitation;Wherein, m, n are the integer more than or equal to 1.The number of the characteristic image of first convolutional layer output is less than described the
The number of the characteristic image of one convolutional layer input;The number of the characteristic image of the second convolutional layer output is greater than the volume Two
The number of the characteristic image of lamination input.
It further, is optimization characteristic extracting module, a kind of possible implementation, as shown in Figure 3c, the first volume
It further include third convolutional layer between lamination and second convolutional layer;The characteristic image of third convolutional layer input is described the
The characteristic image of the image of one convolutional layer output, the third convolutional layer output is the image of second convolutional layer input.
Wherein, the convolution kernel size of third convolutional layer can be k*k, k and m, and n may be the same or different, herein not
It limits.
In one specific embodiment, the size of the convolution kernel of first convolutional layer is 3*3;Second convolutional layer
The size of convolution kernel is 3*3;The size of the convolution kernel of the third convolutional layer is 1*1.
By the set-up mode of above-mentioned convolution kernel, the perception that can effectively improve feature extraction is wild, is conducive to improve cream
The accuracy of adenopathy stove identification.
Various sizes of characteristic image can be the characteristic image of different pixels, such as the characteristic pattern that pixel is 500 × 500
As the characteristic image for being 1000 × 1000 with pixel is various sizes of characteristic image.
Optionally, the various sizes of characteristic pattern of breast image is extracted using trained breast lesion detection model in advance
Picture, model are determined after being trained using 2D convolutional neural networks to marked multiple breast images.
Optionally, before the various sizes of characteristic image for extracting breast image, specific dimensions is scaled the images to, are made
The scale bar of pixel and physical length is certain in all directions.
Alternatively possible implementation, the characteristic extracting module include adopting on N/2 down-sampling convolution block and N/2
Sample convolution block;The various sizes of characteristic image for obtaining the breast image, comprising:
The breast image is passed sequentially through into the first spy that N/2 down-sampling convolution block extracts the N/2 breast images
Levy image;
The fisrt feature image that the N/2 down-sampling convolution block exports is passed sequentially through N/2 up-sampling convolution block to extract
The second feature image of the N/2 breast images, the size for the second feature image that each up-sampling convolution block extracts is not
Together;
After the identical fisrt feature image of size and second feature image are merged, N number of breast image is determined not
With the characteristic image of size.
For the perception open country for improving feature extraction, the performance of feature extraction, a kind of possible implementation, the feature are improved
It further include feature preprocessing module before extraction module;The feature preprocessing module include a convolutional layer, one BN layers, one
A Relu layers and a pond layer;The convolution kernel size of the feature preprocessing module is greater than any in N number of convolution module
The size of the convolution kernel of convolution module.
Preferably, the convolution kernel size of the convolutional layer can be 7*7, be divided into 2 pixels.Pond layer be 2*2 most
Big value pond.By feature preprocessing module, image area can be reduced rapidly, side length becomes original 1/4, effective to improve
The perception of characteristic image is wild, quickly extracts shallow-layer feature, the effective loss for reducing raw information.
A kind of possible implementation, the feature preprocessing module include continuous multiple convolutional layers, and one BN layers, one
A Relu layers and a pond layer;Maximum in the convolution kernel size of the feature preprocessing module and N number of convolution module
Convolution kernel it is equal in magnitude.
Characteristic image may include: that the breast image is input to feature to locate in advance by the step of feature preprocessing module
Module is managed, pretreated characteristic image is obtained;Using the pretreated characteristic image as the input of the characteristic extracting module.
Step 203: for any one characteristic image in the various sizes of characteristic image of the breast image, from institute
It states and determines that breast lesion identifies frame in characteristic image.
Optionally, determine that breast lesion is identified from characteristic image using preparatory trained breast lesion detection model
Frame, breast lesion detection model are to be trained using 2D convolutional neural networks to multiple breast images of marked breast lesion
It determines afterwards.The region for the breast lesion identification circle choosing determined from characteristic image might not all include breast lesion,
Therefore the breast lesion probability for identifying frame according to breast lesion is needed to screen each breast lesion identification frame, breast lesion is general
The breast lesion identification frame that rate is less than preset threshold is deleted, wherein breast lesion probability is the area that breast lesion identifies circle choosing
Domain is the probability of breast lesion.
Step 204: frame being identified according to the breast lesion determined from each characteristic image, determines the mastosis of breast image
Stove.
Specifically, will identify that frame is exported as the breast lesion in breast image after determining breast lesion identification frame,
The breast lesion parameter of output includes the centre coordinate of breast lesion and the diameter of breast lesion, wherein the center of breast lesion
Coordinate is the centre coordinate that breast lesion identifies frame, the diameter of breast lesion be breast lesion identify the center of frame to one of them
The distance in face.
Due to extracting the various sizes of characteristic image of breast image, and identify the mastosis in each characteristic image
Stove, therefore can detect large-sized breast lesion, while can also detect the breast lesion of small size, improve breast lesion
The precision of detection.Secondly, being examined automatically in the application compared to the method that whether there is breast lesion in artificial judgment breast image
The method for surveying breast lesion effectively improves breast lesion detection efficiency.
Since the breast lesion determined from each characteristic image identifies frame, there may be multiple identification frames to correspond to one
Breast lesion will lead to inspection if directly identifying that the quantity of frame determines the quantity of breast lesion in breast image according to breast lesion
There is very large deviation in the breast lesion quantity measured, therefore need to convert the characteristic image of same size simultaneously for each characteristic image
Then alignment screens the breast lesion determined from each characteristic image identification frame, and by the breast lesion after screening
Identification frame is determined as the breast lesion in breast image.
For the recognition accuracy for further increasing breast lesion, a kind of possible implementation, the breast image includes
The different breast images thrown according to position of not ipsilateral breast;It is described that the breast image is input to characteristic extracting module, comprising:
Using the same throwing of the breast image according to the other side breast of position breast image as the ginseng of the breast image
Image is examined, the characteristic extracting module is input to, obtains fixed reference feature image;
Any one characteristic image in the various sizes of characteristic image for the breast image, from the spy
Determine that breast lesion identifies frame in sign image;Include:
Determine the first breast lesion identification frame in the characteristic image and the second mammary gland in the fixed reference feature image
Lesion identifies frame;
If it is determined that the position and/or size of the first breast lesion identification frame and second breast lesion identification frame are all
It is identical, then delete the first breast lesion identification frame.
To further increase the accuracy rate that breast lesion identifies, as shown in figure 3, the embodiment of the present invention provides a kind of mammary gland shadow
As lesion knows method for distinguishing, comprising:
Step 301: obtaining breast image;The breast image includes the different mammary gland shadows thrown according to position of not ipsilateral breast
Picture;
Step 302: the breast image being input in characteristic extracting module, it is various sizes of to obtain the breast image
Characteristic image;
Step 403: using the same throwing of the breast image according to the other side breast of position breast image as the mammary gland
The reference image of image, is input to the characteristic extracting module, obtains various sizes of fixed reference feature image;
Step 304: according to the characteristic image and the fixed reference feature image, determining that breast lesion identifies frame;
Step 305: frame being identified according to the breast lesion determined from each characteristic image, determines the mastosis of breast image
Stove.
By reference to the identification of characteristic image, the identification further improved in characteristic image in lesion identification frame is accurate
Rate avoids the interference of mammary gland normal gland, improves Lesion Detection rate.
A kind of possible implementation, breast lesion identify that the screening process of frame includes:
Step 1: determining the first breast lesion identification frame in the characteristic image and the in the fixed reference feature image
Two breast lesions identify frame;
Step 2: if it is determined that first breast lesion identification frame and second breast lesion identification frame position and/
Or size is all identical, then deletes the first breast lesion identification frame.
Optionally, breast lesion identification frame screening process can with the following steps are included:
Step 1 determines that the breast lesion of breast lesion maximum probability is known from the breast lesion of each characteristic image identification frame
Other frame.
Step 2 calculates the friendship of breast lesion identification frame and other breast lesions identification frame of breast lesion maximum probability simultaneously
Than.
Step 3, will hand over and the identification frame of other breast lesions than being greater than preset threshold is deleted.
Step 4 determines the breast lesion identification of breast lesion maximum probability from other remaining breast lesions identification frame
Frame repeats the screening process of breast lesion identification frame, until other no remaining breast lesions identify frame.
It is illustrated below with reference to screening process of the specific example to above-mentioned breast lesion identification frame, sets each characteristic pattern
The breast lesion identification frame determined as in is respectively A, B, C, D, E, F, and the breast lesion of above-mentioned each breast lesion identification frame is general
Rate is respectively as follows: P (A)=0.9, P (B)=0.85, P (C)=0.95, P (D)=0.75, P (E)=0.96, P (F)=0.65.It will
After above-mentioned breast lesion identification frame is ranked up from big to small according to breast lesion probability are as follows: E, C, A, B, D, F, known to after sequence
The breast lesion identification frame of breast lesion maximum probability is E in the breast lesion identification frame of each characteristic image, is then calculated separately
Breast lesion identifies the friendship between frame E and other each breast lesion identification frames and than IOU, wherein the calculation such as formula of friendship and ratio
(1) shown in:
Wherein, m is that the breast lesion of breast lesion maximum probability identifies that frame, n are the cream compared with identifying frame m with breast lesion
Adenopathy stove identifies that frame, IOU are that breast lesion identifies that frame m and breast lesion identify friendship and ratio between frame n.
Preset threshold is set as 0.5, if the friendship and ratio between breast lesion identification frame C and breast lesion identification frame E are greater than
0.5, breast lesion identifies the friendship between frame A and breast lesion identification frame E and than being greater than 0.5, and breast lesion identifies frame B, mammary gland
Lesion identifies the friendship between frame D, breast lesion identification frame F and breast lesion identification frame E and than being respectively less than 0.5, then deletes mammary gland
Lesion identifies that frame C and breast lesion identify frame A, and breast lesion identification frame E is determined as the breast lesion in breast image.
Further, the identification of other remaining breast lesions frame B, D, F are ranked up according to breast lesion probability, are determined
The breast lesion identification frame of breast lesion maximum probability is that breast lesion identifies frame B, then calculates breast lesion identification frame B and cream
Adenopathy stove identifies the friendship between frame D and friendship and ratio between ratio and breast lesion identification frame B and breast lesion identification frame F.If
Breast lesion identifies the friendship between frame B and breast lesion identification frame D and than greater than 0.5, breast lesion identifies frame B and breast lesion
It identifies the friendship between frame F and ratio is less than 0.5, then delete breast lesion identification frame D, breast lesion is identified into frame B and mastosis
Stove identification frame F is determined as the breast lesion in breast image.Due to according to breast lesion identify frame breast lesion probability and
Breast lesion identifies the friendship between frame and compares the breast lesion identification frame determined in each characteristic image and screened, and avoids weight
It rechecks the same breast lesion surveyed in breast image and exports, breast lesion quantity is accurate in raising detection breast image
Property.
Lower mask body is introduced is instructed by multiple breast images of the convolutional neural networks to marked breast lesion
Practice and determine breast lesion detection model process, as shown in Figure 4, comprising the following steps:
Step 401, breast image is obtained as training sample.
Specifically, several breast images that can be will acquire, can also be to several creams of acquisition directly as training sample
Gland image carries out enhancing operation, expands the data volume of training sample, enhancing operation includes but is not limited to: translating up and down at random
Set pixel (such as 0~20 pixel), Random-Rotation set angle (such as -15~15 degree), random scaling set multiple (such as
0.85~1.15 times).
Step 402, the breast lesion in handmarking's training sample.
The breast lesion in training sample can be marked by professionals such as doctors, the content of label includes cream
The centre coordinate of adenopathy stove and the diameter of breast lesion.Specifically, breast lesion can be labeled by several doctors, and
Final breast lesion and breast lesion parameter are determined in such a way that more people vote synthesis, are as a result protected with the mode of mask figure
It deposits.It should be noted that the enhancing of breast lesion and training sample operates in no particular order in handmarking's training sample, Ke Yixian
Then breast lesion in handmarking's training sample will mark the training sample of breast lesion to carry out enhancing operation again, can also
Training sample is first carried out enhancing operation, then manually the training sample after enhancing operation is marked.
Step 403, training sample input convolutional neural networks are trained, determine breast lesion identification model.
The structure of the convolutional neural networks includes input layer, down-sampling convolution block, up-sampling convolution block, target detection network
And output layer.Above-mentioned convolutional neural networks are inputted after training sample is pre-processed, by the breast lesion of output and in advance
The mask figure of the training sample of label carries out loss function calculating, then anti-using back-propagation algorithm and sgd optimization algorithm
Multiple iteration, determines breast lesion detection model.
Further, the various sizes of spy of breast image is extracted using the breast lesion detection model that above-mentioned training determines
Levy the process of image, comprising the following steps:
Breast image is passed sequentially through the fisrt feature figure that N/2 down-sampling convolution block extracts N number of breast image by step 1
Picture.
The size for the fisrt feature image that each down-sampling convolution block extracts is different, and N/2 is greater than 0.
Optionally, down-sampling convolution block include the first convolutional layer and the second convolutional layer, group articulamentum, front and back articulamentum, under
Sample level.
The fisrt feature image that the N/2 down-sampling convolution block exports is passed sequentially through N/2 up-sampling convolution by step 2
Block extracts the second feature image of N/2 breast image.
The size for the second feature image that each up-sampling convolution block extracts is different.
Optionally, up-sampling convolution block includes that convolutional layer, group articulamentum, front and back articulamentum, up-sampling layer and synthesis connect
Connect layer.Convolutional layer includes convolution algorithm, and normalization layers and RELU layers of batch.
Step 3 determines N/2 breast image after merging the identical fisrt feature image of size and second feature image
Various sizes of characteristic image.
By up-sampling the synthesis articulamentum in convolution block for the identical fisrt feature image of size and second feature image
Merge and determines various sizes of characteristic image.It optionally, is by the logical of fisrt feature image and second feature image when merging
Road number merges, the size and the size phase of fisrt feature image and second feature image of the characteristic image obtained after merging
Together.
Further, breast lesion is determined from characteristic image using the breast lesion detection model that above-mentioned training determines
Identify the process of frame, comprising the following steps:
Step 1 centered on pixel, spreads determine the firstth area around for any one pixel in characteristic image
Domain.
Multiple default frames are arranged according to preset rules in the first region in step 2.
Since the shape of breast lesion is different, therefore various shapes can be set by default frame.Preset rules can be by
Default frame center is overlapped with the center of first area, and the angle for being also possible to preset frame is overlapped etc. with the angle of first area.
In a specific embodiment, the mode that the default frame of breast lesion is chosen is, for each of each characteristic pattern
Pixel, it is believed that it is an anchor point.The different default frame of multiple length-width ratios is set on each anchor point.For each default frame,
By carrying out convolution to characteristic pattern, the offset and confidence level of a coordinate and size are predicted, according to the inclined of coordinate and size
Shifting and confidence level determine default frame.
Step 3 presets frame for any one, predicts the position deviation of default frame and first area.
Step 4 determines that breast lesion identifies frame after adjusting default frame according to position deviation, and predicts that breast lesion identifies
The breast lesion probability of frame.
Wherein, breast lesion probability is the probability that the region of breast lesion identification circle choosing is breast lesion.Pass through prediction
Then the position deviation of default frame and first area adjusts default frame using position deviation and determines identification frame, so that identification frame is more
More ground frame selects the breast lesion region in characteristic pattern, improves the accuracy of breast lesion detection.
Specific training process may include: that training data image is inputted above-mentioned convolutional neural networks to calculate.
When incoming, multiple images of lesion difference window width and window level are passed to.When training, in the prediction block of network output, confidence is chosen
It spends highest prediction block collection and is overlapped maximum prediction block set with training sample.The friendship that prediction block confidence level and sample are marked
Entropy is pitched, the cross entropy with the offset of the mark lesion and prediction block of training sample, the weighted sum of the two is as loss function.Pass through
The method training of backpropagation, trained optimization algorithm use the sgd algorithm decayed with momentum and ladder.
In algorithm use process, by preprocessing module, input picture is pre-processed, to improve the effect of feature extraction
Fruit.
A kind of possible implementation, the acquisition breast image, comprising:
Step 1: the breast image image of shooting according to gaussian filtering, is determined the binaryzation of the breast image image
Image;
Step 2: obtaining the connected region of the binary image, region maximum in connected region is corresponded to described
The region of breast image image is as the galactophore image being partitioned into;
Step 3: the galactophore image being partitioned into is added in preset image template, pretreated cream is generated
Gland image;And using the pretreated galactophore image as the breast image for being input to the characteristic extracting module.
Specifically, the input of preprocessing module is the breast image saved with Dicom form type.Pretreatment may include
Body of gland segmentation and image normalization;The main purpose of body of gland segmentation is that the mammary gland extracting section in the breast image by input goes out,
Reject the image of other unrelated interference;Image normalization is that image conversion is classified as to unified format-pattern, specifically, including:
In step 1, the threshold value of specific binaryzation can be by seeking the maximum kind distance method of image grey level histogram
It obtains.
In step 2, can by binaryzation as a result, obtain independent region unit by unrestrained water law (flood fill),
And count the area of each region unit;By the region on the corresponding image of the maximum region unit of area, as the cream split
Gland image.
In step 3, preset image template can be the square-shaped image of black floor;Specifically, can will obtain
The galactophore image split, be extended for the square-shaped image of 1:1 by way of blackening side filling.
In addition, the breast image of output can be scaled by pixel, for example, image difference can be zoomed to 4096 pixels
× 4096 pixel sizes.
It, can be by adjusting mammary gland due to breast irradiation dosage and the extraneous factor of shooting etc. for mammary gland
Window width and window level, to obtain the recognition effect of better breast lesion identification.A kind of possible implementation, it is described by the mammary gland
Image is input to before characteristic extracting module, further includes:
Obtain the original document of the breast image;
At least one set of window width and window level is chosen in the original document of the breast image, and obtains at least one set of window width
The breast image of the corresponding picture format in window position;
According to the breast image of the corresponding picture format of at least one set window width and window level, mentioned as the feature is input to
The breast image of modulus block.
In a specific embodiment, dicom image can be converted into png image, example by three groups of window width and window levels
Such as, first group of window width is 4000, window position 2000;Second group of window width is 1000;Window position is 2000;Third group window width is 1500, window
Position is 1500.
Based on the same technical idea, the embodiment of the invention provides a kind of devices of breast lesion identification, such as Fig. 5 institute
Show, which can execute the process that breast lesion knows method for distinguishing, which includes acquiring unit 501, processing unit 502.
Acquiring unit 501, for obtaining breast image;The breast image includes that the different of not ipsilateral breast are thrown according to position
Breast image;
Processing unit 502 obtains the breast image not for the breast image to be input in characteristic extracting module
With the characteristic image of size;Using the same throwing of the breast image according to the other side breast of position breast image as the mammary gland
The reference image of image, is input to the characteristic extracting module, obtains various sizes of fixed reference feature image;According to the feature
Image and the fixed reference feature image determine that breast lesion identifies frame;According to the breast lesion determined from each characteristic image
It identifies frame, determines the breast lesion of breast image.
A kind of possible implementation, the processing unit 502, is specifically used for:
Determine the first breast lesion identification frame in the characteristic image and the second mammary gland in the fixed reference feature image
Lesion identifies frame;If it is determined that the position of first breast lesion identification frame and second breast lesion identification frame and/or big
It is small all identical, then delete the first breast lesion identification frame.
A kind of possible implementation, the processing unit 502, is specifically used for:
The breast image is passed sequentially through into the first spy that N/2 down-sampling convolution block extracts the N/2 breast images
Levy image;The fisrt feature image that the N/2 down-sampling convolution block exports is passed sequentially through into N/2 up-sampling convolution block and extracts N/
The size of the second feature image of 2 breast images, the second feature image that each up-sampling convolution block extracts is different;
After the identical fisrt feature image of size and second feature image are merged, the various sizes of of N number of breast image is determined
Characteristic image.
A kind of possible implementation, the feature processing block further include before feature preprocessing module;The processing
Unit 502, is specifically used for:
The breast image is input in the feature preprocessing module, the feature preprocessing module includes a volume
Lamination, one BN layers, one Relu layers and a pond layer;The convolution kernel size of the feature preprocessing module is greater than the N
The size of convolution sum in a convolution module;
Alternatively, the feature preprocessing module includes continuous multiple convolutional layers, one BN layer, one Relu layers with one
Pond layer;The size of maximum convolution kernel in the convolution kernel size of the feature preprocessing module and N number of convolution module
It is equal.
A kind of possible implementation, the acquiring unit 501, is used for:
Obtain the original document of the breast image;
The processing unit 502, is specifically used for:
At least one set of window width and window level is chosen in the original document of the breast image, and obtains at least one set of window width
The breast image of the corresponding picture format in window position;According to the mammary gland shadow of the corresponding picture format of at least one set window width and window level
Picture, as the breast image for being input to the characteristic extracting module.
A kind of possible implementation, the breast image include the different breast images thrown according to position of not ipsilateral breast;
The acquiring unit 501, is used for:
Using the same throwing of the breast image according to the other side breast of position breast image as the ginseng of the breast image
Image is examined, the characteristic extracting module is input to, obtains fixed reference feature image;Determine the first mastosis in the characteristic image
Stove identifies that the second breast lesion in frame and the fixed reference feature image identifies frame;If it is determined that first breast lesion identifies frame
It is all identical with the position of second breast lesion identification frame and/or size, then delete the first breast lesion identification frame.
The embodiment of the invention provides a kind of calculating equipment, including at least one processing unit and at least one storage list
Member, wherein the storage unit is stored with computer program, when described program is executed by the processing unit, so that described
Processing unit executes the step of breast lesion knows method for distinguishing.As shown in fig. 6, for calculating equipment described in the embodiment of the present invention
Hardware structural diagram, the calculating equipment be specifically as follows desktop computer, portable computer, smart phone, plate electricity
Brain etc..Specifically, which may include memory 801, the computer journey of processor 802 and storage on a memory
Sequence, the processor 802 realize the step of any breast lesion in above-described embodiment knows method for distinguishing when executing described program.
Wherein, memory 801 may include read-only memory (ROM) and random access memory (RAM), and provide to processor 802
The program instruction and data stored in memory 801.
Further, calculating equipment described in the embodiment of the present application can also include input unit 803 and output dress
Set 804 etc..Input unit 803 may include keyboard, mouse, touch screen etc.;Output device 804 may include display equipment, such as
Liquid crystal display (Liquid Crystal Display, LCD), cathode-ray tube (Cathode Ray Tube, CRT) touch
Screen etc..Memory 801, processor 802, input unit 803 and output device 804 can be connected by bus or other modes
It connects, in Fig. 6 for being connected by bus.The program instruction that processor 802 calls memory 801 to store and the journey according to acquisition
Sequence instruction execution breast lesion provided by the above embodiment knows method for distinguishing.
The embodiment of the invention also provides a kind of computer readable storage medium, being stored with can be executed by calculating equipment
Computer program, when described program is run on the computing device, so that the side for calculating equipment and executing breast lesion identification
The step of method.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method or computer program product.
Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the present invention
Form.It is deposited moreover, the present invention can be used to can be used in the computer that one or more wherein includes computer usable program code
The shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
Formula.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
Although preferred embodiments of the present invention have been described, it is created once a person skilled in the art knows basic
Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as
It selects embodiment and falls into all change and modification of the scope of the invention.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art
Mind and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technologies
Within, then the present invention is also intended to include these modifications and variations.
Claims (10)
1. a kind of breast image lesion knows method for distinguishing characterized by comprising
Obtain breast image;The breast image includes the different breast images thrown according to position of not ipsilateral breast;
The breast image is input in characteristic extracting module, the various sizes of characteristic image of the breast image is obtained;
Using the same throwing of the breast image according to the other side breast of position breast image as the reference shadow of the breast image
Picture is input to the characteristic extracting module, obtains various sizes of fixed reference feature image;
According to the characteristic image and the fixed reference feature image, determine that breast lesion identifies frame;
Frame is identified according to the breast lesion determined from each characteristic image, determines the breast lesion of breast image.
2. the method as described in claim 1, which is characterized in that described according to the characteristic image and the fixed reference feature figure
Picture determines that breast lesion identifies frame, comprising:
Determine the first breast lesion identification frame in the characteristic image and the second breast lesion in the fixed reference feature image
Identify frame;
If it is determined that the position and/or size all phases of the first breast lesion identification frame and second breast lesion identification frame
Together, then the first breast lesion identification frame is deleted.
3. the method as described in claim 1, which is characterized in that the characteristic extracting module includes N number of convolution module;The N
A convolution module is down-sampling convolution block or up-sampling convolution block;The spy that each down-sampling convolution block or up-sampling convolution block extract
The size for levying image is different, includes the first convolutional layer, the second convolutional layer in each convolution module of N number of convolution module;
The number of the characteristic image of the first convolutional layer output is less than the number of the characteristic image of first convolutional layer input;It is described
The number of the characteristic image of second convolutional layer output is greater than the number of the characteristic image of first convolutional layer input;N is greater than 0;
For any one characteristic image in the various sizes of characteristic image of the breast image, from the characteristic image
Determine that breast lesion identifies frame.
4. the method as described in claim 1, which is characterized in that the various sizes of characteristic pattern for obtaining the breast image
Picture, comprising:
The breast image is passed sequentially through into the fisrt feature figure that N/2 down-sampling convolution block extracts the N/2 breast images
Picture;
The fisrt feature image that the N/2 down-sampling convolution block exports is passed sequentially through into N/2 up-sampling convolution block and extracts N/2
The size of the second feature image of the breast image, the second feature image that each up-sampling convolution block extracts is different;
After the identical fisrt feature image of size and second feature image are merged, the different rulers of N number of breast image are determined
Very little characteristic image.
5. the method as described in claim 1, which is characterized in that the nerve convolutional network model further includes feature pretreatment mould
Block, the feature preprocessing module are located at before N number of convolution module;It is described that the breast image is input to feature extraction
In module, comprising:
The breast image is input in the feature preprocessing module, the feature preprocessing module includes a convolution
Layer, one BN layers, one Relu layers and a pond layer;The convolution kernel size of the feature preprocessing module is greater than described N number of
The size of convolution sum in convolution module;
Alternatively, the feature preprocessing module includes continuous multiple convolutional layers, one BN layers, one Relu layers and a pond
Layer;The convolution kernel size of the feature preprocessing module is equal in magnitude with the maximum convolution kernel in N number of convolution module.
6. the method as described in claim 1, which is characterized in that it is described by the breast image be input to characteristic extracting module it
Before, further includes:
Obtain the original document of the breast image;
At least one set of window width and window level is chosen in the original document of the breast image, and obtains at least one set of window width and window level
The breast image of corresponding picture format;
According to the breast image of the corresponding picture format of at least one set of window width and window level, as being input to the feature extraction mould
The breast image of block.
7. a kind of device of breast image lesion identification characterized by comprising
Acquiring unit, for obtaining breast image;The breast image includes the different mammary gland shadows thrown according to position of not ipsilateral breast
Picture;
Processing unit obtains the breast image difference size for the breast image to be input in characteristic extracting module
Characteristic image;Using the same throwing of the breast image according to the other side breast of position breast image as the breast image
With reference to image, it is input to the characteristic extracting module, obtains various sizes of fixed reference feature image;According to the characteristic image and
The fixed reference feature image determines that breast lesion identifies frame;Frame is identified according to the breast lesion determined from each characteristic image,
Determine the breast lesion of breast image.
8. the method as described in claim 1, which is characterized in that the processing unit is specifically used for:
Determine the first breast lesion identification frame in the characteristic image and the second breast lesion in the fixed reference feature image
Identify frame;If it is determined that the position and/or size of the first breast lesion identification frame and second breast lesion identification frame are all
It is identical, then delete the first breast lesion identification frame.
9. a kind of calculating equipment, which is characterized in that including at least one processing unit and at least one storage unit, wherein
The storage unit is stored with computer program, when described program is executed by the processing unit, so that the processing unit
Perform claim requires the step of 1~6 any claim the method.
10. a kind of computer readable storage medium, which is characterized in that it is stored with can be by computer journey that calculating equipment executes
Sequence, when described program is run on said computing device, so that calculating equipment perform claim requirement 1~6 is any described
The step of method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811203383.7A CN109363699B (en) | 2018-10-16 | 2018-10-16 | Method and device for identifying focus of breast image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811203383.7A CN109363699B (en) | 2018-10-16 | 2018-10-16 | Method and device for identifying focus of breast image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109363699A true CN109363699A (en) | 2019-02-22 |
CN109363699B CN109363699B (en) | 2022-07-12 |
Family
ID=65400002
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811203383.7A Active CN109363699B (en) | 2018-10-16 | 2018-10-16 | Method and device for identifying focus of breast image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109363699B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109978894A (en) * | 2019-03-26 | 2019-07-05 | 成都迭迦科技有限公司 | A kind of lesion region mask method and system based on three-dimensional mammary gland color ultrasound |
CN110013264A (en) * | 2019-04-29 | 2019-07-16 | 北京青燕祥云科技有限公司 | X-ray image recognition methods, device, electronic equipment and storage medium |
CN110400302A (en) * | 2019-07-25 | 2019-11-01 | 杭州依图医疗技术有限公司 | The method and device of lesion information in a kind of determination, display breast image |
WO2020077961A1 (en) * | 2018-10-16 | 2020-04-23 | 杭州依图医疗技术有限公司 | Image-based breast lesion identification method and device |
CN111325282A (en) * | 2020-03-05 | 2020-06-23 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image identification method and device suitable for multiple models |
CN111415332A (en) * | 2020-03-05 | 2020-07-14 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image linkage method and device |
WO2021136505A1 (en) * | 2019-12-31 | 2021-07-08 | Shanghai United Imaging Healthcare Co., Ltd. | Imaging systems and methods |
CN113344854A (en) * | 2021-05-10 | 2021-09-03 | 深圳瀚维智能医疗科技有限公司 | Breast ultrasound video-based focus detection method, device, equipment and medium |
TWI769370B (en) * | 2019-03-08 | 2022-07-01 | 太豪生醫股份有限公司 | Focus detection apparatus and method thereof |
TWI832671B (en) * | 2023-01-13 | 2024-02-11 | 國立中央大學 | Mammography intelligent diagnosis method by using machine learning from mammography image |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060177125A1 (en) * | 2005-02-08 | 2006-08-10 | Regents Of The University Of Michigan | Computerized detection of breast cancer on digital tomosynthesis mammograms |
CN101373479A (en) * | 2008-09-27 | 2009-02-25 | 华中科技大学 | Method and system for searching computer picture of mammary gland x-ray radiography |
CN105975785A (en) * | 2016-05-13 | 2016-09-28 | 深圳市前海安测信息技术有限公司 | Mammary gland screening image automatic processing system and method |
CN106572824A (en) * | 2014-07-18 | 2017-04-19 | 皇家飞利浦有限公司 | Stenosis assessment |
CN107665491A (en) * | 2017-10-10 | 2018-02-06 | 清华大学 | The recognition methods of pathological image and system |
CN107680678A (en) * | 2017-10-18 | 2018-02-09 | 北京航空航天大学 | Based on multiple dimensioned convolutional neural networks Thyroid ultrasound image tubercle auto-check system |
CN108090889A (en) * | 2016-11-21 | 2018-05-29 | 医渡云(北京)技术有限公司 | Galactophore image establishment of coordinate system method and device |
CN108648192A (en) * | 2018-05-17 | 2018-10-12 | 杭州依图医疗技术有限公司 | A kind of method and device of detection tubercle |
-
2018
- 2018-10-16 CN CN201811203383.7A patent/CN109363699B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060177125A1 (en) * | 2005-02-08 | 2006-08-10 | Regents Of The University Of Michigan | Computerized detection of breast cancer on digital tomosynthesis mammograms |
CN101373479A (en) * | 2008-09-27 | 2009-02-25 | 华中科技大学 | Method and system for searching computer picture of mammary gland x-ray radiography |
CN106572824A (en) * | 2014-07-18 | 2017-04-19 | 皇家飞利浦有限公司 | Stenosis assessment |
CN105975785A (en) * | 2016-05-13 | 2016-09-28 | 深圳市前海安测信息技术有限公司 | Mammary gland screening image automatic processing system and method |
CN108090889A (en) * | 2016-11-21 | 2018-05-29 | 医渡云(北京)技术有限公司 | Galactophore image establishment of coordinate system method and device |
CN107665491A (en) * | 2017-10-10 | 2018-02-06 | 清华大学 | The recognition methods of pathological image and system |
CN107680678A (en) * | 2017-10-18 | 2018-02-09 | 北京航空航天大学 | Based on multiple dimensioned convolutional neural networks Thyroid ultrasound image tubercle auto-check system |
CN108648192A (en) * | 2018-05-17 | 2018-10-12 | 杭州依图医疗技术有限公司 | A kind of method and device of detection tubercle |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020077961A1 (en) * | 2018-10-16 | 2020-04-23 | 杭州依图医疗技术有限公司 | Image-based breast lesion identification method and device |
TWI769370B (en) * | 2019-03-08 | 2022-07-01 | 太豪生醫股份有限公司 | Focus detection apparatus and method thereof |
CN109978894A (en) * | 2019-03-26 | 2019-07-05 | 成都迭迦科技有限公司 | A kind of lesion region mask method and system based on three-dimensional mammary gland color ultrasound |
CN110013264A (en) * | 2019-04-29 | 2019-07-16 | 北京青燕祥云科技有限公司 | X-ray image recognition methods, device, electronic equipment and storage medium |
CN110400302A (en) * | 2019-07-25 | 2019-11-01 | 杭州依图医疗技术有限公司 | The method and device of lesion information in a kind of determination, display breast image |
CN110400302B (en) * | 2019-07-25 | 2021-11-09 | 杭州依图医疗技术有限公司 | Method and device for determining and displaying focus information in breast image |
WO2021136505A1 (en) * | 2019-12-31 | 2021-07-08 | Shanghai United Imaging Healthcare Co., Ltd. | Imaging systems and methods |
CN111325282A (en) * | 2020-03-05 | 2020-06-23 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image identification method and device suitable for multiple models |
CN111415332A (en) * | 2020-03-05 | 2020-07-14 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image linkage method and device |
CN111415332B (en) * | 2020-03-05 | 2023-10-24 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image linkage method and device |
CN111325282B (en) * | 2020-03-05 | 2023-10-27 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image identification method and device adapting to multiple models |
CN113344854A (en) * | 2021-05-10 | 2021-09-03 | 深圳瀚维智能医疗科技有限公司 | Breast ultrasound video-based focus detection method, device, equipment and medium |
TWI832671B (en) * | 2023-01-13 | 2024-02-11 | 國立中央大學 | Mammography intelligent diagnosis method by using machine learning from mammography image |
Also Published As
Publication number | Publication date |
---|---|
CN109363699B (en) | 2022-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109447065A (en) | A kind of method and device of breast image identification | |
CN109363699A (en) | A kind of method and device of breast image lesion identification | |
CN109363698A (en) | A kind of method and device of breast image sign identification | |
US10127675B2 (en) | Edge-based local adaptive thresholding system and methods for foreground detection | |
CN109363697A (en) | A kind of method and device of breast image lesion identification | |
CN109447998B (en) | Automatic segmentation method based on PCANet deep learning model | |
Deng et al. | Classification of breast density categories based on SE-Attention neural networks | |
US9741112B2 (en) | Generating image-based diagnostic tests by optimizing image analysis and data mining of co-registered images | |
US7646902B2 (en) | Computerized detection of breast cancer on digital tomosynthesis mammograms | |
CN110942446A (en) | Pulmonary nodule automatic detection method based on CT image | |
CN106846344A (en) | A kind of image segmentation optimal identification method based on the complete degree in edge | |
CN109389129A (en) | A kind of image processing method, electronic equipment and storage medium | |
CN109087703A (en) | Abdominal cavity CT image peritonaeum metastatic marker method based on depth convolutional neural networks | |
CN108648192A (en) | A kind of method and device of detection tubercle | |
CN110046627B (en) | Method and device for identifying mammary gland image | |
CN109461144A (en) | A kind of method and device of breast image identification | |
CN108830842A (en) | A kind of medical image processing method based on Corner Detection | |
CN112990214A (en) | Medical image feature recognition prediction model | |
CN115423806B (en) | Breast mass detection method based on multi-scale cross-path feature fusion | |
Hou et al. | Mass segmentation for whole mammograms via attentive multi-task learning framework | |
CN106023188A (en) | Breast tumor feature selection method based on Relief algorithm | |
CN114581474A (en) | Automatic clinical target area delineation method based on cervical cancer CT image | |
Wang et al. | A region-line primitive association framework for object-based remote sensing image analysis | |
CN111062909A (en) | Method and equipment for judging benign and malignant breast tumor | |
CN109635866B (en) | Method of processing an intestinal image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |