CN109685077A - A kind of breast lump image-recognizing method and device - Google Patents
A kind of breast lump image-recognizing method and device Download PDFInfo
- Publication number
- CN109685077A CN109685077A CN201811525188.6A CN201811525188A CN109685077A CN 109685077 A CN109685077 A CN 109685077A CN 201811525188 A CN201811525188 A CN 201811525188A CN 109685077 A CN109685077 A CN 109685077A
- Authority
- CN
- China
- Prior art keywords
- characteristic pattern
- magnetic resonance
- mammary gland
- indicate
- resonance image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
This specification provides a kind of breast lump image-recognizing method and device, which comprises obtains mammary gland magnetic resonance image to be identified;The mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building, the lump recognition result in the mammary gland magnetic resonance image to be identified is obtained;Wherein, the lump identification model uses the full convolutional neural networks model of depth, the cataloged procedure of the full convolutional neural networks model of depth carries out feature extraction using the basic convolution module in U-shaped convolutional neural networks model, and characteristic pattern to be fused is unified after size to merge again by the full convolutional neural networks solution to model code process of depth using intensive connection.This specification embodiment realizes the automatic identification of breast lump, improves the accuracy of breast lump identification.
Description
Technical field
This specification belongs to technical field of image processing more particularly to a kind of breast lump image-recognizing method and device.
Background technique
Breast cancer is the highest cancer of women disease incidence, and the cure rate of early-stage breast cancer is more much higher than advanced breast cancer,
Early discovery, early diagnosis and early treatment are the key that reduce breast cancer case fatality rate.Image check includes that breast molybdenum target, ultrasound and magnetic are total
Vibration imaging etc. is the common technology of early-stage breast cancer screening.Image check result identification breast lump be can use to carry out mammary gland
The breast lump size shape of cancer screening, different patients is widely different, brings very big challenge to automatic division method, and lump
Segmentation is the first step of all subsequent breast cancer diagnosis and treatment, therefore realizes the automatic segmentation of breast lump to computer aided manufacturing
Diagnostic system is helped to be of great significance.
In the prior art, Mammogram Analysis system mainly use manual extraction feature come to breast lump into
Row positioning or classification.Manual extraction feature depend on researcher profession and Heuristics, often have certain limitation and
Subjectivity, as a result can be by larger impact.The automatic division method of breast lump is generally only to give lump in the prior art
Position, there is no the shape size information of lump.Therefore, this field, which needs one kind, can accurately be partitioned into breast lump region
Technical solution.
Summary of the invention
This specification is designed to provide a kind of breast lump image-recognizing method and device, realize breast lump from
Dynamic identification, improves the accuracy of breast lump identification.
One side this specification embodiment provides a kind of breast lump image-recognizing method, comprising:
Obtain mammary gland magnetic resonance image to be identified;
The mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building, obtain it is described to
Lump recognition result in the mammary gland magnetic resonance image of identification;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional Neural net of depth
The cataloged procedure of network model carries out feature extraction, the depth using the basic convolution module in U-shaped convolutional neural networks model
Characteristic pattern to be fused is unified after size to merge again by full convolutional neural networks solution to model code process using intensive connection.
Further, in another embodiment of the method, using basic in the U-shaped convolutional neural networks model
Convolution module carries out feature extraction using following formula:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13)
In above formula, y1Indicate the characteristic pattern extracted, x1Indicate that the mammary gland magnetic resonance image of input, δ indicate the first activation letter
Number, W11、W12、W13Indicate the corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
Further, in another embodiment of the method, the method also includes: utilize the full convolution mind of the depth
The characteristic pattern extracted is adjusted using following formula through network model:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2)
In above formula, y2Indicate characteristic pattern adjusted, x2Indicate that the characteristic pattern extracted, δ indicate the first activation primitive,
W21、W22Indicate the corresponding weight of different convolutional layers, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
Further, in another embodiment of the method, the method also includes:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
Further, in another embodiment of the method, the characteristic pattern that will be extracted assigns different weights
Value, comprising:
Average pond is carried out to the characteristic pattern extracted;
The characteristic pattern of average Chi Huahou is attached using full articulamentum, and is activated using the second activation primitive,
Obtain the corresponding weighted value of the characteristic pattern.
Further, in another embodiment of the method, the characteristic pattern that will be extracted assigns different weights
Value, comprising: obtain the corresponding weighted value of the characteristic pattern using following formula:
In above formula, ZcIndicate that characteristic pattern is averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H indicates c-th of characteristic pattern
Height, M indicate c-th of characteristic pattern width, xcIndicate c-th of characteristic pattern of input, S indicate the corresponding weighted value of characteristic pattern to
Amount, Z indicate that the pond result vector of characteristic pattern Chi Huahou, σ indicate second activation primitive, W32、W31Different full articulamentums pair
The weight answered, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
Further, in another embodiment of the method, the breast lump identification model uses following methods structure
It builds:
Multiple sample datas are obtained, the sample data includes: mammary gland magnetic resonance image and the mammary gland magnetic resonance image
In mass mark;
The breast lump identification model is established, using the mammary gland magnetic resonance image in the sample data as the mammary gland
The input data of lump identification model, using the mass mark in the corresponding mammary gland magnetic resonance image as the breast lump
The output data of identification model is trained the breast lump identification model, until the breast lump identification model reaches
To preset requirement.
Further, in another embodiment of the method, the method also includes:
The image of mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number compares;
If the image of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number is inconsistent, then is adjusted using the mammary gland magnetic resonance image to be identified to the breast lump identification model;
Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted.
On the other hand, present description provides a kind of breast lump pattern recognition devices, comprising:
Image collection module, for obtaining mammary gland magnetic resonance image to be identified;
Lump identification module, the breast lump for the mammary gland magnetic resonance image to be identified to be input to building identify
In model, the lump recognition result in the mammary gland magnetic resonance image to be identified is obtained;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional Neural net of depth
The cataloged procedure of network model carries out feature extraction, the depth using the basic convolution module in U-shaped convolutional neural networks model
Characteristic pattern to be fused is unified after size to merge again by full convolutional neural networks solution to model code process using intensive connection.
Further, the basic convolution in another embodiment of described device, in the U-shaped convolutional neural networks model
Module carries out feature extraction using following formula:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13)
In above formula, y1Indicate the characteristic pattern extracted, x1Indicate that the mammary gland magnetic resonance image of input, δ indicate the first activation letter
Number, W11、W12、W13Indicate the corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
It further, include spy in the full convolutional neural networks model of depth in another embodiment of described device
Sign adapts to module, for being adjusted the characteristic pattern extracted using following formula:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2)
In above formula, y2Indicate characteristic pattern adjusted, x2Indicate that the characteristic pattern extracted, δ indicate the first activation primitive,
W21、W22Indicate the corresponding weight of different convolutional layers, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
Further, in another embodiment of described device, further include in the full convolutional neural networks model of depth
Channel notices that power module is used for:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
Further, in another embodiment of described device, the channel notices that power module is specifically used for:
Average pond is carried out to the characteristic pattern extracted;
The characteristic pattern of average Chi Huahou is attached using full articulamentum, and is activated using the second activation primitive,
Obtain the corresponding weighted value of the characteristic pattern.
Further, in another embodiment of described device, the channel notices that power module is specifically used for using following
Formula obtains the corresponding weighted value of the characteristic pattern:
In above formula, ZcIndicate that characteristic pattern is averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H indicates c-th of characteristic pattern
Height, M indicate c-th of characteristic pattern width, xcIndicate c-th of characteristic pattern of input, S indicate the corresponding weighted value of characteristic pattern to
Amount, Z indicate that the pond result vector of characteristic pattern Chi Huahou, σ indicate second activation primitive, W32、W31Different full articulamentums pair
The weight answered, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
Further, in another embodiment of described device, described device further includes model construction module for using
Following methods construct the breast lump identification model:
Multiple sample datas are obtained, the sample data includes: mammary gland magnetic resonance image and the mammary gland magnetic resonance image
In mass mark;
Establish the breast lump identification model, wherein include multiple model parameters in the breast lump identification model;
It, will using the mammary gland magnetic resonance image in the sample data as the input data of the breast lump identification model
Output data of the mass mark as the breast lump identification model in the corresponding mammary gland magnetic resonance image adjusts institute
The model parameter of breast lump identification model is stated, until the breast lump identification model reaches preset requirement.
Further, in another embodiment of described device, described device further includes that model adjustment module is used for:
The image of mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number compares;
If the image of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number is inconsistent, then is adjusted using the mammary gland magnetic resonance image to be identified to the breast lump identification model;
Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted.
Also on the one hand, present description provides breast lump image recognition processing equipment, comprising: at least one processor with
And the memory for storage processor executable instruction, the processor realize this specification embodiment when executing described instruction
In breast lump image-recognizing method.
In another aspect, present description provides a kind of breast lump image identification system, including at least one processor with
And the memory for storage processor executable instruction, the processor realize this specification embodiment when executing described instruction
In breast lump image-recognizing method.
Breast lump image-recognizing method, the device, processing equipment, system of this specification offer, are based on deep learning, will
U-shaped convolutional neural networks model is combined with intensive convolutional neural networks model, and it is swollen to construct asymmetrical coding and decoding mammary gland
Block identification model structure.Mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building again
To obtain the lump recognition result of mammary gland magnetic resonance image to be identified, the automatic identification of breast lump is realized, very important person is not required to
Work naked eyes identify, improve the recognition result of breast lump.The breast lump image-recognizing method that this specification is implemented to provide can
Preferably to retain useful feature to the Beneficial Effect of segmentation result, and weaken the effect of useless feature.Substantially increase model
To the effect of magnetic resonance breast lump segmentation, does not need subsequent network model and segmentation result is advanced optimized, reduce
Cost is calculated, image analysing computer process is accelerated, doctor can preferably be assisted to carry out real-time imaging diagnosis.
Detailed description of the invention
In order to illustrate more clearly of this specification embodiment or technical solution in the prior art, below will to embodiment or
Attached drawing needed to be used in the description of the prior art is briefly described, it should be apparent that, the accompanying drawings in the following description is only
The some embodiments recorded in this specification, for those of ordinary skill in the art, in not making the creative labor property
Under the premise of, it is also possible to obtain other drawings based on these drawings.
Fig. 1 is the flow diagram of breast lump image-recognizing method in this specification one embodiment;
Fig. 2 is the network architecture schematic diagram of breast lump identification model in this specification one embodiment;
Fig. 3 is the modular structure schematic diagram of breast lump pattern recognition device one embodiment that this specification provides;
Fig. 4 is the structural schematic diagram of breast lump pattern recognition device in the another embodiment of this specification;
Fig. 5 is the structural schematic diagram of breast lump pattern recognition device in the another embodiment of this specification;
Fig. 6 is the hardware block diagram using the embodiment of the present application breast lump identification server.
Specific embodiment
In order to make those skilled in the art more fully understand the technical solution in this specification, below in conjunction with this explanation
Attached drawing in book embodiment is clearly and completely described the technical solution in this specification embodiment, it is clear that described
Embodiment be only this specification a part of the embodiment, instead of all the embodiments.The embodiment of base in this manual,
Every other embodiment obtained by those of ordinary skill in the art without making creative efforts, all should belong to
The range of this specification protection.
Breast lump image-recognizing method can be applied in client or server in this specification, and client can be
Smart phone, tablet computer, intelligent wearable device (smartwatch, virtual reality glasses, virtual implementing helmet etc.), intelligent vehicle
Carry the electronic equipments such as equipment.
Specifically, Fig. 1 is the flow diagram of breast lump image-recognizing method in this specification one embodiment, such as scheme
Shown in 1, the overall process of the breast lump image-recognizing method provided in this specification one embodiment may include:
Step 102 obtains mammary gland magnetic resonance image to be identified.
Magnetic resonance examination is relatively common medical inspection method at present, in this specification embodiment, available user
Mammary gland magnetic resonance image, the identification of breast lump is carried out based on the mammary gland magnetic resonance image to be identified that gets.
The mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building by step 104, is obtained
Obtain the lump recognition result in the mammary gland magnetic resonance image to be identified;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional Neural net of depth
The cataloged procedure of network model carries out feature extraction, the depth using the basic convolution module in U-shaped convolutional neural networks model
Characteristic pattern to be fused is unified after size to merge again by full convolutional neural networks solution to model code process using intensive connection.
In the specific implementation process, it can be based on deep learning method, construct breast lump identification model, such as: can
To carry out model training using the mammary gland magnetic resonance image data of existing patient with breast cancer, learn from input magnetic resonance image
To the Function Mapping relationship of output breast lump segmentation result, breast lump identification model is constructed.Fig. 2 is one reality of this specification
The network architecture schematic diagram of breast lump identification model in example is applied, as shown in Fig. 2, the breast lump in this specification embodiment is known
Other model can be a kind of full convolutional neural networks model of depth, can by U-shaped convolutional neural networks model (i.e. U-Net) with it is close
Collection convolutional neural networks model is that (DenseNet) is combined.It can be used in U-shaped convolutional neural networks model in cataloged procedure
Basic convolution module carry out feature extraction, decoding process then can be using the intensive connection in intensive convolutional neural networks model
After characteristic pattern to be fused is unified size, Fusion Features are carried out.
U-Net can be understood as a kind of deformation of convolutional neural networks, its main structure is similar to letter U, thus the U- that gains the name
Net.The entire neural network of U-Net is mainly made of two parts: constricted path and extensions path, constricted path are primarily used to
The contextual information in picture is captured, and the extensions path claimed in contrast is then to split to required in picture
Part carries out precise positioning.DenseNet can be understood as it is a kind of with the convolutional neural networks intensively connected, in the network,
There is direct connection between any two layers, that is to say, that each layer of network of input is all the union of all layers of output in front,
And the characteristic pattern that this layer is learnt can also be directly passed to and be used as input for all layers behind.Intensive connection can alleviate gradient and disappear
Mistake problem reinforces feature propagation, encourages feature multiplexing, greatly reduces parameter amount, improve the accuracy of image recognition.
As shown in Fig. 2, the basic convolution module in left side can be using the base in similar U-shaped convolutional neural networks model in Fig. 2
This convolution module, is used for feature extraction, and the downward arrow of basic convolution module can indicate maximum pond.Right channel in Fig. 2
Notice that power module can be understood as decoding process, channel notices that the upward arrow of power module can indicate bilinear interpolation, wherein
The characteristic pattern of the cataloged procedure of same layer can be carried out Fusion Features by decoding process together.Channel pays attention to the power module right side in Fig. 2
The connecting line with the arrow of side can indicate intensively to connect, can be by the full convolutional Neural net of depth in some embodiments of this specification
The network module of network solution to model code process is intensively connected, i.e., the channel in Fig. 2 is noticed that power module is intensively connected.
Channel notices that the characteristic pattern of the cataloged procedure of same layer extraction and other channels can be paid attention to the feature of power module by power module
After figure carries out unified size, Fusion Features are carried out.
Such as: the channel of Tu2Zhong top layer pay attention to power module can by the cataloged procedure on the left of it obtain characteristic pattern, its
The channel of lower section 3 notices that the characteristic pattern of power module output and the feature of bottom adapt to the characteristic pattern that module exports and carry out unification
Then size merges the characteristic pattern after unified size.Wherein, the channel of top layer pay attention to power module can left side coding
The characteristic pattern of process acquisition, below 3 channels notice that the characteristic pattern of power module output and the feature of bottom adapt to module
The channel that the characteristic pattern of output can be understood as top layer pays attention to the characteristic pattern to be fused of power module.
This specification embodiment joined intensive connection in decoding process, these intensive connections can be by the abstract journey of difference
The characteristic pattern of degree carries out being upsampled to unified size respectively, then directly merges them.Characteristic pattern is reused, and is taken out
As the high characteristic pattern of degree can preferably guide classification results, and the low characteristic pattern of level of abstraction can preferably retention position
Information, segmentation result obtain bigger promotion.
After the completion of the building of breast lump identification model, the mammary gland that mammary gland magnetic resonance image to be identified is input to building is swollen
In block identification model, the lump recognition result of mammary gland magnetic resonance image to be identified is obtained such as using breast lump identification model:
It can identify in mammary gland magnetic resonance image to be identified whether there is lump, if so, the region where may recognize that lump,
Or shape, size of breast lump etc..Bright spot in Fig. 2 in the output image of rightmost side breast lump identification model can indicate
The region of lump in the mammary gland magnetic resonance image that the lump being partitioned into, i.e. left side input, doctor can be according to the lump identified
Region is further diagnosed and is treated, or for other medical researches etc..
The breast lump image-recognizing method that this specification embodiment provides is based on deep learning, by U-shaped convolutional Neural net
Network model is combined with intensive convolutional neural networks model, constructs asymmetrical coding and decoding breast lump identification model knot
Structure.Mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building again, it can obtain to be identified
Mammary gland magnetic resonance image lump recognition result, realize the automatic identification of breast lump, do not need artificial eye identification, mention
The high recognition result of breast lump.The breast lump image-recognizing method that this specification is implemented to provide can be remained with preferably
With feature to the Beneficial Effect of segmentation result, and weaken the effect of useless feature.Network model is substantially increased to magnetic resonance cream
The effect of adenoncus block segmentation, does not need subsequent network model and advanced optimizes to segmentation result, reduce calculating cost, add
Fast image analysing computer process, can preferably assist doctor to carry out real-time imaging diagnosis.
On the basis of the above embodiments, in some embodiments of this specification, the U-shaped convolutional neural networks be can use
Basic convolution module in model carries out feature extraction using following formula:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13) (1)
In above formula, y1Indicate the output of the i.e. basic convolution module of characteristic pattern extracted, x1Indicate the mammary gland magnetic resonance of input
The input of image, that is, basic convolution module, δ indicate the first activation primitive (can be line rectification function), W11、W12、W13It indicates
The corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
In the specific implementation process, as shown in Fig. 2, may include above-mentioned formula (1) in basic convolution module in figure
Function, utilize above-mentioned formula (1) carry out feature extraction.Multiple convolution operation, i.e. base can be carried out in basic convolution module
It may include multiple convolutional layers in this convolution module, different convolutional layers are corresponding with different weight W11、W12、W13And offset parameter
b11、b12、b13.It in basic convolution module may include 3 convolutional layers in this specification embodiment, it according to actual needs can be with
The convolutional layer of other quantity is set, the adjustment or modification of adaptability can be carried out to above-mentioned formula (1) according to the quantity of convolutional layer,
This specification embodiment is not especially limited.
It can be connected by maximum pondization between basic convolution module, maximum pondization can increase receptive field and to a certain extent
Input picture translation invariance is realized, characteristic pattern is after each maximum pond layer, and resolution ratio halves, and port number adds
Times.
This specification embodiment uses the basic convolution module of similar U-Net to carry out feature extraction, it is possible to reduce sample number
According to quantity, promote the efficiency of data processing.
On the basis of the above embodiments, in this specification one embodiment, the method can also include: described in utilization
The characteristic pattern extracted is adjusted by the full convolutional neural networks model of depth using following formula:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2) (2)
In above formula, y2Indicate that characteristic pattern adjusted i.e. feature adapts to the output of module, x2Indicate the characteristic pattern extracted i.e.
Feature adapts to the input of module, and δ indicates the first activation primitive (can be line rectification function), W21、W22Indicate different convolutional layers
Corresponding weight, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
In the specific implementation process, as shown in Fig. 2, the full convolutional neural networks model of depth in this specification embodiment
Feature suitable modules can also be added in breast lump identification model, feature adapts to basic in the adjustable cataloged procedure of module
The characteristic pattern that convolution module generates, to promote subsequent Fusion Features effect.It may include above-mentioned formula in feature suitable modules
(2) function, the adjustment of characteristic pattern is carried out using above-mentioned formula (2), and feature adapts to also can wrap in module containing multiple convolution
Layer, different weight W can be corresponding in different convolutional layers21、W22And b21、b22.Feature adapts to mould in this specification embodiment
It may include 2 convolutional layers in block, the convolutional layer of other quantity can also be set according to actual needs, according to the quantity of convolutional layer
Above-mentioned formula (2) can be carried out with the adjustment or modification of adaptability, this specification embodiment is not especially limited.Cataloged procedure produces
The characteristic pattern of raw different level of abstractions can use cascade side with the characteristic pattern in decoding process after feature adapts to module
Formula carries out Fusion Features.
This specification embodiment adapts to module using the feature in breast lump identification model, can adjust to characteristic pattern
It is whole, the effect of Fusion Features is improved, the accuracy of breast lump identification is further increased.
On the basis of the above embodiments, in this specification one embodiment, the method also includes:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
In the specific implementation process, different characteristic patterns can be assigned different weighted values by this specification embodiment,
Such as: the weighted value of the more characteristic pattern of some information is arranged it is relatively high, by information content is fewer or otiose feature
The weighted value of figure is arranged relatively low.Fusion Features are carried out further according to the corresponding weighted value of characteristic pattern, such as: can be by each feature
Scheme corresponding weighted value to take on characteristic pattern to be fused, increases influence of the important feature figure to lump recognition result, reduce not
Influence of the important feature figure to lump recognition result.
In this specification one embodiment, average pond can be carried out to the characteristic pattern extracted, such as: feature is adapted into mould
Block characteristic pattern adjusted carries out average pond, then via two full articulamentums, such as finally by the second activation primitive:
The activation of Sigmoid function, generates the weight of each characteristic pattern, then the weighted value is multiplied on the characteristic pattern for returning to fusion.Average pond
Change can indicate to average to all values in local acceptance region, and Sigmoid function is a common S type letter in biology
Number, is referred to as S sigmoid growth curve.
In this specification one embodiment, the corresponding weighted value of each characteristic pattern can be obtained using following formula (3):
In above formula, ZcIndicate that characteristic pattern is averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H indicates c-th of characteristic pattern
Height, M indicates the width of c-th of characteristic pattern, and i can indicate that the numerical value of 1-H, j can indicate the numerical value of 1-M, xcIt can indicate logical
Road notices that c-th of characteristic pattern of power module input, S indicate that the corresponding weighted value vector of characteristic pattern (wherein may include multiple spies
Sign schemes corresponding weighted value), Z indicates that the pond result vector of characteristic pattern Chi Huahou (wherein may include that multiple characteristic patterns are corresponding
Pond as a result, such as: Zc), σ indicates second activation primitive (can be Sigmoid function), W32、W31Different full articulamentums
Corresponding weight, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
It, can be by the product of each characteristic pattern corresponding weighted value and characteristic pattern after obtaining the corresponding weighted value of different characteristic figure
As output, it to be used for Fusion Features.Such as:
y3=Sc·xc (4)
In above formula, y3It can indicate that channel pays attention to the output of power module, ScIt can indicate c-th of characteristic pattern respective weights value,
xcIt can indicate that channel pays attention to c-th of characteristic pattern of power module input.
Specification is needed, the formula in this specification embodiment is only schematical expression, according to actual needs
Above-mentioned formula can also be adjusted, converted or modified, this specification embodiment is not especially limited.
This specification embodiment pays attention to carrying out after power module assigns different characteristic patterns to different weighted values using channel
Fusion Features realize the screening of characteristic pattern, and the influence of important feature figure can be improved, and further increase the standard of lump recognition result
True property.
On the basis of the above embodiments, in this specification embodiment, can be known using following manner building breast lump
Other model:
Multiple sample datas are obtained, the sample data includes: mammary gland magnetic resonance image and the mammary gland magnetic resonance image
In mass mark;
The breast lump identification model is established, using the mammary gland magnetic resonance image in the sample data as the mammary gland
The input data of lump identification model, using the mass mark in the corresponding mammary gland magnetic resonance image as the breast lump
The output data of identification model is trained the breast lump identification model, until the breast lump identification model reaches
To preset requirement.
During specific embodiment, the mammary gland magnetic resonance image of available historical user is as sample data, sample
Notebook data can be the mammary gland magnetic resonance image for being diagnosed as the user of breast cancer.Sample data can also include getting
As training label, the particular number of sample data can carry out mass mark in mammary gland magnetic resonance image according to actual needs
Selection, this specification embodiment are not especially limited.In this specification one embodiment, multiple mammary gland magnetic resonance figures are being got
As after, the mammary gland magnetic resonance image got can be normalized, i.e., to the mammary gland magnetic resonance figure in sample data
The pixel of picture is uniformly processed, and subsequent carry out model training is facilitated.Again to the mammary gland magnetic resonance image after normalized into
Row lump mark can be labeled by the doctor of profession or be labeled according to the diagnostic result of user, can marked
The contents such as the position of lump, size out.
After sample data prepares, breast lump identification model can be constructed, as: building breast lump identification model
Network architecture etc., the network architecture of breast lump identification model can specifically refer to the record of above-described embodiment, no longer superfluous herein
It states.As shown in Fig. 2, lump identification model may include basic convolution module, feature adaptation mould in some embodiments of this specification
Block, channel pay attention to power module, wherein basic convolution module, feature adapt to module and can be understood as cataloged procedure, channel attention
Module can be understood as decoding process.In breast lump identification model can also include multiple model parameters, such as: convolution kernel it is big
Small, convolutional layer quantity etc..It, can be by the mammary gland magnetic resonance image in sample data after the completion of the building of breast lump identification model
As the input data of breast lump identification model, using the mass mark in corresponding mammary gland magnetic resonance image as breast lump
The output data of identification model carries out model training to breast lump identification model, until breast lump identification model reaches pre-
If it is required that such as: model output accuracy meets the requirements or model training number meets the requirements, it can thinks that model training terminates.
This specification embodiment constructs breast lump identification model using deep learning training, and breast lump may be implemented
Automatic identification, do not need manual identified, improve breast lump identification accuracy.
Breast lump identification in this specification embodiment includes mainly model training and two stages of prediction, in above-mentioned reality
On the basis of applying example, in this specification one embodiment, in the training stage, it can use existing mammary gland magnetic resonance data to setting
The full convolutional network of the depth of meter optimizes and parameter learning, in test phase, is used to analyze instruction for trained network model
Unseen new data, that is, mammary gland magnetic resonance to be identified image in white silk.Lump is being carried out to mammary gland magnetic resonance image to be identified
It, can be by the figure of the mammary gland magnetic resonance image in mammary gland magnetic resonance image to be identified and the sample data of training stage when identification
Picture parameter (such as: home court intensity, relaxation time T1, T2) compares, and whether the distribution for comparing two image datas unifies, i.e.,
Whether contrast, signal-to-noise ratio of image etc. are consistent.If mammary gland magnetic resonance image to be identified and sample data meet univesral distribution,
Can lump identification directly be carried out using the breast lump identification model of training building.When mammary gland magnetic resonance image to be identified and
When sample data is unsatisfactory for univesral distribution, can be used a small amount of new data i.e. mammary gland magnetic resonance image to be identified (or with wait know
Other consistent image of mammary gland magnetic resonance image parameter) it is quickly micro- to the parameter progress of the breast lump identification model constructed
It adjusts.Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted, is passed through
It is continuous to update adjustment breast lump identification model, the applicability of model is improved, the accurate of lump recognition result is further increased
Property.
This specification embodiment proposes asymmetrical coding and decoding main frame structure, devise new Fusion Features and
Filtering system introduces intensive connection.Dividing method in this specification embodiment can preferably retain useful feature to point
The Beneficial Effect of result is cut, and weakens the effect of useless feature.Thus, it is possible to greatly improve network to magnetic resonance breast lump point
The effect cut does not need subsequent network and advanced optimizes to segmentation result, reduces calculating cost, accelerates image analysing computer
Process can preferably assist doctor to carry out real-time imaging diagnosis.
It should be noted that the breast lump image-recognizing method in this specification embodiment can be not limited to identification mammary gland
Lump can be also used for other image recognition processes, such as: identify other focal zones (such as: brain tumor).It can use it
The magnetic resonance image training at his position constructs corresponding identification model, completes the automatic identification of other focal zones.
Various embodiments are described in a progressive manner for the above method in this specification, identical between each embodiment
Similar part may refer to each other, and each embodiment focuses on the differences from other embodiments.Correlation
Place illustrates referring to the part of embodiment of the method.
Based on breast lump image-recognizing method described above, this specification one or more embodiment also provides one kind
Breast lump pattern recognition device.The device may include the system (packet for having used this specification embodiment the method
Include distributed system), software (application), module, component, server, client etc. and combine the necessary device for implementing hardware.
Device such as the following examples institute based on same innovation thinking, in one or more embodiments of this specification embodiment offer
It states.Since the implementation that device solves the problems, such as is similar to method, the implementation of the specific device of this specification embodiment can
With referring to the implementation of preceding method, overlaps will not be repeated.Used below, term " unit " or " module " can be real
The combination of the software and/or hardware of existing predetermined function.Although device described in following embodiment is preferably realized with software,
But the realization of the combination of hardware or software and hardware is also that may and be contemplated.
Specifically, Fig. 3 is that the modular structure of breast lump pattern recognition device one embodiment that this specification provides is shown
It is intended to, as shown in figure 3, the breast lump pattern recognition device provided in this specification includes: image collection module 31, lump knowledge
Other module 32, in which:
Image collection module 31 can be used for obtaining mammary gland magnetic resonance image to be identified;
Lump identification module 32 can be used for swelling the mammary gland that the mammary gland magnetic resonance image to be identified is input to building
In block identification model, the lump recognition result in the mammary gland magnetic resonance image to be identified is obtained;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional Neural net of depth
The cataloged procedure of network model carries out feature extraction, the depth using the basic convolution module in U-shaped convolutional neural networks model
Characteristic pattern to be fused is unified after size to merge again by full convolutional neural networks solution to model code process using intensive connection.
The breast lump pattern recognition device that this specification embodiment provides is based on deep learning, by U-shaped convolutional Neural net
Network model is combined with intensive convolutional neural networks model, constructs asymmetrical coding and decoding breast lump identification model knot
Structure.Mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building again, it can obtain to be identified
Mammary gland magnetic resonance image lump recognition result, realize the automatic identification of breast lump, do not need artificial eye identification, mention
The high recognition result of breast lump.The breast lump image-recognizing method that this specification is implemented to provide can be remained with preferably
With feature to the Beneficial Effect of segmentation result, and weaken the effect of useless feature.Network can be greatly improved to magnetic resonance mammary gland
The effect of lump segmentation, does not need subsequent network and advanced optimizes to segmentation result, reduce calculating cost, accelerate shadow
As analytic process, doctor can preferably be assisted to carry out real-time imaging diagnosis.
On the basis of the above embodiments, the basic convolution module in the U-shaped convolutional neural networks model is using following
Formula carries out feature extraction:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13)
In above formula, y1Indicate the characteristic pattern extracted, x1Indicate that the mammary gland magnetic resonance image of input, δ indicate the first activation letter
Number, W11、W12、W13Indicate the corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
The breast lump pattern recognition device that this specification embodiment provides, using the basic convolution module of similar U-Net
Carry out feature extraction, it is possible to reduce the quantity of sample data promotes the efficiency of data processing.
On the basis of the above embodiments, include that feature adapts to module in the full convolutional neural networks model of the depth, use
The characteristic pattern extracted is adjusted in the following formula of use:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2)
In above formula, y2Indicate characteristic pattern adjusted, x2Indicate that the characteristic pattern extracted, δ indicate the first activation primitive,
W21、W22Indicate the corresponding weight of different convolutional layers, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
This specification embodiment adapts to module using the feature in breast lump identification model, can adjust to characteristic pattern
It is whole, the effect of Fusion Features is improved, the accuracy of breast lump identification is further increased.
It on the basis of the above embodiments, further include that channel pays attention to power module in the full convolutional neural networks model of the depth
For:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
Different characteristic patterns is assigned different weighted values by this specification embodiment, is increased important feature figure and is known to lump
The influence of other result reduces influence of the inessential characteristic pattern to lump recognition result.
On the basis of the above embodiments, the channel notices that power module is specifically used for:
Average pond is carried out to the characteristic pattern extracted;
The characteristic pattern of average Chi Huahou is attached using full articulamentum, and is activated using the second activation primitive,
Obtain the corresponding weighted value of the characteristic pattern.
On the basis of the above embodiments, the channel notices that power module is specifically used for obtaining the spy using following formula
Sign schemes corresponding weighted value:
In above formula, ZcIndicate that characteristic pattern is averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H indicates c-th of characteristic pattern
Height, M indicate c-th of characteristic pattern width, xcIndicate c-th of characteristic pattern of input, S indicate the corresponding weighted value of characteristic pattern to
Amount, Z indicate that the pond result vector of characteristic pattern Chi Huahou, σ indicate second activation primitive, W32、W31Different full articulamentums pair
The weight answered, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
This specification embodiment pays attention to carrying out after power module assigns different characteristic patterns to different weighted values using channel
Fusion Features realize the screening of characteristic pattern, and the influence of important feature figure can be improved, and further increase the standard of lump recognition result
True property.
Fig. 4 is the structural schematic diagram of breast lump pattern recognition device in the another embodiment of this specification, as shown in figure 4,
On the basis of the above embodiments, described device further includes that model construction module 41 is used to construct the mammary gland using following methods
Lump identification model:
Multiple sample datas are obtained, the sample data includes: mammary gland magnetic resonance image and the mammary gland magnetic resonance image
In mass mark;
The breast lump identification model is established, using the mammary gland magnetic resonance image in the sample data as the mammary gland
The input data of lump identification model, using the mass mark in the corresponding mammary gland magnetic resonance image as the breast lump
The output data of identification model is trained the breast lump identification model, until the breast lump identification model reaches
To preset requirement.
This specification embodiment constructs breast lump identification model using deep learning training, it is swollen that mammary gland may be implemented
The automatic identification of block, does not need manual identified, improves the accuracy of breast lump identification.
Fig. 5 is the structural schematic diagram of breast lump pattern recognition device in the another embodiment of this specification, as shown in figure 5,
On the basis of the above embodiments, described device further includes that model adjustment module 51 is used for:
The image of mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number compares;
If the image of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is joined
Number is inconsistent, then is adjusted using the mammary gland magnetic resonance image to be identified to the breast lump identification model;
Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted.
This specification embodiment adjusts breast lump identification model by constantly updating, improves the applicability of model, into
The accuracy of one step raising lump recognition result.
It should be noted that device described above can also include other embodiment party according to the description of embodiment of the method
Formula.Concrete implementation mode is referred to the description of related method embodiment, does not repeat one by one herein.
This specification embodiment also provides a kind of breast lump image recognition processing equipment, comprising: at least one processor
And the memory for storage processor executable instruction, the processor realize above-described embodiment when executing described instruction
Breast lump image-recognizing method, such as:
Obtain mammary gland magnetic resonance image to be identified;
The mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building, obtain it is described to
Lump recognition result in the mammary gland magnetic resonance image of identification;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional Neural net of depth
The cataloged procedure of network model carries out feature extraction, the depth using the basic convolution module in U-shaped convolutional neural networks model
Characteristic pattern to be fused is unified after size to merge again by full convolutional neural networks solution to model code process using intensive connection.
The storage medium may include the physical unit for storing information, usually by after information digitalization again with benefit
The media of the modes such as electricity consumption, magnetic or optics are stored.It may include: that letter is stored in the way of electric energy that the storage medium, which has,
The device of breath such as, various memory, such as RAM, ROM;The device of information is stored in the way of magnetic energy such as, hard disk, floppy disk, magnetic
Band, core memory, magnetic bubble memory, USB flash disk;Using optical mode storage information device such as, CD or DVD.Certainly, there are also it
Readable storage medium storing program for executing of his mode, such as quantum memory, graphene memory etc..
It should be noted that processing equipment described above can also include other implement according to the description of embodiment of the method
Mode.Concrete implementation mode is referred to the description of related method embodiment, does not repeat one by one herein.
The breast lump identifying system that this specification provides can be individual breast lump identifying system, can also apply
In a variety of Data Analysis Services systems.The system may include any one breast lump image recognition in above-described embodiment
Device.The system can be individual server, also may include the one or more side for having used this specification
Server cluster, system (including distributed system), the software (application), practical operation of method or one or more embodiment devices
Device, logic gates device, quantum computer etc. simultaneously combine the necessary terminal installation for implementing hardware.The verification difference number
According to detection system may include at least one processor and store computer executable instructions memory, the processor
The step of method described in above-mentioned any one or multiple embodiments is realized when executing described instruction.
Embodiment of the method provided by this specification embodiment can mobile terminal, terminal, server or
It is executed in similar arithmetic unit.For running on the server, Fig. 6 is using the embodiment of the present application breast lump identification clothes
The hardware block diagram of business device.As shown in fig. 6, server 10 may include one or more (only showing one in figure) processors
100 (processing units that processor 100 can include but is not limited to Micro-processor MCV or programmable logic device FPGA etc.) are used
Memory 200 in storing data and the transmission module 300 for communication function.This neighborhood those of ordinary skill can manage
Solution, structure shown in fig. 6 are only to illustrate, and do not cause to limit to the structure of above-mentioned electronic device.For example, server 10 may be used also
It including component more or more less than shown in Fig. 6, such as can also include other processing hardware, such as database or multistage
Caching, GPU, or with the configuration different from shown in Fig. 6.
Memory 200 can be used for storing the software program and module of application software, such as the cream in this specification embodiment
Corresponding program instruction/the module of adenoncus block image-recognizing method, processor 100 are stored in soft in memory 200 by operation
Part program and module, thereby executing various function application and data processing.Memory 200 may include high speed random storage
Device may also include nonvolatile memory, such as one or more magnetic storage device, flash memory or other are non-volatile solid
State memory.In some instances, memory 200 can further comprise the memory remotely located relative to processor 100, this
A little remote memories can pass through network connection to terminal.The example of above-mentioned network includes but is not limited to internet, enterprise
Industry intranet, local area network, mobile radio communication and combinations thereof.
Transmission module 300 is used to that data to be received or sent via a network.Above-mentioned network specific example may include
The wireless network that the communication providers of terminal provide.In an example, transmission module 300 includes a Network adaptation
Device (Network Interface Controller, NIC), can be connected by base station with other network equipments so as to it is mutual
Networking is communicated.In an example, transmission module 300 can be radio frequency (Radio Frequency, RF) module, use
In wirelessly being communicated with internet.
It is above-mentioned that this specification specific embodiment is described.Other embodiments are in the scope of the appended claims
It is interior.In some cases, the movement recorded in detail in the claims or step can be come according to the sequence being different from embodiment
It executes and desired result still may be implemented.In addition, process depicted in the drawing not necessarily require show it is specific suitable
Sequence or consecutive order are just able to achieve desired result.In some embodiments, multitasking and parallel processing be also can
With or may be advantageous.
Method or apparatus described in above-described embodiment that this specification provides can realize that business is patrolled by computer program
It collects and records on a storage medium, the storage medium can be read and be executed with computer, realize this specification embodiment institute
The effect of description scheme.
This specification embodiment provide above-mentioned breast lump image-recognizing method or device can in a computer by
Reason device executes corresponding program instruction to realize, such as using the c++ language of windows operating system in the realization of the end PC, linux system
System is realized or other are for example realized using android, iOS system programming language in intelligent terminal, and based on quantum
Processing logic realization of calculation machine etc..
It should be noted that specification device described above, computer storage medium, system are implemented according to correlation technique
The description of example can also include other embodiments, and concrete implementation mode is referred to the description of corresponding method embodiment,
It does not repeat one by one herein.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for hardware+
For program class embodiment, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to side
The part of method embodiment illustrates.
This specification embodiment is not limited to meet industry communication standard, standard computer data processing sum number
According to situation described in storage rule or this specification one or more embodiment.The right way of conduct is made in certain professional standards or use by oneself
In formula or the practice processes of embodiment description embodiment modified slightly also may be implemented above-described embodiment it is identical, it is equivalent or
The implementation result being anticipated that after close or deformation.Using these modifications or deformed data acquisition, storage, judgement, processing side
The embodiment of the acquisitions such as formula still may belong within the scope of the optional embodiment of this specification embodiment.
In the 1990s, the improvement of a technology can be distinguished clearly be on hardware improvement (for example,
Improvement to circuit structures such as diode, transistor, switches) or software on improvement (improvement for method flow).So
And with the development of technology, the improvement of current many method flows can be considered as directly improving for hardware circuit.
Designer nearly all obtains corresponding hardware circuit by the way that improved method flow to be programmed into hardware circuit.Cause
This, it cannot be said that the improvement of a method flow cannot be realized with hardware entities module.For example, programmable logic device
(Programmable Logic Device, PLD) (such as field programmable gate array (Field Programmable Gate
Array, FPGA)) it is exactly such a integrated circuit, logic function determines device programming by user.By designer
Voluntarily programming comes a digital display circuit " integrated " on a piece of PLD, designs and makes without asking chip maker
Dedicated IC chip.Moreover, nowadays, substitution manually makes IC chip, this programming is also used instead mostly " is patrolled
Volume compiler (logic compiler) " software realizes that software compiler used is similar when it writes with program development,
And the source code before compiling also write by handy specific programming language, this is referred to as hardware description language
(Hardware Description Language, HDL), and HDL is also not only a kind of, but there are many kind, such as ABEL
(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description
Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL
(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby
Hardware Description Language) etc., VHDL (Very-High-Speed is most generally used at present
Integrated Circuit Hardware Description Language) and Verilog.Those skilled in the art also answer
This understands, it is only necessary to method flow slightly programming in logic and is programmed into integrated circuit with above-mentioned several hardware description languages,
The hardware circuit for realizing the logical method process can be readily available.
Controller can be implemented in any suitable manner, for example, controller can take such as microprocessor or processing
The computer for the computer readable program code (such as software or firmware) that device and storage can be executed by (micro-) processor can
Read medium, logic gate, switch, specific integrated circuit (Application Specific Integrated Circuit,
ASIC), the form of programmable logic controller (PLC) and insertion microcontroller, the example of controller includes but is not limited to following microcontroller
Device: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320 are deposited
Memory controller is also implemented as a part of the control logic of memory.It is also known in the art that in addition to
Pure computer readable program code mode is realized other than controller, can be made completely by the way that method and step is carried out programming in logic
Controller is obtained to come in fact in the form of logic gate, switch, specific integrated circuit, programmable logic controller (PLC) and insertion microcontroller etc.
Existing identical function.Therefore this controller is considered a kind of hardware component, and to including for realizing various in it
The device of function can also be considered as the structure in hardware component.Or even, it can will be regarded for realizing the device of various functions
For either the software module of implementation method can be the structure in hardware component again.
System, device, module or the unit that above-described embodiment illustrates can specifically realize by computer chip or entity,
Or it is realized by the product with certain function.It is a kind of typically to realize that equipment is computer.Specifically, computer for example may be used
Think personal computer, laptop computer, vehicle-mounted human-computer interaction device, cellular phone, camera phone, smart phone, individual
Digital assistants, media player, navigation equipment, electronic mail equipment, game console, tablet computer, wearable device or
The combination of any equipment in these equipment of person.
Although this specification one or more embodiment provides the method operating procedure as described in embodiment or flow chart,
It but may include more or less operating procedure based on conventional or without creativeness means.The step of being enumerated in embodiment
Sequence is only one of numerous step execution sequence mode, does not represent and unique executes sequence.Device in practice or
When end product executes, can be executed according to embodiment or the execution of method shown in the drawings sequence or parallel (such as it is parallel
The environment of processor or multiple threads, even distributed data processing environment).The terms "include", "comprise" or its
Any other variant is intended to non-exclusive inclusion so that include the process, methods of a series of elements, product or
Equipment not only includes those elements, but also including other elements that are not explicitly listed, or further include for this process,
Method, product or the intrinsic element of equipment.In the absence of more restrictions, being not precluded is including the element
There is also other identical or equivalent elements in process, method, product or equipment.The first, the second equal words are used to indicate name
Claim, and does not indicate any particular order.
For convenience of description, it is divided into various modules when description apparatus above with function to describe respectively.Certainly, implementing this
The function of each module can be realized in the same or multiple software and or hardware when specification one or more, it can also be with
The module for realizing same function is realized by the combination of multiple submodule or subelement etc..Installation practice described above is only
It is only illustrative, for example, in addition the division of the unit, only a kind of logical function partition can have in actual implementation
Division mode, such as multiple units or components can be combined or can be integrated into another system or some features can be with
Ignore, or does not execute.Another point, shown or discussed mutual coupling, direct-coupling or communication connection can be logical
Some interfaces are crossed, the indirect coupling or communication connection of device or unit can be electrical property, mechanical or other forms.
The present invention be referring to according to the method for the embodiment of the present invention, the process of device (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net
Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),
Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage, graphene stores or other
Magnetic storage device or any other non-transmission medium, can be used for storage can be accessed by a computing device information.According to herein
In define, computer-readable medium does not include temporary computer readable media (transitory media), such as the data of modulation
Signal and carrier wave.
It will be understood by those skilled in the art that this specification one or more embodiment can provide as method, system or calculating
Machine program product.Therefore, this specification one or more embodiment can be used complete hardware embodiment, complete software embodiment or
The form of embodiment combining software and hardware aspects.Moreover, this specification one or more embodiment can be used at one or
It is multiple wherein include computer usable program code computer-usable storage medium (including but not limited to magnetic disk storage,
CD-ROM, optical memory etc.) on the form of computer program product implemented.
This specification one or more embodiment can computer executable instructions it is general on
It hereinafter describes, such as program module.Generally, program module includes executing particular task or realization particular abstract data type
Routine, programs, objects, component, data structure etc..This this specification one can also be practiced in a distributed computing environment
Or multiple embodiments, in these distributed computing environments, by being held by the connected remote processing devices of communication network
Row task.In a distributed computing environment, program module can be located at the local and remote computer including storage equipment
In storage medium.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for system reality
For applying example, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to embodiment of the method
Part explanation.In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ",
The description of " specific example " or " some examples " etc. means specific features described in conjunction with this embodiment or example, structure, material
Or feature is contained at least one embodiment or example of this specification.In the present specification, to the signal of above-mentioned term
Property statement be necessarily directed to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described
It may be combined in any suitable manner in any one or more of the embodiments or examples.In addition, without conflicting with each other, this
The technical staff in field can be by the spy of different embodiments or examples described in this specification and different embodiments or examples
Sign is combined.
The foregoing is merely the embodiments of this specification one or more embodiment, are not limited to book explanation
Book one or more embodiment.To those skilled in the art, this specification one or more embodiment can have various
Change and variation.All any modification, equivalent replacement, improvement and so within the spirit and principle of this specification, should all wrap
It is contained within scope of the claims.
Claims (18)
1. a kind of breast lump image-recognizing method characterized by comprising
Obtain mammary gland magnetic resonance image to be identified;
The mammary gland magnetic resonance image to be identified is input in the breast lump identification model of building, is obtained described to be identified
Mammary gland magnetic resonance image in lump recognition result;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional neural networks mould of depth
The cataloged procedure of type carries out feature extraction using the basic convolution module in U-shaped convolutional neural networks model, and the depth is rolled up entirely
Characteristic pattern to be fused is unified after size to merge again by the decoding process of product neural network model using intensive connection.
2. the method as described in claim 1, which is characterized in that utilize the base volume in the U-shaped convolutional neural networks model
Volume module carries out feature extraction using following formula:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13)
In above formula, y1Indicate the characteristic pattern extracted, x1Indicate that the mammary gland magnetic resonance image of input, δ indicate the first activation primitive,
W11、W12、W13Indicate the corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
3. the method as described in claim 1, which is characterized in that the method also includes: utilize the full convolutional Neural of the depth
The characteristic pattern extracted is adjusted by network model using following formula:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2)
In above formula, y2Indicate characteristic pattern adjusted, x2Indicate that the characteristic pattern extracted, δ indicate the first activation primitive, W21、W22
Indicate the corresponding weight of different convolutional layers, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
4. the method as described in claim 1, which is characterized in that the method also includes:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
5. method as claimed in claim 4, which is characterized in that the characteristic pattern that will be extracted assigns different weighted values,
Include:
Average pond is carried out to the characteristic pattern extracted;
The characteristic pattern of average Chi Huahou is attached using full articulamentum, and is activated using the second activation primitive, is obtained
The corresponding weighted value of the characteristic pattern.
6. method as claimed in claim 5, which is characterized in that the characteristic pattern that will be extracted assigns different weighted values,
Include: to obtain the corresponding weighted value of the characteristic pattern using following formula:
In above formula, ZcIndicate characteristic pattern be averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H expression c-th of characteristic pattern height,
M indicates the width of c-th of characteristic pattern, xcIndicate that c-th of characteristic pattern of input, S indicate the corresponding weighted value vector of characteristic pattern, Z table
Show that the pond result vector of characteristic pattern Chi Huahou, σ indicate second activation primitive, W32、W31The corresponding power of different full articulamentums
Weight, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
7. the method as described in claim 1, which is characterized in that the breast lump identification model is constructed using following methods:
Multiple sample datas are obtained, the sample data includes: in mammary gland magnetic resonance image and the mammary gland magnetic resonance image
Mass mark;
The breast lump identification model is established,
Using the mammary gland magnetic resonance image in the sample data as the input data of the breast lump identification model, will correspond to
The mammary gland magnetic resonance image in output data of the mass mark as the breast lump identification model, to the mammary gland
Lump identification model is trained, until the breast lump identification model reaches preset requirement.
8. the method for claim 7, which is characterized in that the method also includes:
By the image parameter of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data into
Row comparison;
If the image parameter of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is not
Unanimously, then the breast lump identification model is adjusted using the mammary gland magnetic resonance image to be identified;
Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted.
9. a kind of breast lump pattern recognition device characterized by comprising
Image collection module, for obtaining mammary gland magnetic resonance image to be identified;
Lump identification module, for the mammary gland magnetic resonance image to be identified to be input to the breast lump identification model of building
In, obtain the lump recognition result in the mammary gland magnetic resonance image to be identified;
Wherein, the lump identification model uses the full convolutional neural networks model of depth, the full convolutional neural networks mould of depth
The cataloged procedure of type carries out feature extraction using the basic convolution module in U-shaped convolutional neural networks model, and the depth is rolled up entirely
Characteristic pattern to be fused is unified after size to merge again by the decoding process of product neural network model using intensive connection.
10. device as claimed in claim 9, which is characterized in that the base volume product module in the U-shaped convolutional neural networks model
Block carries out feature extraction using following formula:
y1=δ (W13*δ(W12*δ(W11*x1+b11)+b12)+b13)
In above formula, y1Indicate the characteristic pattern extracted, x1Indicate that the mammary gland magnetic resonance image of input, δ indicate the first activation primitive,
W11、W12、W13Indicate the corresponding weight of different convolutional layers, b11、b12、b13Indicate the corresponding offset parameter of different convolutional layers.
11. device as claimed in claim 9, which is characterized in that include feature in the full convolutional neural networks model of depth
Module is adapted to, for the characteristic pattern extracted to be adjusted using following formula:
y2=δ ((W22*δ(W21*x2+b21)+b22)+x2)
In above formula, y2Indicate characteristic pattern adjusted, x2Indicate that the characteristic pattern extracted, δ indicate the first activation primitive, W21、W22
Indicate the corresponding weight of different convolutional layers, b21、b22Indicate the corresponding offset parameter of different convolutional layers.
12. device as claimed in claim 9, which is characterized in that further include leading in the full convolutional neural networks model of depth
Road notices that power module is used for:
The characteristic pattern extracted is assigned to different weighted values using the full convolutional neural networks model of the depth;
Fusion Features are carried out according to the corresponding weighted value of each characteristic pattern.
13. device as claimed in claim 12, which is characterized in that the channel notices that power module is specifically used for:
Average pond is carried out to the characteristic pattern extracted;
The characteristic pattern of average Chi Huahou is attached using full articulamentum, and is activated using the second activation primitive, is obtained
The corresponding weighted value of the characteristic pattern.
14. device as claimed in claim 13, which is characterized in that the channel notices that power module is specifically used for using following public affairs
Formula obtains the corresponding weighted value of the characteristic pattern:
In above formula, ZcIndicate characteristic pattern be averaged the corresponding pond of c-th of characteristic pattern of pondization as a result, H expression c-th of characteristic pattern height,
M indicates the width of c-th of characteristic pattern, xcIndicate that c-th of characteristic pattern of input, S indicate the corresponding weighted value vector of characteristic pattern, Z table
Show that the pond result vector of characteristic pattern Chi Huahou, σ indicate second activation primitive, W32、W31The corresponding power of different full articulamentums
Weight, b31、b32Indicate the corresponding offset parameter of different full articulamentums.
15. device as claimed in claim 9, which is characterized in that described device further includes model construction module under use
It states method and constructs the breast lump identification model:
Multiple sample datas are obtained, the sample data includes: in mammary gland magnetic resonance image and the mammary gland magnetic resonance image
Mass mark;
The breast lump identification model is established, using the mammary gland magnetic resonance image in the sample data as the breast lump
The input data of identification model identifies the mass mark in the corresponding mammary gland magnetic resonance image as the breast lump
The output data of model is trained the breast lump identification model, until the breast lump identification model reaches pre-
If it is required that.
16. device as claimed in claim 15, which is characterized in that described device further includes that model adjustment module is used for:
By the image parameter of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data into
Row comparison;
If the image parameter of the mammary gland magnetic resonance image in the mammary gland magnetic resonance image to be identified and the sample data is not
Unanimously, then the breast lump identification model is adjusted using the mammary gland magnetic resonance image to be identified;
Lump identification is carried out to the mammary gland magnetic resonance image to be identified using breast lump identification model adjusted.
17. a kind of breast lump image recognition processing equipment characterized by comprising at least one processor and for depositing
The memory of processor-executable instruction is stored up, the processor is realized described in any one of claim 1-8 when executing described instruction
Method.
18. a kind of breast lump image identification system characterized by comprising at least one processor and at storage
The memory of device executable instruction is managed, the processor realizes the described in any item sides of claim 1-8 when executing described instruction
Method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811525188.6A CN109685077A (en) | 2018-12-13 | 2018-12-13 | A kind of breast lump image-recognizing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811525188.6A CN109685077A (en) | 2018-12-13 | 2018-12-13 | A kind of breast lump image-recognizing method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109685077A true CN109685077A (en) | 2019-04-26 |
Family
ID=66186612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811525188.6A Pending CN109685077A (en) | 2018-12-13 | 2018-12-13 | A kind of breast lump image-recognizing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109685077A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110349166A (en) * | 2019-06-11 | 2019-10-18 | 东软医疗系统股份有限公司 | A kind of blood vessel segmentation method, device and equipment being directed to retinal images |
CN110599451A (en) * | 2019-08-05 | 2019-12-20 | 平安科技(深圳)有限公司 | Medical image focus detection positioning method, device, equipment and storage medium |
CN110738231A (en) * | 2019-07-25 | 2020-01-31 | 太原理工大学 | Method for classifying mammary gland X-ray images by improving S-DNet neural network model |
CN111105347A (en) * | 2019-11-19 | 2020-05-05 | 贝壳技术有限公司 | Method, device and storage medium for generating panoramic image with depth information |
CN111292299A (en) * | 2020-01-21 | 2020-06-16 | 长沙理工大学 | Mammary gland tumor identification method and device and storage medium |
CN111311547A (en) * | 2020-01-20 | 2020-06-19 | 北京航空航天大学 | Ultrasonic image segmentation device and ultrasonic image segmentation method |
CN111339862A (en) * | 2020-02-17 | 2020-06-26 | 中国地质大学(武汉) | Remote sensing scene classification method and device based on channel attention mechanism |
CN111415741A (en) * | 2020-03-05 | 2020-07-14 | 北京深睿博联科技有限责任公司 | Breast X-ray image classification model training method based on implicit appearance learning |
CN111583210A (en) * | 2020-04-29 | 2020-08-25 | 北京小白世纪网络科技有限公司 | Automatic breast cancer image identification method based on convolutional neural network model integration |
CN112201328A (en) * | 2020-10-09 | 2021-01-08 | 浙江德尚韵兴医疗科技有限公司 | Breast mass segmentation method based on cross attention mechanism |
US11055835B2 (en) | 2019-11-19 | 2021-07-06 | Ke.com (Beijing) Technology, Co., Ltd. | Method and device for generating virtual reality data |
CN113449785A (en) * | 2021-06-18 | 2021-09-28 | 浙江大学 | Eyelid tumor digital pathological section image multi-classification method based on deep learning |
CN113688822A (en) * | 2021-09-07 | 2021-11-23 | 河南工业大学 | Time sequence attention mechanism scene image identification method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107169974A (en) * | 2017-05-26 | 2017-09-15 | 中国科学技术大学 | It is a kind of based on the image partition method for supervising full convolutional neural networks more |
US20180068436A1 (en) * | 2016-09-07 | 2018-03-08 | International Business Machines Corporation | Multiple algorithm lesion segmentation |
CN108346145A (en) * | 2018-01-31 | 2018-07-31 | 浙江大学 | The recognition methods of unconventional cell in a kind of pathological section |
CN108805188A (en) * | 2018-05-29 | 2018-11-13 | 徐州工程学院 | A kind of feature based recalibration generates the image classification method of confrontation network |
CN108986067A (en) * | 2018-05-25 | 2018-12-11 | 上海交通大学 | Pulmonary nodule detection method based on cross-module state |
-
2018
- 2018-12-13 CN CN201811525188.6A patent/CN109685077A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180068436A1 (en) * | 2016-09-07 | 2018-03-08 | International Business Machines Corporation | Multiple algorithm lesion segmentation |
CN107169974A (en) * | 2017-05-26 | 2017-09-15 | 中国科学技术大学 | It is a kind of based on the image partition method for supervising full convolutional neural networks more |
CN108346145A (en) * | 2018-01-31 | 2018-07-31 | 浙江大学 | The recognition methods of unconventional cell in a kind of pathological section |
CN108986067A (en) * | 2018-05-25 | 2018-12-11 | 上海交通大学 | Pulmonary nodule detection method based on cross-module state |
CN108805188A (en) * | 2018-05-29 | 2018-11-13 | 徐州工程学院 | A kind of feature based recalibration generates the image classification method of confrontation network |
Non-Patent Citations (2)
Title |
---|
HUI SUN 等: "AUNet: Breast Mass Segmentation of Whole Mammograms", 《ARXIV》 * |
XIAOMENG LI 等: "H-DenseUNet: Hybrid Densely Connected UNet for Liver and Tumor Segmentation from CT Volumes", 《ARXIV》 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110349166A (en) * | 2019-06-11 | 2019-10-18 | 东软医疗系统股份有限公司 | A kind of blood vessel segmentation method, device and equipment being directed to retinal images |
CN110738231A (en) * | 2019-07-25 | 2020-01-31 | 太原理工大学 | Method for classifying mammary gland X-ray images by improving S-DNet neural network model |
CN110599451B (en) * | 2019-08-05 | 2023-01-20 | 平安科技(深圳)有限公司 | Medical image focus detection and positioning method, device, equipment and storage medium |
CN110599451A (en) * | 2019-08-05 | 2019-12-20 | 平安科技(深圳)有限公司 | Medical image focus detection positioning method, device, equipment and storage medium |
US11961227B2 (en) | 2019-08-05 | 2024-04-16 | Ping An Technology (Shenzhen) Co., Ltd. | Method and device for detecting and locating lesion in medical image, equipment and storage medium |
CN111105347B (en) * | 2019-11-19 | 2020-11-13 | 贝壳找房(北京)科技有限公司 | Method, device and storage medium for generating panoramic image with depth information |
US11055835B2 (en) | 2019-11-19 | 2021-07-06 | Ke.com (Beijing) Technology, Co., Ltd. | Method and device for generating virtual reality data |
CN111105347A (en) * | 2019-11-19 | 2020-05-05 | 贝壳技术有限公司 | Method, device and storage medium for generating panoramic image with depth information |
US11721006B2 (en) | 2019-11-19 | 2023-08-08 | Realsee (Beijing) Technology Co., Ltd. | Method and device for generating virtual reality data |
CN111311547A (en) * | 2020-01-20 | 2020-06-19 | 北京航空航天大学 | Ultrasonic image segmentation device and ultrasonic image segmentation method |
CN111292299A (en) * | 2020-01-21 | 2020-06-16 | 长沙理工大学 | Mammary gland tumor identification method and device and storage medium |
CN111339862A (en) * | 2020-02-17 | 2020-06-26 | 中国地质大学(武汉) | Remote sensing scene classification method and device based on channel attention mechanism |
CN111415741A (en) * | 2020-03-05 | 2020-07-14 | 北京深睿博联科技有限责任公司 | Breast X-ray image classification model training method based on implicit appearance learning |
CN111415741B (en) * | 2020-03-05 | 2023-09-26 | 北京深睿博联科技有限责任公司 | Mammary gland X-ray image classification model training method based on implicit apparent learning |
CN111583210A (en) * | 2020-04-29 | 2020-08-25 | 北京小白世纪网络科技有限公司 | Automatic breast cancer image identification method based on convolutional neural network model integration |
CN112201328A (en) * | 2020-10-09 | 2021-01-08 | 浙江德尚韵兴医疗科技有限公司 | Breast mass segmentation method based on cross attention mechanism |
CN112201328B (en) * | 2020-10-09 | 2022-06-21 | 浙江德尚韵兴医疗科技有限公司 | Breast mass segmentation method based on cross attention mechanism |
CN113449785B (en) * | 2021-06-18 | 2022-08-05 | 浙江大学 | Eyelid tumor digital pathological section image multi-classification method based on deep learning |
CN113449785A (en) * | 2021-06-18 | 2021-09-28 | 浙江大学 | Eyelid tumor digital pathological section image multi-classification method based on deep learning |
CN113688822A (en) * | 2021-09-07 | 2021-11-23 | 河南工业大学 | Time sequence attention mechanism scene image identification method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109685077A (en) | A kind of breast lump image-recognizing method and device | |
CN108596882B (en) | The recognition methods of pathological picture and device | |
WO2020118618A1 (en) | Mammary gland mass image recognition method and device | |
CN111161275B (en) | Method and device for segmenting target object in medical image and electronic equipment | |
CN111488921B (en) | Intelligent analysis system and method for panoramic digital pathological image | |
Tang et al. | E 2 Net: An edge enhanced network for accurate liver and tumor segmentation on CT scans | |
CN110263656B (en) | Cancer cell identification method, device and system | |
CN110728674B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
AU2017331352B2 (en) | Mapping of breast arterial calcifications | |
CN109376756B (en) | System, computer device and storage medium for automatically identifying lymph node transferred from upper abdomen based on deep learning | |
CN109241967A (en) | Thyroid ultrasound automatic image recognition system, computer equipment, storage medium based on deep neural network | |
TW202042181A (en) | Method, device and electronic equipment for depth model training and storage medium thereof | |
CN109948680B (en) | Classification method and system for medical record data | |
CN110047068A (en) | MRI brain tumor dividing method and system based on pyramid scene analysis network | |
CN108629772A (en) | Image processing method and device, computer equipment and computer storage media | |
CN106682127A (en) | Image searching system and method | |
CN113705595A (en) | Method, device and storage medium for predicting degree of abnormal cell metastasis | |
US20210052918A1 (en) | Clinical target volume delineation method and electronic device | |
Karunakar et al. | An unparagoned application for red blood cell counting using marker controlled watershed algorithm for android mobile | |
CN109636780A (en) | Breast density automatic grading method and device | |
CN109816665A (en) | A kind of fast partition method and device of optical coherence tomographic image | |
CN113393445B (en) | Breast cancer image determination method and system | |
CN114757906A (en) | Method for detecting leucocytes in microscopic image based on DETR model | |
Zhang et al. | State-of-the-Art and Challenges in Pancreatic CT Segmentation: A Systematic Review of U-Net and Its Variants | |
CN110706209B (en) | Method for positioning tumor in brain magnetic resonance image of grid network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190426 |
|
RJ01 | Rejection of invention patent application after publication |