CN109872307B - Method for detecting tumor in biological tissue image, corresponding device and medium - Google Patents

Method for detecting tumor in biological tissue image, corresponding device and medium Download PDF

Info

Publication number
CN109872307B
CN109872307B CN201910092954.2A CN201910092954A CN109872307B CN 109872307 B CN109872307 B CN 109872307B CN 201910092954 A CN201910092954 A CN 201910092954A CN 109872307 B CN109872307 B CN 109872307B
Authority
CN
China
Prior art keywords
image
target
tumor
typing
biological tissue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910092954.2A
Other languages
Chinese (zh)
Other versions
CN109872307A (en
Inventor
江铖
田宽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910092954.2A priority Critical patent/CN109872307B/en
Priority to CN201910684650.5A priority patent/CN110428405A/en
Publication of CN109872307A publication Critical patent/CN109872307A/en
Application granted granted Critical
Publication of CN109872307B publication Critical patent/CN109872307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/12Computing arrangements based on biological models using genetic models
    • G06N3/126Evolutionary algorithms, e.g. genetic algorithms or genetic programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

The application discloses a method for detecting tumor in biological tissue image, comprising: acquiring a target biological tissue image; determining the typing type of the target biological tissue image; determining a target object in the target biological tissue image and a probability that the target object is a tumor; comparing the probability that the target object is the tumor with probability threshold values corresponding to the typing types of the target biological tissue images, wherein the probability threshold values corresponding to the biological tissue images of different typing types are different; and if the probability that the target object is the tumor is greater than the probability threshold corresponding to the typing type of the target biological tissue image, detecting that the target object is the tumor. The technical scheme of the application is that when the biological tissue image is detected, for example: when the mammary gland image is detected, the typing type of the biological tissue image can be detected, and different types of biological tissues have different tumor probability threshold values, so that the tumor detection accuracy is improved.

Description

Method for detecting tumor in biological tissue image, corresponding device and medium
Technical Field
The application relates to the technical field of computers, in particular to a method for detecting tumor in a biological tissue image, a method for training a model, a corresponding device, equipment and a medium.
Background
Breast molybdenum targets (mammograms) are widely used in early screening for breast cancer. The tumor is an important local feature for judging whether the breast is normal, wherein the location of the suspected malignant tumor can provide a better basis for judging the malignancy and the malignancy for doctors, and is one of the most important clues for diagnosing the breast cancer.
The tumors vary in morphology and size and often exhibit low contrast. The traditional tumor detection technology mainly carries out supervised detection according to the prior characteristics of the tumor or carries out unsupervised segmentation by utilizing the separability of the tumor and other tissues, and often faces the problem of difficult tumor positioning.
In the prior art, a suspicious region is segmented by using a segmentation algorithm, and then the tumor screening is carried out according to the shape of the suspicious region, so that the screening accuracy is very low.
Disclosure of Invention
The embodiment of the application provides a method for detecting a tumor in a biological tissue image, which can improve the accuracy of detecting the tumor in the biological tissue image. The embodiment of the application also provides a corresponding device and a storage medium.
A first aspect of the present application provides a method of detecting a tumor in an image of biological tissue, comprising:
acquiring a target biological tissue image;
determining a typing type of the target biological tissue image;
determining a target object in the target biological tissue image and a probability that the target object is a tumor;
comparing the probability that the target object is the tumor with probability threshold values corresponding to the typing types of the target biological tissue images, wherein the probability threshold values corresponding to the biological tissue images of different typing types are different;
and if the probability that the target object is the tumor is greater than the probability threshold corresponding to the typing type of the target biological tissue image, detecting that the target object is the tumor.
In one possible implementation, the biological tissue image is a breast image, and the determining the type of classification of the target biological tissue image may include:
determining the typing type of the target mammary gland image through a target mammary gland typing model, wherein the target mammary gland typing model is obtained by training a plurality of mammary gland images of different typing types and the typing type information of each mammary gland image.
In one possible implementation, the method for detecting a tumor in a breast image may further include:
marking the detected mass;
the output contains the labeled breast image.
In one possible implementation, the different typing types include at least two of the following typing types: adipose, glandular and compact;
the probability threshold value corresponding to the fat type is a, the probability threshold value corresponding to the few gland type is b, the probability threshold value corresponding to the multiple gland type is c, the probability threshold value corresponding to the compact type is d, a, b, c and d are all larger than 0, and a < b < c < d.
In one possible implementation, the determining the target object in the target biological tissue image and the probability that the target object is a tumor may include:
preprocessing and segmenting the target breast image to determine segmented sub-images;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining a probability of the target object being a tumor from each target sub-object.
In one possible implementation, the determining, according to each target sub-object, the probability that the target object is a tumor block may include:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
A second aspect of the present application provides a method of training a model, comprising:
acquiring a sample image set, wherein the sample image set comprises a plurality of biological tissue images with different typing types and the typing type information of each biological tissue image;
training an initial biological tissue typing model through the set of images to determine reference parameters of the initial biological tissue typing model;
inputting the reference parameters into the initial biological tissue typing model to determine a target biological tissue typing model, wherein the target biological tissue typing model is used for determining the typing type of the biological tissue image.
In one possible implementation, the training an initial breast typing model through the image set to determine reference parameters of the initial breast typing model may include:
extracting characteristic information of each mammary gland image;
taking the characteristic information of each mammary gland image and the typing type information corresponding to the mammary gland image as a group of training parameters;
and training the initial breast typing model through a group of training parameters corresponding to each of the plurality of breast images so as to determine the reference parameters of the initial breast typing model.
A third aspect of the present application provides an apparatus for detecting a tumor in an image of a biological tissue, comprising:
an acquisition unit for acquiring an image of a target biological tissue;
a first determination unit configured to determine a type of classification of the target biological tissue image acquired by the acquisition unit;
a second determination unit configured to determine a target object in the target biological tissue image acquired by the acquisition unit and a probability that the target object is a tumor;
a comparison unit, configured to compare the probability that the target object determined by the second determination unit is a tumor with a probability threshold corresponding to a type of the target biological tissue image determined by the first determination unit, where the probability thresholds corresponding to biological tissue images of different types are different;
and the detection unit is used for detecting that the target object is the tumor if the probability that the target object is the tumor compared by the comparison unit is greater than the probability threshold corresponding to the type of the target biological tissue image.
In a possible implementation manner, the first determining unit is configured to determine, when the biological tissue image is a breast image, a typing type of the target breast image through a target breast typing model, where the target breast typing model is obtained by training a plurality of breast images of different typing types and information of the typing type of each breast image.
In one possible implementation manner, the apparatus may further include:
a marking unit for marking the detected tumor;
and the output unit is used for outputting the mammary gland image marked by the marking unit.
In one possible implementation, the different typing types include at least two of the following typing types: adipose, glandular and compact;
the probability threshold value corresponding to the fat type is a, the probability threshold value corresponding to the few gland type is b, the probability threshold value corresponding to the multiple gland type is c, the probability threshold value corresponding to the compact type is d, a, b, c and d are all larger than 0, and a < b < c < d.
In a possible implementation manner, the second determining unit is configured to:
preprocessing and segmenting the target breast image to determine segmented sub-images;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining a probability of the target object being a tumor from each target sub-object.
In a possible implementation manner, the second determining unit is configured to:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
A fourth aspect of the present application provides an apparatus for training a model, comprising:
an acquisition unit configured to acquire a sample image set including a plurality of biological tissue images of different typing types and typing type information of each biological tissue image;
a training unit for training an initial biological tissue typing model through the image set to determine reference parameters of the initial biological tissue typing model;
a determination unit for inputting the reference parameter into the initial biological tissue typing model to determine a target biological tissue typing model for determining a typing type of the biological tissue image.
In a possible implementation manner, the training unit is configured to: when the biological tissue image is a breast image,
extracting characteristic information of each mammary gland image;
taking the characteristic information of each mammary gland image and the typing type information corresponding to the mammary gland image as a group of training parameters;
and training the initial breast typing model through a group of training parameters corresponding to each of the plurality of breast images so as to determine the reference parameters of the initial breast typing model.
A fifth aspect of embodiments of the present application provides a computer device, including: an input/output (I/O) interface, a processor, and a memory having program instructions stored therein;
the processor is configured to execute program instructions stored in the memory to perform the method according to the first aspect or any one of the possible implementations of the first aspect.
A sixth aspect of embodiments of the present application provides a computer device, including: an input/output (I/O) interface, a processor, and a memory having program instructions stored therein;
the processor is configured to execute program instructions stored in the memory to perform the method according to the second aspect or any one of the possible implementations of the second aspect.
A seventh aspect of embodiments of the present application provides a computer-readable storage medium, including instructions that, when executed on a computer device, cause the computer device to perform the method according to the first aspect or any one of the possible implementation manners of the first aspect.
An eighth aspect of embodiments of the present application provides a computer-readable storage medium, which includes instructions that, when executed on a computer device, cause the computer device to perform the method according to any one of the possible implementation manners of the second aspect or the second aspect.
A ninth aspect of the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect or any one of the possible implementations of the first aspect.
A tenth aspect of the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the second aspect or any one of the possible implementations of the second aspect.
An eleventh aspect of the present application provides a medical image inspection system including an image scanning apparatus and an image processing apparatus;
the image scanning device is used for scanning a medical image and sending the medical image to the image processing device;
an image processing apparatus for performing the method of any one of the first aspects or performing the method of any one of the second aspects.
According to the scheme provided by the embodiment of the application, when the biological tissue image is detected, the typing type of the biological tissue image can be detected, and different tumor probability thresholds exist in biological tissues of different typing types, so that the tumor detection accuracy is improved.
Drawings
FIG. 1 is a schematic diagram of an example of a scenario for training a target breast typing model in an embodiment of the present application;
FIG. 2 is a schematic diagram of an embodiment of a method for training a model in an embodiment of the present application;
FIG. 3 is a schematic diagram of a scene for detecting a tumor in a breast image according to an embodiment of the present application;
FIG. 4 is a schematic diagram of another example of a scene for detecting a tumor in a breast image according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an embodiment of a method for detecting a tumor in a breast image according to an embodiment of the present application;
FIG. 6 is a schematic diagram of another embodiment of a method for detecting a tumor in a breast image according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an embodiment of an apparatus for training a model in an embodiment of the present application;
FIG. 8 is a schematic diagram of an embodiment of an apparatus for detecting a tumor in a breast image according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of another embodiment of an apparatus for detecting a tumor in a breast image according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an embodiment of a computer device in the embodiment of the present application.
Detailed Description
Embodiments of the present application will now be described with reference to the accompanying drawings, and it is to be understood that the described embodiments are merely illustrative of some, but not all, embodiments of the present application. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The embodiment of the application provides a method for detecting a tumor in a biological tissue image, which can improve the accuracy of detecting the tumor in the biological tissue image. The embodiment of the application also provides a corresponding device and a storage medium. The following are detailed below.
With the development of artificial intelligence technology, a machine can assist in judging whether a biological tissue image, such as a breast image, has a tumor, and the embodiment of the application considers that the number of glands on each breast is different, there are more glands on the breast and fewer glands on the breast, and when judging the tumor on the breast, the method can be influenced by the number of glands. Therefore, in the embodiment of the application, when the tumor in the breast image is determined, the classification type of the breast image is considered, and the tumor condition in the breast image is determined by combining the classification type of the breast image, so that the accuracy of the tumor detection can be improved. The determination of the type of breast image can be implemented by training a breast typing model, inputting a breast image into the breast typing model, and outputting the type of breast image. The breast typing model may be a deep neural network model. The breast typing model can be obtained by training a large number of samples of breast images of different typing types. Of course, the breast image is only used as an example in the embodiment of the present application, and actually, other biological tissue images that can be classified by the classification type belong to the claimed solutions in the embodiment of the present application.
The following describes a process of training a breast typing model in the embodiment of the present application with reference to a scene diagram of the training model of fig. 1.
As shown in fig. 1, a scenario embodiment of the training model provided in the embodiment of the present application may include: the computer device 10 obtains from the database 20 a sample image set comprising a plurality of breast images of different typing types, and typing type information for each breast image. The data in the sample image set may be pre-collected and the typing type information for each breast image may be annotated by experts in the relevant field. In the embodiment of the present application, the breast image can be divided into the following types: : adipose, oligoglandular, polyglandular and compact. Of course, it should be noted that the embodiments of the present application provide an idea of classifying the breast image type, and other related schemes for classifying the breast image type, even if the type of the partition is different from the present application or the name of the classification type is different from the present application, all of them are within the scope of the claims of the embodiments of the present application as far as the classification of the breast image type is concerned.
An initial breast typing model is configured on the computer device 10, and the computer device 10 can train the initial breast typing model through the image set to determine a reference parameter of the initial breast typing model; inputting the reference parameters into the initial breast typing model to determine a target breast typing model, wherein the target breast typing model is used for determining the typing type of the breast image.
In one possible implementation, the initial breast typing model may be understood as a function of a deep neural network, in which the coefficients are in an unknown state, the coefficients in the unknown state may be understood as reference parameters of the initial breast typing model, the characteristic information of each breast image may be understood as a plurality of input parameters, the corresponding typing type information may be understood as corresponding output parameters, and the relationship between the input parameters and the output parameters may be expressed as y ═ f (x1, x2 … … xN). A plurality of groups of input parameters and output parameters are input into the initial mammary gland typing model, so that the unknown reference parameters can be determined, the training of the model is completed, and the target mammary gland typing model is obtained. Thus, after a breast image is input into the target breast typing model, the corresponding output parameter y can be output by extracting the characteristic information (x1, x2 … … xN) in the breast image, and the typing type detection of the breast image is realized.
Accordingly, in combination with the scenario example shown in fig. 1, as shown in fig. 2, an embodiment of the method for training a model provided in the embodiment of the present application may include:
101. a sample image set is acquired, the sample image set including a plurality of biological tissue images of different typing types, and typing type information of each biological tissue image.
102. Training an initial biological tissue typing model through the set of images to determine reference parameters of the initial biological tissue typing model.
In a possible implementation manner, the step 102, if the biological tissue image is a breast image, may include:
extracting feature information of each breast image;
taking the characteristic information of each mammary gland image and the typing type information corresponding to the mammary gland image as a group of training parameters;
and training the initial breast typing model through a group of training parameters corresponding to the plurality of breast images respectively so as to determine the reference parameters of the initial breast typing model.
Of course, if the biological tissue image is an image of another part, the above-mentioned process training reference parameter may also be used.
103. Inputting the reference parameters into the initial biological tissue typing model to determine a target biological tissue typing model, wherein the target biological tissue typing model is used for determining the typing type of the biological tissue image.
The method for training the model provided in the embodiment of the application may train a target biological tissue typing model for determining the typing type of the biological tissue image, and the training process of the target biological tissue typing model may participate in the description of fig. 1 for understanding, which is not repeated herein.
As can be seen from the above description, the target biological tissue typing model can be trained, and of course, the target breast typing model can also be trained, so that when detecting a tumor in a breast image, the target breast typing model can be used to determine the typing type of the detected target breast image and then detect the tumor. When detecting a tumor, the probability that a region is a tumor is usually determined by analyzing the characteristics of the region, and if the judgment of the tumor in a breast image of any type adopts a passing comparison threshold, the identification precision is undoubtedly reduced. Therefore, in the embodiment of the present application, different sizes of probability thresholds for judging masses are configured for each typing type of breast image.
For example: the above describes that breast images can be classified into fat type, oligogland type, polygland type, and dense type in the present embodiment. The probability threshold corresponding to the fat type may be a, the probability threshold corresponding to the few gland type may be b, the probability threshold corresponding to the multiple gland type may be c, the probability threshold corresponding to the dense type may be d, and a, b, c and d are all greater than 0. Since fat type, few glands, and easily detectable masses in breast images, a does not need to be set too large, and among these several types, masses in breast images of fat type are most easily detected, so a is the smallest, and so on, a < b < c < d. The values of a, b, c, and d may be set by empirical values or calculated by some algorithms, and the determination manner of the values of a, b, c, and d is not limited in the embodiment of the present application.
Therefore, when the tumor in the breast image is detected, the probability threshold values corresponding to the type types of the breast image can be respectively adopted for judgment, so that the accuracy of the tumor detection can be improved. The process of detecting a tumor in a breast image in an embodiment of the present application is described below with reference to fig. 3.
Fig. 3 is a schematic view of a scene for detecting a tumor in a breast image according to an embodiment of the present application.
As shown in fig. 3, in an embodiment of the present application, which is a scene for detecting a tumor in a breast image, the image capturing device 30, the computer device 40 and the display 50 may be included, and the image capturing device 30, the computer device 40 and the display 50 may be in image transmission via a network. The image collecting device 30 may be a molybdenum target machine, the image collecting device 30 may collect a target breast image of a patient by radiation and transmit the target breast image to the computer device 40 through a network, and the computer device 40 may determine a typing type of the target breast image; then determining a target object in the target breast image and the probability that the target object is a tumor; comparing the probability that the target object is the tumor with probability threshold values corresponding to the typing types of the target breast images, wherein the probability threshold values corresponding to the breast images of different typing types are different; and if the probability that the target object is the tumor is greater than the probability threshold corresponding to the type of the target breast image, detecting that the target object is the tumor.
In a possible implementation manner, the computer device 40 is configured with the target breast typing model in the embodiment corresponding to fig. 1, and the typing type of the target breast image can be determined by the target breast typing model.
When the computer device 40 detects a tumor, the detected tumor may be marked and an image containing the marked breast may be output. The location of the tumor mass is marked in the breast image as shown on the display 50 in fig. 3. Of course, the way the tumor is marked can be many and is not limited to the one in fig. 3.
Of course, the scheme for detecting a tumor in a breast image provided in the embodiment of the present application is not limited to only using the field-acquired breast image shown in fig. 3, and analyzing the scene of the result in the field, and if there is only one piece of breast image, the scheme provided in the embodiment of the present application can also be implemented by acquiring the image of the piece.
As shown in fig. 4, in another scene embodiment of the present application for detecting a tumor in a breast image, the image capturing device 30 may be a shooting device, and may capture an image in a breast detection slice and transmit the image to the computer device 40, and the subsequent process performed by the computer device 40 is substantially the same as the process described in the embodiment corresponding to fig. 3, and is not repeated here.
Of course, the scenes corresponding to fig. 3 and 4 are only two examples, and in other scenes, if the computer device is integrated with the display, the display of the tumor in the detected breast image and the output result may be realized by one device as long as the function of the computer device 40 in the scenes corresponding to fig. 3 and 4 is provided. If the image capturing function is integrated with the computer device and the display on one device, the processes of capturing, detecting and displaying the image in the scene corresponding to fig. 3 and 4 can be realized by only one device.
The above-mentioned scenes shown in fig. 3 and 4 are described by taking breast detection as an example, and in fact, the embodiments of the present application are not limited to breast detection,
as shown in fig. 5, an embodiment of a method for detecting a tumor in a biological tissue image according to an embodiment of the present disclosure may include:
201. an image of the target biological tissue is acquired.
The biological tissue image may be, for example, a target breast image, and the acquisition mode of the target breast image may be understood by participating in the acquisition modes in the above-described two scenes of fig. 3 or 4.
202. Determining a typing type of the target biological tissue image.
For example: the type of classification of the target breast image may be: adipose, glandular, or dense.
203. A target object in the target biological tissue image is determined, as well as a probability that the target object is a tumor.
The tumor in the embodiment of the present application may be a suspected malignant tumor, and certainly, is not limited to a suspected malignant tumor, and may also be a benign tumor.
In one possible implementation, when the biological tissue image is a breast image, the step 203 may include:
preprocessing and segmenting the target breast image to determine segmented sub-images;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining a probability of the target object being a tumor from each target sub-object.
Determining the probability that the target object is a tumor according to each target sub-object may include:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
The merging is actually performed for two overlapping portions, one of which is to be removed.
204. And comparing the probability that the target object is the tumor with probability threshold values corresponding to the typing types of the target biological tissue images, wherein the probability threshold values corresponding to the biological tissue images of different typing types are different.
This step 204, taking a breast image as an example, can be understood by participating in the following table 1:
TABLE 1
Figure BDA0001963783650000111
If the type of the target breast image is fat type, comparing E with a, if the type of the target breast image is glandular type, comparing E with b, if the type of the target breast image is glandular type, comparing E with c, if the type of the target breast image is glandular type, comparing E with d, if the type of the target breast image is dense type.
205. And if the probability that the target object is the tumor is greater than the probability threshold corresponding to the typing type of the target biological tissue image, detecting that the target object is the tumor.
If E is 0.45, a is 0.4, b is 0.5, c is 0.6, and d is 0.7, the type of the target breast image is fat type, and E > a, the target object is determined to be a tumor mass. And if the type of the target breast image is a glandular minor form, a glandular major form or a compact form, and E is smaller than b, c and d respectively, determining that the target object is not a tumor.
In the embodiment of the application, when the breast image is detected, the typing type of the breast image can be detected, and different breast types have different tumor probability threshold values, so that the accuracy of tumor detection is improved.
In a possible implementation manner, after step 205, the method may further include: marking the detected mass; the output contains the labeled breast image. Such possible implementation manners may participate in understanding corresponding contents in fig. 3 or fig. 4, and are not repeated herein.
In the following, referring to fig. 6, another embodiment of the method for detecting a tumor in a breast image according to the embodiment of the present application is described by taking the breast image as an example.
As shown in fig. 6, another embodiment of the method for detecting a tumor in a breast image provided by the embodiment of the present application may include:
on one branch, the molybdenum target picture, i.e. the target breast image, is input to a breast typing network, i.e. the target breast typing model above. The breast typing network is obtained by training a training sample, and the breast typing network is the target breast typing model in the foregoing, and a specific training process can be understood by taking part in the embodiment corresponding to fig. 1.
On the other branch, the molybdenum target picture, namely the target mammary gland image, is preprocessed, and the preprocessed image is segmented.
The two branches are combined, the process of judging the tumor is combined with the probability threshold value corresponding to the type of the target breast image determined through the breast type network, and the detection result is obtained after the tumor is judged.
The preprocessing process may include:
(1) normalization of the sample
The image gray scale range is stretched to 0-255 through linear stretching, and the robustness of subsequent processing is improved.
(2) Breast region segmentation
The breast area is extracted through morphological opening operation and binarization, the background such as labels and the like is removed, fine tissues and noise can be removed through opening operation, and the breast tissue area can be effectively extracted through secondary classification in the segmentation process through an Otsu segmentation (Otsu) method.
(3) Histogram equalization
The subsequent segmentation algorithm is based on the image histogram, and therefore, the robustness of subsequent processing needs to be improved through histogram equalization.
(4) Bilateral filtering
Bilateral filtering is used to remove noise that may be present in breast tissue and to some extent improve region homogeneity, and bilateral filtering does not destroy the segmentation edges.
The image segmentation may include:
(1) genetic segmentation of genes
And reducing dimensionality of the target mammary image by using two-dimensional wavelet transform (with the level number of 3), counting an image histogram of the low-detail image after normalization, and performing image segmentation according to the histogram. And for the segmentation of the histogram, a genetic algorithm is adopted, genes use a binary coding form, the length is equal to the number of gray levels, and when the bit value is 0, the gray level is represented as a segmentation threshold value. The genetic algorithm cost function takes the maximum between-class variance and the minimum within-class variance as the standard, uses a general genetic algorithm process, repeats three processes of iterative selection, intersection and variation after population initialization until convergence (the initial population number is 30, the iteration times is 40, the selection rate is 10%, the intersection rate is 80%, and the variation rate is 10%), and finally outputs a segmentation threshold value, and performs segmentation operation on the original image according to the threshold value.
(2) Morphological opening operation
Morphological opening operation is carried out on the segmentation image, thymus gland connection is disconnected, and the like, so that subsequent region extraction is facilitated.
(3) Region block extraction
For the segmentation result, the higher gray level is extracted first, for example: the Top5 gray scale regions were selected as candidate regions, 10 regions having a larger area for each molybdenum target image, for the regions satisfying the conditions.
The mass determination may include:
(1) neural network training and classification
The method uses domestic hospital data, hires expert labeling data (2200+), a suspected malignant tumor is used as a positive sample, the remaining obvious benign tumor and a background area are used as negative samples, and after data enhancement (due to a molybdenum target picture, data enhancement mainly through turning and cutting is carried out, data enhancement of a color space is not needed, in addition, the suspected malignant tumor sample is input to contain the whole tumor area and a small amount of background area is surrounded), the suspected malignant tumor sample is input to an IncephetionV 3 model issued by Google as training data to be trained, and the output category number of the model is reset to be 2. The weight initialization of the model firstly uses ImageNet data set, then uses open data set DDSM, and finally uses the training data of the invention to perform migration learning to obtain the final model weight (the descending algorithm uses RMSprop, the batch processing size is 64, the initial learning rate is 0.01, and the maximum iteration number is 100000). After the model training is completed, for any input candidate region block, a probability label of whether the candidate region block is a suspected malignant block can be obtained through network calculation, and generally, a probability greater than 0.5 is considered as a suspected malignant block.
(2) Non-maximum suppression
For the area judged as the suspected malignant tumor, a non-maximum suppression method is used to remove the overlapped area, wherein the overlap threshold is set as 50%, the main purpose is to reduce the false alarm rate and simultaneously improve the accuracy of the positioning of the suspected malignant tumor.
Breast typing may include:
(1) breast typing network training and classification
Using domestic hospital data, the engaging experts labeled mammary typing data (6000+), the training phase was trained using the inclusion v3 deep learning network published by Google, the output class of the model was reset to 4, where the initialization weight of the model used the ImageNet dataset, the descent algorithm used adam, the batch size 64, the initial learning rate 0.0001, and the maximum number of iterations 100. And in the application stage, inputting the molybdenum target picture input into the system into a mammary gland typing network to obtain the classification of mammary gland typing.
(2) Selection of threshold for mass determination
And respectively returning corresponding lump judgment classification threshold values corresponding to fat type, glandular type and dense type breasts according to the breast classification obtained by classification, wherein the threshold values are a, b, c and d respectively, and a < b < c < d. Namely, for breasts with more glands, the threshold value for judging the mass is correspondingly improved, which is beneficial to avoiding the interference of the lump-shaped glands on the positioning of the mass.
In the above, a scheme for training a model and detecting a tumor in a breast image is introduced, and a corresponding apparatus in the embodiment of the present application is described below with reference to the accompanying drawings.
As shown in fig. 7, an embodiment of the apparatus 70 for training a model provided in the embodiment of the present application may include:
an acquiring unit 701 configured to acquire a sample image set including a plurality of biological tissue images of different typing types and typing type information of each biological tissue image;
a training unit 702, configured to train an initial biological tissue typing model through the image set acquired by the acquiring unit 701, so as to determine a reference parameter of the initial biological tissue typing model;
a determining unit 703, configured to input the reference parameters trained by the training unit 702 into the initial biological tissue typing model to determine a target biological tissue typing model, where the target biological tissue typing model is used to determine a typing type of a biological tissue image.
The device for training the model can train the target biological tissue typing model for determining the typing type of the biological tissue image, so that the typing type of the tumor can be judged, and the accuracy of tumor identification in the biological tissue image is improved.
In a possible implementation manner, the training unit is configured to: when the image of the biological tissue is an image of a breast,
extracting feature information of each breast image;
taking the characteristic information of each mammary gland image and the typing type information corresponding to the mammary gland image as a group of training parameters;
and training the initial breast typing model through a group of training parameters corresponding to the plurality of breast images respectively so as to determine the reference parameters of the initial breast typing model.
As shown in fig. 8, an embodiment of an apparatus 80 for detecting a tumor in a biological tissue image provided by the embodiment of the present application may include:
an acquisition unit 801 for acquiring an image of a target biological tissue;
a first determination unit 802 for determining a type of typing of the target biological tissue image acquired by the acquisition unit 801;
a second determination unit 803 for determining a target object in the target biological tissue image acquired by the acquisition unit 801 and a probability that the target object is a tumor;
a comparing unit 804, configured to compare the probability that the target object determined by the second determining unit 803 is a tumor with the probability threshold corresponding to the type of the target biological tissue image determined by the first determining unit 802, where the probability thresholds corresponding to biological tissue images of different types are different;
a detecting unit 805, configured to detect that the target object is a tumor if the probability that the target object is a tumor compared by the comparing unit 804 is greater than the probability threshold corresponding to the type of the target biological tissue image.
The device for detecting the tumor in the biological tissue image, provided by the embodiment of the application, can detect the typing type of the biological tissue image when the biological tissue image is detected, and biological tissues of different typing types have different tumor probability threshold values, so that the accuracy of tumor detection is improved.
In a possible implementation manner, the first determining unit 802 is configured to determine, when the biological tissue image is a breast image, a typing type of the target breast image through a target breast typing model, where the target breast typing model is obtained by training a plurality of breast images of different typing types and information of the typing type of each breast image.
As shown in fig. 9, another embodiment of the apparatus 80 for detecting a tumor in a breast image provided by the embodiment of the present application may further include:
a marking unit 806 for marking the tumor detected by the detection unit 805;
an output unit 807 for outputting an image containing the breast marked with the marking unit 806.
In a possible implementation manner, the probability threshold corresponding to the fat type is a, the probability threshold corresponding to the few gland type is b, the probability threshold corresponding to the multiple gland type is c, the probability threshold corresponding to the dense type is d, a, b, c, and d are all greater than 0, and a < b < c < d.
In a possible implementation manner, the second determining unit 803 is configured to:
preprocessing and segmenting the target breast image to determine segmented sub-images;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining a probability of the target object being a tumor from each target sub-object.
In a possible implementation manner, the second determining unit 803 is configured to:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
Fig. 10 is a schematic structural diagram of a computer device 90 provided in an embodiment of the present application. The computer device 90 includes a processor 910, a memory 940 and an input/output (I/O) interface 930, the memory 940 may include a read-only memory and a random access memory, and provides operating instructions and data to the processor 910. A portion of the memory 940 may also include non-volatile random access memory (NVRAM).
In some embodiments, memory 940 stores elements, executable modules or data structures, or a subset thereof, or an expanded set thereof as follows:
in the embodiment of the present application, in the process of training the model, the processor 910 performs the following process by calling the operation instructions (which may be stored in the operating system) stored in the memory 940:
acquiring a sample image set, wherein the sample image set comprises a plurality of biological tissue images with different typing types and the typing type information of each biological tissue image;
training an initial biological tissue typing model through the set of images to determine reference parameters of the initial biological tissue typing model;
inputting the reference parameters into the initial biological tissue typing model to determine a target biological tissue typing model, wherein the target biological tissue typing model is used for determining the typing type of the biological tissue image.
According to the scheme of the training model, the target biological tissue typing model used for determining the typing type of the biological tissue image can be trained, so that the typing type of tumor can be judged, and the accuracy of tumor identification in the biological tissue image is improved.
Processor 910 controls the operation of computer device 90, and processor 910 may also be referred to as a CPU (Central Processing Unit). Memory 940 may include both read-only memory and random-access memory, and provides instructions and data to processor 910. A portion of the memory 940 may also include non-volatile random access memory (NVRAM). In a particular application, the various components of the computer device 90 are coupled together by a bus system 920, wherein the bus system 920 may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, however, the various buses are designated as bus system 920 in the figure.
The method disclosed in the embodiments of the present application may be applied to the processor 910, or implemented by the processor 910. The processor 910 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 910. The processor 910 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 940, and the processor 910 reads the information in the memory 940 and performs the steps of the above method in combination with the hardware thereof.
In one possible implementation, the processor 910 is configured to: when the image of the biological tissue is an image of a breast,
extracting feature information of each breast image;
taking the characteristic information of each mammary gland image and the typing type information corresponding to the mammary gland image as a group of training parameters;
and training the initial breast typing model through a group of training parameters corresponding to the plurality of breast images respectively so as to determine the reference parameters of the initial breast typing model.
In the process of the computer device 90 for detecting a tumor in a breast image, the processor 910 executes the following process by calling the operation instructions stored in the memory 940 (the operation instructions may be stored in the operating system):
acquiring a target biological tissue image;
determining a typing type of the target biological tissue image;
determining a target object in the target biological tissue image and a probability that the target object is a tumor;
comparing the probability that the target object is the tumor with probability threshold values corresponding to the typing types of the target biological tissue images, wherein the probability threshold values corresponding to the biological tissue images of different typing types are different;
and if the probability that the target object is the tumor is greater than the probability threshold corresponding to the typing type of the target biological tissue image, detecting that the target object is the tumor.
According to the scheme provided by the embodiment of the application, when the biological tissue image is detected, the typing type of the biological tissue image can be detected, and different tumor probability thresholds exist in biological tissues of different typing types, so that the tumor detection accuracy is improved.
In one possible implementation, the processor 910 is configured to: when the biological tissue image is a mammary gland image, determining the parting type of the target mammary gland image through a target mammary gland parting model, wherein the target mammary gland parting model is obtained by training a plurality of mammary gland images of different parting types and the parting type information of each mammary gland image.
In one possible implementation, the processor 910 is configured to: marking the detected mass;
input output (I/O) interface 930 to: the output contains the labeled breast image.
In one possible implementation, the different typing types include at least two of the following typing types: adipose, glandular and compact;
the probability threshold value corresponding to the fat type is a, the probability threshold value corresponding to the few gland type is b, the probability threshold value corresponding to the multiple gland type is c, the probability threshold value corresponding to the compact type is d, a, b, c and d are all larger than 0, and a < b < c < d.
In one possible implementation, the processor 910 is configured to:
preprocessing and segmenting the target breast image to determine segmented sub-images;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining a probability of the target object being a tumor from each target sub-object.
In one possible implementation, the processor 910 is configured to:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
The above description of the computer device 90 can be understood with reference to the description of fig. 1 to 6, and will not be repeated herein.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: ROM, RAM, magnetic or optical disks, and the like.
The method for detecting a tumor in a breast image, the method for training a model, the device for training a model, and the storage medium provided in the embodiments of the present application are described in detail above, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the description of the embodiments above is only used to help understanding the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A method of detecting tumors in images of biological tissue, comprising:
acquiring a target biological tissue image, wherein the target biological tissue image is a mammary gland image;
determining a typing type of the target biological tissue image;
determining a target object in the target biological tissue image and the probability of the target object being a tumor through the trained IncepotionV 3 model, wherein the probability label of whether any input image is a tumor or not is obtained through the trained IncepotionV 3 model;
comparing the probability that the target object is a tumor with probability thresholds corresponding to the typing types of the target biological tissue images, wherein the probability thresholds corresponding to the biological tissue images of different typing types are different, and the different typing types comprise at least two of the following typing types: the kit comprises an adipotype, a glandula minority carrier, a glandula majority carrier and a dense type, wherein the probability threshold corresponding to the adipotype is a, the probability threshold corresponding to the glandula minority carrier is b, the probability threshold corresponding to the glandula minority carrier is c, the probability threshold corresponding to the dense type is d, a, b, c and d are all greater than 0, and a < b < c < d;
if the probability that the target object is the tumor is greater than the probability threshold value corresponding to the typing type of the target biological tissue image, detecting that the target object is the tumor;
wherein the determining a target object in the target biological tissue image and a probability that the target object is a tumor comprises:
preprocessing a target breast image, and segmenting the preprocessed image to determine a segmentation sub-image;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining the probability of the target object being a tumor according to each target sub-object;
wherein the preprocessing the target breast image comprises:
stretching the gray scale range of the target breast image to 0-255 through linear stretching;
removing fine and broken combination and noise by using an open operation and performing secondary classification by using an Otsu segmentation method to extract a breast tissue region;
processing through histogram equalization;
removing noise in breast tissue using bilateral filtering;
wherein the segmenting the pre-processed image comprises:
reducing dimensionality of the preprocessed target breast image by using two-dimensional wavelet transform, and performing segmentation operation according to a segmentation threshold, wherein the segmentation threshold takes maximum inter-class variance and minimum intra-class variance as standards through a genetic algorithm cost function, and a general genetic algorithm process is used for repeating three processes of iterative selection, intersection and variation after population initialization until convergence output is achieved to obtain the segmentation threshold;
using a morphological segmentation operation on the segmented images;
the region blocks with higher gray levels are extracted and candidate regions are selected from the regions satisfying the conditions.
2. The method of claim 1, wherein the determining the type of typing of the target biological tissue image comprises:
determining the typing type of the target mammary gland image through a target mammary gland typing model, wherein the target mammary gland typing model is obtained by training a plurality of mammary gland images of different typing types and the typing type information of each mammary gland image.
3. The method of claim 2, further comprising:
marking the detected mass;
the output contains the labeled breast image.
4. The method of claim 1, wherein determining the probability of the target object being a tumor from each target sub-object comprises:
combining the mutually overlapped areas in each target sub-object;
and determining the probability that the target object is the tumor according to the set of the merged target sub-objects.
5. An apparatus for detecting a tumor in a breast image, comprising:
the acquisition unit is used for acquiring a target biological tissue image, and the target biological tissue image is a mammary gland image;
a first determination unit configured to determine a type of classification of the target biological tissue image acquired by the acquisition unit;
a second determining unit, configured to determine, through the trained inclusion v3 model, a target object in the target biological tissue image acquired by the acquiring unit and a probability that the target object is a tumor, where for any input image, a probability label of whether the input image is a tumor is obtained through the trained inclusion v3 model;
a comparing unit, configured to compare the probability that the target object is a tumor determined by the second determining unit with a probability threshold corresponding to a typing type of the target biological tissue image determined by the first determining unit, where the probability thresholds corresponding to biological tissue images of different typing types are different, and the different typing types include at least two of the following typing types: the kit comprises an adipotype, a glandula minority carrier, a glandula majority carrier and a dense type, wherein the probability threshold corresponding to the adipotype is a, the probability threshold corresponding to the glandula minority carrier is b, the probability threshold corresponding to the glandula minority carrier is c, the probability threshold corresponding to the dense type is d, a, b, c and d are all greater than 0, and a < b < c < d;
a detection unit, configured to detect that a target object is a tumor if the probability that the target object is the tumor compared by the comparison unit is greater than a probability threshold corresponding to the type of the target biological tissue image;
wherein the second determining unit is specifically configured to:
preprocessing a target breast image, and segmenting the preprocessed image to determine a segmentation sub-image;
inputting the segmentation sub-images into a classification model, and determining a target sub-object contained in each segmentation sub-image, wherein the target sub-object is contained in the target object;
determining the probability of the target object being a tumor according to each target sub-object;
wherein the second determining unit is specifically configured to: stretching the gray scale range of the target breast image to 0-255 through linear stretching; removing fine and broken combination and noise by using an open operation and performing secondary classification by using an Otsu segmentation method to extract a breast tissue region; processing through histogram equalization; removing noise in breast tissue using bilateral filtering;
reducing dimensionality of the preprocessed target breast image by using two-dimensional wavelet transform, and performing segmentation operation according to a segmentation threshold, wherein the segmentation threshold takes maximum inter-class variance and minimum intra-class variance as standards through a genetic algorithm cost function, and a general genetic algorithm process is used for repeating three processes of iterative selection, intersection and variation after population initialization until convergence output is achieved to obtain the segmentation threshold; using a morphological segmentation operation on the segmented images; the region blocks with higher gray levels are extracted and candidate regions are selected from the regions satisfying the conditions.
6. The apparatus of claim 5,
the first determining unit is used for determining the type of the target breast image through a target breast typing model when the biological tissue image is the breast image, wherein the target breast typing model is obtained by training a plurality of breast images of different typing types and the typing type information of each breast image.
7. The apparatus of claim 6, further comprising:
a marking unit for marking the tumor detected by the detecting unit;
and the output unit is used for outputting the mammary gland image marked by the marking unit.
8. A computer device, characterized in that the computer device comprises: an input/output (I/O) interface, a processor, and a memory having program instructions stored therein;
the processor is configured to execute program instructions stored in the memory to perform the method of any of claims 1-4.
9. A medical image inspection system, characterized in that the medical image inspection system comprises an image scanning device and an image processing device;
the image scanning device is used for scanning a medical image and sending the medical image to the image processing device;
an image processing apparatus for performing the method of any one of claims 1 to 4.
CN201910092954.2A 2019-01-30 2019-01-30 Method for detecting tumor in biological tissue image, corresponding device and medium Active CN109872307B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910092954.2A CN109872307B (en) 2019-01-30 2019-01-30 Method for detecting tumor in biological tissue image, corresponding device and medium
CN201910684650.5A CN110428405A (en) 2019-01-30 2019-01-30 Method, relevant device and the medium of lump in a kind of detection biological tissue images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910092954.2A CN109872307B (en) 2019-01-30 2019-01-30 Method for detecting tumor in biological tissue image, corresponding device and medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201910684650.5A Division CN110428405A (en) 2019-01-30 2019-01-30 Method, relevant device and the medium of lump in a kind of detection biological tissue images

Publications (2)

Publication Number Publication Date
CN109872307A CN109872307A (en) 2019-06-11
CN109872307B true CN109872307B (en) 2022-01-07

Family

ID=66918282

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910092954.2A Active CN109872307B (en) 2019-01-30 2019-01-30 Method for detecting tumor in biological tissue image, corresponding device and medium
CN201910684650.5A Pending CN110428405A (en) 2019-01-30 2019-01-30 Method, relevant device and the medium of lump in a kind of detection biological tissue images

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910684650.5A Pending CN110428405A (en) 2019-01-30 2019-01-30 Method, relevant device and the medium of lump in a kind of detection biological tissue images

Country Status (1)

Country Link
CN (2) CN109872307B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110414539A (en) * 2019-08-05 2019-11-05 腾讯科技(深圳)有限公司 A kind of method and relevant apparatus for extracting characterization information
CN111080642A (en) * 2019-12-31 2020-04-28 北京推想科技有限公司 Tissue typing method and device based on medical image and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101401730A (en) * 2008-11-14 2009-04-08 南京大学 Fast detecting method for mammary gland tumour shadiness area based on layered structure
CN104732213A (en) * 2015-03-23 2015-06-24 中山大学 Computer-assisted lump detecting method based on mammary gland magnetic resonance image
CN104771228A (en) * 2015-03-23 2015-07-15 中山大学 Method and device for judging whether breast mass is benign or malignant
CN106355043A (en) * 2015-11-19 2017-01-25 小红象医疗科技有限公司 Automatic analysis system and method for breast infrared information
CN108395986A (en) * 2018-03-23 2018-08-14 余晖 Human papilloma virus automatic parting direction detection device based on deep learning
CN108464840A (en) * 2017-12-26 2018-08-31 安徽科大讯飞医疗信息技术有限公司 A kind of breast lump automatic testing method and system
CN109191424A (en) * 2018-07-23 2019-01-11 哈尔滨工业大学(深圳) A kind of detection of breast lump and categorizing system, computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9392986B2 (en) * 2011-02-14 2016-07-19 University Of Rochester Method and apparatus for cone beam breast CT image-based computer-aided detection and diagnosis
EP3612981A1 (en) * 2017-04-19 2020-02-26 Siemens Healthcare GmbH Target detection in latent space
US20180306794A1 (en) * 2017-04-20 2018-10-25 The Regents Of The University Of California Methods of Producing Gene Expression Profiles of Subjects Having Cancer and Kits for Practicing Same
CN107958453B (en) * 2017-12-01 2022-01-28 深圳蓝韵医学影像有限公司 Method and device for detecting lesion region of mammary gland image and computer storage medium
CN108921821A (en) * 2018-06-01 2018-11-30 中国人民解放军战略支援部队信息工程大学 Method of discrimination based on the LASSO mammary cancer armpit lymph gland transfering state returned
CN109146848A (en) * 2018-07-23 2019-01-04 东北大学 A kind of area of computer aided frame of reference and method merging multi-modal galactophore image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101401730A (en) * 2008-11-14 2009-04-08 南京大学 Fast detecting method for mammary gland tumour shadiness area based on layered structure
CN104732213A (en) * 2015-03-23 2015-06-24 中山大学 Computer-assisted lump detecting method based on mammary gland magnetic resonance image
CN104771228A (en) * 2015-03-23 2015-07-15 中山大学 Method and device for judging whether breast mass is benign or malignant
CN106355043A (en) * 2015-11-19 2017-01-25 小红象医疗科技有限公司 Automatic analysis system and method for breast infrared information
CN108464840A (en) * 2017-12-26 2018-08-31 安徽科大讯飞医疗信息技术有限公司 A kind of breast lump automatic testing method and system
CN108395986A (en) * 2018-03-23 2018-08-14 余晖 Human papilloma virus automatic parting direction detection device based on deep learning
CN109191424A (en) * 2018-07-23 2019-01-11 哈尔滨工业大学(深圳) A kind of detection of breast lump and categorizing system, computer readable storage medium

Also Published As

Publication number Publication date
CN109872307A (en) 2019-06-11
CN110428405A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
Li et al. Pulmonary nodule classification with deep convolutional neural networks on computed tomography images.
Foggia et al. Benchmarking HEp-2 cells classification methods
Albalawi et al. Classification of breast cancer mammogram images using convolution neural network
WO2019184851A1 (en) Image processing method and apparatus, and training method for neural network model
Torrents-Barrena et al. Computer-aided diagnosis of breast cancer via Gabor wavelet bank and binary-class SVM in mammographic images
Unni et al. Tumour detection in double threshold segmented mammograms using optimized GLCM features fed SVM
Öztürk et al. Comparison of HOG, MSER, SIFT, FAST, LBP and CANNY features for cell detection in histopathological images
Abdel-Nasser et al. Towards cost reduction of breast cancer diagnosis using mammography texture analysis
CN109872307B (en) Method for detecting tumor in biological tissue image, corresponding device and medium
Lobo et al. Classification and segmentation techniques for detection of lung cancer from CT images
Salazar-Licea et al. Location of mammograms ROI's and reduction of false-positive
Baboo et al. A classification and analysis of pulmonary nodules in CT images using random forest
Kumarganesh et al. An efficient approach for brain image (tissue) compression based on the position of the brain tumor
Iqbal et al. A heteromorphous deep CNN framework for medical image segmentation using local binary pattern
Mabrouk et al. Computer aided detection of large lung nodules using chest computer tomography images
Ryan et al. Image classification with genetic programming: Building a stage 1 computer aided detector for breast cancer
GB2457022A (en) Creating a fuzzy inference model for medical image analysis
Bajcsi et al. Towards feature selection for digital mammogram classification
Torres et al. Lesion detection in breast ultrasound images using a machine learning approach and genetic optimization
Herwanto et al. Association technique based on classification for classifying microcalcification and mass in mammogram
Vyshnavi et al. Breast Density Classification in Mammogram Images
Liu et al. Breast mass detection with kernelized supervised hashing
Santamaria-Pang et al. Cell segmentation and classification via unsupervised shape ranking
Vijayarajan et al. A novel comparative study on breast cancer detection using different types of classification techniques
Karale et al. A screening CAD tool for the detection of microcalcification clusters in mammograms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant