CN113537181A - CB microkernel microscopic image identification and analysis method and system based on neural network - Google Patents

CB microkernel microscopic image identification and analysis method and system based on neural network Download PDF

Info

Publication number
CN113537181A
CN113537181A CN202111090307.1A CN202111090307A CN113537181A CN 113537181 A CN113537181 A CN 113537181A CN 202111090307 A CN202111090307 A CN 202111090307A CN 113537181 A CN113537181 A CN 113537181A
Authority
CN
China
Prior art keywords
image
cell
micronucleus
microscopic
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111090307.1A
Other languages
Chinese (zh)
Inventor
申相
周正干
马腾飞
李朝文
杨富城
温占波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huironghe Technology Co Ltd
Original Assignee
Beijing Huironghe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huironghe Technology Co Ltd filed Critical Beijing Huironghe Technology Co Ltd
Priority to CN202111090307.1A priority Critical patent/CN113537181A/en
Publication of CN113537181A publication Critical patent/CN113537181A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention discloses a CB method microkernel microscopic image identification and analysis method and system based on a convolutional neural network. The automatic identification method comprises the following steps: acquiring a CB method microkernel microscopic image; extracting a cell image from the CB method micronucleus microscopic image; dividing the extracted cell image into a first single independent cell image and an adhesion cell mass image; separating the adherent cell mass image into a plurality of second individual independent cell images; all of the first single independent cell images and all of the second single independent cell images are input into a convolutional neural network model to identify binuclear cell images containing micronuclei. The dual-core cell image containing the micronucleus is identified through the convolutional neural network model, so that the identification speed and the identification rate of the dual-core cell image containing the micronucleus are improved.

Description

CB microkernel microscopic image identification and analysis method and system based on neural network
Technical Field
The application relates to the field of radiation biology dosimetry, in particular to a CB microkernel microscopic image identification and analysis method and system based on a neural network.
Background
Cytokininsis-block (CB) micronucleus analysis (e.g., human peripheral blood lymphocyte Cytokinesis-blocking micronucleus analysis) is an internationally accepted method of radiobiological dose estimation.
At present, most of the related technologies use an artificial microscope slide reading mode to identify the binuclear cells containing the micronucleus of the CB microkaryon, and the artificial mode is time-consuming, labor-consuming and easily influenced by human subjectivity.
Disclosure of Invention
In view of the above, the present application is proposed to provide a neural network-based CB microkernel microscopic image recognition and analysis method and system that overcomes or at least partially solves the above-mentioned problems.
According to one aspect of the invention, the method for automatically identifying the CB method microkernel microscopic image based on the convolutional neural network comprises the following steps: acquiring a CB method microkernel microscopic image; extracting a cell image from the CB method micronucleus microscopic image; dividing the extracted cell image into a first single independent cell image and an adhesion cell mass image; separating the adherent cell mass image into a plurality of second individual independent cell images; inputting all of the first single independent cell images and all of the second single independent cell images into a convolutional neural network model to identify a binuclear cell image containing a micronucleus.
Optionally, the step of extracting a cell image from the CB-method micronucleus microscopic image includes: carrying out RGB channel splitting on the CB method microkernel microscopic image to obtain a channel component image of the CB method microkernel microscopic image; performing a double-threshold iterative algorithm on the channel component map to determine a cytoplasm gray threshold and a nucleus gray threshold corresponding to the channel component map; performing thresholding processing on the channel component image according to the gray threshold of the cytoplasm; merging the image subjected to thresholding with the CB method microkernel microscopic image; and extracting the cell image from the combined image by searching the contour, thereby extracting the cell image from the CB method micronucleus microscopic image.
Optionally, before the step of combining the thresholded image and the CB method microkernel microscopic image, the method further includes: carrying out median filtering processing on the image subjected to thresholding processing; the step of combining the image after thresholding and the CB method microkernel microscopic image comprises the following steps: and merging the image subjected to thresholding and median filtering with the CB method microkernel microscopic image.
Optionally, the step of performing a dual-threshold iterative algorithm on the channel component map includes: performing a double-threshold iterative algorithm on the channel component diagram according to the following iterative formula:
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE002
in the formula, hkRepresenting the number of pixels with k gray level in the channel component diagram, Ti+1,1Representing the gray threshold, T, of the cytoplasm after the ith iterationi+1,2Representing a grayscale threshold of the cell nucleus after the ith iteration; when T isi+1,1=Ti,1When, Ti+1,1Or Ti,1Determining a threshold value of the cytoplasm corresponding to the channel component map; when T isi+1,2=Ti,2When, Ti+1,2Or Ti,2And determining the corresponding cytoplasm gray threshold value of the channel component map.
Optionally, the CB-method micronucleus microscopic image is an image formed by cells stained with Giemsa dye, and the channel component map is a G-channel component map.
Optionally, the step of dividing the extracted cell image into a first single independent cell image and a cohesive cell mass image includes: judging whether the morphological characteristics of each extracted cell image meet preset conditions or not; if the preset conditions are met, dividing the corresponding cell image into the first single independent cell image; and if the preset condition is not met, dividing the corresponding cell image into the adhesion cell mass image.
Optionally, the morphological feature includes an elongation and a defect rate, and the elongation and the defect rate are determined by the following formulas:
EI=min(d,w)/max(d,w);
DR=(circle-area)/area;
wherein EI is an extension rate, d is a width of a corresponding minimum circumscribed rectangle of the cell image, w represents a length of the corresponding minimum circumscribed rectangle of the cell image, DR is a defect rate, area is an area of the corresponding cell image, and circle is an area of the corresponding minimum circumscribed circle of the cell image.
Optionally, the preset condition includes: the elongation is greater than 0.5 and less than 1.0, and the defect rate is greater than 0 and less than 0.5.
Optionally, the step of separating the adherent cell mass image into a plurality of second individual independent cell images comprises: performing limit corrosion operation on the adhesion cell mass image so as to obtain the geometric center of a target area corresponding to each second single independent cell image; performing multiple expansion operations with the geometric center as the center until all the target areas are connected; and separating the adhered cell mass image into a plurality of second single independent cell images by taking the connected boundary of each target area as a separation line of the plurality of second single independent cell images.
Optionally, before the step of performing multiple expansion operations with the geometric center as the center until each of the target regions is connected, the method further includes: and screening all the geometric centers to screen out the geometric centers of which the pixel value is smaller than the pixel value threshold value.
Optionally, the convolutional neural network includes a first convolutional layer, a first maximum pooling layer, a second convolutional layer, a second maximum pooling layer, a third convolutional layer, a fourth convolutional layer, a third maximum pooling layer, a first comprehensive connection layer, a second comprehensive connection layer, and a third comprehensive connection layer, which enable image data to be sequentially transferred, and the image data passes through the third comprehensive connection layer, the convolutional neural network model generates the reliability probability value of the category to which the image data belongs by using a softmax function.
Optionally, each neuron of the first and second fully-connected layers uses a loss rate of 50% during training.
According to another aspect of the present application, there is provided an automatic identification system for CB-method microkernel microscopy images based on convolutional neural network, comprising: one or more processors; and a memory storing instructions executable by the one or more processors to cause the automatic identification system to perform any of the automatic identification methods described above.
According to another aspect of the present application, there is provided an automatic analysis method for CB-method microkernel microscopy images based on convolutional neural network, comprising: acquiring a binuclear cell image containing micronucleus, which is identified by any one of the automatic identification methods; extracting a cell nucleus image and a micronucleus image in the identified binuclear cell image containing micronucleus; and screening and judging the extracted cell nucleus image and the micronucleus image so as to determine the micronucleus number of the binuclear cells containing the micronuclei.
Optionally, the step of screening and determining the extracted cell nucleus image and the extracted micronucleus image includes: sorting the extracted cell nucleus images and the extracted area values ai of the micronucleus images into arrays from large to small, wherein i =0, 1 and 2 … … n, and n is the sum of the cell nucleus images and the micronucleus images minus 1; sequentially judging and screening each value in the array, when 20 pixel points are less than Ai and less than 0.2 xAi, determining the corresponding image as a micronuclear image, when Ai is less than or equal to 20 pixel points, screening the corresponding image, and when Ai is more than or equal to 0.2 xAi, determining the corresponding image as a nuclear image; wherein, a0= a0, Ai = a (i-1) when a (i-1) is judged as a micronucleus image or screened out, and Ai = (a (i-1) + a (i-1))/2 when a (i-1) is judged as a cell nucleus image.
Optionally, the step of extracting the cell nucleus image and the micronucleus image in the identified binuclear cell image containing the micronucleus includes: and extracting a cell nucleus image and a micronucleus image in the identified binuclear cell image containing the micronucleus by using a K-means clustering algorithm.
According to another aspect of the present application, there is provided an automatic analysis system for CB-method microkernel microscopy images based on convolutional neural network, comprising: one or more processors; and a memory storing instructions executable by the one or more processors to cause the automated analysis system to perform any of the automated analysis methods described above.
The dual-core cell image containing the micronucleus is identified through the convolutional neural network model, so that automation in the identification process of the dual-core cell containing the micronucleus of the CB method micronucleus is achieved, workers are liberated from tedious reading work, standardization and unification of the identification process are achieved, and the identification speed and the identification rate of the dual-core cell image containing the micronucleus are improved through the convolutional neural network model.
Drawings
Other objects and advantages of the present invention will become apparent from the following description of the invention which refers to the accompanying drawings, and may assist in a comprehensive understanding of the invention.
FIG. 1 is a flow chart of a method for automatically identifying CB-based microkernel microscopy images based on a convolutional neural network according to some embodiments of the present application;
FIG. 2 is a flow chart of cell image extraction in the automatic identification method of CB method micro-nuclear microscopic image based on convolutional neural network according to some embodiments of the present application;
FIG. 3 is a flowchart of cell image extraction in a method for automatically identifying a CB-based microkernel microscopic image based on a convolutional neural network according to another embodiment of the present application;
fig. 4 is a flowchart of dividing an extracted cell image into a first single independent cell image and an adherent cell mass image in an automatic identification method of a CB-based microkernel microscopic image based on a convolutional neural network according to some embodiments of the present application;
FIG. 5 is a flow chart of a method for automatically analyzing a CB-based microkernel microscopic image based on a convolutional neural network according to some embodiments of the present application;
fig. 6 is a flowchart illustrating a method for automatically analyzing CB-based microkernel microscopic images based on a convolutional neural network according to some embodiments of the present application, wherein the extracted nuclear images and microkernel images are screened and determined.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings of the embodiments of the present invention. It should be apparent that the described embodiment is one embodiment of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without any inventive step, are within the scope of protection of the invention.
Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs.
The embodiment of the application firstly provides an automatic identification method of a CB method microkernel microscopic image based on a convolutional neural network.
In machine learning, the convolutional neural network is a deep feedforward artificial neural network and can perform image processing.
As will be understood by those skilled in the art, the CB-method micronucleus microscopic image refers to an image of a cell imaged by a microscope for use in cytokinesis-blocking micronucleus analysis, that is, the CB-method micronucleus microscopic image may be an image of a cell-containing microscope that is required to determine the micronucleus number of a binuclear cell containing a micronucleus. The corresponding cells may be human peripheral blood lymphocytes. Cytokinesis-arrested micronucleus analysis, the currently most effective dose estimation method in addition to the chromosome aberration gold standard, correlates the micronucleus rate of binuclear cells in cells with the irradiated dose by adding cytochalasin B during cell culture, and is commonly used for unknown dose estimation in radiobiology dosimetry. Micronuclei are derived from whole chromosomes or chromosome fragments that are delayed in division induced by ionizing radiation. When the cell is irradiated, the micronuclei is in a lagging state in the anaphase of nuclear division, and therefore is not contained in the main nucleus of the cell, but is a small round entity which is retained in cytoplasm, and compared with double centromere chromosome analysis, the cytokinesis retardation micronuclei analysis is simpler and has high production speed, so that the method is more suitable for rapid biological dosimetry of a large number of people.
Fig. 1 is a flowchart of an automatic identification method for CB-method microkernel microscopic images based on a convolutional neural network according to some embodiments of the present application, and as shown in fig. 1, the automatic identification method for CB-method microkernel microscopic images based on a convolutional neural network according to embodiments of the present application includes the following steps:
and step S102, acquiring a CB method microkernel microscopic image.
And step S104, extracting a cell image from the CB method micronucleus microscopic image.
Step S106, dividing the extracted cell image into a first single independent cell image and an adhesion cell mass image.
Step S108, separating the adhesion cell mass image into a plurality of second single independent cell images.
Step S110, inputting all the first single independent cell images and all the second single independent cell images into a convolutional neural network model to identify a dual-core cell image containing a micronucleus.
In step S102, a CB method microkernel microscopic image may be obtained from an image database, for example, the CB method microkernel microscopic image is read from a database of an automatic microscopic image acquisition system, where the CB method microkernel microscopic image may have various resolutions, for example, the resolution may be 2048 × 2048, and the like.
In step S104, the CB micronuclear microscopic image may be preprocessed, and then the cell image may be separated from the background image, so as to extract the cell image from the CB micronuclear microscopic image.
In step S106, the extracted cell image may be divided into a first single independent cell image and an adherent cell mass image according to the defect rate and the extension rate of each cell image.
In step S108, the adherent cell mass image may be separated into a plurality of second individual cell images using a watershed automatic separation algorithm, a corrosion dilation algorithm, or the like.
In step S110, all the single independent cell images may be input into the convolutional neural network model for cell type determination, so as to obtain a type (including a type of a dual-core cell image including a micronucleus) to which each image belongs and a reliability probability value.
According to the embodiment of the application, the dual-core cell image containing the micronucleus is identified through the convolutional neural network model, so that automation in the identification process of the dual-core cell containing the micronucleus of the CB method micronucleus is realized, workers are liberated from tedious reading work, standardization and unification of the identification process are realized, and the identification speed and the identification rate of the dual-core cell image containing the micronucleus are improved through the use of the convolutional neural network model.
Fig. 2 is a flowchart of extracting a cell image in an automatic identification method of a CB-method microkernel microscopic image based on a convolutional neural network according to some embodiments of the present application. In some embodiments of the present application, the step of extracting the cell image from the CB method micronuclear microscopic image may specifically include the following steps:
step S202, RGB channel splitting is carried out on the CB method microkernel microscopic image to obtain a channel component image of the CB method microkernel microscopic image.
Step S204, a double-threshold iterative algorithm is carried out on the channel component image to determine the cytoplasm gray threshold and the nucleus gray threshold corresponding to the channel component image.
In step S206, the channel component map is thresholded according to the cytoplasmic gray threshold.
And step S208, merging the image subjected to thresholding with the CB method microkernel microscopic image.
And step S210, extracting a cell image from the combined image by searching the contour, thereby extracting the cell image from the CB method micronucleus microscopic image.
Wherein, the CB-method micronucleus microscopic image may be an image formed by cells stained with Giemsa dye, and the channel component map in step S202 may be a G-channel component map. The cells are stained by Giemsa dye, so that the cells can be darker blue or purple red (related to the pH value during staining), and in practical tests, the inventors find that the segmentation effect of the G channel component diagram is the best at the moment, and the applicability is stronger.
In step S204, a double-threshold iterative algorithm may be performed on the channel component map according to the following iterative formula:
Figure 20062DEST_PATH_IMAGE003
Figure DEST_PATH_IMAGE004
in the formula, hkRepresenting the number of pixel points with the gray level of k (the k value range is 0-255), T in the channel component diagrami+1,1Representing the gray threshold, T, of the cytoplasm after the ith iterationi+1,2Representing a grayscale threshold of the cell nucleus after the ith iteration; when T isi+1,1=Ti,1When, Ti+1,1Or Ti,1Determining a threshold value of the cytoplasm corresponding to the channel component map; when T isi+1,2=Ti,2When, Ti+1,2Or Ti,2And determining the corresponding cytoplasm gray threshold value of the channel component map.
Understandably, Ti,1Representing a boundary threshold, T, pertaining to the background and cytoplasmic color of the imagei,2Representing the boundary threshold of cytoplasmic color and nuclear color. Wherein, when i =0, Ti+1,1Namely T1,1Can be selected according to experimental conditions, e.g. T1,1Can be (kmax + kmin)/3, etc., Ti+1,,2Namely T1,2Can root upSelected according to experimental conditions, e.g. T1,2The gray scale value can be 2 (kmax + kmin)/3, wherein kmax is the maximum gray scale value in the CB method micronucleus microscopic image, and kmin is the minimum gray scale value in the CB method micronucleus microscopic image.
In step S206, the channel component map is thresholded according to the cytoplasmic gray threshold, so as to obtain a binary map, for example, the gray of the pixel having the gray value smaller than the cytoplasmic gray threshold is assigned to 0, and the gray of the pixel having the gray value greater than or equal to the cytoplasmic gray threshold is assigned to 255.
In step S208, the non-cell part of the merged image may be a pure black color, and the cell part may be an original color. The merging operation of the images is known per se to the person skilled in the art and will not be described in further detail here.
In step S210, since the original image cannot be directly subjected to contour search and the merging function is to highlight the contour, the cell image can be extracted by searching the contour.
Fig. 3 is a flowchart of extracting a cell image in an automatic identification method for CB-method microkernel microscopy images based on a convolutional neural network according to another embodiment of the present disclosure, as shown in fig. 3, in another embodiment of the present disclosure, the step of extracting a cell image from CB-method microkernel microscopy images may specifically include the following steps:
step S302, RGB channel splitting is carried out on the CB method microkernel microscopic image to obtain a channel component image of the CB method microkernel microscopic image.
Step S304, a double-threshold iterative algorithm is performed on the channel component map to determine a grayscale threshold of cytoplasm and a grayscale threshold of nucleus corresponding to the channel component map.
Step S306, carrying out thresholding processing on the channel component map according to the cytoplasm gray threshold value.
Step S308, performs median filtering processing on the image subjected to thresholding processing.
And step S310, merging the image subjected to thresholding and median filtering with the CB method microkernel microscopic image.
Step S312, extracting a cell image from the combined image by searching the contour, thereby extracting the cell image from the CB method micronucleus microscopic image.
In this embodiment, in step S308, the small block impurities in the image can be removed by using a median filtering technique, which is known to those skilled in the art and will not be described herein again, and further steps of this embodiment may refer to the above embodiment and will not be described herein again.
In some embodiments of the present application, the step of dividing the extracted cell image into a first single independent cell image and a stuck cell mass image may include: judging whether the morphological characteristics of each extracted cell image meet preset conditions or not; if the preset conditions are met, dividing the corresponding cell image into a first single independent cell image; and if the preset condition is not met, dividing the corresponding cell image into an adhesion cell mass image.
Since the image of the individual cell and the image of the adherent cell mass are generally largely different, the image of the individual cell and the image of the adherent cell mass can be effectively divided by the morphological feature of each cell image.
The morphological characteristics may include an elongation and a defect rate, which may be determined by the following formula:
EI=min(d,w)/max(d,w);
DR=(circle-area)/area;
where EI is an elongation, d is a width of a minimum bounding rectangle of the corresponding cell image, w represents a length of the minimum bounding rectangle of the corresponding cell image, DR is a defect rate, area is an area of the corresponding cell image, and circle is an area of a minimum bounding circle of the corresponding cell image.
The value corresponding to the preset condition may be determined according to an experimental condition, for example, the preset condition may include: the elongation is more than 0.5 and less than 1.0, and the defect rate is more than 0 and less than 0.5. The larger the value of circle-area, the more the portion representing a defect. After a large number of parameter tests, when DR is selected to be 0 < DR < 0.5, the number of the obtained candidate cells is the largest, so that DR is selected to be 0-0.5.
Fig. 4 is a flowchart of dividing the extracted cell image into a first single independent cell image and an adherent cell mass image in the automatic identification method of the CB-based micronucleus microscopic image based on the convolutional neural network according to some embodiments of the present application, and as shown in fig. 4, the dividing the extracted cell image into the first single independent cell image and the adherent cell mass image may specifically include the following steps:
in step S402, the elongation EI and defect DR of the cells are calculated.
In step S404, it is determined whether EI is greater than 0.5 and less than 1.0 and DR is greater than 0 and less than 0.5. If yes, go to step S406, otherwise go to step S408.
Step S406, the corresponding cell image is divided into a first single independent cell image.
Step S408, the corresponding cell image is divided into an adherent cell mass image.
In some embodiments of the present application, the step of separating the adherent cell mass image into a plurality of second individual independent cell images may comprise: performing limit corrosion operation on the adhesion cell mass image to obtain a geometric center (namely a seed point) of a target area corresponding to each second single independent cell image; performing multiple expansion operations by taking the geometric center as a center until all target areas are connected; and separating the adherent cell mass image into a plurality of second single independent cell images by taking the connected boundary of each target area as a separation line of the plurality of second single independent cell images.
Since the extreme erosion operation, the dilation operation, the watershed algorithm, etc. are known per se to those skilled in the art, they will not be described herein.
Before the step of performing multiple expansion operations with the geometric center as the center until each target area is connected, the method further comprises the following steps: all geometric centers are screened to screen out those geometric centers where the pixel value size is less than the pixel value threshold. That is, the geometric center that is filtered out is not used as the geometric center for the expansion operation in the subsequent calculation.
Since the shape and size of the cells in the image of the adherent cell mass are different and some impurity residues which are not completely eliminated exist, the phenomenon of excessive separation can occur by using the conventional watershed algorithm. Therefore, a threshold value is set for the size of the seed point, and the over-separation phenomenon is suppressed by removing the too small seed point.
The convolutional neural network can comprise a first convolutional layer, a first maximum pooling layer, a second convolutional layer, a second maximum pooling layer, a third convolutional layer, a fourth convolutional layer, a third maximum pooling layer, a first comprehensive connecting layer, a second comprehensive connecting layer and a third comprehensive connecting layer which enable image data to be transmitted in sequence, and after the image data passes through the third comprehensive connecting layer, the convolutional neural network model generates the reliability probability value of the category to which the image data belongs by using a softmax function.
In some embodiments, each neuron of the first fully-connected layer and the second fully-connected layer uses a loss rate of 50% during training. When the loss rate is set to 50%, the generated random loss effect is the best, and the anti-overfitting effect is also the best.
In particular, the trained convolutional neural network model can be used to perform class determination on the obtained single independent cell. The designed convolutional neural network model contains 7 layers (4 convolutional layers, 3 global connection layers). The first convolutional layer (Conv 1) filters a 3 × 132 × 132 input image with 8 kernels of size 3 × 3. The second convolutional layer (Conv 2) has 16 filters of size 3 × 3, and the third convolutional layer (Conv 3) and the fourth convolutional layer (Conv 4) have 32 filters of size 3 × 3 and 48 filters of size 48, respectively. The first fully Connected Layer (FC 1) is 40 neurons, each using a loss rate of 50% during training. The second fully Connected Layer (FC 2) has 20 neurons, each with a loss rate of 50% during training as well. The third fully Connected Layer (FC 3) has 5 neurons. The largest pooling layer with 2 × 2 kernel size was used after Conv1, Conv2 and Conv 4. A rectifying linear unit (ReLU) activation function is used after each convolution and full connection layer. And finally generates probability distributions on the five class labels using the softmax function. Since the area values of different cell images are different, all images need to be scaled to 132 × 132 images without stretching before training (this size is close to the average area of the binuclear cell images). The convolutional neural network model was then implemented using the Keras function package of TensorFlow. The learning rate was initialized to 0.001 and reduced by a factor of 10 after half of the iterations. Momentum term was set to 0.9 and weight decayed to 0.0001. All parameter settings are standard configurations except for the specified configuration. The network model spends 40 hours of training time on the NVIDIA GeForce GTX 1080 Ti GPU after 60 times of iterative training, so that an identification model with high identification accuracy of 85% can be obtained, and the embodiment of the application conducts repeated iterative training on the convolutional neural network model through a large number of micronucleus images, so that the identification model with high identification rate and high identification accuracy is obtained, and the problem of low identification rate (30-60%) of double-core cells and micronucleus of the existing micronucleus image analysis algorithm can be solved. After each image is identified by the convolutional neural network model, the category to which the image belongs and the probability value of the reliability can be obtained.
The embodiment of the application also provides an automatic identification system of the CB method microkernel microscopic image based on the convolutional neural network, and the automatic identification system comprises one or more processors and a memory, wherein the memory stores instructions executable by the one or more processors, and the instructions enable the automatic identification system to execute any one of the automatic identification methods.
An embodiment of the present application further provides an automatic analysis method for CB-method microkernel microimages based on a convolutional neural network, fig. 5 is a flowchart of the automatic analysis method for CB-method microkernel microimages based on a convolutional neural network according to some embodiments of the present application, and as shown in fig. 5, the automatic analysis method may include:
step S502, acquiring the image of the binuclear cell containing the micronucleus identified by any one of the above automatic identification methods.
In step S504, a cell nucleus image and a micronucleus image in the identified binuclear cell image containing micronuclei are extracted.
Step S506, the extracted cell nucleus image and the micronucleus image are screened and judged to determine the micronucleus number of the binuclear cells containing the micronucleus.
In step S504, a K-means clustering algorithm may be used to extract a cell nucleus image and a micronucleus image in the identified binuclear cell image containing micronuclei. That is, the K-means clustering algorithm is used to separate cytoplasm from nucleus and micronucleus.
According to the automatic analysis method provided by the embodiment of the application, the dual-core cell image containing the micronucleus is identified through the convolutional neural network model, automation in the analysis process of the dual-core cell containing the micronucleus of the micronucleus by the CB method is realized, workers are liberated from tedious reading work, the standardization and unification of the analysis process are realized, and the analysis speed and the identification rate of the dual-core cell image containing the micronucleus are improved.
Because the staining of the cells is not completely uniform, a fuzzy cytoplasm background is often generated, the cell nucleus and the micronucleus cannot be extracted from the cytoplasm by the conventional threshold segmentation algorithm, and the cell nucleus and the micronucleus can be extracted by the segmentation algorithm of K-means clustering based on the pixel characteristics of a plurality of micronucleus cell images contained in the dual cores in the embodiment of the application, so that the cell nucleus and the micronucleus can be effectively extracted from the fuzzy cytoplasm, and the micronucleus identification rate is favorably improved.
The step of screening and judging the extracted cell nucleus image and the extracted micronucleus image may include: sorting the extracted area values ai of the cell nucleus images and the micronucleus images into arrays from large to small, wherein i =0, 1 and 2 … … n, and n is the sum of the cell nucleus images and the micronucleus images minus 1; sequentially judging and screening each value in the array, when 20 pixel points are less than Ai and less than 0.2 multiplied by Ai, determining the corresponding image as a micronuclear image, when Ai is less than or equal to 20 pixel points, screening the corresponding image, and when Ai is more than or equal to 0.2 multiplied by Ai, determining the corresponding image as a nuclear image; wherein, a0= a0, Ai = a (i-1) when a (i-1) is judged as a micronucleus image or screened out, and Ai = (a (i-1) + a (i-1))/2 when a (i-1) is judged as a cell nucleus image.
The number of the microkernels is counted according to the criterion about microkernel judgment in the standard issued by the international atomic energy agency. Fig. 6 is a flowchart of screening and determining the extracted cell nucleus image and the micro-nuclear image in the automatic analysis method of the CB-based micro-nuclear microscopic image based on the convolutional neural network according to some embodiments of the present application, and as shown in fig. 6, the specific steps of screening and determining may include:
step S602, the extracted cell nucleus images and the area values ai of the micronucleus images are sorted into arrays from big to small.
Step S604, judging whether 20 pixel points are less than Ai and less than 0.2 multiplied by Ai. If so, go to step S608, otherwise, go to step S606.
Step S606, judge whether Ai is not less than 0.2 × Ai. If yes, go to step S612, otherwise go to step S610.
In step S608, the number of microkernels is increased by 1.
In step S610, the corresponding image is screened out (indicating that the corresponding image is an impurity, and is neither a micronucleus nor a nucleus), so that a (i + 1) = (Ai + Ai)/2.
Step S612, determine whether all area values ai are traversed. If yes, go to step S616, otherwise go to step S614.
In step S614, i = i +1, and after step S614, the process returns to step S604, where the next area value is determined.
Step S616, counting the number of the microkernels. It is understood that the image of the cell nucleus and the like may be counted, so that various data such as the micronucleus rate of the cell, the micronucleus cell rate and the like may be obtained.
The embodiment of the application also provides an automatic analysis system of the CB method microkernel microscopic image based on the convolutional neural network, and the automatic analysis system comprises: one or more processors and memory storing instructions executable by the one or more processors to cause an automated analysis system to perform any of the automated analysis methods described above.
It can be understood that the embodiments of the present application provide a method and a system for automatically identifying and analyzing CB-method micro-nuclear microscopic images based on a convolutional neural network, which are used in the fields of health examination of radiological practitioners, biological dose estimation of a large number of nuclear radiation irradiated people, detection of potential genetic toxicity and cytotoxicity of compounds, and the like. It is to be understood that, the automatic identification and analysis method for CB microkernel microscopic images based on convolutional neural network provided in the embodiments of the present application may be executed by one or more processors, and in some embodiments, the dual-core cell images including microkernels identified by the automatic identification method and system for CB microkernel microscopic images based on convolutional neural network provided in the embodiments of the present application may also count the number of microkernels or other data mentioned in the present embodiment in an artificial manner.
It should also be noted that, in the case of the embodiments of the present invention, features of the embodiments and examples may be combined with each other to obtain a new embodiment without conflict.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and the scope of the present invention is subject to the scope of the claims.

Claims (17)

1. A CB method micro-nuclear microscopic image automatic identification method based on a convolutional neural network comprises the following steps:
acquiring a CB method microkernel microscopic image;
extracting a cell image from the CB method micronucleus microscopic image;
dividing the extracted cell image into a first single independent cell image and an adhesion cell mass image;
separating the adherent cell mass image into a plurality of second individual independent cell images;
inputting all of the first single independent cell images and all of the second single independent cell images into a convolutional neural network model to identify a binuclear cell image containing a micronucleus.
2. The automatic identification method according to claim 1, wherein the step of extracting a cell image from the CB-method micronucleus microscopic image comprises:
carrying out RGB channel splitting on the CB method microkernel microscopic image to obtain a channel component image of the CB method microkernel microscopic image;
performing a double-threshold iterative algorithm on the channel component map to determine a cytoplasm gray threshold and a nucleus gray threshold corresponding to the channel component map;
performing thresholding processing on the channel component image according to the gray threshold of the cytoplasm;
merging the image subjected to thresholding with the CB method microkernel microscopic image;
and extracting the cell image from the combined image by searching the contour, thereby extracting the cell image from the CB method micronucleus microscopic image.
3. The automatic recognition method according to claim 2,
before the step of combining the image after thresholding and the CB method microkernel microscopic image, the method further comprises the following steps: carrying out median filtering processing on the image subjected to thresholding processing;
the step of combining the image after thresholding and the CB method microkernel microscopic image comprises the following steps: and merging the image subjected to thresholding and median filtering with the CB method microkernel microscopic image.
4. The automatic identification method of claim 2, wherein the step of performing a dual-threshold iterative algorithm on the channel component map comprises:
performing a double-threshold iterative algorithm on the channel component diagram according to the following iterative formula:
Figure 498364DEST_PATH_IMAGE001
Figure 244865DEST_PATH_IMAGE002
in the formula, hkRepresenting the number of pixels with k gray level in the channel component diagram, Ti+1,1Representing the gray threshold, T, of the cytoplasm after the ith iterationi+1,2Representing a grayscale threshold of the cell nucleus after the ith iteration; when T isi+1,1=Ti,1When, Ti+1,1Or Ti,1Determining a threshold value of the cytoplasm corresponding to the channel component map; when T isi+1,2=Ti,2When, Ti+1,2Or Ti,2And determining the corresponding cytoplasm gray threshold value of the channel component map.
5. The automatic recognition method according to claim 2,
the CB method micronucleus microscopic image is an image formed by cells stained by a Giemsa dye, and the channel component diagram is a G channel component diagram.
6. The automatic recognition method according to claim 1, wherein the step of dividing the extracted cell image into a first single independent cell image and a stuck cell mass image comprises:
judging whether the morphological characteristics of each extracted cell image meet preset conditions or not;
if the preset conditions are met, dividing the corresponding cell image into the first single independent cell image;
and if the preset condition is not met, dividing the corresponding cell image into the adhesion cell mass image.
7. The automatic identification method according to claim 6, wherein the morphological features include an elongation and a defect rate, the elongation and defect rate being determined by the following formula:
EI=min(d,w)/max(d,w);
DR=(circle-area)/area;
wherein EI is an extension rate, d is a width of a corresponding minimum circumscribed rectangle of the cell image, w represents a length of the corresponding minimum circumscribed rectangle of the cell image, DR is a defect rate, area is an area of the corresponding cell image, and circle is an area of the corresponding minimum circumscribed circle of the cell image.
8. The automatic recognition method according to claim 7, wherein the preset condition includes:
the elongation is greater than 0.5 and less than 1.0, and the defect rate is greater than 0 and less than 0.5.
9. The automatic identification method of claim 1, wherein the step of separating the adherent cell mass image into a plurality of second individual independent cell images comprises:
performing limit corrosion operation on the adhesion cell mass image so as to obtain the geometric center of a target area corresponding to each second single independent cell image;
performing multiple expansion operations with the geometric center as the center until all the target areas are connected;
and separating the adhered cell mass image into a plurality of second single independent cell images by taking the connected boundary of each target area as a separation line of the plurality of second single independent cell images.
10. The automatic identification method according to claim 9, wherein before the step of performing a plurality of dilation operations until each of the target regions is connected with the geometric center as a center, the method further comprises:
and screening all the geometric centers to screen out the geometric centers of which the pixel value is smaller than the pixel value threshold value.
11. The automatic recognition method according to claim 1,
the convolutional neural network comprises a first convolutional layer, a first maximum pooling layer, a second convolutional layer, a second maximum pooling layer, a third convolutional layer, a fourth convolutional layer, a third maximum pooling layer, a first comprehensive connecting layer, a second comprehensive connecting layer and a third comprehensive connecting layer, wherein the first convolutional layer, the first maximum pooling layer, the second convolutional layer, the second maximum pooling layer, the third convolutional layer, the fourth convolutional layer, the third maximum pooling layer, the first comprehensive connecting layer, the second comprehensive connecting layer and the third comprehensive connecting layer enable image data to be transmitted in sequence, the image data pass through the third comprehensive connecting layer, and the convolutional neural network model generates the credibility probability value of the category to which the image data belong by using a softmax function.
12. The automatic recognition method according to claim 11,
each neuron of the first fully connected layer and the second fully connected layer uses a loss rate of 50% during training.
13. A CB method micro-nuclear microscopic image automatic identification system based on a convolutional neural network comprises the following steps:
one or more processors; and
a memory storing instructions executable by the one or more processors to cause the auto-id system to perform the auto-id method of any one of claims 1-12.
14. A CB method micro-nuclear microscopic image automatic analysis method based on a convolutional neural network comprises the following steps:
acquiring an image of a binuclear cell containing micronuclei identified by the automatic identification method according to any one of claims 1 to 12;
extracting a cell nucleus image and a micronucleus image in the identified binuclear cell image containing micronucleus;
and screening and judging the extracted cell nucleus image and the micronucleus image so as to determine the micronucleus number of the binuclear cells containing the micronuclei.
15. The automatic analysis method according to claim 14, wherein the step of screening and determining the extracted nuclear image and micronuclear image comprises:
sorting the extracted cell nucleus images and the extracted area values ai of the micronucleus images into arrays from large to small, wherein i =0, 1 and 2 … … n, and n is the sum of the cell nucleus images and the micronucleus images minus 1;
sequentially judging and screening each value in the array, when 20 pixel points are less than Ai and less than 0.2 xAi, determining the corresponding image as a micronuclear image, when Ai is less than or equal to 20 pixel points, screening the corresponding image, and when Ai is more than or equal to 0.2 xAi, determining the corresponding image as a nuclear image; wherein the content of the first and second substances,
a0= a0, Ai = a (i-1) when a (i-1) is judged as a micronucleus image or screened, and Ai = (a (i-1) + a (i-1))/2 when a (i-1) is judged as a cell nucleus image.
16. The automatic analysis method according to claim 14, wherein the step of extracting the cell nucleus image and the micronucleus image in the identified binuclear cell image containing micronuclei includes:
and extracting a cell nucleus image and a micronucleus image in the identified binuclear cell image containing the micronucleus by using a K-means clustering algorithm.
17. An automatic analysis system of CB method micro-nuclear microscopic images based on a convolutional neural network comprises:
one or more processors; and
a memory storing instructions executable by the one or more processors to cause the automated analysis system to perform the automated analysis method of any of claims 14 to 16.
CN202111090307.1A 2021-09-17 2021-09-17 CB microkernel microscopic image identification and analysis method and system based on neural network Pending CN113537181A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111090307.1A CN113537181A (en) 2021-09-17 2021-09-17 CB microkernel microscopic image identification and analysis method and system based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111090307.1A CN113537181A (en) 2021-09-17 2021-09-17 CB microkernel microscopic image identification and analysis method and system based on neural network

Publications (1)

Publication Number Publication Date
CN113537181A true CN113537181A (en) 2021-10-22

Family

ID=78093335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111090307.1A Pending CN113537181A (en) 2021-09-17 2021-09-17 CB microkernel microscopic image identification and analysis method and system based on neural network

Country Status (1)

Country Link
CN (1) CN113537181A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101639941A (en) * 2009-01-13 2010-02-03 中国人民解放军军事医学科学院放射与辐射医学研究所 Method for extracting binuclear lymphocyte accurately and quickly in CB method micronucleated cell image
US8394592B2 (en) * 2006-06-12 2013-03-12 The Board Of Regents Of The University Of Texas System Methods for assessing cancer susceptibility to carcinogens in tobacco products
US20190362491A1 (en) * 2016-09-13 2019-11-28 Swansea University Computer-implemented apparatus and method for performing a genetic toxicity assay
CN110838126A (en) * 2019-10-30 2020-02-25 东莞太力生物工程有限公司 Cell image segmentation method, cell image segmentation device, computer equipment and storage medium
CN113240620A (en) * 2021-01-29 2021-08-10 西安理工大学 Highly adhesive and multi-size brain neuron automatic segmentation method based on point markers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8394592B2 (en) * 2006-06-12 2013-03-12 The Board Of Regents Of The University Of Texas System Methods for assessing cancer susceptibility to carcinogens in tobacco products
CN101639941A (en) * 2009-01-13 2010-02-03 中国人民解放军军事医学科学院放射与辐射医学研究所 Method for extracting binuclear lymphocyte accurately and quickly in CB method micronucleated cell image
US20190362491A1 (en) * 2016-09-13 2019-11-28 Swansea University Computer-implemented apparatus and method for performing a genetic toxicity assay
CN110838126A (en) * 2019-10-30 2020-02-25 东莞太力生物工程有限公司 Cell image segmentation method, cell image segmentation device, computer equipment and storage medium
CN113240620A (en) * 2021-01-29 2021-08-10 西安理工大学 Highly adhesive and multi-size brain neuron automatic segmentation method based on point markers

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JOHN W. WILLS 等: "Inter-laboratory automation of the in vitro micronucleus assay using imaging flow cytometry and deep learning", 《BIORXIV》 *
XIANG SHEN 等: "Rapid and Automatic Detection of Micronuclei in Binucleated lymphocytes Image", 《RESEARCH SQUARE》 *
郭晓敏: "基于显微图像的颗粒计数方法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
闫学昆: "CB法微核图像自动分析关键技术研究与系统实现", 《中国博士论文全文数据库信息科技辑》 *

Similar Documents

Publication Publication Date Title
US20230127698A1 (en) Automated stereology for determining tissue characteristics
US8605981B2 (en) Centromere detector and method for determining radiation exposure from chromosome abnormalities
CN111292305A (en) Improved YOLO-V3 metal processing surface defect detection method
CN111524137B (en) Cell identification counting method and device based on image identification and computer equipment
KR20190043135A (en) Systems and methods for classifying biological particles
CN113537182B (en) Automatic identification method and system for metaphase mitosis microscopic image of chromosome
WO2012109571A1 (en) Systems and methods for object identification
Travieso et al. Pollen classification based on contour features
CN112329664A (en) Method for evaluating prokaryotic quantity of prokaryotic embryo
CN112784767A (en) Cell example segmentation algorithm based on leukocyte microscopic image
CN110414317B (en) Full-automatic leukocyte classification counting method based on capsule network
Tehsin et al. Myeloma cell detection in bone marrow aspiration using microscopic images
US20210264130A1 (en) Method and apparatus for training a neural network classifier to classify an image depicting one or more objects of a biological sample
Shah et al. Automatic detection and classification of tuberculosis bacilli from ZN-stained sputum smear images using watershed segmentation
CN113435285A (en) Automatic analysis method and system for chromosome karyotype of hematological tumor
CN112613505A (en) Cell micronucleus identification, positioning and counting method based on deep learning
CN111862004A (en) Tumor cell phenotype identification and counting method based on cell fluorescence image
CN113537181A (en) CB microkernel microscopic image identification and analysis method and system based on neural network
Cosio et al. Automatic counting of immunocytochemically stained cells
CN112750118B (en) Novel method and system for identifying cell number in single cell pore plate sequencing based on automatic visual detection
CN112669288B (en) Cell target expression prediction method, system and device based on digital pathological image
CN114627308A (en) Extraction method and system of bone marrow cell morphological characteristics
Le et al. An automated framework for counting lymphocytes from microscopic images
Chitra et al. Detection of aml in blood microscopic images using local binary pattern and supervised classifier
Gim et al. A novel framework for white blood cell segmentation based on stepwise rules and morphological features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination