CN110378313B - Cell cluster identification method and device and electronic equipment - Google Patents

Cell cluster identification method and device and electronic equipment Download PDF

Info

Publication number
CN110378313B
CN110378313B CN201910683872.5A CN201910683872A CN110378313B CN 110378313 B CN110378313 B CN 110378313B CN 201910683872 A CN201910683872 A CN 201910683872A CN 110378313 B CN110378313 B CN 110378313B
Authority
CN
China
Prior art keywords
image
cell
gray
scanning
cell mass
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910683872.5A
Other languages
Chinese (zh)
Other versions
CN110378313A (en
Inventor
孙明建
常江龙
郎彬
周超
马威
吴云松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiuyisanluling Medical Technology Nanjing Co ltd
Original Assignee
Jiuyisanluling Medical Technology Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiuyisanluling Medical Technology Nanjing Co ltd filed Critical Jiuyisanluling Medical Technology Nanjing Co ltd
Priority to CN201910683872.5A priority Critical patent/CN110378313B/en
Publication of CN110378313A publication Critical patent/CN110378313A/en
Application granted granted Critical
Publication of CN110378313B publication Critical patent/CN110378313B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Abstract

The application provides a cell mass identification method, a cell mass identification device and electronic equipment, wherein the cell mass identification method comprises the following steps: obtaining scanning images of cervical fluid-based cells to be detected under different scanning multiplying powers; obtaining a first gray level image and a second gray level image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray level image is a first multiplying power, and the scanning multiplying power corresponding to the second gray level image is a second multiplying power; obtaining a target binarization threshold value based on the first gray level image; processing the second gray level image according to the target binary threshold value to obtain a target binary image; determining a candidate region according to the target binary image and the first gray-scale image; and identifying the image with the highest scanning magnification in the scanned image according to the candidate area to obtain a cell mass identification result. Therefore, the problem that the cell mass is difficult to rapidly identify in the prior art can be solved.

Description

Cell cluster identification method and device and electronic equipment
Technical Field
The application relates to the field of medical cell image processing, in particular to a cell mass identification method and device and electronic equipment.
Background
With the rapid development of the medical industry, more and more people are invested in cervical cytology research. However, in practice most studies are directed to single cells and less analysis of cell clusters.
For cell clusters in cervical fluid-based cell images, the causes of various cell clusters are complex, and some cell clusters are real cell clusters or false cell clusters caused by improper processing of original data. Due to the complex cause of the cell mass, the cell mass is difficult to be rapidly identified in the prior art.
Disclosure of Invention
An object of the embodiments of the present application is to provide a cell mass identification method, a device and an electronic apparatus, so as to overcome a defect that it is difficult to quickly identify various cell masses in cervical fluid-based cells in the prior art.
In a first aspect, an embodiment of the present application provides a cell mass identification method, including:
obtaining scanning images of cervical fluid-based cells to be detected under different scanning multiplying powers;
obtaining a first gray level image and a second gray level image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray level image is a first multiplying power, and the scanning multiplying power corresponding to the second gray level image is a second multiplying power;
obtaining a target binarization threshold value based on the first gray level image;
processing the second gray level image according to the target binarization threshold value to obtain a target binary image;
determining a candidate region according to the target binary image and the first gray-scale image;
and identifying the image with the highest scanning magnification in the scanned image according to the candidate area to obtain a cell mass identification result.
By the method, the image under the second scanning magnification can be processed based on the target binarization threshold value obtained under the first scanning magnification, and the target binary image with 0 and 1 for area division can be obtained. And then combining the images under the two scanning magnifications to further obtain a candidate region about the cervical fluid-based cells to be detected, and finally identifying the cell mass based on the candidate region. Before entering the identification stage, the candidate region with high reliability can be determined only according to the color of the image and the cell distribution condition, so that the identification difficulty is reduced, and the processing efficiency of the whole identification process is improved.
With reference to the first aspect, in one possible design, the obtaining a first grayscale image based on a scanned image includes:
identifying an impurity region of an image at a first magnification in the scanned image; performing histogram statistics on the identified image of each impurity region to obtain a gray value corresponding to each impurity region; and processing the gray values of the impurity regions according to the median of all the gray values of the impurity regions to obtain a first gray image.
Through the implementation mode, the influence of the impurity region on the subsequent calculation process of the target binarization threshold value can be reduced by deleting the impurity region, and the data accuracy of the target binarization threshold value is improved.
With reference to the first aspect, in one possible design, the step of identifying an impurity region of an image at a first magnification in the scanned image includes:
graying an image under a first multiplying power in the scanned image to obtain first grayed data; performing binarization processing on the first grayscale data according to a set first threshold value to obtain a first binarized image; and screening the first binary image by using a set pixel threshold value to obtain an impurity region.
Through the implementation mode, the impurity region can be screened out from the image with the first magnification.
With reference to the first aspect, in one possible design, the step of obtaining a target binarization threshold based on the first grayscale image includes:
calculating the cell proportion of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images, and calculating the cell density of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images; and calculating a target binarization threshold value according to the cell density, the cell ratio and the first gray level image.
Through the implementation mode, the color difference between the cell nucleus color and the surrounding quality is fully considered, and the target binarization threshold calculated based on the cell density, the cell proportion and the first gray level image is accurate, so that the reliability of the candidate area is improved.
With reference to the first aspect, in one possible design, the step of calculating the cell density of the cervical fluid-based cells to be detected according to the image at the third time in the scanned image includes:
obtaining a third gray image according to an image at a third multiplying power in the scanned image, wherein the third multiplying power is lower than the first multiplying power and the second multiplying power; obtaining a second binary image according to the third gray level image; detecting an envelope boundary of the second binary image through a current detection rule; when the envelope boundary of the second binary image is detected, updating to obtain the current cell density, and skipping to the step of detecting the envelope boundary of the second binary image through the current detection rule to continue to execute; and when the envelope boundary of the second binary image is not detected, taking the current cell density as the cell density corresponding to the second binary image.
Through the implementation mode, the overall distribution characteristics of the cells can be reflected on the basis of the image with the lower scanning magnification, and then the cell density used for participating in the calculation process of the target binarization threshold value is determined.
With reference to the first aspect, in a possible design, the step of processing the second grayscale image according to the target binarization threshold to obtain a target binary image includes:
performing primary filtering treatment on the second gray level image based on the cell density of the cervical fluid-based cells to be detected to obtain a primary filtering image; and carrying out binarization processing on the preliminary filtering image according to the target binarization threshold value to obtain a target binary image.
Through the implementation mode, the interference caused by noise can be reduced through primary filtering treatment under the condition that more suspicious cell mass regions are reserved as far as possible, and the binary image from which the non-clustering regions are removed can be obtained, so that the identification difficulty can be reduced, and the identification efficiency is improved.
With reference to the first aspect, in one possible design, the cell mass recognition result includes a plurality of cell mass classes;
the plurality of cell pellet categories include: background, glandular cell mass, squamous epithelial cell mass, neutrophil aggregation, squamous epithelial cell overlap, fuzzy non-cell mass, fuzzy cell mass.
Therefore, true cell masses and false cell masses in the same section can be distinguished, and a plurality of specific cell mass types can be determined.
With reference to the first aspect, in a possible design, the step of identifying, according to the candidate region, an image with a highest scanning magnification in the scanned images to obtain a cell mass identification result includes:
inputting the image data set with the highest scanning magnification in the scanning image corresponding to the candidate area into the multi-level convolution layer in the neural network to obtain a first convolution result corresponding to part of convolution layers; sending the first convolution result to a multi-stage deconvolution layer in a neural network to obtain a second convolution result; determining a confidence value set according to the first convolution result and the second convolution result; and determining the cell mass recognition result according to each confidence value in the confidence value group.
Through the implementation mode, the neural network can be used for quickly identifying, the confidence value groups determined through the first convolution result and the second convolution result are fully fused with the shallow layer characteristics of the multi-level convolution layer and the deep layer characteristics of the multi-level deconvolution layer, and the detected object can be identified more accurately.
In a second aspect, the present application provides a cell mass identification device, including:
the scanning image acquisition module is used for acquiring scanning images of the cervical fluid-based cells to be detected under different scanning multiplying powers;
the preprocessing module is used for obtaining a first gray image and a second gray image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray image is a first multiplying power, and the scanning multiplying power corresponding to the second gray image is a second multiplying power;
the threshold value calculation module is used for obtaining a target binarization threshold value based on the first gray level image;
the filtering module is used for processing the second gray level image according to the target binarization threshold value to obtain a target binary image;
a candidate region determining module, configured to determine a candidate region according to the target binary image and the first grayscale image;
and the identification module is used for identifying the image with the highest scanning magnification in the scanned image according to the candidate region to obtain a cell mass identification result.
By the cell mass recognition device, the cell mass recognition method provided by the first aspect can be executed, the recognition difficulty can be reduced according to the determined candidate area, and the recognition efficiency can be improved.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory storing machine-readable instructions executable by the processor, the machine-readable instructions, when executed by the processor, performing the steps of the method of the first aspect as previously described when the electronic device is run.
In a fourth aspect, the present application provides a readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps in the method according to the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present disclosure.
Fig. 2 is a flowchart of a cell mass identification method according to an embodiment of the present disclosure.
Fig. 3 is a partial flowchart of step S2 in a cell mass identification method according to an embodiment of the present application.
Fig. 4 is a schematic diagram of three scanned images provided in the embodiment of the present application.
Fig. 5 is a flowchart of step S3 in a cell mass identification method according to an embodiment of the present application.
Fig. 6 is a schematic diagram of a neural network according to an embodiment of the present disclosure.
Fig. 7 is a schematic functional block diagram of a cell mass identification device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
The inventor has found that in the field of cervical fluid-based cytology, in view of the complex cause of various cell masses, it is difficult to quickly identify the true cell masses and the types thereof from images of cervical fluid-based cells doped with false cell masses or other impurities.
If the cervical gland cell clusters are segmented firstly, and then the segmented cell clusters are subjected to traditional feature extraction, feature mapping and feature extraction and fusion, the operation difficulty is high, and a part of real cell clusters are easy to omit in the segmentation and extraction processes.
The inventors have observed statistically a large number of cervical fluid-based cytology images that the nucleus color in the real cell mass is generally darker than the color of the surrounding parenchyma, which indicates the area outside the nucleus. Therefore, the following examples are proposed based on this theory to improve the problem in the prior art that it is difficult to rapidly identify a cell mass in cervical fluid-based cells.
Fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 may be used for executing the cell mass identification method in the embodiment of the present application, and the electronic device 100 includes a processor 101, a memory 102, a communication interface 103, a display unit 104, a bus 105, and the like. The processor 101, the memory 102, the communication interface 103, and the display unit 104 are electrically connected to each other directly or indirectly through the bus 105 to realize data transmission or interaction. The processor 101 is for executing executable modules, such as computer programs, stored in the memory 102.
The processor 101 may be an integrated circuit chip having signal processing capability. The Processor 101 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The processor 101 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. The method performed by the electronic device 100 defined by the processes disclosed in the embodiments of the present application can be applied to the processor 101, or implemented by the processor 101.
The Memory 102 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like. The memory 102 is used for storing programs, and may be used for storing the cell mass identification apparatus 200, for example. The cell mass recognition device 200 includes at least one software function module which can be stored in the memory 102 in the form of software or firmware (firmware) or is fixed in an Operating System (OS) of the electronic device 100. The processor 101 executes the program stored in the memory 102 upon receiving the execution instruction to implement the cell mass identification method. Access to the memory 102 by the processor 101 and possibly other components is under the control of a memory controller.
The bus 105 may be an ISA (Industry Standard architecture) bus, a PCI (peripheral Component interconnect) bus, or an EISA (extended Industry Standard architecture) bus. Only one bi-directional arrow is shown in fig. 1, but this does not indicate only one bus or one type of bus.
The electronic device 100 implements wired or wireless communication connection with other external devices through at least one communication interface 103. For example, the electronic device 100 may receive scan data of a scanning device or image data sent by other external devices through the communication interface 103, for example, receive scan data sent by a scanner to obtain a scanned image.
The display unit 104 is used to display image data for user reference, and the displayed content may be some processing result of the processor 101. The display unit 104 may be a liquid crystal display or a touch display.
It will be understood by those of ordinary skill in the art that the structure shown in fig. 1 is merely exemplary and is not intended to limit the structure of the electronic device 100. Electronic device 100 may also have more or fewer components than shown in FIG. 1, or a different configuration than shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
In the embodiment of the present application, the electronic device 100 may be a device having an arithmetic processing capability, such as a server, a personal computer, or a mobile device.
Referring to fig. 2, fig. 2 is a flowchart illustrating a cell mass recognition method applied to the electronic device 100 shown in fig. 1 according to an embodiment of the disclosure. The specific process shown in fig. 2 will be described in detail below.
The cell mass identification method comprises a candidate region extraction stage and an identification stage. After the candidate region is determined in the candidate region extraction stage, the identification stage may be entered to identify the image data obtained according to the candidate region mapping, so as to obtain the cell mass identification result.
As shown in fig. 2, the cell mass identification method includes the steps of: S1-S6.
S1: and obtaining scanning images of the cervical fluid-based cells to be detected under different scanning magnifications.
S2: and obtaining a first gray level image and a second gray level image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray level image is a first multiplying power, and the scanning multiplying power corresponding to the second gray level image is a second multiplying power.
S3: and obtaining a target binarization threshold value based on the first gray level image.
S4: and processing the second gray level image according to the target binary threshold value to obtain a target binary image.
S5: and determining a candidate region according to the target binary image and the first gray-scale image.
S6: and identifying the image with the highest scanning magnification in the scanned image according to the candidate area to obtain a cell mass identification result.
Wherein, S1-S5 are candidate region extraction stages, and S6 is a recognition stage.
For S1, the cervical fluid-based cell to be tested may include real cell clusters, false cell clusters, and impurities, which may be formed during the preparation process of the cervical fluid-based cell to be tested. For convenience of description, the cervical fluid-based cell to be tested is hereinafter referred to simply as the target to be tested.
For the same target to be detected, the scanning images under different scanning magnifications can be obtained through the combination of the microscopes with different scanning magnifications or different eyepieces and objective lenses of the same microscope. For example, the scan magnification may include a first magnification, a second magnification, a third magnification, and a fourth magnification. Wherein the third multiplying power is lower than the first multiplying power, the second multiplying power and the fourth multiplying power. The first magnification may be lower than the second magnification.
In one example, the first magnification may be 2.5 times, the second magnification may be 5 times, the third magnification may be 0.15625 times, and the fourth magnification may be 20 times. In other examples, the scan magnification may be more or less, and the particular scan magnification values should not be construed as limiting the application.
For S2, it may include: a first gray scale image is obtained based on an image corresponding to a first magnification in the scanned image, and a second gray scale image is obtained based on an image corresponding to a second magnification in the scanned image.
For S3-S4, the target binarization threshold value can be determined according to the size of the first gray-scale image and the cell distribution condition in the target to be detected. And carrying out binarization processing on the second gray level image through the determined target binarization threshold value to obtain a target binary image. In the cell mass identification method in the embodiment of the application, the determined target binarization threshold has an important influence on the determination of the candidate region.
The target binary image may be regarded as an image from which non-clumped regions are removed, and a large number of real cell masses and a large number of suspicious cell masses remain in the target binary image. The non-clumped region may represent a null region, and the probability of the presence of a cell clump in the null region is extremely low. The suspicious cell mass may be an actual cell mass, and the suspicious cell mass may be identified through the identification step of S6.
For S5, a related region corresponding to the target binary image may be determined in the first grayscale image through a position mapping relationship between the target binary image and the first grayscale image, and a partial region may be screened from the related region as a candidate region.
For step S6, the position of the coordinates of the candidate region in the image at each scanning magnification may be obtained according to the position mapping relationship between the images at each scanning magnification and the determined candidate region, that is, the candidate region in the image at each scanning magnification is obtained. To improve the recognition accuracy, the image data of the candidate region in the image of the highest scan magnification may be taken as the original data set for the recognition stage.
The raw data set can be input into a neural network model for identification so as to obtain a cell mass identification result about the target to be detected.
It should be noted that the real cell mass type can be included in the final cell mass recognition result, i.e. the differentiation between real and false cell masses is realized.
In one embodiment, the cell mass recognition result may include a plurality of cell mass types and may also include a cell mass location.
The above-mentioned S1-S6 can distinguish various cell masses in the same target.
By the cell mass identification method, the scanned images of the same cervical fluid-based cell object to be detected at different scanning magnifications are processed, and a first gray image and a target binary threshold value are obtained based on the scanned image at the first magnification. And processing a second gray image obtained based on the scanning magnification under the second magnification through a target binarization threshold value to obtain a target binary image. And then determining a candidate region from the first gray-scale image through the incidence relation between the target binary image under the second magnification and the first gray-scale image under the first magnification, and identifying the image with the selected scanning magnification through the mapping relation between the candidate region and the images under other scanning magnifications to obtain a cell mass identification result. In the cell mass identification method, each gray scale processing and binarization processing considers that the color of the cell nucleus is darker than that of the surrounding plain color to obtain each gray scale image and each binary image, so that the cell mass can be quickly identified under the condition that the suspicious cell mass is kept as much as possible.
Because the gray processing and the binarization processing are carried out based on the color, the areas with darker colors can be reserved as much as possible, suspicious cell mass areas are reserved, and the real cell mass areas are prevented from being missed. Compared with a mode of directly identifying the original cervical liquid-based cell scanning image, the method has the advantages that the identification difficulty is reduced, and the identification efficiency is improved. Compared with a mode of identifying based on the image characteristics of various cell groups after complex characteristic extraction and fusion means, the cell group identification method provided by the embodiment of the application does not need to perform complex characteristic extraction before an identification stage, and can quickly identify the cell groups.
Optionally, in order to improve the calculation accuracy of the target binarization threshold, the impurity region with a darker color is prevented from affecting the calculation process of the target binarization threshold. The impurity region in the scanned image at the first magnification may be processed first to obtain a first gray scale image. As shown in fig. 3, the process of obtaining the first gray image in the above S2 may include the steps of: S21-S23.
S21: an impurity region of an image at a first magnification in a scanned image is identified.
S22: and performing histogram statistics on the image of each identified impurity region to obtain a gray value corresponding to each impurity region.
S23: and processing the gray values of the impurity regions according to the median of all the gray values of the impurity regions to obtain a first gray image.
Wherein the impurities may be formed during the process of making the slices. As shown in fig. 4, three sub-graphs of (i), (ii), and (iii) are schematic diagrams of an impurity-free region, a region with more impurities, and a region with less impurities, respectively, and a plurality of black arc-shaped thick lines around the slice corresponding to the sub-graph (ii) and the sub-graph (iii) represent impurities. It should be noted that the three sub-graphs in fig. 4 are only schematic, and the three sub-graphs correspond to images of different cervical cells to be detected.
Since the black impurity region is conspicuous, the impurity processing can be performed using an image at the first magnification where the scanning magnification is not high.
The method has the advantages that the histogram statistics is carried out on the identified impurity regions, the impurity regions are processed based on the gray value median values in all gray value statistics conditions, the impurity regions can be deleted, the first gray image with the impurity regions removed is obtained, and therefore the influence on the subsequent calculation process of the target binarization threshold value due to the fact that the color of the impurities is too deep can be avoided.
As an embodiment, the S21 may specifically include the following sub-steps: S211-S233.
S211: graying an image at a first magnification in the scanned image to obtain first grayed data.
S212: and carrying out binarization processing on the first grayed data according to a set first threshold value to obtain a first binarized image.
The first threshold may be set by a person skilled in the art according to actual needs, for example, the first threshold may be 65, 70, 75, and in the embodiment of the present application, the first threshold is set to be 70.
S213: and screening the first binary image by using a set pixel threshold value to obtain an impurity region.
The set pixel threshold may include a first pixel threshold and a second pixel thresholdAnd (4) prime threshold value. The first pixel threshold may be greater than the second pixel threshold. The first pixel threshold may be calculated by a first expression of:
Figure BDA0002145597250000121
s denotes a first pixel threshold value, w0、h0The width and height of the image at the first magnification are shown, respectively. The second pixel threshold may be set according to actual needs, for example, the second pixel threshold may be 2000, 2500, 3000, or the like.
As one implementation manner, a region having a pixel value greater than 2500 pixels and smaller than the first pixel threshold S in the first binarized image may be used as an impurity region, and the impurity region obtained thereby may be presented in a rectangular frame form.
After the impurity region is obtained, for S22, histogram statistics may be performed on the image in the impurity region represented in the rectangular frame format, and the statistical range is 256 gray levels from 0 to 255, so as to obtain the pixel amount at each gray level in the image in the same impurity region. And accumulating the pixel quantity corresponding to each gray level according to the sequence of the gray levels from low to high according to the statistical result of the histogram until the ratio of the pixel quantity of the accumulated sum to the total pixel quantity reaches a first percentage, and taking the value of the maximum gray level corresponding to the pixel quantity of the accumulated sum meeting the first percentage as the gray value of the rectangular frame. The gray scale value corresponding to each impurity region can be obtained according to the implementation mode. The first percentage is illustrated as 90% in this example. In other embodiments, the first percentage may be a set value of 85%, 88%, 92%, 95%, etc.
After obtaining the gray scale values of the plurality of impurity regions, in S23, all the gray scale values of the plurality of impurity regions may be counted to obtain a median of all the gray scale values, and the gray scale values of all the impurity regions involved in the counting may be reset to the median to obtain a first gray scale image.
Optionally, as shown in fig. 5, the S3 in the cell mass identification method may specifically include the following sub-steps: S31-S33.
S31: and calculating the cell ratio of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images.
S32: and calculating the cell density of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images.
S33: and calculating a target binary threshold value according to the cell density, the cell ratio and the first gray level image.
Wherein, S31, S32 can be executed simultaneously. For example, the cell fraction can be determined in the course of determining the cell density.
For S33, as an embodiment, histogram statistics may be performed on the first grayscale image obtained in step S2. The statistical range includes 256 gray levels of 0-255, wherein the gray level 0 corresponds to the pixel value A0The gray level 1 corresponds to the pixel value A1By analogy, the gray level 255 corresponds to the pixel value A255. Accumulating the pixel values from the pixel value corresponding to the low gray level to the pixel value corresponding to the high gray level according to the gray level until the accumulated sum of the pixel values is larger than the cell nucleus pixel quantity, recording the maximum gray level corresponding to the accumulated sum larger than the cell nucleus pixel quantity as the current gray level GR, and continuing to accumulate until the accumulated sum is larger than the cell nucleus pixel quantity, wherein the gray level corresponding to the cell nucleus pixel quantity just larger than the cell nucleus pixel quantity is used as the current gray level GR, so that the updating of the current gray level GR is realized. Wherein, the current gray scale GR is the target binarization threshold value.
The cell nucleus pixel amount is calculated based on the cell density and the cell ratio in substeps S31 and S32. The cell nucleus pixel quantity can be calculated according to the following second expression:
Figure BDA0002145597250000131
in the second expression, N represents the amount of the cell nucleus pixel found, and w0、h0Respectively representing the width and height of the image at a first magnification, e.g. w0、h0The width and height of the first gray image can be expressed respectively, P represents the cell ratio, and X represents a parameter related to the cell density level. When the density levels are 0, 1, 2, 3, 4, 5, the corresponding X may be 100, 70, 45, 30, 25, 12, respectively.
In the process of S31-S33, in the process of obtaining the target binarization threshold value by calculation based on the cell ratio, the cell density and the first gray level image, the color corresponding to the cell nucleus pixel quantity can be fully considered by the participation of the cell ratio and the cell density in the calculation. The target binarization threshold determined based on the cell nucleus pixel quantity and the first gray level image without the impurity region can be used in the subsequent data filtering process, and determination of a more reasonable candidate region is facilitated.
Alternatively, the value of cell density may be one of a plurality of density levels. In one example, there may be 6 density levels, and if there are 6 density levels with increasing density represented by cell density levels 0-5, density level 0 may represent a very sparse distribution of cells, density level 1 may represent a relatively sparse distribution, and so on, density level 4 may represent a very dense distribution of cells, density level 5 may represent a very dense distribution of cells, and so on.
In order to calculate the cell density, the step S32 may specifically include the following sub-steps: S321-S325.
S321: and obtaining a third gray image according to the image at a third multiplying factor in the scanned image, wherein the third multiplying factor is lower than the first multiplying factor and the second multiplying factor.
S322: and obtaining a second binary image according to the third gray level image.
Taking the third magnification of 0.15625 times as an example, the scanned image of 0.15625 times can be grayed out to obtain a third grayscale image. The width and height of the third gray scale image may be within the pixel range of 250-500 pixels.
In order to accurately calculate the cell density, the width and the height of the third grayscale image may be normalized to obtain a size-normalized third grayscale image, where the size includes the width and the height. For example, it may be normalized to a pixel range of 300-400 pixels. If the size of the third grayscale image is a set uniform size, the normalization step can be omitted.
After the third gray level image with uniform size is obtained, a Gaussian filtering mode can be adopted to reduce noise, and an Otsu algorithm is adopted to calculate a binarization threshold value for calculating cell density. Note that this binarization threshold value for calculating the cell density is not the aforementioned target binarization threshold value. And performing binarization processing on the third gray level image according to the binarization threshold value for calculating the cell density to obtain a second binarization image. After the second binarized image is obtained, S323 is performed to determine a value of the cell density.
S323: the envelope boundary of the second binarized image is detected by the current detection rule, and step S324 or S325 is performed.
S324: when the envelope boundary of the second binary image is detected, the current cell density is obtained by updating, and the step S323 is skipped to continue to execute.
S325: and when the envelope boundary of the second binary image is not detected, taking the current cell density as the cell density corresponding to the second binary image.
With respect to the detection rule in S323, each current detection rule includes: and processing the current second binary image to obtain a new second binary image, carrying out edge detection on the new second binary image by adopting a Canny detection algorithm, and detecting the envelope boundary of the new second binary image by adopting Hough transform.
In one embodiment, the current second binarized image is expanded, eroded and expanded according to the current iteration coefficient in the current detection rule to obtain a new second binarized image.
In one example, four sets of iteration coefficients are used to determine the lower density level cell density. The four sets of iteration coefficients are: [11, 14, 3], [7, 10, 3], [3, 6, 3], [0, 3, 3], the size of the kernel used was 3 × 3, and each set of iterative coefficients was used for the dilation-erosion-dilation operation on the second binarized image. 0 in the iteration coefficient means that no corresponding operation is performed, e.g. [0, 3, 3] this iteration coefficient means that only the erosion-dilation operation is performed.
A complete iterative detection process will be described below with reference to the four sets of iterative coefficients.
The default cell density is 0 under the initial condition, expansion-corrosion-expansion operation is carried out by adopting [11, 14, 3], then edge detection is carried out by adopting a Canny detection algorithm, and envelope boundary of the second binary image is detected by adopting Hough transform. If the envelope boundary is not detected through Hough transformation, keeping the current density level unchanged and equal to 0, if the envelope boundary is detected, adding 1 to the density level to update the current density level, performing expansion-corrosion-expansion operation by adopting a next group of iteration coefficients [7, 10, 3], performing edge detection, detecting the envelope boundary through Hough transformation, updating the current density level when the envelope boundary is detected, and the like, and stopping iteration until the envelope boundary is not detected any more. Where the envelope boundaries may be circular outlines as shown in the three sub-graphs in fig. 4.
With the foregoing example, 5 cell density levels labeled 0-4 can be determined, and since the density labeled 5 corresponds to a very large density, in order to confirm an image with a cell density of 5, it can be determined whether the density level corresponding to the second binarized image is 5 or not by the following two conditions: (1) the current density grade obtained by updating the detection rule is 4; (2) the target binarization threshold GR calculated in the case where the density level is 4 is within a preset interval range.
As an embodiment, when the density level is determined to be 4 by the detection rule, the cell nucleus pixel amount N is calculated according to the parameter X associated with the density level 4, a temporary GR value is obtained based on N, the temporary GR value is determined, and when the temporary GR value is in an interval range of 30 or more and less than 70, the current density level is updated to 5, that is, the cell density value is 5.
After the cell density value is determined to be 5, a new cell nucleus pixel quantity N can be calculated according to the parameter X associated with the density level of 5, and a new GR value is obtained based on the new cell nucleus pixel quantity N and serves as a new target binarization threshold.
Wherein the cell fraction P can be calculated each time the envelope boundary is detected using the hough transform. If the cell density is 0, the cell fraction may be a default value, for example, 0.9, 0.8, etc. When the value of the cell density is other than 0, the cell ratio P can be calculated by the following third expression.
The third expression is:
Figure BDA0002145597250000161
in a third expression, r denotes the radius of the detected envelope boundary, e.g. the radius of the detected circular contour, w1、h1The width and height of the image at the third magnification are shown, respectively, and for example, the width and height of the second binarized image are shown, respectively.
The cell density determined by the method is high in accuracy and high in cell proportion reliability, the cell nucleus pixel quantity calculated according to the cell density and the cell proportion determined by the method is more reasonable, and the high-reliability target binarization threshold value can be obtained.
Alternatively, in order to avoid the missing of part of the cell mass region, the target binarization threshold GR obtained in the above method may be finely adjusted.
As an implementation manner for fine tuning the target binarization threshold, the target binarization threshold GR may be fine-tuned through the following fine-tuning strategy, where the fine-tuning strategy includes:
when the GR is less than or equal to 35 and the density levels are 0, 1 and 2, after the density levels are adjusted up by two levels, S33 is executed again to calculate a new target binarization threshold value; when GR is 35 or less and the density level is 3, the level is raised one by one, and S33 is executed again.
When GR is greater than 35 and less than 70 and the density level is 0, the density level is up-regulated by two levels, and S33 is executed again; when GR is greater than 35 and less than 70, density levels 1, 2, 3, 4, one level is raised, and S33 is re-executed.
After the final target binarization threshold is determined, S4 is executed.
Alternatively, for the above S4, the method may include the sub-steps of: S41-S42.
S41: and performing primary filtering treatment on the second gray scale image based on the cell density of the cervical fluid-based cells to be detected to obtain a primary filtering image.
Wherein, the primary filtering treatment may include: expansion operation, corrosion operation, Gaussian filtering and bilateral filtering. The parameters used in the specific implementation of the primary filtration treatment may be different for different cell densities.
S42: and carrying out binarization processing on the preliminary filtering image according to a target binarization threshold value to obtain a target binary image.
The pixel value in the preliminary filtered image may be compared with a target binarization threshold GR, and the preliminary filtered image is binarized based on the comparison result, so as to obtain a target binary image containing 0 and 1. S42 may be regarded as a re-filtering process for the preliminary filtered image, which can remove non-clustered regions in the de-noised image.
By the method, the images with different cell densities can be denoised respectively through a primary filtering processing mode, non-agglomerated areas such as non-cell aggregates, single cells and fine impurities can be removed through binarization processing of the primary filtering image, the non-agglomerated areas can be deleted under the condition that more suspicious cell aggregate areas are reserved as far as possible, and identification interference caused by the non-agglomerated areas is avoided.
Optionally, for the above S41, in one example, different primary filtering processing modes are set for the three types of cell density grades. In the three primary filtering processing modes, parameters such as the diameter of the adopted filtering field, the standard deviation of the spatial Gaussian function, the similarity standard deviation of the gray value and the like are different.
In the first category, for images with a high density level, for example, at density levels 2, 3, and 4, the second gray scale image is subjected to two iterations of the dilation-erosion operation, and the kernel size used in each iteration of the dilation-erosion operation is 3 × 3. And performing the next expansion-corrosion operation after the first expansion-corrosion operation is performed. And then, carrying out smooth denoising in a Gaussian filtering mode, wherein the kernel size adopted in the filtering process is 5 multiplied by 5, and filtering redundant fine impurities in a bilateral filtering mode.
In the second category, for an image with very sparse density, for example, when the density level is 1, the second gray image is subjected to iterative processing of three expansion-corrosion operations, and gaussian filtering and bilateral filtering are performed after the three iterative processing processes of expansion-corrosion are finished.
In the third category, for an extremely sparse image, for example, when the density level is 0, the second grayscale image is subjected to iterative processing of three dilations, and then gaussian filtering and bilateral filtering are performed. Wherein, for the third class of images, the target binary image B obtained after the first-stage filtering processing and the binarization processing can represent a temporary binary image BtFor the temporary binary image BtAnd after the iteration treatment of the three times of corrosion operation, obtaining a new target binary image B.
Different primary filtering modes are set for different cell density grades, so that efficient processing can be performed on different density grades to quickly remove non-agglomerated areas. Compared with a uniform processing mode, the method can avoid excessive processing time waste on the images of partial density levels, and is beneficial to improving the identification efficiency.
Optionally, after obtaining the target binary image B, regions in the target binary image that satisfy any one of the following conditions may be removed: (1) the width and the height of the region are lower than 14 pixels; (2) width below 8 pixels or height below 8 pixels; (3) the ratio between the width and the height or the inverse of the ratio exceeds 15. Thus, invalid regions such as a fine region having an excessively small number of pixels and a region having a severely unbalanced aspect ratio can be eliminated, and a clustered region can be retained.
For convenience of description, the image subjected to the above-described thinning-out operation is still referred to as a target binary image B.
After the final target binary image is obtained, S5 is executed. There may be a plurality of irregular regions in the final target binary image B. These irregular regions are regions in the target binary image B that are set to 1.
Alternatively, as an implementation manner of S5, the target binary image may be subjected to three iterative processes of erosion operation, that is, erosion-erosion operation. For the target binary image after the corrosion is finished, according to the position of each region in the image and the position mapping relation between the images under each scanning magnification, each related image M related to each region in the target binary image is found out from the first gray imagei. Then calculating the associated image MiOf each associated image MiAn average gray value can be calculated, and all the related images M in the same first gray image are usediAfter all the average gray values are sorted, a specified number of associated images M are selected from the sorting resultiAs a candidate region. The number of the appointments can be set according to actual needs, for example, the number of the appointments can be 800, 1000, 1500, 2000, etc., and the example is explained with the number of the appointments being 1000.
Wherein for the actual associated image MiWhen the number is less than the specified number, all the related images M are processediAre all taken as candidate regions.
After the candidate region is determined, S6 is performed.
For S6, since the image scan magnification corresponding to the candidate region in S5 is low, image data matching the candidate region can be mapped from the images at the selected scan magnification in the scanned images as the original data set for the recognition stage by the coordinate mapping relationship between the images at the respective scan magnifications. The selected scan magnification may be the highest scan magnification of the scanned image, in one example, the selected scan magnification is 20 times.
In order to improve the identification accuracy and speed, the original data set can be input into a pre-established neural network model for identification so as to obtain a cell mass identification result.
The cell mass recognition result may include a plurality of cell mass categories including: background, normal glandular cell mass, normal squamous epithelial cell mass, abnormal glandular cell mass, abnormal squamous epithelial cell mass, neutrophil aggregation, impurities, squamous epithelial cell overlap, mucus thread, fuzzy acellular mass, fuzzy cell mass.
The cell mass recognition result may further include a cell mass location for identifying coordinates of the recognized cell mass.
In order to make the sizes of the images in the original data set uniform for inputting the neural network model, the images in the original data set may be subjected to size processing and then input into the neural network model. For example, with 512 × 512 as a fixed input size, an area smaller than 512 pixels on any one side of each image in the original data set is expanded, and a sliding block fetching is performed on an area larger than 512 pixels on any one side, and the sliding step size may be 128 pixels.
By inputting the original data set into the convolutional neural network model, the predicted coordinates of the detection box and the cell mass category can be obtained. In practical application, elements in the same image may be identified by frames with different sizes, so that in order to avoid repeated identification of the same element, a non-maximum value suppression method can be adopted to remove repeated detection frames in the same area, and a final detection result can be obtained.
In one example, the convolutional neural network occupies 3G of video memory, one video card can run three processes simultaneously, 55 images can be tested in one second, and the time required for testing the original data set corresponding to the candidate regions of one slice in the identification stage is less than 20 seconds. From step S1 to the completion of step S6, the total operation time is about 40 seconds, the operation time is short, the recognition efficiency is high, and the reliability is high.
In the embodiment of the present application, the convolutional neural network architecture is shown in fig. 6. The arrows in fig. 6 may represent the data flow in the neural network.
After the raw data set is prepared, the raw data set is first fed into the multi-level convolutional layers in the base network for a series of convolution operations.
The multi-level convolutional layers in the base network include Conv1_1, Conv1_2, Conv2_1, Conv2_2, Conv3_1, Conv3_2, Conv3_3, Conv4_1, Conv4_2, Conv4_3, Conv5_1, Conv5_2, Conv5_ 3. The convolution kernel size of each convolution layer in the basic network is 3 multiplied by 3, the step length is 1, the padding is 1, and only the number of channels of each convolution layer is different. The number of lanes for Conv1_1 and Conv1_2 was 32, the number of lanes for Conv2_1 and Conv2_2 was 64, the number of lanes for Conv3_1, Conv3_2 and Conv3_3 was 128, the number of lanes for Conv4_1, Conv4_2 and Conv4_3 was 256, and the number of lanes for Conv5_1, Conv5_2 and Conv5_3 was 256.
Data that is convolved by the base network is fed into the test network. The test network has a special pooling layer, a multi-level convolutional layer and a multi-level anti-convolutional layer.
The multi-level convolutional layers in the test network include Conv5_4, Conv5_5, Conv6_1, Conv6_2, Conv7_1, Conv7_2, Conv8_1, Conv8_2, Conv9_1, Conv9_2, Conv10_1, Conv10_ 2. The special pooling layer may be disposed between any two of the multi-level convolutional layers of the test network.
The multilevel deconvolution layers in the test network include Deconv9_1, Deconv9_2, Deconv8_1, Deconv8_2, Deconv7_1, Deconv7_2, Deconv6_1, Deconv6_2, Deconv5_4, Deconv5_5, Deconv4_3_1, Deconv4_3_ 2.
After the image data set with the highest scanning magnification in the scanning image corresponding to the candidate area is input into the multi-stage convolution layer in the neural network, a first convolution result corresponding to part of convolution layers is obtained. The partial convolutional layer refers to the convolutional layer of which the Conv4_3-Conv10_2 part corresponds to the subsequent deconvolution layer.
And sending the first convolution result to a multi-stage deconvolution layer in the neural network to obtain a second convolution result, wherein the second convolution result is obtained after deconvolution operation is carried out on the deconvolution layer.
After obtaining the second convolution result according to the first convolution result, the confidence value group may be determined according to the first convolution result and the second convolution result. For example, the output results of part of the convolutional layers and the deconvolution layers are cascaded to realize the cascade of the feature maps, and the output of the multilayer is synthesized after the cascade of the multilayer to obtain the confidence value group.
And determining the cell mass recognition result according to each confidence value in the confidence value group.
If there are 11 classifications, a confidence array containing at least 11 elements may be obtained. For multiple confidence elements within the same set of confidence values, the class corresponding to the element with the highest confidence value may be a cell mass class in the object under test. Any set of confidence values may correspond to a detection box for identifying the subject, the detection box identifying the location of the cell mass under test.
The following describes the special pooling layers, convolution layers, and deconvolution layers in the test network.
The special pooling layer uses maximal pooling, the kernel size is 3 × 3, the step size is 1, and the padding is 1. The feature map obtained after the basic network and the special pooling layer can be sent to Conv5_4, wherein Conv5_4 is an expansion convolutional layer, the convolutional kernel is 3 × 3, padding is 6, the expansion coefficient is 6, and the number of channels is 512. Then, the data output by the Conv5_4 is sent to Conv5_5, the convolution kernel used by the Conv5_5 is 1 × 1, and the number of channels is 512.
Convolution operations of convolution layers Conv6_1 and Conv6_2 are the same as those of subsequent Conv7_1, Conv7_2, Conv8_1, Conv8_2, Conv9_1 and Conv9_2, and are described by taking Conv6_1 and Conv6_2 as examples. The convolution kernel size of Conv6_1 is 1 × 1, the number of channels is 64, the convolution kernel size of Conv6_2 is 3 × 3, the step size is 2, the padding is 1, and the number of channels is 128.
The convolution layer Conv10_1 uses convolution kernels of 1 × 1 size and 128 channels. The convolution kernel size of Conv10_2 is 4 × 4, step size is 1, padding is 1, and the number of channels is 256.
After the convolution operation of Conv10_2, the size of the feature map is 1 × 1, at this time, the feature map is sent to Deconv9_1 to be subjected to deconvolution operation, the convolution kernel size is 3 × 3, the step size is 2, the number of channels is 128, and the feature map size becomes 2 × 2. Then cascade the characteristic diagrams of Deconv9_1 and Conv9_2, and then perform convolution operation of Deconv9_2, with the convolution kernel size of 3 × 3, the step size of 1, the padding of 1, and the number of channels of 128. The processes of Deconv9_1 and Deconv9_2 are the same as the subsequent processes of Deconv8_1, Deconv8_2, Deconv7_1, Deconv7_2, Deconv6_1 and Deconv6_2, and are different from Deconv5_4, Deconv5_5, Deconv4_3_1 and Deconv4_3_2 only in the number of output channels, wherein the number of channels corresponding to Deconv5_4 and Deconv5_5 is 512, and the number of channels corresponding to Deconv4_3_1 and Deconv4_3_2 is 256.
The feature maps corresponding to Conv10_2, Deconv9_2, Deconv8_2, Deconv7_2, Deconv6_2, Deconv5_5 and Deconv4_3_2 all output the coordinates of the detection frames and the confidence values of the detection frame types, the detection frames with the confidence values higher than 0.3 are integrated, then the detection frames with high overlapping are removed by using a non-maximum suppression method, and the rest detection frames and the detection frames are used as recognition results.
Through the structural design of the convolutional neural network, the characteristics of a shallow layer and a deep layer can be fully fused, and the position and the category of the measured object can be more accurately predicted. Wherein the features of Conv4_3, Conv5_3 to Conv10_1 are considered shallow features and the features of Conv10_2, Deconv9_1 to Deconv4_3_2 are considered deep features. Because the detection frame coordinates and the confidence values of the detection frame types can be output in the feature maps of a plurality of scales through the convolutional neural network, the detection frames can better cover the detected objects under various sizes, and the detection effect on small targets is remarkable.
Referring to fig. 7, a functional block diagram of a cell mass recognition device 200 according to an embodiment of the present disclosure is shown. The cell mass recognition device 200 may be stored in the memory 102 of the electronic device 100, and the cell mass recognition device 200 is configured to perform each step and any implementation of the cell mass recognition method.
As shown in fig. 7, the cell mass recognition apparatus 200 includes a scan image acquisition module 201, a preprocessing module 202, a threshold calculation module 203, a filtering module 204, a candidate region determination module 205, and a recognition module 206.
And a scanned image obtaining module 201, configured to obtain scanned images of the cervical fluid-based cells to be detected at different scanning magnifications.
The preprocessing module 202 is configured to obtain a first grayscale image and a second grayscale image based on a scanned image, where a scanning magnification corresponding to the first grayscale image is a first magnification, and a scanning magnification corresponding to the second grayscale image is a second magnification.
And a threshold calculation module 203, configured to obtain a target binarization threshold based on the first grayscale image.
And the filtering module 204 is configured to process the second grayscale image according to the target binarization threshold to obtain a target binary image.
A candidate region determining module 205, configured to determine a candidate region according to the target binary image and the first grayscale image.
And the identification module 206 is configured to identify the image with the highest scanning magnification in the scanned image according to the candidate region, so as to obtain a cell mass identification result.
The cell mass recognition apparatus 200 can perform the cell mass recognition method, and thus can rapidly recognize a cell mass.
Optionally, the preprocessing module 202 may be further configured to perform steps S21-S23 in the cell mass identification method.
Optionally, the preprocessing module 202 may also be used to perform substeps S211-S213 of the cell mass identification method described above.
Optionally, the threshold calculation module 203 may be further configured to perform steps S31-S33 in the cell mass identification method.
Optionally, the threshold calculation module 203 may also be configured to perform substeps S321-S325 of the cell mass identification method described above.
Optionally, the filtering module 204 may be specifically configured to perform steps S41-S42 in the cell mass identification method.
For other details of the cell mass recognition device 200 provided in the embodiment of the present application, please further refer to the related description of the cell mass recognition method, which is not repeated herein.
In addition to the above embodiments, the present application further provides a readable storage medium, on which a computer program is stored, and the computer program is executed by the processor 101 to perform the steps in the cell mass identification method.
In summary, with the cell mass identification method, the cell mass identification device, and the electronic device 100 provided in the embodiments of the present application, before identification by using a neural network, a target binary threshold for identification is obtained by calculation according to processing scanned images at different magnifications, and in the calculation process of the target binary threshold, the cell nucleus color is considered to be darker than the surrounding quality, so that a highly reliable target binary threshold is determined according to the cell density, the cell proportion parameter, and the grayscale image. The target binary image obtained based on the target binary threshold value is obtained by removing noise and non-cluster regions under the condition that suspicious cell cluster regions are kept as much as possible. The candidate region determined according to the target binary image has high reliability, and does not need to go through complicated feature extraction and fusion processes before entering the identification stage, so that the identification difficulty is reduced, the extraction efficiency of the candidate region is improved, and the whole identification process is accelerated. The convolutional neural network provided by the embodiment of the application is combined to identify the output result of the candidate region extraction stage, so that the processing time of the whole identification process can be shortened, and the accuracy and the reliability of the identification result are high.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of modules is only one division of logical functions, and there may be other divisions in actual implementation, and for example, multiple units or components may be combined or may be integrated, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
It should be noted that the functions, if implemented in the form of software functional modules and sold or used as independent products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device to perform all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: various media that can store program codes, such as a U disk, a removable hard disk, a memory, a magnetic disk, or an optical disk.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
The above embodiments are merely examples of the present application and are not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. A method for identifying a cell mass, the method comprising:
obtaining scanning images of cervical fluid-based cells to be detected under different scanning multiplying powers;
obtaining a first gray level image and a second gray level image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray level image is a first multiplying power, and the scanning multiplying power corresponding to the second gray level image is a second multiplying power;
calculating the cell proportion of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images, and calculating the cell density of the cervical liquid-based cells to be detected according to the images at the third time in the scanning images; calculating a target binarization threshold value according to the cell density, the cell proportion and the first gray level image; the third magnification is lower than the first magnification and the second magnification;
processing the second gray level image according to the target binarization threshold value to obtain a target binary image;
determining a candidate region according to the target binary image and the first gray-scale image;
and identifying the image with the highest scanning magnification in the scanned image according to the candidate area to obtain a cell mass identification result.
2. The method of claim 1, wherein said deriving a first grayscale image based on the scanned image comprises:
identifying an impurity region of an image at a first magnification in the scanned image;
performing histogram statistics on the identified image of each impurity region to obtain a gray value corresponding to each impurity region;
and processing the gray values of the impurity regions according to the median of all the gray values of the impurity regions to obtain a first gray image.
3. The method of claim 2, wherein the step of identifying an impurity region of an image at a first magnification in the scanned image comprises:
graying an image under a first multiplying power in the scanned image to obtain first grayed data;
performing binarization processing on the first grayscale data according to a set first threshold value to obtain a first binarized image;
and screening the first binary image by using a set pixel threshold value to obtain an impurity region.
4. The method of claim 1, wherein the step of calculating the cell density of the cervical fluid-based cells to be tested from the image at the third time in the scan image comprises:
obtaining a third gray image according to the image at the third time in the scanned image;
obtaining a second binary image according to the third gray level image;
detecting an envelope boundary of the second binary image through a current detection rule;
when the envelope boundary of the second binary image is detected, updating to obtain the current cell density, and skipping to the step of detecting the envelope boundary of the second binary image through the current detection rule to continue to execute;
and when the envelope boundary of the second binary image is not detected, taking the current cell density as the cell density corresponding to the second binary image.
5. The method according to claim 1, wherein the step of processing the second gray scale image according to the target binarization threshold to obtain a target binary image comprises:
performing primary filtering treatment on the second gray level image based on the cell density of the cervical fluid-based cells to be detected to obtain a primary filtering image;
and carrying out binarization processing on the preliminary filtering image according to the target binarization threshold value to obtain a target binary image.
6. The method of claim 1, wherein the cell mass recognition result comprises a plurality of cell mass categories;
the plurality of cell pellet categories include: background, glandular cell mass, squamous epithelial cell mass, neutrophil aggregation, squamous epithelial cell overlap, fuzzy non-cell mass, fuzzy cell mass.
7. The method according to claim 1, wherein the step of identifying the image with the highest scanning magnification in the scanned image according to the candidate region to obtain the cell mass identification result comprises:
inputting the image data set with the highest scanning magnification in the scanning image corresponding to the candidate area into the multi-level convolution layer in the neural network to obtain a first convolution result corresponding to part of convolution layers;
sending the first convolution result to a multi-stage deconvolution layer in a neural network to obtain a second convolution result;
determining a confidence value set according to the first convolution result and the second convolution result;
and determining the cell mass recognition result according to each confidence value in the confidence value group.
8. A cell mass identification device, the device comprising:
the scanning image acquisition module is used for acquiring scanning images of the cervical fluid-based cells to be detected under different scanning multiplying powers;
the preprocessing module is used for obtaining a first gray image and a second gray image based on the scanned image, wherein the scanning multiplying power corresponding to the first gray image is a first multiplying power, and the scanning multiplying power corresponding to the second gray image is a second multiplying power;
a threshold calculation module, configured to calculate a cell proportion of the cervical fluid-based cell to be detected according to the image at the third time in the scanned image, and calculate a cell density of the cervical fluid-based cell to be detected according to the image at the third time in the scanned image; calculating a target binarization threshold value according to the cell density, the cell proportion and the first gray level image; the third magnification is lower than the first magnification and the second magnification;
the filtering module is used for processing the second gray level image according to the target binarization threshold value to obtain a target binary image;
a candidate region determining module, configured to determine a candidate region according to the target binary image and the first grayscale image;
and the identification module is used for identifying the image with the highest scanning magnification in the scanned image according to the candidate region to obtain a cell mass identification result.
9. An electronic device, characterized in that the electronic device comprises: a processor, a memory storing machine-readable instructions executable by the processor, the machine-readable instructions when executed by the processor performing the steps of the method of any of claims 1-7 when the electronic device is run.
CN201910683872.5A 2019-07-26 2019-07-26 Cell cluster identification method and device and electronic equipment Active CN110378313B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910683872.5A CN110378313B (en) 2019-07-26 2019-07-26 Cell cluster identification method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910683872.5A CN110378313B (en) 2019-07-26 2019-07-26 Cell cluster identification method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN110378313A CN110378313A (en) 2019-10-25
CN110378313B true CN110378313B (en) 2021-05-18

Family

ID=68256501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910683872.5A Active CN110378313B (en) 2019-07-26 2019-07-26 Cell cluster identification method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110378313B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462099B (en) * 2020-04-05 2024-01-23 中国人民解放军总医院 Image cell area positioning method based on rapid integral graph monitoring
CN112102277A (en) * 2020-09-10 2020-12-18 深圳市森盈生物科技有限公司 Device and method for detecting tumor cells in pleural fluid fluorescence image
CN112378837B (en) * 2020-09-15 2021-12-28 深圳市华中生物药械有限公司 Cervical exfoliated cell detection method and related device
CN112149561B (en) * 2020-09-23 2024-04-16 杭州睿琪软件有限公司 Image processing method and device, electronic equipment and storage medium
CN113936021B (en) * 2021-09-18 2022-07-05 广东石油化工学院 Soft aggregation molecule identification method and system based on SEM image
CN117705786A (en) * 2022-09-07 2024-03-15 上海睿钰生物科技有限公司 Automatic analysis method and system for cell monoclonal origins
CN115165710B (en) * 2022-09-08 2022-11-29 珠海圣美生物诊断技术有限公司 Rapid scanning method and device for cells
CN116580041A (en) * 2023-05-30 2023-08-11 山东第一医科大学附属眼科研究所(山东省眼科研究所、山东第一医科大学附属青岛眼科医院) Corneal endothelial cell boundary segmentation method and device based on voronoi diagram
CN117218139A (en) * 2023-09-12 2023-12-12 珠海横琴圣澳云智科技有限公司 Method and device for determining cell density of sample

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106033540A (en) * 2016-05-27 2016-10-19 北京大学第医院 Automatic analyzing method and system for vaginal microecological morphology
CN106709914A (en) * 2017-01-05 2017-05-24 北方工业大学 SAR image ship detection false alarm eliminating method based on two-stage DEM sea-land reservoir
CN108564567A (en) * 2018-03-15 2018-09-21 中山大学 A kind of ultrahigh resolution pathological image cancerous region method for visualizing
CN108846829A (en) * 2018-05-23 2018-11-20 平安科技(深圳)有限公司 Diseased region recognition methods and device, computer installation and readable storage medium storing program for executing
CN109190567A (en) * 2018-09-10 2019-01-11 哈尔滨理工大学 Abnormal cervical cells automatic testing method based on depth convolutional neural networks
EP3427183A1 (en) * 2016-03-10 2019-01-16 Genomic Vision Method of curvilinear signal detection and analysis and associated platform
CN109461495A (en) * 2018-11-01 2019-03-12 腾讯科技(深圳)有限公司 A kind of recognition methods of medical image, model training method and server

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2469843A1 (en) * 2001-06-04 2002-12-12 Ikonisys Inc. Method for detecting infectious agents using computer controlled automated image analysis

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3427183A1 (en) * 2016-03-10 2019-01-16 Genomic Vision Method of curvilinear signal detection and analysis and associated platform
CN106033540A (en) * 2016-05-27 2016-10-19 北京大学第医院 Automatic analyzing method and system for vaginal microecological morphology
CN106709914A (en) * 2017-01-05 2017-05-24 北方工业大学 SAR image ship detection false alarm eliminating method based on two-stage DEM sea-land reservoir
CN108564567A (en) * 2018-03-15 2018-09-21 中山大学 A kind of ultrahigh resolution pathological image cancerous region method for visualizing
CN108846829A (en) * 2018-05-23 2018-11-20 平安科技(深圳)有限公司 Diseased region recognition methods and device, computer installation and readable storage medium storing program for executing
CN109190567A (en) * 2018-09-10 2019-01-11 哈尔滨理工大学 Abnormal cervical cells automatic testing method based on depth convolutional neural networks
CN109461495A (en) * 2018-11-01 2019-03-12 腾讯科技(深圳)有限公司 A kind of recognition methods of medical image, model training method and server

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于深度卷积神经网络的肾透明细胞癌细胞核分割;鲁浩达等;《生物医学工程研究》;20171215;340-345 *

Also Published As

Publication number Publication date
CN110378313A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN110378313B (en) Cell cluster identification method and device and electronic equipment
CN110060237B (en) Fault detection method, device, equipment and system
CN115861135B (en) Image enhancement and recognition method applied to panoramic detection of box body
CN109978839B (en) Method for detecting wafer low-texture defects
CN110619618A (en) Surface defect detection method and device and electronic equipment
WO2021139258A1 (en) Image recognition based cell recognition and counting method and apparatus, and computer device
CN111986183B (en) Chromosome scattered image automatic segmentation and identification system and device
CN116664559B (en) Machine vision-based memory bank damage rapid detection method
CN113962976B (en) Quality evaluation method for pathological slide digital image
JPWO2019026104A1 (en) Information processing apparatus, information processing program, and information processing method
CN111369523B (en) Method, system, equipment and medium for detecting cell stack in microscopic image
CN110648330B (en) Defect detection method for camera glass
CN114240978B (en) Cell edge segmentation method and device based on adaptive morphology
CN114495098B (en) Diaxing algae cell statistical method and system based on microscope image
CN116740728B (en) Dynamic acquisition method and system for wafer code reader
CN115375629A (en) Method for detecting line defect and extracting defect information in LCD screen
CN114972202A (en) Ki67 pathological cell rapid detection and counting method based on lightweight neural network
CN112991259B (en) Method and system for detecting defects of semiconductor manufacturing process
CN113689431A (en) Industrial product appearance defect detection method and device
CN109949245B (en) Cross laser detection positioning method and device, storage medium and computer equipment
WO2021139447A1 (en) Abnormal cervical cell detection apparatus and method
CN113963004A (en) Sampling method and device and electronic equipment
CN114155399A (en) Breast pathology whole-section classification method based on multi-feature fusion progressive discrimination
CN112069885A (en) Face attribute identification method and device and mobile terminal
CN116703927B (en) Cell counting method, device and storage medium based on bright field optical image processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: Room 601-605, 6th Floor, Building J, Building 5, Yunmi City, No. 19, Ningshuang Road, Yuhuatai District, Nanjing, Jiangsu Province, 210000

Patentee after: Jiuyisanluling medical technology Nanjing Co.,Ltd.

Address before: Room 305, building A3, Nanhai Science Park, 180 software Avenue, Yuhuatai District, Nanjing City, Jiangsu Province, 210000

Patentee before: Jiuyisanluling medical technology Nanjing Co.,Ltd.