CN112435259B - Cell distribution model construction and cell counting method based on single sample learning - Google Patents

Cell distribution model construction and cell counting method based on single sample learning Download PDF

Info

Publication number
CN112435259B
CN112435259B CN202110109382.1A CN202110109382A CN112435259B CN 112435259 B CN112435259 B CN 112435259B CN 202110109382 A CN202110109382 A CN 202110109382A CN 112435259 B CN112435259 B CN 112435259B
Authority
CN
China
Prior art keywords
cell
picture
unit area
likelihood function
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110109382.1A
Other languages
Chinese (zh)
Other versions
CN112435259A (en
Inventor
涂文玲
史育红
蒋胜
冯亚辉
张舒羽
江雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuclear Industry 416 Hospital
Original Assignee
Nuclear Industry 416 Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nuclear Industry 416 Hospital filed Critical Nuclear Industry 416 Hospital
Priority to CN202110109382.1A priority Critical patent/CN112435259B/en
Publication of CN112435259A publication Critical patent/CN112435259A/en
Application granted granted Critical
Publication of CN112435259B publication Critical patent/CN112435259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N2015/1006Investigating individual particles for cytology
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N2015/1024Counting particles by non-optical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Dispersion Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention is suitable for the technical field of medicine, and provides a cell distribution model construction and cell counting method based on single sample learning, which comprises the following steps: carrying out microscope irradiation on the cell culture to obtain a microscopic picture; obtaining a cell distribution model F' (x, y) by adopting the cell model construction method, traversing each pixel point in the microscopic picture by adopting the hyper-parameter Hum _ pa, and finding out a proper interception size and a training hyper-parameter; and (4) taking the pixel point (x, y) as a center, intercepting the cell with the optimal interception length and counting. The counting method only needs to take a picture by using a microscope, does not need excessive additional operation, and is more economical; only a single cell needs to be marked manually, and automatic marking can be carried out according to cell similarity, so that convenience of cell counting is improved.

Description

Cell distribution model construction and cell counting method based on single sample learning
Technical Field
The invention belongs to the field of medicine, and particularly relates to a cell distribution model construction and cell counting method based on single sample learning.
Background
In cell culture, the statistics of cell numbers are crucial to cell culture. The cell density required by different cells is not uniform, and the estimation of the cell density needs to be completed by cell counting so as to obtain the conditions most suitable for cell culture.
There are two main ways of cell counting available:
one, full-automatic cell counting instrument
The general automatic cell counting instrument is provided with a reusable counting plate and a fluorescence detection function (a bright field and two fluorescence channels, and the wavelength can be changed), and can count cells. However, each time the cells are counted, the "sample addition" - "insertion counter" - "reading result" is required to obtain the final cell count value.
The above approach has several major disadvantages: 1. the instruments are expensive and need tens of thousands of yuan; 2. a plurality of manual operations are required each time; 3. cell counting plates are expensive as consumables; 4. each cell count requires the use of the cells being cultured, which can have certain effects; typically interspersed in cell culture is required.
Second, artificial cell counting
Manual cell counting, in which a cell counting plate is mainly used for uniform sample distribution to a multi-well plate. The specific steps are roughly as follows:
digesting and blowing the cells uniformly when the cells in a 10CM culture dish grow to about 85%, transferring the uniformly mixed cells into a 15ml centrifuge tube by using a pipette, centrifuging for 5min at 1000 rpm, adding about 3ml of culture medium, mixing uniformly, taking out 100ul of the culture medium, diluting by 4 times, mixing uniformly, filling into a cell counting plate, counting, taking 100ul of the uniformly mixed cell liquid from the 15ml centrifuge tube, diluting by 4 times, mixing, filling into a pool, counting for two times, repeatedly taking a dilute weight pool, and finally taking the average value of 6 counts.
The disadvantages of the above method are: 1. the manual operation amount is extremely large and extremely complicated; 2. uneven sample separation easily causes inaccurate final cell counting; 3. each cell count requires the use of the cells being cultured, which can have certain effects; typically interspersed in cell culture is required.
Therefore, it is urgently required to provide a simple and economical cell counting method.
Disclosure of Invention
The object of the present invention is to provide a simple and economical cell counting method. Comprises the construction of a cell distribution model and a cell technology method based on the model.
A construction method of a cell distribution model based on single sample learning is characterized in that,
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as a coordinate origin (0, 0) to obtain a coordinate array [ x, y ] of a pixel point (x, y) in the boundary of the artificially marked complete cell, wherein x and y are horizontal and vertical coordinates of the pixel point in a coordinate system taking the coordinate origin (0, 0) as the coordinate origin respectively;
1-4, establishing a cell distribution model F (x, y):
Figure 815849DEST_PATH_IMAGE001
Figure 681036DEST_PATH_IMAGE002
Figure 960839DEST_PATH_IMAGE003
Figure 996797DEST_PATH_IMAGE004
wherein the content of the first and second substances,f 1 x,y) Is a function of a standard normal distribution function,
Figure 246513DEST_PATH_IMAGE005
is a two-dimensional normal distribution function, σ1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution;
Figure 762945DEST_PATH_IMAGE006
the deformed ellipse equation, r is the radius of the circle in the equation of the standard circle,
Figure 213649DEST_PATH_IMAGE007
mu is a compression ratio, namely, the standard circle is compressed in equal proportion to form a similar ellipse; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a Loss function, wherein the Loss function Loss is as follows:
Figure 284373DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 868938DEST_PATH_IMAGE009
the floating point number of the pixel point (x, y) is obtained after the picture is preprocessed;
solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained. It will be appreciated by those skilled in the art that the Loss function value is a negative probability value, and the present invention requires that the negative value be the smallest if the present invention wants the probability to be the greatest.
Further, the preprocessing of the picture comprises: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 659783DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 140443DEST_PATH_IMAGE011
wherein, a is the pixel value of the pixel point (x, y).
A cell counting method based on single sample learning is characterized by comprising the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2, obtaining a cell distribution adaptation model F' (x, y) by adopting the construction method, and obtaining the parameters theta, r, sigma and sigma obtained by learning in the step 1-51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, centering on pixel ATaking L _ A1 as a cell to cut out a new picture (a square picture taking L _ A1 as a side length), taking the center of the new picture as a coordinate origin (0, 0), and obtaining a coordinate array [ x ] of a pixel point in the new pictureA1,yA1](ii) a To coordinate array [ x ]A1,yA1]Substituting the parameters into the steps 1-4 and 1-5 to carry out model adaptation, and learning to obtain the parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 839409DEST_PATH_IMAGE012
Figure 696507DEST_PATH_IMAGE013
Figure 656372DEST_PATH_IMAGE014
Figure 963726DEST_PATH_IMAGE015
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of the picture intercepted in the step 3-3-12Selecting the cell interception length corresponding to the maximum unit area likelihood function value as the cell interception length taking the pixel point A as the center, and marking the cell interception length as L '_ A, wherein the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
3-4, taking the pixel point A as the center and taking L' _ A as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
Further, in the step 3-3-3, the method further includes: when the same pixel point is intercepted by a plurality of intercepted cells, the actual unit area likelihood function values corresponding to the intercepted cells are compared, the pixel point is intercepted into the cell corresponding to the actual unit area likelihood function value with the maximum value, and the rest intercepted cells are not counted.
Further, in the step 3-4, a threshold value is set, all the actual unit area likelihood function values are compared with the threshold value, and the cells with the actual unit area likelihood function values larger than the threshold value are selected for counting; the threshold is the average of all actual unit area likelihood function values.
Further, the threshold is set artificially.
Further, the likelihood function value is the value of the pixel multiplied by the probability value that the pixel is in accordance with F (x, y) distribution.
Compared with the prior art, the cell counting method has the following beneficial effects:
(1) the counting method only needs to take a picture by using a microscope, and extra excessive operation is not needed;
(2) according to the counting method, only a single cell needs to be marked manually, and automatic marking can be carried out according to cell similarity;
(3) the counting method can be finally regulated according to the threshold value, and people can find a better result by regulating the threshold value;
(4) the counting method of the invention needs no consumables and only needs a small amount of labor, thereby greatly reducing the cost of experiment and measurement.
Detailed Description
The following description provides many different embodiments, or examples, for implementing different features of the invention. The particular examples set forth below are illustrative only and are not intended to be limiting.
A construction method of a cell distribution model based on single sample learning comprises the following steps:
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as the origin of coordinates (0, 0) to obtain the coordinate array [ x, y ] of the pixel point (x, y) in the complete cell]:(x1,y1)、(x2,y2)、(x3,y3)、(x4,y4)、(x5,y5)……(xn,yn) (ii) a Wherein, x and y are respectively the horizontal and vertical coordinates of the pixel points in a coordinate system with the origin of coordinates (0, 0);
at this time, the image is usually whitened to ensure that the background color (i.e. the cell-free position) is black, i.e. the corresponding pixel value is 0;
1-4, establishing a cell distribution model:
Figure 9042DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE017
Figure 873093DEST_PATH_IMAGE018
Figure 421886DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 667185DEST_PATH_IMAGE005
is a two-dimensional normal distribution function, σ1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution; the deformed ellipse equation, r is the radius of the circle in the equation of the standard circle,
Figure 934218DEST_PATH_IMAGE019
mu is a compression ratio, namely, the standard circle is compressed in equal proportion to form a shape similar to an ellipse; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a loss function, wherein the loss function is as follows:
Figure 336380DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 5259DEST_PATH_IMAGE009
the floating point number is the floating point number of a pixel point (x, y) after the picture is preprocessed; solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained.
The coordinate array (x) is1,y1)、(x2,y2)、(x3,y3)、(x4,y4)、(x5,y5)……(xn,yn) Substituting into learning, and obtaining parameters theta, r and sigma after learning is finished by solving loss minimization1、σ2、ρ、μ1、μ2、w1、w2Finally, the cell-adapted model F' (x, y) is obtained.
Further, the preprocessing of the picture comprises: the picture preprocessing comprises the following steps: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 264202DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 674324DEST_PATH_IMAGE011
wherein, a is the pixel value of the pixel point (x, y).
Further, the parameters theta, r and sigma obtained by learning in the steps 1 to 51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
It should be noted that different types of cells have different applicable models, and the calculated hyper-parameters are different, and the calculated hyper-parameters are obviously different for fusiform cells, spherical cells and cake cells.
The invention also provides a cell counting method based on single sample learning, which applies the constructed cell distribution model and comprises the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2, obtaining a cell distribution model F' (x, y) by adopting the cell model construction method, and obtaining the parameters theta, r, sigma and sigma obtained by learning in the step 1-51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, taking the pixel A as the center, taking L _ A1 as the cell intercepting length to intercept a new picture and obtaining the coordinate array [ x ] of the pixel in the new pictureA1,yA1](ii) a Wherein x isA1,yA1Respectively the horizontal and vertical coordinates of the pixel points in the new picture in a coordinate system which takes the central point of the original microscopic picture as the origin of coordinates in the original microscopic picture; to coordinate array [ x ]A1,yA1]Substituting the parameters into the steps 1-4 and 1-5 to carry out model adaptation, and learning to obtain the parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 614598DEST_PATH_IMAGE012
Figure 137983DEST_PATH_IMAGE021
Figure 98986DEST_PATH_IMAGE014
Figure 75032DEST_PATH_IMAGE022
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
for example, the coordinate of the A point is (x)1,y1) Then, with (x)1,y1) Taking 0.5L, 0.6L, 0.7L, … … and 1.5L as cell intercepting length intercepting pictures respectively as the center to form new pictures;
those skilled in the art will appreciate that the specific truncation length may be selected based on the actual circumstances of the calculation, and the truncation lengths listed herein are for illustrative purposes only.
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
The likelihood function is the probability value of the pixel point according with F (x, y) distribution multiplied by the floating point value of the pixel point;
namely, maximum likelihood function values Log _ L _ A _0.5L, Log _ L _ A _0.6L, Log _ L _ A _0.7L, … … and Log _ L _ A _1.5L of the intercepted pictures with 0.5L, 0.6L, 0.7L, … … and 1.5L as cell interception lengths are respectively obtained; and a unit area likelihood function value Log _ L _ A _0.5L/(L _ A _0.5L) for each picture2、Log_L_A1_0.6L/(L_A_0.6L)2、Log_L_A_0.7L/(L_A_0.7L)2、……、Log_L_A_1.5L/(L_A_1.5L)2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2Selecting the cell interception length corresponding to the maximum unit area likelihood function value as the actual cell interception length taking the pixel point A as the center, and marking the cell interception length as L '_ A, wherein the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
comparison Log _ L _ A _0.5L/(L _ A _0.5L)2、Log_L_A_0.6L/(L_A_0.6L)2、Log_L_A_0.7L/(L_A_0.7L)2、……、Log_L_A_1.5L/(L_A_1.5L)2Selecting the largest unit area likelihood function value, assuming Log _ L _ A _0.6L/(L _ A _0.6L)2Maximum, select 0.6L as the actual interception length of the cell with the pixel point A as the center, and mark it as L' _ A, the maximum unit area likelihood function value Log _ L _ A _0.6L/(L _ A _0.6L)2An actual unit area likelihood function value marked as L' _ A;
3-4, taking the pixel point A as the center, and taking L' _ A =0.6L as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
For pixel point B (x) in the microscopic picture2,y2)、C(x3,y3)、D(x4,y4) … …, respectively obtaining screenshot cells with pixel points B, C, D.. as centers, such as L ' _ B =0.8L, L ' _ C =1.5L, L ' _ D =0.9L, … …; the corresponding actual unit area likelihood function value is Log _ L _ B _0.8L/(L _ B _0.8L)2,Log_L_C_1.5L/(L_C_1.5L)2,Log_L_D_0.9L/(L_D_0.9L)2,……
To ensure that the center point of each pixel is not covered by multiple intercepted cells (when the same point is covered by multiple cells, it indicates that there is a cell repeat interception condition), when (x)n,yn) When covered by a picture intercepted by a plurality of cells, for example, covered by the above-mentioned intercepted cells with pixel B and pixel C as the middleCover, order (x)n,yn) The cell corresponding to the actual likelihood function value per unit area with the maximum value is truncated:
for example, when (x)n,yn) Meanwhile, the cell 2 with the pixel point B as the center and the cell 3 with the pixel point C as the center are intercepted, the actual unit area likelihood function values corresponding to the cell 2 and the cell 3 are compared, and according to the assumption, the actual unit area likelihood function value of the cell 2 is Log _ L _ B _0.8L/(L _ B _0.8L)2The actual value of the unit area likelihood function of the cell 3 is Log _ L _ C _1.5L/(L _ C _1.5L)2And then:
Log_L_B_0.8L/(L_B_0.8L)2<Log_L_C_1.5L/(L_C_1.5L)2then (x)n,yn) Intercepted by the cell 3 with the pixel point C as the center, and not intercepted by the cell 2 with the pixel point B as the center;
similarly, Log _ L _ B _0.8L/(L _ B _0.8L)2>Log_L_C_1.5L/(L_C_1.5L)2Then (x)n,yn) Intercepted by cell 2 centered on pixel B, and not intercepted by cell 3 centered on pixel C.
Meanwhile, there may be a case where one pixel point is intercepted by only one intercepted cell, but the intercepted cell is not a cell.
In order to solve this problem, a threshold value is set, and when the actual value of the likelihood function per unit area of the truncated cell is smaller than the threshold value, the truncated cell is not counted, that is, the cell truncation which is not a cell is discarded.
The threshold is calculated as: the average of all actual unit area likelihood function values;
for example, the actual value of the likelihood function per unit area of the truncated cell 1 is Log _ L _ A _0.6L/(L _ A _0.6L)2The actual value of the unit area likelihood function of cell 2 is Log _ L _ B _0.8L/(L _ B _0.8L)2The actual value of the unit area likelihood function of the cell 3 is Log _ L _ C _1.5L/(L _ C _1.5L)2Actual unit area likelihood function value Log _ L _ D _0.9L/(L _ D _0.9L) of cell 42Then the threshold a is:
a=(1/4)*(Log_L_A_0.6L/(L_A_0.6L)2+Log_L_B_0.8L/(L_B_0.8L)2+ Log_L_C_1.5L/(L_C_1.5L)2 + Log_L_D_0.9L/(L_D_0.9L)2
when the actual unit area likelihood function value of any one of the cells 1, 2, 3 and 4 is less than a, the cell is discarded and is not counted.
It will be understood by those skilled in the art that this embodiment is merely an example, and in practical applications, there are m truncated cells, where m is a natural number greater than or equal to 1.
Of course, the threshold value may be set empirically, and the most suitable threshold value is selected by manually controlling the threshold value, so as to obtain the final cell interception result.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (6)

1. A construction method of a cell distribution model based on single sample learning is characterized in that,
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as a coordinate origin (0, 0) to obtain a coordinate array [ x, y ] of pixel points (x, y) in the boundary of the artificially marked complete cell, wherein x and y are respectively horizontal and vertical coordinates of the pixel points in a coordinate system taking the coordinate origin (0, 0) as the coordinate origin;
1-4, establishing a cell distribution model F (x, y):
Figure 438005DEST_PATH_IMAGE001
Figure 893257DEST_PATH_IMAGE002
Figure 950337DEST_PATH_IMAGE003
Figure 719710DEST_PATH_IMAGE004
wherein the content of the first and second substances,f 1 x,y) Is a standard positive-Taiwan distribution function, is a two-dimensional normal distribution function, sigma1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution; is a deformed ellipse equation, r is the radius of a circle in a standard circle equation,
Figure 690443DEST_PATH_IMAGE007
mu is a compression ratio; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a Loss function, wherein the Loss function Loss is as follows:
Figure 60244DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 154102DEST_PATH_IMAGE009
the floating point number of the pixel point (x, y) is obtained after the picture is preprocessed; solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained.
2. Construction method according to claim 1, characterized in that the pre-processing of the pictures comprises: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 951157DEST_PATH_IMAGE010
The calculation formula of (2) is as follows:
Figure 91151DEST_PATH_IMAGE011
wherein, a is the pixel value of the pixel point (x, y).
3. A cell counting method based on single sample learning is characterized by comprising the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2. obtaining a cell distribution fitting model F' (x, y) by the construction method according to claim 1 or 2, and learning the parameters θ, r, σ from steps 1-51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, taking the pixel A as the center, taking L _ A1 as the cell intercepting length to intercept a new picture, and taking the center of the new picture as the origin of coordinates (0, 0) to obtain the coordinate array [ x ] of the pixel in the new pictureA1,yA1](ii) a To coordinate array [ x ]A1,yA1]Substituting the parameters into the steps 1-4 and 1-5 to carry out model adaptation, and learning to obtain the parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 153392DEST_PATH_IMAGE012
Figure 164073DEST_PATH_IMAGE013
Figure 663188DEST_PATH_IMAGE014
Figure 759320DEST_PATH_IMAGE015
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of the picture intercepted in the step 3-3-12Selecting the largest unit area likelihood functionThe cell interception length corresponding to the value is taken as the cell interception length taking the pixel point A as the center and is marked as L '_ A, and the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
3-4, taking the pixel point A as the center and taking L' _ A as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
4. The cell counting method according to claim 3, further comprising, in the step 3-3-3: when the same pixel point is intercepted by a plurality of intercepted cells, the actual unit area likelihood function values corresponding to the intercepted cells are compared, the pixel point is intercepted into the cell corresponding to the actual unit area likelihood function value with the maximum value, and the rest intercepted cells are not counted.
5. The cell counting method according to claim 3, wherein in the step 3-4, a threshold value is set, all the actual unit area likelihood function values are compared with the threshold value, and the cells with the actual unit area likelihood function values larger than the threshold value are selected for counting; the threshold is the average of all actual unit area likelihood function values.
6. The method of claim 5, wherein the threshold is set manually.
CN202110109382.1A 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning Active CN112435259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110109382.1A CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110109382.1A CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Publications (2)

Publication Number Publication Date
CN112435259A CN112435259A (en) 2021-03-02
CN112435259B true CN112435259B (en) 2021-04-02

Family

ID=74697336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110109382.1A Active CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Country Status (1)

Country Link
CN (1) CN112435259B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067315B (en) * 2021-10-23 2022-11-29 广州市艾贝泰生物科技有限公司 Cell counting method, cell counting device, computer device, and storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101275897A (en) * 2008-03-06 2008-10-01 上海交通大学 Medulla biopsy slice hematopoiesis cell characteristic recognizing method
AU2010315455B2 (en) * 2009-10-28 2015-01-29 Chevron U.S.A. Inc. Multiscale Finite Volume method for reservoir simulation
CN107527028B (en) * 2017-08-18 2020-03-24 深圳乐普智能医疗器械有限公司 Target cell identification method and device and terminal
US20200311465A1 (en) * 2017-11-14 2020-10-01 miDiagnostics NV Classification of a population of objects by convolutional dictionary learning with class proportion data
CN108346145B (en) * 2018-01-31 2020-08-04 浙江大学 Identification method of unconventional cells in pathological section
CN109765164B (en) * 2018-03-06 2021-02-26 中国科学院高能物理研究所 Absolute quantitative detection method of gold nanorods in single cells
CN109406768B (en) * 2018-11-06 2023-01-24 扬州大学 Method for observing three-dimensional distribution of cells in micro tissue and counting cell number
CN110501278B (en) * 2019-07-10 2021-04-30 同济大学 Cell counting method based on YOLOv3 and density estimation
CN110648309B (en) * 2019-08-12 2024-05-28 平安科技(深圳)有限公司 Method and related equipment for generating anti-network synthesized erythrocyte image based on condition
CN110516584B (en) * 2019-08-22 2021-10-08 杭州图谱光电科技有限公司 Cell automatic counting method based on dynamic learning for microscope
CN110647875B (en) * 2019-11-28 2020-08-07 北京小蝇科技有限责任公司 Method for segmenting and identifying model structure of blood cells and blood cell identification method
CN111520073B (en) * 2020-03-30 2021-07-13 西安石油大学 Quantitative characterization method for collision prevention risk of large well cluster infilled well
CN111723693B (en) * 2020-06-03 2022-05-27 云南大学 Crowd counting method based on small sample learning
CN111724381B (en) * 2020-06-24 2022-11-01 武汉互创联合科技有限公司 Microscopic image cell counting and posture identification method based on multi-view cross validation

Also Published As

Publication number Publication date
CN112435259A (en) 2021-03-02

Similar Documents

Publication Publication Date Title
US9292729B2 (en) Method and software for analysing microbial growth
CN110837870B (en) Sonar image target recognition method based on active learning
US8687879B2 (en) Method and apparatus for generating special-purpose image analysis algorithms
CN112435259B (en) Cell distribution model construction and cell counting method based on single sample learning
CN113947607B (en) Cancer pathological image survival prognosis model construction method based on deep learning
CN113723573A (en) Tumor tissue pathological classification system and method based on adaptive proportion learning
CN112102232B (en) Method and device for automatically evaluating colony quality of induced pluripotent stem cells
CN112949517B (en) Plant stomata density and opening degree identification method and system based on deep migration learning
CN108009567A (en) A kind of automatic discriminating conduct of the fecal character of combination color of image and HOG and SVM
CN114486646A (en) Cell analysis method and system, and quantification method and system
CN113610101A (en) Method for measuring germination rate of grains
CN116385374A (en) Cell counting method based on convolutional neural network
CN109147932A (en) cancer cell HER2 gene amplification analysis method and system
CN117152147B (en) Online chromosome collaborative analysis method, system and medium
CN115760957B (en) Method for analyzing substances in cell nucleus by three-dimensional electron microscope
CN108596840A (en) A kind of data set Enhancement Method for deep learning evaluation blood vessel network developmental level
CN116188489A (en) Wheat head point cloud segmentation method and system based on deep learning and geometric correction
Kbiri et al. Quantifying Meiotic CrossoverRecombination in Arabidopsis Lines Expressing Fluorescent Reporters in Seeds Using SeedScoring Pipeline for CellProfiler
JP2021525890A (en) Computer implementation process for images of biological samples
US20240193968A1 (en) Analyzing microscope images of microalgae culture samples
CN111507234B (en) Cell flow detection method
CN112540039A (en) Method for directly calculating number of adherent living cells
Mendelsohn et al. Morphological analysis of cells and chromosomes by digital computer
Uttamatanin et al. Chromosome classification for metaphase selection
Qu Two algorithms of image segmentation and measurement method of particles parameters

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant