CN112435259A - Cell distribution model construction and cell counting method based on single sample learning - Google Patents

Cell distribution model construction and cell counting method based on single sample learning Download PDF

Info

Publication number
CN112435259A
CN112435259A CN202110109382.1A CN202110109382A CN112435259A CN 112435259 A CN112435259 A CN 112435259A CN 202110109382 A CN202110109382 A CN 202110109382A CN 112435259 A CN112435259 A CN 112435259A
Authority
CN
China
Prior art keywords
cell
picture
unit area
likelihood function
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110109382.1A
Other languages
Chinese (zh)
Other versions
CN112435259B (en
Inventor
涂文玲
史育红
蒋胜
冯亚辉
张舒羽
江雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuclear Industry 416 Hospital
Original Assignee
Nuclear Industry 416 Hospital
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nuclear Industry 416 Hospital filed Critical Nuclear Industry 416 Hospital
Priority to CN202110109382.1A priority Critical patent/CN112435259B/en
Publication of CN112435259A publication Critical patent/CN112435259A/en
Application granted granted Critical
Publication of CN112435259B publication Critical patent/CN112435259B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N2015/1006Investigating individual particles for cytology
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N15/00Investigating characteristics of particles; Investigating permeability, pore-volume or surface-area of porous materials
    • G01N15/10Investigating individual particles
    • G01N2015/1024Counting particles by non-optical means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • Dispersion Chemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

The invention is suitable for the technical field of medicine, and provides a cell distribution model construction and cell counting method based on single sample learning, which comprises the following steps: carrying out microscope irradiation on the cell culture to obtain a microscopic picture; obtaining a cell distribution model F' (x, y) by adopting the cell model construction method, traversing each pixel point in the microscopic picture by adopting the hyper-parameter Hum _ pa, and finding out a proper interception size and a training hyper-parameter; and (4) taking the pixel point (x, y) as a center, intercepting the cell with the optimal interception length and counting. The counting method only needs to take a picture by using a microscope, does not need excessive additional operation, and is more economical; only a single cell needs to be marked manually, and automatic marking can be carried out according to cell similarity, so that convenience of cell counting is improved.

Description

Cell distribution model construction and cell counting method based on single sample learning
Technical Field
The invention belongs to the field of medicine, and particularly relates to a cell distribution model construction and cell counting method based on single sample learning.
Background
In cell culture, the statistics of cell numbers are crucial to cell culture. The cell density required by different cells is not uniform, and the estimation of the cell density needs to be completed by cell counting so as to obtain the conditions most suitable for cell culture.
There are two main ways of cell counting available:
one, full-automatic cell counting instrument
The general automatic cell counting instrument is provided with a reusable counting plate and a fluorescence detection function (a bright field and two fluorescence channels, and the wavelength can be changed), and can count cells. However, each time the cells are counted, the "sample addition" - "insertion counter" - "reading result" is required to obtain the final cell count value.
The above approach has several major disadvantages: 1. the instruments are expensive and need tens of thousands of yuan; 2. a plurality of manual operations are required each time; 3. cell counting plates are expensive as consumables; 4. each cell count requires the use of the cells being cultured, which can have certain effects; typically interspersed in cell culture is required.
Second, artificial cell counting
Manual cell counting, in which a cell counting plate is mainly used for uniform sample distribution to a multi-well plate. The specific steps are roughly as follows:
digesting and blowing the cells uniformly when the cells in a 10CM culture dish grow to about 85%, transferring the uniformly mixed cells into a 15ml centrifuge tube by using a pipette, centrifuging for 5min at 1000 rpm, adding about 3ml of culture medium, mixing uniformly, taking out 100ul of the culture medium, diluting by 4 times, mixing uniformly, filling into a cell counting plate, counting, taking 100ul of the uniformly mixed cell liquid from the 15ml centrifuge tube, diluting by 4 times, mixing, filling into a pool, counting for two times, repeatedly taking a dilute weight pool, and finally taking the average value of 6 counts.
The disadvantages of the above method are: 1. the manual operation amount is extremely large and extremely complicated; 2. uneven sample separation easily causes inaccurate final cell counting; 3. each cell count requires the use of the cells being cultured, which can have certain effects; typically interspersed in cell culture is required.
Therefore, it is urgently required to provide a simple and economical cell counting method.
Disclosure of Invention
The object of the present invention is to provide a simple and economical cell counting method. Comprises the construction of a cell distribution model and a cell technology method based on the model.
A construction method of a cell distribution model based on single sample learning is characterized in that,
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as a coordinate origin (0, 0) to obtain a coordinate array [ x, y ] of a pixel point (x, y) in the boundary of the artificially marked complete cell, wherein x and y are horizontal and vertical coordinates of the pixel point in a coordinate system taking the coordinate origin (0, 0) as the coordinate origin respectively;
1-4, establishing a cell distribution model F (x, y):
Figure 802931DEST_PATH_IMAGE001
Figure 496080DEST_PATH_IMAGE002
Figure 557577DEST_PATH_IMAGE003
Figure 45190DEST_PATH_IMAGE004
wherein, the two-dimensional normal distribution function is σ1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution; the deformed ellipse equation, r is the radius of the circle in the equation of the standard circle,
Figure 638872DEST_PATH_IMAGE007
mu is a compression ratio, namely, the standard circle is compressed in equal proportion to form a similar ellipse; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a Loss function, wherein the Loss function Loss is as follows:
Figure 828544DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 728367DEST_PATH_IMAGE009
the floating point number of the pixel point (x, y) is obtained after the picture is preprocessed;
solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained. It will be appreciated by those skilled in the art that the Loss function value is a negative probability value, and the present invention requires that the negative value be the smallest if the present invention wants the probability to be the greatest.
Further, the preprocessing of the picture comprises: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 763319DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 64988DEST_PATH_IMAGE010
where a is the pixel value of pixel (x, y).
A cell counting method based on single sample learning is characterized by comprising the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2, obtaining a cell distribution adaptation model F' (x, y) by adopting the construction method, and obtaining the parameters theta, r, sigma and sigma obtained by learning in the step 1-51、σ2、ρ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, taking the pixel A as the center, taking L _ A1 as the cell, taking a new picture (a square picture taking L _ A1 as the side length) and taking the center of the new picture as the origin of coordinates (0, 0) to obtain the coordinate array [ x ] of the pixel in the new pictureA1,yA1](ii) a To coordinate array [ x ]A1,yA1]Substituting the parameters into the steps 1-4 and 1-5 to carry out model adaptation, and learning to obtain the parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 425562DEST_PATH_IMAGE011
Figure 812681DEST_PATH_IMAGE012
Figure 385745DEST_PATH_IMAGE013
Figure 276340DEST_PATH_IMAGE014
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of the picture intercepted in the step 3-3-12Selecting the cell interception length corresponding to the maximum unit area likelihood function value as the cell interception length taking the pixel point A as the center, and marking the cell interception length as L '_ A, wherein the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
3-4, taking the pixel point A as the center and taking L' _ A as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
Further, in the step 3-3-3, the method further includes: when the same pixel point is intercepted by a plurality of intercepted cells, the actual unit area likelihood function values corresponding to the intercepted cells are compared, the pixel point is intercepted into the cell corresponding to the actual unit area likelihood function value with the maximum value, and the rest intercepted cells are not counted.
Further, in the step 3-4, a threshold value is set, all the actual unit area likelihood function values are compared with the threshold value, and the cells with the actual unit area likelihood function values larger than the threshold value are selected for counting; the threshold is the average of all actual unit area likelihood function values.
Further, the threshold is set artificially.
Further, the likelihood function value is the value of the pixel multiplied by the probability value that the pixel is in accordance with F (x, y) distribution.
Compared with the prior art, the cell counting method has the following beneficial effects:
(1) the counting method only needs to take a picture by using a microscope, and extra excessive operation is not needed;
(2) according to the counting method, only a single cell needs to be marked manually, and automatic marking can be carried out according to cell similarity;
(3) the counting method can be finally regulated according to the threshold value, and people can find a better result by regulating the threshold value;
(4) the counting method of the invention needs no consumables and only needs a small amount of labor, thereby greatly reducing the cost of experiment and measurement.
Detailed Description
The following description provides many different embodiments, or examples, for implementing different features of the invention. The particular examples set forth below are illustrative only and are not intended to be limiting.
A construction method of a cell distribution model based on single sample learning comprises the following steps:
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as the origin of coordinates (0, 0) to obtain the coordinate array [ x, y ] of the pixel point (x, y) in the complete cell]:(x1,y1)、(x2,y2)、(x3,y3)、(x4,y4)、(x5,y5)……(xn,yn) (ii) a Wherein, x and y are respectively the horizontal and vertical coordinates of the pixel points in a coordinate system with the origin of coordinates (0, 0);
at this time, the image is usually whitened to ensure that the background color (i.e. the cell-free position) is black, i.e. the corresponding pixel value is 0;
1-4, establishing a cell distribution model:
Figure 791504DEST_PATH_IMAGE015
Figure 400340DEST_PATH_IMAGE016
Figure 573833DEST_PATH_IMAGE018
Figure 787776DEST_PATH_IMAGE004
wherein, is a two-dimensional normal distribution function, σ1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution; the deformed ellipse equation, r is the radius of the circle in the equation of the standard circle,
Figure 851864DEST_PATH_IMAGE019
mu is a compression ratio, namely, the standard circle is compressed in equal proportion to form a shape similar to an ellipse; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a loss function, wherein the loss function is as follows:
Figure 32310DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure 897498DEST_PATH_IMAGE009
the floating point number is the floating point number of a pixel point (x, y) after the picture is preprocessed; solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained.
The coordinate array (x) is1,y1)、(x2,y2)、(x3,y3)、(x4,y4)、(x5,y5)……(xn,yn) Substituting into learning, and obtaining parameters theta, r and sigma after learning is finished by solving loss minimization1、σ2、ρ、μ1、μ2、w1、w2Finally, the cell-adapted model F' (x, y) is obtained.
Further, the preprocessing of the picture comprises: the picture preprocessing comprises the following steps: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 36355DEST_PATH_IMAGE009
The calculation formula of (2) is as follows:
Figure 573778DEST_PATH_IMAGE021
where a is the pixel value of pixel (x, y).
Further, the parameters theta, r and sigma obtained by learning in the steps 1 to 51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
It should be noted that different types of cells have different applicable models, and the calculated hyper-parameters are different, and the calculated hyper-parameters are obviously different for fusiform cells, spherical cells and cake cells.
The invention also provides a cell counting method based on single sample learning, which applies the constructed cell distribution model and comprises the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2, obtaining a cell distribution model F' (x, y) by adopting the cell model construction method, and obtaining the parameters theta, r, sigma and sigma obtained by learning in the step 1-51、σ2、ρ、μ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, taking the pixel A as the center, taking L _ A1 as the cell intercepting length to intercept a new picture and obtaining the coordinate array [ x ] of the pixel in the new pictureA1,yA1](ii) a Wherein x isA1,yA1Respectively the horizontal and vertical coordinates of the pixel points in the new picture in a coordinate system which takes the central point of the original microscopic picture as the origin of coordinates in the original microscopic picture; to coordinate array [ x ]A1,yA1]Substituting the parameters into the steps 1-4 and 1-5 to carry out model adaptation, and learning to obtain the parameters theta, r, sigma and sigma1、σ2、ρ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 823493DEST_PATH_IMAGE011
Figure 808767DEST_PATH_IMAGE022
Figure 587367DEST_PATH_IMAGE013
Figure 658091DEST_PATH_IMAGE023
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
for example, the coordinate of the A point is (x)1,y1) Then, with (x)1,y1) Taking 0.5L, 0.6L, 0.7L, … … and 1.5L as cell intercepting length intercepting pictures respectively as the center to form new pictures;
those skilled in the art will appreciate that the specific truncation length may be selected based on the actual circumstances of the calculation, and the truncation lengths listed herein are for illustrative purposes only.
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
The likelihood function is the probability value of the pixel point according with F (x, y) distribution multiplied by the floating point value of the pixel point;
namely, maximum likelihood function values Log _ L _ A _0.5L, Log _ L _ A _0.6L, Log _ L _ A _0.7L, … … and Log _ L _ A _1.5L of the intercepted pictures with 0.5L, 0.6L, 0.7L, … … and 1.5L as cell interception lengths are respectively obtained; and a unit area likelihood function value Log _ L _ A _0.5L/(L _ A _0.5L) for each picture2、Log_L_A1_0.6L/(L_A_0.6L)2、Log_L_A_0.7L/(L_A_0.7L)2、……、Log_L_A_1.5L/(L_A_1.5L)2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2Selecting the cell interception length corresponding to the maximum unit area likelihood function value as the actual cell interception length taking the pixel point A as the center, and marking the cell interception length as L '_ A, wherein the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
comparison Log _ L _ A _0.5L/(L _ A _0.5L)2、Log_L_A_0.6L/(L_A_0.6L)2、Log_L_A_0.7L/(L_A_0.7L)2、……、Log_L_A_1.5L/(L_A_1.5L)2Selecting the largest unit area likelihood function value, assuming Log _ L _ A _0.6L/(L _ A _0.6L)2Maximum, select 0.6L as the actual interception length of the cell with the pixel point A as the center, and mark it as L' _ A, the maximum unit area likelihood function value Log _ L _ A _0.6L/(L _ A _0.6L)2An actual unit area likelihood function value marked as L' _ A;
3-4, taking the pixel point A as the center, and taking L' _ A =0.6L as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
For pixel point B (x) in the microscopic picture2,y2)、C(x3,y3)、D(x4,y4) … …, respectively obtaining screenshot cells with pixel points B, C, D.. as centers, such as L ' _ B =0.8L, L ' _ C =1.5L, L ' _ D =0.9L, … …; the corresponding actual unit area likelihood function value is Log _ L _ B _0.8L/(L _ B _0.8L)2,Log_L_C_1.5L/(L_C_1.5L)2,Log_L_D_0.9L/(L_D_0.9L)2,……
To ensure that the center point of each pixel is not covered by multiple intercepted cells (when the same point is covered by multiple cells, it indicates that there is a cell repeat interception condition), when (x)n,yn) When the image intercepted by a plurality of cells is covered, for example, covered by the above-mentioned intercepted cells with pixel B and pixel C as middle, then let (x)n,yn) The cell corresponding to the actual likelihood function value per unit area with the maximum value is truncated:
for example, when (x)n,yn) Meanwhile, the cell 2 with the pixel point B as the center and the cell 3 with the pixel point C as the center are intercepted, the actual unit area likelihood function values corresponding to the cell 2 and the cell 3 are compared, and according to the assumption, the actual unit area likelihood function value of the cell 2 is Log _ L _ B _0.8L/(L _ B _0.8L)2The actual value of the unit area likelihood function of the cell 3 is Log _ L _ C _1.5L/(L _ C _1.5L)2And then:
Log_L_B_0.8L/(L_B_0.8L)2<Log_L_C_1.5L/(L_C_1.5L)2then (x)n,yn) Intercepted by the cell 3 with the pixel point C as the center, and not intercepted by the cell 2 with the pixel point B as the center;
similarly, Log _ L _ B _0.8L/(L _ B _0.8L)2>Log_L_C_1.5L/(L_C_1.5L)2Then (x)n,yn) Intercepted by cell 2 centered on pixel B, and not intercepted by cell 3 centered on pixel C.
Meanwhile, there may be a case where one pixel point is intercepted by only one intercepted cell, but the intercepted cell is not a cell.
In order to solve this problem, a threshold value is set, and when the actual value of the likelihood function per unit area of the truncated cell is smaller than the threshold value, the truncated cell is not counted, that is, the cell truncation which is not a cell is discarded.
The threshold is calculated as: the average of all actual unit area likelihood function values;
for example, the actual unit area likelihood function value of the truncated cell 1 is Log _ L _ a _0.6L/(L_A_0.6L)2The actual value of the unit area likelihood function of cell 2 is Log _ L _ B _0.8L/(L _ B _0.8L)2The actual value of the unit area likelihood function of the cell 3 is Log _ L _ C _1.5L/(L _ C _1.5L)2Actual unit area likelihood function value Log _ L _ D _0.9L/(L _ D _0.9L) of cell 42Then the threshold a is:
a=(1/4)*(Log_L_A_0.6L/(L_A_0.6L)2+Log_L_B_0.8L/(L_B_0.8L)2+ Log_L_C_1.5L/(L_C_1.5L)2 + Log_L_D_0.9L/(L_D_0.9L)2
when the actual unit area likelihood function value of any one of the cells 1, 2, 3 and 4 is less than a, the cell is discarded and is not counted.
It will be understood by those skilled in the art that this embodiment is merely an example, and in practical applications, there are m truncated cells, where m is a natural number greater than or equal to 1.
Of course, the threshold value may be set empirically, and the most suitable threshold value is selected by manually controlling the threshold value, so as to obtain the final cell interception result.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (6)

1. A construction method of a cell distribution model based on single sample learning is characterized in that,
1-1, carrying out microscope irradiation on cell culture to obtain a microscopic picture;
1-2, manually marking a complete cell boundary on a microscopic picture, wherein the longest distance of the boundary is L and is defined as the interception length of the cell;
1-3, taking the central point of the picture as a coordinate origin (0, 0) to obtain a coordinate array [ x, y ] of pixel points (x, y) in the boundary of the artificially marked complete cell, wherein x and y are respectively horizontal and vertical coordinates of the pixel points in a coordinate system taking the coordinate origin (0, 0) as the coordinate origin;
1-4, establishing a cell distribution model F (x, y):
Figure 372003DEST_PATH_IMAGE001
Figure 218736DEST_PATH_IMAGE002
Figure 553902DEST_PATH_IMAGE003
Figure 282824DEST_PATH_IMAGE004
wherein, is a two-dimensional normal distribution function, σ1、σ2Standard deviation of two-dimensional normal distribution; rho is the covariance of the two-dimensional normal distribution; mu.s1、μ2Is the mean of the two-dimensional normal distribution; is a deformed ellipse equation, r is the radius of a circle in a standard circle equation,
Figure 100104DEST_PATH_IMAGE007
mu is a compression ratio; w is a1、w2Is a weight coefficient, w1+w2= 1; sigma is the standard deviation of one-dimensional normal distribution;
1-5, learning by using a Loss function, wherein the Loss function Loss is as follows:
Figure 50743DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 881383DEST_PATH_IMAGE009
the floating point number of the pixel point (x, y) is obtained after the picture is preprocessed; solving the minimum value of Loss to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2Finally, the cell distribution fitting model F' (x, y) is obtained.
2. The construction method according to claim 1, wherein the preprocessing of the picture comprises: whitening the microscopic picture to make the background pixel 0 and the floating point number of the pixel point (x, y)
Figure 233867DEST_PATH_IMAGE010
The calculation formula of (2) is as follows:
Figure 176416DEST_PATH_IMAGE011
where a is the pixel value of pixel (x, y).
3. A cell counting method based on single sample learning is characterized by comprising the following steps:
3-1, carrying out microscope irradiation on the cell culture to obtain a microscopic picture;
3-2. obtaining a cell distribution fitting model F' (x, y) by the construction method according to claim 1 or 2, and learning the parameters θ, r, σ from steps 1-51、σ2、ρ、μ1、μ2、w1、w2Defined as manually intercepted hyper-parameter Hum _ pa, recorded as H respectivelyθ,Hr,Hσ,Hσ1,Hσ2,Hρ,Hμ,Hμ1,Hμ2,Hw1,Hw2
3-3, traversing each pixel point in the microscopic picture to find out a proper interception size and a proper training hyper-parameter;
3-3-1, taking the pixel A as the center, taking L _ A1 as the cell intercepting length to intercept a new picture, and taking the center of the new picture as the origin of coordinates (0, 0) to obtain the coordinate array [ x ] of the pixel in the new pictureA1,yA1](ii) a To coordinate array [ x ]A1,yA1]Substituting into steps 1-4 and 1-5 to perform model adaptationLearning to obtain parameters theta, r, sigma and sigma1、σ2、ρ、μ、μ1、μ2、w1、w2And defined as the hyper-parameter F _ pa _ in _ L _ A _ A1, respectively recorded as Lθ_A_A1,Lr_A_A1,Lσ_A_A1,Lσ1_A_A1,Lσ2_A_A1,Lρ_A_A1,Lμ1_A_A1,Lμ2_A_A1,Lw1_A_A1,Lw2_A_A1(ii) a Wherein, 0.5L<L_A1<1.5L; meanwhile, the requirement F _ pa _ in _ L _ a1 has similarity with the manual intercept hyper-parameter Hum _ pa, that is:
Figure 614350DEST_PATH_IMAGE012
Figure 503809DEST_PATH_IMAGE013
Figure 976378DEST_PATH_IMAGE014
Figure 89828DEST_PATH_IMAGE015
repeating the above steps to obtain hyper-parameters F _ pa _ in _ L _ a2, F _ pa _ in _ L _ A3, F _ pa _ in _ L _ a _ 4, … …, F _ pa _ in _ L _ a _ An of pictures with cell truncation lengths of L _ a2, L _ A3, L _ a4, L _ a5, … …, and L _ An, respectively; wherein n is a natural number greater than or equal to 1;
3-3-2, calculating the maximum likelihood function value Log _ L _ A _ An of each picture intercepted in the step 3-3-1 and the unit area likelihood function value Log _ L _ An/(L _ An) of each picture2
3-3-3, comparing the unit area likelihood function value Log _ L _ An/(L _ An) of the picture intercepted in the step 3-3-12Selecting the cell interception length corresponding to the maximum unit area likelihood function value as the cell interception length taking the pixel point A as the center, and marking the cell interception length as L '_ A, wherein the maximum unit area likelihood function value is marked as the actual unit area likelihood function value of the L' _ A;
3-4, taking the pixel point A as the center and taking L' _ A as the interception length to intercept the cells and count the cells;
and 3-5, repeating the steps and traversing each pixel point in the microscopic picture.
4. The cell counting method according to claim 3, further comprising, in the step 3-3-3: when the same pixel point is intercepted by a plurality of intercepted cells, the actual unit area likelihood function values corresponding to the intercepted cells are compared, the pixel point is intercepted into the cell corresponding to the actual unit area likelihood function value with the maximum value, and the rest intercepted cells are not counted.
5. The cell counting method according to claim 3, wherein in the step 3-4, a threshold value is set, all the actual unit area likelihood function values are compared with the threshold value, and the cells with the actual unit area likelihood function values larger than the threshold value are selected for counting; the threshold is the average of all actual unit area likelihood function values.
6. The method of claim 5, wherein the threshold is set manually.
CN202110109382.1A 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning Active CN112435259B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110109382.1A CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110109382.1A CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Publications (2)

Publication Number Publication Date
CN112435259A true CN112435259A (en) 2021-03-02
CN112435259B CN112435259B (en) 2021-04-02

Family

ID=74697336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110109382.1A Active CN112435259B (en) 2021-01-27 2021-01-27 Cell distribution model construction and cell counting method based on single sample learning

Country Status (1)

Country Link
CN (1) CN112435259B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067315A (en) * 2021-10-23 2022-02-18 广州市艾贝泰生物科技有限公司 Cell counting method, cell counting device, computer device, and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101275897A (en) * 2008-03-06 2008-10-01 上海交通大学 Medulla biopsy slice hematopoiesis cell characteristic recognizing method
US20110098998A1 (en) * 2009-10-28 2011-04-28 Chevron U.S.A. Inc. Multiscale finite volume method for reservoir simulation
CN107527028A (en) * 2017-08-18 2017-12-29 深圳乐普智能医疗器械有限公司 Target cell recognition methods, device and terminal
CN108346145A (en) * 2018-01-31 2018-07-31 浙江大学 The recognition methods of unconventional cell in a kind of pathological section
CN109406768A (en) * 2018-11-06 2019-03-01 扬州大学 A method of the distribution of observation microtissue inner cell solid and statistics cell number
CN109765164A (en) * 2018-03-06 2019-05-17 中国科学院高能物理研究所 A kind of absolute quantitation detection method of unicellular middle gold nanorods
WO2019099592A1 (en) * 2017-11-14 2019-05-23 miDiagnostics NV Classification of a population of objects by convolutional dictionary learning with class proportion data
CN110501278A (en) * 2019-07-10 2019-11-26 同济大学 A kind of method for cell count based on YOLOv3 and density estimation
CN110516584A (en) * 2019-08-22 2019-11-29 杭州图谱光电科技有限公司 A kind of Auto-counting of Cells method based on dynamic learning of microscope
CN110648309A (en) * 2019-08-12 2020-01-03 平安科技(深圳)有限公司 Method for generating erythrocyte image complexed by antithetical net based on conditions and related equipment
CN110647875A (en) * 2019-11-28 2020-01-03 北京小蝇科技有限责任公司 Method for segmenting and identifying model structure of blood cells and blood cell identification method
CN111520073A (en) * 2020-03-30 2020-08-11 西安石油大学 Quantitative characterization method for collision prevention risk of large well cluster infilled well
CN111723693A (en) * 2020-06-03 2020-09-29 云南大学 Crowd counting method based on small sample learning
CN111724381A (en) * 2020-06-24 2020-09-29 武汉互创联合科技有限公司 Microscopic image cell counting and posture identification method based on multi-view cross validation

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101275897A (en) * 2008-03-06 2008-10-01 上海交通大学 Medulla biopsy slice hematopoiesis cell characteristic recognizing method
US20110098998A1 (en) * 2009-10-28 2011-04-28 Chevron U.S.A. Inc. Multiscale finite volume method for reservoir simulation
CN107527028A (en) * 2017-08-18 2017-12-29 深圳乐普智能医疗器械有限公司 Target cell recognition methods, device and terminal
WO2019099592A1 (en) * 2017-11-14 2019-05-23 miDiagnostics NV Classification of a population of objects by convolutional dictionary learning with class proportion data
CN108346145A (en) * 2018-01-31 2018-07-31 浙江大学 The recognition methods of unconventional cell in a kind of pathological section
CN109765164A (en) * 2018-03-06 2019-05-17 中国科学院高能物理研究所 A kind of absolute quantitation detection method of unicellular middle gold nanorods
CN109406768A (en) * 2018-11-06 2019-03-01 扬州大学 A method of the distribution of observation microtissue inner cell solid and statistics cell number
CN110501278A (en) * 2019-07-10 2019-11-26 同济大学 A kind of method for cell count based on YOLOv3 and density estimation
CN110648309A (en) * 2019-08-12 2020-01-03 平安科技(深圳)有限公司 Method for generating erythrocyte image complexed by antithetical net based on conditions and related equipment
CN110516584A (en) * 2019-08-22 2019-11-29 杭州图谱光电科技有限公司 A kind of Auto-counting of Cells method based on dynamic learning of microscope
CN110647875A (en) * 2019-11-28 2020-01-03 北京小蝇科技有限责任公司 Method for segmenting and identifying model structure of blood cells and blood cell identification method
CN111520073A (en) * 2020-03-30 2020-08-11 西安石油大学 Quantitative characterization method for collision prevention risk of large well cluster infilled well
CN111723693A (en) * 2020-06-03 2020-09-29 云南大学 Crowd counting method based on small sample learning
CN111724381A (en) * 2020-06-24 2020-09-29 武汉互创联合科技有限公司 Microscopic image cell counting and posture identification method based on multi-view cross validation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LLU´IS HURTADO-GIL等: "The best fit for the observed galaxy Counts-in-Cell distribution function", 《ARXIV:1703.01087》 *
WENLING TU等: "Testis-specific protein, Y-linked 1 activates PI3K/AKT and RAS signaling pathways through suppressing IGFBP3 expression during tumor progression", 《HTTP://DX.DOI.ORG/10.1111/CAS.13984》 *
崔世钢等: "基于遗传算法的微藻类光分布模型参数优化", 《江苏农业科学》 *
秦娇: "基于红细胞分布宽度建立乙肝相关慢加急性肝衰竭不良预后的优化预测模型", 《中国优秀硕士学位论文全文数据库_医药卫生科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067315A (en) * 2021-10-23 2022-02-18 广州市艾贝泰生物科技有限公司 Cell counting method, cell counting device, computer device, and storage medium

Also Published As

Publication number Publication date
CN112435259B (en) 2021-04-02

Similar Documents

Publication Publication Date Title
US9292729B2 (en) Method and software for analysing microbial growth
US8687879B2 (en) Method and apparatus for generating special-purpose image analysis algorithms
CN113723573B (en) Tumor tissue pathological classification system and method based on adaptive proportion learning
CN112435259B (en) Cell distribution model construction and cell counting method based on single sample learning
US20130183707A1 (en) Stem cell bioinformatics
CN108009567B (en) Automatic excrement character distinguishing method combining image color and HOG and SVM
WO2022100517A1 (en) Cell analysis method and system, and quantitative method and system
CN113610101A (en) Method for measuring germination rate of grains
CN112102232A (en) Method and device for automatically evaluating colony quality of induced pluripotent stem cells
CN112949517A (en) Plant stomata density and opening degree identification method and system based on deep migration learning
CN116385374A (en) Cell counting method based on convolutional neural network
CN109147932A (en) cancer cell HER2 gene amplification analysis method and system
CN114494739B (en) Toner mixing effect detection method based on artificial intelligence
CN117152147A (en) Online chromosome collaborative analysis method, system and medium
CN109978058A (en) Determine the method, apparatus, terminal and storage medium of image classification
CN115760957B (en) Method for analyzing substances in cell nucleus by three-dimensional electron microscope
CN112540039A (en) Method for directly calculating number of adherent living cells
Kbiri et al. Quantifying Meiotic CrossoverRecombination in Arabidopsis Lines Expressing Fluorescent Reporters in Seeds Using SeedScoring Pipeline for CellProfiler
JP2024513984A (en) Analysis of microscopic images of microalgae culture samples
CN110175531B (en) Attitude-based examinee position positioning method
Mendelsohn et al. Morphological analysis of cells and chromosomes by digital computer
Qu Two Algorithms of Image Segmentation and Measurement Method of Particle’s Parameters
CN109924147A (en) Information collection measurement system and measuring method in a kind of crucian hybrid seeding
CN108537244A (en) A kind of gradual deep learning method towards real-time system
US20240193968A1 (en) Analyzing microscope images of microalgae culture samples

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant