CN116609311A - Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system - Google Patents
Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system Download PDFInfo
- Publication number
- CN116609311A CN116609311A CN202310642599.8A CN202310642599A CN116609311A CN 116609311 A CN116609311 A CN 116609311A CN 202310642599 A CN202310642599 A CN 202310642599A CN 116609311 A CN116609311 A CN 116609311A
- Authority
- CN
- China
- Prior art keywords
- image
- cell
- region
- fluorescence
- fluorescent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001890 transfection Methods 0.000 title claims abstract description 129
- 238000000034 method Methods 0.000 title claims abstract description 124
- 238000002073 fluorescence micrograph Methods 0.000 claims abstract description 73
- 238000004590 computer program Methods 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 17
- 238000005516 engineering process Methods 0.000 claims description 16
- 239000002131 composite material Substances 0.000 claims description 14
- 238000003384 imaging method Methods 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000007405 data analysis Methods 0.000 claims description 6
- 238000000799 fluorescence microscopy Methods 0.000 claims description 6
- 238000003062 neural network model Methods 0.000 claims description 4
- 210000004027 cell Anatomy 0.000 description 279
- 238000010801 machine learning Methods 0.000 description 49
- 238000004422 calculation algorithm Methods 0.000 description 27
- 238000012549 training Methods 0.000 description 23
- 102000034287 fluorescent proteins Human genes 0.000 description 9
- 108091006047 fluorescent proteins Proteins 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000003066 decision tree Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 6
- 238000013528 artificial neural network Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000012706 support-vector machine Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000002787 reinforcement Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 108010076504 Protein Sorting Signals Proteins 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000007635 classification algorithm Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000003527 eukaryotic cell Anatomy 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 108090000623 proteins and genes Proteins 0.000 description 2
- 108020004511 Recombinant DNA Proteins 0.000 description 1
- 125000002015 acyclic group Chemical group 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010224 classification analysis Methods 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000003013 cytotoxicity Effects 0.000 description 1
- 231100000135 cytotoxicity Toxicity 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 230000035772 mutation Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000013450 outlier detection Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000002179 total cell area Methods 0.000 description 1
- 238000003151 transfection method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/28—Measuring arrangements characterised by the use of optical techniques for measuring areas
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
- G01N21/6458—Fluorescence microscopy
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/85—Investigating moving fluids or granular solids
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/85—Investigating moving fluids or granular solids
- G01N2021/8592—Grain or other flowing solid samples
Landscapes
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
The application discloses a transfection identification method, a transfection efficiency measuring and calculating method, a device and a microscope system, wherein the transfection identification method comprises the following steps: acquiring data of a phase image and data of a fluorescence image of cells of a sample to be identified; identifying cell areas corresponding to each cell in the phase image, and identifying fluorescent areas in the fluorescent image; comparing the image data of each identified cell region with the image data of the fluorescent region; and determining the transfection result of the cells corresponding to each cell area according to the comparison result. According to the application, the transfection result of each cell is accurately identified by comparing the identified cell region data with the fluorescence region data, so that the measuring and calculating precision of the transfection efficiency is improved.
Description
Technical Field
The application relates to the technical field of cell analysis, in particular to a transfection identification method, a transfection efficiency measuring and calculating method, a device and a microscope system.
Background
Cell transfection refers to a technique of introducing foreign molecules into eukaryotic cells, and as molecular biology and cell biology research continues, transfection has become a conventional tool for researching and controlling gene functions of eukaryotic cells. The application of the recombinant DNA is more and more extensive in biological tests such as researching gene functions, regulating gene expression, mutation analysis, protein production and the like.
The ideal cell transfection method should have the advantages of high transfection efficiency, low cytotoxicity and the like. Therefore, it is necessary to measure the transfection efficiency of cells. In the prior art, the result is not accurate enough whether the transfection efficiency is calculated by the number of cells or the transfection area.
The transfection efficiency is calculated using the number of cells and the number of fluorescent proteins is used as the number of transfected cells, but in practice, a plurality of fluorescent proteins may be formed in one cell in a plurality of transfected regions, and the fluorescent proteins of a plurality of cells may be overlapped to be calculated as one fluorescent protein, and therefore, the transfection efficiency calculated from the number of fluorescent proteins is inaccurate.
The transfection efficiency is calculated by using the transfection area by the ratio of the fluorescent protein area to the total cell area, but the transfection area of each cell, that is, the size of the fluorescent protein area corresponding to each transfected cell, may be 1/2 of the transfection area of some cells, 3/4 of the transfection area of some cells, even full of cells, and the fluorescent protein areas may overlap, and thus the calculation of the transfection efficiency based on the fluorescent protein areas is inaccurate.
The above disclosure of background art is only for aiding in understanding the inventive concept and technical solution of the present application, and it does not necessarily belong to the prior art of the present patent application, nor does it necessarily give technical teaching; the above background should not be used to assess the novelty and creativity of the present application in the event that no clear evidence indicates that such is already disclosed prior to the filing date of the present patent application.
Disclosure of Invention
The invention aims to provide a transfection identification method, a transfection efficiency measuring and calculating method, a device and a microscope system, and the transfection results of all cells can be accurately identified by comparing the identified cell area data with the fluorescence area data, so that the measurement and calculation precision of the transfection efficiency is improved.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a cell transfection recognition method comprising the steps of:
acquiring data of a phase image and data of a fluorescence image of cells of a sample to be identified;
identifying cell areas corresponding to each cell in the phase image, and identifying fluorescent areas in the fluorescent image;
comparing the image data of each identified cell region with the image data of the fluorescent region;
and determining the transfection result of the cells corresponding to each cell area according to the comparison result.
Further, in any one or a combination of the foregoing aspects, the fields of view of the phase image and the fluorescent image are the same.
Further, any one or a combination of the foregoing aspects, wherein determining the transfection result of the cells corresponding to each cell region according to the comparison result includes:
Identifying whether a cell region overlaps with the fluorescent region;
and determining cells corresponding to the cell regions overlapping the fluorescent regions as transfected cells.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
identifying cell areas corresponding to cells in the phase image, and respectively obtaining a first set of position coordinates in each cell area;
identifying a fluorescence region in the fluorescence image to obtain a second set of position coordinates within the fluorescence region;
judging whether the position coordinates in a first set exist in the second set or not;
determining that at least one position coordinate exists in the first set of the second set, and determining that a cell region corresponding to the first set overlaps with the fluorescence region.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
positioning target areas in the fluorescence image in one-to-one relation with the cell areas so that the positions of the target areas in the fluorescence image coincide with the positions of the cell areas in the phase image;
And determining a target area which is at least partially overlapped with the fluorescent area identified in the fluorescent image, and determining that the cell area associated with the target area is overlapped with the fluorescent area.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
positioning a target region associated with the fluorescence region in the phase image such that a position of the target region in the phase image coincides with a position of the fluorescence region in the fluorescence image;
a region of the cell that at least partially coincides with the target region is determined to overlap with the fluorescent region.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
superposing the phase image and the fluorescent image to obtain a composite image;
comparing the image parameters at the same location area in the composite image with each cell area identified in the phase image, the image parameters being parameters related to fluorescence characteristics;
And determining the cell area with the variation difference value of the image parameters exceeding a preset threshold value as overlapping with the fluorescence area.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
drawing contour lines of each cell area in the phase image by utilizing an image processing technology to obtain a processed phase image; and outlining the fluorescence area in the fluorescence image to obtain a processed fluorescence image;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a first contour line corresponding to the cell area and a second contour line corresponding to the fluorescent area;
and determining a first contour line intersecting with or surrounding the second contour line in the synthetic image, and determining that a cell area corresponding to the first contour line overlaps with the fluorescence area.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
Drawing contour lines of each cell area in the phase image by utilizing an image processing technology to obtain a processed phase image; filling marks are carried out on the fluorescence areas in the fluorescence images, and the processed fluorescence images are obtained;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a contour line corresponding to the cell area and a filling mark corresponding to the fluorescent area;
and determining a contour line intersecting with or surrounding the filling mark in the synthetic image, and determining that a cell area corresponding to the contour line overlaps with the fluorescence area.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image is obtained; and outlining the fluorescence area in the fluorescence image to obtain a processed fluorescence image;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a filling mark corresponding to the cell area and a contour line corresponding to the fluorescent area;
And determining a filling mark intersected with the contour line in the synthetic image, and determining that a cell area corresponding to the filling mark overlaps with the fluorescent area.
Further, any one or a combination of the above, identifying whether a cell region overlaps with the fluorescent region by:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image is obtained; filling marks are carried out on the fluorescence areas in the fluorescence images, and the processed fluorescence images are obtained;
superposing the processed phase image and the processed fluorescent image to obtain a composite image, wherein the composite image is provided with a first filling mark corresponding to the cell area and a second filling mark corresponding to the fluorescent area;
and determining a first filling mark intersected with the second filling mark in the synthetic image, and determining that a cell area corresponding to the first filling mark overlaps with the fluorescent area.
Further, the combination of any one or more of the foregoing aspects, wherein a color of a contour line of a cell region outlined in the phase image is different from a color of a contour line of a fluorescence region outlined in the fluorescence image;
Alternatively, the color of the contour lines of the cell region delineated in the phase image is different from the color of the filling mark made in the fluorescent region in the fluorescent image;
alternatively, the color of the filling mark made in the cell region in the phase image is different from the color of the outline of the fluorescence region outlined in the fluorescence image;
alternatively, the color or pattern of the fill mark made in the cell region in the phase image is different from the color or pattern of the fill mark made in the fluorescent region in the fluorescent image.
Further, the combination of any one or more of the foregoing, pretraining an AI neural network model configured to identify a cell region in a phase image input into the model, and to identify a fluorescence region in a fluorescence image input into the model.
According to another aspect of the present invention, there is provided a method for measuring and calculating transfection efficiency of cells, comprising the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample by using the cell transfection identification method described in any one or a combination of the above aspects;
Count the number n of transfected cells therein 1 And counting the total number of cells N in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=n 1 /N total 。
Further, any one or a combination of the foregoing aspects, including the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample by using the cell transfection identification method described in any one or a combination of the above aspects;
counting the area s of the cell region corresponding to the transfected cells 1 And counting the sum of areas S of all cell areas in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=s 1 /S total 。
According to another aspect of the present invention, there is provided a cell transfection recognition device comprising the following modules:
an image acquisition module configured to acquire data of a phase image and data of a fluorescence image of cells of a sample to be identified;
an identification module configured to identify a cell region corresponding to each cell in the phase image, and to identify a fluorescence region in the fluorescence image;
a comparison module configured to compare the image data of the identified individual cell regions with the image data of the fluorescence regions;
A transfection property determination module configured to determine a transfection result of the cells corresponding to each cell region based on the comparison result.
According to another aspect of the present invention, there is provided a cell transfection efficiency measuring device comprising:
a cell transfection recognition device as described in any one or a combination of the above; and
a statistics module configured to receive the result of transfection of each cell region in the phase image from the transfection property determination module of the cell transfection recognition device, and to count the number or area of cell regions transfected therein, and to count the total number or area sum of all cell regions in the phase image;
a transfection efficiency calculation module configured to calculate a cell transfection efficiency based on the statistics of the statistics module.
According to another aspect of the present invention, there is provided an AI module including an image recognition unit and a data analysis unit;
wherein the image recognition unit is configured to recognize a cell region in the phase image and to recognize a fluorescence region in the fluorescence image;
the data analysis unit is configured to perform the steps of the method as described in any one or a combination of the above.
According to another aspect of the present invention, there is provided a microscope system comprising a microscope, a processor, and an AI module, the microscope configured with a phase imaging mode and a fluorescence imaging mode, the microscope configured to image cells in the phase imaging mode to obtain a phase image, and to image cells in the fluorescence imaging mode to obtain a fluorescence image;
the processor is configured to be communicatively coupled to a microscope to receive data of a phase image and data of a fluorescence image of the microscope;
the AI module is configured to identify a cell region in the phase image and to identify a fluorescence region in the fluorescence image;
the microscope system is configured to perform the steps of the method as described in any one or a combination of the above claims.
According to another aspect of the present invention there is provided an electronic device comprising a processor and a memory, wherein the memory is for storing program instructions, the processor being configured to execute the program instructions, the program instructions being executed to perform the steps of the method as described in any one or a combination of the above.
According to another aspect of the present invention, there is provided a computer readable storage medium storing program instructions configured to be invoked to perform the steps of the method of any one or a combination of the above claims.
According to a further aspect of the present invention there is provided a computer program product comprising a computer program stored readable, the computer program comprising program instructions for performing the steps of the method as described in any one or a combination of the above claims when said program instructions are run on a computer device.
The technical scheme provided by the invention has the following beneficial effects:
a. according to the method, the cell areas corresponding to the cells are identified through the data of the phase images of the cells of the sample to be identified, the fluorescent areas in the fluorescent images of the cells of the sample to be identified are identified, the image data of the identified cell areas and the image data of the fluorescent areas are compared, and the transfection result of each cell can be accurately identified according to whether one cell area overlaps with the fluorescent area or not;
b. the invention provides a plurality of methods for identifying whether the cell area is overlapped with the fluorescent area, which are convenient for a user to select a proper method for cell transfection identification and cell transfection efficiency measurement and calculation, and the methods can mutually verify the accuracy of the identification result;
c. The application can effectively improve the measuring and calculating precision of the transfection efficiency based on accurately identifying the transfected cells and calculating the transfection efficiency according to the number of the transfected cells or calculating the transfection efficiency according to the area of the transfected cells instead of calculating the transfection efficiency through the number or the area of fluorescent cells.
Drawings
In order to more clearly illustrate the technical solutions or conventional technical solutions of the embodiments of the present application, the drawings required to be used in the descriptions of the embodiments or conventional technical solutions will be briefly described below, and it is obvious that the drawings in the following descriptions are only some embodiments described in the present application, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
FIG. 1 is a basic flow chart of a cell transfection recognition method provided by an exemplary embodiment of the present application;
FIG. 2 is a flowchart showing a cell transfection identification method according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a first method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present application;
FIG. 4 is a flow chart of a second method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present application;
FIG. 5 is a flow chart of a third method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present invention;
FIG. 6 is a flowchart of a fourth method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present invention;
FIG. 7 is a flowchart of a fifth method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present invention;
FIG. 8 is a processed image of a phase image of cells of a sample to be identified;
FIG. 9 is a processed image of a fluorescence image of cells of a sample to be identified;
FIG. 10 is a composite image resulting from the superposition of FIGS. 8 and 9;
FIG. 11 is a flowchart of a sixth method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present invention;
FIG. 12 is a flowchart of a seventh method for identifying whether a cell region overlaps a fluorescent region according to an exemplary embodiment of the present invention;
FIG. 13 is a flowchart of an eighth method for identifying whether a cell region overlaps a fluorescent region according to one exemplary embodiment of the present invention;
Fig. 14 is a schematic diagram of a microscope system provided by an exemplary embodiment of the invention for performing the methods described by the embodiments of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or device.
In one embodiment of the present invention, a cell transfection recognition method is provided, the basic flow of which is shown in FIG. 1, comprising the steps of:
acquiring data of a phase image and data of a fluorescence image of cells of a sample to be identified;
identifying cell areas corresponding to each cell in the phase image, and identifying fluorescent areas in the fluorescent image;
comparing the image data of each identified cell region with the image data of the fluorescent region;
and determining the transfection result of the cells corresponding to each cell area according to the comparison result.
In one embodiment, see FIG. 2 for a relatively specific flow:
and shooting a phase image and a fluorescence image of the cells of the sample under the same imaging view field by utilizing a microscopic imaging technology, and respectively obtaining data of the phase image and data of the fluorescence image. Specifically, the microscope is configured with a phase imaging mode and a fluorescence imaging mode, the microscope being configured to image cells in the phase imaging mode to obtain a phase image and to image cells in the fluorescence imaging mode to obtain a fluorescence image. The imaging time difference between the phase image and the fluorescent image should not be too long, so as to avoid the degradation of the accuracy of the cell transfection identification result caused by the too long imaging time difference.
The method comprises the steps of identifying cell areas corresponding to cells in the phase image and identifying fluorescent areas in the fluorescent image, wherein the existing artificial intelligent cell identification technology or other technologies are utilized to process the phase image and the fluorescent image, so that the outline of each cell in the phase image can be accurately identified, and then the cell area corresponding to each cell is identified; and identifying a range of fluorescent regions in the fluorescent image.
Comparing the image data of each identified cell region with the image data of the fluorescent region;
determining the transfection result of the cells corresponding to each cell region according to the comparison result, wherein the method comprises the following steps: identifying whether a region of cells overlaps with the region of fluorescence, and determining cells corresponding to the region of cells overlapping with the region of fluorescence as transfected cells. If the cell area C is identified to overlap with the fluorescent area, determining that the cell corresponding to the cell area C is a transfected cell, wherein the transfected cell comprises both the transfected cell and the transfected cell; in this embodiment, the dot/range regions of the same orientation in the phase image, respectively the fluorescence image, are defined as overlapping, rather than overlapping in the conventional sense. If 20 spots are included in the cell region C, if none of the 20 spots overlap with the fluorescent region, the cell corresponding to the cell region C is determined to be an untransfected cell.
Specific description will be given below regarding the recognition of whether a cell region overlaps with the fluorescent region:
in one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is described with reference to FIG. 3: identifying cell areas corresponding to cells in the phase image, and respectively obtaining first sets of position coordinates in the cell areas, wherein the number of the first sets is the number of the identified cell areas;
identifying a fluorescence region in the fluorescence image to obtain a second set of position coordinates within the fluorescence region;
judging whether the position coordinates in a first set exist in the second set or not;
determining that at least one position coordinate exists in the first set of the second set, and determining that a cell region corresponding to the first set overlaps with the fluorescence region.
For example, the cell region of one cell A in the sample to be identified includes 5 position coordinates, the first set of which is P 1 ={(x i ,y i )|i∈N + And i.ltoreq.5 }, the position coordinates in the fluorescent region including 500 position coordinates having a second set of corresponding position coordinates of P 2 ={(x k ,y k )|k∈N + And n is less than or equal to 500}, if set P 1 At least one element is set P 2 Determining the first set P 1 The corresponding cell region overlaps with the fluorescent region (even if only one of the 5 coordinates in the first set is present in the second set), and cell a is identified as a transfected cell; if set P 1 None of the elements is set P 2 The first set P is determined if the 5 coordinates of the elements of (a) are different from the position coordinates in 500 fluorescence regions 1 The corresponding cell region does not overlap with the fluorescent region and recognizes cell a as a non-transfected cell. According to the method, whether the cell areas corresponding to the cells in the phase image are overlapped with the fluorescent areas is judged one by one, so that whether the cells are transfected or not is accurately identified. This approach may not be based on visual presentation, and the recognition result may be obtained directly by machine processing the data.
In the present embodiment, an AI neural network model configured to identify a cell region in a phase image input into the model and to identify a fluorescence region in a fluorescence image input into the model is trained in advance. AI models for identifying cellular, fluorescent regions are state of the art, such as chinese patent application publication No. CN114972222 a.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 4:
positioning target areas in the fluorescence image in one-to-one relation with the cell areas so that the positions of the target areas in the fluorescence image coincide with the positions of the cell areas in the phase image; the target area in the embodiment can be automatically identified and marked by a computer, and can also be marked by a manual mode, and the marking of the target area and the marking of the fluorescent area adopt different colors, so that the marking types can be distinguished conveniently during manual marking, misuse of the color marking of the fluorescent area without being perceived during marking is avoided, and subsequent identification of whether the target area overlaps with the fluorescent area is facilitated.
And determining a target area which is at least partially overlapped with the fluorescent area identified in the fluorescent image, and determining that the cell area associated with the target area is overlapped with the fluorescent area.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 5:
positioning a target region associated with the fluorescence region in the phase image such that a position of the target region in the phase image coincides with a position of the fluorescence region in the fluorescence image; the target area in the embodiment can be automatically identified and marked by a computer, and can also be marked by a manual mode, and the marking of the target area and the marking of the fluorescent area adopt different colors, so that the marking types can be distinguished conveniently during manual marking, misuse of the color marking of the fluorescent area without being perceived during marking is avoided, and subsequent identification of whether the target area overlaps with the fluorescent area is facilitated.
A region of the cell that at least partially coincides with the target region is determined to overlap with the fluorescent region.
The principle of the embodiment shown in fig. 4 and 5 is that whether the cell area and the fluorescence area are coincident or not is judged based on the same image, wherein the embodiment corresponding to fig. 4 is that the cell area is mapped into the fluorescence image, and the embodiment corresponding to fig. 5 is that the fluorescence area is mapped into the phase image. Whether two areas are overlapped on image processing and analysis based on one image is known in the prior art, and is not repeated.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 6:
superposing the phase image and the fluorescent image to obtain a composite image;
comparing the image parameters at the same location area in the composite image with each cell area identified in the phase image, the image parameters being parameters related to fluorescence characteristics; taking brightness as an example, if a point B exists in the cell area, the brightness of the point B in the phase image is x, and the brightness of the point B at the same position in the composite image is x';
if the difference between the image parameters before and after the synthesis of the point B (i.e. the brightness difference x' -x) exceeds a preset threshold, the cell area where the point B is located is determined to overlap with the fluorescence area.
The present invention is not limited to the brightness or the single parameter as the image parameter, and the unit to be compared is not limited to a single point, for example, the image parameters of a certain number of partial points or all points in the cell region can be compared, and the average value of the image parameter values of a plurality of points can be used as the object to be compared.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 7:
outlining each cell region in the phase image by using an image processing technology to obtain a processed phase image, as shown in fig. 8; and outlining the fluorescence region in the fluorescence image to obtain a processed fluorescence image, as shown in fig. 9, wherein the color of the outlining the cell region in the phase image is different from the color of the outlining the fluorescence region in the fluorescence image;
superposing the processed phase image and the processed fluorescence image to obtain a synthetic image, wherein the synthetic image is shown in fig. 10, and the synthetic image is provided with a first contour line corresponding to the cell area and a second contour line corresponding to the fluorescence area;
Judging whether the first contour line intersects with the second contour line or surrounds the second contour line one by one in the synthetic image, and if yes, determining that a cell area corresponding to the first contour line overlaps with the fluorescence area. Referring to fig. 4, there is a cell region including two independent fluorescent regions in the upper right side, and if the two independent fluorescent regions are identified as two transfected cells according to the counting method in the background art; according to the transfected cell identification method provided by the invention, the transfected cell can be accurately identified as a transfected cell; referring to the fluorescence communication region with larger area at the lower right of FIG. 4, if the fluorescence communication region is identified as a transfected cell according to the counting method of the prior art; according to the transfected cell identification method provided by the invention, a plurality of transfected cells can be accurately identified. Therefore, the transfected cell identification method provided by the invention can accurately identify the transfected result of each cell.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 11:
outlining each cell region in the phase image by using an image processing technology to obtain a processed phase image, as shown in fig. 8; filling marks are carried out on the fluorescence areas in the fluorescence images, and processed fluorescence images are obtained, as shown in fig. 9; the color of the contour lines of the cell areas delineated in the phase image is different from the color of the filling marks made in the fluorescent areas in the fluorescent image;
Superposing the processed phase image and the processed fluorescence image to obtain a synthetic image, wherein the synthetic image is provided with a contour line corresponding to the cell area and a filling mark corresponding to the fluorescence area as shown in fig. 10;
and determining a contour line intersecting with or surrounding the filling mark in the synthetic image, and determining that a cell area corresponding to the contour line overlaps with the fluorescence area.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 12:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image (not shown) is obtained; and outlining the fluorescence area in the fluorescence image to obtain a processed fluorescence image; the color of the filling mark made in the cell region in the phase image is different from the color of the outline of the fluorescence region outlined in the fluorescence image;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a filling mark corresponding to the cell area and a contour line corresponding to the fluorescent area;
And determining a filling mark intersected with the contour line in the synthetic image, and determining that a cell area corresponding to the filling mark overlaps with the fluorescent area.
In one embodiment of the invention, the manner of identifying whether a cell region overlaps a fluorescent region is seen in FIG. 13:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image (not shown) is obtained; filling marks are carried out on the fluorescence areas in the fluorescence images, and the processed fluorescence images are obtained; the color or pattern of the fill mark made in the cell region in the phase image is different from the color or pattern of the fill mark made in the fluorescent region in the fluorescent image.
Superposing the processed phase image and the processed fluorescent image to obtain a composite image, wherein the composite image is provided with a first filling mark corresponding to the cell area and a second filling mark corresponding to the fluorescent area;
and determining a first filling mark intersected with the second filling mark in the synthetic image, and determining that a cell area corresponding to the first filling mark overlaps with the fluorescent area.
The present invention proposes various methods for identifying whether a cell region overlaps with a fluorescent region, wherein the methods shown in FIGS. 7, 11, 12, and 13 can accomplish identification of cell transfection by manual observation.
The methods for identifying whether the cell region overlaps with the fluorescent region in the above-mentioned different embodiments can be freely combined, and the combined methods can mutually verify to improve the accuracy of the identification result.
In one embodiment of the present invention, a method for measuring cell transfection efficiency is provided, comprising the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample by using the cell transfection identification method according to any one or more of the embodiments above;
count the number n of transfected cells therein 1 And counting the total number of cells N in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=n 1 /N total 。
The method for measuring and calculating the transfection efficiency of the cells is based on accurately identifying the transfected cells and calculating the transfection efficiency according to the number of the transfected cells instead of calculating the transfection efficiency through the number of fluorescent cells, and obviously, the method for measuring and calculating the transfection efficiency of the cells provided by the invention has more accurate transfection efficiency and can effectively improve the measuring and calculating precision of the transfection efficiency.
In one embodiment of the present invention, a method for measuring cell transfection efficiency is provided, comprising the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample by using the cell transfection identification method according to any one or more of the embodiments above;
counting the area s of the cell region corresponding to the transfected cells 1 And counting the sum of areas S of all cell areas in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=s 1 /S total 。
The embodiment is based on accurately identifying transfected cells first and then correspondingly counting the area of the cell area instead of the area in the fluorescent image; and the transfection efficiency is calculated according to the area of the cell region of the transfected cell, even if the cell transfected region is smaller under the condition that the coincidence degree of the identified cell region and the actual cell region is higher, the accuracy of the transfection efficiency result value calculated by the area in the embodiment is not influenced. Obviously, compared with the transfection area calculation mode in the background technology, the transfection efficiency obtained by the cell transfection efficiency calculation method provided by the invention is more accurate, and the calculation precision of the transfection efficiency can be effectively improved.
In one embodiment of the invention, there is provided a cell transfection recognition device comprising the following modules:
an image acquisition module configured to acquire data of a phase image and data of a fluorescence image of cells of a sample to be identified;
an identification module configured to identify a cell region corresponding to each cell in the phase image, and to identify a fluorescence region in the fluorescence image;
a comparison module configured to compare the image data of the identified individual cell regions with the image data of the fluorescence regions;
a transfection property determination module configured to determine a transfection result of the cells corresponding to each cell region based on the comparison result.
In one embodiment of the present invention, there is provided a cell transfection efficiency measuring device including:
the cell transfection recognition device as described in the above embodiment; and
a statistics module configured to receive the result of transfection of each cell region in the phase image from the transfection property determination module of the cell transfection recognition device, and to count the number or area of cell regions transfected therein, and to count the total number or area sum of all cell regions in the phase image;
A transfection efficiency calculation module configured to calculate a cell transfection efficiency based on the statistics of the statistics module.
In one embodiment of the present invention, an AI module is provided, including an image recognition unit and a data analysis unit;
wherein the image recognition unit is configured to recognize a cell region in the phase image and to recognize a fluorescence region in the fluorescence image;
the data analysis unit is configured to perform the steps of the method as described in any of the embodiments above.
In one embodiment of the present invention, a microscope system is provided that includes a microscope, a processor, and an AI module;
the processor is configured to be communicatively coupled to a microscope to receive data of a phase image and data of a fluorescence image of the microscope;
the AI module is configured to identify a cell region in the phase image and to identify a fluorescence region in the fluorescence image;
the microscope system is configured to perform the steps of the method as described in any of the embodiments above.
In one embodiment of the invention, an electronic device is provided comprising a processor and a memory, wherein the memory is for storing program instructions, the processor being configured to execute the program instructions, the program instructions being executed to perform the steps performed by the method embodiments above.
In one embodiment of the present invention, a computer readable storage medium is provided for storing program instructions configured to be invoked to perform the steps performed by the method embodiments above.
In one embodiment of the invention, a computer program product is provided, comprising a computer program stored readable, the computer program comprising program instructions for performing the steps performed by the method embodiments described above when the program instructions are run on a computer device.
The cell transfection efficiency measuring and calculating method, the cell transfection identification device, the cell transfection efficiency measuring and calculating device, the AI module, the electronic device, the computer readable storage medium, the computer program product embodiment and the cell transfection identification method embodiment described above belong to the same inventive concept, and the whole content of the cell transfection identification method embodiment is incorporated into the cell transfection efficiency measuring and calculating method, the cell transfection identification device, the cell transfection efficiency measuring and calculating device, the AI module, the electronic device, the computer readable storage medium, the computer program product embodiment by reference.
Some embodiments relate to a microscope including a method for identifying cell transfection, a method for measuring cell transfection efficiency, a device for identifying cell transfection, and a device for measuring cell transfection efficiency. Alternatively, the microscope may be part of or connected to a system that performs one or more of the method flows as described in connection with fig. 1-7, 11-13. Fig. 14 shows a schematic diagram of a system 100 configured to perform the method described in the present invention. The system 100 includes a microscope 110 and a computer system 120. Microscope 110 is configured to capture images and is connected to computer system 120. Computer system 120 is configured to perform at least a portion of the methods described herein. The computer system 120 may be configured to execute a machine learning algorithm. The computer system 120 and the microscope 110 may be separate entities, but may also be integrated into a common housing. The computer system 120 may be part of a central processing system of the microscope 110 and/or the computer system 120 may be part of a sub-assembly of the microscope 110, such as a sensor, actuator, camera, or illumination unit of the microscope 110, etc.
The computer system 120 may be a local computer device (e.g., a personal computer, notebook, tablet, or mobile phone) having one or more processors and one or more storage devices, or may be a distributed computer system (e.g., having one or more processors and one or more storage devices distributed at various locations, e.g., local clients and/or one or more remote server sites and/or data centers). Computer system 120 may include any circuit or combination of circuits. In one embodiment, computer system 120 may include one or more processors, which may be any type of processor. As used herein, a processor may refer to any type of computational circuitry, such as, but not limited to, a microprocessor, a microcontroller, a Complex Instruction Set Computing (CISC) microprocessor, a Reduced Instruction Set Computing (RISC) microprocessor, a Very Long Instruction Word (VLIW) microprocessor, a graphics processor, a Digital Signal Processor (DSP), a multi-core processor, a field-programmable gate array (FPGA) such as a microscope or a microscope component (e.g., a camera), or any other type of processor or processing circuitry. Other types of circuitry that may be included in computer system 120 may be custom circuits, application-specific integrated circuits (ASlC), and the like, such as one or more circuits (e.g., communications circuits) used in wireless devices such as mobile phones, tablet computers, notebook computers, two-way radios, and similar electronic systems. Computer system 120 may include one or more storage devices that may include one or more storage elements suitable for a particular application, such as main memory in the form of Random Access Memory (RAM), one or more hard disk drives, and/or one or more drives that handle removable media such as Compact Discs (CDs), flash memory cards, digital Video Disks (DVDs), etc. Computer system 120 may also include a display device, one or more speakers, and a keyboard and/or controller, which may include a mouse, a trackball, a touch screen, a voice recognition device, or any other device that allows a system user to input information to computer system 120 or receive information from computer system 120.
Some or all of the method steps may be performed by (or using) hardware devices (e.g., processors, microprocessors, programmable computers, or electronic circuits). In some embodiments, such an apparatus may perform one or more of the most important method steps.
Embodiments of the invention may be implemented in hardware or software, depending on certain implementation requirements. The implementation may be performed using a non-transitory storage medium (such as a digital storage medium, e.g., a floppy disk, DVD, blu-ray, CD, ROM, PROM, and EPROM, EEPROM, or FLASH) having stored thereon electronically readable control signals, which cooperate (or are capable of cooperating) with a programmable computer system, such that the corresponding method is performed. Thus, the digital storage medium may be computer readable.
Some embodiments of the invention include a data carrier having electronically readable control signals capable of cooperating with a programmable computer system to perform one of the methods described herein.
In general, embodiments of the invention may be implemented as a computer program product having a program code which, when the computer program product is run on a computer, is operative for performing one of the methods. The program code may be stored on a machine readable carrier, for example.
Other embodiments include a computer program stored on a machine readable carrier for performing one of the methods of the present invention.
In other words, an embodiment of the invention is thus a computer program having a program code for performing one of the methods of the invention when the computer program runs on a computer.
Thus, a further embodiment of the invention is a storage medium (or data carrier or computer readable medium) comprising a computer program stored thereon for performing one of the methods of the invention when it is executed by a processor. The data carrier, digital storage medium or recording medium is typically tangible and/or non-transitory. Yet another embodiment of the invention is an apparatus as described in the present invention that includes a processor and a storage medium.
Thus, a further embodiment of the invention is a data stream or signal sequence representing a computer program for executing one of the methods of the invention. The data stream or signal sequence may for example be configured to be transmitted via a data communication connection, for example via the internet.
Yet another embodiment includes a processing device, such as a computer or programmable logic device, configured or adapted to perform one of the methods of the present invention.
A further embodiment comprises a computer on which a computer program for performing one of the methods according to the invention is installed.
Yet another embodiment of the invention includes an apparatus or system configured to transmit a computer program (e.g., electronically or optically) for performing one of the methods of the invention to a receiver. The receiver may be, for example, a computer, a mobile device, a storage device, etc. The device or system may for example comprise a file server for transmitting the computer program to the receiver.
In some embodiments, programmable logic devices (e.g., field programmable gate arrays) may be used to perform some or all of the functions of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor to perform one of the methods of the present invention. In general, the method is preferably performed by any hardware device.
Embodiments may be based on using a machine learning model or a machine learning algorithm. Machine learning may refer to algorithms and statistical models that a computer system may use to perform a particular task without using explicit instructions, but rather relying on models and reasoning. For example, in machine learning, instead of rule-based data conversion, data conversion derived from analysis of historical and/or training data may be used. For example, the content of the image may be analyzed using a machine learning model or using a machine learning algorithm. In order for the machine learning model to analyze the content of the image, the machine learning model may be trained using the training image as input and training content information as output. By training a machine learning model with a large number of training images and/or training sequences (e.g., words or sentences) and associated training content information (e.g., tags or notes), the machine learning model "learns" the content of the recognition images so that the machine learning model can be used to recognize image content that is not contained in the training data. The same principle can be used for other types of sensor data as well: by training a machine learning model using the training sensor data and the desired output, the machine learning model may "learn" transitions between the sensor data and the output, which may be used to provide an output based on the non-training sensor data provided to the machine learning model. The provided data (e.g., sensor data, metadata, and/or image data) may be preprocessed to obtain feature vectors that are used as inputs to the machine learning model.
The machine learning model may be trained using training input data. The example specified above uses a training method called "guided learning". In guided learning, a machine learning model is trained using a plurality of training samples, where each sample may include a plurality of input data values, and a plurality of desired output values, i.e., each training sample is associated with a desired output value. By specifying the training samples and the desired output values, the machine learning model "learns" the output values to be provided from input samples similar to the samples provided during the training process. In addition to guiding learning, semi-guiding learning may be used. In semi-guided learning, some training samples lack a corresponding desired output value. The guided learning may be based on a guided learning algorithm (e.g., a classification algorithm, a regression algorithm, or a similarity learning algorithm). A classification algorithm may be used when the output is limited to a set of limited values (classification variables), i.e. the input is classified as one of a set of limited values. The regression algorithm may be used when the output may have any value (within a range). The similarity learning algorithm may be similar to both classification and regression algorithms, but is based on learning from examples using a similarity function that measures the degree of similarity or correlation between two objects. In addition to guided learning or semi-guided learning, machine learning models may be trained using non-guided learning. In non-guided learning, input data may be provided (only), and a non-guided learning algorithm may be used to find structures in the input data (e.g., by grouping or clustering the input data, find common points in the data). Clustering is the assignment of input data containing multiple input values to subsets (clusters) such that the input values in the same cluster are similar according to one or more (predefined) similarity criteria, but dissimilar to the input values contained in other clusters.
Reinforcement learning is a third set of machine learning algorithms. In other words, reinforcement learning may be used to train a machine learning model. In reinforcement learning, one or more software executives (referred to as "software agents") are trained to take action in the environment. Based on the action taken, a reward is calculated. Reinforcement learning is based on training of action selections of one or more software agents to increase the jackpot, thereby making the software agents better (as evidenced by the increase in rewards) when performing tasks.
Furthermore, certain techniques may be applied to some machine learning algorithms. For example, feature learning may be used. In other words, feature learning may be used to at least partially train a machine learning model, and/or a machine learning algorithm may include a feature learning component. Feature learning algorithms (which may be referred to as representation learning algorithms) may retain information in their inputs, but may also be transformed in a way that makes them useful, typically as a preprocessing step prior to performing classification or prediction. Feature learning may be based on principal component analysis or cluster analysis, for example.
In some examples, anomaly detection (i.e., outlier detection) may be used with the purpose of providing identification of suspected input values by being distinct from most input or training data. In other words, the machine learning model may be at least partially trained using anomaly detection, and/or the machine learning algorithm may include an anomaly detection component.
In some examples, the machine learning algorithm may use a decision tree as the predictive model. In other words, the machine learning model may be based on a decision tree. In a decision tree, observations about an item (e.g., a set of input values) may be represented by branches of the decision tree, and output values corresponding to the item may be represented by leaves of the decision tree. The decision tree may support discrete values and continuous values as output values. If discrete values are used, the decision tree may be represented as a classification tree, and if continuous values are used, the decision tree may be represented as a regression tree.
Association rules are another technique that may be used in machine learning algorithms. In other words, the machine learning model may be based on one or more association rules. Association rules are created by identifying relationships between variables in a large amount of data. The machine learning algorithm may identify and/or utilize one or more relationship rules that represent knowledge derived from the data. Rules may be used, for example, to store, manipulate or apply this knowledge.
Machine learning algorithms are typically based on machine learning models. In other words, the term "machine learning algorithm" may represent a set of instructions that may be used to create, train, or use a machine learning model. The term "machine learning model" may refer to a data structure and/or a set of rules representing learned knowledge (e.g., based on training performed by a machine learning algorithm). In an embodiment, the use of a machine learning algorithm may mean the use of a potential machine learning model (or multiple potential machine learning models). The use of a machine learning model may mean that the machine learning model and/or a data structure/rule set that is the machine learning model is trained by a machine learning algorithm.
For example, the machine learning model may be an Artificial Neural Network (ANN). ANNs are systems inspired by biological neural networks, such as those that can be found in the retina or brain. An ANN comprises a plurality of interconnected nodes, so-called edges, of a plurality of connections between the nodes. There are typically three types of nodes: the input node receiving the input value, the hidden node (only) connected to the other nodes, and the output node providing the output value. Each node may represent an artificial neuron. Each edge may transmit information from one node to another. The output of a node may be defined as a (non-linear) function of its inputs (e.g. the sum of its inputs). The input of a node may be used in a function based on the "weight" of the edge or node that provides the input. The weights of the nodes and/or edges may be adjusted during the learning process. In other words, training of the artificial neural network may include adjusting weights of nodes and/or edges of the artificial neural network, i.e., achieving a desired output for a given input.
Alternatively, the machine learning model may be a support vector machine, a random forest model, or a gradient lifting model. A support vector machine (i.e., a support vector network) is a guided learning model with associated learning algorithms that can be used to analyze data (e.g., in classification or regression analysis). The support vector machine may be trained by providing a plurality of training input values belonging to one of two categories to an input. The support vector machine may be trained to assign new input values to one of two categories. Alternatively, the machine learning model may be a bayesian network, which is a probabilistic directed acyclic graph model. A bayesian network can use directed acyclic graphs to represent a set of random variables and their conditional dependencies. Alternatively, the machine learning model may be based on genetic algorithms, which are search algorithms and exploration techniques that mimic the natural selection process.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items, and may be abbreviated as "/".
The foregoing is merely illustrative of the embodiments of this application and it will be appreciated by those skilled in the art that variations and modifications may be made without departing from the principles of the application, and it is intended to cover all modifications and variations as fall within the scope of the application.
Claims (22)
1. A method for identifying cell transfection, comprising the steps of:
acquiring data of a phase image and data of a fluorescence image of cells of a sample to be identified;
identifying cell areas corresponding to each cell in the phase image, and identifying fluorescent areas in the fluorescent image;
comparing the image data of each identified cell region with the image data of the fluorescent region;
and determining the transfection result of the cells corresponding to each cell area according to the comparison result.
2. The method of claim 1, wherein the phase image and the fluorescent image have the same field of view.
3. The method of claim 1, wherein determining the result of transfection of cells corresponding to each cell region based on the comparison result comprises:
identifying whether a cell region overlaps with the fluorescent region;
and determining cells corresponding to the cell regions overlapping the fluorescent regions as transfected cells.
4. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
Identifying cell areas corresponding to cells in the phase image, and respectively obtaining a first set of position coordinates in each cell area;
identifying a fluorescence region in the fluorescence image to obtain a second set of position coordinates within the fluorescence region;
judging whether the position coordinates in a first set exist in the second set or not;
determining that at least one position coordinate exists in the first set of the second set, and determining that a cell region corresponding to the first set overlaps with the fluorescence region.
5. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
positioning target areas in the fluorescence image in one-to-one relation with the cell areas so that the positions of the target areas in the fluorescence image coincide with the positions of the cell areas in the phase image;
and determining a target area which is at least partially overlapped with the fluorescent area identified in the fluorescent image, and determining that the cell area associated with the target area is overlapped with the fluorescent area.
6. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
positioning a target region associated with the fluorescence region in the phase image such that a position of the target region in the phase image coincides with a position of the fluorescence region in the fluorescence image;
a region of the cell that at least partially coincides with the target region is determined to overlap with the fluorescent region.
7. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
superposing the phase image and the fluorescent image to obtain a composite image;
comparing the image parameters at the same location area in the composite image with each cell area identified in the phase image, the image parameters being parameters related to fluorescence characteristics;
and determining the cell area with the variation difference value of the image parameters exceeding a preset threshold value as overlapping with the fluorescence area.
8. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
Drawing contour lines of each cell area in the phase image by utilizing an image processing technology to obtain a processed phase image; and outlining the fluorescence area in the fluorescence image to obtain a processed fluorescence image;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a first contour line corresponding to the cell area and a second contour line corresponding to the fluorescent area;
and determining a first contour line intersecting with or surrounding the second contour line in the synthetic image, and determining that a cell area corresponding to the first contour line overlaps with the fluorescence area.
9. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
drawing contour lines of each cell area in the phase image by utilizing an image processing technology to obtain a processed phase image; filling marks are carried out on the fluorescence areas in the fluorescence images, and the processed fluorescence images are obtained;
Superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a contour line corresponding to the cell area and a filling mark corresponding to the fluorescent area;
and determining a contour line intersecting with or surrounding the filling mark in the synthetic image, and determining that a cell area corresponding to the contour line overlaps with the fluorescence area.
10. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image is obtained; and outlining the fluorescence area in the fluorescence image to obtain a processed fluorescence image;
superposing the processed phase image and the processed fluorescent image to obtain a synthetic image, wherein the synthetic image is provided with a filling mark corresponding to the cell area and a contour line corresponding to the fluorescent area;
and determining a filling mark intersected with the contour line in the synthetic image, and determining that a cell area corresponding to the filling mark overlaps with the fluorescent area.
11. The method of claim 3, wherein the step of identifying whether a cell region overlaps with the fluorescent region is performed by:
filling marks are carried out on each cell area in the phase image by utilizing an image processing technology, so that a processed phase image is obtained; filling marks are carried out on the fluorescence areas in the fluorescence images, and the processed fluorescence images are obtained;
superposing the processed phase image and the processed fluorescent image to obtain a composite image, wherein the composite image is provided with a first filling mark corresponding to the cell area and a second filling mark corresponding to the fluorescent area;
and determining a first filling mark intersected with the second filling mark in the synthetic image, and determining that a cell area corresponding to the first filling mark overlaps with the fluorescent area.
12. The cell transfection recognition method of any one of claims 8 to 11, wherein the color of the contour line of the cell region delineated in the phase image is different from the color of the contour line of the fluorescent region delineated in the fluorescent image;
Alternatively, the color of the contour lines of the cell region delineated in the phase image is different from the color of the filling mark made in the fluorescent region in the fluorescent image;
alternatively, the color of the filling mark made in the cell region in the phase image is different from the color of the outline of the fluorescence region outlined in the fluorescence image;
alternatively, the color or pattern of the fill mark made in the cell region in the phase image is different from the color or pattern of the fill mark made in the fluorescent region in the fluorescent image.
13. The cell transfection identification method of any one of claims 1 to 11, wherein an AI neural network model is pre-trained, the AI neural network model being configured to identify a cell region in a phase image input into the model, and to identify a fluorescence region in a fluorescence image input into the model.
14. A method for measuring and calculating the transfection efficiency of cells, comprising the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample using the cell transfection identification method of any one of claims 1 to 13;
count the number n of transfected cells therein 1 And counting the total number of cells N in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=n 1 /N total 。
15. A method for measuring and calculating the transfection efficiency of cells, comprising the steps of:
determining the transfection result of cells corresponding to each cell region in the phase image of the sample using the cell transfection identification method of any one of claims 1 to 13;
counting the area s of the cell region corresponding to the transfected cells 1 And counting the sum of areas S of all cell areas in the phase image total ;
Cell transfection efficiency was calculated by the following formula: η=s 1 /S total 。
16. A cell transfection recognition device, comprising the following modules:
an image acquisition module configured to acquire data of a phase image and data of a fluorescence image of cells of a sample to be identified;
an identification module configured to identify a cell region corresponding to each cell in the phase image, and to identify a fluorescence region in the fluorescence image;
a comparison module configured to compare the image data of the identified individual cell regions with the image data of the fluorescence regions;
a transfection property determination module configured to determine a transfection result of the cells corresponding to each cell region based on the comparison result.
17. A cell transfection efficiency measuring device, comprising:
the cell transfection recognition device of claim 16; and
a statistics module configured to receive the result of transfection of each cell region in the phase image from the transfection property determination module of the cell transfection recognition device, and to count the number or area of cell regions transfected therein, and to count the total number or area sum of all cell regions in the phase image;
a transfection efficiency calculation module configured to calculate a cell transfection efficiency based on the statistics of the statistics module.
18. An AI module is characterized by comprising an image recognition unit and a data analysis unit;
wherein the image recognition unit is configured to recognize a cell region in the phase image and to recognize a fluorescence region in the fluorescence image;
the data analysis unit is configured to perform the steps of the method of any one of claims 1 to 15.
19. A microscope system comprising a microscope, a processor, and an AI module, the microscope configured with a phase imaging mode and a fluorescence imaging mode, the microscope configured to image cells in the phase imaging mode to obtain a phase image, and to image cells in the fluorescence imaging mode to obtain a fluorescence image;
The processor is configured to be communicatively coupled to a microscope to receive data of a phase image and data of a fluorescence image of the microscope;
the AI module is configured to identify a cell region in the phase image and to identify a fluorescence region in the fluorescence image;
the microscope system is configured to perform the steps of the method of any one of claims 1 to 15.
20. An electronic device comprising a processor and a memory, wherein the memory is for storing program instructions, the processor being configured to execute the program instructions, characterized in that the program instructions are executed to perform the steps of the method of any one of claims 1 to 15.
21. A computer readable storage medium storing program instructions configured to be invoked to perform the steps of the method of any one of claims 1 to 15.
22. A computer program product comprising a computer program stored readable, the computer program comprising program instructions, characterized in that the computer device performs the steps of the method according to any of claims 1 to 15 when said program instructions are run on the computer device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310642599.8A CN116609311A (en) | 2023-06-01 | 2023-06-01 | Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310642599.8A CN116609311A (en) | 2023-06-01 | 2023-06-01 | Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116609311A true CN116609311A (en) | 2023-08-18 |
Family
ID=87679800
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310642599.8A Pending CN116609311A (en) | 2023-06-01 | 2023-06-01 | Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116609311A (en) |
-
2023
- 2023-06-01 CN CN202310642599.8A patent/CN116609311A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113761259A (en) | Image processing method and device and computer equipment | |
Zhang et al. | Fast covariance matching with fuzzy genetic algorithm | |
CN111161249B (en) | Unsupervised medical image segmentation method based on domain adaptation | |
CN108009571A (en) | A kind of semi-supervised data classification method of new direct-push and system | |
CN107016416B (en) | Data classification prediction method based on neighborhood rough set and PCA fusion | |
US20230011970A1 (en) | Method and an apparatus for predicting a future state of a biological system, a system and a computer program | |
CN114972222A (en) | Cell information statistical method, device, equipment and computer readable storage medium | |
CN118016279A (en) | Analysis diagnosis and treatment platform based on artificial intelligence multi-mode technology in breast cancer field | |
CN117854011B (en) | Intelligent AI camera recognition comparison method and system | |
CN112101516A (en) | Generation method, system and device of target variable prediction model | |
US20210319269A1 (en) | Apparatus for determining a classifier for identifying objects in an image, an apparatus for identifying objects in an image and corresponding methods | |
CN117437507A (en) | Prejudice evaluation method for evaluating image recognition model | |
CN118072829A (en) | Protein acid-base stability prediction method, electronic equipment and storage medium | |
CN114391162A (en) | System and method for processing biologically relevant data and microscope | |
CN116609311A (en) | Transfection identification method, transfection efficiency measuring and calculating method, device and microscope system | |
CN114496118A (en) | Drug sensitivity result identification method and device, electronic equipment and readable storage medium | |
Lyu et al. | Optimal burn-in strategy for high reliable products using convolutional neural network | |
KR102636461B1 (en) | Automated labeling method, device, and system for learning artificial intelligence models | |
Prasad | Pattern recognition: possible research areas and issues | |
CN117574098B (en) | Learning concentration analysis method and related device | |
US20220019944A1 (en) | System and method for identifying and mitigating ambiguous data in machine learning architectures | |
Abdallh et al. | Machine Learning (Pattern Recognition) Review | |
Zeng | Deep Learning-based Image Analysis for High-content Screening | |
EP4273608A1 (en) | Automatic acquisition of microscopy image sets | |
Kaur et al. | Machine Learning and its Applications-A Review Study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |