CN116012838B - Artificial intelligence-based organoid activity recognition method and system - Google Patents
Artificial intelligence-based organoid activity recognition method and system Download PDFInfo
- Publication number
- CN116012838B CN116012838B CN202211733272.3A CN202211733272A CN116012838B CN 116012838 B CN116012838 B CN 116012838B CN 202211733272 A CN202211733272 A CN 202211733272A CN 116012838 B CN116012838 B CN 116012838B
- Authority
- CN
- China
- Prior art keywords
- organoid
- map
- bright field
- training
- fluorescence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000002220 organoid Anatomy 0.000 title claims abstract description 156
- 230000000694 effects Effects 0.000 title claims abstract description 52
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000013473 artificial intelligence Methods 0.000 title claims abstract description 13
- 238000012549 training Methods 0.000 claims abstract description 48
- 238000003384 imaging method Methods 0.000 claims abstract description 15
- 238000002073 fluorescence micrograph Methods 0.000 claims abstract description 9
- 230000006870 function Effects 0.000 claims description 16
- 239000011148 porous material Substances 0.000 claims description 13
- 238000010186 staining Methods 0.000 claims description 9
- BQRGNLJZBFXNCZ-UHFFFAOYSA-N calcein am Chemical compound O1C(=O)C2=CC=CC=C2C21C1=CC(CN(CC(=O)OCOC(C)=O)CC(=O)OCOC(C)=O)=C(OC(C)=O)C=C1OC1=C2C=C(CN(CC(=O)OCOC(C)=O)CC(=O)OCOC(=O)C)C(OC(C)=O)=C1 BQRGNLJZBFXNCZ-UHFFFAOYSA-N 0.000 claims description 8
- 239000007850 fluorescent dye Substances 0.000 claims description 8
- 239000003153 chemical reaction reagent Substances 0.000 claims description 7
- 238000003062 neural network model Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 2
- 238000003306 harvesting Methods 0.000 claims 1
- 230000009286 beneficial effect Effects 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 239000003814 drug Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 239000000975 dye Substances 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000009432 framing Methods 0.000 description 2
- 235000015097 nutrients Nutrition 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 238000003908 quality control method Methods 0.000 description 2
- 238000010206 sensitivity analysis Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 1
- 238000004043 dyeing Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000010419 fine particle Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000011229 interlayer Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000013049 sediment Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Abstract
The invention relates to an artificial intelligence-based organoid activity recognition method and system, wherein the method comprises the following steps: obtaining a fluorescence-stained organoid culture well plate, the organoid culture well plate being placed in a cell imaging system; collecting a fluorescence image and a bright field image of the organoid culture well plate through the cell imaging system; adding a tag file to the fluorescent map, and constructing a training sample set based on the bright field map and the tag file; and training the recognition model through the training samples in the training sample set, and performing organoid activity recognition analysis based on the recognition model after training. The technical scheme provided by the invention can effectively identify the activity of the organoid without affecting the organoid culture.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to an artificial intelligence-based organoid activity recognition method and system.
Background
The microscope is a precise optical instrument for observing biological slices, biological cells, organoids, living tissue cultures, fluid sediments, and the like, and also for observing other transparent or semitransparent objects, powder, fine particles, and the like. The microscope can be used for obtaining bright field images and fluorescence images of the organoids, so that the organoids can be identified in activity and evaluated in growth.
At present, no special organoid activity recognition and detection software for a microscope exists, but organoid activity recognition realized by the microscope is needed to be subjected to dyeing treatment, meanwhile, a target object is selected by an artificial frame, so that recognition of partial organoids can be realized, and the organoid culture is influenced to a certain extent.
Disclosure of Invention
In view of the above, the present invention provides an artificial intelligence-based method and system for identifying organoid activity, which can effectively identify organoid activity without affecting organoid culture.
To achieve the above object, in one aspect, the present invention provides an artificial intelligence-based organoid activity recognition method, which comprises:
obtaining a fluorescence-stained organoid culture well plate, the organoid culture well plate being placed in a cell imaging system;
collecting a fluorescence image and a bright field image of the organoid culture well plate through the cell imaging system;
adding a tag file to the fluorescent map, and constructing a training sample set based on the bright field map and the tag file;
and training the recognition model through the training samples in the training sample set, and performing organoid activity recognition analysis based on the recognition model after training.
In one embodiment, obtaining a fluorescence-stained organoid culture well plate comprises:
and adding a Calcein-AM and PI fluorescent reagent into the staining culture solution in proportion, and replacing the culture solution in the organoid culture pore plate after proportioning so as to carry out fluorescent staining on living cells and dead cells in the organoid culture pore plate.
In one embodiment, adding a tag file to the fluorescence map includes:
loading the fluorescence map in Labelme software, and responding to a drawing instruction, and selecting a fluorescence organoid in the fluorescence map through a drawing tool box;
and after the organoids generating fluorescence in the visual field are completely framed, generating corresponding tag files through Labelme software.
In one embodiment, training the recognition model with the training samples in the training sample set includes:
inputting the bright field map and the corresponding tag file into the recognition model in sequence, normalizing the tag file into a binary map through the recognition model, and extracting bright field map features;
restoring the image size of the bright field image features by utilizing the upsampling operation to obtain a feature image of the bright field image;
and performing feature matching on the feature map of the bright field map and the binary map of the tag file to perform iterative learning on organoid features in the tag file.
In one embodiment, in the iterative learning process of organoid features in a tag file, a loss function is adopted to express deviation of the degree of difference between a feature map of a bright field map and a binary map of the tag file, and an optimizer is utilized to update a network weight in the identification model based on the deviation;
and sending the loss function, the optimizer and the recognition model obtained by iterative training into a graphic processor for iterative training again, and stopping training until the loss function reaches a target value.
In one embodiment, the loss function is calculated based on the error of the cross entropy reaction predicted value from the actual value according to the following formula:
wherein L is the error between the predicted value and the actual value of the N samples, the organoid is the organoid judgment value, the organoid is 1 if the organoid is 0, the background is p i For model predictors, N is the total number of samples.
In another aspect, the present invention provides an artificial intelligence based organoid activity recognition system, comprising:
an orifice plate acquisition unit for acquiring a fluorescence-stained organoid culture orifice plate, which is placed in a cell imaging system;
the image acquisition unit is used for acquiring a fluorescence image and a bright field image of the organoid culture pore plate through the cell imaging system;
a organoid activity recognition unit covering the trained organoid activity recognition model, the unit for implementing organoid activity recognition of the bright field map;
and the activity recognition analysis unit is used for analyzing the organoid recognized by the organoid activity recognition unit to obtain the size parameters, the area occupation ratio, the number and the like of the organoid.
In one embodiment, the well plate obtaining unit is specifically configured to add Calcein-AM and PI fluorescent reagent in the staining culture 2 nutrient solution in proportion, and replace the nutrient solution in the organoid culture well plate after proportioning, so as to perform fluorescent staining on the living cells and the dead cells in the organoid culture well plate.
In one embodiment, the organoid activity recognition unit is specifically configured to perform the recognition of the active organoid on the bright field map using a trained recognition model.
In one embodiment, the activity recognition analysis unit is specifically configured to perform a parametric analysis on the organoid after recognizing the active organoid by using the organoid activity recognition unit, where the parametric analysis includes a number, an area, a diameter distribution, and the like.
The beneficial effects of the invention are as follows:
the method takes the organoid image as data, the data is visual and vivid, the organoid is not required to be dyed, compared with other modes, the method ensures that the living organoid condition in the sample is detected under the condition that the organoid is not dyed, and an important ring is established for quality control and drug sensitivity analysis in organoid culture, so that the detected organoid is beneficial to subsequent culture and experiment.
Drawings
FIG. 1 is a schematic diagram showing steps of an artificial intelligence-based organoid activity recognition method in accordance with one embodiment of the present invention;
FIG. 2 (a) is a microscope acquired image;
FIG. 2 (b) is an optical image of an organoid obtained using microscopic scanning;
FIG. 2 (c) is a organoid identified in an image;
FIG. 3 is a schematic diagram of functional blocks of an artificial intelligence based organoid activity recognition system in accordance with one embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be further clearly and completely described in the following in conjunction with the embodiments of the present invention. It should be noted that the described embodiments are only some embodiments of the present invention, and not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the present invention provides an artificial intelligence based organoid activity recognition method, which may include the following steps.
S1: obtaining a fluorescence-stained organoid culture well plate, the organoid culture well plate being placed in a cell imaging system;
s2: collecting a fluorescence image and a bright field image of the organoid culture well plate through the cell imaging system;
s3: adding a tag file to the fluorescent map, and constructing a training sample set based on the bright field map and the tag file;
s4: and training the recognition model through the training samples in the training sample set, and performing organoid activity recognition analysis based on the recognition model after training.
In one embodiment, obtaining a fluorescence-stained organoid culture well plate comprises:
and adding a Calcein-AM and PI fluorescent reagent into the staining culture solution in proportion, and replacing the culture solution in the organoid culture pore plate after proportioning so as to carry out fluorescent staining on living cells and dead cells in the organoid culture pore plate.
In one embodiment, adding a tag file to the fluorescence map includes:
loading the fluorescence map in Labelme software, and responding to a drawing instruction, and selecting a fluorescence organoid in the fluorescence map through a drawing tool box;
and after the organoids generating fluorescence in the visual field are completely framed, generating corresponding tag files through Labelme software.
In one embodiment, training the recognition model with the training samples in the training sample set includes:
inputting the bright field map and the corresponding tag file into the recognition model in sequence, normalizing the tag file into a binary map through the recognition model, and extracting bright field map features;
restoring the image size of the bright field image features by utilizing the upsampling operation to obtain a feature image of the bright field image;
and performing feature matching on the feature map of the bright field map and the binary map of the tag file to perform iterative learning on organoid features in the tag file.
In one embodiment, in the iterative learning process of organoid features in a tag file, a loss function is adopted to express deviation of the degree of difference between a feature map of a bright field map and a binary map of the tag file, and an optimizer is utilized to update a network weight in the identification model based on the deviation;
and sending the loss function, the optimizer and the recognition model obtained by iterative training into a graphic processor for iterative training again, and stopping training until the loss function reaches a target value.
In one embodiment, the loss function is calculated based on the error of the cross entropy reaction predicted value from the actual value according to the following formula:
wherein L is the error between the predicted value and the actual value of the N samples, the organoid is the organoid judgment value, the organoid is 1 if the organoid is 0, the background is p i For model predictors, N is the total number of samples.
In a specific application example, the technical scheme of the invention can be realized by the following steps:
step 1, performing fluorescent staining on an organoid culture pore plate, wherein living cell dye is Calcein-AM, dead cell dye is PI, and the organoid culture pore plate is stained according to the staining principle and the using steps;
step 2, after the step 1 is completed, placing the organoid culture pore plate into a cell imaging system, setting various shooting parameters (focal length, interlayer spacing, exposure, gain and the like) after the image acquisition view is fixed, and acquiring images of a fluorescence image and an bright field image;
step 3, preparing the serial organoid fluorescent images obtained in the step 2 into mask tag files by using Labelme software;
step 4, the mask label file in the step 3 and the bright field diagram in the step 2 are manufactured into a training set of the organoid activity recognition model, and the organoid activity recognition model is trained by using the training set;
and 5, performing organoid activity recognition analysis by using the trained organoid activity recognition model.
Further, the fluorescent staining step in the step 1 specifically comprises the following steps: and (3) adding the Calcein-AM and PI fluorescent reagent into the staining culture solution according to a proportion, and replacing the culture solution in the organoid culture pore plate after proportioning to realize organoid staining.
Further, the making of the mask tag file in the step 3 specifically includes: and opening a target fluorescence graph by using Labelme software, framing and selecting the organoids generating fluorescence in the image by using a drawing tool, and storing the organoids generating fluorescence in the image as mask tag files after framing and selecting all organoids generating fluorescence in the visual field.
Preferably, the organoid activity recognition model in the step 4 is a neural network model, the neural network adopts FCN, U-Net, wide U-Net or unet++, the input of which is the bright field chart in the step 2, and the mask tag file in the step 3, and the neural network model is used for iterative training by adopting a supervised learning method.
Further, the specific iterative training process of the neural network model in the step 4 includes:
(1) Sequentially inputting a quasi organ bright field image and a mask tag file manufactured by a corresponding fluorescent image, normalizing the tag file into a binary image, performing downsampling operation on the bright field image in a model, extracting bright field image features by using a convolution network in the model, then recovering image dimensions by using upsampling operation to obtain feature images of the bright field image, performing feature matching with the tag file, thereby realizing quasi organ feature learning in the tag file, and so on, performing iterative training, wherein the iterative training adopts a loss function to express deviation of difference degree between the feature images of the bright field image and the tag file, and updates network weights in a neural network model by using an optimizer, and further enables feature recognition accuracy of each bright field image to converge along with training along with updating;
(2) And sending the loss function, the optimizer and the neural network model obtained by iterative training into a Graphic Processor (GPU) for iterative training again, and stopping training until the loss function reaches a target value.
Further, the feature extraction in the step (2) is calculated by using the formula i:
in the formula I, Y is a characteristic value after convolution, pixel 1 ,Pixel 2 ,Pixel 3 ,...Pixel n Is the pixel value corresponding to the bright field image, j is the translation parameter of convolution operation, X 1 、X 2 、X 3 ...X n Is a convolution kernel.
Further, the loss function adopted in the step (2) is calculated by adopting a formula II based on the error between the cross entropy reaction predicted value and the actual value:
in the formula II, L is the error between the predicted value and the actual value of the N samples, the organoid is the organoid judgment value, the organoid is the organoid if the organoid is 1, the background is the organoid if the organoid is 0, and pi is the model predicted value.
Referring to fig. 2 (a) to 2 (c), fig. 2 (a) is a microscope-captured image, fig. 2 (b) is an optical image of an organoid obtained by scanning with a microscope, and fig. 2 (c) is an organoid identified in the image.
Referring to fig. 3, the present invention also provides an artificial intelligence based organoid activity recognition system, the system comprising:
an orifice plate acquisition unit for acquiring a fluorescence-stained organoid culture orifice plate, which is placed in a cell imaging system;
the image acquisition unit is used for acquiring a fluorescence image and a bright field image of the organoid culture pore plate through the cell imaging system;
a organoid activity recognition unit covering the trained organoid activity recognition model, the unit for implementing organoid activity recognition of the bright field map;
and the activity recognition analysis unit is used for analyzing the organoid recognized by the organoid activity recognition unit to obtain the size parameters, the area occupation ratio, the number and the like of the organoid.
In one embodiment, the well plate obtaining unit is specifically configured to add Calcein-AM and PI fluorescent reagent in a staining culture solution in proportion, and replace the culture solution in the organoid culture well plate after proportioning, so as to perform fluorescent staining on living cells and dead cells in the organoid culture well plate.
In one embodiment, the organoid activity recognition unit is specifically configured to perform active organoid recognition on the bright field map using a trained recognition model;
in one embodiment, the activity recognition analysis unit is specifically configured to perform a parametric analysis on the organoid after recognizing the active organoid by using the organoid activity recognition unit, where the parametric analysis includes a number, an area, a diameter distribution, and the like.
The beneficial effects of the invention are as follows:
the method takes the organoid image as data, the data is visual and vivid, the organoid is not required to be dyed, compared with other modes, the method ensures that the living organoid condition in the sample is detected under the condition that the organoid is not dyed, and an important ring is established for quality control and drug sensitivity analysis in organoid culture, so that the detected organoid is beneficial to subsequent culture and experiment.
It will be appreciated by those skilled in the art that implementing all or part of the above-described embodiment method may be implemented by a computer program to instruct related hardware, where the program may be stored in a computer readable storage medium, and the program may include the above-described embodiment method when executed. Wherein the storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD); the storage medium may also comprise a combination of memories of the kind described above.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for embodiments of the system, since they are substantially similar to the method embodiments, the description is relatively simple, as relevant to see the section of the method embodiments.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
Claims (8)
1. A method for identifying organoid activity based on artificial intelligence, the method comprising:
obtaining a fluorescence-stained organoid culture well plate, the organoid culture well plate being placed in a cell imaging system;
collecting a fluorescence image and a bright field image of the organoid culture well plate through the cell imaging system;
adding a tag file to the fluorescent map, and constructing a training sample set based on the bright field map and the tag file;
training the recognition model through training samples in the training sample set, and performing organoid activity recognition analysis based on the recognition model after training;
training the recognition model with the training samples in the training sample set includes:
inputting the bright field map and the corresponding tag file into the recognition model in sequence, normalizing the tag file into a binary map through the recognition model, and extracting bright field map features;
restoring the image size of the bright field image features by utilizing the upsampling operation to obtain a feature image of the bright field image;
performing feature matching on the feature map of the bright field map and the binary map of the tag file to perform iterative learning on organoid features in the tag file;
in the iterative learning process of organoid features in the tag file, expressing the deviation of the difference degree between the feature map of the bright field map and the binary map of the tag file by adopting a loss function, and updating the network weight in the identification model by utilizing an optimizer based on the deviation;
sending the loss function, the optimizer and the recognition model obtained by iterative training into a graphic processor for iterative training again, and stopping training until the loss function reaches a target value;
the organoid activity recognition model is a neural network model, and the neural network adopts FCN, U-Net, wide U-Net or UNet++.
2. The method of claim 1, wherein obtaining a fluorescence-stained organoid culture well plate comprises:
and adding a Calcein-AM and PI fluorescent reagent into the staining culture solution in proportion, and replacing the culture solution in the organoid culture pore plate after proportioning so as to carry out fluorescent staining on living cells and dead cells in the organoid culture pore plate.
3. The method of claim 1, wherein adding a tag file to the fluorescence map comprises:
loading the fluorescence map in Labelme software, and responding to a drawing instruction, and selecting a fluorescence organoid in the fluorescence map through a drawing tool box;
and after the organoids generating fluorescence in the visual field are completely framed, generating corresponding tag files through Labelme software.
4. The method of claim 1, wherein the loss function is calculated based on the error of the cross entropy reaction predicted value and the actual value according to the following formula:
wherein L is the error between the predicted value and the actual value of the N samples, the organoid is the organoid judgment value, the organoid is 1 if the organoid is 0, the background is p i For model predictors, N is the total number of samples.
5. An artificial intelligence based organoid activity recognition system for implementing the method of any of claims 1-4, said system comprising:
an orifice plate acquisition unit for acquiring a fluorescence-stained organoid culture orifice plate, which is placed in a cell imaging system;
the image acquisition unit is used for acquiring a fluorescence image and a bright field image of the organoid culture pore plate through the cell imaging system;
a organoid activity recognition unit covering the trained organoid activity recognition model, the unit for implementing organoid activity recognition of the bright field map;
and the activity recognition analysis unit is used for analyzing the organoid recognized by the organoid activity recognition unit to obtain the size parameters, the area occupation ratio, the number and the like of the organoid.
6. The system of claim 5, wherein the well plate harvesting unit is specifically configured to add Calcein-AM and PI fluorescent reagents in a ratio to the staining broth, and to replace the broth in the organoid well plate after the ratio, so as to perform fluorescent staining on the living cells and dead cells in the organoid well plate.
7. The system according to claim 5, wherein the organoid activity recognition unit is configured to perform the recognition of the active organoid on the bright field map using a trained recognition model.
8. The system of claim 5, wherein the activity recognition analysis unit is configured to perform a parametric analysis of the organoid, including a number, an area, a diameter distribution, etc., after the organoid is recognized by the organoid activity recognition unit.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211733272.3A CN116012838B (en) | 2022-12-30 | 2022-12-30 | Artificial intelligence-based organoid activity recognition method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211733272.3A CN116012838B (en) | 2022-12-30 | 2022-12-30 | Artificial intelligence-based organoid activity recognition method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116012838A CN116012838A (en) | 2023-04-25 |
CN116012838B true CN116012838B (en) | 2023-11-07 |
Family
ID=86022657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211733272.3A Active CN116012838B (en) | 2022-12-30 | 2022-12-30 | Artificial intelligence-based organoid activity recognition method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116012838B (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2818514A1 (en) * | 2009-11-17 | 2011-05-26 | Harvard Bioscience, Inc. | Bioreactors, systems, and methods for producing and/or analyzing organs |
CN109871735A (en) * | 2017-11-17 | 2019-06-11 | 希森美康株式会社 | Method for analyzing image, device, program and study are over the manufacturing method of Deep Learning algorithm |
CN112102323A (en) * | 2020-09-17 | 2020-12-18 | 陕西师范大学 | Adherent nucleus segmentation method based on generation of countermeasure network and Caps-Unet network |
CN113151149A (en) * | 2021-03-10 | 2021-07-23 | 安徽大学 | Method for economically, simply and conveniently inducing lung organoid and establishment of experimental model |
CN113283353A (en) * | 2021-05-31 | 2021-08-20 | 创芯国际生物科技(广州)有限公司 | Organoid cell counting method and system based on microscopic image |
CN113283352A (en) * | 2021-05-31 | 2021-08-20 | 创芯国际生物科技(广州)有限公司 | Organoid vitality evaluation method and system based on microscopic image |
CN113688939A (en) * | 2021-09-08 | 2021-11-23 | 华中农业大学 | Pollen activity recognition model training method and system and recognition method and system |
CN113920108A (en) * | 2021-10-29 | 2022-01-11 | 北京航空航天大学 | Training method for training U-Net model for processing cell image |
CN114332855A (en) * | 2021-12-24 | 2022-04-12 | 杭州电子科技大学 | Unmarked leukocyte three-classification method based on bright field microscopic imaging |
CN114463290A (en) * | 2022-01-20 | 2022-05-10 | 创芯国际生物科技(广州)有限公司 | Organoid type intelligent identification method and system based on microscopic image |
CN114494217A (en) * | 2022-01-29 | 2022-05-13 | 杭州捷诺飞生物科技股份有限公司 | Method and device for detecting artificial tissues and organoids |
CN114926562A (en) * | 2022-05-20 | 2022-08-19 | 温州医科大学 | Hyperspectral image virtual staining method based on deep learning |
CN115063360A (en) * | 2022-06-09 | 2022-09-16 | 成都华西精准医学产业技术研究院有限公司 | Intelligent interpretation method and system based on virtual dyeing |
EP4060552A1 (en) * | 2021-03-15 | 2022-09-21 | Universiteit Antwerpen | Label-free analysis of brightfield microscope images |
CN115358973A (en) * | 2022-07-22 | 2022-11-18 | 创芯国际生物科技(广州)有限公司 | Organoid ATP analysis method and system based on artificial intelligence |
CN115485714A (en) * | 2020-04-21 | 2022-12-16 | 赛多利斯生物分析仪器有限公司 | Image processing and segmentation of Z-stack image sets of three-dimensional biological samples |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11415571B2 (en) * | 2019-12-05 | 2022-08-16 | Tempus Labs, Inc. | Large scale organoid analysis |
US11561178B2 (en) * | 2020-04-20 | 2023-01-24 | Tempus Labs, Inc. | Artificial fluorescent image systems and methods |
-
2022
- 2022-12-30 CN CN202211733272.3A patent/CN116012838B/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2818514A1 (en) * | 2009-11-17 | 2011-05-26 | Harvard Bioscience, Inc. | Bioreactors, systems, and methods for producing and/or analyzing organs |
CN109871735A (en) * | 2017-11-17 | 2019-06-11 | 希森美康株式会社 | Method for analyzing image, device, program and study are over the manufacturing method of Deep Learning algorithm |
CN115485714A (en) * | 2020-04-21 | 2022-12-16 | 赛多利斯生物分析仪器有限公司 | Image processing and segmentation of Z-stack image sets of three-dimensional biological samples |
CN112102323A (en) * | 2020-09-17 | 2020-12-18 | 陕西师范大学 | Adherent nucleus segmentation method based on generation of countermeasure network and Caps-Unet network |
CN113151149A (en) * | 2021-03-10 | 2021-07-23 | 安徽大学 | Method for economically, simply and conveniently inducing lung organoid and establishment of experimental model |
EP4060552A1 (en) * | 2021-03-15 | 2022-09-21 | Universiteit Antwerpen | Label-free analysis of brightfield microscope images |
CN113283353A (en) * | 2021-05-31 | 2021-08-20 | 创芯国际生物科技(广州)有限公司 | Organoid cell counting method and system based on microscopic image |
CN113283352A (en) * | 2021-05-31 | 2021-08-20 | 创芯国际生物科技(广州)有限公司 | Organoid vitality evaluation method and system based on microscopic image |
CN113688939A (en) * | 2021-09-08 | 2021-11-23 | 华中农业大学 | Pollen activity recognition model training method and system and recognition method and system |
CN113920108A (en) * | 2021-10-29 | 2022-01-11 | 北京航空航天大学 | Training method for training U-Net model for processing cell image |
CN114332855A (en) * | 2021-12-24 | 2022-04-12 | 杭州电子科技大学 | Unmarked leukocyte three-classification method based on bright field microscopic imaging |
CN114463290A (en) * | 2022-01-20 | 2022-05-10 | 创芯国际生物科技(广州)有限公司 | Organoid type intelligent identification method and system based on microscopic image |
CN114494217A (en) * | 2022-01-29 | 2022-05-13 | 杭州捷诺飞生物科技股份有限公司 | Method and device for detecting artificial tissues and organoids |
CN114926562A (en) * | 2022-05-20 | 2022-08-19 | 温州医科大学 | Hyperspectral image virtual staining method based on deep learning |
CN115063360A (en) * | 2022-06-09 | 2022-09-16 | 成都华西精准医学产业技术研究院有限公司 | Intelligent interpretation method and system based on virtual dyeing |
CN115358973A (en) * | 2022-07-22 | 2022-11-18 | 创芯国际生物科技(广州)有限公司 | Organoid ATP analysis method and system based on artificial intelligence |
Non-Patent Citations (3)
Title |
---|
In vitro reconstitution of the hormone-responsive testicular organoids from murine primary testicular cells;Yan Yang等;Biofabrication;第1-18页 * |
一种用于药物蛋白亲和纯化和跨膜转运的双功能标签的开发;郭利成;曹雪玮;傅龙云;王富军;赵健;;中国生物工程杂志(第06期);全文 * |
利用 FRET 技术检测膜受体与胞外配体相互作用的优化策略;马廷政等;南京医科大学学报(自然科学版);第1049-1054页 * |
Also Published As
Publication number | Publication date |
---|---|
CN116012838A (en) | 2023-04-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ouyang et al. | Deep learning massively accelerates super-resolution localization microscopy | |
US20230127698A1 (en) | Automated stereology for determining tissue characteristics | |
Zieliński et al. | Deep learning approach to describe and classify fungi microscopic images | |
Pennekamp et al. | Implementing image analysis in laboratory‐based experimental systems for ecology and evolution: a hands‐on guide | |
CA3138959C (en) | Image diagnostic system, and methods of operating thereof | |
JP2021503666A (en) | Systems and methods for single-channel whole-cell segmentation | |
Sun et al. | Deep learning‐based single‐cell optical image studies | |
CN106204642B (en) | A kind of cell tracker method based on deep neural network | |
Hollandi et al. | A deep learning framework for nucleus segmentation using image style transfer | |
CN111161272B (en) | Embryo tissue segmentation method based on generation of confrontation network | |
US20230186659A1 (en) | Machine learning models for cell localization and classification learned using repel coding | |
Wang et al. | Biological image analysis using deep learning-based methods: literature review | |
Shang et al. | Identifying rumen protozoa in microscopic images of ruminant with improved YOLACT instance segmentation | |
Wang et al. | Fossil brachiopod identification using a new deep convolutional neural network | |
US20210264130A1 (en) | Method and apparatus for training a neural network classifier to classify an image depicting one or more objects of a biological sample | |
CN116012838B (en) | Artificial intelligence-based organoid activity recognition method and system | |
US20200372652A1 (en) | Calculation device, calculation program, and calculation method | |
WO2018128091A1 (en) | Image analysis program and image analysis method | |
CN116757998A (en) | Screening method and device for CTC cells and CTC-like cells based on AI | |
Ekman et al. | Task based semantic segmentation of soft X-ray CT images using 3D convolutional neural networks | |
CN115760957A (en) | Method for analyzing substance in three-dimensional electron microscope cell nucleus | |
CN112669288B (en) | Cell target expression prediction method, system and device based on digital pathological image | |
Gatti et al. | Deep learning strategies for differential expansion microscopy | |
US20240054640A1 (en) | System, Method, and Computer Program Product for Classification of Diseases Based on Expansion Microscopic Images | |
EP4273608A1 (en) | Automatic acquisition of microscopy image sets |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |