WO2020188814A1 - Analyseur de cellule - Google Patents

Analyseur de cellule Download PDF

Info

Publication number
WO2020188814A1
WO2020188814A1 PCT/JP2019/011855 JP2019011855W WO2020188814A1 WO 2020188814 A1 WO2020188814 A1 WO 2020188814A1 JP 2019011855 W JP2019011855 W JP 2019011855W WO 2020188814 A1 WO2020188814 A1 WO 2020188814A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
cells
cell
culture
unit
Prior art date
Application number
PCT/JP2019/011855
Other languages
English (en)
Japanese (ja)
Inventor
倫誉 山川
藤原 直也
克利 木村
隆二 澤田
Original Assignee
株式会社島津製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島津製作所 filed Critical 株式会社島津製作所
Priority to PCT/JP2019/011855 priority Critical patent/WO2020188814A1/fr
Priority to JP2021506110A priority patent/JP7006833B2/ja
Publication of WO2020188814A1 publication Critical patent/WO2020188814A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/26Stages; Adjusting means therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Definitions

  • the present invention relates to a cell analyzer.
  • pluripotent stem cells such as iPS cells and ES cells
  • iPS cells and ES cells pluripotent stem cells
  • phase contrast microscope In the field of cell culture for regenerative medicine, a phase contrast microscope has been used exclusively for morphological observation of cells in culture.
  • a phase contrast microscope is used because cells are generally transparent and it is difficult to observe them well with a normal optical microscope.
  • cell area When culturing cells for regenerative medicine, it is very important to understand the number of cells existing in the culture vessel and the area occupied by the cells on the medium in the culture vessel (hereinafter sometimes referred to as "cell area"). Is. This is because the number of cells and the cell area are important indicators for determining whether or not the culture conditions are appropriate and for determining the timing of subculture passage.
  • colony When cells form a mass and form a cell colony (hereinafter, simply referred to as "colony”), it is difficult to separate individual cells and count the number of cells, so the cell area is used as an index. Is often done.
  • the cells are detected in those images, that is, the region where the cells exist (hereinafter referred to as "cell region"). It is necessary to distinguish between (may) and the background area where cells do not exist. It is very burdensome and inefficient for humans to make such identification. Therefore, a technique using a discriminator that automatically distinguishes between a cell region and a background region has been developed. Various algorithms for such identification can be considered, and one of them is known as an identification method using machine learning such as deep learning.
  • various forms such as one or more wells, dishes (Petri dishes), flasks, etc. formed on a culture plate are used as culture containers.
  • many wells formed on one culture plate are formed in order to evaluate the growth status of cells by changing the culture conditions such as the medium used and the type of differentiation inducer.
  • cell culture is performed under different culture conditions for each well. In such a case, even if the cell type and strain are the same, the cell size, shape, density, etc. are different for each well.
  • cells of different strains or different types are often cultured in a large number of wells formed on one culture plate, and in that case, of course, the size, shape, and density of cells for each well. Etc. are different.
  • the cell region can be determined with high accuracy depending on the state of the cells appearing in the image. In some cases, the cell regions could not be identified very well, and it was difficult to improve the identification accuracy as a whole.
  • the present invention has been made to solve these problems, and its main purpose is to characterize cell regions in which cells are cultured in a plurality of wells (culture vessels) and undifferentiated deviating cells.
  • a cell analysis device that can identify a target region with high accuracy regardless of differences in cell conditions and culture conditions when automatically identifying a region in which various cells exist. is there.
  • the cell analyzer is a cell analyzer for analyzing cells cultured in a culture vessel. Multiple discriminators of different types and different types created by machine learning different teacher data in order to extract a specific region where cells exist from the observation image of cells.
  • An image acquisition unit that acquires observation images of cells over a predetermined imaging target area including a plurality of culture containers, and an image acquisition unit.
  • a classifier selection unit that independently selects at least one of the plurality of classifiers for each of the plurality of culture vessels.
  • a specific region extraction unit that extracts a specific region using a discriminator selected for each culture vessel by the discriminator selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit. Is provided.
  • the cell analyzer of another aspect of the present invention is a cell analyzer for analyzing cells cultured in a culture vessel. Multiple discriminators of different types and different types created by machine learning different teacher data in order to extract a specific region where cells exist from the observation image of cells.
  • An image acquisition unit that acquires observation images of cells over a predetermined imaging target area including a plurality of culture containers, and an image acquisition unit.
  • a classifier selection unit that selects at least one of the plurality of classifiers for each type of cultured cells and / or for each culture condition, and Identification selected by the classifier selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit according to the type of cells in the culture vessel and / or the culture conditions in the image.
  • a specific area extraction unit that extracts a specific area using a container, Is provided.
  • the cell analysis device of still another aspect of the present invention is a cell analysis device for analyzing cells cultured in a culture vessel. Multiple discriminators of different types and different types created by machine learning different teacher data in order to extract a specific region where cells exist from the observation image of cells.
  • An image acquisition unit that acquires an observation image of cells over a predetermined imaging target area including one or a plurality of culture containers, and an image acquisition unit.
  • a discriminator selection unit that independently selects at least one of the plurality of discriminators for each of the plurality of analysis ranges set by the user in the one or the plurality of culture vessels.
  • a specific area extraction unit that extracts a specific area using a classifier selected for each analysis range by the classifier selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit. Is provided.
  • the type of cell to be analyzed in the cell analyzer of each of the above embodiments is not particularly limited, but is typically a pluripotent stem cell such as a human iPS cell or an ES cell.
  • the characteristic cells can include undifferentiated cells, undifferentiated deviant cells, and the like.
  • the specific region extraction unit identifies data constituting an image of the whole or a part of the observation image of the cell over a predetermined imaging target region including a plurality of culture containers. And as the output of the discriminator, for example, an image in which a specific region in which a cell exists is segmented with another region is obtained.
  • the classifier used at this time is at least one of a plurality of classifiers selected for each culture vessel by the classifier selection unit. Therefore, a specific region can be extracted using a different classifier for each culture vessel.
  • the type of area identified by the classifier and the discrimination performance depend on the teacher data.
  • an appropriate discriminator is selected according to the type of cells cultured in the well, the culture conditions, and the like. Therefore, even if the cell size, shape, density, etc. are different for each well, a specific region such as a cell region can be extracted with high accuracy. As a result, for example, the cell area can be accurately obtained, and based on such information, the user can easily and accurately determine the accuracy of the culture conditions in the cell culture and the timing of the passage of the culture. .. Further, even in the cell analyzer of another aspect of the present invention, even if the size, shape, density, etc. of the cultured cells are different, a specific region such as a cell region is extracted with high accuracy. can do.
  • the schematic block block diagram of the cell analysis apparatus which is one Embodiment of this invention.
  • the flowchart which shows the work procedure and process flow in the cell state evaluation work using the cell analysis apparatus of this embodiment.
  • the figure which shows an example of the extraction result of a cell area when the inside of a well is designated as an imaging target area.
  • the figure which shows an example of the extraction result of the cell region by a classifier The figure which shows an example of the extraction result of the undifferentiated cell region and the undifferentiated deviant cell region by the classifier.
  • the figure which shows an example of the display screen which includes the graph of the time change of a cell area.
  • the figure which shows an example of the classifier selection screen in the cell analysis apparatus of another embodiment The figure which shows an example of the classifier selection screen in the cell analysis apparatus of another embodiment.
  • FIG. 1 is a schematic block configuration diagram of the cell analysis device of the present embodiment.
  • This cell analysis device includes a microscopic observation unit 1, a control / processing unit 2, an input unit 3 and a display unit 4 which are user interfaces.
  • the microscopic observation unit 1 is an in-line holographic microscope, has a light source unit 10 including a laser diode and the like and an image sensor 11, and a cell (or colony) is formed between the light source unit 10 and the image sensor 11.
  • a culture plate 12 or another culture container in which 13 is cultured is arranged.
  • the control / processing unit 2 controls the operation of the microscopic observation unit 1 and processes the data acquired by the microscopic observation unit 1, and includes the imaging control unit 20, the hologram data storage unit 21, and the phase information calculation.
  • a condition setting unit 29 and a condition setting unit 29 are provided as functional blocks.
  • the specific area extraction unit 25 includes a plurality of classifiers 251 as described later. Further, the measurement / analysis condition setting unit 29 includes a discriminator selection unit 291.
  • the entity of the control / processing unit 2 is a personal computer or a higher-performance workstation in which predetermined software (computer program) is installed, or a high-performance computer connected to such a computer via a communication line.
  • Including computer system That is, the function of each block included in the control / processing unit 2 is stored in the computer or computer system, which is executed by executing software installed in a computer system including a single computer or a plurality of computers. It is embodied by processing using various types of data. Of course, it is also possible to realize some of these functions with a dedicated hardware circuit.
  • the cell to be observed / analyzed by the cell analyzer of the present embodiment may be of any type as long as it is a cell cultured in a culture vessel, but is typically pluripotent such as iPS cells and ES cells. It is a sex stem cell.
  • the culture vessel is a culture plate, dish, flask or the like in which a large number of wells are formed. The size and shape of these culture vessels vary depending on the type or manufacturer. For example, the number of wells provided on one culture plate is 6, 12, 24, 48, 96, 384, and the like. There are many types.
  • each well can be regarded as an individual culture container. Therefore, for example, a culture plate in which 6 wells are formed can be regarded as one in which 6 culture containers are connected and integrated.
  • FIG. 2 is a flowchart showing a work procedure and a flow of processing in a cell analysis work using the cell analysis device of this embodiment.
  • FIG. 3 is a conceptual diagram for explaining a photographing operation in the microscopic observation unit 1 and an image reconstruction process performed in the control / processing unit 2.
  • FIG. 3A is a schematic top view of an example of the culture plate 12. Six wells 12a having a circular shape in the top view are formed on the culture plate 12, and cells are cultured on the medium contained in each well 12a.
  • the entire culture plate 12 shown in FIG. 3A that is, the entire rectangular area including the six wells 12a is the imageable region by the microscopic observation unit 1.
  • the microscopic observation unit 1 includes four sets of a light source unit 10 and an image sensor 11, and each of the light source unit 10 and the image sensor 11 of each set has a total culture plate 12 of 4 mag as shown in FIG. 3A. It is responsible for collecting hologram data of the four divided four-division ranges 81. That is, the four sets of the light source unit 10 and the image sensor 11 share the collection of hologram data over the entire culture plate 12 (that is, the entire imageable region).
  • the range in which the set of the light source unit 10 and the image sensor 11 can be photographed at one time is a substantially square shape including one well 12a in the four division range 81.
  • the range 82 is a range corresponding to one imaging unit 83 obtained by dividing the range 82 into 10 equal parts in the X-axis direction and 12 equal parts in the Y-axis direction.
  • the four light source units 10 and the four image sensors 11 are arranged in the XY plane including the light source unit 10 and the image sensor 11, respectively, near the four vertices of a rectangle having the same size as the four division range 81.
  • the holograms of the four different imaging units 83 on the culture plate 12 are acquired at the same time. Then, as will be described later, by moving the light source unit 10 and the image sensor 11 in a step shape by a predetermined distance in the XY plane, the light source unit 10 and the image sensor 11 of each set are combined into 180 imaging units 83. Hologram data in the corresponding range are acquired in order.
  • the size and shape of the culture vessel are very diverse, and even inside the culture vessel container, not all of them are necessarily regions for culturing cells.
  • the region that the user actually wants to analyze is not necessarily all the regions in which the cells exist, and may be only a part of them. Therefore, in the cell analysis apparatus of the present embodiment, it is possible to limit the imaging target area for actual imaging and the analysis target area for actual analysis as follows.
  • the user when the user inputs information such as the measurement date and time from the input unit 3 before executing the measurement (shooting), the user inputs the information about the culture container such as the manufacturer and model number of the culture container to be used and the area to be photographed.
  • the scan conditions for determining the scan conditions and the analysis conditions for determining the analysis target area are input from the input unit 3.
  • the culture plate 12 in which the cells 13 to be observed and analyzed are being cultured is set at a predetermined position of the microscopic observation unit 1, and the measurement execution is instructed (step S1).
  • "manufacturer model number” 101 is information indicating the manufacturer and model number of the culture plate input by the user
  • plate shape 102 is information indicating the shape of the entire culture plate
  • "well shape” 103 is information indicating the shape of one well, and these are uniquely determined by the manufacturer's model number and the like.
  • the “plate reference point” 104 is information regarding the installation position of the culture plate
  • the “well position” 105 is information indicating the center position coordinates of each of the six wells, which was photographed by the camera attached to the microscopic observation unit 1. Calculated from optical images.
  • Scan condition 106 is information indicating a range to be excluded from the imaging target
  • analysis condition 107 is information indicating a range to be excluded from the analysis target such as extraction of a cell region described later, and these are information depending on the user as necessary. Input is set.
  • the imaging control unit 20 has a container information storage unit in which information including the shape and size of various culture containers is registered in advance.
  • the imaging control unit 20 collates the information (manufacturer model number) about the culture container set from the input unit 3 with the container information storage unit, and reads out information such as the plate shape and the well shape displayed in FIG. Further, the plate reference point, the position coordinates of the well, and the like are specified based on the image obtained by optically photographing the mounted culture plate 12. Then, from the input information and the position information acquired based on the optical image, the region in which the cells can be cultured is calculated in the imageable region, and the imaging target region is determined according to the calculation result (step). S2). The details of the shooting target area will be described later.
  • the imaging control unit 20 controls each unit of the microscopic observation unit 1 to perform imaging and collect hologram data (step S3). That is, one light source unit 10 irradiates a predetermined region (one imaging unit 83) of the culture plate 12 with coherent light having a spread of a minute angle of about 10 °.
  • the coherent light (object light 15) transmitted through the culture plate 12 and the cell 13 reaches the image sensor 11 while interfering with the light (reference light 14) transmitted through the region close to the cell 13 on the culture plate 12.
  • the object light 15 is light whose phase changes when it passes through the cell 13, while the reference light 14 is light that does not pass through the cell 13 and therefore does not undergo the phase change caused by the cell 13.
  • the four image sensors 11 acquire hologram data of regions corresponding to different imaging units 83 on the culture plate 12. Will be done.
  • the light source unit 10 and the image sensor 11 are moved in the X-axis direction and the Y-axis direction by a moving unit (not shown) by a distance corresponding to one imaging unit 83 in the XY plane. It is moved sequentially in steps. As a result, measurements are performed with 180 or less imaging units 83 included in the 4-division range 81, and the entire culture plate 12 or a specific imaging target area is performed by the four sets of the light source unit 10 and the image sensor 11.
  • Measurement will be performed. As described above, since it is not necessary to perform focusing at the time of imaging with a holographic microscope, the time required for imaging (measurement) at one measurement position can be shortened. As a result, shooting can be completed in a relatively short time even if the shooting target area is wide.
  • the hologram data obtained by the four image sensors 11 of the microscopic observation unit 1 is sent to the control / processing unit 2 and stored in the hologram data storage unit 21 together with the attribute information such as the measurement date and time.
  • the microscopic observation unit 1 can acquire hologram data of the entire rectangular imageable area including the six wells 12a, but some of them.
  • the imaging unit 83 is the outer portion of the well 12a in which no cells are present. Even if hologram data is acquired for such an imaging unit, the data is useless. Therefore, in step S2, whether or not each imaging unit 83 includes the inner region of the well 12a (that is, the outer region of the well 12a) is calculated by the imaging control unit 20 based on the shape of the well 12a and the position information thereof. Whether or not it contains only) is judged. Then, only the imaging unit 83 determined to include the inner region of the well 12a is determined as the imaging target region.
  • FIG. 5 is an example of dividing the imaging unit 83 of 10 ⁇ 12 including one well 12a into an imaging target region and other regions.
  • 48 imaging units 83B shown by diagonal lines are out of the imaging target area, and the remaining 72 imaging units 83A form the imaging target area. ..
  • the imaging unit 83B outside the imaging target area is skipped during the imaging operation as described above, and hologram data is not acquired.
  • the execution time of the measurement can be shortened, and the amount of hologram data for the imaging target area can be reduced. This is effective in saving the memory capacity in the hologram data storage unit 21 and reducing the amount of data transferred from the microscopic observation unit 1 to the control / processing unit 2.
  • the entire photographic area including the part where significant hologram data cannot be obtained is photographed (measured) to collect hologram data, and the imaging unit outside the imaging target area during the analysis described later. May be excluded from the analysis target.
  • phase information calculation unit 22 sequentially reads hologram data from the hologram data storage unit 21 and performs light wave propagation calculation processing to perform phase information and amplitude at each two-dimensional position. Restore information. Since the spatial distribution of this information is obtained for each imaging unit 83, if the phase information and amplitude information of all the imaging units 83 are obtained, the image reconstructing unit 23 takes a picture based on the phase information and the amplitude information. A phase image of the entire target region, that is, an IHM phase image is formed (step S4).
  • the image reconstruction unit 23 reconstructs the IHM phase image of each imaging unit 83 based on the spatial distribution of the phase information calculated for each imaging unit 83. Then, by performing a tiling process (see FIG. 3D) for connecting the IHM phase images in the narrow range, the IHM phase image for the imaging target region, that is, the entire culture plate 12, or a specific imaging target region thereof. To form.
  • the data constituting the IHM phase image is stored in the reconstructed image data storage unit 24. At the time of the tiling process, it is advisable to perform an appropriate correction process so that the IHM phase images at the boundary of the imaging unit 83 are smoothly connected.
  • the reconstructed image obtained by normal processing is the highest resolution IHM phase image obtained in principle from the acquired hologram data, but not only that, binning processing or the like based on the highest resolution IHM phase image.
  • the data constituting the IHM phase image having various resolutions is stored in the reconstructed image data storage unit 24.
  • the display processing unit 28 should read out the reconstructed image data of the required resolution from the reconstructed image data storage unit 24 and display it. Images can be formed quickly.
  • the phase information calculation unit 22 calculates not only the phase information but also the intensity information and the pseudo-phase information obtained by merging the phase information and the intensity information based on the hologram data, and the image reconstruction unit 23 calculates the intensity information.
  • a reproduced image (IHM intensity image, IHM pseudo-phase image) based on these can also be created. These images may be used in place of the IHM phase image. Further, when creating an IHM phase image or the like based on hologram data, it is possible to create an IHM phase image or the like at a plurality of focal positions. As a result, an IHM phase image at the in-focus position can be obtained without focusing at the time of shooting.
  • the specific region extraction unit 25 executes a process of extracting the cell region in which cells exist, the cell region in which specific cells exist, or other characteristic regions from the IHM phase image corresponding to the region to be imaged. ..
  • region extraction is performed by using a classifier learned by a machine learning algorithm such as deep learning using a multi-layer neural network, or more simply, by using a technique of image segmentation by deep learning.
  • a person in charge determines a background region in which cells do not exist and a cell region in which cells exist, and prepares a large number of IHM phase images labeled with binary values as teacher data.
  • This classifier takes data constituting an IHM phase image as an input, and outputs an image segmented according to a binary label (segmentation image) as a discrimination result.
  • the actual state of the classifier is a multi-layer neural network in which parameters are learned in order to execute the discrimination process.
  • the specific area extraction unit 25 has the above-mentioned classifier, and inputs the IHM phase image of the analysis target created as described above to the classifier as input data. Then, as the output of the classifier, a segmentation image in which the pixels included in the background region and the cell region are labeled is obtained.
  • the specific region extraction unit 25 includes a plurality of different types of classifiers 251. Prior to performing the region extraction process based on the IHM phase image, the user individually designates the classifier used for the region extraction process for each of the six wells 12a as follows (step S5). Then, the specific region extraction unit 25 executes a process of extracting a cell region or a characteristic region from the IHM phase image for the well by using a discriminator designated for each well (step S6).
  • FIG. 6 is a diagram showing an example of the classifier selection screen 110.
  • a well-compatible classifier list 111 is arranged on the classifier selection screen 110.
  • the analysis parameter indicates the type of classifier, and in the example of FIG. 6, a classifier named "Cell Analysis_01" is used for the well whose well number is "1". Means that.
  • the down arrow symbol 113 at the right end of the well-compatible classifier list 111 is clicked, a list of registered classifiers is displayed in a down menu. The user selects a desired device from the list and clicks to select the classifier.
  • the mouse cursor is moved to an arbitrary classifier name in the classifier list displayed in the down menu in advance.
  • the comment registered by is displayed in a pop-up.
  • the user registers appropriate information such as cell type and culture conditions suitable for the region extraction process using the classifier as a comment in advance, so that the classifier can be selected when selecting the classifier.
  • a comment column may be provided in each line of the well-corresponding classifier list 111 so that a comment registered in advance for the classifier selected at that time is displayed.
  • discriminators that are different from each other have the same discriminant algorithm (for example, deep learning using the same multi-layer neural network), but different teacher data at the time of learning. That is, even with the same labeling (for example, a binary label of a cell region and a background region), if the IHM phase image itself to be trained is different, the discriminator will be different. Not only when the cell type is different, but also when the cell type is the same but the cell line is different or the culture conditions are different, the cell shape and the like may be different. Therefore, a classifier was created using the IHM phase image of the cells cultured under different culture conditions and the labeled image (segmentation image) for the IHM phase image as teacher data, and the classifier was selected according to the culture conditions. By performing the analysis, it is possible to reduce the influence of differences in culture conditions and make more accurate identification.
  • the same labeling for example, a binary label of a cell region and a background region
  • classifiers those with different labeling are different classifiers. Specifically, for example, in addition to a discriminator created by learning IHM images labeled corresponding to two regions, a background region and a cell region, as teacher data, a background region, an undifferentiated cell region, and an undifferentiated cell region, and A classifier created by learning labeled IHM images corresponding to three (or more) regions of the undifferentiated deviant cell region as teacher data can be used. Furthermore, not only the cell region but also the region where foreign matter such as dust exists and the region where some characteristic pattern exists on the image such as interference fringes are labeled as the characteristic region. The classifier can be created. Such different classifiers can be used properly according to the purpose of analysis, the state of cells, and the like.
  • a plurality of discriminators can be selected corresponding to one well, and when extracting a cell region or a characteristic region from an IHM phase image for a certain one or a plurality of wells, each other A plurality of different classifiers may be used in combination.
  • the region extraction result obtained from the first discriminator that discriminates between the cell region and the background region and the region extraction obtained from the second discriminator that discriminates between the undifferentiated deviated cell region and the other region.
  • FIG. 8 is a diagram showing an example in which a cell region was extracted (segmentation was performed) using a discriminator capable of distinguishing between a background region and a cell region. In this extraction result, the region where cells and colonies are present is well extracted.
  • FIG. 9 is a diagram showing an example in which segmentation was performed using a discriminator capable of discriminating between a background region, an undifferentiated cell region, and an undifferentiated deviant cell region. In this extraction result, the cell region and the background region are well distinguished, and even in the same cell region, undifferentiated cells and undifferentiated deviant cells are well distinguished.
  • FIG. 7 is a diagram showing an IHM phase image (right side) near the peripheral edge of the well and an image (left side) of the result of extracting the cell region using the entire IHM phase image as the analysis target region. From FIG. 7, it can be seen that the inside of the well peripheral portion having a substantially circular shape is defined as the imaging target region, and the cells (colony) in the imaging target region can be appropriately detected.
  • the index value calculation unit 26 performs a predetermined arithmetic process on the cell region or feature region extracted as described above, and is related to, for example, the cell area of each well or the entire culture plate including a plurality of wells.
  • the index value to be calculated is calculated (step S7). Then, the calculated index value is stored in the calculation result storage unit 27.
  • the number of pixels of the cell region extracted on the IHM phase image is integrated for each well, and the number of pixels corresponds to one pixel, and the region set as the analysis range in the well (for example, from the center of the well).
  • the area value of a circular range with a diameter of 15 mm
  • the cell area in the analysis range for each well can be calculated.
  • the ratio of the area occupied by cells in the analysis range and the cell area per unit area can be calculated from the cell area in the analysis range in one well and the area of the analysis range in the well obtained in this way. Further, it is possible to calculate statistics such as the total cell area of one culture plate and the average and dispersion of cell areas in each of a plurality of wells in one culture plate.
  • the ratio of the area occupied by the undifferentiated deviation cells to the area occupied by the undifferentiated cells can be calculated for each well.
  • the area of those feature regions and the area of the feature region The ratio between the well and the internal area of the well, or the ratio between the area of the characteristic region and the area of the cell region can be calculated.
  • the index value calculation unit 26 can also calculate various index values related to the size and shape of the cell region other than the area of the cell region. Specifically, since the extracted cell region is the shape of an individual cell or the shape of a colony which is a mass of a plurality of cells, the length in the major axis direction and the length in the minor axis direction are determined for each region. It is possible to calculate characteristic index values that reflect the size and shape, such as the peripheral length and roundness. Thereby, for each well, statistics such as the average and dispersion of the peripheral length of the cell region can be calculated.
  • image features such as the brightness value on the image in the cell region or feature region, the gradient of the brightness value, and the differential value indicating the spatial change of the brightness value, which are not directly related to the size and shape of the cell. It is also possible to calculate the numerical value indicating the above, and to calculate the feature amount of the cell (or colony) region or the feature region based on this. Further, a new index value may be calculated by combining different index values such as the above-mentioned area value.
  • FIG. 10 is an example of displaying a graph showing the temporal change of the cell density (ratio of the cell area to the internal area of the well) for each well.
  • a culture container selection instruction field 121 and a time lapse graph 122 for selecting a culture plate and a well to be displayed are arranged.
  • the culture vessel selection instruction field 121 the user selects the culture plate and well to be displayed on the time passage graph 122 and puts a check mark. Further, the index value used for the graph to be displayed is selected in the display graph selection field 123. In the example of FIG. 10, the cell density (Confluency) is selected.
  • the display processing unit 28 reads out the calculation result of the specified index value associated with the specified plate name stored in the calculation result storage unit 27, and reads the calculation result.
  • the cell culture period is assigned to the horizontal axis (or vertical axis), and the index value is assigned to the vertical axis (or horizontal axis), and the values at each measurement time point are plotted for each designated well to create a graph.
  • the time elapsed graph 122 is displayed in the analysis result display screen 120.
  • the lines on the graph 122 corresponding to each well are displayed in different colors.
  • the display processing unit 28 changes the index value, creates a new graph, and updates the displayed graph according to the change operation. Thereby, it is possible to confirm the changes with time for various index values in order.
  • the time-lapse graph 122 shows not only the index value for a plurality of wells on one culture plate but also the time-dependent change of the index value for the wells on another culture plate. it can. Thereby, for example, it is possible to easily evaluate the difference from the reference culture plate or well.
  • the display processing unit 28 corresponds to the indicated plot point, that is, corresponds to the measurement time point, the plate, and the well. Identify the attached IHM phase image. Then, the data constituting the IHM phase image is acquired from the reconstructed image data storage unit 24 to form an image, and the image is displayed on the screen of the display unit 4. The image may be displayed on a part of the analysis result display screen 120, or may be displayed in another newly opened window. The displayed image allows the user to confirm and examine the state of cells in any well at any time of measurement.
  • a switching button 124 for selecting whether or not to superimpose the segmentation image which is the region extraction result on the IHM phase image is arranged. Has been done. Depending on the selection by the switching button 124, it is possible to switch whether to display only the IHM phase image or to display a composite image in which a translucent segmentation image is superimposed on the IHM image. By making such a selection on the analysis result display screen 120 and then instructing the user to select an arbitrary plot point on the time-lapse graph 122, the target cell observation image can be efficiently displayed.
  • a CSV file output button such as a PC output button
  • a screen capture button such as a screen capture button
  • an output instruction button 125 such as a PC output button
  • the time elapsed graph 122 confirmed on the analysis result display screen 120 is displayed as either a CSV file (text file), a screen capture, or a screen file (for example, a JPEG file).
  • a screen file for example, a JPEG file.
  • the information of the type of the classifier used for the area extraction and the type of calculation of the feature amount can be specified.
  • a different type of classifier can be used for each well in order to extract a cell region or the like based on an IHM image in which cells can be clearly observed.
  • various conditions such as cell type, culture conditions (medium type, temperature, time, type of differentiation inducer, etc.), cell density, etc. differ from well to well, each well can be handled.
  • Cell regions and the like can be satisfactorily extracted from the IHM image.
  • index values for grasping the state of cell culture such as the area occupied by the cell region, can be accurately calculated for each well. For example, the suitability of culture conditions and the timing of culture passage can be determined. It can be done appropriately and easily.
  • FIG. 11 is a diagram showing a classifier selection screen 130 in the cell analyzer of another embodiment.
  • the classifier selection screen 130 corresponds to one well. For example, when a well for which a classifier is to be specified is selected, the classifier selection screen 130 corresponding to the well is displayed on the display unit 4.
  • the user directly selects the type of the classifier, but in the cell analyzer of this embodiment, the user can specify either automatic selection or manual selection of the classifier. It has become.
  • the classifier selection screen 130 shown in FIG. 11 is provided with a check box 131 for designating the automatic selection of the classifier and a check box 132 for designating the manual selection of the classifier, and the user can check either of them. Click the box to put a check mark.
  • automatic selection of the classifier is specified.
  • the user inputs the cell type and the culture time as the condition for selecting the classifier in the condition specification field 133.
  • the classifier selection unit 291 selects a classifier that best suits the classifier selection conditions from a plurality of classifiers 251 registered in advance, and displays the classifier in the classifier display field 134.
  • the user wants to select the classifier directly, check the check box 132 for specifying the manual selection of the classifier, and then a down menu showing a list of registered classifier names.
  • the desired classifier may be selected from.
  • the target area for executing the area extraction process in the well can be specified by the diameter from the center of the well.
  • the region extraction process using the designated discriminator is performed only on the circular region having a diameter of 30 mm from the center of the well.
  • the region extraction process is not performed on the region outside the circular region.
  • the type of the classifier used for the region extraction processing and the size of the region to be the target region of the region extraction processing can be specified as a set for each well. After such specification, when the user clicks the "parameter setting" button 136, the discriminator and analysis range specified at that time are determined and registered for the well.
  • FIG. 12 is a diagram showing a classifier selection screen 140 in the cell analysis device of still another embodiment.
  • the same well-corresponding classifier list 141 as the well-corresponding classifier list 111 in FIG. 6 is arranged on the left side thereof.
  • a segmentation image showing the region extraction result for a part of the IHM images is displayed as a preview image 147.
  • the user selects the type of the classifier for a certain well number in the well-compatible classifier list 141, instructs the selection by clicking the well number, and then clicks the "preview selected well number" button 146. To do. Then, in response to this, the specific region extraction unit 25 divides the IHM image corresponding to one imaging unit located at the center of the well among the IHM images corresponding to the wells with the designated well numbers. Select as an observation image. Then, the region extraction process using the designated discriminator is executed on the partially observed image, and the segmentation image in one imaging unit is acquired. The display processing unit 28 displays this segmentation image as a preview image 147 in the classifier selection screen 140. At this time, since the region extraction process for the IHM phase image in one imaging unit is only executed instead of the entire IHM phase image, the region extraction process can be completed in a short time to obtain a segmentation image.
  • the specific area extraction unit 25 is an IHM image corresponding to one imaging unit located at the center of the well among the IHM images corresponding to the wells assigned the designated well numbers.
  • the IHM image corresponding to the imaging unit at the position specified by the user may be selected from the IHM images corresponding to the well.
  • the user confirms the displayed preview image 147 and determines whether or not the selected classifier is appropriate. Then, the selector selection is changed as necessary, and the preview image is confirmed again. Since the user can confirm the segmentation image that is the result of the region extraction process using the selected classifier, the appropriate classifier can be easily and surely selected. Then, when the classifier is selected for all the wells and the "Analysis execution" button 145 is clicked, the specific area extraction unit 25 executes the area extraction process based on the IHM image of the entire well for each well and the segmentation image. To get.
  • an image in which the segmentation image and the IHM image are superimposed may be displayed.
  • FIG. 13 is a diagram showing a classifier selection screen 150 in the cell analyzer of still another embodiment.
  • the contents of the classifier selection screen 130 shown in FIG. 11 are arranged on the left side of the classifier selection screen 150 shown in FIG. 13.
  • a partial IHM phase image 158 and a segmentation image 159 showing the region extraction result for the IHM image are displayed side by side as a preview image.
  • the user selects the classifier corresponding to one well by automatic selection or manual selection as described above, and then clicks the "preview” button 157.
  • the specific region extraction unit 25 selects the IHM image corresponding to one imaging unit located at the center of the well from the IHM images corresponding to the well at that time as the partial observation image.
  • the region extraction process using the designated discriminator is executed for the partially observed image, and the segmentation image in one imaging unit is acquired.
  • the display processing unit 28 displays the segmentation image 159 and the partial IHM image 158, which is the original partial observation image, side by side in the classifier selection screen 150.
  • the user compares the two displayed images 158 and 159 to determine whether the selected classifier is appropriate. Then, the selector selection is changed as necessary, and the preview image is confirmed again. Since the user can confirm the segmentation image that is the result of the region extraction process using the selected classifier, the appropriate classifier can be easily and surely selected.
  • the IHM phase image and the segmentation image may not be displayed side by side but may be displayed by superimposing them.
  • the cell analyzer of the above embodiment can be modified in various ways as described below.
  • ⁇ Modification example 1> In the cell analysis apparatus of the above embodiment, the specific region extraction unit 25 executes a process of extracting the cell region and the characteristic region only for the analysis target region, and at that time, the range included in the analysis target region in the IHM phase image.
  • the area extraction process may be performed using only the data, but other than that, the following may be performed.
  • mask processing is performed on a part of the IHM phase image outside the area to be analyzed, and the data extracted from the masked part is treated as invalid data or zero data. To do.
  • the region extraction process is executed including the region outside the analysis target region, the cell region and the characteristic region can be substantially extracted only for the analysis target region.
  • the area extraction process is performed even outside the area of the analysis target area in the IHM phase image (for example, the area extraction process is performed for the entire imaging target area), but the extraction result is analyzed.
  • Mask processing is performed on the outside of the target area, or data of only the analysis target area is used. This also makes it possible to obtain the extraction results of the cell region and the characteristic region substantially only for the analysis target region.
  • a discriminant created by using deep learning which is a method of machine learning
  • discriminant analysis by another machine learning method is used.
  • regression analysis may be used.
  • a support vector machine, a random forest, or the like may be used.
  • the classifier created by machine learning it is also possible to use a method of extracting a characteristic region using the brightness value of the IHM phase image, the texture image, or the like. In any case, any method may be used as long as it can detect and identify a characteristic region on the image from a certain image information.
  • the cell analyzer of the above embodiment has a configuration in which a discriminator is selected for each well (that is, for each culture container), but the user specifies the discriminator after selecting the culture conditions and / or the cell type. If so, the region extraction process using the designated discriminator may be performed on one or more wells associated with the selected culture conditions and / or cell type. That is, instead of selecting the classifier for each well, the classifier may be selected for each culture condition and / or cell type.
  • a barcode memorizing or associating the type of cells cultured in each well in one culture plate and the culture conditions thereof is attached to the culture plate. deep. Then, by reading the barcode when executing the analysis, the type of cells cultured in each well and the culture conditions thereof are grasped, and which discriminator is assigned to which well based on the culture conditions set by the user. Should be decided.
  • ⁇ Modification 5> In the cell analysis apparatus of the above embodiment, it is assumed that cells are cultured using a culture plate in which a plurality of wells are formed, but for example, when there is no well such as a culture flask, or even if there is a well, the inside of the well It is also possible that you want to analyze only a small area of. Therefore, the user may specify an analysis range of a region having a shape such as a rectangle or a circle, and may select a discriminator suitable for the analysis range. In this case, it is preferable to set a plurality of analysis ranges in one large culture vessel such as a culture flask so that different discriminators can be selected for each of the plurality of analysis ranges. As a result, for example, in a large culture vessel, it is possible to accurately extract a region of a portion having a high cell density and a portion having a low cell density using different discriminators.
  • the first aspect of the present invention is a cell analyzer for analyzing cells cultured in a culture vessel. Multiple discriminators of different types and different types created by machine learning different teacher data in order to extract a specific region where cells exist from the observation image of cells.
  • An image acquisition unit that acquires observation images of cells over a predetermined imaging target area including a plurality of culture containers, and an image acquisition unit.
  • a classifier selection unit that independently selects at least one of the plurality of classifiers for each of the plurality of culture vessels.
  • a specific region extraction unit that extracts a specific region using a discriminator selected for each culture vessel by the discriminator selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit. Is provided.
  • the machine learning method is not particularly limited as long as it is supervised opportunity learning, and various well-known methods such as deep learning using a multi-layer neural network and a support vector machine are used. Can be done. Further, multivariate analysis that can be included in opportunity learning in a broad sense such as discriminant analysis and regression analysis may be used.
  • the plurality of culture vessels are, for example, a plurality of wells formed on one culture plate.
  • a plurality of classifiers of different types are prepared in advance, and the classifier selection unit is used for at least a plurality of classifiers independently for each of the plurality of culture vessels, for example, wells. Select one.
  • one discriminator may be selected for each culture vessel, but a plurality of discriminators can be selected for one culture vessel, and the region extraction results using the plurality of discriminators are combined. By doing so, the final extraction result of a specific region may be obtained, or the region extraction result using each of a plurality of classifiers may be output in a comparable manner.
  • the plurality of classifiers perform a discriminating process for extracting a specific region of the same type, for example, a cell region
  • the plurality of classifiers receive teacher data having different cell types, culture conditions, cell densities, or positions in the culture vessel. It can be created by learning. Further, it is also possible to perform an identification process for extracting different types of specific regions, such as undifferentiated cell regions, undifferentiated deviant cell regions, and regions in which foreign substances other than cells are present.
  • an appropriate discriminator is selected according to the type of cells cultured in the well, the culture conditions, and the like.
  • the image acquisition unit is either a holographic microscope that acquires hologram data over a region to be imaged, or information including phase information, intensity information, or both elements by a predetermined arithmetic process using the hologram data. It can include an image reconstructing unit that creates an image showing a spatial distribution as an observation image of cells.
  • the measurement can be efficiently completed in a short time even when the imaging target area is wide, for example, the entire culture plate in which a large number of wells are formed or the entire large-capacity flask. can do. Therefore, the time required to take out the culture vessel in which the cells are cultured from the incubator can be shortened, and the influence on the cell culture can be reduced. In addition, the throughput of cell culture work can be improved.
  • the plurality of discriminators were created by machine learning using teacher data of a plurality of different conditions for at least one item of cell type, culture condition, cell density, or specific region type.
  • Each culture vessel is further provided with a selection condition setting unit for the user to input conditions for at least one item of cell type, culture condition, cell density, or specific region type.
  • the classifier selection unit may select at least one of the plurality of classifiers for each culture vessel based on the conditions set by the selection condition setting unit.
  • the region identification process using the classifier most suitable for the condition is executed. ..
  • a specific region such as a cell region can be accurately extracted.
  • the plurality of discriminators were created by machine learning using teacher data of a plurality of different conditions for at least one item of cell type, culture condition, cell density, or specific region type.
  • the classifier selection unit displays a display screen on the display unit for the user to select one of the plurality of classifiers for each culture container, and selects the classifier according to an instruction on the display screen. This is to be done so that reference information for the user to select the classifier is displayed on the display screen.
  • the reference information can be, for example, comment information that the user can freely describe in advance, such as the cell type and culture conditions suitable for using the discriminator. According to this fourth aspect, when the user selects the classifier for each culture vessel, the reference information can be used to select an appropriate classifier.
  • the discriminator selection unit allows the user to select one of the plurality of discriminators for each culture container.
  • a display screen for selecting the discriminator is displayed on the display unit, and a discriminator is selected according to an instruction on the display screen.
  • the display screen is a cell corresponding to the culture vessel to which the discriminator is selected. It can include at least a part of the observed image of.
  • the user selects an appropriate classifier for each culture vessel according to the shape, size, density, etc. of the cells while checking the observation image of the cells being cultured for each culture vessel. can do.
  • the classifier selection unit displays a display screen on the display unit for the user to select one of the plurality of classifiers for each culture container, and selects the classifier according to an instruction on the display screen.
  • the display screen is provided with an operator for creating a preview image.
  • the specific region extraction unit selects a part of the observation image corresponding to a predetermined one culture container in the observation image obtained by the image acquisition unit with respect to the culture container. Execute the process of extracting a specific area using the classifier
  • the identification selection unit may display a preview image showing the extraction result of the specific area in the display screen or on a screen different from the display screen.
  • the observation image of the cell over the predetermined imaging target region is created by joining a plurality of different partial observation images.
  • the preview image may show the extraction result of a specific region for one of the plurality of partially observed images.
  • the user can immediately determine whether or not the selected discriminator can appropriately extract the cell region or the like by checking the preview image. Can be done. Thereby, a classifier more suitable for region extraction can be easily and surely selected. Further, by limiting the preview image to a part of the observation image, the preview image can be quickly created and displayed, and the efficiency of the analysis work can be improved.
  • the preview image is an image obtained by superimposing a segmentation image showing the extraction result of a specific region and an observation image in the same range. Can be.
  • the preview image is a segmentation image showing an extraction result of a specific region, and the segmentation image and an observation image in the same range thereof can be displayed side by side.
  • the eighth aspect and the ninth aspect it is possible to easily compare the observed image of the cell with the segmentation image which is the result of region extraction with respect to the observed image, and thus whether or not the selected discriminator is appropriate is determined. It is possible to make an easy and accurate judgment. As a result, the classifier can be selected accurately.
  • the image acquisition unit A holographic microscope that acquires hologram data over the entire imaging target area by repeatedly performing imaging for a predetermined range of imaging units while moving the imaging position.
  • An image reconstruction part that creates an observation image of cells, Including The preview image shows the extraction result of a specific region for one of the plurality of partial images.
  • the user immediately confirms the preview image to immediately determine whether or not the selected discriminator can appropriately extract the cell region or the like. You can judge. Thereby, a classifier more suitable for region extraction can be easily and surely selected. Further, by limiting the preview image to one partial image corresponding to one imaging unit, the preview image can be quickly created and displayed, and the efficiency of the analysis work can be improved.
  • the specific region in which the cells are present can be a region in which any one of undifferentiated cells, undifferentiated deviant cells, and dead cells is present, or a region in which a plurality of them are present. ..
  • a twelfth aspect of the present invention is a cell analyzer for analyzing cells cultured in a culture vessel. Multiple discriminators of different types and different types created by machine learning different teacher data in order to extract a specific region where cells exist from the observation image of cells.
  • An image acquisition unit that acquires observation images of cells over a predetermined imaging target area including a plurality of culture containers, and an image acquisition unit.
  • a classifier selection unit that selects at least one of the plurality of classifiers for each type of cultured cells and / or for each culture condition, and Identification selected by the classifier selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit according to the type of cells in the culture vessel and / or the culture conditions in the image.
  • a specific area extraction unit that extracts a specific area using a container, Is provided.
  • the size and shape differ depending on the cell type, culture conditions, etc., even if the user does not understand the correspondence between the wells in the culture plate and the culture conditions. It is possible to accurately extract a specific region such as the region occupied by the cells.
  • the thirteenth aspect of the present invention is a cell analyzer for analyzing cells cultured in a culture vessel.
  • An image acquisition unit that acquires an observation image of cells over a predetermined imaging target area including one or a plurality of culture containers, and an image acquisition unit.
  • a discriminator selection unit that independently selects at least one of the plurality of discriminators for each of the plurality of analysis ranges set by the user in the one or the plurality of culture vessels.
  • a specific area extraction unit that extracts a specific area using a classifier selected for each analysis range by the classifier selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit. Is provided.
  • the cell analyzer of the thirteenth aspect of the present invention even when the shape and density of cells are different at a plurality of sites in one culture vessel, a specific region such as a cell region at each site can be accurately extracted. can do.
  • the cell analysis device can also be used to extract a region in which a foreign substance such as dust exists, instead of a cell region.
  • the cell analysis device is a cell analysis device for analyzing cells cultured in a culture vessel. Multiple discriminators of different types created by machine learning different teacher data in order to extract a specific region where a foreign substance exists from an observation image of a cell, An image acquisition unit that acquires observation images of cells over a predetermined imaging target area including a plurality of culture containers, and an image acquisition unit.
  • a classifier selection unit that independently selects at least one of the plurality of classifiers for each of the plurality of culture vessels.
  • a specific region extraction unit that extracts a specific region using a discriminator selected for each culture vessel by the discriminator selection unit for an image of the whole or a part of the observation image obtained by the image acquisition unit. Is provided.

Landscapes

  • Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Urology & Nephrology (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Hematology (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)
  • Microscoopes, Condenser (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

Selon un mode de réalisation de la présente invention, un analyseur de cellules comprend : une pluralité de classificateurs (251) de types mutuellement différents générés par apprentissage automatique de différentes données d'apprentissage permettant d'extraire une région déterminée, dans laquelle des cellules ou des cellules caractéristiques sont présentes, à partir d'une image d'observation de cellules ; des unités d'acquisition d'image (1, 20-23) permettant d'acquérir une image d'observation de cellules dans l'ensemble d'une région cible prédéterminée comprenant une pluralité de récipients de culture ; une unité de sélection de classificateur (291) permettant de sélectionner au moins au moins un classificateur parmi la pluralité de classificateurs indépendamment pour chacun de la pluralité de récipients de culture ; et une unité d'extraction de région déterminée (25) permettant d'extraire une région déterminée à partir de la totalité de l'image d'observation acquise par l'unité d'acquisition d'image ou d'une partie de l'image à l'aide des classificateurs sélectionnés par l'unité de sélection de classificateur pour les récipients de culture respectifs. L'utilisation d'un classificateur approprié pour le type de cellules, les conditions de culture, la densité de cellules, et analogue pour chaque récipient de culture permet l'extraction précise d'une région cellulaire, et son résultat permet une détermination précise de la zone cellulaire et analogue.
PCT/JP2019/011855 2019-03-20 2019-03-20 Analyseur de cellule WO2020188814A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/011855 WO2020188814A1 (fr) 2019-03-20 2019-03-20 Analyseur de cellule
JP2021506110A JP7006833B2 (ja) 2019-03-20 2019-03-20 細胞解析装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/011855 WO2020188814A1 (fr) 2019-03-20 2019-03-20 Analyseur de cellule

Publications (1)

Publication Number Publication Date
WO2020188814A1 true WO2020188814A1 (fr) 2020-09-24

Family

ID=72520806

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/011855 WO2020188814A1 (fr) 2019-03-20 2019-03-20 Analyseur de cellule

Country Status (2)

Country Link
JP (1) JP7006833B2 (fr)
WO (1) WO2020188814A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023209853A1 (fr) * 2022-04-27 2023-11-02 ローツェ株式会社 Procédé pour déterminer le moment de la sous-culture, système pour déterminer le moment de la sous-culture et système de culture cellulaire

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013137627A (ja) * 2011-12-28 2013-07-11 Olympus Corp 細胞輪郭線形成装置及びその方法、細胞輪郭線形成プログラム
JP2014022837A (ja) * 2012-07-13 2014-02-03 Nippon Hoso Kyokai <Nhk> 学習装置、及びプログラム
WO2018101004A1 (fr) * 2016-12-01 2018-06-07 富士フイルム株式会社 Système d'évaluation d'image de cellule et programme de commande d'évaluation d'image de cellule
WO2018105432A1 (fr) * 2016-12-06 2018-06-14 富士フイルム株式会社 Dispositif d'évaluation d'image de cellules et programme de commande d'évaluation d'image de cellules
WO2018158901A1 (fr) * 2017-03-02 2018-09-07 株式会社島津製作所 Dispositif et système d'analyse de cellules
WO2018180206A1 (fr) * 2017-03-30 2018-10-04 富士フイルム株式会社 Dispositif, procédé et programme d'évaluation d'images cellulaires
WO2018235251A1 (fr) * 2017-06-23 2018-12-27 株式会社ニコン Dispositif d'analyse, programme d'analyse et procédé d'analyse
WO2019016169A1 (fr) * 2017-07-16 2019-01-24 Imec Vzw Reconnaissance de cellules
WO2019039035A1 (fr) * 2017-08-25 2019-02-28 富士フイルム株式会社 Dispositif, procédé et programme d'apprentissage d'un dispositif de détermination et dispositif de détermination

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013137627A (ja) * 2011-12-28 2013-07-11 Olympus Corp 細胞輪郭線形成装置及びその方法、細胞輪郭線形成プログラム
JP2014022837A (ja) * 2012-07-13 2014-02-03 Nippon Hoso Kyokai <Nhk> 学習装置、及びプログラム
WO2018101004A1 (fr) * 2016-12-01 2018-06-07 富士フイルム株式会社 Système d'évaluation d'image de cellule et programme de commande d'évaluation d'image de cellule
WO2018105432A1 (fr) * 2016-12-06 2018-06-14 富士フイルム株式会社 Dispositif d'évaluation d'image de cellules et programme de commande d'évaluation d'image de cellules
WO2018158901A1 (fr) * 2017-03-02 2018-09-07 株式会社島津製作所 Dispositif et système d'analyse de cellules
WO2018180206A1 (fr) * 2017-03-30 2018-10-04 富士フイルム株式会社 Dispositif, procédé et programme d'évaluation d'images cellulaires
WO2018235251A1 (fr) * 2017-06-23 2018-12-27 株式会社ニコン Dispositif d'analyse, programme d'analyse et procédé d'analyse
WO2019016169A1 (fr) * 2017-07-16 2019-01-24 Imec Vzw Reconnaissance de cellules
WO2019039035A1 (fr) * 2017-08-25 2019-02-28 富士フイルム株式会社 Dispositif, procédé et programme d'apprentissage d'un dispositif de détermination et dispositif de détermination

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
KATAFUCHI, SAYA; YOSHIMURA, MOTOHIDE: "Cell Contour Extraction Using CNN", IEE JAPAN, 27 March 2017 (2017-03-27), pages 67 - 71 *
SASAKI, KEI ET AL.: "Segmentation and feature extraction of cell images with phase-contrast microscope", IEICE TECHNICAL REPORT, vol. 111, no. 389, 2012, pages 343 - 348 *
SATO, KAZUKI ET AL.: "A machine Learning Algorithm for the Automatic and Non-invasive Quality Assessment of Confluent Cells", IEICE TECHNICAL REPORT, vol. 116, no. 393, 2017, pages 101 - 106, XP009522563 *
SATO, KAZUKI ET AL.: "An Image-based Classification Algorithm for the Non-invasive Quality Evaluation of Confluent Cells", IEICE TECHNICAL REPORT, vol. 115, no. 401, 2016, pages 277 - 282 *
TANAKA, KOJIRO ET AL.: "Detection of Differentiated vs. Undifferentiated Colonies if iPS Cells Using CNN for Regression on Class Probability", IEICE TECHNICAL REPORT, vol. 117, no. 442, 2018, pages 109 - 114 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023209853A1 (fr) * 2022-04-27 2023-11-02 ローツェ株式会社 Procédé pour déterminer le moment de la sous-culture, système pour déterminer le moment de la sous-culture et système de culture cellulaire

Also Published As

Publication number Publication date
JPWO2020188814A1 (ja) 2021-12-02
JP7006833B2 (ja) 2022-01-24

Similar Documents

Publication Publication Date Title
JP4558047B2 (ja) 顕微鏡システム、画像生成方法、及びプログラム
JP6981533B2 (ja) 細胞画像解析装置、細胞画像解析システム、学習データの生成方法、学習モデルの生成方法、学習データの生成プログラム、および、学習データの製造方法
JP6348504B2 (ja) 生体試料の分割画面表示及びその記録を取り込むためのシステム及び方法
Racine et al. Visualization and quantification of vesicle trafficking on a three‐dimensional cytoskeleton network in living cells
JP5783043B2 (ja) 細胞塊の状態判別手法、この手法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法
JP7428173B2 (ja) 細胞観察装置及び細胞観察方法
JP6949875B2 (ja) 試料中に存在する粒子を取得するためのデバイスおよび方法
JP7006832B2 (ja) 細胞解析装置
CN103885168B (zh) 用于显微镜装置的自校准的方法
CN111837157A (zh) 细胞图像解析方法、细胞图像解析装置、及学习模型创建方法
KR20150095646A (ko) 바이오마커 발현의 선택 및 디스플레이
JP7439165B2 (ja) 敵対的生成ネットワークを用いたホログラフィック顕微鏡画像中の細胞の仮想染色
JPWO2010146802A1 (ja) 細胞塊の状態判別方法、この方法を用いた画像処理プログラム及び画像処理装置、並びに細胞塊の製造方法
JPH09509487A (ja) 細胞試料自動分類装置及び方法
CN114127307A (zh) 细胞图像解析方法和细胞图像解析装置
JPWO2019171453A1 (ja) 細胞画像解析方法、細胞画像解析装置、及び学習モデル作成方法
JP7006833B2 (ja) 細胞解析装置
JPWO2019180810A1 (ja) 細胞観察装置
SK500702016A3 (sk) Spôsob interaktívnej kvantifikácie digitalizovaných 3D objektov pomocou kamery snímajúcej pohľad
JP6947288B2 (ja) 細胞観察装置及び細胞観察用プログラム
WO2019043953A1 (fr) Dispositif d&#39;observation de cellules
WO2020261607A1 (fr) Procédé d&#39;évaluation de forme de cellule tridimensionnelle
JP7036192B2 (ja) 細胞観察装置
US20230129540A1 (en) Cell analyzer device
WO2023195405A1 (fr) Dispositif de détection de cellule, dispositif de support de diagnostic de cellule, procédé de détection de cellule et programme de détection de cellule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19920098

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021506110

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19920098

Country of ref document: EP

Kind code of ref document: A1