WO2014203322A1 - Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire - Google Patents

Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire Download PDF

Info

Publication number
WO2014203322A1
WO2014203322A1 PCT/JP2013/066672 JP2013066672W WO2014203322A1 WO 2014203322 A1 WO2014203322 A1 WO 2014203322A1 JP 2013066672 W JP2013066672 W JP 2013066672W WO 2014203322 A1 WO2014203322 A1 WO 2014203322A1
Authority
WO
WIPO (PCT)
Prior art keywords
cell
image
contour
background
noise component
Prior art date
Application number
PCT/JP2013/066672
Other languages
English (en)
Japanese (ja)
Inventor
利昇 三好
亮太 中嶌
広斌 周
豊茂 小林
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to PCT/JP2013/066672 priority Critical patent/WO2014203322A1/fr
Priority to JP2015522397A priority patent/JP6138935B2/ja
Publication of WO2014203322A1 publication Critical patent/WO2014203322A1/fr

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M41/00Means for regulation, monitoring, measurement or control, e.g. flow regulation
    • C12M41/30Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration
    • C12M41/36Means for regulation, monitoring, measurement or control, e.g. flow regulation of concentration of biomass, e.g. colony counters or by turbidity measurements

Definitions

  • the present invention relates to a cell image processing technique.
  • the present invention proposes a technique capable of automatically monitoring the growth state of cells by performing image processing on microscopic images taken at each stage during cell culture.
  • the present application includes a plurality of means for solving the above-mentioned problems.
  • a background separation image and a cell contour image are generated from an image of cells cultured in a cell culture device, and the background The noise component included in the separated image is removed by the cell outline image, the noise component contained in the cell outline image is removed by the background separated image, and the background separated image and the cell outline image from which the noise component has been removed are individually used.
  • a cell recognition unit for recognizing a cell region of the cell, a cell growth stage determination unit for determining a cell growth stage based on the recognition result, and an abnormality based on the number of cells and the cell position in the first stage of the cell growth stage
  • an abnormality determination unit that determines the presence / absence of an abnormality based on the cell occupancy rate and an abnormality in the presence / absence determination of the abnormality are
  • An abnormal state processing unit that issues a control or alarm, and determines whether or not the cell culture by the cell culture device is completed based on the cell occupancy rate, and / or outputs a quality evaluation value based on the cell occupancy rate
  • a cell image processing apparatus having a completion determination unit.
  • the present invention it is possible to automatically and quantitatively monitor the growth state of cells in the cell culture device. This reduces costs (labor costs), dependence on the skill level of engineers, and the risk of biological contamination as compared with the case where cell growth is regularly checked manually. Problems, configurations, and effects other than those described above will become apparent from the following description of embodiments.
  • the figure which shows the main components of the cell culture apparatus which concerns on an Example The whole figure of the cell culture device concerning an example.
  • 5 is a flowchart showing a specific example of background separation processing 602.
  • the figure which shows the example of a background separation image after applying a background isolated point removal process to the image shown in FIG. The figure which shows the example of the background separation image A after applying the intracellular hole-filling process to the image shown in FIG.
  • the figure which shows the example of the cell outline image A The figure which changed the black pixel of the part corresponding to the black pixel of the background separation image A among input images.
  • FIG. The figure which shows the example of the cell outline image C.
  • FIG. The figure explaining the image of a cell segmentation process. 12 is a flowchart illustrating another specific example of the background separation process 602.
  • FIG. 1 shows main components of a cell culture device 101 according to the present embodiment.
  • a cell culture device 101 shown in FIG. 1 includes an input device 102, a display device 103, an image acquisition device 104, a communication device 105, a computing device (CPU) 106, an external storage device 107, a culture vessel 108, and an environment control device 109. Yes.
  • the input device 102 is a keyboard, a mouse, or the like for inputting commands and the like.
  • the input device 102 is used to input a command executed for controlling a program executed by the arithmetic unit (CPU) 106 and a device connected to the cell culture device 101.
  • the display device 103 is a device such as a display that displays processing contents as appropriate.
  • the image acquisition device 104 is an image acquisition device such as a microscope or a camera.
  • the image acquisition device 104 is used to acquire an image obtained by enlarging cells in the culture vessel 108 with a microscope.
  • the acquired image may be stored in an external storage device or the like.
  • the communication device 105 is used to exchange data with an external device such as a PC or a server.
  • the communication device 105 is used for purposes such as acquisition of an execution command by an user from an external device, acquisition of data such as an image and text from an external device, and the like.
  • the communication device 105 is also used for the purpose of transmitting processing contents and images in the cell culture device 101 to an external device.
  • the computing device (CPU) 106 is a computing device that executes cell image recognition processing and the like.
  • the external storage device 107 is a storage device such as an HDD or a memory.
  • the external storage device 107 stores various data such as images captured by the image acquisition device 104, images being processed, and processing results.
  • the external storage device 107 is also used for temporary storage of data or the like generated during processing executed by the arithmetic unit (CPU) 106.
  • the cell culture device 101 does not have to include the input device 102, the display device 103, the image acquisition device 104, and the communication device 105. For example, when the input device 102 is not provided, the start of processing is instructed from an external device using the communication device 105, or is automatically performed by time designation or the like. If the display device 103 is not provided, the processing result is transmitted to an external device using the communication device 105 or stored in the external storage device 107.
  • the output and input of a module that executes processing may be performed via the external storage device 107. That is, when the first module outputs the processing result to the second module and the second module receives the processing result as an input, the first module actually sends the processing result to the external storage device 107.
  • the second module may obtain the output result of the first module from the external storage device 107.
  • the cell culture apparatus 101 is also equipped with a mechanism unit and a control unit (not shown) for seeding a cell seed in a culture container and proliferating the cell to produce a cell sheet. Further, the cell culture apparatus 101 according to the present embodiment is equipped with an image processing function (cell image processing function) described later so that the growth state of the cells can be automatically and quantitatively monitored.
  • a mechanism unit and a control unit not shown
  • the cell culture apparatus 101 according to the present embodiment is equipped with an image processing function (cell image processing function) described later so that the growth state of the cells can be automatically and quantitatively monitored.
  • the cell culture device 101 periodically and automatically checks the cell growth status, and if there is a problem with the check result, realizes a corresponding operation such as automatically adjusting the culture environment.
  • the cell culture device 101 is required to have a function of checking whether the cells are growing along a normal growth curve. In this case, a normal growth curve given in advance is used as a criterion.
  • the cell culture device 101 is required to have a function of checking whether or not the hole (the portion in the sheet where there is no cell) is small in order to maintain the quality of the cell sheet.
  • an image processing function (cell image processing function) proposed in this embodiment, a function of counting (counting) the number of cells in the early and middle stages of cell amplification and recognizing a part having no cells in the later stage.
  • the shape of the cell appearing in the cell image to be processed is not limited to a relatively simple shape such as an ellipse or a line, and it is also necessary to handle cells having complicated shapes and different sizes.
  • An image processing function (cell image processing function) described later is provided with a mechanism for accurately recognizing individual cells even for cell images that are generally difficult to recognize.
  • the automatic culture apparatus 210 is one example of the cell culture apparatus 101 shown in FIG. 2 is an overall view of the automatic culture apparatus 210, FIG. 3 is a front view of the cell culture apparatus, and FIG. 4 is a side view of the cell culture apparatus. 2 to 4, the same reference numerals are given to the same parts.
  • a cell culture chamber door 211 and an intermediate chamber door 217 are attached to the front side of the casing of the automatic culture apparatus 210 so as to be openable and closable.
  • a cell culture chamber 213 is provided behind the cell culture chamber door 211, and an intermediate chamber 216 is disposed behind the intermediate chamber door 217.
  • These doors are provided with a stopper and a heat insulating mechanism for sealing the inside when they are closed. By closing the cell culture chamber door 211 and the intermediate chamber door 217, it is possible to maintain the airtightness and temperature inside the automatic culture apparatus 210.
  • a refrigerator 214 is arranged inside the intermediate chamber 216, and a refrigerator door 212 is attached to the front side thereof so as to be opened and closed.
  • the refrigerator door 212 is also provided with the same mechanism as the door described above.
  • a control unit 215 is disposed below the automatic culture apparatus 210. The control unit 215 is independent of other compartments and blocks the temperature, humidity, and carbon dioxide in the cell culture chamber 213.
  • the cell culture chamber 213 includes a culture vessel 220, a culture vessel base 221 that holds the culture vessel, a rotation mechanism 222 that rotationally drives the culture vessel base 221, a pump 223, a valve 224, a tank 225, a drive base 227, a microscope 228, a stage. 229, a connector board 230, a connector 231, a flow path 240, a tube 241, a syringe 243, and a syringe pump 244 are arranged.
  • the culture vessel 220 corresponds to the culture vessel 108 (FIG. 1)
  • the microscope 228 corresponds to the image acquisition device 104 (FIG. 1) as an imaging device.
  • a medium base 232 is disposed in the refrigerator 214. Between the cell culture chamber 213 and the refrigerator 214, a seal 233 that seals both spaces is disposed. Note that a claw portion 234 for installing the seal 233 is provided on the walls of the cell culture chamber 213 and the refrigerator 214.
  • the automatic culture apparatus 210 includes a fan (for an intermediate chamber) 250, a filter (for an intermediate chamber) 251, a fan (for a control unit) 252, an intake filter 253, and an exhaust gas used for adjusting temperature, humidity, and the like.
  • a filter 254 is provided.
  • the air flow in the intermediate chamber 216 is indicated by an arrow 255
  • the air flow in the control unit 215 is indicated by an arrow 256.
  • the microscope 228 may incorporate the arithmetic device 106 (FIG. 1) and the external storage device 107 (FIG. 1).
  • the processing result of the arithmetic device 106 may be transmitted from the cell culture device 101 to an external computer through the communication device 105, and the processing result may be displayed on the external computer.
  • an image captured by the microscope 228 is transmitted to an external computer (a computer including the arithmetic device 106, the external storage device 107, the input device 102, and the display device 103) through the communication device 105, and the image is processed in the external computer. May be.
  • the automatic culture apparatus 210 periodically captures cells in culture, recognizes the position of the cells through image processing of the captured images, counts the number of cells, and occupies the cells. Calculate.
  • the automatic culture apparatus 210 determines the growth stage of the cells in culture based on the processing results, changes the report content according to the growth stage, changes the abnormality determination method, and outputs an alarm when abnormal. And execute environmental control.
  • the following processing is realized through execution of a program in the arithmetic unit 106.
  • the automatic culture apparatus 210 seeds cell seeds in the culture vessel 220 (501).
  • the automatic culture device 210 captures an enlarged image of the cells in the culture vessel 220 using the microscope 228, and the arithmetic device 106 recognizes the position of the cells in the image (502).
  • the computing device 106 also counts the number of recognized cells, calculates the area occupied by the cells, and calculates the ratio occupied by the cells (occupation rate).
  • This calculation process is programmed so that the first execution time is preset and the process (502) is started after a certain time from the first execution time or periodically according to the growth state of the cells.
  • the arithmetic unit 106 determines a cell growth stage based on the calculated number of cells, occupation rate, elapsed time, and the like (503).
  • determination criteria will be described. For example, when the cell type and the seeding density are determined in advance, the growth stage can be determined based on the elapsed time. If these pieces of information are not determined, it may be determined that the growth phase has been entered later when the cell occupancy rate exceeds a certain threshold. Also, the occupation rate and the number of cells are recorded every hour, and if the increase in the number of cells or the rate of increase in the occupation rate slows down (when the increase rate falls below a certain threshold value), it is judged that it has entered the later stage of the growth stage. May be. In other cases, it is determined as the initial period or the middle period.
  • the arithmetic unit 106 records the number of cells counted up to the current time and the recognized cell position as a determination history (504).
  • the computing device 106 refers to the determination history of the number of cells and the cell position up to the current time, and determines whether there is an abnormality in cell growth (506).
  • the arithmetic unit 106 evaluates based on a difference from a preset normal growth curve (number of cells), a difference in the shape of the growth curve, and the like, and determines that an abnormality occurs when the difference exceeds a certain threshold value. To do.
  • a preset normal growth curve number of cells
  • the scale of the actual growth curve before the current time is changed so that it approximates the shape of the normal growth curve, and the same time based on the normal growth curve.
  • the computing device 106 can track individual cells based on the recorded information of individual cell positions in the past. In this case, the arithmetic unit 106 can determine that the cell with little movement is dead. This number of dead cells is also a criterion for determining whether or not there is an abnormality.
  • the arithmetic unit 106 determines whether there is an abnormality in cell growth based on the difference from the normal growth curve as described above, the number of dead cells, and the like.
  • the computing device 106 When it is determined that the growth stage is late, the computing device 106 records the size, number, etc. of the cell non-occupied area as a determination history (505). The computing device 106 determines whether the quality is abnormal based on the size, number, etc. of the cell non-occupied areas up to the current time (506).
  • the computing device 106 has an area of a certain size or more among the connected components of the non-occupied area, the non-occupancy rate is equal to or greater than a certain threshold value, or the number of connected components of the non-occupied area is equal to or greater than
  • the arithmetic unit 106 executes two operations. As one of the operations, the arithmetic unit 106 executes a control operation (environment control) for automatically adjusting the culture environment such as temperature and humidity (507). The contents of the control operation are programmed in advance based on experience. Further, the arithmetic unit 106 executes alarm notification processing as one of the operations (508). The alarm notification is executed when it is determined that manual control is necessary, such as when it is determined that automatic environmental control is impossible. Here, when it is determined that automatic environmental control is impossible, for example, a case where the degree of abnormality is significant or a situation cannot be improved from a history of past control results of processing 507 is assumed. The alarm is notified to the worker through a display device or a communication device.
  • a control operation environment control
  • the contents of the control operation are programmed in advance based on experience.
  • the arithmetic unit 106 executes alarm notification processing as one of the operations (508).
  • the alarm notification is executed when it is determined that manual
  • the arithmetic unit 106 proceeds to one of the processes 509 and 510 depending on the growth stage.
  • the computing device 106 When the growth stage is in the initial stage or the middle stage, the computing device 106 outputs the number of cells, the number of deaths, the growth curve of the cells so far, etc., which are useful information at this stage (509).
  • the arithmetic unit 106 After a predetermined time has elapsed since the execution of the process 509, the arithmetic unit 106 returns to the process 502.
  • the arithmetic unit 106 outputs the occupation rate, quality evaluation value, etc., which are useful information at this stage (510).
  • the quality evaluation value includes the number of connected components in the non-occupied region, the size of the maximum component among the connected components in the non-occupied region, the unoccupied rate, the history of the unoccupied rate of each connected component, and the like.
  • the arithmetic unit 106 determines that the quality does not improve even if a longer time elapses (the rate at which the unoccupied area becomes smaller with reference to the past history) It is determined that the culture has been completed. When it is determined that the culture is completed, the arithmetic device 106 notifies the completion of the culture through a display device, a communication device, etc. (512). On the other hand, when the culture is not completed, the arithmetic unit 106 returns to the process 502 after a predetermined time of completion determination.
  • Cell image recognition processing The flow of cell image processing according to the present embodiment will be described with reference to FIG.
  • This cell image processing is executed by the arithmetic device 106 in the processing 502 (FIG. 5).
  • the arithmetic device 106 acquires, as the input image 601, a microscopic image of cultured cells taken in the culture device.
  • the position of each cell is recognized from the microscope image, and the number of cells is counted (counting).
  • the shape of the cell handled by the cell culture device is not limited to a relatively simple shape such as an ellipse or a line, and it is assumed that it is necessary to handle a cell having a complicated shape.
  • a method such as a pattern matching method that is assumed in advance cannot be used for cell recognition.
  • the size of cells to be handled is not constant but varies, and there is a possibility that a plurality of cells overlap and contact each other.
  • the microscopic image may have poor color and brightness characteristics that separate the interior of the cell from the background, and binarization based on color and brightness cannot accurately separate the cell and background. There is a case.
  • the background separation processing and the cell contour are performed so that the generation accuracy of the background separation image and the cell contour extraction image is improved even in the situation where it is difficult to separate the cell portion and the background portion as described above.
  • Adopt a method of combining each other while taking advantage of each other's advantages of the extraction process.
  • the background separation image may include an area recognized as the inside of the cell, although it is originally an area outside the cell. Therefore, in the cell image processing of the present embodiment, a method of removing the noise component of the background separation image by removing the component inside the cell that is outside the cell outline from the outline of the cell outline image is adopted.
  • the cell contour image includes a background noise component and a noise component caused by the internal structure of the cell.
  • the background noise component removes the contour located in the background area of the background separation image, and the noise component inside the cell removes the non-contact connected component from the background of the background separation image. The method of removing the noise component contained in is adopted.
  • the cell cluster in the background separated image from which the noise component has been removed is separated into individual cells using the cell contour image from which the noise component has been removed, and the position of each cell is determined. Recognize and count the number of cells.
  • the input image 601 is a microscopic image of cells cultured in the culture vessel 220 of the automatic culture apparatus 210.
  • An image 701 shown in FIG. 7 represents an image obtained by converting the input image 601 into a gray scale and further converting it into a newspaper printing format.
  • the actual input image is color, and there is unevenness in light due to uneven illumination. Also, the distinct color features that separate the cell interior and cell exterior are generally poor.
  • the boundary between the background and the cell part has a high brightness due to light hitting the convex part, but it is only a part of the boundary and does not surround the whole cell.
  • the intensity of the boundary brightness varies depending on the location. Further, it can be seen that the shape of the cells is generally complicated, the shape is different from cell to cell, and there is contact between cells. Therefore, in order to accurately count the number of cells, it is not sufficient to calculate the cell occupancy rate.
  • the position of individual cells can be accurately recognized from the cell image even under such a disadvantageous condition for image processing. Moreover, if the position of each cell can be recognized, the division status and movement status of each cell can be tracked. Thereby, in the cell image processing of the present embodiment, additional information for automatically determining the cell growth state can be obtained.
  • the background separation process 602 is a process of separating a cell internal area and a background area in the input image 601 and generating a “background separated image A” 603.
  • the “background separated image A” 603 is an image in which the internal area of the cell is expressed by white pixels and the background area is expressed by black pixels.
  • it is necessary to distinguish the internal area of the cell from the background area by some method, but there is a possibility that a background part having a short distance from the cell is also determined as the inside of the cell.
  • Various factors such as those caused by the imaging system and those caused by the processing method of image processing can be considered as factors that cause such erroneous determination.
  • the input image 601 is separated into an internal region and an external region of the cell by extracting irregularities on the image due to the internal structure of the cell.
  • the arithmetic unit 106 converts the input image 601 into an HSV expression in the HSV conversion process 901.
  • RGB expression a method of expressing a color with three colors of red (R), green (G), and blue (B)
  • HSV expression hue ( H), saturation (S), and lightness (V).
  • the arithmetic unit 106 converts the image into a gray image based on the brightness.
  • the lightness (V) is given by a value from 0.0 to 1.0.
  • V the lightness
  • 256 gradation values By re-expressing this value to 256 gradation values from 0 to 255, a gray image is generated. For example, when the brightness is v, the integer part of the value calculated by 255 * v is output. Note that “*” is a multiplication operator.
  • the input image 601 is a gray image
  • these processes are not necessary.
  • other graying processing methods such as a graying process based on light intensity instead of lightness, and a graying process based on a value obtained by adding lightness and lightness with weights may be used.
  • the arithmetic unit 106 extracts uneven portions due to the internal structure of the cell from the image.
  • a method of calculating the smoothness of the image can be used.
  • a “Sobel filter” can be used. The calculation by the “Sobel filter” is performed as follows.
  • g (i, j) fx (i, j) ⁇ fx (i, j) + fy (i, j) ⁇ fy (i, j)
  • the arithmetic unit 106 determines that a white pixel is obtained if the value of g (i, j) is equal to or greater than a predetermined threshold value, and a black pixel is otherwise obtained.
  • a background separated image 1001 as shown in FIG. 10 is obtained.
  • some white pixels may protrude from the background outside the cell, resulting in a white pixel area.
  • the image after binarization is shown in the schematic diagram 802 of FIG.
  • the portion determined to be inside the cell is represented by white
  • the portion determined to be the background is represented by black
  • the hatched portion is determined to be inside the cell, but actually represents the portion that is the background.
  • the white pixels in the shaded area is a problem of the imaging system.
  • the resolution at the time of shooting, the pseudo color, etc. there are cases where irregularities occur about several pixels from the actual boundary.
  • the surrounding pixel values must be taken into account, and therefore, unevenness may occur by several pixels.
  • FIG. 11 shows a background isolated image 1101 obtained by removing background noise from the background isolated image 1001 (FIG. 10).
  • FIG. 10 shows isolated points.
  • the background separated image 1101 is clearly a region inside the cell, there is a portion represented by black pixels due to the influence of a locally smooth portion inside the cell.
  • black pixels remaining inside the cell are represented by black dots.
  • the arithmetic unit 106 calculates a black pixel connected component and removes a black pixel block having a certain size or less (conversion to a white pixel), contrary to the background isolated point removal process 905. To fill the inside of the cell. As a result, a background separated image 1201 shown in FIG. 12 is obtained. This is the “background separated image A” 603 in FIG.
  • the “background separated image A” 603 is obtained from the input image 601 by the series of processes described above (that is, the background separated process 602). However, as described above, “background separated image A” includes a region in which white pixels protrude somewhat from the background outside the cell and are white pixels even though they are the background.
  • a schematic diagram 803 of FIG. 8 shows an image of “background separated image A”. A hatched portion in the schematic diagram 803 indicates a region which is a white pixel although it is the background.
  • the arithmetic unit 106 grays the input image 601 on the basis of brightness, as in the HSV conversion process 901 (FIG. 9) and the graying process 902 (FIG. 9), for example. Apply a filter.
  • FIG. 13 shows a cell contour image 1301 obtained by applying a canny filter to a gray image.
  • This image is a “cell outline image A” 605 in FIG.
  • the edge portion (white pixel) is the contour portion.
  • a schematic diagram 804 in FIG. 8 is an image of the “cell outline image A” 605.
  • Outside cell outline removal processing 606 / cell outline image 607 In the cell outer contour removal processing 606, the noise component generated in the background portion of the cell among the noise components mixed in the “cell contour image A” 605 is filtered by the background region of the “background separated image A” 603 and removed. To do. That is, white pixels in the background portion mixed in “cell outline image A” 605 are removed (set to black pixels). Specifically, the arithmetic unit 106 converts the white pixel of the “cell outline image A” 605 corresponding to the black pixel of the “background separated image A” 603 to a black pixel. As a result, a “cell outline image B” 607 from which the noise of the cell outline in the background portion is removed is generated.
  • Cell outline image B 607 is an image in which a portion corresponding to the black pixel of the background separated image 1201 shown in FIG. 12 in the cell outline image 1301 shown in FIG.
  • a schematic diagram 805 in FIG. 8 is obtained.
  • white pixels corresponding to the background portion are removed.
  • Extracellular component removal processing 608 (Extracellular component removal processing 608 / background separation image 609)
  • the noise component that is mixed in the “background separated image A” 603 and is recognized as an internal region of the cell although it is originally an external portion of the cell is removed.
  • the region outside the outline of the “cell outline image B” 607 is regarded as the outside area of the cell, and is removed.
  • white pixels included in the “background separated image A” 603 pixels originally located outside the cell are removed (made black pixels) using the outline of the “cell outline image B” 607.
  • the arithmetic unit 106 determines that the pixel of the “cell outline image B” 607 corresponding to the white pixel adjacent to the black pixel of the “background separated image A” 603 is not a white pixel (outline) (that is, a black pixel).
  • the white pixel of the “background separated image A” 603 that is the determination target is changed to a black pixel.
  • a pixel portion that is not a contour among white pixels adjacent to the background (black pixel) is changed to a black pixel.
  • extracellular components located outside the outline of “cell outline image B” 607 are removed.
  • the outline of the “cell outline image B” 607 does not surround the entire cell so as to surround the cell. There is a possibility that the inside also becomes black pixels.
  • the arithmetic device 106 is set in advance so as to repeat the above-described black pixel conversion processing as many times as the number corresponding to the average protruding pixel.
  • the cell segmentation process 612 described later since a small isolated component is not regarded as a cell, even if the external component of the cell cannot be completely removed, it is sufficient that the portion is small. On the other hand, even if the inside of the cell is made somewhat black by the extracellular component removal processing 608, there is no problem in the final output result.
  • the above-described black pixel conversion processing may be executed after the outline of the “cell outline image B” 607 is thickened. That is, the process of making the black pixel in contact with the contour portion (white pixel) of the “cell contour image A” 603 into a white pixel may be repeated many times.
  • the white pixel mixed in the “background separated image A” 603 is converted into a black pixel by converting the portion from the background portion to the outline of the cell into a black pixel.
  • “background separated image B” 609 is obtained.
  • An image of “background separated image B” 609 is shown in a schematic diagram 806 of FIG.
  • FIG. 14 shows an image obtained by converting a portion corresponding to a black pixel (background) in the “background separated image A” 603 out of the pixels of the input image 601 into a black pixel.
  • FIG. 15 is an image in which the portion corresponding to the black pixel (background) in the “background separated image B” 609 among the pixels of the input image is changed to a black pixel. All are color images, but they are grayed out and converted to a dot diagram in newspaper printing format. In FIG. 14, there are portions that are not black pixels outside the cell, whereas in FIG. 15, these portions are removed.
  • Cell inner contour removal processing 610 (Cell inner contour removal processing 610 / cell contour image 611)
  • the noise components generated in the region inside the cell are in contact with the background of the “background separated image B” 609. Remove no connected regions.
  • contours other than the cell boundaries that is, contours present inside the cells are removed from the white pixels included in the “cell contour image B” 607.
  • the “cell contour image C” 611 is generated by executing the cell inner contour removal processing 610.
  • FIG. 16 schematically shows a generation image of “cell outline image C” 611.
  • a schematic diagram 1601 is an image of “cell outline image C” 611.
  • FIG. 17 shows an image image of the cell contour image 1701 corresponding to the “cell contour image C” 611. Comparing FIG. 17 and FIG. 13, it can be seen that the “cell outline image C” 611 has a background portion and the outline (white pixel) inside the cell removed compared to the “cell outline image A” 605.
  • Cell segmentation process 612 When “background separation image B” 609 and “cell outline image C” 611 are obtained by the above-described processing, the arithmetic unit 106 executes the cell segmentation processing 612. That is, a process of dividing or separating individual cell regions from the cell region (cell cluster) obtained by the “background separated image B” 609 is executed. At this time, the arithmetic unit 106 uses the outline of the “cell outline image C” 611 for dividing or separating individual cell regions. Here, the arithmetic unit 106 regards the internal area of the cell surrounded by the outline as an individual cell area.
  • the cell outline extracted by the image processing may be missing at a part of the cell boundary as described above. This can also be seen from the “cell outline image C” 611 shown in FIG.
  • FIG. 17 shows an image of the cell segmentation process 612.
  • the arithmetic unit 106 thickens the outline of the “cell outline image C” 1802 and generates a “cell outline image C” 1803 that has been bolded.
  • the arithmetic unit 106 converts the pixel portion corresponding to the white pixel in the “cell outline image C” 1803 that has been bolded out of the white pixels in the “background separated image B” 1801 into a black pixel in the outline portion.
  • a “background separated image B” 1804 is generated.
  • the arithmetic unit 106 extracts a connected component of the white pixel from the contour portion black pixelation “background separated image B” 1804, and selects a component having a size equal to or larger than a predetermined threshold among the connected components as one cell. Recognize as Further, the arithmetic device 106 calculates the position of each cell as, for example, the position of the center of gravity of the connected component. The arithmetic device 106 also calculates the number of cells included in the input image 601, the area occupied by the cells, the ratio of cells occupied with respect to the input image 601, and the like according to the growth stage. The area occupied by the cells can be calculated based on the area and ratio of the white pixel portion of the “background separated image B” 609.
  • Output processing 613 After the execution of the cell segmentation process 612, the arithmetic unit 106 outputs an image in which the position of the cell is marked, the number of cells included in the input image 601, the area occupied by the cell, the cell occupation rate, and the like.
  • the noise component included in each of the background separation image and the cell outline image is removed using the other image.
  • the background separation accuracy and the cell contour extraction accuracy can be improved. As a result, it is possible to improve the recognition accuracy of the position of the cell included in the input image and the recognition accuracy of the cell occupation region.
  • the extracellular component removal processing 608 is executed after the cell outer contour removal processing 606, and then the cell inner contour removal processing 610 is executed.
  • the reason is as follows. First, in order to execute the cell inner contour removal processing 610, the background of the “background separated image A” 602 needs to be in contact with the cell boundary. However, as described above, the portion of the “background separated image A” 602 that is recognized as being inside the cell includes a region portion (extracellular component noise) that is not originally inside the cell. As described above, if an area portion that is not originally inside the cell is included, the cell inner contour removal processing 610 becomes difficult, and thus the extracellular component removal processing 608 is executed in advance.
  • the extracellular component removal process 608 is a process of removing extracellular components from the “background separated image A” 602 by relying on the contour. In this case, even if the noise of the contour is included inside the cell. It doesn't matter, but there is a problem if there is contour noise outside the cell. Therefore, prior to the processing of the extracellular component removal processing 608, the external cell contour removal processing 606 is executed to remove contour noise outside the cell.
  • the automatic culture apparatus 210 As described above, the automatic culture apparatus 210 according to the present embodiment periodically photographs the cells in the culture container, recognizes the position of the cells and the cell occupation area from the acquired image, and the cell growth stage. Automatically, cell growth abnormality determination according to the growth stage, quality evaluation, etc. are automatically executed. Thereby, the automatic culture apparatus 210 which concerns on a present Example can monitor the growth condition etc. of a cell quantitatively. As a result, it is possible to reduce costs (labor costs), dependence on the skill level of engineers, and the risk of biological contamination, compared with the case where cell growth is regularly checked manually. . Moreover, appropriate environmental control and information provision according to the cell growth stage can be automatically executed.
  • the actual shape of the cells to be cultured is not limited to a relatively simple shape (for example, an ellipse, a line, etc.) and is complex, and the size of the cells varies.
  • the automatic culture apparatus 210 according to the present embodiment is not limited. If the cell image recognition processing installed in is used, it is possible to handle cells having complicated shapes and cells having different sizes. If the cell image recognition process is used, an image of a cell having poor characteristics such as contact between cells, brightness for separating the inside and outside of the cell, and color can be handled as the input image 601. As a result, even when an input image that cannot be handled by a conventional method is used, the cell growth state can be monitored, and the cell occupancy and number can be calculated quantitatively with high accuracy. In addition, if moving image processing is used, cell tracking can be facilitated, and useful information for monitoring the cell growth status can be acquired.
  • This example proposes an automatic culture apparatus 210 that generates a “background separated image A” 603 by a procedure different from the background separation process 602 (FIG. 9) of the first embodiment.
  • Other configurations and processing contents of the automatic analyzer 210 in the present embodiment are the same as those in the first embodiment.
  • Background separation processing 602 in the present embodiment is a method for discriminating cell regions and background regions by machine learning, and the processing phase can be separated into a learning phase and an identification phase.
  • the learning phase will be explained.
  • a database in which labeled input images 1901 are collected is created.
  • the arithmetic unit 106 cuts out a part of the cell image and performs labeling based on whether or not the cell is reflected in the image. It is convenient to process the cell image into a rectangular image having a predetermined size. Prepare as many image and label pairs as possible (for example, several hundred or more).
  • the arithmetic unit 106 performs a process of converting one image into a vector value.
  • a feature extraction method used in the field of image recognition such as HOG feature or SIFT feature can be used.
  • the simplest processing method includes a method in which a vector in which all pixel values are arranged is used as a pixel feature. In this case, if it is a gray scale, it will become a vector of a pixel number dimension, and if it is RGB, it will become a (pixel number * 3) dimension vector.
  • “*” is a multiplication operator. Let n be the number of dimensions of the vector.
  • the arithmetic unit 106 takes a real value as a value and generates a function f (x) with a parameter having an n-dimensional vector as an argument.
  • the value of the function f (x) with respect to the vector x extracted from the image showing the cell is positive (> 0), and the arithmetic device 106 has the function f (with respect to the vector x extracted from the image where the cell is not shown. Parameters are learned as much as possible so that the value of x) is negative ( ⁇ 0).
  • a neural network, SVM (Support Vector Vector), or the like can be used.
  • the function f (x) generated in the identification learning process 1903 is stored in the identification dictionary 1904.
  • the identification phase feature extraction processing 1905 is executed for the input image 601.
  • the arithmetic unit 106 divides the input image 601 into rectangles having the same size as the labeled input image 1901, and converts each rectangle into a vector by the same method as the feature extraction processing 1902. At this time, the rectangles may overlap each other.
  • the identification process 1906 the value of the function f (x) is calculated for the vector x extracted in the feature extraction process 1905. If f (x)> 0, it is determined as a cell region, and if f (x) ⁇ 0, it is determined as a background region.
  • a “background separated image A” 603 is generated in which the cell region is a white pixel and the background region is a black pixel.
  • Example 1 and Example 2 the case where the cell culture apparatus 101 is also provided with a cell image processing function was handled.
  • the cell culture apparatus 101 may be equipped with only an image acquisition function, and part or all of the cell image processing function may be implemented in a personal computer connected to the outside of the cell culture apparatus 101. Further, only the cell image processing function may be an independent software or image processing apparatus.
  • the present invention is not limited to the configuration of the embodiment described above, and includes various modifications.
  • the case where the “background separated image B” 609 is generated using the “background separated image A” 603 and the “cell outline image B” 607 has been described.
  • the method is different from the method illustrated in FIG. “Background separation image B” 609 may be generated.
  • “background separated image B” 609 may be generated using “background separated image A” 603 and “cell outline image A” 605.
  • “background separated image B” 609 may be generated by using the cell contour image obtained by removing the background noise component from “cell contour image A” 605 and “background separated image A” 603.
  • an image captured by the cell culture device 101 is used as the input image 601.
  • the input image 601 is arbitrary, and for example, noise components are removed by the method described in the above-described embodiment. You may use the image.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Organic Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Microbiology (AREA)
  • Sustainable Development (AREA)
  • Biotechnology (AREA)
  • Biomedical Technology (AREA)
  • Biochemistry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Genetics & Genomics (AREA)
  • Analytical Chemistry (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image cellulaire comprenant : une unité de reconnaissance cellulaire qui génère, à partir d'une image prise de cellules, une image séparée du fond et une image de contour cellulaire, élimine la composante de bruit de l'image séparée du fond à l'aide de l'image de contour cellulaire, élimine la composante de bruit de l'image de contour cellulaire au moyen de l'image séparée du fond et reconnaît chaque zone cellulaire à l'aide de l'image séparée du fond et de l'image de contour cellulaire dont les composantes de bruit ont été éliminées; une unité de détermination du stade de croissance cellulaire qui détermine le stade de croissance cellulaire sur base des résultats de la reconnaissance susmentionnée; une unité de détermination d'anomalies qui, dans la période précoce du stade de croissance cellulaire, réalise une détection d'anomalies sur base du nombre de cellules et des positions cellulaires et qui, dans la période ultérieure du stade de croissance cellulaire, détecte la présence ou l'absence d'anomalies sur base du rapport d'occupation cellulaire; une unité de traitement d'état anormal qui, si une anomalie est trouvée lors de la détection susmentionnée de présence ou d'absence d'anomalies, initie une régulation environnementale ou une alarme; et une unité de détermination d'achèvement qui détermine si la culture des cellules par le dispositif de culture cellulaire est achevée ou non sur base du rapport d'occupation cellulaire et/ou qui émet une valeur d'évaluation de qualité sur base du rapport d'occupation cellulaire.
PCT/JP2013/066672 2013-06-18 2013-06-18 Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire WO2014203322A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2013/066672 WO2014203322A1 (fr) 2013-06-18 2013-06-18 Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire
JP2015522397A JP6138935B2 (ja) 2013-06-18 2013-06-18 細胞画像処理装置、細胞画像認識装置及び細胞画像認識方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/066672 WO2014203322A1 (fr) 2013-06-18 2013-06-18 Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire

Publications (1)

Publication Number Publication Date
WO2014203322A1 true WO2014203322A1 (fr) 2014-12-24

Family

ID=52104091

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/066672 WO2014203322A1 (fr) 2013-06-18 2013-06-18 Dispositif de traitement d'image cellulaire, dispositif de reconnaissance d'image cellulaire et procédé de reconnaissance d'image cellulaire

Country Status (2)

Country Link
JP (1) JP6138935B2 (fr)
WO (1) WO2014203322A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016149997A (ja) * 2015-02-19 2016-08-22 大日本印刷株式会社 識別装置、識別方法、プログラム
JPWO2018199326A1 (ja) * 2017-04-28 2019-08-08 国立研究開発法人海洋研究開発機構 統合システム及び統合方法
CN110392732A (zh) * 2017-03-02 2019-10-29 株式会社岛津制作所 细胞分析方法和细胞分析装置
WO2020189218A1 (fr) * 2019-03-19 2020-09-24 富士フイルム株式会社 Système de culture cellulaire et procédé de culture cellulaire
WO2023195490A1 (fr) * 2022-04-06 2023-10-12 富士フイルム株式会社 Système d'imagerie et procédé de réglage de concentration de cellules

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001275659A (ja) * 2000-03-31 2001-10-09 Masahito Taya 細胞培養方法、細胞培養装置及び記録媒体
JP2007155982A (ja) * 2005-12-02 2007-06-21 Kawasaki Heavy Ind Ltd 位相物体検出装置及び位相物体検出方法
JP2007222073A (ja) * 2006-02-23 2007-09-06 Yamaguchi Univ 画像処理により細胞運動特性を評価する方法、そのための画像処理装置及び画像処理プログラム
JP2010263872A (ja) * 2009-05-18 2010-11-25 Olympus Corp 細胞画像解析装置
WO2011132587A1 (fr) * 2010-04-23 2011-10-27 浜松ホトニクス株式会社 Procédé et appareil pour l'observation de cellules
JP2012039927A (ja) * 2010-08-18 2012-03-01 Univ Of Fukui 細胞画像解析装置及び方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001275659A (ja) * 2000-03-31 2001-10-09 Masahito Taya 細胞培養方法、細胞培養装置及び記録媒体
JP2007155982A (ja) * 2005-12-02 2007-06-21 Kawasaki Heavy Ind Ltd 位相物体検出装置及び位相物体検出方法
JP2007222073A (ja) * 2006-02-23 2007-09-06 Yamaguchi Univ 画像処理により細胞運動特性を評価する方法、そのための画像処理装置及び画像処理プログラム
JP2010263872A (ja) * 2009-05-18 2010-11-25 Olympus Corp 細胞画像解析装置
WO2011132587A1 (fr) * 2010-04-23 2011-10-27 浜松ホトニクス株式会社 Procédé et appareil pour l'observation de cellules
JP2012039927A (ja) * 2010-08-18 2012-03-01 Univ Of Fukui 細胞画像解析装置及び方法

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016149997A (ja) * 2015-02-19 2016-08-22 大日本印刷株式会社 識別装置、識別方法、プログラム
CN110392732A (zh) * 2017-03-02 2019-10-29 株式会社岛津制作所 细胞分析方法和细胞分析装置
US11609537B2 (en) 2017-03-02 2023-03-21 Shimadzu Corporation Cell analysis method and cell analysis system using a holographic microscope
CN110392732B (zh) * 2017-03-02 2023-07-28 株式会社岛津制作所 细胞分析方法和细胞分析装置
JPWO2018199326A1 (ja) * 2017-04-28 2019-08-08 国立研究開発法人海洋研究開発機構 統合システム及び統合方法
US11837331B2 (en) 2017-04-28 2023-12-05 Japan Agency For Marine-Earth Science And Technology Integration system and integration method
WO2020189218A1 (fr) * 2019-03-19 2020-09-24 富士フイルム株式会社 Système de culture cellulaire et procédé de culture cellulaire
JPWO2020189218A1 (fr) * 2019-03-19 2020-09-24
JP7209079B2 (ja) 2019-03-19 2023-01-19 富士フイルム株式会社 細胞培養システム及び細胞培養方法
WO2023195490A1 (fr) * 2022-04-06 2023-10-12 富士フイルム株式会社 Système d'imagerie et procédé de réglage de concentration de cellules

Also Published As

Publication number Publication date
JPWO2014203322A1 (ja) 2017-02-23
JP6138935B2 (ja) 2017-05-31

Similar Documents

Publication Publication Date Title
JP6138935B2 (ja) 細胞画像処理装置、細胞画像認識装置及び細胞画像認識方法
US11842556B2 (en) Image analysis method, apparatus, program, and learned deep learning algorithm
JP7148581B2 (ja) コロニーコントラスト収集
US11942220B2 (en) Methods and apparatus for assessing embryo development
JP6801000B2 (ja) 細胞画像評価装置および細胞画像評価制御プログラム
JP6845221B2 (ja) プレート培地上にストリークされた試料からの自動化された微生物コロニーカウントのための方法及びシステム
CN110705639B (zh) 一种基于深度学习的医学精子图像识别系统
CN107945181A (zh) 用于乳腺癌淋巴转移病理图像的处理方法和装置
US10417772B2 (en) Process to isolate object of interest in image
CN110807775A (zh) 基于人工智能的中医舌像分割装置、方法及存储介质
CN111414995B (zh) 小目标菌落的检测处理方法、装置、电子设备及介质
CN113141778A (zh) 图像特征检测
WO2015141275A1 (fr) Dispositif, procédé et programme pour la commande de l'affichage d'une région cellulaire
JP7097686B2 (ja) 画像解析装置
CN115641364A (zh) 基于胚胎动力学参数的胚胎分裂周期智能预测系统及方法
CN107123102A (zh) 一种贴壁细胞生长融合度自动分析方法
JP2022514427A (ja) バクテリアコロニーのバクテリア増殖を監視しコロニーバイオマスを予測するシステム及び方法
CN110378260B (zh) 基于kcf的实时刀闸状态跟踪方法及系统
EP4209991A1 (fr) Programme, support de stockage d'informations et système de traitement
García-Garví et al. Automation of Caenorhabditis elegans lifespan assay using a simplified domain synthetic image-based neural network training strategy
CN117616453A (zh) 细胞图像分析方法
CN117635523A (zh) 显微镜系统和生成处理显微镜数据的机器学习模型的方法
CN113344934A (zh) 一种面向植物表型研究的叶片实例分割方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13887032

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015522397

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13887032

Country of ref document: EP

Kind code of ref document: A1