WO2004042392A1 - 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 - Google Patents
画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 Download PDFInfo
- Publication number
- WO2004042392A1 WO2004042392A1 PCT/JP2002/011624 JP0211624W WO2004042392A1 WO 2004042392 A1 WO2004042392 A1 WO 2004042392A1 JP 0211624 W JP0211624 W JP 0211624W WO 2004042392 A1 WO2004042392 A1 WO 2004042392A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- cell
- information
- class
- recognition result
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/695—Preprocessing, e.g. image segmentation
Definitions
- the present invention relates to an image analysis support method, an image analysis prayer support program, and an image analysis support method for supporting analysis of cell images photographed with a microscope or the like in the field of genome analysis technology such as research and development of new drugs and drug discovery.
- the present invention relates to an analysis support device. Background art
- the microscope image capture device HTS-50 of Matsushita Electric Industrial Co., Ltd. has a function to output quantitative numerical values from cell images. Examples include cell flatness, neurite length, and analysis of stained nuclei. Here, individual cells are not extracted and analyzed, but the texture of the entire image and the total length of elongated structures are calculated. Also, the whole system It is designed on the assumption that the examiner visually checks all raw images. In addition, Beckman of the United States has commercialized equipment and applications using microscope images, but it is premised on that the experimenter visually confirms the images, similar to HTS-50.
- Patent documents include, for example, Japanese Patent Application Laid-Open No. H5-249102.
- the conventional technology uses uniform processing for the entire image, and the result is displayed in units of one image in some cases.
- experiments with cDNA had a low probability of successful injection, and needed to distinguish between injected and non-injected cells.
- there are many types of fluorescent patterns and there is a weak pattern such as noise in the background, which makes it difficult to distinguish between the background and the cell pattern.
- the whole cell is firefly In some cases, the light shines, and in other cases, some of the cells fluoresce granularly. If all adjacent cells are fluorescent, they need to be separated. In the case of fluorescent particles, it is necessary to recognize that the dark area between the grains is not set as the background.
- fluorescent particles it is necessary to recognize that the dark area between the grains is not set as the background.
- an object of the present invention is to provide an image analysis support method, an image analysis support program, and an image analysis support device capable of improving the efficiency of experiments and analysis. Disclosure of the invention
- an image analysis support method, an image analysis support program, and an image analysis support device include information on a fluorescent image obtained by capturing cells attached to a plurality of cells, And input of information relating to at least one of a bright-field image and a phase difference image in the same visual field as the fluorescent image, and based on the information regarding the received fluorescent image, the input of cells in the fluorescent image is performed. Detecting the presence position, determining the class of the cell present at the detected presence position, information on the determined class, and the bright-field image and the position of the cell extracted based on the detected presence position.
- a feature is that at least one of the image information of the phase difference images is output so as to be displayed as a list.
- the recognition result for each of the images or for each of the pairs may be determined based on the determined class, and information on the determined recognition result may be output in a list-displayable manner. Also, the determined recognition result can be changed.
- At least one of the information on the cell candidate of the fluorescence image, the information on the average cell model, and the information on the average background model may be detected.
- the amount of projection is obtained by calculating the inner product of at least one of the principal component vector of the average cell model and the average background model and the feature vector of the cell candidate in the fluorescence image. It is also possible to calculate the evaluation value by normalizing the calculated and calculated projection amount, and to detect the position of the cell in the fluorescence image based on the calculated evaluation value.
- the amount of projection is calculated by calculating an inner product of a principal component vector of each protein localization class model and a characteristic vector of a cell existing at the existing position in the position detecting step.
- An evaluation value may be calculated by normalizing the projection amount, and the class of the cell may be determined based on the calculated evaluation value.
- FIG. 1 is an explanatory diagram showing an example of a hardware configuration of an image analysis support device according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a hardware configuration of an image storage PC or an image processing PC.
- FIG. 3 is a block diagram showing an example,
- FIG. 3 is an explanatory diagram showing a functional configuration of an image keratinization support device according to the embodiment of the present invention, and
- FIG. 4 is a diagram showing the functional configuration of the embodiment of the present invention.
- Fig. 5 is a flowchart showing an example of a fluorescent image
- Fig. 6 is an explanatory diagram showing an example of a bright-field image.
- FIG. 7 is an explanatory diagram showing an example of a phase difference image.
- FIG. 1 is an explanatory diagram showing an example of a hardware configuration of an image analysis support device according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a hardware configuration of an image storage PC or an image processing PC.
- FIG. 8 is an explanatory diagram showing an example of a plate for culturing cells.
- FIG. 9 is a diagram showing a plurality of images from one well.
- FIG. 10 is an explanatory diagram showing an example of a photographing area when obtaining a hologram.
- FIG. 11 is a flowchart showing the contents of the image acquisition processing.
- FIG. 11 is an explanatory diagram showing the contents of the cell position detection processing
- FIG. 12 is another flowchart showing the contents of the cell position detection processing.
- FIG. 13 is an explanatory diagram.
- FIG. FIG. 14 is an explanatory diagram showing an example of detecting a cell position
- FIG. 15 is an explanatory diagram showing the contents of a class determination process
- FIG. 17 is a flowchart showing the contents of the recognition result determination process.
- FIG. 18 is an explanatory diagram showing an example of the report screen.
- FIG. 19 is an explanatory diagram showing another example of a report screen, and
- FIG. 20 is an explanatory diagram showing an example of an image of a protein localization class in a cell.
- FIG. 1 is an explanatory diagram illustrating an example of a hardware configuration of an image processing apparatus according to an embodiment of the present invention.
- 100 is an optical microscope
- 101 is a microscope control module
- 102 is an image storage personal computer (PC)
- 103 is an image processing personal computer.
- 104 is a monitor
- 105 is a printer.
- the microscope control module 101 is a module for controlling the optical microscope 100 and capturing a fluorescence image, a bright-field image, and a phase contrast image. Set the cells that have been successfully injected to respond to fluorescence.
- the optical microscope 100 is set so that images of fluorescence, bright field, and phase difference can be captured in the same field of view.
- the microscope control module 101 is equipped with an Xyz stage, a fluorescent (mercury silver) shutter, a halogen light shutter, etc., and has functions such as replacement of a fluorescent filter and control of a CCD camera for determining the exposure time.
- the image storage PC 102 and the image processing PC 103 are personal computers as an example of an information processing apparatus. These may be servers or work stations. In particular, the PC 103 for image processing has a high processing speed. Are preferred.
- FIG. 2 is a block diagram showing an example of a hardware configuration of the image storage PC 102 and the image processing PC 103.
- a PC 201 for image storage and a PC 103 for image processing include a CPU 201, a ROM 202, a RAM 203, an HDD 204, an HD 205, a FDD (flexible disk drive) 206, and a removable recording medium.
- An FD (flexible disk) 207, an IZF (interface) 208, a keyboard 210, a mouse 211, and a scanner 212 are provided as an example of the above.
- Each component is connected by a bus 200.
- a display (monitor) 104 and a printer 105 are connected.
- the CPU 201 governs overall control of the image analysis support device.
- the ROM 202 stores programs such as a boot program.
- the RAM 203 is used as a work area of the CPU 201.
- the HDD 204 controls read / write of data to / from the HD 205 under the control of the CPU 201.
- the HD 205 stores data written under the control of the HDD 204.
- the FDD 206 controls reading / writing of data from / to the FD 207 under the control of the CPU 201.
- the FD 207 stores the data written under the control of the FDD 206 and causes the information processing device to read the data recorded in the FD 207.
- a CD-ROM CD-R, CD-RW
- MO Compact Disc
- DVD DigitalVeRsatileD eIsk
- the display 104 displays data such as a document, an image, and function information, such as a cursor, an icon, or a tool box. Examples include CRTs, TFT LCDs, and plasma displays.
- the I / F (interface) 208 is connected to a network 209 such as a LAN or the Internet via a communication line, and is connected to other servers and information processing devices via the network 209.
- the I / F 208 manages an interface between the network and the inside, and receives data from other servers and information terminal devices. Control the output.
- the IZF 208 also connects the image storage PC 102 and the image processing PC 103.
- I ZF208 is, for example, a modem.
- the keyboard 210 is provided with keys for inputting characters, numerals, various instructions, and the like, and performs data input.
- the touch panel input pad can be a numeric keypad.
- the mouse 211 moves the cursor, selects a region, or moves and resizes a window.
- a trackball, joystick, or the like may be used as long as the pointing device has the same function.
- the scanner 211 optically reads an image such as a driver image and takes in image data into the information processing device. It also has an OCR function, which can read printed information and convert it into data.
- the printer 105 prints image data such as contour image information and document data. For example, laser printers, inkjet printers, and the like.
- FIG. 3 is a block diagram showing an example of a functional configuration of the image analysis support device according to the embodiment of the present invention.
- the image analysis support device includes a fluorescent image information input unit 301, a fluorescent image information storage unit 302, a bright field image information / phase difference image information input unit 303, and a bright field image.
- the functional configuration of the image storage PC 102 includes a fluorescent image information input section 301, a fluorescent image information storage section 302, a bright field image information / phase difference image information input section 303, The bright field image information 'phase difference image information storage unit 304, the report information storage unit 310, and the power supply.
- the functional configuration of the image processing PC 103 is as follows. , A class determination unit 306, a recognition result determination unit 307, a report creation unit 308, a display control unit 310, and an output control unit 311.
- image storage The PC 102 for image processing and the PC 103 for image processing may be realized by one PC to realize the functions of both PCs.
- the fluorescent image information input unit 301 receives an input of information on a fluorescent image obtained by capturing cells attached to a plurality of wells, which will be described later. Further, the fluorescent image information storage unit 302 stores information on the fluorescent image whose input has been accepted by the fluorescent image information input unit 301.
- the bright-field image information / phase difference image information input unit 303 receives the information on the bright-field image in the same field of view as the fluorescent image input by the fluorescent image information input unit 301, or Accepts input of information about an image. Alternatively, both the information on the bright-field image and the information on the phase difference image may be received. Whether to input information on the bright-field image or the phase-contrast image should be selected according to the state of the cells to be photographed.
- the bright-field image information and phase-difference image information storage unit 304 stores information on the bright-field image and the phase-difference image accepted by the bright-field image information and phase-difference image information input unit 303. I do.
- the cell position detection unit 3005 receives an input from the fluorescence image information storage unit 302, and performs processing on the fluorescence image based on the information on the fluorescence image stored by the fluorescence image information storage unit 302. Detect the location of the cells. That is, by comparing the information on the cell candidates in the fluorescence image with at least one of the information on the average cell model and the information on the average background model described later, and evaluating the similarity between them, The location of the cell in the fluorescence image is detected. More specifically, the cell position detection unit 304 includes at least one of the above-described average cell model and the average background model as a main component vector, and a cell candidate of the fluorescence image.
- the projection amount is calculated by taking the inner product with the vector, the calculated projection amount is normalized to calculate an evaluation value, and the location of the cell in the fluorescence image is detected based on the calculated evaluation value.
- the information about the average cell model is the main component vector of the cell class of the sample image, the variance of the cell class, and And average values.
- the information on the average background model also includes the principal component vector of the background class of the sample image, and the variance value and the average value of the background class. The details of the cell position detection processing will be described later.
- the class determination unit 303 determines the class of the cell existing at the position detected by the cell position detection unit 304.
- the class judging unit 303 obtains the projected amount by calculating the inner product of the principal component vector of the model of each protein localization class and the characteristic vector of the cell existing at the position by the cell position detecting unit 300. Is calculated, the calculated projection amount is normalized to calculate an evaluation value, and the class of the cell is determined based on the calculated evaluation value. The details of the class determination processing will be described later.
- the recognition result determination unit 307 determines a recognition result for each image or each pixel based on the class determined by the class determination unit 306. The details of the recognition result determination process will also be described later.
- the report creation unit 308 includes information on the class determined by the class determination unit 306 and a description of the cells extracted based on the existence position detected by the cell position detection unit 305.
- Visual field image informationPhase difference image information input unit 303 receives input, and bright field image informationPhase difference image information storage unit 304 stores at least one of bright field image and phase difference image Based on one of the image information, for example, report information as shown in FIG. 18 or FIG. 19 described later is created.
- the report creator 308 can create report information including information on the recognition result determined by the recognition result determiner 307. Furthermore, the recognition result determined by the recognition result determination unit 307 can be changed based on the input of an instruction from the operator.
- the report information storage unit 309 stores the report information created by the report creation unit 308.
- the display control unit 310 controls the monitor 104 so that the report information created by the report creation unit 308 or the report information accumulated by the report information accumulation unit 309 is displayed. Is displayed.
- the output control unit 311 controls the printer 105 so that the report created by the report The report information or the report information stored by the report information storage unit 309 is printed. Alternatively, the output control unit 311 transmits the report information to another information processing device connected via the network 209 by the IZF 208.
- the fluorescence image information input unit 301 and the fluorescence image information storage unit 302 specifically realize their functions by, for example, a keyboard 210, a mouse 211, a scanner 212, or an I / F 208. Further, the cell position detection unit 305, the class determination unit 306, the recognition result determination unit 307, the report creation unit 308, the display control unit 310, and the output control unit 311 are, for example, ROM 202, RAM 203, HD The functions are realized by the CPU 201 executing the program stored in the 205 or the FD 207.
- the fluorescent image information storage unit 303, the bright field image information / phase difference image information storage unit 304, and the report information storage unit 309 specifically include, for example, the RAM 203, the HD 205 and the HDD 204, or the FD 207. And FDD 206 implement those functions.
- FIG. 4 is a flowchart showing a processing procedure of the image analysis support apparatus according to the embodiment of the present invention.
- a fluorescent image, a bright-field image, or a phase difference image is captured according to the procedure of FIG. 10 described later (step S401).
- FIG. 5 is an example of a captured fluorescent image.
- the fluorescent image can identify cells in which the gene has been expressed by adding a fluorescent marker to the cDNA and emitting a fluorescent substance in the cells. Injecting cDNA does not mean that all cells will have cDNA. As shown in Figure 5, only the nuclei of successfully injected cells glow.
- FIG. 6 is an example of a captured bright-field image.
- a bright-field image is an image taken using a normal lens, and can recognize the shape (contour) of a cell.
- FIG. 7 is an example of the photographed phase difference image.
- the phase contrast image is the refractive index of the lens This is an image taken by using, and it is possible to recognize even the contents of cells. Therefore, a bright-field image has an advantage that it is easy to focus, while a phase contrast image requires time for focusing, but has an advantage that a contour of a cell can be displayed more clearly. . Which image to use or both of them can be selected as appropriate according to the situation of the experiment and analysis.
- step S404 After focusing on a bright-field image, take a fluorescence image, then change the lens with the same focus, take a phase difference image, and store only the fluorescence image and the phase difference image. For example, a method of storing the data in the PC 102 may be considered.
- the cell position is detected from the fluorescence image (recognition sample (unknown class) 1111 shown in FIG. 11 described later) captured in step S401 (step S402).
- step S403 the cell area of the cell whose position is detected in step S402 in the bright field is obtained.
- a fluorescent portion other than the cell region is excluded (step S404).
- step S404 the class of the cell of interest specified by the cell region that has been left out after being excluded in step S404 is determined, thereby classifying the cell of interest (step S405). After that, a report is created based on the above result (step S406), and a series of processing ends.
- FIG. 8 is an explanatory diagram showing a 96-well plate as an example of a plate for culturing cells.
- the 96-well plate has the wells arranged in a lattice.
- Each peg is labeled with the letters A to H in the vertical direction and the numbers 01 to 12 in the horizontal direction.
- the top left peg is "A01”
- FIG. 9 is an explanatory diagram showing an example of a photographing area when a plurality of images are obtained from one pell (“G12”). It is shown enlarged.
- a fluorescence image, a bright field image, and a phase difference image are photographed at a plurality of positions (01 to 08) in a G12 cell.
- the position is changed concentrically, that is, images are taken in the order of 01 ⁇ 02 ⁇ ... ⁇ 08.
- the image is not limited to concentric circles, and may be changed in a spiral shape.
- FIG. 10 is a flow chart showing the content of the image acquisition processing using a 96-well plate.
- the user moves to the first well (A01) to make the first well ready for photographing (step S1001).
- the S counter and ⁇ counter 1 are reset (step S1002).
- the first image is obtained by photographing (step S1003), and ⁇ the counter is incremented by one (step S1004).
- step S1005 it is determined whether or not (1) the counter 1 has reached a preset value, for example, "8" (step S1005).
- step S1005 it is determined whether or not gene-expressing cells are present in the image taken in step S1003 (step S1006).
- Whether or not the gene-expressing cell is present can be determined by, for example, whether or not a fluorescent portion is present in the fluorescent image.
- step S1006 Yes
- step S1006 Yes
- step S1006 a bright-field image or a phase-contrast image of the same visual field, or both a bright-field image and a phase-contrast image are acquired.
- step S 1007 the M power center is increased by one (step S 1007).
- step S1006 it is determined that no gene-expressing cells exist (step S1006: No) the process proceeds to step S1009 without doing anything.
- step S 1008 it is determined whether or not the M counter has reached a preset value, for example, “3” (step S 1008). If it has not reached “3” (step S 1008: No), the next position is determined. The image at is taken and acquired (step S1009). Thereafter, the process returns to step S1004, and repeats the processes of steps S1004 to S1009. Then, in step S1005, the N counter If one becomes “8” (step S 1005: Y es) or if the M counter becomes “3” in step S 1008 (step S 1008: Y es), go to step S 1010 .
- step S 1008 the N counter If one becomes “8” (step S 1005: Y es) or if the M counter becomes “3” in step S 1008 (step S 1008: Y es), go to step S 1010 .
- step S1010 it is determined whether or not it is the last level. If it is not the last level (step S1010: No), move to the next level (step S1011), return to step S1002, and enter a new level. In step S1002, the processes in steps S1002 to S1009 are repeatedly performed. Then, in step S1010, if it is the last level (step S1010: Yes), a series of processing ends.
- FIG. 11 is an explanatory diagram showing the contents of the cell position detection processing.
- the learning sample 1101 includes both background learning images and cell learning images.
- Each of the images is divided (step S1102), feature extraction is performed from the image-divided image information (step S1103), and a feature vector is generated by normalizing the extracted feature information (step S1104). ). Further, a learning process is performed based on the generated feature vector (step S1105), thereby creating an average cell model 1100.
- the recognition sample 1111 of the captured fluorescent image relates to an unknown class of cells whose class has not yet been determined.
- This recognition sample 11 11 The input is performed in the same manner as for the training sample, the image is divided into images (step S1112), features are extracted from the divided image information (step S1113), and the feature extraction is performed by normalizing the extracted features. A vector is generated (step S111 4). Then, the identification processing is performed using the average cell model 1110 (step S1115), and the positions of the cells present in the fluorescence image are extracted.
- FIG. 12 is an explanatory diagram showing the contents of the cell position detection process, and specifically shows the process of generating an average cell model.
- the cell learning image 1101 of the target learning sample is divided into a predetermined number (step S1102). For example, when dividing into 4 * 4 pixel areas, as the preprocessing, a maximum value filter 1202 that replaces the maximum value of 4 * 4 pixels with 1 pixel, and a minimum value that replaces the minimum value of 4 * 4 pixels with 1 pixel
- the processing of the learning sample image information (original image) 1101 obtained by image division by the filter 1201 is performed.
- data compression is performed on the maximum value image and the minimum value image having a size of 14 * 1 ⁇ 4.
- Step S1204 the variance of the projections of all the training samples of the background class is the same (Step S1205).
- FIG. 13 is an explanatory diagram showing the contents of the cell position detection process, and specifically shows the process of generating an average cell model.
- the detected image 1111 of the target recognition sample is divided into a predetermined number in the same manner as the image 1101 of the learning sample (step SI 112).
- Data compression is performed on the maximum value image and minimum value image.
- X j (X , Zum: ⁇ , 2, ..., x jAm ).
- a feature vector is generated (step S1114).
- the processing is the same as the processing of the learning sample 1101.
- the projection (amount) projected on the (principal component vector) is calculated (step S1303). So Then, the similarity is evaluated by normalization, that is, the distance between the projection (amount) and the average vector w of the cell class (the projection 1 m (c) of the cell class is divided by the variance of the cell class (c) 2 The obtained value is used as the evaluation value of the cell class of the coordinate point (! ⁇ , Zo) (step S1304).
- the minimum value in the local area is detected based on the predetermined average cell diameter (step S135), and as a result, the cell position can be detected.
- FIG. 14 shows that the cell position is specified in the fluorescence image and the state is displayed. In Fig. 14, the position marked with "10" is the cell position
- FIG. 15 is an explanatory diagram showing an example of the result of the class determination process.
- 1500 is a cell model of each class, that is, a protein localization, and is composed of average cell model information of each class. Examples of the types of protein localization include those shown in FIG.
- the method of generating the cell model is the same as the method of generating the average cell model.
- a feature vector of a recognition sample to be subjected to class determination is generated (step S1501).
- the generation of the feature vector includes various features, such as those related to density statistics, those related to edge element features, those related to shape features, those related to texture features, those related to run-length statistics, etc. Perform using a part.
- those relating to the density statistic include, specifically, the luminance average of the entire image, the luminance variance of the entire image, the total variance of the average in the non-background rectangular block portion, and the non-background rectangular block.
- the edge element feature specifically, the average of the neighborhood density difference, the variance of the neighborhood density difference, the brightness average of the phase difference edge image, the horizontal dark area run length average of the phase difference edge image, the phase difference edge Horizontal bright-length average of image, dark-length run-length average of phase-difference image, bright-length run-length average of phase-difference image, covariance of density co-occurrence matrix of phase-difference edge image, phase difference Horizontal standard deviation of density co-occurrence matrix of edge image, vertical standard deviation of density co-occurrence matrix of phase difference edge image, sample correlation coefficient of density co-occurrence matrix of phase difference edge image, phase difference edge image And the standard deviation ratio in the horizontal and vertical directions of the density co-occurrence matrix.
- the number of labels In terms of shape features, the number of labels, the total area of labeling items, the total perimeter of labeling items, the average luminance ratio of the entire image and non-background parts, the average area of labeling items, the labeling items The average of the perimeter, the average circularity of the labeling items, and the average complexity of the labeling items.
- the horizontal average of the density co-occurrence matrix, the vertical average of the density co-occurrence matrix, the contrast of the density co-occurrence matrix, the covariance of the density co-occurrence matrix, and the The horizontal standard deviation, the vertical standard deviation of the concentration co-occurrence matrix, the power of the concentration co-occurrence matrix, the sample correlation coefficient of the concentration co-occurrence matrix, the average ratio of the concentration co-occurrence matrix in the horizontal and vertical directions, etc. is there.
- run length statistics there are the vertical and horizontal dark run length averages, the vertical and horizontal bright run length averages, and the dark to light ratio of the run length average.
- the projection amount is calculated by taking the inner product of the eigenvector of the cell model 1500 of each class and the feature vector generated in step S1501 (step S1502). ), And the similarity is evaluated by normalizing using the variance / average value (step S1503).
- steps S1303 and S304 in FIG. 13 a detailed description thereof will be omitted.
- the top n n is a natural number
- eigenvectors that can better classify the learning pattern are used.
- the recognition result is a result relating to which class of protein localization the recognition sample to be subjected to class determination falls.
- FIG. 16 shows that the class, which is the recognition result of each cell whose cell position has been identified in the fluorescence image, is indicated by an abbreviated name.
- MEM indicates that this cell is a cell membrane.
- FIG. 17 is a flowchart showing the contents of the recognition result determination process.
- the recognition result of each image and inspection unit is determined from the recognition result of each cell.
- the recognition results for each cell are collected for each image (step S1771).
- the recognition results of the collected image units are obtained in consideration of the confidence obtained from the recognition algorithm and the importance (causal relation) of each class (step S1702).
- the respective confidence thresholds are TA, TB, TC, and TD.
- the confidence for each is obtained as XA, xB, xC, xD.
- classes B, C, and D have a subordinate relationship, and there is a relationship that changes over time from B to C, and from C to D (B ⁇ C ⁇ D). Then, if any certainty is smaller than each threshold, the result is output as an unknown pattern. If both the confidences of classes A and B are equal to or greater than the threshold value, the class with the greater confidence is the result. If the certainty factors of classes B, C, and D are all equal to or greater than the threshold, class B, which is the main dependent relationship, is the result.
- Class A includes, for example, mitochondrial localization.
- the results for each image are collected for each inspection (step S1703). Then, the result of the inspection unit is determined by a majority decision (step S 174), and a series of processing ends.
- the recognition result (gene function) of each image or test unit can be estimated by using the foresight that a single correct answer is included for each image or each test. .
- FIG. 18 and FIG. 19 are explanatory diagrams showing an example of a report screen.
- FIG. 20 shows an example of an image of protein localization in a cell, and shows a list of class names and class abbreviations corresponding to the surface image examples.
- FIG. 18 is a report screen created based on the recognition result obtained by the class determination processing described in FIG. “A 0101 FA” in the “view image name” indicates the level “A 0 1” and the shooting position “0 1”. From this fluorescent image, there are 15 gene-expressing cells. It is shown that . Then, each of the cut out bright-field images or phase-difference images is pasted, and the abbreviation of the determined class is displayed.
- FIG. 19 is a report screen created based on the result obtained by the recognition result determination processing described in FIG. 1900 is a representative image for each cell.
- the “ratio of the fluorescent part” 19001 which indicates the ratio of the fluorescent portion to the entire fluorescent image captured in one cell, the “number of cells” 1900, and the “number of cells” in the fluorescent image captured in one well "Exposure state” 1 903, "Automatic classification type” indicating the class obtained by the recognition result judgment processing 1 904, "Similarity” with the model 1 905, "Manual classification type” 1 9 0
- a list of each item in 6 is displayed. Items can be sorted and displayed for each of the items 1901-1906. Also, the “manual classification type” can be changed by changing the display of the manual classification type display column 1907.
- the image display button 1908 When the image display button 1908 is pressed, the representative image can be changed to a desired image. In addition, click the "Print Report” button By pressing this button, the displayed contents will be printed.
- the “o K” button 1910 When the “o K” button 1910 is pressed, the content is determined as displayed and stored in the report information storage unit 309. It is also possible to read a report that has already been stored from the report information storage unit 309, change the content, and store it again in the report information storage unit 309.
- cells in which a gene has been expressed are detected, and only those portions are cut out. And recognize the protein localization and morphological changes from those images, and select, arrange, display, print, and save them based on information such as the confidence of the recognition results.
- Everything up to the creation can be operated automatically.
- the cells of interest that have expressed the gene are automatically detected, and only those parts are cut out, and the fluorescence, phase contrast, and bright field images are extracted.
- Recognition of localization and morphological changes is possible, and reports that are selected and arranged based on information such as the degree of certainty of the recognition result can be created automatically. Since the brightness range of each image is adjusted so that it is easy to see on the screen, displaying, printing, and saving this on the screen significantly improves the efficiency of the experiment.
- a fluorescent portion of a region for each cell is extracted from the fluorescent image.
- the fluorescent pattern and the background pattern are learned.
- an “average cell model” is created using the information that the target is a cell, and an estimated area for one cell is extracted by a normalized similarity evaluation method.
- the contour of one cell can be extracted with reference to the bright field image or the phase difference image at the same position. In this way, it is possible to cut out the region of individual cells instead of the entire image and analyze and display the result.
- a learning / recognition process for recognizing a fluorescence pattern or a morphological pattern for each cell.
- eigenspace method, subspace method, neural network, and other methods can be used. in this way, By providing a computer with a learning function, it is possible to automatically output judgment results.
- the applicant has actually produced a trial report. As a result, about 700 MB of image data was generated per plate, and the brightness range was adjusted each time the image was viewed, so it took several hours (3 hours or more) just to look at it. (I did not have time to judge the result). On the other hand, the capacity of the created catalog was about 6 MB per plate, and it took only a few minutes to see the entire catalog, and it was possible to attach the judgment results.
- the image analysis support method according to the present embodiment may be a computer-readable program prepared in advance, and is realized by executing the program on a computer such as a personal computer or a workstation. You.
- This program is recorded on a computer-readable recording medium such as HD, FD, CD-ROM, MO, and DVD, and is executed by being read from the recording medium by the computer.
- the program may be a transmission medium that can be distributed via a network such as the Internet.
- the image analysis support method, the image angle analysis support program, and the image analysis support device quickly and efficiently extract the cell position and the type of an image taken by a microscope, It is suitable for improving the efficiency of experiments and analysis by automatically displaying the list of extraction results for each cell region of the image.
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP02780044A EP1586897B1 (en) | 2002-11-07 | 2002-11-07 | Image analysis supporting method, image analysis supporting program, and image analysis supporting device |
AU2002344483A AU2002344483A1 (en) | 2002-11-07 | 2002-11-07 | Image analysis supporting method, image analysis supporting program, and image analysis supporting device |
PCT/JP2002/011624 WO2004042392A1 (ja) | 2002-11-07 | 2002-11-07 | 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 |
DE60237242T DE60237242D1 (de) | 2002-11-07 | 2002-11-07 | Hilfsverfahren, hilfsprogramm und hilfsvorrichtung zur bildanalyse |
JP2004549566A JP4037869B2 (ja) | 2002-11-07 | 2002-11-07 | 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 |
US11/089,223 US7050613B2 (en) | 2002-11-07 | 2005-03-25 | Method for supporting cell image analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2002/011624 WO2004042392A1 (ja) | 2002-11-07 | 2002-11-07 | 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/089,223 Continuation US7050613B2 (en) | 2002-11-07 | 2005-03-25 | Method for supporting cell image analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004042392A1 true WO2004042392A1 (ja) | 2004-05-21 |
Family
ID=32310236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2002/011624 WO2004042392A1 (ja) | 2002-11-07 | 2002-11-07 | 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1586897B1 (ja) |
JP (1) | JP4037869B2 (ja) |
AU (1) | AU2002344483A1 (ja) |
DE (1) | DE60237242D1 (ja) |
WO (1) | WO2004042392A1 (ja) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005029413A1 (en) * | 2003-09-23 | 2005-03-31 | Iatia Imaging Pty Ltd | Method and apparatus for determining the area or confluency of a sample |
JP2005192485A (ja) * | 2004-01-07 | 2005-07-21 | Hitachi Medical Corp | 細胞の培養状態検出装置 |
JP2006105695A (ja) * | 2004-10-01 | 2006-04-20 | Sysmex Corp | 細胞画像表示方法、細胞画像表示システム、細胞画像表示装置、及びコンピュータプログラム |
WO2006104201A1 (ja) * | 2005-03-29 | 2006-10-05 | Olympus Corporation | 細胞画像解析方法、細胞画像解析プログラム、細胞画像解析装置、スクリーニング方法およびスクリーニング装置 |
JP2006275771A (ja) * | 2005-03-29 | 2006-10-12 | Olympus Corp | 細胞画像解析装置 |
JP2007113961A (ja) * | 2005-10-18 | 2007-05-10 | Olympus Corp | 生物学的活性を長期的に評価する装置または自動解析する方法 |
JP2007278985A (ja) * | 2006-04-11 | 2007-10-25 | Hamamatsu Photonics Kk | 光測定装置、光測定方法、及び光測定プログラム |
JP2008268027A (ja) * | 2007-04-20 | 2008-11-06 | Nikon Corp | 画像解析方法と、蛍光検出装置 |
JP2010518486A (ja) * | 2007-02-05 | 2010-05-27 | シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッド | 顕微鏡検査における細胞分析システムおよび方法 |
JP2011505954A (ja) * | 2007-12-17 | 2011-03-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 混濁媒体の内部における不均質性の存在を検出するための方法及び、混濁媒体の内部を画像化するための装置 |
JP2012002949A (ja) * | 2010-06-15 | 2012-01-05 | Nikon Corp | 観察装置 |
JPWO2011132586A1 (ja) * | 2010-04-23 | 2013-07-18 | 浜松ホトニクス株式会社 | 細胞観察装置および細胞観察方法 |
WO2013146843A1 (ja) * | 2012-03-30 | 2013-10-03 | コニカミノルタ株式会社 | 医用画像処理装置及びプログラム |
WO2015133193A1 (ja) * | 2014-03-05 | 2015-09-11 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
JP2016118428A (ja) * | 2014-12-19 | 2016-06-30 | コニカミノルタ株式会社 | 画像処理装置、画像処理システム、画像処理プログラム及び画像処理方法 |
JP2017524196A (ja) * | 2014-08-04 | 2017-08-24 | ヴェンタナ メディカル システムズ, インク. | コンテキストフィーチャを用いた画像解析システム |
KR101847509B1 (ko) | 2012-02-17 | 2018-04-10 | 광주과학기술원 | 막 오염도 측정 방법 및 장치 |
WO2018143406A1 (ja) * | 2017-02-06 | 2018-08-09 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JP2020530637A (ja) * | 2017-08-09 | 2020-10-22 | アレン インスティテュート | 予測タグ付けを有する画像を生成するための画像処理のためのシステム、デバイス、および方法 |
CN111826423A (zh) * | 2019-03-29 | 2020-10-27 | 希森美康株式会社 | 荧光图像分析装置及荧光图像分析方法 |
WO2022004318A1 (ja) * | 2020-07-02 | 2022-01-06 | 富士フイルム株式会社 | 情報処理装置、情報処理方法、及びプログラム |
WO2022249583A1 (ja) * | 2021-05-26 | 2022-12-01 | ソニーグループ株式会社 | 情報処理装置、生体試料観察システム及び画像生成方法 |
CN111826423B (zh) * | 2019-03-29 | 2024-04-26 | 希森美康株式会社 | 荧光图像分析装置及荧光图像分析方法 |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8014577B2 (en) * | 2007-01-29 | 2011-09-06 | Institut National D'optique | Micro-array analysis system and method thereof |
CA2675103C (en) * | 2007-01-29 | 2013-10-01 | Institut National D'optique | Micro-array analysis system and method thereof |
JP5446868B2 (ja) * | 2007-10-19 | 2014-03-19 | 株式会社ニコン | プログラム、コンピュータおよび培養状態解析方法 |
DE102007062067A1 (de) * | 2007-12-21 | 2009-07-02 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Optisches Verfahren und Vorrichtung zur Klassifizierung von Partikeln |
WO2021206686A1 (en) * | 2020-04-07 | 2021-10-14 | Hewlett-Packard Development Company, L.P. | Microfluidic chip cell sorting and transfection |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5352494A (en) * | 1976-10-25 | 1978-05-12 | Hitachi Ltd | Discrimination method of leucocyte |
JPS6447948A (en) * | 1987-08-19 | 1989-02-22 | Hitachi Ltd | Automatic cell sorter |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02165388A (ja) * | 1988-12-20 | 1990-06-26 | Toshiba Corp | パターン認識方式 |
US5162990A (en) * | 1990-06-15 | 1992-11-10 | The United States Of America As Represented By The United States Navy | System and method for quantifying macrophage phagocytosis by computer image analysis |
US5798262A (en) * | 1991-02-22 | 1998-08-25 | Applied Spectral Imaging Ltd. | Method for chromosomes classification |
IL140851A0 (en) * | 1998-07-27 | 2002-02-10 | Applied Spectral Imaging Ltd | In situ method of analyzing cells |
-
2002
- 2002-11-07 AU AU2002344483A patent/AU2002344483A1/en not_active Abandoned
- 2002-11-07 WO PCT/JP2002/011624 patent/WO2004042392A1/ja active Application Filing
- 2002-11-07 DE DE60237242T patent/DE60237242D1/de not_active Expired - Lifetime
- 2002-11-07 EP EP02780044A patent/EP1586897B1/en not_active Expired - Lifetime
- 2002-11-07 JP JP2004549566A patent/JP4037869B2/ja not_active Expired - Lifetime
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5352494A (en) * | 1976-10-25 | 1978-05-12 | Hitachi Ltd | Discrimination method of leucocyte |
JPS6447948A (en) * | 1987-08-19 | 1989-02-22 | Hitachi Ltd | Automatic cell sorter |
Non-Patent Citations (1)
Title |
---|
See also references of EP1586897A4 * |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005029413A1 (en) * | 2003-09-23 | 2005-03-31 | Iatia Imaging Pty Ltd | Method and apparatus for determining the area or confluency of a sample |
JP4565845B2 (ja) * | 2004-01-07 | 2010-10-20 | 株式会社カネカ | 細胞の培養状態検出装置 |
JP2005192485A (ja) * | 2004-01-07 | 2005-07-21 | Hitachi Medical Corp | 細胞の培養状態検出装置 |
JP2006105695A (ja) * | 2004-10-01 | 2006-04-20 | Sysmex Corp | 細胞画像表示方法、細胞画像表示システム、細胞画像表示装置、及びコンピュータプログラム |
WO2006104201A1 (ja) * | 2005-03-29 | 2006-10-05 | Olympus Corporation | 細胞画像解析方法、細胞画像解析プログラム、細胞画像解析装置、スクリーニング方法およびスクリーニング装置 |
JP2006275771A (ja) * | 2005-03-29 | 2006-10-12 | Olympus Corp | 細胞画像解析装置 |
JP4728025B2 (ja) * | 2005-03-29 | 2011-07-20 | オリンパス株式会社 | 細胞画像解析装置 |
JP2007113961A (ja) * | 2005-10-18 | 2007-05-10 | Olympus Corp | 生物学的活性を長期的に評価する装置または自動解析する方法 |
JP2007278985A (ja) * | 2006-04-11 | 2007-10-25 | Hamamatsu Photonics Kk | 光測定装置、光測定方法、及び光測定プログラム |
JP2010518486A (ja) * | 2007-02-05 | 2010-05-27 | シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッド | 顕微鏡検査における細胞分析システムおよび方法 |
JP2008268027A (ja) * | 2007-04-20 | 2008-11-06 | Nikon Corp | 画像解析方法と、蛍光検出装置 |
JP2011505954A (ja) * | 2007-12-17 | 2011-03-03 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 混濁媒体の内部における不均質性の存在を検出するための方法及び、混濁媒体の内部を画像化するための装置 |
JPWO2011132586A1 (ja) * | 2010-04-23 | 2013-07-18 | 浜松ホトニクス株式会社 | 細胞観察装置および細胞観察方法 |
JP5775068B2 (ja) * | 2010-04-23 | 2015-09-09 | 浜松ホトニクス株式会社 | 細胞観察装置および細胞観察方法 |
JP2012002949A (ja) * | 2010-06-15 | 2012-01-05 | Nikon Corp | 観察装置 |
KR101847509B1 (ko) | 2012-02-17 | 2018-04-10 | 광주과학기술원 | 막 오염도 측정 방법 및 장치 |
US9189678B2 (en) | 2012-03-30 | 2015-11-17 | Konica Minolta, Inc. | Medical image processor and storage medium |
WO2013146843A1 (ja) * | 2012-03-30 | 2013-10-03 | コニカミノルタ株式会社 | 医用画像処理装置及びプログラム |
JP5804194B2 (ja) * | 2012-03-30 | 2015-11-04 | コニカミノルタ株式会社 | 医用画像処理装置及びプログラム |
WO2015133193A1 (ja) * | 2014-03-05 | 2015-09-11 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
JP2015170047A (ja) * | 2014-03-05 | 2015-09-28 | 富士フイルム株式会社 | 細胞画像評価装置および方法並びにプログラム |
US10360676B2 (en) | 2014-03-05 | 2019-07-23 | Fujifilm Corporation | Cell image evaluation device, method, and program |
JP2020205063A (ja) * | 2014-08-04 | 2020-12-24 | ヴェンタナ メディカル システムズ, インク. | コンテキストフィーチャを用いた画像解析システム |
JP2017524196A (ja) * | 2014-08-04 | 2017-08-24 | ヴェンタナ メディカル システムズ, インク. | コンテキストフィーチャを用いた画像解析システム |
JP7335855B2 (ja) | 2014-08-04 | 2023-08-30 | ヴェンタナ メディカル システムズ, インク. | コンテキストフィーチャを用いた画像解析システム |
JP2016118428A (ja) * | 2014-12-19 | 2016-06-30 | コニカミノルタ株式会社 | 画像処理装置、画像処理システム、画像処理プログラム及び画像処理方法 |
WO2018143406A1 (ja) * | 2017-02-06 | 2018-08-09 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JPWO2018143406A1 (ja) * | 2017-02-06 | 2019-12-26 | コニカミノルタ株式会社 | 画像処理装置及びプログラム |
JP2020530637A (ja) * | 2017-08-09 | 2020-10-22 | アレン インスティテュート | 予測タグ付けを有する画像を生成するための画像処理のためのシステム、デバイス、および方法 |
US11614610B2 (en) | 2017-08-09 | 2023-03-28 | Allen Institute | Systems, devices, and methods for image processing to generate an image having predictive tagging |
JP7459336B2 (ja) | 2017-08-09 | 2024-04-01 | アレン インスティテュート | 予測タグ付けを有する画像を生成するための画像処理のためのシステム、デバイス、および方法 |
CN111826423A (zh) * | 2019-03-29 | 2020-10-27 | 希森美康株式会社 | 荧光图像分析装置及荧光图像分析方法 |
CN111826423B (zh) * | 2019-03-29 | 2024-04-26 | 希森美康株式会社 | 荧光图像分析装置及荧光图像分析方法 |
WO2022004318A1 (ja) * | 2020-07-02 | 2022-01-06 | 富士フイルム株式会社 | 情報処理装置、情報処理方法、及びプログラム |
WO2022249583A1 (ja) * | 2021-05-26 | 2022-12-01 | ソニーグループ株式会社 | 情報処理装置、生体試料観察システム及び画像生成方法 |
Also Published As
Publication number | Publication date |
---|---|
AU2002344483A1 (en) | 2004-06-07 |
JPWO2004042392A1 (ja) | 2006-03-09 |
EP1586897A1 (en) | 2005-10-19 |
DE60237242D1 (de) | 2010-09-16 |
JP4037869B2 (ja) | 2008-01-23 |
EP1586897B1 (en) | 2010-08-04 |
EP1586897A4 (en) | 2007-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4037869B2 (ja) | 画像解析支援方法、画像解析支援プログラムおよび画像解析支援装置 | |
US7050613B2 (en) | Method for supporting cell image analysis | |
US11681418B2 (en) | Multi-sample whole slide image processing in digital pathology via multi-resolution registration and machine learning | |
Silva-Rodríguez et al. | Going deeper through the Gleason scoring scale: An automatic end-to-end system for histology prostate grading and cribriform pattern detection | |
Huang et al. | From quantitative microscopy to automated image understanding | |
Sertel et al. | Computer-aided prognosis of neuroblastoma on whole-slide images: Classification of stromal development | |
JP4688954B2 (ja) | 特徴量選択方法、特徴量選択装置、画像分類方法、画像分類装置、コンピュータプログラム、及び記録媒体 | |
CN102687007B (zh) | 利用分层标准化切割的高处理量生物标志物分割 | |
US9646376B2 (en) | System and method for reviewing and analyzing cytological specimens | |
US8306284B2 (en) | Manually-assisted automated indexing of images using facial recognition | |
JP5022174B2 (ja) | 欠陥分類方法及びその装置 | |
CN109829467A (zh) | 图像标注方法、电子装置及非暂态电脑可读取储存媒体 | |
US20020159642A1 (en) | Feature selection and feature set construction | |
JP2012181209A (ja) | 欠陥分類方法及びその装置 | |
WO2001078009A2 (en) | Quantitative analysis of biological images | |
JPH08315144A (ja) | パターン分類装置及びそのパターン分類方法 | |
Sertel et al. | Computer-aided prognosis of neuroblastoma: classification of stromal development on whole-slide images | |
WO2020208955A1 (ja) | 情報処理装置、情報処理装置の制御方法及びプログラム | |
Rifkin | Identifying fluorescently labeled single molecules in image stacks using machine learning | |
Hu et al. | Application of temporal texture features to automated analysis of protein subcellular locations in time series fluorescence microscope images | |
Zanotelli et al. | A flexible image segmentation pipeline for heterogeneous multiplexed tissue images based on pixel classification | |
SOUAHI | Analytic study of the preprocessing methods impact on historical document analysis and classification | |
CN117635629A (zh) | 利用部分注释图像训练实例分割算法 | |
CN116229435A (zh) | 一种基于多尺度特征融合的溺死硅藻检测方法 | |
Blunsden et al. | Investigating the effects of scale in MRF texture classification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004549566 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2002780044 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11089223 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2002780044 Country of ref document: EP |