WO2023026742A1 - 色素画像取得方法、色素画像取得装置、及び色素画像取得プログラム - Google Patents
色素画像取得方法、色素画像取得装置、及び色素画像取得プログラム Download PDFInfo
- Publication number
- WO2023026742A1 WO2023026742A1 PCT/JP2022/028660 JP2022028660W WO2023026742A1 WO 2023026742 A1 WO2023026742 A1 WO 2023026742A1 JP 2022028660 W JP2022028660 W JP 2022028660W WO 2023026742 A1 WO2023026742 A1 WO 2023026742A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fluorescence
- dye
- image
- wavelength
- images
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 44
- 239000000975 dye Substances 0.000 claims abstract description 173
- 238000002073 fluorescence micrograph Methods 0.000 claims abstract description 172
- 238000012545 processing Methods 0.000 claims abstract description 62
- 238000009826 distribution Methods 0.000 claims abstract description 52
- 230000001678 irradiating effect Effects 0.000 claims abstract description 9
- 239000011159 matrix material Substances 0.000 claims description 144
- 230000005284 excitation Effects 0.000 claims description 68
- 230000006870 function Effects 0.000 claims description 55
- 238000004364 calculation method Methods 0.000 claims description 39
- 230000003287 optical effect Effects 0.000 claims description 38
- 238000003384 imaging method Methods 0.000 claims description 9
- 238000000926 separation method Methods 0.000 description 16
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 6
- 239000000049 pigment Substances 0.000 description 6
- 238000002834 transmittance Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 240000006829 Ficus sundaica Species 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000000799 fluorescence microscopy Methods 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 2
- 238000005086 pumping Methods 0.000 description 2
- 230000003595 spectral effect Effects 0.000 description 2
- 235000018185 Betula X alpestris Nutrition 0.000 description 1
- 235000018212 Betula X uliginosa Nutrition 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000000862 absorption spectrum Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000009509 drug development Methods 0.000 description 1
- 238000002189 fluorescence spectrum Methods 0.000 description 1
- 238000012880 independent component analysis Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- PYWVYCXTNDRMGF-UHFFFAOYSA-N rhodamine B Chemical compound [Cl-].C=12C=CC(=[N+](CC)CC)C=C2OC2=CC(N(CC)CC)=CC=C2C=1C1=CC=CC=C1C(O)=O PYWVYCXTNDRMGF-UHFFFAOYSA-N 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N21/645—Specially adapted constructive features of fluorimeters
- G01N21/6456—Spatial resolved fluorescence measurements; Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N2021/6417—Spectrofluorimetric devices
- G01N2021/6419—Excitation at two or more wavelengths
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/62—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
- G01N21/63—Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
- G01N21/64—Fluorescence; Phosphorescence
- G01N2021/6417—Spectrofluorimetric devices
- G01N2021/6421—Measuring at two or more wavelengths
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N2201/00—Features of devices classified in G01N21/00
- G01N2201/12—Circuits of general importance; Signal processing
- G01N2201/129—Using chemometrical methods
Definitions
- One aspect of the embodiments relates to a dye image acquisition method, a dye image acquisition device, and a dye image acquisition program.
- NMF Nonnegative Matrix Factorization
- Non-Patent Document 2 as a method of unmixing fluorescence images, fluorescence images are clustered, the maximum value of fluorescence intensity is extracted in each clustered pixel group, and the maximum value is used to separate images. It is disclosed to generate a
- An object of the present invention is to provide an apparatus and a dye image acquisition program.
- a dye image acquisition method irradiates a sample with each of C (C is an integer of 2 or more) wavelength distributions of excitation light, and N (N is an integer of 2 or more)
- the dye image acquisition apparatus irradiates the sample with each of C (C is an integer of 2 or more) wavelength distributions of excitation light, and N (N is an integer of 2 or more) ), and an image processing device for generating a dye image showing the distribution of the dye in the sample.
- C is an integer of 2 or more
- N is an integer of 2 or more
- an image processing device for generating a dye image showing the distribution of the dye in the sample.
- N pixels are clustered into L pixel groups (L is an integer of 2 or more and N ⁇ 1 or less), and C fluorescence images are clustered for each pixel group.
- Generate L cluster matrices arranged in calculate statistics of intensity values of pixel groups constituting C fluorescence images for each L cluster matrices, C fluorescence for each L cluster matrices Using the statistical values of the images, unmixing is performed on the C fluorescence images to generate K dye images showing the distribution of each of K dyes (K is an integer of 2 or more and C or less).
- the dye image acquisition program is a fluorescence image acquired by irradiating the sample with each of C (C is an integer of 2 or more) wavelength distribution excitation light, , N (N is an integer equal to or greater than 2) pixels, based on C fluorescent images each configured, a dye image acquisition program for generating a dye image showing the distribution of the dye in the sample, Based on the intensity value of each pixel of the C fluorescence images, the computer clusters the N pixels into L (L is an integer of 2 or more and N-1 or less) into pixel groups to obtain C fluorescence images.
- a clustering unit that generates L cluster matrices arranged for each clustered pixel group, a calculation unit that calculates statistics of intensity values of pixel groups constituting C fluorescence images for each of the L cluster matrices, and Using the statistical values of C fluorescence images for each L cluster matrix, unmixing is performed on C fluorescence images, and the distribution for each of K dyes (K is an integer of 2 or more and C or less) is obtained. It functions as an image generation unit that generates K dye images shown in FIG.
- C fluorescence images are obtained by capturing fluorescence images of the sample using excitation light with different wavelength distributions, and these C N pixels of the fluorescence images are clustered into L pixel groups based on the intensity value of each pixel, and L cluster matrices are generated in which C fluorescence images are arranged for each L pixel groups. be.
- statistical values of intensity values of pixel groups constituting C fluorescence images are calculated for each of the L cluster matrices, and the C fluorescence images are unlocked using the statistical values of the respective C fluorescence images. They are mixed to produce K dye images.
- FIG. 1 is a schematic configuration diagram of a dye image acquisition system 1 according to an embodiment
- FIG. 2 is a perspective view showing the configuration of the image acquisition device 3 of FIG. 1
- FIG. 2 is a block diagram showing an example of a hardware configuration of an image processing device 5 of FIG. 1
- FIG. 2 is a block diagram showing a functional configuration of an image processing device 5 of FIG. 1
- FIG. 5 is a diagram showing an image of a group of pixels clustered by the first clustering function of the clustering unit 203 of FIG. 4
- FIG. 4 is a graph showing wavelength characteristics of absorptivity of excitation light of a plurality of dyes contained in sample S.
- FIG. 5 is a graph showing the distribution of centroid fluorescence wavelengths identified by the clustering unit 203 of FIG. 4;
- FIG. 5 is a diagram showing an image of a group of pixels clustered by a second clustering function of the clustering unit 203 of FIG. 4;
- FIG. 5 is a diagram showing an image of matrix data Y' and pigment matrix data X' regenerated by the statistical value calculation unit 204 of FIG. 4;
- FIG. 4 is a flow chart showing procedures of a dye image acquisition method according to an embodiment;
- 3 is a diagram showing an example of a dye image generated by the dye image acquisition system 1 according to the embodiment;
- FIG. 1 is a schematic configuration diagram of a dye image acquisition system 1, which is a dye image acquisition device according to an embodiment.
- the dye image acquisition system 1 is a device for generating a dye image for specifying the distribution of dyes in a sample such as a biological tissue that is an observation target.
- the image generated by the dye image acquisition system 1 is used for purposes such as drug development and treatment study through analysis of the image. Therefore, the dye image acquisition system 1 is required to generate an image capable of quantitatively identifying the distribution of many substances (dyes) contained in the sample with high throughput.
- the dye image acquisition system 1 includes an image acquisition device 3 that irradiates a sample S with excitation light and acquires an image of fluorescence generated accordingly, and an image processing device 5 that performs data processing on the image acquired by the image acquisition device 3. and
- the image acquisition device 3 and the image processing device 5 may be configured so that image data can be transmitted and received between them using wired communication or wireless communication, and image data can be input and output via a recording medium. may be configured.
- FIG. 2 is a perspective view showing the configuration of the image acquisition device 3 of FIG.
- the optical path of excitation light is indicated by a dotted line with arrows
- the optical path of fluorescence is indicated by a solid line with arrows.
- the image acquisition device 3 includes an excitation light source 7, a light source side filter set 9a, a dichroic mirror 11, a camera side filter set 9b, a wavelength information acquisition optical system 13, a first camera 15a, and a second camera 15b. be.
- the excitation light source 7 is a light source that can switch and irradiate excitation light in a plurality of wavelength bands (wavelength distribution). For example, a light source combined with a selective optical element.
- the light source side filter set 9a is a multi-bandpass filter that is provided on the optical path of the excitation light from the excitation light source 7 and has the property of transmitting light in a plurality of predetermined wavelength bands.
- the transmission wavelength band of this light source side filter set 9a is set according to a plurality of wavelength bands of excitation light that can be used.
- the dichroic mirror 11 is provided between the light source side filter set 9a and the sample S, and is an optical member having the property of reflecting the excitation light toward the sample S and transmitting the fluorescence emitted from the sample S accordingly.
- the camera-side filter set 9b is a multi-bandpass filter that is provided on the optical path of fluorescence transmitted by the dichroic mirror 11 and has the property of transmitting light in a plurality of predetermined wavelength bands.
- the transmission wavelength band of this camera-side filter set 9b is set according to the wavelength band of fluorescence generated in the dye that may be contained in the sample S to be observed.
- the wavelength information acquisition optical system 13 is provided on the optical path of fluorescence transmitted by the camera-side filter set 9b, and is an optical system for acquiring fluorescence wavelength information. That is, the wavelength information acquisition optical system 13 separates the fluorescence from the sample S into two optical paths with different wavelength characteristics.
- a dichroic mirror having a transmittance wavelength characteristic such that the transmittance increases linearly as the wavelength increases is used.
- the wavelength information acquisition optical system 13 using such a dichroic mirror separates fluorescence with different wavelength characteristics, reflects a part of the fluorescence with wavelength characteristics such that the reflectance decreases as the wavelength increases, and the wavelength increases. A portion of the fluorescent light is transmitted with a wavelength characteristic that increases the transmittance.
- the wavelength information acquisition optical system 13 is provided with a support mechanism (not shown) that detachably supports the wavelength information acquisition optical system 13 on the optical path of fluorescence from the camera-side filter set 9b.
- the first camera 15a is an imaging device that captures a two-dimensional image composed of N pixels (N is an integer equal to or greater than 2, for example, 2048 ⁇ 2048), and the wavelength information acquisition optical system 13 captures fluorescent light. This is a camera that captures one component of fluorescence separated by the wavelength information acquisition optical system 13 and acquires one separated fluorescence image when it is supported on the road. Further, the first camera 15a captures fluorescence transmitted through the camera-side filter set 9b when the wavelength information acquisition optical system 13 is removed from the optical path of the fluorescence, and acquires a fluorescence image. The first camera 15a outputs the acquired separated fluorescence image or fluorescence image to the image processing device 5 using communication or via a recording medium.
- the second camera 15b is an imaging device that captures a two-dimensional image composed of the same N pixels as the first camera 15a. , is a camera that captures the other component of the fluorescence separated by the wavelength information acquisition optical system 13 and acquires the other separated fluorescence image.
- the second camera 15b outputs the acquired separated fluorescence image to the image processing device 5 using communication or via a recording medium.
- the fluorescence image is obtained by adding the images by the image processing device 5 using one separated fluorescence image acquired by the first camera 15a and the other separated fluorescence image acquired by the second camera 15b. may be obtained by In that case, the support mechanism in the wavelength information acquisition optical system 13 may be omitted.
- FIG. 3 is a block diagram showing an example of the hardware configuration of the image processing device 5
- FIG. 4 is a block diagram showing the functional configuration of the image processing device 5. As shown in FIG.
- the image processing apparatus 5 physically includes a CPU (Central Processing Unit) 101 as a processor, a RAM (Random Access Memory) 102 or ROM (Read Only Memory) 103 as a recording medium, a communication A computer or the like including a module 104, an input/output module 106, etc., which are electrically connected to each other.
- the image processing apparatus 5 may include a display, keyboard, mouse, touch panel display, etc. as input/output devices, and may include a data recording device such as a hard disk drive, a semiconductor memory, and the like.
- the image processing device 5 may be configured by a plurality of computers.
- the image processing apparatus 5 includes an image acquisition unit 201, a wavelength information acquisition unit 202, a clustering unit 203, a statistical value calculation unit 204, and an image generation unit 205 as functional components. .
- FIG. By executing this computer program, the CPU 101 of the image processing apparatus 5 causes each functional unit shown in FIG. 4 to function, and sequentially executes processing corresponding to the dye image acquisition method described later.
- the CPU 101 may be a single piece of hardware, or may be implemented in programmable logic such as an FPGA like a soft processor.
- the RAM and ROM may be stand-alone hardware, or may be built in programmable logic such as FPGA.
- Various data necessary for executing this computer program and various data generated by executing this computer program are all stored in a built-in memory such as ROM 103 and RAM 102, or a storage medium such as a hard disk drive.
- ROM 103 and RAM 102 or a storage medium such as a hard disk drive.
- the image acquisition unit 201 acquires from the image acquisition device 3 C fluorescence images (C is an integer of 2 or more) specified in advance for the sample S. These C fluorescence images are obtained by irradiating the sample S with excitation light in C wavelength bands in a state in which the wavelength information acquisition optical system 13 is removed from the optical path of the fluorescence. 4 is a fluorescence image composed of N pixels generated by imaging the generated fluorescence. At this time, the number C of fluorescence images to be acquired (the number C of wavelength bands of excitation light with which the sample S is irradiated) is specified in advance so as to be equal to or greater than the maximum number of dyes that the sample S can contain.
- the intensity ratio of the excitation light when the image acquisition device 3 obtains the C fluorescence images is the same, or the image acquisition unit 201 obtains fluorescence images in which the intensity of the excitation light is considered to be equivalent.
- the luminance values of the C fluorescence images are relatively corrected as follows.
- the image acquisition unit 201 acquires C sets of separated fluorescence images for the sample S, which are specified in advance, from the image acquisition device 3 .
- These C sets of separated fluorescence images are obtained by irradiating the sample S with excitation light in C wavelength bands in a state in which the wavelength information acquisition optical system 13 is supported on the optical path of the fluorescence.
- 2 is a set of separated fluorescence images composed of N pixels generated by imaging the fluorescence generated from the two components separately.
- the wavelength information acquisition unit 202 calculates the ratio of the fluorescence intensity (brightness value) of one separated fluorescence image to the fluorescence intensity of the other separated fluorescence image for each of the C sets of separated fluorescence images. A centroid fluorescence wavelength indicating the centroid of the fluorescence wavelength distribution is estimated. At this time, the wavelength information acquiring unit 202 calculates the average value of the fluorescence intensity of one separated fluorescence image and the average value of the fluorescence intensity of the other separated fluorescence image for the pixel group clustered by the clustering unit 203 described below. Calculate and calculate the ratio of these averages. The wavelength information acquisition unit 202 acquires the estimated centroid fluorescence wavelength as wavelength information about the fluorescence wavelength.
- the clustering unit 203 clusters N pixels constituting the C fluorescence images based on the C fluorescence images acquired by the image acquisition unit 201 and the wavelength information acquired by the wavelength information acquisition unit 202. to run. Prior to the clustering process, the clustering unit 203 generates matrix data Y in which fluorescence intensity values of N pixels constituting each of the C fluorescence images are arranged in parallel in one dimension.
- the clustering unit 203 has a function (first clustering function) of clustering N pixels into C pixel groups based on distribution information for each excitation light in C wavelength bands of fluorescence intensity. . Specifically, the clustering unit 203 clusters pixels having the same excitation light wavelength band with the highest fluorescence intensity into the same pixel group.
- FIG. 5 shows an image of a group of pixels clustered by the first clustering function of the clustering unit 203, and FIG. ing. As shown in FIG. 5, the sample S contains three types of dyes, namely dye C 1 , dye C 2 and dye C 3 , and six fluorescence images are obtained using excitation light in six different wavelength bands.
- the clustering unit 203 clusters N pixels included in the six fluorescence images GC 1 to GC 6 into six pixel groups PGr 1 to PGr 6 .
- different types of dyes generally have different wavelength characteristics of absorptance
- the three types of dyes C 1 , C 2 , and C 3 also have different peak wavelength wavelength characteristics CW 1 , CW 2 , Has CW 3 . Therefore, in the six wavelength bands EW 1 , EW 2 , EW 3 , EW 4 , EW 5 and EW 6 of the excitation light, the three dyes C 1 , C 2 and C 3 have the highest absorptance. It's decided to be one of us.
- dye C1 has the highest absorptivity of excitation light in wavelength band EW1
- dye C1 has the highest absorptivity of excitation light in wavelength band EW2
- wavelength band EW3 has the highest absorptivity of excitation light.
- the clustering unit 203 can cluster the N pixels into a pixel group within a range in which the same pigment is distributed by the first clustering function.
- the six pixel groups PGr 1 to PGr 6 clustered by the first clustering function do not correspond one-to-one to the three types of dyes C 1 , C 2 and C 3 .
- the clustering unit 203 has a function of further clustering the C pixel groups clustered by the first clustering function into L pixel groups (L is an integer of 2 or more and N ⁇ 1 or less) based on the wavelength information.
- L is an integer of 2 or more and N ⁇ 1 or less
- the number L of pixel groups to be clustered is set in advance as a parameter stored in the image processing device 5 so as to correspond to the number of types of dyes that may exist in the sample S.
- the clustering unit 203 specifies the centroid fluorescence wavelength estimated for the wavelength band of the excitation light corresponding to each of the C pixel groups clustered by the first clustering function.
- the clustering unit 203 acquires, from the wavelength information acquiring unit 202, wavelength information for a group of pixels clustered as having the highest absorption in a certain wavelength band, and based on the acquired wavelength information, Identify the centroid fluorescence wavelength.
- the wavelength information acquisition unit 202 acquires the wavelength information using the average value of the fluorescence intensity in the pixel group of the set of separated fluorescence images obtained corresponding to the wavelength band.
- the clustering unit 203 clusters the C pixel groups into L pixel groups by determining the distance (closeness of values) between the centroid fluorescence wavelengths specified for each C pixel group. Then, the clustering unit 203 divides and regenerates the matrix data Y in which the fluorescence intensity values of the C pixels of the fluorescence image are arranged one-dimensionally in parallel into cluster matrices for each of the L pixel groups.
- FIG. 7 shows the distribution of centroid fluorescence wavelengths specified by the clustering unit 203
- FIG. 8 shows an image of a group of pixels clustered by the second clustering function of the clustering unit 203.
- the centroid fluorescence wavelengths FW 1 to FW 6 are specified for each of the six pixel groups PGr 1 to PGr 6 clustered by the first clustering function, and the distances between the centroid fluorescence wavelengths are Close pixel groups PGr 1 and PGr 2 are clustered into a new pixel group PGr 01 , similarly, pixel groups PGr 3 and PGr 4 are clustered into pixel group PGr 02 , and pixel groups PGr 5 and PGr 5 are clustered together.
- the pixel groups PGr 6 and PGr 6 are clustered into the pixel group PGr 03 .
- the C pixels of the fluorescence image can be divided into L pixel groups corresponding to the distribution of the pixels assumed to be included in the sample S.
- the division number L by the second clustering function is set to the number C of fluorescence images (the number C of wavelength bands of excitation light) or less.
- the statistical value calculation unit 204 calculates distributions of K dyes (K is an integer of 2 or more and C or less) from the C fluorescence images. Find the mixing matrix A for generating the K dye images shown.
- K is an integer of 2 or more and C or less
- the relationship between matrix data Y, which is an observed value matrix, and dye matrix data X, in which K dye images are arranged in one-dimensional parallel for each pixel is the following formula using the mixing matrix A:
- Y is matrix data of C rows and N columns
- A is matrix data of C rows and K columns
- X is matrix data of K rows and N columns.
- the statistical value calculation unit 204 regenerates the matrix data Y' by compressing the matrix data Y generated by the clustering unit 203 in units of pixel groups clustered by the clustering unit 203. Specifically, the statistical value calculation unit 204 calculates a statistical value for each pixel group of the clustered cluster matrix for the fluorescence intensity of each row of the matrix data Y, and calculates the calculated statistical value for the pixel group of each row. compress to one pixel with As a result, the statistical value calculation unit 204 regenerates the matrix data Y', which is matrix data of C rows and L columns. As the statistical value, the statistical value calculation unit 204 may calculate an average value based on the integrated value of the fluorescence intensity, may calculate the mode value of the fluorescence intensity, or calculate the median value of the fluorescence intensity. You may
- a mixing matrix A is derived based on the matrix data Y' by utilizing the property that is established.
- FIG. 9 shows an image of the matrix data Y' regenerated by the statistical value calculation unit 204 and the corresponding dye matrix data X'.
- One square shown in FIG. 9 represents one element of the matrix data.
- the pigment matrix data X and the matrix data Y divided into the three pixel groups PGr 01 to PGr 03 are obtained by using the statistical values of the pixel groups PGr 01 to PGr 03 as representative values, and the three columns of the pigment matrix data X ' and matrix data Y'.
- the statistical value calculation unit 204 derives the mixing matrix A based on the matrix data Y' as follows. That is, the statistical value calculation unit 204 sets the initial value to the mixing matrix A, calculates the following loss function (loss value) Los while sequentially changing the value of the mixing matrix A, and reduces the value of the loss function Los.
- a mixing matrix A such as Note that a regularization term such as the L1 norm ⁇
- j is a parameter indicating the row position of the matrix data (corresponding to the wavelength band of the excitation light)
- the matrix subscript 1j indicates the matrix data of the j-th row of the first cluster matrix
- the matrix subscript 2j indicates the matrix data of the jth row of the second cluster matrix
- the matrix subscript 3j indicates the matrix data of the jth row of the third cluster matrix.
- Parameters a, b, and c indicate the average values of the statistical values in each column of the matrix data Y'.
- the statistical value calculation unit 204 calculates the loss function by referring to the statistical values of the C matrix data Y′ for each of the L cluster matrices divided by the clustering unit 203, and calculates the L A loss function Los is calculated based on the sum of loss functions, and a mixing matrix A is obtained based on the loss function Los.
- the statistical value calculation unit 204 corrects the loss function calculated for each of the L cluster matrices by dividing the average values a, b, and c of the statistical values of the C matrix data Y′. , the loss function Los is obtained by calculating the sum of the corrected loss functions.
- the statistical value calculation unit 204 calculates the loss function for each of the L cluster matrices, and the row component for each wavelength band of the pumping light of the difference value Y′ ⁇ AX′ as C numbers corresponding to each wavelength band of the pumping light. It may be calculated by dividing and correcting using the statistical value.
- the statistical value calculation unit 204 derives the mixing matrix A and the dye matrix data X' based on the matrix data Y' as follows. That is, the statistic value calculation unit 204 sets initial values to the mixing matrix A and the dye matrix data X′, and sequentially changes the values of the mixing matrix A and the dye matrix data X′ using the loss function (loss value) Los, and derive the mixing matrix A and dye matrix data X' that reduce the value of the loss function Los.
- a regularization term such as the L1 norm ⁇
- the calculation may be performed under a constraint that the mixing matrix A and the dye matrix data X' are non-negative values.
- j is a parameter indicating the row position of the matrix data (corresponding to the wavelength band of the excitation light)
- i is a parameter indicating the column position of the matrix data (corresponding to the i-th cluster).
- wij represents the weight of each element of the matrix data, and may be calculated from the value of each element or its standard deviation. It is also possible to set all w ij to the same value and not consider the weight of each element.
- the average values of the statistical values of each column of the matrix data Y′ in the above formula are a, b, c, .
- the formula is the same as the formula for the loss function Los shown above.
- the statistical value calculation unit 204 calculates the loss function by referring to the statistical values of the C matrix data Y′ for each of the L cluster matrices divided by the clustering unit 203, and calculates the L A loss function Los is calculated based on the sum of loss functions Los i , and a mixing matrix A is obtained based on the loss function Los. Note that the statistical value calculation unit 204 calculates the loss function Los i for each of the L cluster matrices, and converts the row component for each wavelength band of the excitation light of the difference value Y′ ⁇ AX′ into C It may also be calculated by dividing and correcting with the statistic.
- the image generation unit 205 unmixes the C fluorescence images obtained from the sample S to be observed using the mixing matrix A derived by the statistical value calculation unit 204 to generate K fluorescence images. Acquire a dye image of Specifically, the image generation unit 205 applies the inverse matrix A ⁇ 1 of the mixing matrix A to the matrix data Y generated by the clustering unit 203 based on the C fluorescence images, thereby obtaining the dye matrix data X to calculate Then, the image generation unit 205 reproduces K dye images from the dye matrix data X, and outputs the reproduced K dye images.
- the output destination at this time may be an output device of the image processing device 5 such as a display or a touch panel display, or may be an external device connected to the image processing device so as to enable data communication.
- FIG. 10 is a flow chart showing the procedure of observation processing by the dye image acquisition system 1. As shown in FIG.
- the image acquisition unit 201 of the image processing device 5 acquires C fluorescence images and C sets of separated fluorescence images of the sample S (step S1; image acquisition step).
- matrix data Y in which N pixels of C fluorescence images are arranged in parallel is generated by the clustering unit 203 of the image processing device 5 (step S2).
- the first clustering function is executed by the clustering unit 203 of the image processing device 5, and the N pixels of the fluorescence image are clustered into C pixel groups using the fluorescence intensity distribution information for each excitation wavelength band.
- step S3 clustering step
- the wavelength information acquisition unit 202 of the image processing device 5 acquires wavelength information indicating the centroid fluorescence wavelength by referring to a set of separated fluorescence images for each C pixel group (step S4; wavelength information acquisition step).
- the second clustering function is executed by the clustering unit 203 of the image processing device 5, and the distance between the centroid fluorescence wavelengths specified based on the wavelength information is determined for each C pixel group. pixels are clustered into L pixel groups (step S5; clustering step).
- the statistical value calculation unit 204 of the image processing device 5 calculates the statistical values of the L pixel groups, thereby regenerating the matrix data Y′ based on the matrix data Y generated by the clustering unit 203. (step S6; calculation step). Then, the statistical value calculation unit 204 of the image processing device 5 derives the mixing matrix A based on the matrix data Y' (step S7; image generation step). Further, the image generation unit 205 of the image processing device 5 unmixes the matrix data Y generated based on the C fluorescence images of the sample S using the mixing matrix A, resulting in K A dye image is reproduced (step S8; image generation step). Finally, the image generation unit 205 of the image processing device 5 outputs the reproduced K dye images (step S9). Thus, the observation processing for the sample S is completed.
- C fluorescence images are acquired by capturing the fluorescence image of the sample S using excitation light in different wavelength bands, and N pixels of these C fluorescence images are acquired. is clustered into L pixel groups based on the fluorescence intensity of each pixel, and L cluster matrices are generated in which C fluorescence images are arranged for each L pixel groups.
- statistical values of fluorescence intensities of pixel groups constituting C fluorescence images are calculated for each of the L cluster matrices, and the C fluorescence images are unlocked using the statistical values of the respective C fluorescence images. They are mixed to produce K dye images.
- N pixels are clustered based on the distribution information of the fluorescence intensity for each of the C excitation lights.
- the pixels can be clustered according to the fluorescence intensity of each excitation light, and unmixing is performed based on the clustering result, thereby improving the separation accuracy of the dye image.
- a wavelength information acquisition step of acquiring wavelength information about the fluorescence wavelength for each pixel group clustered by the first clustering function is further executed, and in the clustering step, wavelength information corresponding to each pixel group
- the C pixel groups are further clustered into L pixel groups based on .
- the pixels can be clustered using the fluorescence wavelength corresponding to each pixel group of the fluorescence image. The accuracy of image separation can be improved.
- the fluorescence from the sample S generated using the excitation light in one of the C wavelength bands is separated into two components via the wavelength information acquisition optical system 13, and separated.
- a set of separated fluorescence images is obtained by respectively imaging the fluorescence of the two components, and wavelength information is obtained based on the set of separated fluorescence images.
- the fluorescence from the sample S generated by the irradiation of the excitation light is separated with different wavelength characteristics, and the wavelength information is acquired based on the set of separated fluorescence images obtained by imaging the separated fluorescence, so that the wavelength information can be accurately obtained. can be analyzed. As a result, the accuracy of separation of dye images can be improved.
- excitation of multiple wavelength bands is performed by switching between multiple excitation light filters (band-pass filters) each having multiple transmission wavelength bands provided in the light source.
- Multiple fluorescence images are acquired by sequentially irradiating light and capturing the fluorescence emitted from the sample through the fluorescence filters (band-pass filters), each of which has multiple transmission wavelength bands, while switching between them. It had been. Therefore, it is necessary to switch the excitation light filter and switch the fluorescence filter accordingly. As a result, it takes time to switch the filters and time to adjust the position of the stage of the sample S in order to match the irradiation range of the excitation light of the sample S according to the switching of the filters. tend to be longer.
- multi-bandpass filters are used as the light source side filter set 9a and the camera side filter set 9b. Since the wavelength band can be switched, the acquisition time of fluorescence images can be greatly shortened. Note that in the present embodiment, fluorescence images are captured using a multiband filter, so that fluorescence of a plurality of fluorescence bands is mixed and reflected in each fluorescence image. can be separated into the dye images with high accuracy.
- the statistic value calculated when compressing the cluster matrix is calculated based on the integrated value, the mode value, or the median value of the fluorescence intensity of the pixel group.
- unmixing can be performed based on the overall tendency of the fluorescence intensity of the L pixel groups of the fluorescence image in the cluster matrix, and the accuracy of the generated dye image can be improved.
- the statistical values of the C fluorescence images for each of the L cluster matrices are used to obtain the mixing matrix A, and the mixing matrix A is used to generate the C fluorescence images. Unmixed.
- a mixing matrix A can be obtained based on the statistical values of the L pixel groups of the clustered fluorescence image, and the mixing matrix A can be used to perform unmixing, resulting in the dye image being generated. Accuracy can be improved.
- the mixing matrix A is derived by calculation based on reference information (fluorescence spectrum, absorption spectrum, etc.) of each dye. In this embodiment, even if such reference information is unknown, the mixing matrix A can be estimated from the matrix data Y obtained from the fluorescence image.
- the NMF calculation method is used to calculate a loss function based on the statistical values of the C fluorescence images for each of the L cluster matrices.
- a mixing matrix A is obtained based on the loss function Los, which is the sum of the loss functions of .
- a loss function is calculated based on the statistical values of the L pixel groups of the clustered fluorescence image
- a mixing matrix A is obtained based on the sum of these loss functions, and the mixing matrix A is used Unmixing can be performed.
- Unmixing can be performed.
- the accuracy of the generated dye image can be further improved.
- the separation accuracy of dyes with relatively large distribution areas is emphasized, resulting in the separation accuracy of dyes with relatively small distribution areas.
- the separation accuracy of a plurality of dyes can be uniformly improved.
- the loss function for each of the L cluster matrices is corrected by dividing by the coefficients a, b, and c based on the statistical values of the cluster matrix, and the corrected loss function is A mixing matrix A may be obtained based on the sum.
- the loss function is calculated based on the statistical values of the L pixel groups of the clustered fluorescence image, and each loss function is corrected based on the statistical value when obtaining the sum of the loss functions. .
- the dye image can be generated with high accuracy.
- the calculation time for generating the dye image can be greatly reduced by performing clustering and unmixing using the matrix data Y' based on the statistical values. For example, when a fluorescence image including 660 tiles of 2048 ⁇ 2048 images per tile is to be processed, and the number of excitation light bands C is set to “10”, the calculation time is the matrix data Y was used as it was for unmixing, it took about 17,000 seconds. On the other hand, according to the method of the present embodiment, the number of data to be processed is greatly reduced, and the calculation time when the number L of clustering is "6" is greatly reduced to about 0.06 seconds. reduced. In addition, according to the method of the present embodiment, the accuracy of the dye image can be maintained by unmixing using statistical values.
- FIG. 11 shows an example of a dye image generated by the dye image acquisition system 1 according to this embodiment.
- the distribution of rhodamine which is a dye contained in the sample S, can be obtained with high accuracy. It was found that fluorescence images with similar accuracy can be obtained even when compared with conventional fluorescence imaging techniques using band-pass filters.
- the number L of pixel groups to be clustered and the number K of dye images are set in advance as parameters according to the number of dyes contained in the sample S.
- the dye image acquisition system 1 of the present embodiment uses the C set of separated fluorescence images obtained by the first camera 15a and the second camera 15b as they are to perform unmixing to generate dye images. good too.
- the number of fluorescence images to be unmixed is 2 ⁇ C.
- the dye image acquisition system 1 switches and uses M band-pass filters that transmit one fluorescence band as the camera-side filter set 9b, and unmixes the M ⁇ C fluorescence images obtained as a result. may be executed.
- the dye image acquisition system 1 includes a plurality of excitation light sources 7 for irradiating the sample S with excitation light in a plurality of wavelength bands at the same time.
- C fluorescence images may be acquired while irradiating excitation light with wavelength distributions of types.
- a dye image for each of a plurality of dyes can be obtained with high accuracy.
- the image processing apparatus 5 may generate data in which N pixels forming an image are arranged in the row direction according to a prescribed rule. Alternatively, it may be generated as data arranged in the row direction according to a random rule. However, the fluorescence image data of the C row forming one matrix data Y is set as data in which N pixels are arranged according to the same rule. The same matrix data Y' can be regenerated by clustering using matrix data Y generated by random arrangement.
- the image processing apparatus 5 generates the matrix data Y before clustering by excluding background pixels (pixels without dye) included in the fluorescence image. good too.
- a method using machine learning such as the K-means method, a method using deep learning, or the like may be adopted.
- methods for clustering pixels in the image processing device 5 include decision trees, support vector machines, KNN (K nearest neighbors), self-organizing maps, spectral clustering, mixture Gaussian models, Methods using machine learning such as DBSCAN, Affinity Propagation, MeanShift, Ward, Agglomerative Clustering, OPTICS, BIRCH, methods using deep learning, etc. may be adopted.
- the matrix data Y may be preprocessed before clustering is applied. For example, the C-dimensional data of each pixel may be reduced by Phasor Analysis, Principal Component Analysis, Singular Value Decomposition, Independent Component Analysis, Linear Discriminant Analysis, t-SNE, UMAP, and other machine learning.
- the image acquisition device 3 of the present embodiment includes, as the wavelength information acquisition optical system 13, an inclined dichroic mirror having a wavelength characteristic in which the transmittance linearly changes with respect to the wavelength as described above, and a wavelength band that transmits one wavelength band.
- a single-band dichroic mirror that transmits multiple wavelength bands or a multi-band dichroic mirror that transmits multiple wavelength bands may be employed.
- the multi-bandpass filter described above may be employed as the camera-side filter set 9b, or a single bandpass filter that transmits one wavelength band may be employed.
- the wavelength information acquisition unit 202 of the image processing device 5 uses the following formula; is used to calculate the wavelength information WLC indicating the centroid fluorescence wavelength.
- x1 is the fluorescence intensity of one separated fluorescence image
- X2 is the fluorescence intensity of the other separated fluorescence image.
- the wavelength information acquisition optical system 13 is not limited to a dichroic mirror, and a filter set having similar wavelength characteristics may be used, or a beam splitter (polarization beam splitter, etc.) that splits fluorescence may be used. may Alternatively, a set of separated fluorescence images may be obtained by switching and using filters having different wavelength characteristics and taking multiple images using one camera. Alternatively, the fluorescence of two components separated by the wavelength information acquisition optical system 13 may be imaged by dividing the field of view with one camera.
- the wavelength information acquisition unit 202 of the image acquisition device 3 of the present embodiment acquires fluorescence images acquired using a camera capable of detecting at least two or more fluorescence wavelengths in order to acquire wavelength information about fluorescence wavelengths.
- a camera capable of detecting at least two or more fluorescence wavelengths in order to acquire wavelength information about fluorescence wavelengths.
- the wavelength information acquisition unit 202 can also calculate and acquire information about fluorescence wavelengths by comparing three intensity values of R, G, and B pixels obtained from the color sensor. .
- the wavelength information acquisition unit 202 can also calculate and acquire information on fluorescence wavelengths by comparing intensity values for different wavelengths obtained from the multiband sensor.
- the wavelength information since the wavelength information is acquired based on the fluorescence image obtained by picking up the fluorescence, the wavelength information can be analyzed with high accuracy. As a result, the accuracy of separation of dye images can be improved.
- the clustering step preferably clusters N pixels based on the distribution information of the intensity values for each of the C excitation lights.
- the image processing device preferably clusters N pixels based on distribution information of intensity values for each of C excitation lights. Thereby, the pixels can be clustered according to the fluorescence intensity of each excitation light, and unmixing is performed based on the clustering result, thereby improving the separation accuracy of the dye image.
- the dye image acquisition method of the present disclosure further includes a wavelength information acquisition step of acquiring wavelength information about the fluorescence wavelength corresponding to each pixel of the fluorescence image, and in the clustering step, based on the wavelength information corresponding to each pixel It is also preferred to cluster the N pixels.
- the image processing device further acquires wavelength information about the fluorescence wavelength corresponding to each pixel of the fluorescence image, and calculates N pixels based on the wavelength information corresponding to each pixel. Clustering is also preferred. Thereby, the pixels can be clustered using the wavelength information corresponding to each pixel of the fluorescence image, and unmixing is performed based on the clustering result, thereby improving the separation accuracy of the dye image.
- the sample in the wavelength information acquisition step, is irradiated with excitation light having a wavelength distribution of one of the C wavelength distributions, and the fluorescence from the sample is separated with different wavelength characteristics. It is also preferable to separate via an information acquisition optical system, capture the separated fluorescence, acquire a plurality of separated fluorescence images, and acquire wavelength information based on the plurality of separated fluorescence images. Further, in the dye image acquisition device of the present disclosure, the image acquisition device irradiates the sample with excitation light having one of the C wavelength distributions, and separates the fluorescence from the sample with different wavelength characteristics.
- the fluorescence is separated via an acquisition optical system, the separated fluorescence is imaged, and a plurality of separated fluorescence images are acquired, and the image processing device acquires wavelength information based on the plurality of separated fluorescence images.
- the fluorescence from the sample generated by the irradiation of the excitation light is separated with different wavelength characteristics, and the wavelength information is acquired based on the multiple separated fluorescence images obtained by imaging the separated fluorescence, so the wavelength information can be analyzed accurately. can do. As a result, the accuracy of separation of dye images can be improved.
- the sample in the wavelength information acquisition step, is irradiated with excitation light having a wavelength distribution of any one of the C wavelength distributions, and fluorescence from the sample is emitted at least at two or more fluorescence wavelengths. is captured by a camera capable of detecting , a fluorescence image is obtained, and wavelength information is obtained based on the fluorescence image. Further, in the dye image acquisition device of the present disclosure, the image acquisition device irradiates the sample with excitation light having a wavelength distribution of one of the C wavelength distributions, and emits fluorescence from the sample into at least two or more fluorescence wavelengths.
- the wavelength information can be accurately analyzed because the wavelength information is acquired based on the fluorescence image obtained by capturing the fluorescence from the sample caused by the irradiation of the excitation light. As a result, the accuracy of separation of dye images can be improved.
- the calculation step calculates the statistical value based on the integrated value, the mode value, or the median value of the intensity values of the pixel group.
- the image processing device preferably calculates the statistical value based on the integrated value, the mode value, or the median value of the intensity values of the pixel group. In this case, unmixing can be performed based on the overall trend of the intensity values of the L pixel groups of the clustered fluorescence image, and the accuracy of the generated dye image can be improved.
- outliers or some pixels may be excluded before calculating statistical values.
- each pixel may be redundantly included in different clusters. In that case, it is used for statistical value calculation in each cluster.
- the statistical values of C fluorescence images for each L cluster matrix are used to obtain a mixing matrix, and unmixing is performed using the mixing matrix. Doing is also preferred.
- the image processing device uses statistical values of C fluorescence images for each of L cluster matrices to obtain a mixing matrix, and performs unmixing using the mixing matrix. Doing is also preferred. In this way, a mixing matrix is obtained based on the statistical values of the L pixel groups of the clustered fluorescence image, and the mixing matrix can be used to perform unmixing, increasing the accuracy of the generated dye image. can be improved.
- the image generation step uses non-negative matrix factorization to calculate a loss value based on the statistical values of the C fluorescence images for each of the L cluster matrices. , to determine the mixing matrix based on the sum of the loss values.
- the image processing device uses non-negative matrix factorization to calculate the loss value based on the statistical values of the C fluorescence images for each of the L cluster matrices. , to determine the mixing matrix based on the sum of the loss values.
- the loss value is calculated based on the statistical values of the L pixel groups of the clustered fluorescence image
- the mixing matrix is obtained based on the sum of the loss values
- the unmixing is performed using the mixing matrix.
- the loss values for each L cluster matrices are corrected based on the statistical values, and the mixing matrix is obtained based on the sum of the corrected loss values.
- the image processing device corrects the loss value based on the statistical value for each L cluster matrix, and obtains the mixing matrix based on the sum of the corrected loss values.
- the loss value is calculated based on the statistical values of the L pixel groups of the clustered fluorescence image, and each loss value is corrected based on the statistical value when obtaining the sum of the loss values. .
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
Abstract
Description
Y=AX
で表わされる。ここで、Yは、C行N列の行列データであり、Aは、C行K列の行列データであり、Xは、K行N列の行列データである。逆に、ミキシング行列Aの値が求まれば、色素行列データXは、ミキシング行列Aの逆行列A-1と行列データYとを用いて、下記式;
X=A-1Y
によって導き出すことができる(この処理をアンミキシングと言う。)。
Y’=AX’
が成立する性質を利用して、行列データY’を基にミキシング行列Aを導出する。図9には、統計値計算部204によって再生成される行列データY’、及びそれに対応する色素行列データX’のイメージを示している。図9に図示される1つの升目は行列データの1つの要素を表している。このように、3つの画素群PGr01~PGr03に分割された色素行列データX及び行列データYは、画素群PGr01~PGr03毎の統計値を代表値として、3列の色素行列データX’及び行列データY’に圧縮されたデータである。
上記式中、jは行列データの行の位置(励起光の波長帯に対応)を示すパラメータであり、行列の添え字1jは、1番目のクラスタ行列のj番目の行の行列データを示し、行列の添え字2jは、2番目のクラスタ行列のj番目の行の行列データを示し、行列の添え字3jは、3番目のクラスタ行列のj番目の行の行列データを示す。また、パラメータa、b、cは、行列データY’の各列の統計値の平均値を示している。
上記式中、jは行列データの行の位置(励起光の波長帯に対応)を示すパラメータであり、iは行列データの列の位置(i番目のクラスタに対応)を示すパラメータである。また、wijは行列データの各要素の重みを表しており、各要素の値またはその標準偏差から計算しても良い。また、wijを全て同じ値にして各要素の重みを考慮しないことも可能である。なお、上記式における行列データY’の各列の統計値の平均値をa、b、c、…とし、w1j=1/a、w2j=1/b、w3j=1/cと置き換えた式は、先に示したロス関数Losの式と同じとなる。
を用いて重心蛍光波長を示す波長情報WLCを計算する。ここで、x1は一方の分離蛍光画像の蛍光強度であり、X2は他方の分離蛍光画像の蛍光強度である。一方、カメラ側フィルタセット9bとしてシングルバンドパスフィルタが採用され、波長情報取得光学系13として、シングルバンドダイクロイックミラーあるいはマルチバンドダイクロイックミラーが採用された場合は、画像処理装置5の波長情報取得部202は、下記式;
WLratio=x2/x1
を用いて波長情報WLratioを計算する。
Claims (19)
- C個(Cは2以上の整数)の波長分布の励起光のそれぞれを試料に照射し、N個(Nは2以上の整数)の画素によってそれぞれが構成される前記C個の蛍光画像を取得する画像取得ステップと、
前記C個の蛍光画像の各画素の強度値を基に、前記N個の画素をL個(Lは2以上N-1以下の整数)に画素群にクラスタリングして、前記C個の蛍光画像をクラスタリングした画素群毎に配列したL個のクラスタ行列を生成するクラスタリングステップと、
L個のクラスタ行列ごとにC個の蛍光画像を構成する前記画素群の前記強度値の統計値を計算する計算ステップと、
L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を用いて、前記C個の蛍光画像を対象にアンミキシングを行い、K個(Kは2以上C以下の整数)の色素毎の分布を示す前記K個の色素画像を生成する画像生成ステップと、
を備える色素画像取得方法。 - 前記クラスタリングステップでは、前記強度値の前記C個の励起光毎の分布情報を基に前記N個の画素をクラスタリングする、
請求項1に記載の色素画像取得方法。 - 前記蛍光画像の各画素に対応する蛍光波長に関する波長情報を取得する波長情報取得ステップをさらに備え、
前記クラスタリングステップでは、前記各画素に対応する前記波長情報を基に前記N個の画素をクラスタリングする、
請求項1又は2に記載の色素画像取得方法。 - 前記波長情報取得ステップでは、前記C個の波長分布のいずれかの波長分布の励起光を前記試料に照射し、前記試料からの蛍光を異なる波長特性で分離する波長情報取得光学系を介して分離し、分離された前記蛍光をそれぞれ撮像して複数の分離蛍光画像を取得し、前記複数の分離蛍光画像を基に前記波長情報を取得する、
請求項3に記載の色素画像取得方法。 - 前記波長情報取得ステップでは、前記C個の波長分布のいずれかの波長分布の励起光を前記試料に照射し、前記試料からの蛍光を少なくとも2つ以上の蛍光波長を検出可能なカメラで撮像して蛍光画像を取得し、前記蛍光画像を基に前記波長情報を取得する、
請求項3に記載の色素画像取得方法。 - 前記計算ステップでは、前記統計値を、前記画素群の前記強度値の積算値、最頻値、あるいは中間値に基づいて計算する、
請求項1~5のいずれか1項に記載の色素画像取得方法。 - 前記画像生成ステップでは、L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を用いて、ミキシング行列を求め、前記ミキシング行列を用いてアンミキシングを行う、
請求項1~6のいずれか1項に記載の色素画像取得方法。 - 前記画像生成ステップでは、非負値行列因子分解を用いて、L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を基に損失値を計算し、前記損失値の和を基に前記ミキシング行列を求める、
請求項7に記載の色素画像取得方法。 - 前記画像生成ステップでは、L個のクラスタ行列ごと前記損失値を前記統計値を基に補正し、補正した前記損失値の和を基に前記ミキシング行列を求める、
請求項8に記載の色素画像取得方法。 - C個(Cは2以上の整数)の波長分布の励起光のそれぞれを試料に照射し、N個(Nは2以上の整数)の画素によってそれぞれが構成される前記C個の蛍光画像を取得する画像取得装置と、
前記試料における色素の分布を示す色素画像を生成する画像処理装置と、を備え、
前記画像処理装置は、
前記C個の蛍光画像の各画素の強度値を基に、前記N個の画素をL個(Lは2以上N-1以下の整数)に画素群にクラスタリングして、前記C個の蛍光画像をクラスタリングした画素群毎に配列したL個のクラスタ行列を生成し、
L個のクラスタ行列ごとにC個の蛍光画像を構成する前記画素群の前記強度値の統計値を計算し、
L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を用いて、前記C個の蛍光画像を対象にアンミキシングを行い、K個(Kは2以上C以下の整数)の色素毎の分布を示す前記K個の色素画像を生成する、
色素画像取得装置。 - 前記画像処理装置は、前記強度値の前記C個の励起光毎の分布情報を基に前記N個の画素をクラスタリングする、
請求項10に記載の色素画像取得装置。 - 前記画像処理装置は、
前記蛍光画像の各画素に対応する蛍光波長に関する波長情報をさらに取得し、
前記各画素に対応する前記波長情報を基に前記N個の画素をクラスタリングする、
請求項10又は11に記載の色素画像取得装置。 - 前記画像取得装置は、
前記C個の波長分布のいずれかの波長分布の励起光を前記試料に照射し、前記試料からの蛍光を異なる波長特性で分離する波長情報取得光学系を介して分離し、分離された前記蛍光をそれぞれ撮像して複数の分離蛍光画像を取得し、
前記画像処理装置は、前記複数の分離蛍光画像を基に前記波長情報を取得する、
請求項12に記載の色素画像取得装置。 - 前記画像取得装置は、前記C個の波長分布のいずれかの波長分布の励起光を前記試料に照射し、前記試料からの蛍光を少なくとも2つ以上の蛍光波長を検出可能なカメラで撮像して蛍光画像を取得し、前記蛍光画像を基に前記波長情報を取得する、
請求項12に記載の色素画像取得装置。 - 前記画像処理装置は、前記統計値を、前記画素群の前記強度値の積算値、最頻値、あるいは中間値に基づいて計算する、
請求項10~14のいずれか1項に記載の色素画像取得装置。 - 前記画像処理装置は、L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を用いて、ミキシング行列を求め、前記ミキシング行列を用いてアンミキシングを行う、
請求項10~15のいずれか1項に記載の色素画像取得装置。 - 前記画像処理装置は、非負値行列因子分解を用いて、L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を基に損失値を計算し、前記損失値の和を基に前記ミキシング行列を求める、
請求項16に記載の色素画像取得装置。 - 前記画像処理装置は、L個のクラスタ行列ごと前記損失値を前記統計値を基に補正し、補正した前記損失値の和を基に前記ミキシング行列を求める、
請求項17に記載の色素画像取得装置。 - C個(Cは2以上の整数)の波長分布の励起光のそれぞれを試料に照射することにより取得された蛍光画像であって、N個(Nは2以上の整数)の画素によってそれぞれが構成される前記C個の蛍光画像を基に、前記試料における色素の分布を示す色素画像を生成するための色素画像取得プログラムであって、
コンピュータを、
前記C個の蛍光画像の各画素の強度値を基に、前記N個の画素をL個(Lは2以上N-1以下の整数)に画素群にクラスタリングして、前記C個の蛍光画像をクラスタリングした画素群毎に配列したL個のクラスタ行列を生成するクラスタリング部、
L個のクラスタ行列ごとにC個の蛍光画像を構成する前記画素群の前記強度値の統計値を計算する計算部、及び
L個のクラスタ行列ごとの前記C個の蛍光画像の前記統計値を用いて、前記C個の蛍光画像を対象にアンミキシングを行い、K個(Kは2以上C以下の整数)の色素毎の分布を示す前記K個の色素画像を生成する画像生成部、
として機能させる色素画像取得プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22861033.3A EP4379359A1 (en) | 2021-08-25 | 2022-07-25 | Dye image acquisition method, dye image acquisition device, and dye image acquisition program |
JP2023543753A JPWO2023026742A1 (ja) | 2021-08-25 | 2022-07-25 | |
CN202280057084.5A CN117917999A (zh) | 2021-08-25 | 2022-07-25 | 色素图像获取方法、色素图像获取装置及色素图像获取程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-137177 | 2021-08-25 | ||
JP2021137177 | 2021-08-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023026742A1 true WO2023026742A1 (ja) | 2023-03-02 |
Family
ID=85323059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/028660 WO2023026742A1 (ja) | 2021-08-25 | 2022-07-25 | 色素画像取得方法、色素画像取得装置、及び色素画像取得プログラム |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4379359A1 (ja) |
JP (1) | JPWO2023026742A1 (ja) |
CN (1) | CN117917999A (ja) |
WO (1) | WO2023026742A1 (ja) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010243970A (ja) * | 2009-04-10 | 2010-10-28 | Nikon Corp | 共焦点顕微鏡、画像処理装置、および、プログラム |
JP2013213756A (ja) * | 2012-04-03 | 2013-10-17 | Institute Of Physical & Chemical Research | 多重蛍光画像の画像解析のための装置、システム、方法、およびプログラム |
US8660360B1 (en) * | 2012-08-03 | 2014-02-25 | Raytheon Company | System and method for reduced incremental spectral clustering |
JP2017502286A (ja) * | 2013-12-31 | 2017-01-19 | ベンタナ メディカル システムズ, インコーポレイテッド | ピクセルグループ化を使用して微視的画像をスペクトル純化するためのシステムおよび方法 |
JP2018185453A (ja) * | 2017-04-27 | 2018-11-22 | オリンパス株式会社 | 顕微鏡システム |
JP2020020791A (ja) * | 2018-07-24 | 2020-02-06 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システム、およびプログラム |
CN110992273A (zh) * | 2019-11-04 | 2020-04-10 | 中国科学院西安光学精密机械研究所 | 一种自相似性约束的高光谱影像解混方法 |
JP2021036224A (ja) * | 2019-08-23 | 2021-03-04 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
JP2021512406A (ja) * | 2018-01-31 | 2021-05-13 | 日本電気株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2021081342A (ja) * | 2019-11-20 | 2021-05-27 | ソニーグループ株式会社 | 情報処理システムおよび情報処理装置 |
CN113221992A (zh) * | 2021-04-30 | 2021-08-06 | 西安交通大学 | 一种基于l2,1范数的大规模数据快速聚类方法 |
-
2022
- 2022-07-25 CN CN202280057084.5A patent/CN117917999A/zh active Pending
- 2022-07-25 JP JP2023543753A patent/JPWO2023026742A1/ja active Pending
- 2022-07-25 WO PCT/JP2022/028660 patent/WO2023026742A1/ja active Application Filing
- 2022-07-25 EP EP22861033.3A patent/EP4379359A1/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010243970A (ja) * | 2009-04-10 | 2010-10-28 | Nikon Corp | 共焦点顕微鏡、画像処理装置、および、プログラム |
JP2013213756A (ja) * | 2012-04-03 | 2013-10-17 | Institute Of Physical & Chemical Research | 多重蛍光画像の画像解析のための装置、システム、方法、およびプログラム |
US8660360B1 (en) * | 2012-08-03 | 2014-02-25 | Raytheon Company | System and method for reduced incremental spectral clustering |
JP2017502286A (ja) * | 2013-12-31 | 2017-01-19 | ベンタナ メディカル システムズ, インコーポレイテッド | ピクセルグループ化を使用して微視的画像をスペクトル純化するためのシステムおよび方法 |
JP2018185453A (ja) * | 2017-04-27 | 2018-11-22 | オリンパス株式会社 | 顕微鏡システム |
JP2021512406A (ja) * | 2018-01-31 | 2021-05-13 | 日本電気株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP2020020791A (ja) * | 2018-07-24 | 2020-02-06 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システム、およびプログラム |
JP2021036224A (ja) * | 2019-08-23 | 2021-03-04 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
CN110992273A (zh) * | 2019-11-04 | 2020-04-10 | 中国科学院西安光学精密机械研究所 | 一种自相似性约束的高光谱影像解混方法 |
JP2021081342A (ja) * | 2019-11-20 | 2021-05-27 | ソニーグループ株式会社 | 情報処理システムおよび情報処理装置 |
CN113221992A (zh) * | 2021-04-30 | 2021-08-06 | 西安交通大学 | 一种基于l2,1范数的大规模数据快速聚类方法 |
Non-Patent Citations (4)
Title |
---|
ANONYMOUS: "Introduction to Spectral Imaging and Linear Unmixing. Education in Microscopy and Digital Imaging (online)", ZEISS, 27 April 2021 (2021-04-27), XP093039583, Retrieved from the Internet <URL:https://zeiss-campus.magnet.fsu.edu/articles/spectralimaging/introduction.html> [retrieved on 20230417] * |
BINJIE QIN ET AL.: "Target/Background Classification Regularized Nonnegative Matrix Factorization for Fluorescence Unmixing", IEEE TRANSACTIONS ON |
MCRAE TRISTAN D., OLEKSYN DAVID, MILLER JIM, GAO YU-RONG: "Robust blind spectral unmixing for fluorescence microscopy using unsupervised learning", BIORXIV, 9 October 2019 (2019-10-09), XP055879880, Retrieved from the Internet <URL:https://www.biorxiv.org/content/10.1101/797993v1.full.pdf> [retrieved on 20220117], DOI: 10.1101/797993 * |
TRISTAN D. MCRAE ET AL.: "Robust blind spectral unmixing for fluorescence microscopy using unsupervised learning", PLOS ONE, 2 December 2019 (2019-12-02) |
Also Published As
Publication number | Publication date |
---|---|
CN117917999A (zh) | 2024-04-23 |
JPWO2023026742A1 (ja) | 2023-03-02 |
EP4379359A1 (en) | 2024-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11542461B2 (en) | Analysis device | |
US11644395B2 (en) | Multi-spectral imaging including at least one common stain | |
US7953264B2 (en) | Classifying image features | |
US20170131200A1 (en) | Method for huanglongbing (hlb) detection | |
JP2024019444A (ja) | 情報処理装置、及び情報処理システム | |
WO2022004500A1 (ja) | 情報処理装置、情報処理方法、プログラム、顕微鏡システム及び解析システム | |
WO2023026742A1 (ja) | 色素画像取得方法、色素画像取得装置、及び色素画像取得プログラム | |
JP7404906B2 (ja) | 情報処理装置、及び顕微鏡システム | |
Lu et al. | A programmable optical filter with arbitrary transmittance for fast spectroscopic imaging and spectral data post-processing | |
AU2020281945A1 (en) | Hyperspectral quantitative imaging cytometry system | |
CN117274236B (zh) | 基于高光谱图像的尿液成分异常检测方法及系统 | |
GORLINI et al. | Hyperspectral imaging and deep learning for automatic food quality inspection | |
CN117546007A (zh) | 信息处理装置、生物样本观察系统及图像生成方法 | |
Rabinovich et al. | Quantitative spectral decomposition for stained tissue analysis | |
Kelman | Techniques for capture and analysis of hyperspectral data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22861033 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023543753 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280057084.5 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2022861033 Country of ref document: EP Effective date: 20240301 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |