WO2020262117A1 - Image processing system, image processing method, and program - Google Patents

Image processing system, image processing method, and program Download PDF

Info

Publication number
WO2020262117A1
WO2020262117A1 PCT/JP2020/023608 JP2020023608W WO2020262117A1 WO 2020262117 A1 WO2020262117 A1 WO 2020262117A1 JP 2020023608 W JP2020023608 W JP 2020023608W WO 2020262117 A1 WO2020262117 A1 WO 2020262117A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
cell
tissue
analysis
Prior art date
Application number
PCT/JP2020/023608
Other languages
French (fr)
Japanese (ja)
Inventor
北斗 田中
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2021528267A priority Critical patent/JPWO2020262117A1/ja
Publication of WO2020262117A1 publication Critical patent/WO2020262117A1/en

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing system, an image processing method and a program.
  • the virtual microscope is a system that digitizes an image observed by an optical microscope and allows the tissue sample to be observed on a display as if the optical microscope was actually used (see, for example, Patent Document 1).
  • the entire tissue sample on the slide glass is photographed, the obtained image is converted into digital data and saved in a database, and observation is performed using viewer software installed on a personal computer or the like.
  • a virtual microscope is that it can be observed while performing operations such as moving up and down, left and right, and scaling, as in the case of observation using an optical microscope.
  • Digital image data of the entire tissue specimen called a whole slide image (WSI)
  • WSSI whole slide image
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing system, an image processing method, and a program capable of efficiently performing high-precision analysis.
  • the image processing system of the present invention An acquisition method for acquiring a tissue image of a tissue sample, A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
  • the acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
  • the selection means selects the region of interest based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. To do.
  • the image processing method of the present invention The acquisition process to acquire the tissue image of the tissue sample, A selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
  • a selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
  • the selection step is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. Select the area of interest.
  • the program of the present invention Computer, Acquisition method for acquiring tissue images of tissue specimens, A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
  • the acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
  • the selection means is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. This is a program that selects the area of interest.
  • FIG. 1 shows a schematic configuration of the image processing system 1 (WSI creation system) in the present invention.
  • the image processing system 1 includes a microscope device 10, a control device 60, a display device 70, and a database 80.
  • the microscope device 10 includes a first image acquisition unit 20, a second image acquisition unit 30, and a stage 40.
  • the tissue specimen 50 after immunostaining is placed in the stage 40.
  • the tissue sample 50 is an example of a biological sample.
  • FIG. 2 shows a schematic configuration of the first image acquisition unit 20.
  • the first image acquisition unit 20 acquires a bright field image of the tissue sample 50.
  • the first image acquisition unit 20 includes a bright field light source 21, a first image sensor 22, and a light guide lens 23.
  • the bright-field light source 21 is a light source that irradiates the tissue sample 50 with light for generating a light image for acquiring a bright-field image, and is installed so as to irradiate the light from below the stage 40.
  • the tissue sample 50 is irradiated by the bright-field light source 21 and an optical image is generated, the optical image is guided to the first image sensor 22 via the light guide lens 23, and the light image of the tissue sample 50 is brightened by the first image sensor 22.
  • a field image is taken.
  • the first image sensor 22 is an image sensor such as a two-dimensional CCD sensor capable of acquiring a two-dimensional image of the optical image of the tissue specimen 50.
  • FIG. 3 shows a schematic configuration of the second image acquisition unit 30.
  • the second image acquisition unit 30 acquires a fluorescence image of the tissue sample 50.
  • the second image acquisition unit 30 includes a transmission light source 31, an excitation light source 32, a second image sensor 33, an objective lens 34, a fluorescence cube 35, and an imaging lens 36.
  • the fluorescent cube 35 includes an excitation filter 351, a dichroic mirror 352, and an absorption filter 353.
  • the transmission light source 31 is a light source used when acquiring a transmission observation image of the tissue specimen 50, and is installed so as to irradiate light from below the stage 40.
  • the excitation light source 32 is a lamp that emits excitation light by a light source such as a discharge tube.
  • the excitation filter 351 is a filter that transmits only excitation light.
  • the dichroic mirror 352 is a mirror that reflects or transmits light having a predetermined wavelength as a boundary, and here, it reflects excitation light and transmits fluorescence.
  • the absorption filter 353 is a filter that blocks excitation light and transmits only fluorescence.
  • the excitation light passes through the excitation filter 351 and is reflected by the dichroic mirror 352, passes through the objective lens 34, and irradiates the tissue sample 50.
  • fluorescence is emitted from the tissue sample 50, and the fluorescence is focused by the objective lens 34 and transmitted through the dichroic mirror 352 and the absorption filter 353.
  • the fluorescence is guided to the second image sensor 33 as a fluorescence image via the image pickup lens 36, and is imaged by the second image sensor 33.
  • the objective lens 34 includes a low-magnification objective lens (for example, 20 times) and a high-magnification objective lens (for example, 40 times).
  • the second image sensor 33 is an image sensor such as a one-dimensional CCD camera capable of acquiring a one-dimensional image or a two-dimensional image having a predetermined direction as the longitudinal direction, and acquires a high-resolution fluorescent image of the tissue sample 50. be able to.
  • a control device 60 for controlling these is connected to the microscope device 10.
  • the control device 60 includes a control unit (acquisition means, selection means, identification means, analysis means) 61, a storage unit 62, an image processing unit 63, and a communication unit 64.
  • the control unit 61 is configured to include a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like, and executes various processes in cooperation with various programs stored in the storage unit 62 to execute various processes of the microscope device 10. Control the operation comprehensively.
  • the control unit 61 is connected to the stage 40 and can control the ascent and descent of the stage 40 to control the focusing position (Z coordinate) of the tissue sample 50 installed on the stage 40. Further, the control unit 61 is connected to the first image acquisition unit 20 and controls the bright field light source 21 and the first image sensor 22 to take a bright field image. Further, the control unit 61 is connected to the second image acquisition unit 30 and controls the transmission light source 31, the excitation light source 32, and the second image sensor 33 to take a fluorescence image.
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • the storage unit 62 is composed of, for example, an HDD (Hard Disk Drive), a semiconductor non-volatile memory, or the like.
  • the storage unit 62 stores a program for taking a bright field image and a fluorescent image.
  • the image processing unit 63 performs image processing on the fluorescent image taken by the microscope device 10 to create a hall slide image (WSI). As will be described later, according to the instruction of the control unit 61, the captured partial images are combined to create an entire image of the tissue sample 50, and the image data is A / D converted into a digital image to create a WSI. .. In addition, based on the created WSI, a fluorescence brightness map used for quantitative analysis of the target substance is created.
  • WSI hall slide image
  • the communication unit 64 is an interface for transmitting and receiving data to and from an external device such as a personal computer.
  • a user who wants to refer to the WSI can read the WSI stored in the database 80 into a personal computer or the like via the communication unit 64 and observe it on the display.
  • a display device 70 is connected to the control device 60.
  • the display device 70 is configured to include, for example, a monitor such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays various screens according to an instruction of a display signal input from the control unit 61. ..
  • the display device 70 functions as an output means for outputting a captured fluorescent image or the like.
  • a database 80 is further connected to the control device 60.
  • the database 80 includes, for example, an HDD (Hard Disk Drive) and stores the WSI synthesized by the image processing unit 63.
  • the WSI is stored in the database 80 as described above.
  • the storage area is not limited to the database 80, and the database 80 is not provided. There may be.
  • the WSI may be stored in the storage unit 62, or may be stored in an external server (not shown) to form a database.
  • tissue sample 50 is a tissue section containing the target substance and is stained with an immunostaining agent, and the stained tissue sample 50 is placed on the stage 40.
  • Target substance is a substance that is present in a tissue section and is intended for immunostaining using a fluorescent label, mainly for detection or quantification from the viewpoint of pathological diagnosis.
  • a protein antigen
  • Typical target substances include biological substances that are expressed in the cell membranes of various cancer tissues and can be used as biomarkers for proteins, RNA, and the like.
  • the target substance may be a substance introduced from outside the body, such as a drug.
  • units smaller than proteins such as peptides can be immunostained.
  • Immunostaining agent antibody-fluorescent nanoparticle conjugate
  • the primary antibody and fluorescent nanoparticles indirectly, that is, using an antigen-antibody reaction or the like, other than covalent bonds. It is preferable to use a complex linked by the binding of.
  • a complex in which fluorescent nanoparticles are directly linked to the primary antibody or the secondary antibody can also be used as the immunostaining agent.
  • immunostaining agent examples include [primary antibody against the target substance] ... [antibody against the primary antibody (secondary antibody)] to [fluorescent nanoparticles].
  • “...” Indicates that the bond is bound by an antigen-antibody reaction, and the mode of binding indicated by “ ⁇ ” is not particularly limited.
  • covalent bond, ionic bond, hydrogen bond, coordination bond, antigen-antibody bond examples thereof include biotin avidin reaction, physical adsorption, and chemisorption, and may be mediated by a linker molecule if necessary.
  • an antibody (IgG) that specifically recognizes and binds to the target substance as an antigen can be used.
  • an anti-HER2 antibody can be used
  • HER3 is the target substance
  • an anti-HER3 antibody can be used.
  • an antibody (IgG) that specifically recognizes and binds to the primary antibody as an antigen can be used.
  • Both the primary antibody and the secondary antibody may be polyclonal antibodies, but monoclonal antibodies are preferable from the viewpoint of quantitative stability.
  • the type of animal (immune animal) that produces an antibody is not particularly limited, and may be selected from mice, rats, guinea pigs, rabbits, goats, sheep, and the like as in the conventional case.
  • Fluorescent nanoparticles are nano-sized particles that fluoresce when irradiated with excitation light, and emit fluorescence of sufficient intensity to represent the target substance as a bright spot one by one. Fluorescent particles.
  • fluorescent nanoparticles in the present invention, fluorescent substance integrated nanoparticles (PID: Phosphor Integrated Dot nanoparticles) are used.
  • Fluorescent substance-accumulated nanoparticles is based on particles made of organic or inorganic substances, and a plurality of fluorescent substances (for example, the quantum dots, organic fluorescent dyes, etc.) are contained therein and /. Alternatively, it is a nano-sized particle having a structure adsorbed on its surface.
  • fluorescent substance used for PID shows emission of visible to near infrared light having a wavelength in the range of 400 to 900 nm when excited by ultraviolet to near infrared light having a wavelength in the range of 200 to 700 nm. It is preferable that the mother body and the fluorescent substance have substituents or sites having opposite charges, and an electrostatic interaction acts.
  • the average particle size of the PID used in the present invention is not particularly limited, but one having a diameter of about 30 to 800 nm can be used.
  • the average particle size is more preferably in the range of 40 to 500 nm.
  • the reason why the average particle size is set to 40 to 500 nm is that if it is less than 40 nm, an expensive detection system is required, and if it exceeds 500 nm, the quantification range is narrowed due to the physical size. ..
  • the average particle size is determined by taking an electron micrograph using a scanning electron microscope (SEM), measuring the cross-sectional area of a sufficient number of particles, and using each measured value as the area of a circle. Can be obtained as.
  • the organic substances are resins generally classified as thermosetting resins such as melamine resin, urea resin, aniline resin, guanamine resin, phenol resin, xylene resin, and furan resin.
  • Resins generally classified as thermoplastic resins such as styrene resin, acrylic resin, acrylonitrile resin, AS resin (acrylonitrile-styrene copolymer), ASA resin (acrylonitrile-styrene-methyl acrylate copolymer); poly Other resins such as styrene; polysaccharides can be exemplified.
  • the inorganic substance in the mother body include silica and glass.
  • Quantum dot integrated nanoparticles have a structure in which the quantum dots are contained in the mother body and / or adsorbed on the surface thereof. When the quantum dots are contained in the mother body, the quantum dots need only be dispersed inside the mother body, and may or may not be chemically bonded to the mother body itself.
  • quantum dots semiconductor nanoparticles containing a group II-VI compound, a group III-V compound or a group IV element are used.
  • CdSe, CdS, CdTe, ZnSe, ZnS, ZnTe, InP, InN, InAs, InGaP, GaP, GaAs, Si, Ge and the like can be mentioned.
  • a quantum dot having the above quantum dot as a core and a shell provided on the core.
  • the core is CdSe and the shell is ZnS
  • CdSe / ZnS when the core is CdSe and the shell is ZnS, it is described as CdSe / ZnS.
  • CdSe / ZnS, CdS / ZnS, InP / ZnS, InGaP / ZnS, Si / SiO2, Si / ZnS, Ge / GeO2, Ge / ZnS and the like can be used, but are not limited thereto.
  • the quantum dots may be surface-treated with an organic polymer or the like.
  • an organic polymer or the like for example, CdSe / ZnS having a surface carboxy group (manufactured by Invitrogen), CdSe / ZnS having a surface amino group (manufactured by Invitrogen), and the like can be mentioned.
  • Quantum dot integrated nanoparticles can be produced by a known method.
  • the silica nanoparticles containing quantum dots can be synthesized with reference to the synthesis of CdTe-encapsulating silica nanoparticles described in New Journal of Chemistry, Vol. 33, p. 561 (2009).
  • silica nanoparticles encapsulating quantum dots refer to the synthesis of silica nanoparticles in which particles of CdSe / ZnS capped with 5-amino-1-pentanol and APS described on page 2670 (2009) of Chemical Communication are integrated on the surface. Can be synthesized into.
  • Polymer nanoparticles encapsulating quantum dots can be produced by using the method of impregnating polystyrene nanoparticles with quantum dots described in Nature Biotechnology, Vol. 19, p. 631 (2001).
  • the fluorescent dye-accumulated nanoparticles have a structure in which a fluorescent dye is contained in the mother body and / or is adsorbed on the surface thereof.
  • the fluorescent dye include organic fluorescent dyes such as rodamine-based dye molecule, squarylium-based dye molecule, cyanine-based dye molecule, aromatic ring-based dye molecule, oxazine-based dye molecule, carbopyronine-based dye molecule, and pyromesene-based dye molecule. it can.
  • Alexa Fluor registered trademark, manufactured by Invigen
  • BODIPY registered trademark, manufactured by Invigen
  • Cy registered trademark, manufactured by GE Healthcare
  • HiLite registered trademark, Trademarks, Anaspec
  • DyLight registered trademark, Thermoscientific
  • ATTO registered trademark, ATTO-TEC
  • MFP registered trademark, Mobitec
  • Dye molecule CF (registered trademark, manufactured by Biotium) dye molecule
  • DY registered trademark, manufactured by DYOMICS
  • CAL registered trademark, manufactured by BioSearch Technologies
  • the fluorescent dye When the fluorescent dye is contained in the mother body, the fluorescent dye may or may not be chemically bonded to the mother body itself as long as it is dispersed inside the mother body.
  • Fluorescent dye-accumulated nanoparticles can be produced by a known method.
  • silica nanoparticles containing a fluorescent dye can be synthesized with reference to the synthesis of FITC-encapsulating silica particles described in Langmuir Vol. 8, p. 2921 (1992).
  • a desired fluorescent dye instead of FITC
  • various fluorescent dye-accumulated nanoparticles can be synthesized.
  • Polystyrene nanoparticles containing a fluorescent dye can be obtained by a copolymerization method using an organic dye having a polymerizable functional group described in US Pat. No. 4,326,008 (1982) or polystyrene nanoparticles described in US Pat. No. 5,326,692 (1992). It can be prepared by using a method of impregnating particles with a fluorescent dye.
  • the method for preparing a tissue section to which this staining method can be applied (also referred to simply as a “section” and including a section such as a pathological section) is not particularly limited, and a tissue section prepared by a known procedure can be used.
  • Specimen preparation step (5.1) Specimen preparation step (5.1.1) Deparaffinization treatment
  • the section is immersed in a container containing xylene to remove paraffin.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, xylene may be replaced during immersion.
  • the section is immersed in a container containing ethanol to remove xylene.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, ethanol may be replaced during immersion.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, the water may be replaced during immersion.
  • the activation treatment of the target substance is performed according to a known method.
  • the activation conditions are not particularly specified, but the activation solution is 0.01 M citrate buffer (pH 6.0), 1 mM EDTA solution (pH 8.0), 5% urea, and 0.1 M Tris-hydrochloric acid buffer.
  • a liquid or the like can be used.
  • the pH condition is such that a signal is output from the range of pH 2.0 to 13.0 depending on the tissue section to be used and the tissue roughness is such that the signal can be evaluated. Normally, it is performed at pH 6.0 to 8.0, but for special tissue sections, it is also performed at pH 3.0, for example.
  • an autoclave As the heating device, an autoclave, a microwave, a pressure cooker, a water bath, or the like can be used.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the temperature can be 50 to 130 ° C. and the time can be 5 to 30 minutes.
  • the section after activation treatment is immersed in a container containing PBS and washed.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, PBS may be replaced during immersion.
  • (5.2) Immunostaining Step in order to stain the target substance, a solution of an immunostaining agent containing fluorescent nanoparticles having a site capable of directly or indirectly binding to the target substance is applied to a section. Place and react with the target substance.
  • the solution of the immunostaining agent used in the immunostaining step may be prepared in advance before this step.
  • immunostaining is performed with a plurality of immunostaining agents corresponding to the target substances.
  • the plurality of immunostaining agents used in this case may be those containing at least one immunostaining agent using PID (PID staining agent), and if the antibody and the fluorescent substance (fluorescent wavelength) are different from each other, PID staining may be performed. It is also possible to detect a plurality of target substances by multiple staining using a plurality of agents or by a combination of a PID staining agent and an immunostaining agent using a fluorescent label such as an organic fluorescent substance or a quantum dot. ..
  • a solution of each immunostaining agent is prepared, placed on a section, and reacted with the target substance.
  • the solution of each immunostaining agent may be mixed in advance when the solution is placed on the section, or separately. It may be placed sequentially in.
  • the conditions for performing the immunostaining step should be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. Can be done.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the reaction time is preferably 30 minutes or more and 24 hours or less. It is preferable to drop a known blocking agent such as PBS containing BSA or a surfactant such as Tween 20 before performing the treatment as described above.
  • tissue specimen after the immunostaining step is subjected to treatments such as immobilization / dehydration, permeation, and encapsulation so as to be suitable for observation.
  • the tissue section may be immersed in a fixation treatment solution (crosslinking agent such as formalin, paraformaldehyde, glutaaldehyde, acetone, ethanol, methanol).
  • a fixation treatment solution crosslinking agent such as formalin, paraformaldehyde, glutaaldehyde, acetone, ethanol, methanol.
  • the tissue section that has been immobilized and dehydrated may be immersed in a permeation solution (xylene or the like).
  • the tissue section that has undergone the permeation treatment may be immersed in the encapsulation liquid.
  • the conditions for performing these treatments for example, the temperature and the immersion time when immersing the tissue section in a predetermined treatment solution, may be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. it can.
  • Morphological observation staining step Separate from the immunostaining step, morphological observation staining is performed so that the morphology of cells, tissues, organs, etc. can be observed.
  • the morphological observation staining is not particularly limited as long as it can express the morphology of cells, tissues, organs and the like.
  • a dyeing method using a fluorescent dye or the like can also be used.
  • the morphological observation dyeing step can be performed according to a conventional method.
  • eosin in which cytoplasm, interstitium, various fibers, erythrocytes, and keratinocytes are stained in red to deep red
  • Staining with hematoxylin in which the cell nucleus, lime, cartilage tissue, bacteria, and mucus are stained in blue to pale blue, is also standardly used (the method of performing these two stainings at the same time is hematoxylin / eosin staining).
  • HE staining HE staining
  • FIG. 4 shows the overall flow of the operation of the image processing system 1 in the present invention.
  • the operation of the image processing system 1 includes a focusing step (step S101), a low-magnification image acquisition step (step S102), a region of interest selection step (step S103), and an image acquisition step for analysis (step S104). ) And an analysis step (step S105).
  • step S101 a focusing step
  • step S102 a low-magnification image acquisition step
  • step S103 region of interest selection step
  • step S104 region of interest selection step
  • step S105 an analysis step
  • the first image acquisition unit 20 acquires a bright-field image of the tissue sample 50, sets an imaging region to be created for WSI based on the bright-field image, and performs focusing based on the bright-field image. Further, the second image acquisition unit 30 irradiates the tissue specimen 50 with the fluorescence-labeled PID with excitation light, and performs more rigorous focusing with reference to the fluorescent bright spot of the detected PID.
  • the control unit 61 controls the first image acquisition unit 20 to acquire a bright field image for focusing of the entire slide glass (step S1).
  • This bright-field image is used for setting high-resolution imaging conditions such as a fluorescence image described later, and is a low-magnification image using a low-magnification objective lens.
  • the control unit 61 sets an imaging region R including the tissue sample 50 as shown in FIG. 6 (step S2). Specifically, the control unit 61 binarizes the entire image of the tissue sample 50 depending on the presence or absence of the tissue sample 50, and detects the region where the tissue sample 50 exists in each of the X-axis direction and the Y-axis direction.
  • the imaging region R is determined.
  • the imaging region R may be manually set on the display device 70 while the user is observing the bright field image of the entire tissue specimen 50, but it is preferably set automatically.
  • the tissue sample 50 is focused based on the bright field image. Focusing based on the bright-field image may be performed manually by the user, but it is preferably performed automatically under the control of the control unit 61.
  • a method of automatically creating a focus map under the control of the control unit 61 and performing focusing will be described.
  • the first focus measurement position P1 is set on the imaging region R (step S3).
  • the control unit 61 divides the imaging region R into the X-axis direction and the Y-axis direction to set a small region, and obtains the XY coordinates of each small region.
  • the XY coordinates are the center coordinates of each subregion, but the coordinates are not limited to this, and for example, the coordinates of the upper left end of each subregion can be the XY coordinates.
  • the control unit 61 assigns numbers such as 1, 2, 3, ... In each of the X-axis direction and the Y-axis direction to each small area, and sets the array number.
  • the control unit 61 sets the first focus measurement position P1 for each small area.
  • P1 is the center coordinate position of each small area, but the present invention is not limited to this, and for example, the upper left end of each small area can be set as the first focus measurement position P1.
  • the tissue sample 50 may not exist on the center coordinates, for example, as in the region of the sequence number (1, 1) in FIG. In this case, it is possible to move the first focus measurement position P1 to an arbitrary coordinate on the tissue sample 50.
  • step S4 focusing is performed on the first focus measurement position P1 in each small area.
  • the control unit 61 adjusts the optical axis position to the first focus measurement position P1 while moving the stage 40 in the XY directions, and focuses the bright field by actual measurement with respect to each first focus measurement position P1. Find the position (Z coordinate).
  • the control unit 61 Based on the bright field focus position obtained in this way, the control unit 61 creates a focus map as shown in FIG. 8 (step S5).
  • the focus map stores the array number of each small area and the corresponding stage coordinates.
  • the stage coordinates correspond to the center coordinates of each small area for the X-axis and the Y-axis, and the bright-field focusing position for the Z-axis. This completes the focusing of the tissue sample 50 based on the bright-field image.
  • step S6 From step S6 onward, focusing is performed on the PID bright spot based on the focusing information obtained based on the bright field image. That is, as described above, the bright-field focusing position of the tissue sample 50 based on the bright-field image is specified by the processing of steps S1 to S3, and focusing is further performed on the PID bright spot based on this. By performing the above, a more precisely focused fluorescence image can be obtained.
  • a fluorescence image of the PID for focusing is acquired (step S6). That is, the control unit 61 controls the excitation light source 32 to irradiate the tissue sample 50 with the excitation light of the PID, and acquires the fluorescence image of the PID by the second image sensor 33.
  • the fluorescence bright spot of an arbitrary PID on the obtained fluorescence image is selected, and the second focus measurement position P2 is set (step S7).
  • the user manually sets the second focus measurement position P2, and as shown in FIG. 9, one or a plurality of second focus measurement positions P2 on the tissue sample 50 are set.
  • the second focus measurement position P2 may be automatically set.
  • step S8 focusing is performed on the set second focus measurement position P2 (step S8).
  • the control unit 61 adjusts the optical axis position to the second focus measurement position P2 while moving the stage 40 in the XY directions, and refers to the focus position of the focus map created in step 3. , Finely adjust in the Z coordinate direction to obtain the fluorescence focusing position (Z coordinate) with respect to the second focus measurement position P2.
  • control unit 61 modifies the focus map created in step S3 by using the obtained new focus position (step S9). As described above, the focusing on the tissue sample 50 is completed.
  • the focusing method described above is merely an example, and the method applicable to the present invention is not limited to this.
  • the fluorescently labeled body labeled on the tissue specimen 50 is excited (step S10).
  • the control unit 61 controls the excitation light source 32 to irradiate the tissue specimen 50 with excitation light that excites the labeled PID.
  • a partial image of the tissue sample 50 is acquired (step S11).
  • the control unit 61 moves and controls the stage 40, and controls the second image acquisition unit 30 to acquire a partial fluorescence image. That is, the optical axis position and the focusing position are moved to the XYZ coordinates indicated by the stage coordinates stored in the focus map, and the second image sensor 33 is controlled to capture an image for each small area.
  • a high-magnification objective lens as the objective lens 34, a high-resolution image can be acquired.
  • a strip-shaped scan image as shown in FIG. 11 is acquired as a partial image.
  • imaging of the tissue specimen 50 is started from the upper left corner.
  • the control unit 61 irradiates the excitation light and scans the image pickup position by the second image pickup element 33 while moving in the positive direction of the Y axis of the tissue section 51 to acquire a partial image A.
  • the control unit 61 moves the image pickup position by the second image pickup element 33 in the positive direction of the X axis to acquire the partial image B.
  • the partial images are acquired in the order of the partial image C, the partial image N, the imaging is completed.
  • the control unit 61 controls the image processing unit 63 as the creating means to synthesize the captured partial images to create the entire fluorescent image of the imaging region R (step S12). That is, by arranging and pasting the partial images A to N in the X-axis direction, a high-resolution fluorescence image of the entire tissue sample 50 can be obtained. Further, the image processing unit 63 A / D-converts the entire fluorescent image of the obtained imaging region R into a digital image (step S13). With the above, the creation of WSI is completed. The created WSI is stored by the database 80 as a storage means. A user who wants to refer to the WSI can read the image data into a personal computer or the like via the communication unit 64 and observe it on the display.
  • WSI is not particularly limited to a fluorescent image as long as the target for creating the WSI (for example, the entire tissue sample 50) can be grasped.
  • WSI may be acquired as a bright field image.
  • the WSI of the bright field image can be created in the same manner as the creation of the WSI of the fluorescence image described above.
  • a WSI of an image for specifying the cell type (image for specifying the cell type) and an image for analyzing the specified cell (image for cell analysis) is created.
  • the target cell types include, for example, classification by differentiation of hepatocytes, glial cells, T cells, etc., pathological classification such as canceration and inflammation, and classification under specific conditions such as cell cycle and necrosis. , Spatial arrangement and shape feature classification such as infiltration and protrusions are also included.
  • the region of interest selection step is a step of selecting a region of interest (FOV: Field Of View) using the created WSI (image for specifying cell type, image for cell analysis).
  • the “region of interest” is not limited as long as it is a region useful for pathological diagnosis, and may include a normal tissue region as well as a lesion portion such as a cancer region.
  • a region in which the effect of the drug appears or a region in which the drug is present can also be included in the region of interest because it is considered to be useful for pathological diagnosis.
  • a region specified by the color of the bright-field image or a region specified by the shade of the fluorescence image may also be useful, and the size of each region can be appropriately specified, preferably about 100 um square. Is.
  • specifying a region using the number of fluorescent bright spots as an index may be useful for pathological diagnosis, and identifying and extracting a region having a large number, a region having a medium number, and a region having a small number, respectively. You can also. Areas with a large number are sometimes called hot spots, and areas with a small number are sometimes called cold spots.
  • the control unit 61 reads two predetermined WSIs (image for cell type identification and image for cell analysis) from the database 80 with image processing software, and for the read image, for example, noise removal and area setting (non-specific). Pretreatment such as fluorescence exclusion) is performed to acquire two images for selecting a region of interest (step S21).
  • WSIs image for cell type identification and image for cell analysis
  • Pretreatment such as fluorescence exclusion
  • FIG. 13A is an H-stained bright field image.
  • FIG. 13B is a PD-L1 stained fluorescent image.
  • the lower right image in FIGS. 13A and 13B is an enlarged image obtained by enlarging a part of each image.
  • the noise removal process is, for example, a process of suppressing the autofluorescence brightness of the read fluorescence image.
  • the control unit 61 generates a frequency spectrum by DFT processing (high-pass filter), and multiplies the frequency spectrum by a high-pass filter image.
  • a low-frequency component-suppressed image is generated by IDFT. Since the size of the fluorescent image is huge, it is preferable to divide and process the fluorescent image according to the memory load used when the processing by the DFT is executed. That is, the above-mentioned processing may be executed for each of the divided images of the fluorescence image, and finally they may be combined.
  • the fluorescence signal by PID and the fluorescence signal by autofluorescence have different spatial frequency profiles.
  • autofluorescence contains a large amount of low-frequency components
  • PID fluorescence has a steep peak shape
  • low-frequency components are less than autofluorescence.
  • the frequency spectrum can be reversibly converted into a spatial signal by IDFT / IFFT, and as described above, the frequency spectrum in a state where the brightness component due to autofluorescence is selectively suppressed is inversely transformed into spatial information by IDFT / IFFT, thereby autofluorescent. A fluorescent image with suppressed is obtained.
  • the fluorescence at the peripheral edge of the tissue may become stronger due to uneven staining of the automatic dyeing machine. It is a process to make it a necessary area. Specifically, if uneven staining density occurs in the tissue section due to factors such as non-uniformity of the flow of the staining solution by the automatic dyeing machine and dominant dye adsorption at the edges due to the thickness of the tissue section, the above-mentioned factors cause artifacts. Since the frequency of non-specific fluorescence is high in a certain part and the brightness of the fluorescence cannot be analyzed equally, it is necessary to exclude it from the necessary region.
  • the control unit 61 specifies the search start point at the end of the tissue section image by a conventionally known method, and obtains the ROI information of the edge of the section by the contour tracking algorithm that allows the pixel value change within a specific range. Generate. As a conventionally known method, for example, Imagej's WandTool (National Institutes of Health, MD, USA) can be mentioned. Next, the control unit 61 performs a reduction process of several hundred pixels (about one field of view of the microscope) on the generated ROI information (that is, the designated pixel area is reduced inward). Next, the control unit 61 excludes the outside of the required region by superimposing the generated ROI information on the HPF image and deleting the fluorescence pixel value outside the region (luminance is zero).
  • the required area is narrowed down.
  • narrowing down the area is not essential in cases where non-specific uneven staining around the tissue does not occur.
  • the autofluorescent region of cells without nuclei such as erythrocytes may be excluded from the required region. Further, in order to narrow down the required region, various known methods can be applied in addition to the above-mentioned method of deleting the fluorescent pixel value outside the required region (luminance is zero).
  • the control unit 61 cuts out a necessary region from the autofluorescence suppression image and generates an image for selecting a region of interest.
  • control unit 61 machine-learns or identifies a characteristic part of the image with respect to the cell type identification image for selecting the region of interest acquired in step S21.
  • the cell type is specified, and region information indicating the analysis target region is created (step S22).
  • Specific examples of the machine learning or image feature classification method include cell type recognition by machine learning, dye extraction from a bright-field H-stained image, and tumor marker-stained cell extraction.
  • the analysis target region is a region that is a specific type of cell and is the target of analysis, and in step S23 described later, feature amount data regarding the target substance in the analysis target region is calculated.
  • the analysis target region is a specific site in a specific type of cell, for example, the cell nucleus, cytoplasm, cell membrane, specific organelle of a specific type of cell, or an arbitrary defined by image processing, calculation, or the like. Area of.
  • region information for example, as shown in FIG. 14, an image in which the shape of the cell can be understood can be mentioned.
  • the control unit 61 identifies the analysis target region from the fluorescence image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and features related to the target substance in each region.
  • Quantitative data (first feature quantity data) is calculated (step S23). For example, the average luminance value in each region is calculated as the first feature amount data.
  • the control unit 61 identifies the analysis target region from the bright field image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and relates to the cell type in each region.
  • the feature amount data (second feature amount data) is calculated (step S24). For example, the cell density in each region is calculated as the second feature amount data.
  • the user can recognize the distribution of the cells in the bright-field image by creating a heat map or a histogram in which the value of the cell density is represented by a shade of color or the like.
  • control unit 61 performs arithmetic processing using the first feature amount data calculated in step S23 and the second feature amount data calculated in step S24, and calculates the composite feature amount data (step). S25). Specifically, the control unit 61 performs arithmetic processing such as multiplication using the average luminance value (first feature amount data) and the cell density (second feature amount data) for each analysis target region. The calculation result is a composite feature quantity. In addition to using the values as they are for the first feature data and the second feature data, certain offset values are added, normalized, or converted to another dimension such as n-step stratification, etc. The arithmetic processing may be performed after the processing of.
  • the control unit 61 selects a region of interest based on the calculated composite feature amount data (step S26). For example, a region in which the value of the composite feature amount data is equal to or higher than a predetermined first threshold value can be selected as a region of interest (hot spot). Further, a region smaller than the first threshold value and less than the second threshold value may be selected as a region of interest (cold spot). In addition, the top N elements having high values of the composite feature amount data may be selected as the region of interest. Thereby, for example, as shown in FIG. 15, the result data in which the region of interest SP is selected can be obtained.
  • step S25 composite feature amount data is calculated from the first feature amount data and the second feature amount data
  • step S26 the attention area is selected from the values.
  • a predetermined spot is selected (specified) from the fluorescence image using the first feature data
  • a predetermined spot is selected (specified) from the bright field image using the second feature data.
  • Specification may be performed (step S27), and the region of interest may be selected from the result of superimposing these results (step S28).
  • a region in which the average brightness value or the cell density is equal to or higher than a predetermined first threshold value is selected as a predetermined spot.
  • a region smaller than the first threshold value and less than the second threshold value may be selected as the predetermined spot, or a region between the first threshold value and the second threshold value may be selected as the predetermined spot. Is also good.
  • the top N spots having a high average brightness value or cell density value may be selected as predetermined spots.
  • a region in which these overlap each other is selected as a region of interest. For example, as shown in FIG. 17, attention is paid to the region SP3 in which the predetermined spot SP1 (solid line) using the first feature data and the predetermined spot SP2 (broken line) using the second feature data overlap. Select as an area.
  • Modification 2 Next, a modification 2 of the region of interest selection process will be described with reference to the flowchart of FIG. Further, in the above-described region of interest selection process, the case where one image for specifying the cell type and one image for cell analysis are acquired in step S21 has been described as an example, but a plurality of these images can also be used. .. Hereinafter, the region of interest selection process using two images for cell type identification and two images for cell analysis will be described.
  • Examples of the cell type identification image include, although not shown, an Epcam-stained fluorescence image and a DAB (CD8) -stained bright-field image.
  • examples of the cell analysis image include a PD-L1 stained fluorescent image and a PD-1 stained fluorescent image.
  • the control unit 61 performs preprocessing on such an image in the same manner as in step S21 to acquire an image for selecting a region of interest (step S31).
  • control unit 61 identifies different analysis target regions from each of the two cell type identification images in the same manner as in step S22, and creates region information for each region (step S32). For example, region information for identifying a cancer cell region is created from an Epcam-stained fluorescent image. Further, from the DAB (CD8) stained bright field image, region information for identifying the CD8-positive T cell region is created.
  • control unit 61 calculates the first feature amount data from each of the two cell analysis images in the same manner as in step S23 (step S33).
  • control unit 61 calculates the second feature amount data from each of the two cell type identification images in the same manner as in step S24 (step S34).
  • control unit 61 calculates the related information between the two cell types using the two second feature amount data calculated in step S34 (step S35). Specifically, the density ratio between the two cell types and the distance between the dense regions of the two cell types are calculated as related information.
  • control unit 61 narrows down the area for each of the two cell analysis images based on the related information calculated in step S35, and uses the first feature amount data in the narrowed down area. To select a predetermined spot (step S36). As a result, a predetermined spot is selected in the narrowed area in each of the two cell analysis images.
  • the selection of the predetermined spot using the first feature amount data can be performed in the same manner as in step S27.
  • control unit 61 selects a region of interest from the result of superimposing the results of selecting a predetermined spot in step S36 (step S37).
  • a region having a high cancer cell / T cell density ratio is narrowed down, it is possible to select a region of interest in the (non-infiltrating) cancer cell region according to the expression level of cancer cell PD-L1. it can. Further, when the region having a low cancer cell / T cell density ratio is narrowed down, the region of interest in the (non-infiltrating) T cell region can be selected according to the T cell PD-1 expression level. In addition, when the region where the cancer cell / T cell density ratio is within the threshold is narrowed down, (1) the region of interest according to the expression level of PD-L1 of cancer cells in the region of T cell infiltration into cancer cells, (2). ) A region of interest according to the expression level of T cell PD-1, and (3) a region of interest according to co-expression such as multiplication of the expression levels of (1) and (2) above can be selected.
  • the analysis image acquisition step is a step of acquiring an analysis image (re-photographed image) by photographing the area of interest selected as described above at a magnification higher than that of the tissue image. Specifically, the position of the selected region of interest is observed with the high-magnification objective lens of the second image acquisition unit 30, and an image is acquired again.
  • a high-magnification microscope such as BX63 + DP80 (Olympus) can also be used. By taking an enlarged image of the selected region of interest in this way, more accurate analysis becomes possible.
  • an analysis image for the selected region of interest can be acquired, and in addition to acquiring an analysis image by photographing at a magnification higher than that of the tissue image as described above.
  • An image acquisition with an increased bit depth, or a three-dimensional stack image (re-photographed image) of the selected region of interest may be acquired.
  • the analysis step is a step in which the control unit 61 analyzes using an image (re-photographed image) photographed at a high magnification as described above.
  • a step of evaluating the fluorescence bright spot derived from PID which will be described later, and a step of quantitatively evaluating the target substance can be mentioned.
  • the step of evaluating the fluorescent bright spots derived from PID include a step of measuring the number of fluorescent bright spots and a step of measuring the number of PID particles corresponding to the number of fluorescent bright spots.
  • Examples of the step of quantitatively evaluating the target substance include a step of calculating a PID score.
  • the analysis is performed using a plurality of images (re-photographed images) corresponding to the cell type identification image used for selecting the region of interest and the cell analysis image.
  • a plurality of images it is possible to obtain more detailed analysis data.
  • statistical values of the expression level of the target substance per cell and analysis data such as a position image of the target substance.
  • the control unit 61 acquires a tissue image obtained by photographing the tissue sample 50, and from the acquired tissue image, an image of a region of interest for re-imaging. Select. Further, the control unit 61 acquires a cell type identification image for identifying one or more cell types in the tissue sample 50 and one or more cell analysis images for analyzing the identified cells. , The region of interest is selected based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image.
  • the region of interest suitable for the purpose can be selected. Can be automatically selected based on descriptive criteria. As a result, compared to the case where the observer visually selects the region of interest, for example, it is possible to perform an analysis excluding arbitrariness, so that the evaluation of the fluorescent bright spot and the target substance can be performed with higher accuracy. It becomes possible to do.
  • the tissue specimen is morphologically observed and stained so that one or more cell types can be identified by the control unit 61. This makes it possible to identify one or more cell types using one tissue specimen.
  • control unit 61 identifies the analysis target region in the cell type identification image by machine learning or an image feature classification method. As a result, a more accurate image for identifying the cell type can be obtained.
  • control unit 61 calculates the feature amount related to the cell type from the analysis target region of the cell type identification image, and relates to the target substance from the region corresponding to the analysis target region of the cell analysis image. Calculate the feature amount. Therefore, since the feature amount is calculated only from the analysis target region in the cell type identification image and the cell analysis image, more accurate analysis can be performed efficiently.
  • control unit 61 selects a region of interest based on the complex feature amount calculated from the feature amount related to the cell type and the feature amount related to the target substance. Therefore, by using the complex feature amount, it is possible to quantitatively evaluate only the target substance existing in the target cell type, not the arbitrary target substance existing in the tissue sample, and the distribution of the target substance, etc. It becomes possible to grasp the above more accurately and perform analysis and the like with high accuracy.
  • control unit 61 identifies the first spot from the cell type identification image based on the feature amount related to the cell type, and from the cell analysis image based on the feature amount related to the target substance.
  • the second spot is specified, and the region where the first spot and the second spot overlap each other is selected as the region of interest. Therefore, the method of selecting the region of interest is expanded, and the region of interest desired by the user can be selected.
  • control unit 61 has a plurality of cell type identification images for identifying different cell types and a plurality of cell analysis images for analyzing the identified plurality of cell types. And get. Therefore, more detailed analysis becomes possible.
  • the tissue image is a whole slide image obtained by photographing the entire tissue sample 50. Therefore, in pathological diagnosis using a hole slide image, it is possible to efficiently perform highly accurate analysis.
  • the acquisition means are a first photographing means for acquiring a tissue image (a low-magnification objective lens of the first image acquisition unit 20 or the second image acquisition unit 30) and a first image.
  • a second photographing means (such as a high-magnification objective lens of the second image acquisition unit 30) having a higher photographing magnification than the photographing means is provided, and the re-photographing is performed by the second photographing means in the extracted specific area. Therefore, an enlarged image of the selected region of interest can be taken, and more accurate analysis becomes possible.
  • a low-magnification microscope may be used in addition to the whole slide scanner as long as the WSI can be obtained at a low magnification.
  • the first photographing means and the second photographing means may be configured as different photographing means in the same photographing device, or may be configured as two different photographing devices.
  • an analysis means for analyzing the re-photographed image obtained by re-photographing the region of interest by the second photographing means is further provided. Therefore, it is possible to analyze the enlarged image of the selected region of interest.
  • a tissue section is targeted as a biological sample, and the tissue sample 50 is stained with an immunostaining agent containing fluorescent substance-accumulated nanoparticles as a fluorescent label.
  • the target of the biological sample may be a cultured cell or a gene (DNA).
  • the present invention can be used in an image processing system, an image processing method, and a program that enable highly accurate analysis to be performed efficiently.
  • Image processing system 10
  • Microscope device 20
  • First image acquisition unit 21
  • Bright field light source 22
  • First image sensor 30
  • Second image acquisition unit (first imaging means, second imaging means)
  • Transmission light source 32
  • Excitation light source 33
  • Second image sensor 40
  • Stage 50
  • Tissue specimen 60
  • Control device 61
  • Storage unit 63
  • Image processing unit 64
  • Communication unit 70 Display device 80 Database

Abstract

According to this image processing system 1, a control unit 61 is configured to acquire histological images captured of a tissue specimen, and to select, from among the acquired histological images, an image of an area of interest for the purpose of re-imaging. The control unit 61 acquires a cell-type identification image for identifying at least one cell type in a tissue specimen 50 and at least one cell analysis image for analyzing the identified cells, and then selects an area of interest on the basis of a feature quantity pertaining to the cell type and being calculated from the cell-type identification image and a feature quantity pertaining to at least one target substance and being calculated from the cell analysis image.

Description

画像処理システム、画像処理方法及びプログラムImage processing system, image processing method and program
 本発明は、画像処理システム、画像処理方法及びプログラムに関する。 The present invention relates to an image processing system, an image processing method and a program.
 近年、病理学等の分野で、バーチャル顕微鏡が先端技術として注目されている。バーチャル顕微鏡とは、光学顕微鏡によって観察される画像をデジタルデータ化し、ディスプレイ上で、実際に光学顕微鏡を用いているかのように組織標本を観察可能なシステムである(例えば、特許文献1参照)。 In recent years, virtual microscopes have been attracting attention as advanced technology in fields such as pathology. The virtual microscope is a system that digitizes an image observed by an optical microscope and allows the tissue sample to be observed on a display as if the optical microscope was actually used (see, for example, Patent Document 1).
 詳細には、スライドガラス上の組織標本全体を撮影し、得られた画像をデジタルデータ化してデータベースに保存し、パーソナルコンピューター等にインストールされたビューアーソフトを用いて観察を行う。この時、光学顕微鏡を用いた観察と同様に、上下左右の移動や拡大縮小などの操作を行いながら観察することができるのが、バーチャル顕微鏡と呼ばれる所以である。ホールスライドイメージ(whole slide image:WSI)と呼ばれる、組織標本全体のデジタル画像データは、データベースに保存される。保存されたデジタル画像データは、インターネット等によってアクセスすることができるため、例えば遠隔地にいる病理医による迅速な病理診断や、希少な組織標本を誰もが閲覧可能にすることができる。 In detail, the entire tissue sample on the slide glass is photographed, the obtained image is converted into digital data and saved in a database, and observation is performed using viewer software installed on a personal computer or the like. At this time, the reason why it is called a virtual microscope is that it can be observed while performing operations such as moving up and down, left and right, and scaling, as in the case of observation using an optical microscope. Digital image data of the entire tissue specimen, called a whole slide image (WSI), is stored in a database. Since the stored digital image data can be accessed via the Internet or the like, for example, a rapid pathological diagnosis by a pathologist at a remote location and a rare tissue specimen can be made available to anyone.
特開2018-72253号公報JP-A-2018-72253
 ところで、診断等のための解析を行うには、解析に耐え得る画質が担保されていなければならない。そのため、高精度な解析を行うためには、組織標本を高倍率にて撮影する必要がある。
 しかしながら、上記特許文献1に記載されたように、病理標本の全体を高倍視野にて撮影すると、取得される画像の容量が膨大になるとともに、当該画像を用いた解析には膨大な時間を要し、解析の効率が低下してしまう。
By the way, in order to perform an analysis for diagnosis or the like, the image quality that can withstand the analysis must be guaranteed. Therefore, in order to perform highly accurate analysis, it is necessary to take a tissue sample at a high magnification.
However, as described in Patent Document 1, when the entire pathological specimen is photographed in a high-magnification field of view, the capacity of the acquired image becomes enormous, and analysis using the image requires an enormous amount of time. However, the efficiency of analysis is reduced.
 本発明は上記課題を鑑みてなされたものであって、高精度な解析を効率良く行うことを可能とする画像処理システム、画像処理方法及びプログラムを提供することを目的とする。 The present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing system, an image processing method, and a program capable of efficiently performing high-precision analysis.
 上記課題を解決するため、本発明の画像処理システムは、
 組織標本を撮影した組織画像を取得する取得手段と、
 前記取得手段により取得された前記組織画像から、前記取得手段にて再撮影を行うための注目領域の画像を選択する選択手段と、
 を有し、
 前記取得手段は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
 前記選択手段は、前記細胞種別特定用画像から算出される細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択する。
In order to solve the above problems, the image processing system of the present invention
An acquisition method for acquiring a tissue image of a tissue sample,
A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
Have,
The acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
The selection means selects the region of interest based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. To do.
 また、本発明の画像処理方法は、
 組織標本を撮影した組織画像を取得する取得工程と、
 前記取得工程により取得された前記組織画像から、前記取得工程にて再撮影を行うための注目領域の画像を選択する選択工程と、
 を有し、
 前記取得工程は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
 前記選択工程は、前記細胞種別特定用画像から算出される1つ以上の細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択する。
Moreover, the image processing method of the present invention
The acquisition process to acquire the tissue image of the tissue sample,
A selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
Have,
In the acquisition step, an image for identifying one or more cell types in the tissue sample and an image for analyzing one or more cells for analyzing the identified cells are acquired.
The selection step is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. Select the area of interest.
 また、本発明のプログラムは、
 コンピュータを、
 組織標本を撮影した組織画像を取得する取得手段、
 前記取得手段により取得された前記組織画像から、前記取得手段にて再撮影を行うための注目領域の画像を選択する選択手段、
 として機能させ、
 前記取得手段は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
 前記選択手段は、前記細胞種別特定用画像から算出される1つ以上の細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択するプログラムである。
In addition, the program of the present invention
Computer,
Acquisition method for acquiring tissue images of tissue specimens,
A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
To function as
The acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
The selection means is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. This is a program that selects the area of interest.
 本発明によれば、高精度な解析を効率良く行うことが可能となり、即ち、解析の高精度化と高効率化とを同時に図ることが可能となる。 According to the present invention, it is possible to efficiently perform high-precision analysis, that is, it is possible to simultaneously improve the accuracy and efficiency of analysis.
本発明に係る画像処理システムの概略構成を示す図である。It is a figure which shows the schematic structure of the image processing system which concerns on this invention. 第1画像取得部の概略構成を示す図である。It is a figure which shows the schematic structure of the 1st image acquisition part. 第2画像取得部の概略構成を示す図である。It is a figure which shows the schematic structure of the 2nd image acquisition part. 画像処理システムの動作の全体の流れを示すフローチャートである。It is a flowchart which shows the whole flow of operation of an image processing system. フォーカシング時の制御を示すフローチャートである。It is a flowchart which shows the control at the time of focusing. 撮像領域の設定方法を示す図である。It is a figure which shows the setting method of the imaging area. 明視野画像における焦点計測位置の設定方法を示す図である。It is a figure which shows the setting method of the focus measurement position in a bright-field image. フォーカスマップの一例を示す図である。It is a figure which shows an example of a focus map. 蛍光画像における焦点計測位置の設定方法を示す図である。It is a figure which shows the setting method of the focus measurement position in a fluorescence image. 蛍光画像取得時の制御を示すフローチャートである。It is a flowchart which shows the control at the time of the fluorescence image acquisition. 部分画像取得方法の一例を示す図である。It is a figure which shows an example of the partial image acquisition method. 注目領域選択処理の制御を示すフローチャートである。It is a flowchart which shows the control of the attention area selection process. 細胞種別特定用画像の一例である。This is an example of a cell type identification image. 細胞解析用画像の一例である。This is an example of an image for cell analysis. 領域情報を示す画像の一例である。This is an example of an image showing area information. 選択された注目領域の一例を示す図である。It is a figure which shows an example of the selected area of interest. 変形例1の注目領域選択処理の制御を示すフローチャートである。It is a flowchart which shows the control of the attention area selection process of the modification 1. 選択された注目領域の一例を示す図である。It is a figure which shows an example of the selected area of interest. 変形例2の注目領域選択処理の制御を示すフローチャートである。It is a flowchart which shows the control of the attention area selection process of the modification 2.
 以下、1面を参照しながら本発明の好ましい実施形態について説明する。
[画像処理システム]
 図1に、本発明における画像処理システム1(WSI作成システム)の概略構成を示す。図1に示すとおり、画像処理システム1は、顕微鏡装置10、制御装置60、表示装置70及びデータベース80を備えている。
Hereinafter, preferred embodiments of the present invention will be described with reference to page 1.
[Image processing system]
FIG. 1 shows a schematic configuration of the image processing system 1 (WSI creation system) in the present invention. As shown in FIG. 1, the image processing system 1 includes a microscope device 10, a control device 60, a display device 70, and a database 80.
 顕微鏡装置10は、第1画像取得部20、第2画像取得部30、及びステージ40を備えて構成される。
 ステージ40には免疫染色後の組織標本50が設置される。組織標本50は生体サンプルの一例である。
The microscope device 10 includes a first image acquisition unit 20, a second image acquisition unit 30, and a stage 40.
The tissue specimen 50 after immunostaining is placed in the stage 40. The tissue sample 50 is an example of a biological sample.
 図2に、第1画像取得部20の概略構成を示す。
 第1画像取得部20は、組織標本50の明視野画像を取得する。第1画像取得部20は、明視野光源21、第1撮像素子22、導光レンズ23を備えている。
FIG. 2 shows a schematic configuration of the first image acquisition unit 20.
The first image acquisition unit 20 acquires a bright field image of the tissue sample 50. The first image acquisition unit 20 includes a bright field light source 21, a first image sensor 22, and a light guide lens 23.
 明視野光源21は、組織標本50に対して明視野画像取得用の光像を生成するための光を照射する光源であり、ステージ40の下方から光を照射するように設置されている。明視野光源21によって組織標本50が照射され光像が生成されると、導光レンズ23を介して光像が第1撮像素子22へと導かれ、第1撮像素子22によって組織標本50の明視野画像が撮影される。
 なお、第1撮像素子22は、組織標本50の光像による2次元画像が取得可能な2次元CCDセンサーなどの撮像素子である。
The bright-field light source 21 is a light source that irradiates the tissue sample 50 with light for generating a light image for acquiring a bright-field image, and is installed so as to irradiate the light from below the stage 40. When the tissue sample 50 is irradiated by the bright-field light source 21 and an optical image is generated, the optical image is guided to the first image sensor 22 via the light guide lens 23, and the light image of the tissue sample 50 is brightened by the first image sensor 22. A field image is taken.
The first image sensor 22 is an image sensor such as a two-dimensional CCD sensor capable of acquiring a two-dimensional image of the optical image of the tissue specimen 50.
 図3に、第2画像取得部30の概略構成を示す。
 第2画像取得部30は、組織標本50の蛍光画像を取得する。
 第2画像取得部30は、透過光源31、励起光源32、第2撮像素子33、対物レンズ34、蛍光キューブ35、結像レンズ36を備える。蛍光キューブ35は、励起フィルター351、ダイクロイックミラー352及び吸収フィルター353を備えている。
 透過光源31は、組織標本50の透過観察画像を取得する際に用いられる光源であり、ステージ40の下方から光を照射するように設置されている。
FIG. 3 shows a schematic configuration of the second image acquisition unit 30.
The second image acquisition unit 30 acquires a fluorescence image of the tissue sample 50.
The second image acquisition unit 30 includes a transmission light source 31, an excitation light source 32, a second image sensor 33, an objective lens 34, a fluorescence cube 35, and an imaging lens 36. The fluorescent cube 35 includes an excitation filter 351, a dichroic mirror 352, and an absorption filter 353.
The transmission light source 31 is a light source used when acquiring a transmission observation image of the tissue specimen 50, and is installed so as to irradiate light from below the stage 40.
 励起光源32は、放電管などの光源によって励起光を出射するランプである。励起フィルター351は励起光だけを透過するフィルターである。ダイクロイックミラー352は所定波長の光を境界として反射または透過するミラーであって、ここでは励起光を反射し蛍光を透過するものである。吸収フィルター353は励起光を遮断し蛍光だけを透過するフィルターである。 The excitation light source 32 is a lamp that emits excitation light by a light source such as a discharge tube. The excitation filter 351 is a filter that transmits only excitation light. The dichroic mirror 352 is a mirror that reflects or transmits light having a predetermined wavelength as a boundary, and here, it reflects excitation light and transmits fluorescence. The absorption filter 353 is a filter that blocks excitation light and transmits only fluorescence.
 第2画像取得部30では、励起光源32が点灯すると、励起光が励起フィルター351を透過しダイクロイックミラー352で反射され、対物レンズ34を通過し組織標本50に照射される。その結果、組織標本50で蛍光が発光され、蛍光は対物レンズ34で集光されダイクロイックミラー352および吸収フィルター353を透過する。その後、蛍光は蛍光像として結像レンズ36を介して第2撮像素子33へと導かれ、第2撮像素子33に撮像される。なお、対物レンズ34として、低倍率の対物レンズ(例えば、20倍)及び高倍率の対物レンズ(例えば、40倍)を有する。
 第2撮像素子33は、所定の方向を長手方向とする1次元画像又は2次元画像を取得可能な1次元CCDカメラ等の撮像素子であり、組織標本50の高解像度での蛍光画像を取得することができる。
In the second image acquisition unit 30, when the excitation light source 32 is turned on, the excitation light passes through the excitation filter 351 and is reflected by the dichroic mirror 352, passes through the objective lens 34, and irradiates the tissue sample 50. As a result, fluorescence is emitted from the tissue sample 50, and the fluorescence is focused by the objective lens 34 and transmitted through the dichroic mirror 352 and the absorption filter 353. After that, the fluorescence is guided to the second image sensor 33 as a fluorescence image via the image pickup lens 36, and is imaged by the second image sensor 33. The objective lens 34 includes a low-magnification objective lens (for example, 20 times) and a high-magnification objective lens (for example, 40 times).
The second image sensor 33 is an image sensor such as a one-dimensional CCD camera capable of acquiring a one-dimensional image or a two-dimensional image having a predetermined direction as the longitudinal direction, and acquires a high-resolution fluorescent image of the tissue sample 50. be able to.
 顕微鏡装置10にはこれらを制御する制御装置60が接続されている。制御装置60は制御部(取得手段、選択手段、識別手段、解析手段)61、記憶部62、画像処理部63及び通信部64を備えている。 A control device 60 for controlling these is connected to the microscope device 10. The control device 60 includes a control unit (acquisition means, selection means, identification means, analysis means) 61, a storage unit 62, an image processing unit 63, and a communication unit 64.
 制御部61は、CPU(Central Processing Unit)、RAM(Random Access Memory)等を備えて構成され、記憶部62に記憶されている各種プログラムとの協働により各種処理を実行し、顕微鏡装置10の動作を統括的に制御する。
 制御部61はステージ40と接続され、ステージ40の昇降を制御しステージ40に設置される組織標本50の合焦位置(Z座標)を制御しうる。また、制御部61は第1画像取得部20と接続され、明視野光源21及び第1撮像素子22を制御して明視野画像の撮影を行う。さらに、制御部61は第2画像取得部30と接続され、透過光源31、励起光源32、第2撮像素子33を制御して蛍光画像の撮影を行う。
The control unit 61 is configured to include a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like, and executes various processes in cooperation with various programs stored in the storage unit 62 to execute various processes of the microscope device 10. Control the operation comprehensively.
The control unit 61 is connected to the stage 40 and can control the ascent and descent of the stage 40 to control the focusing position (Z coordinate) of the tissue sample 50 installed on the stage 40. Further, the control unit 61 is connected to the first image acquisition unit 20 and controls the bright field light source 21 and the first image sensor 22 to take a bright field image. Further, the control unit 61 is connected to the second image acquisition unit 30 and controls the transmission light source 31, the excitation light source 32, and the second image sensor 33 to take a fluorescence image.
 記憶部62は、例えばHDD(Hard Disk Drive)や半導体の不揮発性メモリー等で構成されている。記憶部62には明視野画像撮影及び蛍光画像撮影を行うためのプログラムが記憶されている。 The storage unit 62 is composed of, for example, an HDD (Hard Disk Drive), a semiconductor non-volatile memory, or the like. The storage unit 62 stores a program for taking a bright field image and a fluorescent image.
 画像処理部63は、顕微鏡装置10によって撮影された蛍光画像に画像処理を施し、ホールスライドイメージ(WSI)を作成する。後述するように、制御部61の指示にしたがって、撮影された部分画像を合成して組織標本50の全体画像を作成し、さらに画像データをA/D変換してデジタル画像化し、WSIを作成する。また、作成されたWSIをもとに、目的物質の定量的解析に用いる蛍光輝度マップを作成する。 The image processing unit 63 performs image processing on the fluorescent image taken by the microscope device 10 to create a hall slide image (WSI). As will be described later, according to the instruction of the control unit 61, the captured partial images are combined to create an entire image of the tissue sample 50, and the image data is A / D converted into a digital image to create a WSI. .. In addition, based on the created WSI, a fluorescence brightness map used for quantitative analysis of the target substance is created.
 通信部64は、パーソナルコンピューター等の外部装置との間でデータ送受信を行なうためのインターフェースである。WSIを参照したいユーザーは、通信部64を介してデータベース80に保存されたWSIをパーソナルコンピューター等に読み込み、ディスプレイ上で観察を行うことができる。 The communication unit 64 is an interface for transmitting and receiving data to and from an external device such as a personal computer. A user who wants to refer to the WSI can read the WSI stored in the database 80 into a personal computer or the like via the communication unit 64 and observe it on the display.
 制御装置60には表示装置70が接続されている。
 表示装置70は、例えば、CRT(Cathode Ray Tube)やLCD(Liquid Crystal Display)等のモニタを備えて構成されており、制御部61から入力される表示信号の指示にしたがって、各種画面を表示する。本実施形態において、表示装置70は、撮影された蛍光画像等を出力するための出力手段として機能する。
A display device 70 is connected to the control device 60.
The display device 70 is configured to include, for example, a monitor such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays various screens according to an instruction of a display signal input from the control unit 61. .. In the present embodiment, the display device 70 functions as an output means for outputting a captured fluorescent image or the like.
 制御装置60にはさらに、データベース80が接続されている。データベース80は、例えばHDD(Hard Disk Drive)等を備え、画像処理部63によって合成されたWSIを保存する。
 なお、本実施形態では、上述のようにWSIをデータベース80に保存することとして説明するが、WSIが保存可能であれば、その保存領域はデータベース80に限定されず、データベース80を備えない構成であっても良い。
 例えば、WSIを記憶部62に保存しても良いし、図示しない外部サーバーに保存してデータベースを形成するようにしても良い。
A database 80 is further connected to the control device 60. The database 80 includes, for example, an HDD (Hard Disk Drive) and stores the WSI synthesized by the image processing unit 63.
In this embodiment, the WSI is stored in the database 80 as described above. However, if the WSI can be stored, the storage area is not limited to the database 80, and the database 80 is not provided. There may be.
For example, the WSI may be stored in the storage unit 62, or may be stored in an external server (not shown) to form a database.
[組織標本]
 続いて、組織標本50について説明する。
 組織標本50は目的物質を含む組織切片であって免疫染色剤で染色され、染色後の組織標本50がステージ40に設置される。
[Tissue specimen]
Subsequently, the tissue sample 50 will be described.
The tissue sample 50 is a tissue section containing the target substance and is stained with an immunostaining agent, and the stained tissue sample 50 is placed on the stage 40.
(1)目的物質
 目的物質とは、主に病理診断の観点からの検出または定量のために、蛍光標識体を用いた免疫染色の対象とするものをいい、組織切片に存在している物質、特にタンパク質(抗原)である。
 典型的な目的物質としては、各種の癌組織の細胞膜で発現しており、例えば、タンパク質、RNAなどのバイオマーカーとして利用することができる生体物質が挙げられる。
 また、目的物質としては、例えば薬物など、生体内に体外から導入された物質であっても良い。また、ペプチド等のタンパク質より小さな単位も免疫染色可能である。
(1) Target substance The target substance is a substance that is present in a tissue section and is intended for immunostaining using a fluorescent label, mainly for detection or quantification from the viewpoint of pathological diagnosis. In particular, it is a protein (antigen).
Typical target substances include biological substances that are expressed in the cell membranes of various cancer tissues and can be used as biomarkers for proteins, RNA, and the like.
Further, the target substance may be a substance introduced from outside the body, such as a drug. In addition, units smaller than proteins such as peptides can be immunostained.
(2)免疫染色剤(抗体-蛍光ナノ粒子の結合体)
 免疫染色剤としては、蛍光標識の効率を向上させて蛍光の劣化につながる時間経過をなるべく抑えるために、一次抗体および蛍光ナノ粒子が間接的に、つまり抗原抗体反応などを利用した、共有結合以外の結合によって連結される複合体を用いることが好ましい。染色操作を簡便にするため、免疫染色剤として、一次抗体または二次抗体に蛍光ナノ粒子が直結している複合体を用いることもできる。
(2) Immunostaining agent (antibody-fluorescent nanoparticle conjugate)
As an immunostaining agent, in order to improve the efficiency of fluorescent labeling and suppress the passage of time leading to deterioration of fluorescence as much as possible, the primary antibody and fluorescent nanoparticles indirectly, that is, using an antigen-antibody reaction or the like, other than covalent bonds. It is preferable to use a complex linked by the binding of. In order to simplify the staining operation, a complex in which fluorescent nanoparticles are directly linked to the primary antibody or the secondary antibody can also be used as the immunostaining agent.
 免疫染色剤の一例として、[目的物質に対する一次抗体]…[一次抗体に対する抗体(二次抗体)]~[蛍光ナノ粒子]が挙げられる。
 “…”は抗原抗体反応により結合していることを表し、“~”が示す結合の態様としては特に限定されず、たとえば、共有結合、イオン結合、水素結合、配位結合、抗原抗体結合、ビオチンアビジン反応、物理吸着、化学吸着などが挙げられ、必要に応じてリンカー分子を介していてもよい。
Examples of the immunostaining agent include [primary antibody against the target substance] ... [antibody against the primary antibody (secondary antibody)] to [fluorescent nanoparticles].
“…” Indicates that the bond is bound by an antigen-antibody reaction, and the mode of binding indicated by “~” is not particularly limited. For example, covalent bond, ionic bond, hydrogen bond, coordination bond, antigen-antibody bond, Examples thereof include biotin avidin reaction, physical adsorption, and chemisorption, and may be mediated by a linker molecule if necessary.
(3)抗体
 一次抗体には、目的物質を抗原として特異的に認識して結合する抗体(IgG)を用いることができる。たとえば、HER2を目的物質とする場合は抗HER2抗体を、HER3を目的物質とする場合は抗HER3抗体を、それぞれ用いることができる。
 二次抗体には、一次抗体を抗原として特異的に認識して結合する抗体(IgG)を用いることができる。
 一次抗体および二次抗体はいずれも、ポリクローナル抗体であってもよいが、定量の安定性の観点から、モノクローナル抗体が好ましい。抗体を産生する動物(免疫動物)の種類は特に限定されるものではなく、従来と同様、マウス、ラット、モルモット、ウサギ、ヤギ、ヒツジなどから選択すればよい。
(3) Antibody As the primary antibody, an antibody (IgG) that specifically recognizes and binds to the target substance as an antigen can be used. For example, when HER2 is the target substance, an anti-HER2 antibody can be used, and when HER3 is the target substance, an anti-HER3 antibody can be used.
As the secondary antibody, an antibody (IgG) that specifically recognizes and binds to the primary antibody as an antigen can be used.
Both the primary antibody and the secondary antibody may be polyclonal antibodies, but monoclonal antibodies are preferable from the viewpoint of quantitative stability. The type of animal (immune animal) that produces an antibody is not particularly limited, and may be selected from mice, rats, guinea pigs, rabbits, goats, sheep, and the like as in the conventional case.
(4)蛍光ナノ粒子
 蛍光ナノ粒子とは、励起光の照射を受けて蛍光発光するナノサイズの粒子であって、目的物質を1分子ずつ輝点として表すのに十分な強度の蛍光を発光しうる粒子である。
 蛍光ナノ粒子として、本発明においては、蛍光物質集積ナノ粒子(PID:Phosphor Integated Dot nanoparticles)が使用される。
(4) Fluorescent nanoparticles Fluorescent nanoparticles are nano-sized particles that fluoresce when irradiated with excitation light, and emit fluorescence of sufficient intensity to represent the target substance as a bright spot one by one. Fluorescent particles.
As the fluorescent nanoparticles, in the present invention, fluorescent substance integrated nanoparticles (PID: Phosphor Integrated Dot nanoparticles) are used.
(4.1)蛍光物質集積ナノ粒子
 PIDは、有機物または無機物でできた粒子を母体とし、複数の蛍光物質(たとえば、上記量子ドット、有機蛍光色素など)がその中に内包されている及び/又はその表面に吸着している構造を有する、ナノサイズの粒子である。PIDとしては、量子ドット集積ナノ粒子、蛍光色素集積ナノ粒子などが使用される。
 PIDに用いられる蛍光物質としては、200~700nmの範囲内の波長の紫外~近赤外光により励起されたときに、400~900nmの範囲内の波長の可視~近赤外光の発光を示すことが好ましく、母体と蛍光物質とが、互いに反対の電荷を有する置換基または部位を有し、静電的相互作用が働くものであることが好適である。
(4.1) Fluorescent substance-accumulated nanoparticles PID is based on particles made of organic or inorganic substances, and a plurality of fluorescent substances (for example, the quantum dots, organic fluorescent dyes, etc.) are contained therein and /. Alternatively, it is a nano-sized particle having a structure adsorbed on its surface. As the PID, quantum dot integrated nanoparticles, fluorescent dye integrated nanoparticles and the like are used.
The fluorescent substance used for PID shows emission of visible to near infrared light having a wavelength in the range of 400 to 900 nm when excited by ultraviolet to near infrared light having a wavelength in the range of 200 to 700 nm. It is preferable that the mother body and the fluorescent substance have substituents or sites having opposite charges, and an electrostatic interaction acts.
 本発明で用いられるPIDの平均粒径は特に限定されないが、30~800nm程度のものを用いることができる。平均粒径が30nm未満の場合には、集積粒子に含まれる蛍光物質が少なく、目的物質の定量的評価が困難となり、800nmを超える場合には、病理組織での目的物質との結合が困難となるためである。なお、平均粒径は、40~500nmの範囲内であることがより好ましい。ここで、平均粒径を40~500nmとしたのは、40nm未満の場合には、高価な検出系が必要となり、500nmを超える場合には、物理的な大きさから定量範囲が狭まるためである。 The average particle size of the PID used in the present invention is not particularly limited, but one having a diameter of about 30 to 800 nm can be used. When the average particle size is less than 30 nm, the amount of fluorescent substances contained in the accumulated particles is small, making it difficult to quantitatively evaluate the target substance, and when it exceeds 800 nm, it is difficult to bind to the target substance in the pathological tissue. This is to become. The average particle size is more preferably in the range of 40 to 500 nm. Here, the reason why the average particle size is set to 40 to 500 nm is that if it is less than 40 nm, an expensive detection system is required, and if it exceeds 500 nm, the quantification range is narrowed due to the physical size. ..
 なお、粒径のばらつきを示す変動係数(=(標準偏差/平均値)×100%)は特に限定されないが、15%以下のものを用いることが望ましい。粒径のばらつきは小さい程、蛍光輝点の輝度のばらつきが小さく、後述するように蛍光輝度をもとに目的物質の発現量を定量的に評価することができる。平均粒径は、走査型電子顕微鏡(SEM)を用いて電子顕微鏡写真を撮影し十分な数の粒子について断面積を計測し、各計測値を円の面積としたときの円の直径を粒径として求めることができる。 The coefficient of variation (= (standard deviation / average value) x 100%) indicating the variation in particle size is not particularly limited, but it is desirable to use one of 15% or less. The smaller the variation in particle size, the smaller the variation in the brightness of the fluorescence bright spot, and as will be described later, the expression level of the target substance can be quantitatively evaluated based on the fluorescence brightness. The average particle size is determined by taking an electron micrograph using a scanning electron microscope (SEM), measuring the cross-sectional area of a sufficient number of particles, and using each measured value as the area of a circle. Can be obtained as.
(4.1.1)母体
 母体のうち、有機物としては、メラミン樹脂、尿素樹脂、アニリン樹脂、グアナミン樹脂、フェノール樹脂、キシレン樹脂、フラン樹脂など、一般的に熱硬化性樹脂に分類される樹脂;スチレン樹脂、アクリル樹脂、アクリロニトリル樹脂、AS樹脂(アクリロニトリル-スチレン共重合体)、ASA樹脂(アクリロニトリル-スチレン-アクリル酸メチル共重合体)など、一般的に熱可塑性樹脂に分類される樹脂;ポリ乳酸等のその他の樹脂;多糖を例示することができる。
 母体のうち、無機物としては、シリカ、ガラスなどを例示することができる。
(4.1.1) Mother Body Among the mother bodies, the organic substances are resins generally classified as thermosetting resins such as melamine resin, urea resin, aniline resin, guanamine resin, phenol resin, xylene resin, and furan resin. Resins generally classified as thermoplastic resins such as styrene resin, acrylic resin, acrylonitrile resin, AS resin (acrylonitrile-styrene copolymer), ASA resin (acrylonitrile-styrene-methyl acrylate copolymer); poly Other resins such as styrene; polysaccharides can be exemplified.
Examples of the inorganic substance in the mother body include silica and glass.
(4.1.2)量子ドット集積ナノ粒子
 量子ドット集積ナノ粒子とは、上記量子ドットが、上記母体の中に内包されている、および/またはその表面に吸着している構造を有する。
 量子ドットが母体に内包されている場合、量子ドットは母体内部に分散されていればよく、母体自体と化学的に結合していてもよいし、していなくてもよい。
(4.1.2) Quantum Dot Accumulated Nanoparticles Quantum dot integrated nanoparticles have a structure in which the quantum dots are contained in the mother body and / or adsorbed on the surface thereof.
When the quantum dots are contained in the mother body, the quantum dots need only be dispersed inside the mother body, and may or may not be chemically bonded to the mother body itself.
 量子ドットとしては、II-VI族化合物、III-V族化合物またはIV族元素を含有する半導体ナノ粒子が使用される。たとえば、CdSe、CdS、CdTe、ZnSe、ZnS、ZnTe、InP、InN、InAs、InGaP、GaP、GaAs、Si、Geなどが挙げられる。 As the quantum dots, semiconductor nanoparticles containing a group II-VI compound, a group III-V compound or a group IV element are used. For example, CdSe, CdS, CdTe, ZnSe, ZnS, ZnTe, InP, InN, InAs, InGaP, GaP, GaAs, Si, Ge and the like can be mentioned.
 上記量子ドットをコアとし、その上にシェルを設けた量子ドットを用いることもできる。以下、本明細書中シェルを有する量子ドットの表記法として、コアがCdSe、シェルがZnSの場合、CdSe/ZnSと表記する。例えば、CdSe/ZnS、CdS/ZnS、InP/ZnS、InGaP/ZnS、Si/SiO2、Si/ZnS、Ge/GeO2、Ge/ZnS等を用いることができるが、これらに限定されない。 It is also possible to use a quantum dot having the above quantum dot as a core and a shell provided on the core. Hereinafter, as the notation of the quantum dot having a shell in the present specification, when the core is CdSe and the shell is ZnS, it is described as CdSe / ZnS. For example, CdSe / ZnS, CdS / ZnS, InP / ZnS, InGaP / ZnS, Si / SiO2, Si / ZnS, Ge / GeO2, Ge / ZnS and the like can be used, but are not limited thereto.
 量子ドットは必要に応じて、有機ポリマー等により表面処理が施されているものを用いてもよい。例えば、表面カルボキシ基を有するCdSe/ZnS(インビトロジェン社製)、表面アミノ基を有するCdSe/ZnS(インビトロジェン社製)等が挙げられる。 If necessary, the quantum dots may be surface-treated with an organic polymer or the like. For example, CdSe / ZnS having a surface carboxy group (manufactured by Invitrogen), CdSe / ZnS having a surface amino group (manufactured by Invitrogen), and the like can be mentioned.
 量子ドット集積ナノ粒子は、公知の方法により作成することが可能である。例えば、量子ドットを内包したシリカナノ粒子は、ニュー・ジャーナル・オブ・ケミストリー33巻561ページ(2009)に記載されているCdTe内包シリカナノ粒子の合成を参考に合成することができる。
 量子ドットを外包したシリカナノ粒子は、ケミカル・コミュニケーション 2670ページ(2009)に記載されているCdSe/ZnSを5-amino-1-pentanolとAPSでキャッピングした粒子を表面に集積したシリカナノ粒子の合成を参考に合成することができる。
 量子ドットを内包したポリマーナノ粒子は、ネイチャー バイオテクノロジー19巻631ページ(2001)に記載されているポリスチレンナノ粒子への量子ドットの含浸法を用いて作製することができる。
Quantum dot integrated nanoparticles can be produced by a known method. For example, the silica nanoparticles containing quantum dots can be synthesized with reference to the synthesis of CdTe-encapsulating silica nanoparticles described in New Journal of Chemistry, Vol. 33, p. 561 (2009).
For silica nanoparticles encapsulating quantum dots, refer to the synthesis of silica nanoparticles in which particles of CdSe / ZnS capped with 5-amino-1-pentanol and APS described on page 2670 (2009) of Chemical Communication are integrated on the surface. Can be synthesized into.
Polymer nanoparticles encapsulating quantum dots can be produced by using the method of impregnating polystyrene nanoparticles with quantum dots described in Nature Biotechnology, Vol. 19, p. 631 (2001).
(4.1.3)蛍光色素集積ナノ粒子
 蛍光色素集積ナノ粒子とは、蛍光色素が、上記母体の中に内包されている、及び/又はその表面に吸着している構造を有する。
 蛍光色素としては、ローダミン系色素分子、スクアリリウム系色素分子、シアニン系色素分子、芳香環系色素分子、オキサジン系色素分子、カルボピロニン系色素分子、ピロメセン系色素分子などの有機蛍光色素を例示することができる。
(4.1.3) Fluorescent dye-accumulated nanoparticles The fluorescent dye-accumulated nanoparticles have a structure in which a fluorescent dye is contained in the mother body and / or is adsorbed on the surface thereof.
Examples of the fluorescent dye include organic fluorescent dyes such as rodamine-based dye molecule, squarylium-based dye molecule, cyanine-based dye molecule, aromatic ring-based dye molecule, oxazine-based dye molecule, carbopyronine-based dye molecule, and pyromesene-based dye molecule. it can.
 具体的には、Alexa Fluor(登録商標、インビトロジェン社製)系色素分子、BODIPY(登録商標、インビトロジェン社製)系色素分子、Cy(登録商標、GEヘルスケア社製)系色素分子、HiLyte(登録商標、アナスペック社製)系色素分子、DyLight(登録商標、サーモサイエンティフィック社製)系色素分子、ATTO(登録商標、ATTO-TEC社製)系色素分子、MFP(登録商標、Mobitec社製)系色素分子、CF(登録商標、Biotium社製)系色素分子、DY(登録商標、DYOMICS社製)系色素分子、CAL(登録商標、BioSearch Technologies社製)系色素分子などを用いることができる。 Specifically, Alexa Fluor (registered trademark, manufactured by Invigen) dye molecule, BODIPY (registered trademark, manufactured by Invigen) dye molecule, Cy (registered trademark, manufactured by GE Healthcare) dye molecule, HiLite (registered) Trademarks, Anaspec) dye molecules, DyLight (registered trademark, Thermoscientific) dye molecules, ATTO (registered trademark, ATTO-TEC) dye molecules, MFP (registered trademark, Mobitec) ) Dye molecule, CF (registered trademark, manufactured by Biotium) dye molecule, DY (registered trademark, manufactured by DYOMICS) dye molecule, CAL (registered trademark, manufactured by BioSearch Technologies) dye molecule and the like can be used. ..
 なお、蛍光色素が母体に内包されている場合、蛍光色素は母体内部に分散されていればよく、母体自体と化学的に結合していてもよいし、していなくてもよい。 When the fluorescent dye is contained in the mother body, the fluorescent dye may or may not be chemically bonded to the mother body itself as long as it is dispersed inside the mother body.
 蛍光色素集積ナノ粒子は、公知の方法により作成することが可能である。例えば、蛍光色素を内包したシリカナノ粒子は、ラングミュア8巻2921ページ(1992)に記載されているFITC内包シリカ粒子の合成を参考に合成することができる。FITCの代わりに所望の蛍光色素を用いることで種々の蛍光色素集積ナノ粒子を合成することができる。
 蛍光色素を内包したポリスチレンナノ粒子は、米国特許4326008(1982)に記載されている重合性官能基をもつ有機色素を用いた共重合法や、米国特許5326692(1992)に記載されているポリスチレンナノ粒子への蛍光色素の含浸法を用いて作製することができる。
Fluorescent dye-accumulated nanoparticles can be produced by a known method. For example, silica nanoparticles containing a fluorescent dye can be synthesized with reference to the synthesis of FITC-encapsulating silica particles described in Langmuir Vol. 8, p. 2921 (1992). By using a desired fluorescent dye instead of FITC, various fluorescent dye-accumulated nanoparticles can be synthesized.
Polystyrene nanoparticles containing a fluorescent dye can be obtained by a copolymerization method using an organic dye having a polymerizable functional group described in US Pat. No. 4,326,008 (1982) or polystyrene nanoparticles described in US Pat. No. 5,326,692 (1992). It can be prepared by using a method of impregnating particles with a fluorescent dye.
(5)組織切片の染色方法
 染色方法の一例について説明する。
 この染色方法が適用できる組織切片(単に「切片」ともいい、病理切片などの切片も含まれる。)の作製法は特に限定されず、公知の手順により作製されたものを用いることができる。
(5) Staining Method for Tissue Section An example of the staining method will be described.
The method for preparing a tissue section to which this staining method can be applied (also referred to simply as a “section” and including a section such as a pathological section) is not particularly limited, and a tissue section prepared by a known procedure can be used.
(5.1)標本作製工程
(5.1.1)脱パラフィン処理
 キシレンを入れた容器に、切片を浸漬させ、パラフィン除去する。温度は特に限定されるものではないが、室温で行うことができる。浸漬時間は、3分以上30分以下であることが好ましい。また必要により浸漬途中でキシレンを交換してもよい。
(5.1) Specimen preparation step (5.1.1) Deparaffinization treatment The section is immersed in a container containing xylene to remove paraffin. The temperature is not particularly limited, but it can be carried out at room temperature. The immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, xylene may be replaced during immersion.
 次いでエタノールを入れた容器に切片を浸漬させ、キシレン除去する。温度は特に限定されるものではないが、室温で行うことができる。浸漬時間は、3分以上30分以下であることが好ましい。また必要により浸漬途中でエタノールを交換してもよい。 Next, the section is immersed in a container containing ethanol to remove xylene. The temperature is not particularly limited, but it can be carried out at room temperature. The immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, ethanol may be replaced during immersion.
 水を入れた容器に、切片を浸漬させ、エタノール除去する。温度は特に限定されるものではないが、室温で行うことができる。浸漬時間は、3分以上30分以下であることが好ましい。また必要により浸漬途中で水を交換してもよい。 Immerse the section in a container filled with water and remove ethanol. The temperature is not particularly limited, but it can be carried out at room temperature. The immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, the water may be replaced during immersion.
(5.1.2)賦活化処理
 公知の方法に倣い、目的物質の賦活化処理を行う。
 賦活化条件に特に定めはないが、賦活液としては、0.01Mのクエン酸緩衝液(pH6.0)、1mMのEDTA溶液(pH8.0)、5%尿素、0.1Mのトリス塩酸緩衝液などを用いることができる。
 pH条件は用いる組織切片に応じてpH2.0~13.0の範囲から、シグナルが出て、組織の荒れがシグナルを評価できる程度となる条件で行う。通常はpH6.0~8.0で行うが、特殊な組織切片ではたとえばpH3.0でも行う。
 加熱機器はオートクレーブ、マイクロウェーブ、圧力鍋、ウォーターバスなどを用いることができる。温度は特に限定されるものではないが、室温で行うことができる。温度は50~130℃、時間は5~30分で行うことができる。
(5.1.2) Activation treatment The activation treatment of the target substance is performed according to a known method.
The activation conditions are not particularly specified, but the activation solution is 0.01 M citrate buffer (pH 6.0), 1 mM EDTA solution (pH 8.0), 5% urea, and 0.1 M Tris-hydrochloric acid buffer. A liquid or the like can be used.
The pH condition is such that a signal is output from the range of pH 2.0 to 13.0 depending on the tissue section to be used and the tissue roughness is such that the signal can be evaluated. Normally, it is performed at pH 6.0 to 8.0, but for special tissue sections, it is also performed at pH 3.0, for example.
As the heating device, an autoclave, a microwave, a pressure cooker, a water bath, or the like can be used. The temperature is not particularly limited, but it can be carried out at room temperature. The temperature can be 50 to 130 ° C. and the time can be 5 to 30 minutes.
 次いでPBSを入れた容器に、賦活処理後の切片を浸漬させ、洗浄を行う。温度は特に限定されるものではないが、室温で行うことができる。浸漬時間は、3分以上30分以下であることが好ましい。また必要により浸漬途中でPBSを交換してもよい。 Next, the section after activation treatment is immersed in a container containing PBS and washed. The temperature is not particularly limited, but it can be carried out at room temperature. The immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, PBS may be replaced during immersion.
(5.2)免疫染色工程
 免疫染色工程では、目的物質を染色するために、目的物質に直接的または間接的に結合しうる部位を有する蛍光ナノ粒子を含む免疫染色剤の溶液を、切片に載せ、目的物質との反応を行う。免疫染色工程に用いる免疫染色剤の溶液については、この工程の前にあらかじめ調製しておけばよい。
(5.2) Immunostaining Step In the immunostaining step, in order to stain the target substance, a solution of an immunostaining agent containing fluorescent nanoparticles having a site capable of directly or indirectly binding to the target substance is applied to a section. Place and react with the target substance. The solution of the immunostaining agent used in the immunostaining step may be prepared in advance before this step.
 なお、複数の目的物質を検出しようとする場合は、目的物質に対応した複数の免疫染色剤によって免疫染色を行う。この場合に用いる複数の免疫染色剤は、PIDを用いた免疫染色剤(PID染色剤)を少なくとも1つ含むものであればよく、抗体と蛍光物質(蛍光波長)とが互いに異なれば、PID染色剤を複数用いた多重染色や、PID染色剤と有機蛍光物質や量子ドット等の蛍光標識体を用いた免疫染色剤とを組み合わせた多重染色によって、複数の目的物質を検出することも可能である。この場合は、各免疫染色剤の溶液をそれぞれ調製し、切片に載せ、目的物質との反応を行うが、切片に載せる際にそれぞれの免疫染色剤の溶液を予め混合してもよいし、別々に順次載せてもよい。
 複数の免疫染色剤を用いる場合、PIDの励起/発光波長と、他の免疫染色剤の蛍光標識体の励起/発光波長は、クロストークを無視できる程度に離れていることが望ましい。
When a plurality of target substances are to be detected, immunostaining is performed with a plurality of immunostaining agents corresponding to the target substances. The plurality of immunostaining agents used in this case may be those containing at least one immunostaining agent using PID (PID staining agent), and if the antibody and the fluorescent substance (fluorescent wavelength) are different from each other, PID staining may be performed. It is also possible to detect a plurality of target substances by multiple staining using a plurality of agents or by a combination of a PID staining agent and an immunostaining agent using a fluorescent label such as an organic fluorescent substance or a quantum dot. .. In this case, a solution of each immunostaining agent is prepared, placed on a section, and reacted with the target substance. However, the solution of each immunostaining agent may be mixed in advance when the solution is placed on the section, or separately. It may be placed sequentially in.
When a plurality of immunostaining agents are used, it is desirable that the excitation / emission wavelengths of the PID and the excitation / emission wavelengths of the fluorescent labels of other immunostaining agents are separated to a extent that crosstalk can be ignored.
 免疫染色工程を行う上での条件、すなわち免疫染色剤の溶液に組織切片を浸漬する際の温度および浸漬時間は、従来の免疫染色法に準じて、適切なシグナルが得られるよう適宜調整することができる。
 温度は特に限定されるものではないが、室温で行うことができる。反応時間は、30分以上24時間以下であることが好ましい。
 上述したような処理を行う前に、BSA含有PBSなど公知のブロッキング剤やTween20などの界面活性剤を滴下することが好ましい。
The conditions for performing the immunostaining step, that is, the temperature and the immersion time when immersing the tissue section in the solution of the immunostaining agent, should be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. Can be done.
The temperature is not particularly limited, but it can be carried out at room temperature. The reaction time is preferably 30 minutes or more and 24 hours or less.
It is preferable to drop a known blocking agent such as PBS containing BSA or a surfactant such as Tween 20 before performing the treatment as described above.
(5.3)標本後処理工程
 免疫染色工程を終えた組織標本は、観察に適したものとなるよう、固定化・脱水、透徹、封入などの処理を行うことが好ましい。
(5.3) Specimen post-treatment step It is preferable that the tissue specimen after the immunostaining step is subjected to treatments such as immobilization / dehydration, permeation, and encapsulation so as to be suitable for observation.
 固定化・脱水処理は、組織切片を固定処理液(ホルマリン、パラホルムアルデヒド、グルタールアルデヒド、アセトン、エタノール、メタノールなどの架橋剤)に浸漬すればよい。透徹処理は、固定化・脱水処理を終えた組織切片を透徹液(キシレンなど)に浸漬すればよい。封入処理は、透徹処理を終えた組織切片を封入液に浸漬すればよい。
 これらの処理を行う上での条件、たとえば組織切片を所定の処理液に浸漬する際の温度および浸漬時間は、従来の免疫染色法に準じて、適切なシグナルが得られるよう適宜調整することができる。
For the immobilization / dehydration treatment, the tissue section may be immersed in a fixation treatment solution (crosslinking agent such as formalin, paraformaldehyde, glutaaldehyde, acetone, ethanol, methanol). In the permeation treatment, the tissue section that has been immobilized and dehydrated may be immersed in a permeation solution (xylene or the like). For the encapsulation treatment, the tissue section that has undergone the permeation treatment may be immersed in the encapsulation liquid.
The conditions for performing these treatments, for example, the temperature and the immersion time when immersing the tissue section in a predetermined treatment solution, may be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. it can.
(5.4)形態観察染色工程
 免疫染色工程とは別に、細胞、組織、臓器などの形態を観察することができるようにするための、形態観察染色を行う。形態観察染色は、細胞、組織、臓器などの形態を表すことができるものであれば、染色方法は特に限定されるものではない。例えば、以下に常法として挙げられた染色方法の他、蛍光色素による染色方法などを用いることもできる。
 形態観察染色工程は、常法にしたがって行うことができる。
 組織標本50の形態観察に関しては、細胞質・間質・各種線維・赤血球・角化細胞が赤~濃赤色に染色される、エオジンを用いた染色が標準的に用いられている。細胞核・石灰部・軟骨組織・細菌・粘液が青藍色~淡青色に染色される、ヘマトキシリンを用いた染色も標準的に用いられている(これら2つの染色を同時に行う方法はヘマトキシリン・エオジン染色(HE染色)として知られている)。
 形態観察染色工程を含める場合は、免疫染色工程の後に行うようにしてもよいし、免疫染色工程の前に行うようにしてもよい。
(5.4) Morphological observation staining step Separate from the immunostaining step, morphological observation staining is performed so that the morphology of cells, tissues, organs, etc. can be observed. The morphological observation staining is not particularly limited as long as it can express the morphology of cells, tissues, organs and the like. For example, in addition to the dyeing methods listed below as conventional methods, a dyeing method using a fluorescent dye or the like can also be used.
The morphological observation dyeing step can be performed according to a conventional method.
For morphological observation of the tissue specimen 50, staining with eosin, in which cytoplasm, interstitium, various fibers, erythrocytes, and keratinocytes are stained in red to deep red, is standardly used. Staining with hematoxylin, in which the cell nucleus, lime, cartilage tissue, bacteria, and mucus are stained in blue to pale blue, is also standardly used (the method of performing these two stainings at the same time is hematoxylin / eosin staining). Known as (HE staining)).
When the morphological observation staining step is included, it may be performed after the immunostaining step or before the immunostaining step.
[画像処理システムの動作]
 図4に、本発明における画像処理システム1の動作の全体の流れを示す。
 図4に示すように、画像処理システム1の動作としては、フォーカシング工程(ステップS101)、低倍率画像取得工程(ステップS102)、注目領域選択工程(ステップS103)、解析用画像取得工程(ステップS104)及び、解析工程(ステップS105)を有する。
 以下、各工程について説明する。
[Operation of image processing system]
FIG. 4 shows the overall flow of the operation of the image processing system 1 in the present invention.
As shown in FIG. 4, the operation of the image processing system 1 includes a focusing step (step S101), a low-magnification image acquisition step (step S102), a region of interest selection step (step S103), and an image acquisition step for analysis (step S104). ) And an analysis step (step S105).
Hereinafter, each step will be described.
[フォーカシング工程]
 まず、フォーカシング工程について説明する。
 本実施形態においては、第1画像取得部20によって組織標本50の明視野画像を取得し、これに基づいてWSIの作成対象となる撮像領域を設定し、明視野画像を基準にフォーカシングを行う。さらに、第2画像取得部30によって組織標本50に蛍光標識されたPIDに励起光を照射し、検出されたPIDの蛍光輝点を基準にしてさらに厳密なフォーカシングを行う。
[Focusing process]
First, the focusing process will be described.
In the present embodiment, the first image acquisition unit 20 acquires a bright-field image of the tissue sample 50, sets an imaging region to be created for WSI based on the bright-field image, and performs focusing based on the bright-field image. Further, the second image acquisition unit 30 irradiates the tissue specimen 50 with the fluorescence-labeled PID with excitation light, and performs more rigorous focusing with reference to the fluorescent bright spot of the detected PID.
 具体的な制御について、図5のフローチャートを用いて説明する。
 まず、制御部61は第1画像取得部20を制御して、スライドガラス全体のフォーカシング用の明視野画像を取得する(ステップS1)。この明視野画像は、後述する蛍光画像等の高解像度の撮影条件の設定に用いるものであり、低倍率の対物レンズを用いた低倍率画像である。得られた明視野画像に基づいて、制御部61は、図6に示すような組織標本50を含む撮像領域Rを設定する(ステップS2)。
 具体的には、制御部61は組織標本50の全体画像を、組織標本50の有無により2値化して、X軸方向、Y軸方向それぞれに組織標本50の存在する領域を検出することで、撮像領域Rを決定する。なお、撮像領域Rは、ユーザーが組織標本50全体の明視野画像を観察しながら表示装置70上で手動にて設定することとしてもよいが、自動設定されることが好ましい。
Specific control will be described with reference to the flowchart of FIG.
First, the control unit 61 controls the first image acquisition unit 20 to acquire a bright field image for focusing of the entire slide glass (step S1). This bright-field image is used for setting high-resolution imaging conditions such as a fluorescence image described later, and is a low-magnification image using a low-magnification objective lens. Based on the obtained bright-field image, the control unit 61 sets an imaging region R including the tissue sample 50 as shown in FIG. 6 (step S2).
Specifically, the control unit 61 binarizes the entire image of the tissue sample 50 depending on the presence or absence of the tissue sample 50, and detects the region where the tissue sample 50 exists in each of the X-axis direction and the Y-axis direction. The imaging region R is determined. The imaging region R may be manually set on the display device 70 while the user is observing the bright field image of the entire tissue specimen 50, but it is preferably set automatically.
 次に、明視野画像に基づいて、組織標本50のフォーカシングを行う。明視野画像に基づいたフォーカシングは、ユーザーが手動で行うものとしてもよいが、制御部61による制御下で自動的で行うことが好ましい。以下、制御部61による制御下で自動的にフォーカスマップを作成し、フォーカシングを行う方法について説明する。 Next, the tissue sample 50 is focused based on the bright field image. Focusing based on the bright-field image may be performed manually by the user, but it is preferably performed automatically under the control of the control unit 61. Hereinafter, a method of automatically creating a focus map under the control of the control unit 61 and performing focusing will be described.
 まず、撮像領域R上に第1の焦点計測位置P1を設定する(ステップS3)。
 制御部61は、図7に示すように撮像領域RをX軸方向、Y軸方向それぞれに分割して小領域を設定し、各小領域のXY座標を求める。ここでは、XY座標は各小領域の中心座標とするが、これに限らず、例えば各小領域の左上端の座標をXY座標とすることも可能である。
 さらに、図7に示すように、制御部61は各小領域に対してX軸方向、Y軸方向それぞれに1、2、3、・・・といった番号を割り当て、配列番号を設定する。即ち、配列番号は、例えば撮像領域Rの左上端の小領域は、(X軸,Y軸)=(1,1)である。
 次に、制御部61は、各小領域に対して第1の焦点計測位置P1を設定する。P1は、本実施形態においては各小領域の中心座標位置とするが、これに限定されず、例えば各小領域の左上端を第1の焦点計測位置P1とすることも可能である。また、例えば図7の配列番号(1,1)の領域のように、中心座標上に組織標本50が存在しない場合がある。この場合は、第1の焦点計測位置P1を組織標本50上の任意の座標へ移動させることが可能である。
First, the first focus measurement position P1 is set on the imaging region R (step S3).
As shown in FIG. 7, the control unit 61 divides the imaging region R into the X-axis direction and the Y-axis direction to set a small region, and obtains the XY coordinates of each small region. Here, the XY coordinates are the center coordinates of each subregion, but the coordinates are not limited to this, and for example, the coordinates of the upper left end of each subregion can be the XY coordinates.
Further, as shown in FIG. 7, the control unit 61 assigns numbers such as 1, 2, 3, ... In each of the X-axis direction and the Y-axis direction to each small area, and sets the array number. That is, the sequence number is, for example, (X-axis, Y-axis) = (1,1) in the small area at the upper left end of the imaging region R.
Next, the control unit 61 sets the first focus measurement position P1 for each small area. In the present embodiment, P1 is the center coordinate position of each small area, but the present invention is not limited to this, and for example, the upper left end of each small area can be set as the first focus measurement position P1. In addition, the tissue sample 50 may not exist on the center coordinates, for example, as in the region of the sequence number (1, 1) in FIG. In this case, it is possible to move the first focus measurement position P1 to an arbitrary coordinate on the tissue sample 50.
 続いて、各小領域の第1の焦点計測位置P1に対してフォーカシングを行う(ステップS4)。ここでは、制御部61は、ステージ40をXY方向に移動制御させながら、光軸位置を第1の焦点計測位置P1に合わせ、各第1の焦点計測位置P1に対して実測による明視野合焦位置(Z座標)を求める。 Subsequently, focusing is performed on the first focus measurement position P1 in each small area (step S4). Here, the control unit 61 adjusts the optical axis position to the first focus measurement position P1 while moving the stage 40 in the XY directions, and focuses the bright field by actual measurement with respect to each first focus measurement position P1. Find the position (Z coordinate).
 このようにして求められた明視野合焦位置をもとにして、制御部61は、図8に示すようなフォーカスマップを作成する(ステップS5)。フォーカスマップには、各小領域の配列番号と、それに対応するステージ座標が格納される。ステージ座標は、X軸、Y軸については各小領域の中心座標が、Z軸については明視野合焦位置が対応する。
 以上で、明視野画像に基づいた組織標本50のフォーカシングが完了する。
Based on the bright field focus position obtained in this way, the control unit 61 creates a focus map as shown in FIG. 8 (step S5). The focus map stores the array number of each small area and the corresponding stage coordinates. The stage coordinates correspond to the center coordinates of each small area for the X-axis and the Y-axis, and the bright-field focusing position for the Z-axis.
This completes the focusing of the tissue sample 50 based on the bright-field image.
 ステップS6以降は、明視野画像に基づいて得られたフォーカシング情報を基準に、PID輝点に対してフォーカシングを行う。即ち、上述のようにステップS1~S3の処理によって、明視野画像を基準とした組織標本50の明視野合焦位置が特定されているが、これを基準にしてさらにPID輝点に対してフォーカシングを行うことで、より厳密に合焦された蛍光画像を得ることができる。 From step S6 onward, focusing is performed on the PID bright spot based on the focusing information obtained based on the bright field image. That is, as described above, the bright-field focusing position of the tissue sample 50 based on the bright-field image is specified by the processing of steps S1 to S3, and focusing is further performed on the PID bright spot based on this. By performing the above, a more precisely focused fluorescence image can be obtained.
 まず、フォーカシング用のPIDの蛍光画像を取得する(ステップS6)。即ち、制御部61は、励起光源32を制御して組織標本50に対してPIDの励起光を照射し、第2撮像素子33によってPIDの蛍光画像を取得する。 First, a fluorescence image of the PID for focusing is acquired (step S6). That is, the control unit 61 controls the excitation light source 32 to irradiate the tissue sample 50 with the excitation light of the PID, and acquires the fluorescence image of the PID by the second image sensor 33.
 次に、得られた蛍光画像上の任意のPIDの蛍光輝点を選出し、第2の焦点計測位置P2を設定する(ステップS7)。ここでは、ユーザーが手動によって第2の焦点計測位置P2を設定するものとし、図9のように、組織標本50上の1つまたは複数の第2の焦点計測位置P2を設定する。なお、第2の焦点計測位置P2を自動設定する構成としてもよい。 Next, the fluorescence bright spot of an arbitrary PID on the obtained fluorescence image is selected, and the second focus measurement position P2 is set (step S7). Here, it is assumed that the user manually sets the second focus measurement position P2, and as shown in FIG. 9, one or a plurality of second focus measurement positions P2 on the tissue sample 50 are set. The second focus measurement position P2 may be automatically set.
 次に、設定された第2の焦点計測位置P2に対して、フォーカシングを行う(ステップS8)。具体的には、制御部61はステージ40をXY方向に移動制御させながら、光軸位置を第2の焦点計測位置P2に合わせ、ステップ3で作成されたフォーカスマップの合焦位置を参照しながら、Z座標方向に微調整して第2の焦点計測位置P2に対する蛍光合焦位置(Z座標)を求める。 Next, focusing is performed on the set second focus measurement position P2 (step S8). Specifically, the control unit 61 adjusts the optical axis position to the second focus measurement position P2 while moving the stage 40 in the XY directions, and refers to the focus position of the focus map created in step 3. , Finely adjust in the Z coordinate direction to obtain the fluorescence focusing position (Z coordinate) with respect to the second focus measurement position P2.
 これをすべての第2の焦点計測位置P2に対して行うと、得られた新たな合焦位置を利用して、制御部61はステップS3で作成されたフォーカスマップを修正する(ステップS9)。
 以上により、組織標本50に対するフォーカシングが完了する。
 なお、上記したフォーカシング方法はあくまで一例であり、本発明に適用可能な方法はこれに限定されるものではない。
When this is performed for all the second focus measurement positions P2, the control unit 61 modifies the focus map created in step S3 by using the obtained new focus position (step S9).
As described above, the focusing on the tissue sample 50 is completed.
The focusing method described above is merely an example, and the method applicable to the present invention is not limited to this.
[低倍率画像取得工程]
 続いて、低倍率撮影によりWSIを作成する低倍率画像取得工程について説明する。
 上述のようにフォーカシングが完了すると、組織標本50全体のWSIの作成に移行する。
 以下、蛍光画像のWSI作成方法について、図10のフローチャートを用いて説明する。
[Low magnification image acquisition process]
Next, a low-magnification image acquisition process for creating a WSI by low-magnification imaging will be described.
When Focusing is completed as described above, the process proceeds to the preparation of WSI for the entire tissue sample 50.
Hereinafter, a method for creating a WSI of a fluorescence image will be described with reference to the flowchart of FIG.
 まず、組織標本50に標識された蛍光標識体を励起させる(ステップS10)。具体的には、制御部61は励起光源32を制御して、組織標本50に標識されたPIDを励起させる励起光を照射させる。 First, the fluorescently labeled body labeled on the tissue specimen 50 is excited (step S10). Specifically, the control unit 61 controls the excitation light source 32 to irradiate the tissue specimen 50 with excitation light that excites the labeled PID.
 次に、組織標本50の部分画像を取得する(ステップS11)。
 ここでは、ステップS9で完成したフォーカスマップの情報をもとに、制御部61はステージ40を移動制御させて、第2画像取得部30を制御して部分的な蛍光画像を取得する。即ち、光軸位置及び合焦位置を、フォーカスマップに格納されたステージ座標が示すXYZ座標へと移動させ、第2撮像素子33を制御して、小領域ごとの画像を撮像する。ここでは、対物レンズ34として高倍率の対物レンズを用いることで、高解像度の画像を取得することができる。
Next, a partial image of the tissue sample 50 is acquired (step S11).
Here, based on the information of the focus map completed in step S9, the control unit 61 moves and controls the stage 40, and controls the second image acquisition unit 30 to acquire a partial fluorescence image. That is, the optical axis position and the focusing position are moved to the XYZ coordinates indicated by the stage coordinates stored in the focus map, and the second image sensor 33 is controlled to capture an image for each small area. Here, by using a high-magnification objective lens as the objective lens 34, a high-resolution image can be acquired.
 具体的には、部分画像として図11に示すような帯状のスキャン画像を取得する。まず、組織標本50に対して、左上端から撮像を開始する。制御部61は、励起光を照射させるとともに、第2撮像素子33による撮像位置を組織切片51のY軸の正の方向に移動しながらスキャンさせ、部分画像Aを取得する。続いて、制御部61は第2撮像素子33による撮像位置をX軸の正の方向に移動させて、部分画像Bを取得する。同様に、部分画像C、・・・部分画像Nの順に部分画像を取得すると、撮像が完了する。 Specifically, a strip-shaped scan image as shown in FIG. 11 is acquired as a partial image. First, imaging of the tissue specimen 50 is started from the upper left corner. The control unit 61 irradiates the excitation light and scans the image pickup position by the second image pickup element 33 while moving in the positive direction of the Y axis of the tissue section 51 to acquire a partial image A. Subsequently, the control unit 61 moves the image pickup position by the second image pickup element 33 in the positive direction of the X axis to acquire the partial image B. Similarly, when the partial images are acquired in the order of the partial image C, the partial image N, the imaging is completed.
 次に、制御部61は作成手段としての画像処理部63を制御して、撮像された部分画像を合成して、撮像領域Rの全体の蛍光画像を作成させる(ステップS12)。即ち、部分画像A~NをX軸方向に並べて貼り合わせることで、組織標本50の全体の高解像度の蛍光画像が得られる。
 さらに、画像処理部63は得られた撮像領域Rの全体の蛍光画像をA/D変換して、デジタル画像化する(ステップS13)。以上により、WSIの作成が完了する。
 作成されたWSIは、記憶手段としてのデータベース80によって保存される。WSIを参照したいユーザーは、通信部64を介してパーソナルコンピューター等に画像データを読み込み、ディスプレイ上で観察を行うことができる。
Next, the control unit 61 controls the image processing unit 63 as the creating means to synthesize the captured partial images to create the entire fluorescent image of the imaging region R (step S12). That is, by arranging and pasting the partial images A to N in the X-axis direction, a high-resolution fluorescence image of the entire tissue sample 50 can be obtained.
Further, the image processing unit 63 A / D-converts the entire fluorescent image of the obtained imaging region R into a digital image (step S13). With the above, the creation of WSI is completed.
The created WSI is stored by the database 80 as a storage means. A user who wants to refer to the WSI can read the image data into a personal computer or the like via the communication unit 64 and observe it on the display.
 以上、蛍光画像のWSI作成方法について説明したが、WSIは、WSIの作成対象(例えば組織標本50の全体)を把握できるものであれば、特に蛍光画像に限定されるものではない。例えば、WSIを明視野画像として取得しても良い。また、明視野画像のWSIは、上記した蛍光画像のWSIの作成と同様にして作成することができる。 The method for creating a WSI for a fluorescent image has been described above, but the WSI is not particularly limited to a fluorescent image as long as the target for creating the WSI (for example, the entire tissue sample 50) can be grasped. For example, WSI may be acquired as a bright field image. Further, the WSI of the bright field image can be created in the same manner as the creation of the WSI of the fluorescence image described above.
 また、本発明では、細胞種別を特定するための画像(細胞種別特定用画像)と、特定された細胞を解析するための画像(細胞解析用画像)のWSIが作成される。
 なお、対象となる細胞種別としては、例えば、肝細胞、グリア細胞、T細胞等の分化による分類の他、がん化や炎症等の病態分類、細胞周期やネクローシス等の特定条件下での分類、浸潤や突起等の空間配置・形状特徴分類も含まれる。
Further, in the present invention, a WSI of an image for specifying the cell type (image for specifying the cell type) and an image for analyzing the specified cell (image for cell analysis) is created.
The target cell types include, for example, classification by differentiation of hepatocytes, glial cells, T cells, etc., pathological classification such as canceration and inflammation, and classification under specific conditions such as cell cycle and necrosis. , Spatial arrangement and shape feature classification such as infiltration and protrusions are also included.
[注目領域選択工程]
 続いて、注目領域選択工程について説明する。
 注目領域選択工程は、作成されたWSI(細胞種別特定用画像、細胞解析用画像)を用いて、注目領域(FOV:Field Of View)を選択する工程である。
[Production area selection process]
Subsequently, the process of selecting the region of interest will be described.
The region of interest selection step is a step of selecting a region of interest (FOV: Field Of View) using the created WSI (image for specifying cell type, image for cell analysis).
 「注目領域」とは、病理診断に有用な領域であれば限定されることはなく、がん領域などの病変部とともに正常組織領域をも含み得る。また、薬剤の効果が表れた領域や薬剤が存在する領域も、病理診断に有用と考えられるため注目領域に含まれ得る。
 さらに、明視野画像の色彩によって特定される領域や、蛍光画像の濃淡によって特定される領域も有用である場合があり、各領域の大きさは適宜指定することができて、好ましくは約100um平方である。
 特に、蛍光輝点の数を指標として領域を特定することが、病理診断に有用なことがあり、数が多い領域、数が中程度の領域、数が少ない領域をそれぞれ特定して抽出することもできる。数が多い領域をホットスポットと、数が少ない領域をコールドスポットと呼んで定義することもある。
The “region of interest” is not limited as long as it is a region useful for pathological diagnosis, and may include a normal tissue region as well as a lesion portion such as a cancer region. In addition, a region in which the effect of the drug appears or a region in which the drug is present can also be included in the region of interest because it is considered to be useful for pathological diagnosis.
Further, a region specified by the color of the bright-field image or a region specified by the shade of the fluorescence image may also be useful, and the size of each region can be appropriately specified, preferably about 100 um square. Is.
In particular, specifying a region using the number of fluorescent bright spots as an index may be useful for pathological diagnosis, and identifying and extracting a region having a large number, a region having a medium number, and a region having a small number, respectively. You can also. Areas with a large number are sometimes called hot spots, and areas with a small number are sometimes called cold spots.
 以下、注目領域選択処理について、図12のフローチャートを用いて説明する。 Hereinafter, the region of interest selection process will be described with reference to the flowchart of FIG.
 まず、制御部61は、画像処理ソフトで、データベース80から所定の2つのWSI(細胞種別特定用画像、細胞解析用画像)を読み込み、読み込んだ画像について、例えば、ノイズ除去や領域設定(非特異蛍光除外)等の前処理を実施して、注目領域選択用の2つの画像を取得する(ステップS21)。
 細胞種別特定用画像の一例として、図13Aは、H染色明視野画像である。また、細胞解析用画像の一例として、図13Bは、PD-L1染色蛍光画像である。なお、図13A及び図13Bにおける右下の画像は、それぞれの画像の一部を拡大した拡大画像である。
First, the control unit 61 reads two predetermined WSIs (image for cell type identification and image for cell analysis) from the database 80 with image processing software, and for the read image, for example, noise removal and area setting (non-specific). Pretreatment such as fluorescence exclusion) is performed to acquire two images for selecting a region of interest (step S21).
As an example of the cell type identification image, FIG. 13A is an H-stained bright field image. Further, as an example of the image for cell analysis, FIG. 13B is a PD-L1 stained fluorescent image. The lower right image in FIGS. 13A and 13B is an enlarged image obtained by enlarging a part of each image.
(ノイズ除去処理)
 ノイズ除去処理は、例えば、読み込んだ蛍光画像について、自家蛍光輝度を抑圧する処理である。具体的には、制御部61は、DFT処理(ハイパスフィルター)により周波数スペクトルを生成し、周波数スペクトルにハイパスフィルター画像を乗算する。次いで、IDFTにより低周波成分抑圧画像を生成する。
 なお、蛍光画像のサイズは巨大であるため、DFTによる処理実行時の使用メモリー負荷に応じて、蛍光画像を分割して処理することが好ましい。すなわち、蛍光画像を分割した画像個別に、上記した処理を実行し、最終的に結合すればよい。
(Noise removal processing)
The noise removal process is, for example, a process of suppressing the autofluorescence brightness of the read fluorescence image. Specifically, the control unit 61 generates a frequency spectrum by DFT processing (high-pass filter), and multiplies the frequency spectrum by a high-pass filter image. Next, a low-frequency component-suppressed image is generated by IDFT.
Since the size of the fluorescent image is huge, it is preferable to divide and process the fluorescent image according to the memory load used when the processing by the DFT is executed. That is, the above-mentioned processing may be executed for each of the divided images of the fluorescence image, and finally they may be combined.
 ここで、PIDによる蛍光信号と自家蛍光による蛍光信号は、空間周波数プロファイルが異なる。一般に、自家蛍光は低周波成分を多く含み、PIDの蛍光は急峻なピーク形状をしており低周波成分は自家蛍光と比較すると少ない。例えば、DFTやFFTなどによる周波数スペクトル解析により蛍光画像の周波数情報を得ることで、PIDの持つ輝度信号と自家蛍光による輝度信号を分離し、自家蛍光による輝度成分を選択的に抑圧することができる。
 周波数スペクトルはIDFT/IFFTにより空間信号に可逆変換可能で、上記したように自家蛍光による輝度成分を選択的に抑圧した状態の周波数スペクトルをIDFT/IFFTにより空間情報に逆変換する事で、自家蛍光を抑圧した蛍光画像を得る。
Here, the fluorescence signal by PID and the fluorescence signal by autofluorescence have different spatial frequency profiles. In general, autofluorescence contains a large amount of low-frequency components, PID fluorescence has a steep peak shape, and low-frequency components are less than autofluorescence. For example, by obtaining the frequency information of the fluorescence image by frequency spectrum analysis using DFT or FFT, the brightness signal of the PID and the brightness signal due to autofluorescence can be separated, and the brightness component due to autofluorescence can be selectively suppressed. ..
The frequency spectrum can be reversibly converted into a spatial signal by IDFT / IFFT, and as described above, the frequency spectrum in a state where the brightness component due to autofluorescence is selectively suppressed is inversely transformed into spatial information by IDFT / IFFT, thereby autofluorescent. A fluorescent image with suppressed is obtained.
(領域設定(非特異蛍光除外)処理)
 また、領域設定(非特異蛍光除外)処理は、自動染色機の染色ムラにより組織周囲端の蛍光が強くなる事があるため、ラージセクションでは周囲端より1周り小さい領域のみを後続の処理対象となる必要領域とするための処理である。
 詳細には、自動染色機による染色液の流れの不均一性や、組織切片の厚みによる端の優位な色素吸着などの要因により、組織切片は染色濃度ムラが発生した場合、上記の要因によるアーチファクトがある部分は非特異蛍光の発生頻度が高く、蛍光の輝度を平等に解析できないため、必要領域外とする必要がある。
(Region setting (non-specific fluorescence exclusion) processing)
In addition, in the area setting (non-specific fluorescence exclusion) processing, the fluorescence at the peripheral edge of the tissue may become stronger due to uneven staining of the automatic dyeing machine. It is a process to make it a necessary area.
Specifically, if uneven staining density occurs in the tissue section due to factors such as non-uniformity of the flow of the staining solution by the automatic dyeing machine and dominant dye adsorption at the edges due to the thickness of the tissue section, the above-mentioned factors cause artifacts. Since the frequency of non-specific fluorescence is high in a certain part and the brightness of the fluorescence cannot be analyzed equally, it is necessary to exclude it from the necessary region.
 そこで、制御部61は、従来公知の手法にて、組織切片画像の端に探索開始点を指定することで、特定範囲内の画素値変化を許容した輪郭追尾アルゴリズムにより切片の縁どりのROI情報を生成する。従来公知の手法としては、例えば、ImagejのWandTool(National Institutes of Health, MD, USA)を挙げることができる。
 次に、制御部61は、生成したROI情報に対して数百ピクセル(顕微鏡視野1視野分程度)の縮小処理を行う(すなわち、内側に指定ピクセル領域を縮める)。
 次に、制御部61は、生成したROI情報を、HPF画像に重ね合わせて領域外の蛍光画素値を削除(輝度ゼロ)にすることで、必要領域外を除外する。つまり、必要領域を絞り込むこととなる。但し、組織周囲の非特異的な染色ムラが発生しないケースでは領域の絞り込みは必須ではない。
 なお、赤血球などの核の無い細胞の自家蛍光領域を必要領域外に除外しても良い。また、必要領域を絞り込むには、上記した必要領域外の蛍光画素値を削除(輝度ゼロ)にする手法以外にも、公知の諸々の手法を適用可能である。
 次に、制御部61は、自家蛍光抑圧画像から必要領域を切り抜き、注目領域選択用画像を生成する。
Therefore, the control unit 61 specifies the search start point at the end of the tissue section image by a conventionally known method, and obtains the ROI information of the edge of the section by the contour tracking algorithm that allows the pixel value change within a specific range. Generate. As a conventionally known method, for example, Imagej's WandTool (National Institutes of Health, MD, USA) can be mentioned.
Next, the control unit 61 performs a reduction process of several hundred pixels (about one field of view of the microscope) on the generated ROI information (that is, the designated pixel area is reduced inward).
Next, the control unit 61 excludes the outside of the required region by superimposing the generated ROI information on the HPF image and deleting the fluorescence pixel value outside the region (luminance is zero). That is, the required area is narrowed down. However, narrowing down the area is not essential in cases where non-specific uneven staining around the tissue does not occur.
The autofluorescent region of cells without nuclei such as erythrocytes may be excluded from the required region. Further, in order to narrow down the required region, various known methods can be applied in addition to the above-mentioned method of deleting the fluorescent pixel value outside the required region (luminance is zero).
Next, the control unit 61 cuts out a necessary region from the autofluorescence suppression image and generates an image for selecting a region of interest.
 図12に戻って、次に、制御部61は、上記ステップS21にて取得した注目領域選択用の細胞種別特定用画像に対して、機械学習、又は画像における特徴的な部分を識別する画像特徴分類方法を適用し、細胞内の一乃至複数の解析対象領域を識別することにより細胞種別を特定し、解析対象領域を示す領域情報を作成する(ステップS22)。
 具体的に、機械学習又は画像特徴分類方法の例としては、機械学習による細胞種認識や、明視野H染色画像からの色素抽出、或いは、腫瘍マーカー染色細胞抽出等の手法が挙げられる。
 解析対象領域とは、特定種類の細胞であって、且つ解析の対象となる領域であり、後述のステップS23においては解析対象領域内における目的物質に関する特徴量データが算出される。具体的に、解析対象領域は、特定種類の細胞における特定の部位であって、例えば特定種類の細胞の細胞核、細胞質、細胞膜、特定の細胞小器官、又は画像処理や計算等により定められた任意の領域である。
 領域情報としては、例えば、図14に示すように、細胞の形が分かるような画像が挙げられる。
Returning to FIG. 12, next, the control unit 61 machine-learns or identifies a characteristic part of the image with respect to the cell type identification image for selecting the region of interest acquired in step S21. By applying the classification method and identifying one or more analysis target regions in the cell, the cell type is specified, and region information indicating the analysis target region is created (step S22).
Specific examples of the machine learning or image feature classification method include cell type recognition by machine learning, dye extraction from a bright-field H-stained image, and tumor marker-stained cell extraction.
The analysis target region is a region that is a specific type of cell and is the target of analysis, and in step S23 described later, feature amount data regarding the target substance in the analysis target region is calculated. Specifically, the analysis target region is a specific site in a specific type of cell, for example, the cell nucleus, cytoplasm, cell membrane, specific organelle of a specific type of cell, or an arbitrary defined by image processing, calculation, or the like. Area of.
As the region information, for example, as shown in FIG. 14, an image in which the shape of the cell can be understood can be mentioned.
 次に、制御部61は、上記ステップS22で作成した領域情報に基づいて、上記ステップS21にて取得した注目領域選択用の蛍光画像から、解析対象領域を特定し、各領域における目的物質に関する特徴量データ(第1の特徴量データ)を算出する(ステップS23)。
 例えば、各領域における平均輝度値が第1の特徴量データとして算出される。当該平均輝度値を色の濃淡などで表したヒートマップや、ヒストグラム等を作成することで、ユーザーは、蛍光画像における目的物質の分布などを認識することができる。
Next, the control unit 61 identifies the analysis target region from the fluorescence image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and features related to the target substance in each region. Quantitative data (first feature quantity data) is calculated (step S23).
For example, the average luminance value in each region is calculated as the first feature amount data. By creating a heat map or a histogram in which the average luminance value is represented by shades of color, the user can recognize the distribution of the target substance in the fluorescence image.
 次に、制御部61は、上記ステップS22で作成した領域情報に基づいて、上記ステップS21にて取得した注目領域選択用の明視野画像から、解析対象領域を特定し、各領域における細胞種別に関する特徴量データ(第2の特徴量データ)を算出する(ステップS24)。
 例えば、各領域における細胞密度が第2の特徴量データとして算出される。当該細胞密度の値を色の濃淡などで表したヒートマップや、ヒストグラム等を作成することで、ユーザーは、明視野画像におけるその細胞の分布などを認識することができる。
Next, the control unit 61 identifies the analysis target region from the bright field image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and relates to the cell type in each region. The feature amount data (second feature amount data) is calculated (step S24).
For example, the cell density in each region is calculated as the second feature amount data. The user can recognize the distribution of the cells in the bright-field image by creating a heat map or a histogram in which the value of the cell density is represented by a shade of color or the like.
 次に、制御部61は、ステップS23で算出した第1の特徴量データと、ステップS24で算出した第2の特徴量データとを用いて演算処理を行い、複合特徴量データを算出する(ステップS25)。
 具体的に、制御部61は、解析対象領域ごとに、平均輝度値(第1の特徴量データ)と細胞密度(第2の特徴量データ)とを用いて乗算などの演算処理を行う。その演算結果が複合特徴量である。
 なお、第1の特徴量データおよび第2の特徴量データはそのままの値を使用する以外にも一定のオフセット値の加算、正規化あるいはn段階の層別化等別次元への変換等、一定の加工を行った後に演算処理行ってもよい。
Next, the control unit 61 performs arithmetic processing using the first feature amount data calculated in step S23 and the second feature amount data calculated in step S24, and calculates the composite feature amount data (step). S25).
Specifically, the control unit 61 performs arithmetic processing such as multiplication using the average luminance value (first feature amount data) and the cell density (second feature amount data) for each analysis target region. The calculation result is a composite feature quantity.
In addition to using the values as they are for the first feature data and the second feature data, certain offset values are added, normalized, or converted to another dimension such as n-step stratification, etc. The arithmetic processing may be performed after the processing of.
 次に、制御部61は、算出した複合特徴量データに基づいて、注目領域を選択する(ステップS26)。
 例えば、複合特徴量データの値が、予め定められた第1の閾値以上である領域を注目領域(ホットスポット)として選択することができる。また、上記第1の閾値より小さい第2の閾値未満である領域を、注目領域(コールドスポット)として選択することとしても良い。また、複合特徴量データの値の高い上位N個を、注目領域として選択しても良い。
 これにより、例えば、図15に示すように、注目領域SPが選択された結果データを得ることができる。
 このように複合特徴量を用いることで、組織標本に存在する任意の目的物質ではなく、対象となる細胞種別に存在する目的物質についてのみ定量的評価等を行うことができるため、目的物質の分布等をより正確に把握し、解析等を精度良く行うことが可能となる。
Next, the control unit 61 selects a region of interest based on the calculated composite feature amount data (step S26).
For example, a region in which the value of the composite feature amount data is equal to or higher than a predetermined first threshold value can be selected as a region of interest (hot spot). Further, a region smaller than the first threshold value and less than the second threshold value may be selected as a region of interest (cold spot). In addition, the top N elements having high values of the composite feature amount data may be selected as the region of interest.
Thereby, for example, as shown in FIG. 15, the result data in which the region of interest SP is selected can be obtained.
By using the complex feature amount in this way, it is possible to quantitatively evaluate only the target substance existing in the target cell type, not the arbitrary target substance existing in the tissue sample, and thus the distribution of the target substance. Etc. can be grasped more accurately, and analysis and the like can be performed with high accuracy.
(変形例1)
 次に、注目領域選択処理の変形例1について、図16のフローチャートを用いて説明する。
 上記した注目領域選択処理では、ステップS25において、第1の特徴量データと第2の特徴量データとから複合特徴量データを算出し、ステップS26において、その値から注目領域を選択している。
 しかしながら、図16に示すように、第1の特徴量データを用いて蛍光画像から所定スポットの選択(特定)を行うとともに、第2の特徴量データを用いて明視野画像から所定スポットの選択(特定)を行い(ステップS27)、これらの結果を重ね合わせた結果から、注目領域を選択することとしても良い(ステップS28)。
 第1の特徴量データ又は第2の特徴量データを用いたスポット選択の手法としては、例えば、平均輝度値又は細胞密度が予め定められた第1の閾値以上である領域を所定スポットとして選択することができる。或いは、上記第1の閾値より小さい第2の閾値未満である領域を、所定スポットとして選択することとしても良いし、第1の閾値と第2の閾値の間の領域を所定スポットとして選択しても良い。また、平均輝度値又は細胞密度の値の高い上位N個を、所定スポットとして選択しても良い。
 そして、第1の特徴量データ又は第2の特徴量データを用いてスポット選択後、これらが互いに重なった領域を、注目領域として選択する。
 例えば、図17に示すように、第1の特徴量データを用いた所定スポットSP1(実線)と、第2の特徴量データを用いた所定スポットSP2(破線)とが重なった領域SP3を、注目領域として選択する。
(Modification example 1)
Next, a modification 1 of the region of interest selection process will be described with reference to the flowchart of FIG.
In the above-mentioned attention area selection process, in step S25, composite feature amount data is calculated from the first feature amount data and the second feature amount data, and in step S26, the attention area is selected from the values.
However, as shown in FIG. 16, a predetermined spot is selected (specified) from the fluorescence image using the first feature data, and a predetermined spot is selected (specified) from the bright field image using the second feature data. (Specification) may be performed (step S27), and the region of interest may be selected from the result of superimposing these results (step S28).
As a spot selection method using the first feature amount data or the second feature amount data, for example, a region in which the average brightness value or the cell density is equal to or higher than a predetermined first threshold value is selected as a predetermined spot. be able to. Alternatively, a region smaller than the first threshold value and less than the second threshold value may be selected as the predetermined spot, or a region between the first threshold value and the second threshold value may be selected as the predetermined spot. Is also good. Further, the top N spots having a high average brightness value or cell density value may be selected as predetermined spots.
Then, after spot selection using the first feature amount data or the second feature amount data, a region in which these overlap each other is selected as a region of interest.
For example, as shown in FIG. 17, attention is paid to the region SP3 in which the predetermined spot SP1 (solid line) using the first feature data and the predetermined spot SP2 (broken line) using the second feature data overlap. Select as an area.
(変形例2)
 次に、注目領域選択処理の変形例2について、図18のフローチャートを用いて説明する。
 また、上記した注目領域選択処理では、ステップS21において、細胞種別特定用画像と細胞解析用画像をそれぞれ1つずつ取得した場合を例示して説明したが、これらの画像をそれぞれ複数用いることもできる。
 以下、細胞種別特定用画像と細胞解析用画像を、それぞれ2つずつ用いた注目領域選択処理について説明する。
(Modification 2)
Next, a modification 2 of the region of interest selection process will be described with reference to the flowchart of FIG.
Further, in the above-described region of interest selection process, the case where one image for specifying the cell type and one image for cell analysis are acquired in step S21 has been described as an example, but a plurality of these images can also be used. ..
Hereinafter, the region of interest selection process using two images for cell type identification and two images for cell analysis will be described.
 細胞種別特定用画像としては、図示は省略するが、例えば、Epcam染色蛍光画像と、DAB(CD8)染色明視野画像と、が挙げられる。
 また、細胞解析用画像としては、図示は省略するが、例えば、PD-L1染色蛍光画像と、PD-1染色蛍光画像と、が挙げられる。
 制御部61は、こうした画像に対して、上記ステップS21と同様にして前処理を施し、注目領域選択用画像を取得する(ステップS31)。
Examples of the cell type identification image include, although not shown, an Epcam-stained fluorescence image and a DAB (CD8) -stained bright-field image.
Although not shown, examples of the cell analysis image include a PD-L1 stained fluorescent image and a PD-1 stained fluorescent image.
The control unit 61 performs preprocessing on such an image in the same manner as in step S21 to acquire an image for selecting a region of interest (step S31).
 次に、制御部61は、上記ステップS22と同様にして、2つの細胞種別特定用画像のそれぞれから、互いに異なる解析対象領域を識別し、それぞれの領域情報を作成する(ステップS32)。
 例えば、Epcam染色蛍光画像からは、がん細胞領域を識別するための領域情報が作成される。また、DAB(CD8)染色明視野画像からは、CD8陽性T細胞領域を識別するための領域情報が作成される。
Next, the control unit 61 identifies different analysis target regions from each of the two cell type identification images in the same manner as in step S22, and creates region information for each region (step S32).
For example, region information for identifying a cancer cell region is created from an Epcam-stained fluorescent image. Further, from the DAB (CD8) stained bright field image, region information for identifying the CD8-positive T cell region is created.
 次に、制御部61は、上記ステップS23と同様にして、2つの細胞解析用画像のそれぞれから、第1の特徴量データを算出する(ステップS33)。
 次に、制御部61は、上記ステップS24と同様にして、2つの細胞種別特定用画像のそれぞれから、第2の特徴量データを算出する(ステップS34)。
Next, the control unit 61 calculates the first feature amount data from each of the two cell analysis images in the same manner as in step S23 (step S33).
Next, the control unit 61 calculates the second feature amount data from each of the two cell type identification images in the same manner as in step S24 (step S34).
 次に、制御部61は、ステップS34で算出した2つの第2の特徴量データを用いて、2つの細胞種間の関連情報を算出する(ステップS35)。
 具体的には、2つの細胞種間の密度比や、2つの細胞種の密集領域の間の距離などを、関連情報として算出する。
Next, the control unit 61 calculates the related information between the two cell types using the two second feature amount data calculated in step S34 (step S35).
Specifically, the density ratio between the two cell types and the distance between the dense regions of the two cell types are calculated as related information.
 次に、制御部61は、ステップS35で算出した関連情報に基づいて、2つの細胞解析用画像それぞれに対して領域の絞り込みを行い、その絞り込まれた領域において、第1の特徴量データを用いて所定スポットの選択を行う(ステップS36)。
 これにより、2つの細胞解析用画像のそれぞれで、絞り込まれた領域において、所定スポットの選択が行われる。なお、第1の特徴量データを用いた所定スポットの選択は、上記ステップS27と同様にして行うことができる。
Next, the control unit 61 narrows down the area for each of the two cell analysis images based on the related information calculated in step S35, and uses the first feature amount data in the narrowed down area. To select a predetermined spot (step S36).
As a result, a predetermined spot is selected in the narrowed area in each of the two cell analysis images. The selection of the predetermined spot using the first feature amount data can be performed in the same manner as in step S27.
 次に、制御部61は、ステップS36による所定スポットの選択の結果を重ね合わせた結果から、注目領域を選択する(ステップS37)。 Next, the control unit 61 selects a region of interest from the result of superimposing the results of selecting a predetermined spot in step S36 (step S37).
 以上より、例えば、がん細胞/T細胞密度比が高い領域を絞り込んだ場合、(非浸潤の)がん細胞領域における、がん細胞PD-L1発現量に応じた注目領域を選択することができる。
 また、がん細胞/T細胞密度比が低い領域を絞り込んだ場合、(非浸潤の)T細胞領域における、T細胞PD-1発現量に応じた注目領域を選択することができる。
 また、がん細胞/T細胞密度比が閾値以内の領域を絞り込んだ場合、がん細胞へのT細胞浸潤領域における、(1)がん細胞PD-L1発現量に応じた注目領域、(2)T細胞PD-1発現量に応じた注目領域、(3)上記(1)(2)の発現量乗算等の共発現に応じた注目領域、を選択することができる。
From the above, for example, when a region having a high cancer cell / T cell density ratio is narrowed down, it is possible to select a region of interest in the (non-infiltrating) cancer cell region according to the expression level of cancer cell PD-L1. it can.
Further, when the region having a low cancer cell / T cell density ratio is narrowed down, the region of interest in the (non-infiltrating) T cell region can be selected according to the T cell PD-1 expression level.
In addition, when the region where the cancer cell / T cell density ratio is within the threshold is narrowed down, (1) the region of interest according to the expression level of PD-L1 of cancer cells in the region of T cell infiltration into cancer cells, (2). ) A region of interest according to the expression level of T cell PD-1, and (3) a region of interest according to co-expression such as multiplication of the expression levels of (1) and (2) above can be selected.
[解析用画像取得工程・解析工程]
 解析用画像取得工程は、上記のようにして選択された注目領域を、組織画像の撮影倍率よりも高倍率で撮影して解析用画像(再撮影画像)を取得する工程である。具体的には、選択された注目領域の位置を、第2画像取得部30の高倍率対物レンズで観察し、改めて画像を取得する。なお、第2画像取得部30の高倍率対物レンズの他にも、高倍率顕微鏡、例えば、BX63+DP80(Olympus)等を用いることもできる。
 このように、選択された注目領域の拡大画像を撮影することで、より正確な解析が可能となる。なお、このような実施の態様は、低倍率(低感度)のホールスライドスキャナーの場合に取りうる。なお、注目領域の位置座標を自動的に高倍率顕微鏡へ継承することで、正確な解析を実現できるし、組織検体スライドを自動ステージで駆動することで位置合わせすることもできる。
[Image acquisition process / analysis process for analysis]
The analysis image acquisition step is a step of acquiring an analysis image (re-photographed image) by photographing the area of interest selected as described above at a magnification higher than that of the tissue image. Specifically, the position of the selected region of interest is observed with the high-magnification objective lens of the second image acquisition unit 30, and an image is acquired again. In addition to the high-magnification objective lens of the second image acquisition unit 30, a high-magnification microscope such as BX63 + DP80 (Olympus) can also be used.
By taking an enlarged image of the selected region of interest in this way, more accurate analysis becomes possible. It should be noted that such an embodiment can be taken in the case of a hall slide scanner having a low magnification (low sensitivity). Accurate analysis can be realized by automatically inheriting the position coordinates of the region of interest to the high-magnification microscope, and alignment can be achieved by driving the tissue sample slide on an automatic stage.
 なお、解析用画像取得工程では、選択された注目領域についての解析用画像を取得できればよく、上記したように組織画像の撮影倍率よりも高倍率で撮影して解析用画像を取得する以外にも、ビット深度を上げた画像取得や、選択された注目領域の3次元スタック画像(再撮影画像)を取得するなどの手法でも良い。 In the analysis image acquisition step, it suffices if an analysis image for the selected region of interest can be acquired, and in addition to acquiring an analysis image by photographing at a magnification higher than that of the tissue image as described above. , An image acquisition with an increased bit depth, or a three-dimensional stack image (re-photographed image) of the selected region of interest may be acquired.
 また、解析工程は、制御部61が、上記のようにして高倍率撮影された画像(再撮影画像)を用いて解析を行う工程である。例えば、後述するPIDに由来する蛍光輝点に関する評価を行う工程や、目的物質の定量的評価を行う工程が挙げられる。PIDに由来する蛍光輝点に関する評価を行う工程としては、蛍光輝点の数を計測する工程や、蛍光輝点の数に対応するPID粒子の数を計測する工程が挙げられる。目的物質の定量的評価を行う工程としては、PIDスコアを算出する工程が挙げられる。 Further, the analysis step is a step in which the control unit 61 analyzes using an image (re-photographed image) photographed at a high magnification as described above. For example, a step of evaluating the fluorescence bright spot derived from PID, which will be described later, and a step of quantitatively evaluating the target substance can be mentioned. Examples of the step of evaluating the fluorescent bright spots derived from PID include a step of measuring the number of fluorescent bright spots and a step of measuring the number of PID particles corresponding to the number of fluorescent bright spots. Examples of the step of quantitatively evaluating the target substance include a step of calculating a PID score.
 ここで、診断等のための解析を行うには、解析に耐え得る画質が担保されていなければならない。そのため、高精度な解析を行うためには、組織標本を高倍率にて撮影する必要がある。しかしながら、病理標本の全体を高倍視野にて撮影すると、取得される画像の容量が膨大になるとともに、当該画像を用いた解析には膨大な時間を要し、解析の効率が低下してしまう。
 特に、蛍光画像を用いた場合は、画質の良い蛍光画像(高精度な解析に耐え得る蛍光画像)をWSIとして取得するのは困難であるため、特に問題が大きい。
 このことから、選択された注目領域のみについて蛍光画像を取得することにより、画質の良い蛍光画像(高精度な解析に耐え得る蛍光画像)を取得するのが比較的容易になるため、蛍光画像を用いた場合においても、画質の良い蛍光画像を効率良く取得することができ、本願発明の効果が特に顕著である。即ち、例えば上記実施形態の各工程を経ることで、高倍率撮影を必要以上に行うことを抑えることができ、高精度かつ高効率な解析が可能となる。
Here, in order to perform an analysis for diagnosis or the like, the image quality that can withstand the analysis must be ensured. Therefore, in order to perform highly accurate analysis, it is necessary to take a tissue sample at a high magnification. However, when the entire pathological specimen is photographed in a high-magnification field of view, the capacity of the acquired image becomes enormous, and the analysis using the image takes an enormous amount of time, which reduces the efficiency of the analysis.
In particular, when a fluorescent image is used, it is difficult to acquire a fluorescent image with good image quality (a fluorescent image that can withstand high-precision analysis) as WSI, which is particularly problematic.
From this, by acquiring a fluorescence image only for the selected region of interest, it becomes relatively easy to acquire a fluorescence image with good image quality (a fluorescence image that can withstand high-precision analysis). Even when it is used, a fluorescent image with good image quality can be efficiently obtained, and the effect of the present invention is particularly remarkable. That is, for example, by going through each step of the above-described embodiment, it is possible to suppress unnecessarily high-magnification imaging, and highly accurate and highly efficient analysis becomes possible.
 また、注目領域の選択に用いた細胞種別特定用画像及び細胞解析用画像に対応する複数の画像(再撮影画像)を用いて解析が行われることが好ましい。複数の画像を用いることで、より詳細な解析データを得ることが可能となる。具体的には、細胞当たりの目的物質発現量の統計値や、目的物質の位置画像等の解析データを得ることが可能となる。 Further, it is preferable that the analysis is performed using a plurality of images (re-photographed images) corresponding to the cell type identification image used for selecting the region of interest and the cell analysis image. By using a plurality of images, it is possible to obtain more detailed analysis data. Specifically, it is possible to obtain statistical values of the expression level of the target substance per cell and analysis data such as a position image of the target substance.
[実施形態の効果]
 以上のように、本実施形態の画像処理システム1によれば、制御部61は、組織標本50を撮影した組織画像を取得し、取得した組織画像から、再撮影を行うための注目領域の画像を選択する。また、制御部61は、組織標本50における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、細胞種別特定用画像から算出される細胞種別に関する特徴量と、細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、注目領域を選択する。
 このため、低倍率画像取得工程、特定領域抽出工程、解析用画像取得工程により、高倍率撮影を必要以上に行うことを抑えることで、高精度な解析を効率良く行うことが可能となり、即ち、解析の高精度化と高効率化とを同時に図ることが可能となる。
 また、細胞種別特定用画像から算出される細胞種別に関する特徴量と、細胞解析用画像から算出される目的物質に関する特徴量とに基づいて、注目領域を選択することで、目的にかなった注目領域を記述可能な基準にて自動的に選択することが可能となる。
 これにより、例えば観察者が目視にて注目領域を選択する場合と比べて、恣意性を排除した解析等を行うことが可能となるため、蛍光輝点や目的物質に関する評価等をより高精度に行うことが可能となる。
[Effect of Embodiment]
As described above, according to the image processing system 1 of the present embodiment, the control unit 61 acquires a tissue image obtained by photographing the tissue sample 50, and from the acquired tissue image, an image of a region of interest for re-imaging. Select. Further, the control unit 61 acquires a cell type identification image for identifying one or more cell types in the tissue sample 50 and one or more cell analysis images for analyzing the identified cells. , The region of interest is selected based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image.
For this reason, it is possible to efficiently perform high-precision analysis by suppressing unnecessary high-magnification imaging by the low-magnification image acquisition step, the specific region extraction step, and the analysis image acquisition step. It is possible to improve the accuracy and efficiency of analysis at the same time.
In addition, by selecting the region of interest based on the feature amount related to the cell type calculated from the image for specifying the cell type and the feature amount related to the target substance calculated from the image for cell analysis, the region of interest suitable for the purpose can be selected. Can be automatically selected based on descriptive criteria.
As a result, compared to the case where the observer visually selects the region of interest, for example, it is possible to perform an analysis excluding arbitrariness, so that the evaluation of the fluorescent bright spot and the target substance can be performed with higher accuracy. It becomes possible to do.
 また、本実施形態によれば、組織標本は、制御部61により1つ以上の細胞種別が特定され得るように形態観察染色が施されたものである。
 これにより、一の組織標本を用いて1つ以上の細胞種別の特定が可能となる。
Further, according to the present embodiment, the tissue specimen is morphologically observed and stained so that one or more cell types can be identified by the control unit 61.
This makes it possible to identify one or more cell types using one tissue specimen.
 また、本実施形態によれば、制御部61は、機械学習又は画像特徴分類方法により、細胞種別特定用画像における解析対象領域を識別する。
 これにより、より精度のよい細胞種別特定用画像を取得することができる。
Further, according to the present embodiment, the control unit 61 identifies the analysis target region in the cell type identification image by machine learning or an image feature classification method.
As a result, a more accurate image for identifying the cell type can be obtained.
 また、本実施形態によれば、制御部61は、細胞種別特定用画像の解析対象領域から細胞種別に関する特徴量を算出するとともに、細胞解析用画像の解析対象領域に対応する領域から目的物質に関する特徴量を算出する。
 このため、細胞種別特定用画像及びの細胞解析用画像における解析対象領域のみから特徴量が算出されるため、より高精度な解析を効率良く行うことが可能となる。
Further, according to the present embodiment, the control unit 61 calculates the feature amount related to the cell type from the analysis target region of the cell type identification image, and relates to the target substance from the region corresponding to the analysis target region of the cell analysis image. Calculate the feature amount.
Therefore, since the feature amount is calculated only from the analysis target region in the cell type identification image and the cell analysis image, more accurate analysis can be performed efficiently.
 また、本実施形態によれば、制御部61は、細胞種別に関する特徴量と、目的物質に関する特徴量とから算出された複合特徴量に基づいて、注目領域を選択する。
 このため、複合特徴量を用いることで、組織標本に存在する任意の目的物質ではなく、対象となる細胞種別に存在する目的物質についてのみ定量的評価等を行うことができ、目的物質の分布等をより正確に把握し、解析等を精度良く行うことが可能となる。
Further, according to the present embodiment, the control unit 61 selects a region of interest based on the complex feature amount calculated from the feature amount related to the cell type and the feature amount related to the target substance.
Therefore, by using the complex feature amount, it is possible to quantitatively evaluate only the target substance existing in the target cell type, not the arbitrary target substance existing in the tissue sample, and the distribution of the target substance, etc. It becomes possible to grasp the above more accurately and perform analysis and the like with high accuracy.
 また、本実施形態によれば、制御部61は、細胞種別に関する特徴量に基づいて細胞種別特定用画像から第1のスポットを特定するとともに、目的物質に関する特徴量に基づいて細胞解析用画像から第2のスポットを特定し、第1のスポットと第2のスポットとが互いに重なった領域を、注目領域として選択する。
 このため、注目領域の選択手法が広がり、ユーザーが所望する注目領域の選択が可能となる。
Further, according to the present embodiment, the control unit 61 identifies the first spot from the cell type identification image based on the feature amount related to the cell type, and from the cell analysis image based on the feature amount related to the target substance. The second spot is specified, and the region where the first spot and the second spot overlap each other is selected as the region of interest.
Therefore, the method of selecting the region of interest is expanded, and the region of interest desired by the user can be selected.
 また、本実施形態によれば、制御部61は、互いに異なる細胞種別を特定するための複数の細胞種別特定用画像と、特定された複数種別の細胞を解析するための複数の細胞解析用画像とを取得する。
 このため、より詳細な解析が可能となる。
Further, according to the present embodiment, the control unit 61 has a plurality of cell type identification images for identifying different cell types and a plurality of cell analysis images for analyzing the identified plurality of cell types. And get.
Therefore, more detailed analysis becomes possible.
 また、本実施形態によれば、組織画像は、組織標本50全体を撮影したホールスライドイメージである。
 このため、ホールスライドイメージを用いた病理診断において、高精度な解析を効率良く行うことが可能となる。
Further, according to the present embodiment, the tissue image is a whole slide image obtained by photographing the entire tissue sample 50.
Therefore, in pathological diagnosis using a hole slide image, it is possible to efficiently perform highly accurate analysis.
 また、本実施形態によれば、取得手段は、組織画像を取得するための第1の撮影手段(第1画像取得部20又は第2画像取得部30の低倍率対物レンズ)と、第1の撮影手段よりも撮影倍率が高い第2の撮影手段(第2画像取得部30の高倍率対物レンズ等)とを備え、再撮影は、抽出された特定領域を、第2の撮影手段により行う。
 このため、選択された注目領域の拡大画像を撮影することができ、より正確な解析が可能となる。
 なお、低倍率でWSIを取得可能なものであれば、ホールスライドスキャナーの他、低倍率顕微鏡を用いても良い。
 また、第1の撮影手段と第2の撮影手段とは、同じ撮影装置内における異なる撮影手段として構成されても良いし、異なる2つの撮影装置として構成されても良い。
Further, according to the present embodiment, the acquisition means are a first photographing means for acquiring a tissue image (a low-magnification objective lens of the first image acquisition unit 20 or the second image acquisition unit 30) and a first image. A second photographing means (such as a high-magnification objective lens of the second image acquisition unit 30) having a higher photographing magnification than the photographing means is provided, and the re-photographing is performed by the second photographing means in the extracted specific area.
Therefore, an enlarged image of the selected region of interest can be taken, and more accurate analysis becomes possible.
A low-magnification microscope may be used in addition to the whole slide scanner as long as the WSI can be obtained at a low magnification.
Further, the first photographing means and the second photographing means may be configured as different photographing means in the same photographing device, or may be configured as two different photographing devices.
 また、本実施形態によれば、前記第2の撮影手段により前記注目領域を再撮影することで得られた再撮影画像に対して解析を行う解析手段をさらに備える。
 このため、選択された注目領域の拡大画像に対して解析を行うことが可能となる。
Further, according to the present embodiment, an analysis means for analyzing the re-photographed image obtained by re-photographing the region of interest by the second photographing means is further provided.
Therefore, it is possible to analyze the enlarged image of the selected region of interest.
[その他]
 その他、画像処理システム1を構成する各装置の細部構成及び各装置の細部動作に関しても、本発明の主旨を逸脱することのない範囲で適宜変更可能である。
[Other]
In addition, the detailed configuration of each device constituting the image processing system 1 and the detailed operation of each device can be appropriately changed without departing from the gist of the present invention.
 例えば、上記実施形態では、生体サンプルとして組織切片を対象とし、蛍光標識体として蛍光物質集積ナノ粒子を含む免疫染色剤で組織標本50を染色している。生体サンプルの対象は培養細胞であってもよいし、遺伝子(DNA)でもよい。 For example, in the above embodiment, a tissue section is targeted as a biological sample, and the tissue sample 50 is stained with an immunostaining agent containing fluorescent substance-accumulated nanoparticles as a fluorescent label. The target of the biological sample may be a cultured cell or a gene (DNA).
 本発明は、高精度な解析を効率良く行うことを可能とする画像処理システム、画像処理方法及びプログラムに利用することができる。 The present invention can be used in an image processing system, an image processing method, and a program that enable highly accurate analysis to be performed efficiently.
1 画像処理システム
10 顕微鏡装置
20 第1画像取得部(第1の撮影手段)
21 明視野光源
22 第1撮像素子
30 第2画像取得部(第1の撮影手段、第2の撮影手段)
31 透過光源
32 励起光源
33 第2撮像素子
40 ステージ
50 組織標本
60 制御装置
61 制御部(取得手段、選択手段、識別手段、解析手段)
62 記憶部
63 画像処理部
64 通信部
70 表示装置
80 データベース
1 Image processing system 10 Microscope device 20 First image acquisition unit (first imaging means)
21 Bright field light source 22 First image sensor 30 Second image acquisition unit (first imaging means, second imaging means)
31 Transmission light source 32 Excitation light source 33 Second image sensor 40 Stage 50 Tissue specimen 60 Control device 61 Control unit (acquisition means, selection means, identification means, analysis means)
62 Storage unit 63 Image processing unit 64 Communication unit 70 Display device 80 Database

Claims (14)

  1.  組織標本を撮影した組織画像を取得する取得手段と、
     前記取得手段により取得された前記組織画像から、前記取得手段にて再撮影を行うための注目領域の画像を選択する選択手段と、
     を有し、
     前記取得手段は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
     前記選択手段は、前記細胞種別特定用画像から算出される細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択する画像処理システム。
    An acquisition method for acquiring a tissue image of a tissue sample,
    A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
    Have,
    The acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
    The selection means selects the region of interest based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. Image processing system.
  2.  前記組織標本は、前記取得手段により前記1つ以上の細胞種別が特定され得るように形態観察染色が施されたものである請求項1に記載の画像処理システム。 The image processing system according to claim 1, wherein the tissue specimen is morphologically observed and stained so that the one or more cell types can be identified by the acquisition means.
  3.  機械学習又は画像特徴分類方法により、前記細胞種別特定用画像における解析対象領域を識別する識別手段を備える請求項1又は2に記載の画像処理システム。 The image processing system according to claim 1 or 2, further comprising an identification means for identifying an analysis target region in the cell type identification image by machine learning or an image feature classification method.
  4.  前記選択手段は、前記細胞種別特定用画像の前記解析対象領域から前記細胞種別に関する特徴量を算出する請求項3に記載の画像処理システム。 The image processing system according to claim 3, wherein the selection means calculates a feature amount related to the cell type from the analysis target region of the cell type identification image.
  5.  前記選択手段は、前記細胞解析用画像の前記解析対象領域に対応する領域から前記目的物質に関する特徴量を算出する請求項4に記載の画像処理システム。 The image processing system according to claim 4, wherein the selection means calculates a feature amount related to the target substance from a region corresponding to the analysis target region of the cell analysis image.
  6.  前記選択手段は、前記細胞種別に関する特徴量と、前記目的物質に関する特徴量とから算出された複合特徴量に基づいて、前記注目領域を選択する請求項1から5のいずれか一項に記載の画像処理システム。 The selection means according to any one of claims 1 to 5, wherein the selection means selects the region of interest based on the complex feature amount calculated from the feature amount related to the cell type and the feature amount related to the target substance. Image processing system.
  7.  前記選択手段は、前記細胞種別に関する特徴量に基づいて前記細胞種別特定用画像から第1のスポットを特定するとともに、前記目的物質に関する特徴量に基づいて前記細胞解析用画像から第2のスポットを特定し、前記第1のスポットと前記第2のスポットとが互いに重なった領域を、前記注目領域として選択する請求項1から5のいずれか一項に記載の画像処理システム。 The selection means identifies the first spot from the cell type identification image based on the feature amount related to the cell type, and selects the second spot from the cell analysis image based on the feature amount related to the target substance. The image processing system according to any one of claims 1 to 5, wherein a region in which the first spot and the second spot overlap each other is specified and selected as the region of interest.
  8.  前記取得手段は、互いに異なる細胞種別を特定するための複数の前記細胞種別特定用画像と、特定された複数種別の細胞を解析するための複数の細胞解析用画像とを取得する請求項1から7のいずれか一項に記載の画像処理システム。 From claim 1, the acquisition means acquires a plurality of the cell type identification images for identifying different cell types and a plurality of cell analysis images for analyzing the specified plurality of cell types. 7. The image processing system according to any one of 7.
  9.  前記組織画像は、前記組織標本全体を撮影したホールスライドイメージである請求項1から8のいずれか一項に記載の画像処理システム。 The image processing system according to any one of claims 1 to 8, wherein the tissue image is a whole slide image obtained by photographing the entire tissue sample.
  10.  前記取得手段は、
     前記組織画像を取得するための第1の撮影手段と、
     前記第1の撮影手段とは異なる手法により前記注目領域を再撮影して再撮影画像を取得するための第2の撮影手段と、
     を備える請求項1から9のいずれか一項に記載の画像処理システム。
    The acquisition means
    A first photographing means for acquiring the tissue image and
    A second photographing means for re-photographing the region of interest and acquiring a re-photographed image by a method different from that of the first photographing means.
    The image processing system according to any one of claims 1 to 9.
  11.  前記第2の撮影手段は、前記第1の撮影手段よりも高い撮影倍率にて再撮影を行う請求項10に記載の画像処理システム。 The image processing system according to claim 10, wherein the second photographing means re-photographs at a higher photographing magnification than the first photographing means.
  12.  前記第2の撮影手段により取得された前記再撮影画像に対して解析を行う解析手段をさらに備える請求項10又は11に記載の画像処理システム。 The image processing system according to claim 10 or 11, further comprising an analysis means for analyzing the re-photographed image acquired by the second image-taking means.
  13.  組織標本を撮影した組織画像を取得する取得工程と、
     前記取得工程により取得された前記組織画像から、前記取得工程にて再撮影を行うための注目領域の画像を選択する選択工程と、
     を有し、
     前記取得工程は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
     前記選択工程は、前記細胞種別特定用画像から算出される1つ以上の細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択する画像処理方法。
    The acquisition process to acquire the tissue image of the tissue sample,
    A selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
    Have,
    In the acquisition step, an image for identifying one or more cell types in the tissue sample and an image for analyzing one or more cells for analyzing the identified cells are acquired.
    The selection step is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. An image processing method that selects the area of interest.
  14.  コンピュータを、
     組織標本を撮影した組織画像を取得する取得手段、
     前記取得手段により取得された前記組織画像から、前記取得手段にて再撮影を行うための注目領域の画像を選択する選択手段、
     として機能させ、
     前記取得手段は、前記組織標本における1つ以上の細胞種別を特定するための細胞種別特定用画像と、特定された細胞を解析するための1つ以上の細胞解析用画像とを取得し、
     前記選択手段は、前記細胞種別特定用画像から算出される1つ以上の細胞種別に関する特徴量と、前記細胞解析用画像から算出される1つ以上の目的物質に関する特徴量とに基づいて、前記注目領域を選択するプログラム。
    Computer,
    Acquisition method for acquiring tissue images of tissue specimens,
    A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
    To function as
    The acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
    The selection means is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. A program that selects the area of interest.
PCT/JP2020/023608 2019-06-28 2020-06-16 Image processing system, image processing method, and program WO2020262117A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021528267A JPWO2020262117A1 (en) 2019-06-28 2020-06-16

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-121466 2019-06-28
JP2019121466 2019-06-28

Publications (1)

Publication Number Publication Date
WO2020262117A1 true WO2020262117A1 (en) 2020-12-30

Family

ID=74061627

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/023608 WO2020262117A1 (en) 2019-06-28 2020-06-16 Image processing system, image processing method, and program

Country Status (2)

Country Link
JP (1) JPWO2020262117A1 (en)
WO (1) WO2020262117A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010078983A (en) * 2008-09-26 2010-04-08 Olympus Corp Microscope system, program, and method thereof
JP2011215061A (en) * 2010-04-01 2011-10-27 Sony Corp Apparatus and method for processing image, and program
WO2016093090A1 (en) * 2014-12-09 2016-06-16 コニカミノルタ株式会社 Image processing apparatus and image processing program
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device
WO2018003063A1 (en) * 2016-06-30 2018-01-04 株式会社ニコン Analysis device, analysis method, analysis program and display device
JP2018503906A (en) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド System and method for co-expression analysis in immunoscore calculation
JP2018072240A (en) * 2016-11-01 2018-05-10 株式会社日立ハイテクノロジーズ Image diagnosis support device, system, and image diagnosis support method
JP2018533116A (en) * 2015-09-02 2018-11-08 ベンタナ メディカル システムズ, インコーポレイテッド Image processing system and method for displaying a plurality of images of a biological sample

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010078983A (en) * 2008-09-26 2010-04-08 Olympus Corp Microscope system, program, and method thereof
JP2011215061A (en) * 2010-04-01 2011-10-27 Sony Corp Apparatus and method for processing image, and program
WO2016093090A1 (en) * 2014-12-09 2016-06-16 コニカミノルタ株式会社 Image processing apparatus and image processing program
JP2018503906A (en) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド System and method for co-expression analysis in immunoscore calculation
JP2016125913A (en) * 2015-01-05 2016-07-11 キヤノン株式会社 Image acquisition device and control method of image acquisition device
JP2018533116A (en) * 2015-09-02 2018-11-08 ベンタナ メディカル システムズ, インコーポレイテッド Image processing system and method for displaying a plurality of images of a biological sample
WO2018003063A1 (en) * 2016-06-30 2018-01-04 株式会社ニコン Analysis device, analysis method, analysis program and display device
JP2018072240A (en) * 2016-11-01 2018-05-10 株式会社日立ハイテクノロジーズ Image diagnosis support device, system, and image diagnosis support method

Also Published As

Publication number Publication date
JPWO2020262117A1 (en) 2020-12-30

Similar Documents

Publication Publication Date Title
JP6350527B2 (en) Image processing apparatus, pathological diagnosis support system, image processing program, and pathological diagnosis support method
US9057701B2 (en) System and methods for rapid and automated screening of cells
JP6074427B2 (en) System and method for generating bright field images using fluorescent images
JP6911855B2 (en) Biomaterial quantification method, image processing device, pathological diagnosis support system and program
JP6635108B2 (en) Image processing apparatus, image processing method, and image processing program
JP7173034B2 (en) Image processing device, focus position specifying method and focus position specifying program
WO2017126420A1 (en) Image processing device and program
JP2020173204A (en) Image processing system, method for processing image, and program
JP6493398B2 (en) Diagnosis support information generation method, image processing apparatus, diagnosis support information generation system, and image processing program
JP6547424B2 (en) Fluorescent image focusing system, focusing method and focusing program
US11423533B2 (en) Image processing method and image processing system
WO2015146938A1 (en) Tissue evaluation method, image processing device, pathological diagnosis support system, and program
JP7235036B2 (en) Image processing method, image processing apparatus and program
JP6375925B2 (en) Image processing apparatus, image processing system, image processing program, and image processing method
WO2020262117A1 (en) Image processing system, image processing method, and program
JP6702339B2 (en) Image processing device and program
JP6578928B2 (en) Focus position specifying system of fluorescent image, focus position specifying method, and focus position specifying program
JP7160047B2 (en) Biological substance quantification method, image processing device and program
WO2020209217A1 (en) Image processing system, image processing method, and program
WO2021124866A1 (en) Image processing method, image processing system, and program
EP3327427A1 (en) Target biological substance analysis method and analysis system
WO2021192910A1 (en) Image generation method, image generation device, and program
JP2016118428A (en) Image processing device, image processing system, image processing program and image processing method
US11600020B2 (en) Biological substance quantification method, image processing device, pathological diagnosis support system, and recording medium storing computer readable program
WO2022059312A1 (en) Focused image selection method, focused image selection device, and focused image selection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832550

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021528267

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20832550

Country of ref document: EP

Kind code of ref document: A1