WO2020262117A1 - Système et procédé de traitement d'image et programme - Google Patents

Système et procédé de traitement d'image et programme Download PDF

Info

Publication number
WO2020262117A1
WO2020262117A1 PCT/JP2020/023608 JP2020023608W WO2020262117A1 WO 2020262117 A1 WO2020262117 A1 WO 2020262117A1 JP 2020023608 W JP2020023608 W JP 2020023608W WO 2020262117 A1 WO2020262117 A1 WO 2020262117A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
cell
tissue
analysis
Prior art date
Application number
PCT/JP2020/023608
Other languages
English (en)
Japanese (ja)
Inventor
北斗 田中
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2021528267A priority Critical patent/JPWO2020262117A1/ja
Publication of WO2020262117A1 publication Critical patent/WO2020262117A1/fr

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an image processing system, an image processing method and a program.
  • the virtual microscope is a system that digitizes an image observed by an optical microscope and allows the tissue sample to be observed on a display as if the optical microscope was actually used (see, for example, Patent Document 1).
  • the entire tissue sample on the slide glass is photographed, the obtained image is converted into digital data and saved in a database, and observation is performed using viewer software installed on a personal computer or the like.
  • a virtual microscope is that it can be observed while performing operations such as moving up and down, left and right, and scaling, as in the case of observation using an optical microscope.
  • Digital image data of the entire tissue specimen called a whole slide image (WSI)
  • WSSI whole slide image
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide an image processing system, an image processing method, and a program capable of efficiently performing high-precision analysis.
  • the image processing system of the present invention An acquisition method for acquiring a tissue image of a tissue sample, A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
  • the acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
  • the selection means selects the region of interest based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. To do.
  • the image processing method of the present invention The acquisition process to acquire the tissue image of the tissue sample, A selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
  • a selection step of selecting an image of a region of interest for rephotographing in the acquisition step from the tissue image acquired in the acquisition step, and a selection step.
  • the selection step is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. Select the area of interest.
  • the program of the present invention Computer, Acquisition method for acquiring tissue images of tissue specimens, A selection means for selecting an image of a region of interest for re-imaging by the acquisition means from the tissue image acquired by the acquisition means.
  • the acquisition means acquires a cell type identification image for identifying one or more cell types in the tissue specimen and one or more cell analysis images for analyzing the identified cells.
  • the selection means is based on the feature amount related to one or more cell types calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image. This is a program that selects the area of interest.
  • FIG. 1 shows a schematic configuration of the image processing system 1 (WSI creation system) in the present invention.
  • the image processing system 1 includes a microscope device 10, a control device 60, a display device 70, and a database 80.
  • the microscope device 10 includes a first image acquisition unit 20, a second image acquisition unit 30, and a stage 40.
  • the tissue specimen 50 after immunostaining is placed in the stage 40.
  • the tissue sample 50 is an example of a biological sample.
  • FIG. 2 shows a schematic configuration of the first image acquisition unit 20.
  • the first image acquisition unit 20 acquires a bright field image of the tissue sample 50.
  • the first image acquisition unit 20 includes a bright field light source 21, a first image sensor 22, and a light guide lens 23.
  • the bright-field light source 21 is a light source that irradiates the tissue sample 50 with light for generating a light image for acquiring a bright-field image, and is installed so as to irradiate the light from below the stage 40.
  • the tissue sample 50 is irradiated by the bright-field light source 21 and an optical image is generated, the optical image is guided to the first image sensor 22 via the light guide lens 23, and the light image of the tissue sample 50 is brightened by the first image sensor 22.
  • a field image is taken.
  • the first image sensor 22 is an image sensor such as a two-dimensional CCD sensor capable of acquiring a two-dimensional image of the optical image of the tissue specimen 50.
  • FIG. 3 shows a schematic configuration of the second image acquisition unit 30.
  • the second image acquisition unit 30 acquires a fluorescence image of the tissue sample 50.
  • the second image acquisition unit 30 includes a transmission light source 31, an excitation light source 32, a second image sensor 33, an objective lens 34, a fluorescence cube 35, and an imaging lens 36.
  • the fluorescent cube 35 includes an excitation filter 351, a dichroic mirror 352, and an absorption filter 353.
  • the transmission light source 31 is a light source used when acquiring a transmission observation image of the tissue specimen 50, and is installed so as to irradiate light from below the stage 40.
  • the excitation light source 32 is a lamp that emits excitation light by a light source such as a discharge tube.
  • the excitation filter 351 is a filter that transmits only excitation light.
  • the dichroic mirror 352 is a mirror that reflects or transmits light having a predetermined wavelength as a boundary, and here, it reflects excitation light and transmits fluorescence.
  • the absorption filter 353 is a filter that blocks excitation light and transmits only fluorescence.
  • the excitation light passes through the excitation filter 351 and is reflected by the dichroic mirror 352, passes through the objective lens 34, and irradiates the tissue sample 50.
  • fluorescence is emitted from the tissue sample 50, and the fluorescence is focused by the objective lens 34 and transmitted through the dichroic mirror 352 and the absorption filter 353.
  • the fluorescence is guided to the second image sensor 33 as a fluorescence image via the image pickup lens 36, and is imaged by the second image sensor 33.
  • the objective lens 34 includes a low-magnification objective lens (for example, 20 times) and a high-magnification objective lens (for example, 40 times).
  • the second image sensor 33 is an image sensor such as a one-dimensional CCD camera capable of acquiring a one-dimensional image or a two-dimensional image having a predetermined direction as the longitudinal direction, and acquires a high-resolution fluorescent image of the tissue sample 50. be able to.
  • a control device 60 for controlling these is connected to the microscope device 10.
  • the control device 60 includes a control unit (acquisition means, selection means, identification means, analysis means) 61, a storage unit 62, an image processing unit 63, and a communication unit 64.
  • the control unit 61 is configured to include a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like, and executes various processes in cooperation with various programs stored in the storage unit 62 to execute various processes of the microscope device 10. Control the operation comprehensively.
  • the control unit 61 is connected to the stage 40 and can control the ascent and descent of the stage 40 to control the focusing position (Z coordinate) of the tissue sample 50 installed on the stage 40. Further, the control unit 61 is connected to the first image acquisition unit 20 and controls the bright field light source 21 and the first image sensor 22 to take a bright field image. Further, the control unit 61 is connected to the second image acquisition unit 30 and controls the transmission light source 31, the excitation light source 32, and the second image sensor 33 to take a fluorescence image.
  • a CPU Central Processing Unit
  • RAM Random Access Memory
  • the storage unit 62 is composed of, for example, an HDD (Hard Disk Drive), a semiconductor non-volatile memory, or the like.
  • the storage unit 62 stores a program for taking a bright field image and a fluorescent image.
  • the image processing unit 63 performs image processing on the fluorescent image taken by the microscope device 10 to create a hall slide image (WSI). As will be described later, according to the instruction of the control unit 61, the captured partial images are combined to create an entire image of the tissue sample 50, and the image data is A / D converted into a digital image to create a WSI. .. In addition, based on the created WSI, a fluorescence brightness map used for quantitative analysis of the target substance is created.
  • WSI hall slide image
  • the communication unit 64 is an interface for transmitting and receiving data to and from an external device such as a personal computer.
  • a user who wants to refer to the WSI can read the WSI stored in the database 80 into a personal computer or the like via the communication unit 64 and observe it on the display.
  • a display device 70 is connected to the control device 60.
  • the display device 70 is configured to include, for example, a monitor such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display), and displays various screens according to an instruction of a display signal input from the control unit 61. ..
  • the display device 70 functions as an output means for outputting a captured fluorescent image or the like.
  • a database 80 is further connected to the control device 60.
  • the database 80 includes, for example, an HDD (Hard Disk Drive) and stores the WSI synthesized by the image processing unit 63.
  • the WSI is stored in the database 80 as described above.
  • the storage area is not limited to the database 80, and the database 80 is not provided. There may be.
  • the WSI may be stored in the storage unit 62, or may be stored in an external server (not shown) to form a database.
  • tissue sample 50 is a tissue section containing the target substance and is stained with an immunostaining agent, and the stained tissue sample 50 is placed on the stage 40.
  • Target substance is a substance that is present in a tissue section and is intended for immunostaining using a fluorescent label, mainly for detection or quantification from the viewpoint of pathological diagnosis.
  • a protein antigen
  • Typical target substances include biological substances that are expressed in the cell membranes of various cancer tissues and can be used as biomarkers for proteins, RNA, and the like.
  • the target substance may be a substance introduced from outside the body, such as a drug.
  • units smaller than proteins such as peptides can be immunostained.
  • Immunostaining agent antibody-fluorescent nanoparticle conjugate
  • the primary antibody and fluorescent nanoparticles indirectly, that is, using an antigen-antibody reaction or the like, other than covalent bonds. It is preferable to use a complex linked by the binding of.
  • a complex in which fluorescent nanoparticles are directly linked to the primary antibody or the secondary antibody can also be used as the immunostaining agent.
  • immunostaining agent examples include [primary antibody against the target substance] ... [antibody against the primary antibody (secondary antibody)] to [fluorescent nanoparticles].
  • “...” Indicates that the bond is bound by an antigen-antibody reaction, and the mode of binding indicated by “ ⁇ ” is not particularly limited.
  • covalent bond, ionic bond, hydrogen bond, coordination bond, antigen-antibody bond examples thereof include biotin avidin reaction, physical adsorption, and chemisorption, and may be mediated by a linker molecule if necessary.
  • an antibody (IgG) that specifically recognizes and binds to the target substance as an antigen can be used.
  • an anti-HER2 antibody can be used
  • HER3 is the target substance
  • an anti-HER3 antibody can be used.
  • an antibody (IgG) that specifically recognizes and binds to the primary antibody as an antigen can be used.
  • Both the primary antibody and the secondary antibody may be polyclonal antibodies, but monoclonal antibodies are preferable from the viewpoint of quantitative stability.
  • the type of animal (immune animal) that produces an antibody is not particularly limited, and may be selected from mice, rats, guinea pigs, rabbits, goats, sheep, and the like as in the conventional case.
  • Fluorescent nanoparticles are nano-sized particles that fluoresce when irradiated with excitation light, and emit fluorescence of sufficient intensity to represent the target substance as a bright spot one by one. Fluorescent particles.
  • fluorescent nanoparticles in the present invention, fluorescent substance integrated nanoparticles (PID: Phosphor Integrated Dot nanoparticles) are used.
  • Fluorescent substance-accumulated nanoparticles is based on particles made of organic or inorganic substances, and a plurality of fluorescent substances (for example, the quantum dots, organic fluorescent dyes, etc.) are contained therein and /. Alternatively, it is a nano-sized particle having a structure adsorbed on its surface.
  • fluorescent substance used for PID shows emission of visible to near infrared light having a wavelength in the range of 400 to 900 nm when excited by ultraviolet to near infrared light having a wavelength in the range of 200 to 700 nm. It is preferable that the mother body and the fluorescent substance have substituents or sites having opposite charges, and an electrostatic interaction acts.
  • the average particle size of the PID used in the present invention is not particularly limited, but one having a diameter of about 30 to 800 nm can be used.
  • the average particle size is more preferably in the range of 40 to 500 nm.
  • the reason why the average particle size is set to 40 to 500 nm is that if it is less than 40 nm, an expensive detection system is required, and if it exceeds 500 nm, the quantification range is narrowed due to the physical size. ..
  • the average particle size is determined by taking an electron micrograph using a scanning electron microscope (SEM), measuring the cross-sectional area of a sufficient number of particles, and using each measured value as the area of a circle. Can be obtained as.
  • the organic substances are resins generally classified as thermosetting resins such as melamine resin, urea resin, aniline resin, guanamine resin, phenol resin, xylene resin, and furan resin.
  • Resins generally classified as thermoplastic resins such as styrene resin, acrylic resin, acrylonitrile resin, AS resin (acrylonitrile-styrene copolymer), ASA resin (acrylonitrile-styrene-methyl acrylate copolymer); poly Other resins such as styrene; polysaccharides can be exemplified.
  • the inorganic substance in the mother body include silica and glass.
  • Quantum dot integrated nanoparticles have a structure in which the quantum dots are contained in the mother body and / or adsorbed on the surface thereof. When the quantum dots are contained in the mother body, the quantum dots need only be dispersed inside the mother body, and may or may not be chemically bonded to the mother body itself.
  • quantum dots semiconductor nanoparticles containing a group II-VI compound, a group III-V compound or a group IV element are used.
  • CdSe, CdS, CdTe, ZnSe, ZnS, ZnTe, InP, InN, InAs, InGaP, GaP, GaAs, Si, Ge and the like can be mentioned.
  • a quantum dot having the above quantum dot as a core and a shell provided on the core.
  • the core is CdSe and the shell is ZnS
  • CdSe / ZnS when the core is CdSe and the shell is ZnS, it is described as CdSe / ZnS.
  • CdSe / ZnS, CdS / ZnS, InP / ZnS, InGaP / ZnS, Si / SiO2, Si / ZnS, Ge / GeO2, Ge / ZnS and the like can be used, but are not limited thereto.
  • the quantum dots may be surface-treated with an organic polymer or the like.
  • an organic polymer or the like for example, CdSe / ZnS having a surface carboxy group (manufactured by Invitrogen), CdSe / ZnS having a surface amino group (manufactured by Invitrogen), and the like can be mentioned.
  • Quantum dot integrated nanoparticles can be produced by a known method.
  • the silica nanoparticles containing quantum dots can be synthesized with reference to the synthesis of CdTe-encapsulating silica nanoparticles described in New Journal of Chemistry, Vol. 33, p. 561 (2009).
  • silica nanoparticles encapsulating quantum dots refer to the synthesis of silica nanoparticles in which particles of CdSe / ZnS capped with 5-amino-1-pentanol and APS described on page 2670 (2009) of Chemical Communication are integrated on the surface. Can be synthesized into.
  • Polymer nanoparticles encapsulating quantum dots can be produced by using the method of impregnating polystyrene nanoparticles with quantum dots described in Nature Biotechnology, Vol. 19, p. 631 (2001).
  • the fluorescent dye-accumulated nanoparticles have a structure in which a fluorescent dye is contained in the mother body and / or is adsorbed on the surface thereof.
  • the fluorescent dye include organic fluorescent dyes such as rodamine-based dye molecule, squarylium-based dye molecule, cyanine-based dye molecule, aromatic ring-based dye molecule, oxazine-based dye molecule, carbopyronine-based dye molecule, and pyromesene-based dye molecule. it can.
  • Alexa Fluor registered trademark, manufactured by Invigen
  • BODIPY registered trademark, manufactured by Invigen
  • Cy registered trademark, manufactured by GE Healthcare
  • HiLite registered trademark, Trademarks, Anaspec
  • DyLight registered trademark, Thermoscientific
  • ATTO registered trademark, ATTO-TEC
  • MFP registered trademark, Mobitec
  • Dye molecule CF (registered trademark, manufactured by Biotium) dye molecule
  • DY registered trademark, manufactured by DYOMICS
  • CAL registered trademark, manufactured by BioSearch Technologies
  • the fluorescent dye When the fluorescent dye is contained in the mother body, the fluorescent dye may or may not be chemically bonded to the mother body itself as long as it is dispersed inside the mother body.
  • Fluorescent dye-accumulated nanoparticles can be produced by a known method.
  • silica nanoparticles containing a fluorescent dye can be synthesized with reference to the synthesis of FITC-encapsulating silica particles described in Langmuir Vol. 8, p. 2921 (1992).
  • a desired fluorescent dye instead of FITC
  • various fluorescent dye-accumulated nanoparticles can be synthesized.
  • Polystyrene nanoparticles containing a fluorescent dye can be obtained by a copolymerization method using an organic dye having a polymerizable functional group described in US Pat. No. 4,326,008 (1982) or polystyrene nanoparticles described in US Pat. No. 5,326,692 (1992). It can be prepared by using a method of impregnating particles with a fluorescent dye.
  • the method for preparing a tissue section to which this staining method can be applied (also referred to simply as a “section” and including a section such as a pathological section) is not particularly limited, and a tissue section prepared by a known procedure can be used.
  • Specimen preparation step (5.1) Specimen preparation step (5.1.1) Deparaffinization treatment
  • the section is immersed in a container containing xylene to remove paraffin.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, xylene may be replaced during immersion.
  • the section is immersed in a container containing ethanol to remove xylene.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, ethanol may be replaced during immersion.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, the water may be replaced during immersion.
  • the activation treatment of the target substance is performed according to a known method.
  • the activation conditions are not particularly specified, but the activation solution is 0.01 M citrate buffer (pH 6.0), 1 mM EDTA solution (pH 8.0), 5% urea, and 0.1 M Tris-hydrochloric acid buffer.
  • a liquid or the like can be used.
  • the pH condition is such that a signal is output from the range of pH 2.0 to 13.0 depending on the tissue section to be used and the tissue roughness is such that the signal can be evaluated. Normally, it is performed at pH 6.0 to 8.0, but for special tissue sections, it is also performed at pH 3.0, for example.
  • an autoclave As the heating device, an autoclave, a microwave, a pressure cooker, a water bath, or the like can be used.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the temperature can be 50 to 130 ° C. and the time can be 5 to 30 minutes.
  • the section after activation treatment is immersed in a container containing PBS and washed.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the immersion time is preferably 3 minutes or more and 30 minutes or less. If necessary, PBS may be replaced during immersion.
  • (5.2) Immunostaining Step in order to stain the target substance, a solution of an immunostaining agent containing fluorescent nanoparticles having a site capable of directly or indirectly binding to the target substance is applied to a section. Place and react with the target substance.
  • the solution of the immunostaining agent used in the immunostaining step may be prepared in advance before this step.
  • immunostaining is performed with a plurality of immunostaining agents corresponding to the target substances.
  • the plurality of immunostaining agents used in this case may be those containing at least one immunostaining agent using PID (PID staining agent), and if the antibody and the fluorescent substance (fluorescent wavelength) are different from each other, PID staining may be performed. It is also possible to detect a plurality of target substances by multiple staining using a plurality of agents or by a combination of a PID staining agent and an immunostaining agent using a fluorescent label such as an organic fluorescent substance or a quantum dot. ..
  • a solution of each immunostaining agent is prepared, placed on a section, and reacted with the target substance.
  • the solution of each immunostaining agent may be mixed in advance when the solution is placed on the section, or separately. It may be placed sequentially in.
  • the conditions for performing the immunostaining step should be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. Can be done.
  • the temperature is not particularly limited, but it can be carried out at room temperature.
  • the reaction time is preferably 30 minutes or more and 24 hours or less. It is preferable to drop a known blocking agent such as PBS containing BSA or a surfactant such as Tween 20 before performing the treatment as described above.
  • tissue specimen after the immunostaining step is subjected to treatments such as immobilization / dehydration, permeation, and encapsulation so as to be suitable for observation.
  • the tissue section may be immersed in a fixation treatment solution (crosslinking agent such as formalin, paraformaldehyde, glutaaldehyde, acetone, ethanol, methanol).
  • a fixation treatment solution crosslinking agent such as formalin, paraformaldehyde, glutaaldehyde, acetone, ethanol, methanol.
  • the tissue section that has been immobilized and dehydrated may be immersed in a permeation solution (xylene or the like).
  • the tissue section that has undergone the permeation treatment may be immersed in the encapsulation liquid.
  • the conditions for performing these treatments for example, the temperature and the immersion time when immersing the tissue section in a predetermined treatment solution, may be appropriately adjusted so as to obtain an appropriate signal according to the conventional immunostaining method. it can.
  • Morphological observation staining step Separate from the immunostaining step, morphological observation staining is performed so that the morphology of cells, tissues, organs, etc. can be observed.
  • the morphological observation staining is not particularly limited as long as it can express the morphology of cells, tissues, organs and the like.
  • a dyeing method using a fluorescent dye or the like can also be used.
  • the morphological observation dyeing step can be performed according to a conventional method.
  • eosin in which cytoplasm, interstitium, various fibers, erythrocytes, and keratinocytes are stained in red to deep red
  • Staining with hematoxylin in which the cell nucleus, lime, cartilage tissue, bacteria, and mucus are stained in blue to pale blue, is also standardly used (the method of performing these two stainings at the same time is hematoxylin / eosin staining).
  • HE staining HE staining
  • FIG. 4 shows the overall flow of the operation of the image processing system 1 in the present invention.
  • the operation of the image processing system 1 includes a focusing step (step S101), a low-magnification image acquisition step (step S102), a region of interest selection step (step S103), and an image acquisition step for analysis (step S104). ) And an analysis step (step S105).
  • step S101 a focusing step
  • step S102 a low-magnification image acquisition step
  • step S103 region of interest selection step
  • step S104 region of interest selection step
  • step S105 an analysis step
  • the first image acquisition unit 20 acquires a bright-field image of the tissue sample 50, sets an imaging region to be created for WSI based on the bright-field image, and performs focusing based on the bright-field image. Further, the second image acquisition unit 30 irradiates the tissue specimen 50 with the fluorescence-labeled PID with excitation light, and performs more rigorous focusing with reference to the fluorescent bright spot of the detected PID.
  • the control unit 61 controls the first image acquisition unit 20 to acquire a bright field image for focusing of the entire slide glass (step S1).
  • This bright-field image is used for setting high-resolution imaging conditions such as a fluorescence image described later, and is a low-magnification image using a low-magnification objective lens.
  • the control unit 61 sets an imaging region R including the tissue sample 50 as shown in FIG. 6 (step S2). Specifically, the control unit 61 binarizes the entire image of the tissue sample 50 depending on the presence or absence of the tissue sample 50, and detects the region where the tissue sample 50 exists in each of the X-axis direction and the Y-axis direction.
  • the imaging region R is determined.
  • the imaging region R may be manually set on the display device 70 while the user is observing the bright field image of the entire tissue specimen 50, but it is preferably set automatically.
  • the tissue sample 50 is focused based on the bright field image. Focusing based on the bright-field image may be performed manually by the user, but it is preferably performed automatically under the control of the control unit 61.
  • a method of automatically creating a focus map under the control of the control unit 61 and performing focusing will be described.
  • the first focus measurement position P1 is set on the imaging region R (step S3).
  • the control unit 61 divides the imaging region R into the X-axis direction and the Y-axis direction to set a small region, and obtains the XY coordinates of each small region.
  • the XY coordinates are the center coordinates of each subregion, but the coordinates are not limited to this, and for example, the coordinates of the upper left end of each subregion can be the XY coordinates.
  • the control unit 61 assigns numbers such as 1, 2, 3, ... In each of the X-axis direction and the Y-axis direction to each small area, and sets the array number.
  • the control unit 61 sets the first focus measurement position P1 for each small area.
  • P1 is the center coordinate position of each small area, but the present invention is not limited to this, and for example, the upper left end of each small area can be set as the first focus measurement position P1.
  • the tissue sample 50 may not exist on the center coordinates, for example, as in the region of the sequence number (1, 1) in FIG. In this case, it is possible to move the first focus measurement position P1 to an arbitrary coordinate on the tissue sample 50.
  • step S4 focusing is performed on the first focus measurement position P1 in each small area.
  • the control unit 61 adjusts the optical axis position to the first focus measurement position P1 while moving the stage 40 in the XY directions, and focuses the bright field by actual measurement with respect to each first focus measurement position P1. Find the position (Z coordinate).
  • the control unit 61 Based on the bright field focus position obtained in this way, the control unit 61 creates a focus map as shown in FIG. 8 (step S5).
  • the focus map stores the array number of each small area and the corresponding stage coordinates.
  • the stage coordinates correspond to the center coordinates of each small area for the X-axis and the Y-axis, and the bright-field focusing position for the Z-axis. This completes the focusing of the tissue sample 50 based on the bright-field image.
  • step S6 From step S6 onward, focusing is performed on the PID bright spot based on the focusing information obtained based on the bright field image. That is, as described above, the bright-field focusing position of the tissue sample 50 based on the bright-field image is specified by the processing of steps S1 to S3, and focusing is further performed on the PID bright spot based on this. By performing the above, a more precisely focused fluorescence image can be obtained.
  • a fluorescence image of the PID for focusing is acquired (step S6). That is, the control unit 61 controls the excitation light source 32 to irradiate the tissue sample 50 with the excitation light of the PID, and acquires the fluorescence image of the PID by the second image sensor 33.
  • the fluorescence bright spot of an arbitrary PID on the obtained fluorescence image is selected, and the second focus measurement position P2 is set (step S7).
  • the user manually sets the second focus measurement position P2, and as shown in FIG. 9, one or a plurality of second focus measurement positions P2 on the tissue sample 50 are set.
  • the second focus measurement position P2 may be automatically set.
  • step S8 focusing is performed on the set second focus measurement position P2 (step S8).
  • the control unit 61 adjusts the optical axis position to the second focus measurement position P2 while moving the stage 40 in the XY directions, and refers to the focus position of the focus map created in step 3. , Finely adjust in the Z coordinate direction to obtain the fluorescence focusing position (Z coordinate) with respect to the second focus measurement position P2.
  • control unit 61 modifies the focus map created in step S3 by using the obtained new focus position (step S9). As described above, the focusing on the tissue sample 50 is completed.
  • the focusing method described above is merely an example, and the method applicable to the present invention is not limited to this.
  • the fluorescently labeled body labeled on the tissue specimen 50 is excited (step S10).
  • the control unit 61 controls the excitation light source 32 to irradiate the tissue specimen 50 with excitation light that excites the labeled PID.
  • a partial image of the tissue sample 50 is acquired (step S11).
  • the control unit 61 moves and controls the stage 40, and controls the second image acquisition unit 30 to acquire a partial fluorescence image. That is, the optical axis position and the focusing position are moved to the XYZ coordinates indicated by the stage coordinates stored in the focus map, and the second image sensor 33 is controlled to capture an image for each small area.
  • a high-magnification objective lens as the objective lens 34, a high-resolution image can be acquired.
  • a strip-shaped scan image as shown in FIG. 11 is acquired as a partial image.
  • imaging of the tissue specimen 50 is started from the upper left corner.
  • the control unit 61 irradiates the excitation light and scans the image pickup position by the second image pickup element 33 while moving in the positive direction of the Y axis of the tissue section 51 to acquire a partial image A.
  • the control unit 61 moves the image pickup position by the second image pickup element 33 in the positive direction of the X axis to acquire the partial image B.
  • the partial images are acquired in the order of the partial image C, the partial image N, the imaging is completed.
  • the control unit 61 controls the image processing unit 63 as the creating means to synthesize the captured partial images to create the entire fluorescent image of the imaging region R (step S12). That is, by arranging and pasting the partial images A to N in the X-axis direction, a high-resolution fluorescence image of the entire tissue sample 50 can be obtained. Further, the image processing unit 63 A / D-converts the entire fluorescent image of the obtained imaging region R into a digital image (step S13). With the above, the creation of WSI is completed. The created WSI is stored by the database 80 as a storage means. A user who wants to refer to the WSI can read the image data into a personal computer or the like via the communication unit 64 and observe it on the display.
  • WSI is not particularly limited to a fluorescent image as long as the target for creating the WSI (for example, the entire tissue sample 50) can be grasped.
  • WSI may be acquired as a bright field image.
  • the WSI of the bright field image can be created in the same manner as the creation of the WSI of the fluorescence image described above.
  • a WSI of an image for specifying the cell type (image for specifying the cell type) and an image for analyzing the specified cell (image for cell analysis) is created.
  • the target cell types include, for example, classification by differentiation of hepatocytes, glial cells, T cells, etc., pathological classification such as canceration and inflammation, and classification under specific conditions such as cell cycle and necrosis. , Spatial arrangement and shape feature classification such as infiltration and protrusions are also included.
  • the region of interest selection step is a step of selecting a region of interest (FOV: Field Of View) using the created WSI (image for specifying cell type, image for cell analysis).
  • the “region of interest” is not limited as long as it is a region useful for pathological diagnosis, and may include a normal tissue region as well as a lesion portion such as a cancer region.
  • a region in which the effect of the drug appears or a region in which the drug is present can also be included in the region of interest because it is considered to be useful for pathological diagnosis.
  • a region specified by the color of the bright-field image or a region specified by the shade of the fluorescence image may also be useful, and the size of each region can be appropriately specified, preferably about 100 um square. Is.
  • specifying a region using the number of fluorescent bright spots as an index may be useful for pathological diagnosis, and identifying and extracting a region having a large number, a region having a medium number, and a region having a small number, respectively. You can also. Areas with a large number are sometimes called hot spots, and areas with a small number are sometimes called cold spots.
  • the control unit 61 reads two predetermined WSIs (image for cell type identification and image for cell analysis) from the database 80 with image processing software, and for the read image, for example, noise removal and area setting (non-specific). Pretreatment such as fluorescence exclusion) is performed to acquire two images for selecting a region of interest (step S21).
  • WSIs image for cell type identification and image for cell analysis
  • Pretreatment such as fluorescence exclusion
  • FIG. 13A is an H-stained bright field image.
  • FIG. 13B is a PD-L1 stained fluorescent image.
  • the lower right image in FIGS. 13A and 13B is an enlarged image obtained by enlarging a part of each image.
  • the noise removal process is, for example, a process of suppressing the autofluorescence brightness of the read fluorescence image.
  • the control unit 61 generates a frequency spectrum by DFT processing (high-pass filter), and multiplies the frequency spectrum by a high-pass filter image.
  • a low-frequency component-suppressed image is generated by IDFT. Since the size of the fluorescent image is huge, it is preferable to divide and process the fluorescent image according to the memory load used when the processing by the DFT is executed. That is, the above-mentioned processing may be executed for each of the divided images of the fluorescence image, and finally they may be combined.
  • the fluorescence signal by PID and the fluorescence signal by autofluorescence have different spatial frequency profiles.
  • autofluorescence contains a large amount of low-frequency components
  • PID fluorescence has a steep peak shape
  • low-frequency components are less than autofluorescence.
  • the frequency spectrum can be reversibly converted into a spatial signal by IDFT / IFFT, and as described above, the frequency spectrum in a state where the brightness component due to autofluorescence is selectively suppressed is inversely transformed into spatial information by IDFT / IFFT, thereby autofluorescent. A fluorescent image with suppressed is obtained.
  • the fluorescence at the peripheral edge of the tissue may become stronger due to uneven staining of the automatic dyeing machine. It is a process to make it a necessary area. Specifically, if uneven staining density occurs in the tissue section due to factors such as non-uniformity of the flow of the staining solution by the automatic dyeing machine and dominant dye adsorption at the edges due to the thickness of the tissue section, the above-mentioned factors cause artifacts. Since the frequency of non-specific fluorescence is high in a certain part and the brightness of the fluorescence cannot be analyzed equally, it is necessary to exclude it from the necessary region.
  • the control unit 61 specifies the search start point at the end of the tissue section image by a conventionally known method, and obtains the ROI information of the edge of the section by the contour tracking algorithm that allows the pixel value change within a specific range. Generate. As a conventionally known method, for example, Imagej's WandTool (National Institutes of Health, MD, USA) can be mentioned. Next, the control unit 61 performs a reduction process of several hundred pixels (about one field of view of the microscope) on the generated ROI information (that is, the designated pixel area is reduced inward). Next, the control unit 61 excludes the outside of the required region by superimposing the generated ROI information on the HPF image and deleting the fluorescence pixel value outside the region (luminance is zero).
  • the required area is narrowed down.
  • narrowing down the area is not essential in cases where non-specific uneven staining around the tissue does not occur.
  • the autofluorescent region of cells without nuclei such as erythrocytes may be excluded from the required region. Further, in order to narrow down the required region, various known methods can be applied in addition to the above-mentioned method of deleting the fluorescent pixel value outside the required region (luminance is zero).
  • the control unit 61 cuts out a necessary region from the autofluorescence suppression image and generates an image for selecting a region of interest.
  • control unit 61 machine-learns or identifies a characteristic part of the image with respect to the cell type identification image for selecting the region of interest acquired in step S21.
  • the cell type is specified, and region information indicating the analysis target region is created (step S22).
  • Specific examples of the machine learning or image feature classification method include cell type recognition by machine learning, dye extraction from a bright-field H-stained image, and tumor marker-stained cell extraction.
  • the analysis target region is a region that is a specific type of cell and is the target of analysis, and in step S23 described later, feature amount data regarding the target substance in the analysis target region is calculated.
  • the analysis target region is a specific site in a specific type of cell, for example, the cell nucleus, cytoplasm, cell membrane, specific organelle of a specific type of cell, or an arbitrary defined by image processing, calculation, or the like. Area of.
  • region information for example, as shown in FIG. 14, an image in which the shape of the cell can be understood can be mentioned.
  • the control unit 61 identifies the analysis target region from the fluorescence image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and features related to the target substance in each region.
  • Quantitative data (first feature quantity data) is calculated (step S23). For example, the average luminance value in each region is calculated as the first feature amount data.
  • the control unit 61 identifies the analysis target region from the bright field image for selecting the region of interest acquired in step S21 based on the region information created in step S22, and relates to the cell type in each region.
  • the feature amount data (second feature amount data) is calculated (step S24). For example, the cell density in each region is calculated as the second feature amount data.
  • the user can recognize the distribution of the cells in the bright-field image by creating a heat map or a histogram in which the value of the cell density is represented by a shade of color or the like.
  • control unit 61 performs arithmetic processing using the first feature amount data calculated in step S23 and the second feature amount data calculated in step S24, and calculates the composite feature amount data (step). S25). Specifically, the control unit 61 performs arithmetic processing such as multiplication using the average luminance value (first feature amount data) and the cell density (second feature amount data) for each analysis target region. The calculation result is a composite feature quantity. In addition to using the values as they are for the first feature data and the second feature data, certain offset values are added, normalized, or converted to another dimension such as n-step stratification, etc. The arithmetic processing may be performed after the processing of.
  • the control unit 61 selects a region of interest based on the calculated composite feature amount data (step S26). For example, a region in which the value of the composite feature amount data is equal to or higher than a predetermined first threshold value can be selected as a region of interest (hot spot). Further, a region smaller than the first threshold value and less than the second threshold value may be selected as a region of interest (cold spot). In addition, the top N elements having high values of the composite feature amount data may be selected as the region of interest. Thereby, for example, as shown in FIG. 15, the result data in which the region of interest SP is selected can be obtained.
  • step S25 composite feature amount data is calculated from the first feature amount data and the second feature amount data
  • step S26 the attention area is selected from the values.
  • a predetermined spot is selected (specified) from the fluorescence image using the first feature data
  • a predetermined spot is selected (specified) from the bright field image using the second feature data.
  • Specification may be performed (step S27), and the region of interest may be selected from the result of superimposing these results (step S28).
  • a region in which the average brightness value or the cell density is equal to or higher than a predetermined first threshold value is selected as a predetermined spot.
  • a region smaller than the first threshold value and less than the second threshold value may be selected as the predetermined spot, or a region between the first threshold value and the second threshold value may be selected as the predetermined spot. Is also good.
  • the top N spots having a high average brightness value or cell density value may be selected as predetermined spots.
  • a region in which these overlap each other is selected as a region of interest. For example, as shown in FIG. 17, attention is paid to the region SP3 in which the predetermined spot SP1 (solid line) using the first feature data and the predetermined spot SP2 (broken line) using the second feature data overlap. Select as an area.
  • Modification 2 Next, a modification 2 of the region of interest selection process will be described with reference to the flowchart of FIG. Further, in the above-described region of interest selection process, the case where one image for specifying the cell type and one image for cell analysis are acquired in step S21 has been described as an example, but a plurality of these images can also be used. .. Hereinafter, the region of interest selection process using two images for cell type identification and two images for cell analysis will be described.
  • Examples of the cell type identification image include, although not shown, an Epcam-stained fluorescence image and a DAB (CD8) -stained bright-field image.
  • examples of the cell analysis image include a PD-L1 stained fluorescent image and a PD-1 stained fluorescent image.
  • the control unit 61 performs preprocessing on such an image in the same manner as in step S21 to acquire an image for selecting a region of interest (step S31).
  • control unit 61 identifies different analysis target regions from each of the two cell type identification images in the same manner as in step S22, and creates region information for each region (step S32). For example, region information for identifying a cancer cell region is created from an Epcam-stained fluorescent image. Further, from the DAB (CD8) stained bright field image, region information for identifying the CD8-positive T cell region is created.
  • control unit 61 calculates the first feature amount data from each of the two cell analysis images in the same manner as in step S23 (step S33).
  • control unit 61 calculates the second feature amount data from each of the two cell type identification images in the same manner as in step S24 (step S34).
  • control unit 61 calculates the related information between the two cell types using the two second feature amount data calculated in step S34 (step S35). Specifically, the density ratio between the two cell types and the distance between the dense regions of the two cell types are calculated as related information.
  • control unit 61 narrows down the area for each of the two cell analysis images based on the related information calculated in step S35, and uses the first feature amount data in the narrowed down area. To select a predetermined spot (step S36). As a result, a predetermined spot is selected in the narrowed area in each of the two cell analysis images.
  • the selection of the predetermined spot using the first feature amount data can be performed in the same manner as in step S27.
  • control unit 61 selects a region of interest from the result of superimposing the results of selecting a predetermined spot in step S36 (step S37).
  • a region having a high cancer cell / T cell density ratio is narrowed down, it is possible to select a region of interest in the (non-infiltrating) cancer cell region according to the expression level of cancer cell PD-L1. it can. Further, when the region having a low cancer cell / T cell density ratio is narrowed down, the region of interest in the (non-infiltrating) T cell region can be selected according to the T cell PD-1 expression level. In addition, when the region where the cancer cell / T cell density ratio is within the threshold is narrowed down, (1) the region of interest according to the expression level of PD-L1 of cancer cells in the region of T cell infiltration into cancer cells, (2). ) A region of interest according to the expression level of T cell PD-1, and (3) a region of interest according to co-expression such as multiplication of the expression levels of (1) and (2) above can be selected.
  • the analysis image acquisition step is a step of acquiring an analysis image (re-photographed image) by photographing the area of interest selected as described above at a magnification higher than that of the tissue image. Specifically, the position of the selected region of interest is observed with the high-magnification objective lens of the second image acquisition unit 30, and an image is acquired again.
  • a high-magnification microscope such as BX63 + DP80 (Olympus) can also be used. By taking an enlarged image of the selected region of interest in this way, more accurate analysis becomes possible.
  • an analysis image for the selected region of interest can be acquired, and in addition to acquiring an analysis image by photographing at a magnification higher than that of the tissue image as described above.
  • An image acquisition with an increased bit depth, or a three-dimensional stack image (re-photographed image) of the selected region of interest may be acquired.
  • the analysis step is a step in which the control unit 61 analyzes using an image (re-photographed image) photographed at a high magnification as described above.
  • a step of evaluating the fluorescence bright spot derived from PID which will be described later, and a step of quantitatively evaluating the target substance can be mentioned.
  • the step of evaluating the fluorescent bright spots derived from PID include a step of measuring the number of fluorescent bright spots and a step of measuring the number of PID particles corresponding to the number of fluorescent bright spots.
  • Examples of the step of quantitatively evaluating the target substance include a step of calculating a PID score.
  • the analysis is performed using a plurality of images (re-photographed images) corresponding to the cell type identification image used for selecting the region of interest and the cell analysis image.
  • a plurality of images it is possible to obtain more detailed analysis data.
  • statistical values of the expression level of the target substance per cell and analysis data such as a position image of the target substance.
  • the control unit 61 acquires a tissue image obtained by photographing the tissue sample 50, and from the acquired tissue image, an image of a region of interest for re-imaging. Select. Further, the control unit 61 acquires a cell type identification image for identifying one or more cell types in the tissue sample 50 and one or more cell analysis images for analyzing the identified cells. , The region of interest is selected based on the feature amount related to the cell type calculated from the cell type identification image and the feature amount related to one or more target substances calculated from the cell analysis image.
  • the region of interest suitable for the purpose can be selected. Can be automatically selected based on descriptive criteria. As a result, compared to the case where the observer visually selects the region of interest, for example, it is possible to perform an analysis excluding arbitrariness, so that the evaluation of the fluorescent bright spot and the target substance can be performed with higher accuracy. It becomes possible to do.
  • the tissue specimen is morphologically observed and stained so that one or more cell types can be identified by the control unit 61. This makes it possible to identify one or more cell types using one tissue specimen.
  • control unit 61 identifies the analysis target region in the cell type identification image by machine learning or an image feature classification method. As a result, a more accurate image for identifying the cell type can be obtained.
  • control unit 61 calculates the feature amount related to the cell type from the analysis target region of the cell type identification image, and relates to the target substance from the region corresponding to the analysis target region of the cell analysis image. Calculate the feature amount. Therefore, since the feature amount is calculated only from the analysis target region in the cell type identification image and the cell analysis image, more accurate analysis can be performed efficiently.
  • control unit 61 selects a region of interest based on the complex feature amount calculated from the feature amount related to the cell type and the feature amount related to the target substance. Therefore, by using the complex feature amount, it is possible to quantitatively evaluate only the target substance existing in the target cell type, not the arbitrary target substance existing in the tissue sample, and the distribution of the target substance, etc. It becomes possible to grasp the above more accurately and perform analysis and the like with high accuracy.
  • control unit 61 identifies the first spot from the cell type identification image based on the feature amount related to the cell type, and from the cell analysis image based on the feature amount related to the target substance.
  • the second spot is specified, and the region where the first spot and the second spot overlap each other is selected as the region of interest. Therefore, the method of selecting the region of interest is expanded, and the region of interest desired by the user can be selected.
  • control unit 61 has a plurality of cell type identification images for identifying different cell types and a plurality of cell analysis images for analyzing the identified plurality of cell types. And get. Therefore, more detailed analysis becomes possible.
  • the tissue image is a whole slide image obtained by photographing the entire tissue sample 50. Therefore, in pathological diagnosis using a hole slide image, it is possible to efficiently perform highly accurate analysis.
  • the acquisition means are a first photographing means for acquiring a tissue image (a low-magnification objective lens of the first image acquisition unit 20 or the second image acquisition unit 30) and a first image.
  • a second photographing means (such as a high-magnification objective lens of the second image acquisition unit 30) having a higher photographing magnification than the photographing means is provided, and the re-photographing is performed by the second photographing means in the extracted specific area. Therefore, an enlarged image of the selected region of interest can be taken, and more accurate analysis becomes possible.
  • a low-magnification microscope may be used in addition to the whole slide scanner as long as the WSI can be obtained at a low magnification.
  • the first photographing means and the second photographing means may be configured as different photographing means in the same photographing device, or may be configured as two different photographing devices.
  • an analysis means for analyzing the re-photographed image obtained by re-photographing the region of interest by the second photographing means is further provided. Therefore, it is possible to analyze the enlarged image of the selected region of interest.
  • a tissue section is targeted as a biological sample, and the tissue sample 50 is stained with an immunostaining agent containing fluorescent substance-accumulated nanoparticles as a fluorescent label.
  • the target of the biological sample may be a cultured cell or a gene (DNA).
  • the present invention can be used in an image processing system, an image processing method, and a program that enable highly accurate analysis to be performed efficiently.
  • Image processing system 10
  • Microscope device 20
  • First image acquisition unit 21
  • Bright field light source 22
  • First image sensor 30
  • Second image acquisition unit (first imaging means, second imaging means)
  • Transmission light source 32
  • Excitation light source 33
  • Second image sensor 40
  • Stage 50
  • Tissue specimen 60
  • Control device 61
  • Storage unit 63
  • Image processing unit 64
  • Communication unit 70 Display device 80 Database

Abstract

Selon le système de traitement d'image (1), une unité de commande (61) est configurée pour acquérir des images histologiques capturées à partir d'un échantillon de tissu, et pour sélectionner, parmi les images histologiques acquises, une image d'une zone d'intérêt à des fins de ré-imagerie. L'unité de commande (61) acquiert une image d'identification de type cellulaire pour identifier au moins un type cellulaire dans un échantillon (50) de tissu et au moins une image d'analyse cellulaire pour analyser les cellules identifiées, et sélectionne ensuite une zone d'intérêt sur la base d'une quantité caractéristique se rapportant au type cellulaire et étant calculée à partir de l'image d'identification de type cellulaire et d'une quantité caractéristique se rapportant à au moins une substance cible et étant calculée à partir de l'image d'analyse cellulaire.
PCT/JP2020/023608 2019-06-28 2020-06-16 Système et procédé de traitement d'image et programme WO2020262117A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021528267A JPWO2020262117A1 (fr) 2019-06-28 2020-06-16

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-121466 2019-06-28
JP2019121466 2019-06-28

Publications (1)

Publication Number Publication Date
WO2020262117A1 true WO2020262117A1 (fr) 2020-12-30

Family

ID=74061627

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/023608 WO2020262117A1 (fr) 2019-06-28 2020-06-16 Système et procédé de traitement d'image et programme

Country Status (2)

Country Link
JP (1) JPWO2020262117A1 (fr)
WO (1) WO2020262117A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010078983A (ja) * 2008-09-26 2010-04-08 Olympus Corp 顕微鏡システム、該プログラム、及び該方法
JP2011215061A (ja) * 2010-04-01 2011-10-27 Sony Corp 画像処理装置、画像処理方法、およびプログラム
WO2016093090A1 (fr) * 2014-12-09 2016-06-16 コニカミノルタ株式会社 Appareil de traitement d'image et programme de traitement d'image
JP2016125913A (ja) * 2015-01-05 2016-07-11 キヤノン株式会社 画像取得装置及び画像取得装置の制御方法
WO2018003063A1 (fr) * 2016-06-30 2018-01-04 株式会社ニコン Dispositif d'analyse, procédé d'analyse, programme d'analyse et dispositif d'affichage
JP2018503906A (ja) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド イムノスコア計算における共発現解析のためのシステム及び方法
JP2018072240A (ja) * 2016-11-01 2018-05-10 株式会社日立ハイテクノロジーズ 画像診断支援装置及びシステム、画像診断支援方法
JP2018533116A (ja) * 2015-09-02 2018-11-08 ベンタナ メディカル システムズ, インコーポレイテッド 生体試料の複数の画像を表示するための画像処理システムおよび方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010078983A (ja) * 2008-09-26 2010-04-08 Olympus Corp 顕微鏡システム、該プログラム、及び該方法
JP2011215061A (ja) * 2010-04-01 2011-10-27 Sony Corp 画像処理装置、画像処理方法、およびプログラム
WO2016093090A1 (fr) * 2014-12-09 2016-06-16 コニカミノルタ株式会社 Appareil de traitement d'image et programme de traitement d'image
JP2018503906A (ja) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド イムノスコア計算における共発現解析のためのシステム及び方法
JP2016125913A (ja) * 2015-01-05 2016-07-11 キヤノン株式会社 画像取得装置及び画像取得装置の制御方法
JP2018533116A (ja) * 2015-09-02 2018-11-08 ベンタナ メディカル システムズ, インコーポレイテッド 生体試料の複数の画像を表示するための画像処理システムおよび方法
WO2018003063A1 (fr) * 2016-06-30 2018-01-04 株式会社ニコン Dispositif d'analyse, procédé d'analyse, programme d'analyse et dispositif d'affichage
JP2018072240A (ja) * 2016-11-01 2018-05-10 株式会社日立ハイテクノロジーズ 画像診断支援装置及びシステム、画像診断支援方法

Also Published As

Publication number Publication date
JPWO2020262117A1 (fr) 2020-12-30

Similar Documents

Publication Publication Date Title
JP6350527B2 (ja) 画像処理装置、病理診断支援システム、画像処理プログラム及び病理診断支援方法
US9057701B2 (en) System and methods for rapid and automated screening of cells
JP6074427B2 (ja) 蛍光画像を用いて明視野画像を生成するためのシステム及び方法
JP6911855B2 (ja) 生体物質定量方法、画像処理装置、病理診断支援システム及びプログラム
JP6635108B2 (ja) 画像処理装置、画像処理方法、及び画像処理プログラム
JP7173034B2 (ja) 画像処理装置、合焦位置特定方法及び合焦位置特定プログラム
WO2017126420A1 (fr) Dispositif et programme de traitement d'image
JP2020173204A (ja) 画像処理システム、画像処理方法及びプログラム
JP6493398B2 (ja) 診断支援情報生成方法、画像処理装置、診断支援情報生成システム及び画像処理プログラム
JP6547424B2 (ja) 蛍光画像の合焦システム、合焦方法および合焦プログラム
US11423533B2 (en) Image processing method and image processing system
WO2015146938A1 (fr) Procédé d'évaluation de tissu, dispositif de traitement d'image, système d'aide au diagnostic pathologique, et programme
JP7235036B2 (ja) 画像処理方法、画像処理装置及びプログラム
EP3327427A1 (fr) Procédé d'analyse et système d'analyse de substances biologiques cibles
JP6375925B2 (ja) 画像処理装置、画像処理システム、画像処理プログラム及び画像処理方法
WO2020262117A1 (fr) Système et procédé de traitement d'image et programme
JP6702339B2 (ja) 画像処理装置及びプログラム
JP6578928B2 (ja) 蛍光画像の合焦位置特定システム、合焦位置特定方法および合焦位置特定プログラム
JP7160047B2 (ja) 生体物質定量方法、画像処理装置及びプログラム
WO2020209217A1 (fr) Système et procédé de traitement d'image et programme
WO2021124866A1 (fr) Procédé de traitement d'image, système de traitement d'image, et programme
WO2021192910A1 (fr) Procédé et dispositif de génération d'images et programme
JP2016118428A (ja) 画像処理装置、画像処理システム、画像処理プログラム及び画像処理方法
US11600020B2 (en) Biological substance quantification method, image processing device, pathological diagnosis support system, and recording medium storing computer readable program
WO2022059312A1 (fr) Procédé de sélection d'image focalisée, dispositif de sélection d'image focalisée et programme de sélection d'image focalisée

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20832550

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021528267

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20832550

Country of ref document: EP

Kind code of ref document: A1