WO2022209443A1 - 医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム - Google Patents
医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム Download PDFInfo
- Publication number
- WO2022209443A1 WO2022209443A1 PCT/JP2022/007460 JP2022007460W WO2022209443A1 WO 2022209443 A1 WO2022209443 A1 WO 2022209443A1 JP 2022007460 W JP2022007460 W JP 2022007460W WO 2022209443 A1 WO2022209443 A1 WO 2022209443A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- region
- tissue
- tissue region
- image
- unit
- Prior art date
Links
- 238000010191 image analysis Methods 0.000 title claims abstract description 70
- 238000003703 image analysis method Methods 0.000 title claims description 5
- 238000012545 processing Methods 0.000 claims abstract description 152
- 238000003384 imaging method Methods 0.000 claims abstract description 113
- 238000000034 method Methods 0.000 claims abstract description 49
- 230000008569 process Effects 0.000 claims abstract description 27
- 238000001514 detection method Methods 0.000 claims description 99
- 230000010365 information processing Effects 0.000 claims description 28
- 230000006835 compression Effects 0.000 claims description 13
- 238000007906 compression Methods 0.000 claims description 13
- 238000004458 analytical method Methods 0.000 claims description 4
- 210000001519 tissue Anatomy 0.000 description 331
- 230000001575 pathological effect Effects 0.000 description 45
- 239000012472 biological sample Substances 0.000 description 42
- 230000003287 optical effect Effects 0.000 description 30
- 239000000523 sample Substances 0.000 description 24
- 210000004027 cell Anatomy 0.000 description 21
- 238000010586 diagram Methods 0.000 description 19
- 230000008520 organization Effects 0.000 description 13
- 238000003379 elimination reaction Methods 0.000 description 12
- 230000004048 modification Effects 0.000 description 12
- 238000012986 modification Methods 0.000 description 12
- 230000008030 elimination Effects 0.000 description 9
- 238000000926 separation method Methods 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 7
- 238000010827 pathological analysis Methods 0.000 description 7
- 239000003153 chemical reaction reagent Substances 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 238000010186 staining Methods 0.000 description 4
- 230000002194 synthesizing effect Effects 0.000 description 4
- 230000002457 bidirectional effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000003745 diagnosis Methods 0.000 description 2
- 239000000975 dye Substances 0.000 description 2
- 238000000799 fluorescence microscopy Methods 0.000 description 2
- 230000003902 lesion Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000001356 surgical procedure Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 239000000427 antigen Substances 0.000 description 1
- 102000036639 antigens Human genes 0.000 description 1
- 108091007433 antigens Proteins 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 239000000090 biomarker Substances 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 210000004413 cardiac myocyte Anatomy 0.000 description 1
- 210000003855 cell nucleus Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 235000006694 eating habits Nutrition 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 235000013601 eggs Nutrition 0.000 description 1
- 230000005284 excitation Effects 0.000 description 1
- 239000007850 fluorescent dye Substances 0.000 description 1
- 102000034287 fluorescent proteins Human genes 0.000 description 1
- 108091006047 fluorescent proteins Proteins 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003364 immunohistochemistry Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- QSHDDOUJBYECFT-UHFFFAOYSA-N mercury Chemical compound [Hg] QSHDDOUJBYECFT-UHFFFAOYSA-N 0.000 description 1
- 229910052753 mercury Inorganic materials 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 239000012188 paraffin wax Substances 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 210000004881 tumor cell Anatomy 0.000 description 1
- 210000002700 urine Anatomy 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
Definitions
- the present disclosure relates to a medical image analysis device, a medical image analysis method, and a medical image analysis system.
- Information to be extracted includes the number of cell types, the cell density for each cell type, and the positional relationship of cells for each cell type. Since heteromorphism is used to distinguish tumor cells, it is necessary to accurately extract information such as cell shape information. Examples of cell shape information include NC ratio (nuclear-cytoplasmic ratio). Therefore, pathological image analysis that extracts the shape of all cells from digital pathological images (sometimes called WSI (Whole Slide Image)) is important.
- WSI Single Slide Image
- Patent Document 1 discloses a method for extracting cell nucleus boundaries from digital pathological images. This method improves the performance of cell shape detection by setting optimum parameters (threshold values) for each of a plurality of small regions obtained by dividing a digital pathological image.
- Patent Document 1 there is a problem that the accuracy of detecting cells present at the boundaries of small regions (crossing the boundaries) is degraded. This problem does not occur if the shape of the cells and the like are extracted from the entire digital pathological image without dividing the digital pathological image into small regions.
- the present disclosure provides a medical image analysis device, a medical image analysis method, and a medical image analysis system that accurately detect information on tissue existing at the boundary of a region to be detected for tissue.
- a medical image analysis apparatus of the present disclosure includes a region setting unit that sets a first region and a second region partially overlapping with the first region in an image to be processed obtained by imaging a biological tissue.
- a specifying unit that specifies a first tissue region that is a tissue region included in the first region; and a second tissue region that is a tissue region included in the second region; comprises an overlap processing unit that performs processing on the first tissue region and the second tissue region that at least partially overlaps the first tissue region, and sets a third tissue region.
- FIG. 4 is a diagram showing an example of an image in which a processing target region is set in a pathological image
- FIG. 4 is a diagram showing an example of dividing an image to be processed into a plurality of small areas and setting a target area for each small area
- FIG. 4 is a diagram showing a margin area in a target area
- FIG. 10 is a diagram showing an example of repeating target region setting and tissue region specifying processing
- FIG. 4 illustrates an overlap region where two regions of interest overlap each other;
- FIG. 4 illustrates an overlap region where two regions of interest overlap each other;
- FIG. 4 is a diagram showing an example in which tissue regions overlap; The figure which shows the example which eliminates the duplication of an overlapping organization, and produces
- FIG. 5 is a diagram showing an example of a detection result image displayed by a detection result display unit; 4 is a flowchart of an example of the operation of the medical image analysis apparatus according to this embodiment;
- FIG. 10 is a diagram showing an example of duplicate elimination processing;
- FIG. 10 is a diagram showing an example of duplicate elimination processing;
- FIG. 11 is a diagram showing a specific example of modification 2;
- FIG. 11 is a diagram showing a specific example of modification 3;
- FIG. 1 is an example of the configuration of a microscope system 100 as one embodiment of the medical image analysis system of the present disclosure.
- a microscope system 100 shown in FIG. 1 includes a microscope device 110 , a control section (control device) 120 , and an information processing section (information processing device) 130 .
- the microscope device 110 includes a light irradiation section 111 , an optical section 112 and a signal acquisition section 113 .
- the microscope device 110 may further include a sample placement section 114 on which the biological sample S is placed. Note that the configuration of the microscope device 110 is not limited to that shown in FIG. It may be used as the light irradiation unit 111 .
- the light irradiation section 111 may be arranged such that the sample mounting section 114 is sandwiched between the light irradiation section 111 and the optical section 112, and may be arranged on the side where the optical section 112 exists, for example.
- the microscope apparatus 110 may be configured for one or more of bright field observation, phase contrast observation, differential interference observation, polarization observation, fluorescence observation, and dark field observation.
- the microscope system 100 may be configured as a so-called WSI (Whole Slide Imaging) system or a digital pathology system, and can be used for pathological diagnosis.
- Microscope system 100 may also be configured as a fluorescence imaging system, in particular a multiplex fluorescence imaging system.
- the microscope system 100 may be used to perform intraoperative pathological diagnosis or remote pathological diagnosis.
- the microscope device 110 acquires data of the biological sample S obtained from the subject of the surgery, and transfers the data to the information processing unit 130. can send.
- the microscope device 110 can transmit the acquired data of the biological sample S to the information processing unit 130 located in a place (another room, building, or the like) away from the microscope device 110 .
- the information processing section 130 receives and outputs the data.
- a user of the information processing unit 130 can make a pathological diagnosis based on the output data.
- the biological sample S may be a sample containing a biological component.
- the biological components may be tissues, cells, liquid components of a living body (blood, urine, etc.), cultures, or living cells (cardiomyocytes, nerve cells, fertilized eggs, etc.).
- the biological sample S may be a solid, a specimen fixed with a fixing reagent such as paraffin, or a solid formed by freezing.
- the biological sample S can be a section of the solid.
- a specific example of the biological sample S is a section of a biopsy sample.
- the biological sample S may be one that has undergone processing such as staining or labeling.
- the treatment may be staining for indicating the morphology of biological components or for indicating substances (surface antigens, etc.) possessed by biological components, examples of which include HE (Hematoxylin-Eosin) staining and immunohistochemistry staining. be able to.
- the biological sample S may have been subjected to the treatment with one or more reagents, and the reagents may be fluorescent dyes, coloring reagents, fluorescent proteins, or fluorescently labeled antibodies.
- the specimen may be prepared from a specimen or tissue sample collected from the human body for the purpose of pathological diagnosis or clinical examination. Moreover, the specimen is not limited to the human body, and may be derived from animals, plants, or other materials.
- the specimen may be the type of tissue used (such as an organ or cell), the type of disease targeted, the subject's attributes (such as age, sex, blood type, or race), or the subject's lifestyle. The properties differ depending on habits (for example, eating habits, exercise habits, smoking habits, etc.).
- the specimens may be managed with identification information (bar code information, QR code (trademark) information, etc.) that allows each specimen to be identified.
- the light irradiation unit 111 is a light source for illuminating the biological sample S and an optical unit that guides the light irradiated from the light source to the specimen.
- the light source may irradiate the biological sample with visible light, ultraviolet light, or infrared light, or a combination thereof.
- the light source may be one or more of halogen lamps, laser light sources, LED lamps, mercury lamps, and xenon lamps. A plurality of types and/or wavelengths of light sources may be used in fluorescence observation, and may be appropriately selected by those skilled in the art.
- the light irradiator may have a transmissive, reflective, or episcopic (coaxial or lateral) configuration.
- the optical section 112 is configured to guide the light from the biological sample S to the signal acquisition section 113 .
- the optical unit 112 can be configured to allow the microscope device 110 to observe or image the biological sample S.
- the optical section 112 can include an objective lens.
- the type of objective lens may be appropriately selected by those skilled in the art according to the observation method.
- the optical unit 112 may include a relay lens for relaying the image magnified by the objective lens to the signal acquisition unit 113 .
- the optical unit 112 may further include optical components other than the objective lens and the relay lens, an eyepiece lens, a phase plate, a condenser lens, and the like.
- the optical section 112 may further include a wavelength separation section configured to separate light having a predetermined wavelength from the light from the biological sample S.
- the wavelength separation section can be configured to selectively allow light of a predetermined wavelength or wavelength range to reach the signal acquisition section.
- the wavelength separator may include, for example, one or more of a filter that selectively transmits light, a polarizing plate, a prism (Wollaston prism), and a diffraction grating.
- the optical components included in the wavelength separation section may be arranged, for example, on the optical path from the objective lens to the signal acquisition section.
- the wavelength separation unit is provided in the microscope apparatus when fluorescence observation is performed, particularly when an excitation light irradiation unit is included.
- the wavelength separator may be configured to separate fluorescent light from each other or white light and fluorescent light.
- the signal acquisition unit 113 can be configured to receive light from the biological sample S and convert the light into an electrical signal, particularly a digital electrical signal.
- the signal acquisition unit 113 may be configured to acquire data regarding the biological sample S based on the electrical signal.
- the signal acquisition unit 113 may be configured to acquire data of an image (image, particularly a still image, a time-lapse image, or a moving image) of the biological sample S. In particular, the image magnified by the optical unit 112 It may be configured to acquire image data.
- the signal acquisition unit 113 has an imaging device including one or more imaging elements, CMOS, CCD, or the like, each having a plurality of pixels arranged one-dimensionally or two-dimensionally.
- the signal acquisition unit 113 may include an imaging device for acquiring a low-resolution image and an imaging device for acquiring a high-resolution image, or an imaging device for sensing such as AF and an imaging device for image output for observation. element.
- the image sensor includes a signal processing unit (including one, two, or three of CPU, DSP, and memory) that performs signal processing using pixel signals from each pixel, and an output control unit for controlling output of image data generated from the pixel signals and processed data generated by the signal processing unit.
- the imaging device may include an asynchronous event detection sensor that detects, as an event, a change in brightness of a pixel that photoelectrically converts incident light exceeding a predetermined threshold.
- An imaging device including the plurality of pixels, the signal processing section, and the output control section may preferably be configured as a one-chip semiconductor device.
- the control unit 120 controls imaging by the microscope device 110 .
- the control unit 120 can drive the movement of the optical unit 112 and/or the sample placement unit 114 to adjust the positional relationship between the optical unit 112 and the sample placement unit.
- the control unit 120 can move the optical unit and/or the sample placement unit in a direction toward or away from each other (for example, the optical axis direction of the objective lens).
- the control section may move the optical section and/or the sample placement section in any direction on a plane perpendicular to the optical axis direction.
- the control unit may control the light irradiation unit 111 and/or the signal acquisition unit 113 for imaging control.
- the sample mounting section 114 may be configured such that the position of the biological sample S on the sample mounting section can be fixed, and may be a so-called stage.
- the sample mounting section 114 can be configured to move the position of the biological sample S in the optical axis direction of the objective lens and/or in a direction perpendicular to the optical axis direction.
- the information processing section 130 can acquire data (imaging data, etc.) acquired by the microscope device 110 from the microscope device 110 .
- the information processing section 130 can perform image processing on captured data.
- the image processing may include color separation processing.
- the color separation process is a process of extracting data of light components of a predetermined wavelength or wavelength range from the captured data to generate image data, or removing data of light components of a predetermined wavelength or wavelength range from the captured data. It can include processing and the like.
- the image processing may include autofluorescence separation processing for separating the autofluorescence component and dye component of the tissue section, and fluorescence separation processing for separating the wavelengths between dyes having different fluorescence wavelengths. In the autofluorescence separation processing, out of the plurality of specimens having the same or similar properties, autofluorescence signals extracted from one may be used to remove autofluorescence components from image information of the other specimen. .
- the information processing section 130 may transmit data for imaging control to the control section 120, and the control section 120 receiving the data may control imaging by the microscope device 110 according to the data.
- the information processing unit 130 may be configured as an information processing device such as a general-purpose computer, and may include a CPU, RAM, and ROM.
- the information processing section 130 may be included in the housing of the microscope device 110 or may be outside the housing. Also, various processes or functions by the information processing unit may be realized by a server computer or cloud connected via a network.
- a method of imaging the biological sample S by the microscope device 110 may be appropriately selected by a person skilled in the art according to the type of the biological sample S, the purpose of imaging, and the like. An example of the imaging method will be described below.
- the microscope device 110 can first identify an imaging target region (processing target region).
- the imaging target region may be specified so as to cover the entire region in which the biological sample S exists, or a target portion of the biological sample S (a target tissue section, a target cell, or a target lesion where the target lesion exists). may be specified to cover the Next, the microscope device 110 divides the imaging target region into a plurality of divided regions of a predetermined size, and the microscope device 110 sequentially images each divided region. As a result, an image of each divided area is obtained.
- FIG. 2(A) is an explanatory diagram of one example of an imaging method.
- the microscope device 110 identifies an imaging target region R that covers the entire biological sample S.
- the microscope device 110 divides the imaging target region R into 16 divided regions.
- the microscope device 110 can image the divided region R1, and then any region included in the imaging target region R, such as a region adjacent to the divided region R1.
- image capturing of the divided areas is performed until there are no unimaged divided areas. Areas other than the imaging target area R may also be imaged based on the captured image information of the divided areas.
- the positional relationship between the microscope device 110 and the sample placement section is adjusted in order to image the next divided area.
- the adjustment may be performed by moving the microscope device 110, moving the sample placement section 114, or moving both of them.
- the image capturing device that captures each divided area may be a two-dimensional image sensor (area sensor) or a one-dimensional image sensor (line sensor).
- the signal acquisition unit 113 may capture an image of each divided area via the optical unit.
- the imaging of each divided region may be performed continuously while moving the microscope device 110 and/or the sample mounting unit 114, or when imaging each divided region, the microscope device 110 and/or the sample mounting unit Movement of 114 may be stopped.
- the imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap.
- Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
- the information processing section 130 can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
- the microscope device 110 can first identify an imaging target region.
- the imaging target region may be specified so as to cover the entire region in which the biological sample S exists, or to cover the target portion (target tissue section or target cell-containing portion) of the biological sample S. may be specified to
- the microscope apparatus 110 scans a partial area (also referred to as a “divided area” or a “divided scan area”) of the imaging target area in one direction (also referred to as a “scanning direction”) in a plane perpendicular to the optical axis. ) and take an image.
- the scanning of the divided area is completed, the next divided area next to the divided area is scanned. These scanning operations are repeated until the entire imaging target area is imaged.
- FIG. 2(B) is an explanatory diagram of another example of the imaging method.
- the microscope device 110 identifies a region (gray portion) in which a tissue section exists in the biological sample S as an imaging target region Sa. Then, the microscope device 110 scans the divided area Rs in the imaging target area Sa in the Y-axis direction. After completing the scanning of the divided region Rs, the microscope device scans the next divided region in the X-axis direction. This operation is repeated until scanning is completed for the entire imaging target area Sa.
- the positional relationship between the microscope device 110 and the sample placement section 114 is adjusted for scanning each divided area and for imaging the next divided area after imaging a certain divided area.
- the adjustment may be performed by moving the microscope device 110, moving the sample placement unit, or moving both of them.
- the image capturing device that captures each divided area may be a one-dimensional image sensor (line sensor) or a two-dimensional image sensor (area sensor).
- the signal acquisition unit 113 may capture an image of each divided area via an enlarging optical system.
- the imaging of each divided region may be performed continuously while moving the microscope device 110 and/or the sample placement section 114 .
- the imaging target area may be divided so that the divided areas partially overlap each other, or the imaging target area may be divided so that the divided areas do not overlap.
- Each divided area may be imaged multiple times by changing imaging conditions such as focal length and/or exposure time.
- the information processing section 130 can generate image data of a wider area by synthesizing a plurality of adjacent divided areas. By performing the synthesizing process over the entire imaging target area, it is possible to obtain an image of a wider area of the imaging target area. Also, image data with lower resolution can be generated from the image of the divided area or the image subjected to the synthesis processing.
- FIG. 3 is a block diagram of the medical image analysis device 10 according to the embodiment of the present disclosure.
- the medical image analysis apparatus 10 is one aspect of the information processing section 130 in FIG. 1 and can be connected to the microscope apparatus 110 and the control section 120 .
- the medical image analysis apparatus 10 may realize the operation of the medical image analysis apparatus 10 by a computer (including a processor such as a CPU (Central Processing Unit), for example). In this case, the operation of the present embodiment is realized by reading the program from the storage unit that stores the program and executing it by the CPU or the information processing unit 130 .
- the storage unit may be a memory such as RAM or ROM, a magnetic recording medium, or an optical recording medium.
- the medical image analysis system 1 includes a medical image analysis device 10, an image database (DB) 20, an operation device 30, and a detection result database (DB) 40.
- the medical image analysis apparatus 10 includes a processing target region setting section 200 , a region setting section 300 , a tissue detection section 400 , an overlap processing section 500 and an output section 600 .
- the output section 600 has a pathological image display section 610 and a detection result display section 620 .
- the output unit 600 is an example of a display unit that displays images or text.
- the identification unit according to this embodiment includes a tissue detection unit 400 and an overlap processing unit 500 .
- the medical image analysis apparatus 10 executes an analysis application (hereinafter sometimes referred to as this application) used by the user of the medical image analysis apparatus 10.
- a user of the medical image analysis apparatus 10 is a doctor such as a pathologist, but the user is not limited to a doctor, and may be, for example, a doctor.
- the output unit 600 displays data read by this application and data generated by this application on a display (for example, a liquid crystal display device, an organic EL display device, etc.).
- the data includes image data, text data, and the like.
- the display is included in the output unit 600 in this embodiment, the display may be connected to the medical image analysis apparatus 10 from the outside by wire or wirelessly.
- the output unit 600 has a function of performing wired or wireless communication, and the output unit 600 may transmit display data to the display.
- the medical image analysis apparatus 10 is wired or wirelessly connected to the image database 20 (image DB 20) and the detection result database 40 (search result DB 40).
- the medical image analysis apparatus 10 can read or acquire information from the image DB 20 and the detection result DB 40 . Also, the medical image analysis apparatus 10 can write or transmit information to the image DB 20 and the detection result DB 40 .
- the image DB 20 and the detection result DB 40 may be configured integrally.
- the medical image analysis apparatus 10 may be connected to the image DB 20 and the detection result DB 40 via a communication network such as the Internet or an intranet, or may be connected via a cable such as a USB cable.
- a communication network such as the Internet or an intranet
- a cable such as a USB cable
- the image DB 20 and the search result DB 40 may be included inside the medical image analysis device 10 as part of the medical image analysis device 10 .
- the medical image analysis device 10 is connected to the operation device 30 by wire or wirelessly.
- the operating device 30 is operated by the user of the medical image analysis device 10 .
- a user inputs various instructions as input information to the medical image analysis apparatus 10 using the operation device 30 .
- the operating device 30 may be any device such as a keyboard, mouse, touch panel, voice input device, or gesture input device.
- the image DB 20 stores pathological images of one or more subjects.
- a pathological image is saved as, for example, a WSI file.
- a pathological image is an image obtained by imaging a sample (biological sample S) collected from a subject.
- the image DB 20 may store information related to the case of the subject, such as clinical information of the subject.
- the image DB 20 is composed of, for example, a memory device, hard disk, optical recording medium, or magnetic recording medium.
- a pathological image is obtained by imaging the biological sample S with the signal obtaining unit 113 described above.
- any method such as the method described in FIG. 2(A) or FIG. 2(B) may be used for imaging the biological sample S. That is, the imaging target region for the biological sample S is divided into a plurality of regions (unit regions or divided regions), and the positional relationship between the microscope device 110 and the sample mounting portion 114 is adjusted until there are no unimaged unit regions. A case is assumed in which the unit areas are sequentially imaged. An image (unit image) for each unit region is stored in the image DB 20, and a set of a plurality of unit images corresponds to a pathological image (whole image).
- a unit image may be compressed by any compression method. The compression method may differ depending on the unit image.
- the pathological image display unit 610 displays part or all of the pathological image specified by the user using the application using the operation device 30 on the screen of the application.
- a screen that displays part or all of a pathological image is called a pathological image viewing screen.
- the medical image analysis apparatus 10 reads the pathological image specified by the user from the diagnosis DB 40 and displays it on the pathological image viewing screen within the window of this application.
- the WSI file is read out and decoded to expand the pathological image, and the pathological image is displayed on the pathological image browsing screen.
- the user may be able to change the magnification of the pathological image while viewing the pathological image. In this case, the image with the magnification specified by the user may be read out from the diagnosis DB 40 and displayed again.
- the processing target region setting unit 200 sets a region (processing target region) to be processed for the pathological image displayed on the pathological image viewing screen.
- the processing target region setting unit 200 may set the processing target region based on user instruction information. For example, an area surrounded by a rectangle or the like by the user using a mouse operation or the like may be set as the processing target area.
- a predetermined range for example, a certain range from the center of the display area or the entire display area
- the predetermined range may be automatically determined as the processing target area.
- the entire pathological image read from the image DB 20 may be set as the processing target area.
- FIG. 4 shows an example of a pathological image in which a processing target region is set (referred to as a processing target image).
- a processing target image 1001 is part or all of the pathological image read from the image DB 20 .
- a processing target image 1001 includes a large number of various types of tissues (micro-tissues such as cells).
- the region setting unit 300 sets a target region (target region) for tissue detection processing in the processing target image 1001 .
- An image to be processed usually has a large data size, and processing all the images to be processed at once requires a large memory capacity, which is not realistic. For this reason, the image to be processed is divided into a plurality of regions (called small regions), and tissue detection processing is performed for each small region.
- the area setting unit 300 sequentially selects each small area and sets a target area for the selected small area.
- the target area is an area that includes the entire small area and has a constant width area (called a margin area) around the small area. Therefore, the margin area includes part of the subregions next to the selected subregion.
- the width of the margin area has a size that is at least one time the size of the tissue to be detected. This makes it possible to accurately detect tissue information (for example, the tissue region or shape) located at the boundary of the small region.
- FIG. 5 shows an example in which the processing target image 1001 is divided into a plurality of small regions 1002, and the target region TA is set at the top (upper leftmost small region) of the plurality of small regions.
- the target area TA includes the leading small area 1002 and includes a margin area surrounding the small area 1002 . This margin area includes portions of three adjacent small areas to the right, left, and right diagonally below the leading small area 1002 .
- FIG. 6 is a diagram showing the margin area in detail.
- a target area TA covers the small area 1002 .
- the area outside the small area 1002 in the target area TA is the margin area MA.
- the tissue detection unit 400 performs tissue detection processing on the image included in the target area TA.
- the image of the target area TA includes the image of the small area 1002 in which the target area TA is set and the image portion of the surrounding small areas 1002 included in the margin area MA of the target area TA.
- the tissue detection process can be performed using a model such as a trained neural network, which receives an image and outputs information such as a tissue region included in the image.
- a model such as a trained neural network, which receives an image and outputs information such as a tissue region included in the image.
- a method using the classical WaterShed method, the region growing method, or the like may be used.
- a technique using a general image segmentation technique may be used.
- the tissue detection unit 400 may delete the tissue existing only in the margin area MA (tissue that does not straddle the boundary between the small areas 1002) at this point. Since this tissue is detected from adjacent small regions, deleting it at this time can reduce the processing of the duplication processing unit 500, which will be described later.
- the tissue detection unit 400 temporarily stores the information on the detected tissue in a storage unit such as a memory in association with the information on the target area TA.
- the information on the target area TA may be, for example, positional information on the target area TA, or may be positional information or identification information on a small area in which the target area TA is set.
- Information about the detected tissue includes information such as the region, position, shape, type, etc. of the detected tissue.
- the tissue detection unit 400 performs tissue detection processing every time the area setting unit 300 sets the target area TA in each small area. In this manner, the setting of the target area TA by the area setting unit 300 and the tissue detection processing by the tissue detection unit 400 are repeatedly performed.
- FIG. 7 shows an example in which setting of the target area TA by the area setting unit 300 and tissue detection processing by the tissue detection unit 400 are repeatedly performed.
- An example regarding the order of setting the target area TA will be described. After first setting the target area TA for the upper leftmost small area 1002 of the processing target image 1001, next, the small area 1002 on the right side is selected and the target area TA is set. After the target area TA is set in the rightmost small area 1002 of the first line, the target area TA is set in the first small area 1002 by moving to the second line. After that, the small areas 1002 are sequentially selected in the right direction, and the target area TA is set. Thereafter, in the same manner, the process is repeated until the target area TA is finally set in the lower rightmost small area 1002 .
- the target area TA set by the area setting unit 300 partially overlaps with the target area TA set in the adjacent small area.
- the margin area MA (assumed to be MA1) of the target area TA (assumed to be TA1) set in a certain small area (assumed to be small area JA1) is the difference between the small area JA and the adjacent small area (small area JA2).
- a part of the small area JA2 is included by the width of the margin area MA from the boundary.
- the margin area MA (referred to as MA2) of the target area TA2 set in the adjacent small area JA2 extends a part of the small area JA1 by the width of the margin area MA2 from the boundary between the small areas JA2 and JA1. include. Therefore, in the tissue detection process for the target area TA1 and the tissue detection process for the target area TA2, overlapping tissue may be detected in the area where the target areas TA1 and TA2 overlap. That is, areas of detected tissue may overlap.
- FIG. 8 is a diagram showing an area where the target area TA1 set for the small area JA1 and the target area TA2 set for the small area JA2 overlap.
- detection processing is performed in duplicate for the target area TA1 and the target area TA2, so tissue may also be detected in duplicate.
- the tissue area detected from the target area TA1 and the tissue area detected from the target area TA2 do not necessarily match for the same tissue. Differences may occur in the location, shape, area, etc. of the detected tissue.
- the detection process may be performed by setting a predetermined pixel value indicating invalidity for the invalid area. good.
- the target area TA1 corresponds to the first area
- the target area TA2 corresponds to the first area.
- the tissue area detected from the target area TA1 corresponds to the first tissue area
- the tissue area detected from the target area TA2 corresponds to the second tissue area.
- the third texture is set by performing overlap cancellation, which will be described later, on the first texture region and the second texture region.
- the small area JA1 and the small area JA2 located on the right side were explained as an example, but the same processing is performed between the small areas located below and to the lower right of the small area JA1.
- the same processing can be performed for a total of eight small areas: upper, lower, left, right, diagonally upper right, diagonally lower right, diagonally upper left, and diagonally lower left.
- the overlap processing unit 500 identifies areas of tissue that are overlapped and detected between target areas set as small areas adjacent to each other.
- the overlap processing unit 500 performs a process of canceling or removing overlap between tissue regions, and sets a tissue region that represents the overlapping tissue regions. That is, the first tissue region and the second tissue region that at least partially overlap each other are processed to set a third tissue region (representative tissue region) representing the first tissue region and the second tissue region.
- the number of overlapping tissue regions is two or more.
- Overlap also includes various forms such as a form in which two or more tissue areas overlap one tissue area, a form in which three or more tissue areas are linked, and the like. Overlap may be defined as non-overlapping when the boundaries between the tissue regions are in contact with each other, or may be defined as overlapping when the boundaries are in contact with each other.
- FIG. 9A two tissue regions (texture regions 1 and 2) overlap each other.
- FIG. 9B two tissue regions (tissue regions 1 and 2) overlap one tissue region (tissue region 3).
- FIG. 9C three tissue regions (tissue regions 1, 2, and 3) overlap in a chain. Other forms of overlap than shown in FIG. 9 are possible.
- the overlap processing unit 500 may select, for example, one of the tissue regions as the representative tissue as a process of eliminating overlap between tissue regions. For example, the largest or smallest tissue region is selected, and the selected tissue region is taken as the representative tissue region. A central size tissue region may be used as a representative tissue region.
- FIG. 10(A) shows an example of selecting the larger tissue region 1 from the overlapping tissue regions 1 and 2 in FIG. 9(A) and using it as a representative tissue region.
- overlapping organizational areas may be integrated and the integrated area may be designated as a representative organizational area. Integration may be, for example, a logical sum of overlapping tissue areas (integrated area) as a representative tissue area.
- FIG. 10(B) the overlapping organizational areas 1 and 2 in FIG. 9(A) are integrated, and the integrated organizational areas (organization areas 1 & 2) are used as representative organizational areas.
- overlapping organizational areas may be divided into multiple areas, and each divided area may be designated as a representative organizational area.
- overlapping tissue regions may be integrated and the integrated tissue regions may be divided into a plurality of regions.
- the ratio of the overlapping areas of the tissue regions is calculated with respect to the integrated region, and if the ratio is less than a threshold, the integrated region is divided, and each divided region is used as a representative tissue region. .
- the ratio is equal to or greater than the threshold, the combined area may be used as the representative area. This method is based on the idea that when the ratio is low, the tissue regions do not naturally overlap, but depending on the accuracy of the tissue detection algorithm, overlapping tissue was detected.
- FIG. 10(C) shows an example in which the overlapping tissue regions 1 and 2 in FIG. 9(A) are integrated (see FIG. 10(B)) and the integrated tissue region is divided into two. Each of the divided areas D1 and D2 is used as a representative organization.
- the overlap processing unit 500 After the overlap processing unit 500 has performed the process of eliminating the overlap of tissue areas between all mutually overlapping (adjacent) target areas, the overlap processing unit 500 performs the tissue area detection result for each target area and the tissue area overlap elimination processing result. (representative tissue area set as a substitute for the overlapping tissue area) to generate detection result data.
- the detection result data includes, for example, information on the tissue regions determined not to overlap among the tissue regions specified in each target region, and information on the representative tissue regions generated by the overlap elimination process.
- the information on the tissue region includes the tissue type, shape, position, size, and other feature amounts (density of tissue, positional relationship of tissue for each type of tissue, etc.) in the tissue region.
- the detection result data may include an image (detection result image) including each identified tissue region (including the representative tissue region). That is, the detection result image is an image in which each tissue region is arranged (see FIG. 11 described later).
- the detection result data may be data in which tissue information (for example, tissue region information) detected from each small region is associated with an image of each small region.
- the detection result data may be data that associates tissue information (for example, tissue region information) detected from each unit region with an image of the unit region (unit image). good.
- the detection result data is a first unit region in which region information of a tissue detected from a first small region (or first unit region) is associated with an image of the first small region (or an image of the first unit region). may contain data.
- the detection result data is second data in which the region information of the tissue detected from the second small region (or the second unit region) is associated with the image of the second small region (or the image of the second unit region).
- the detection result data includes the tissue information in the representative tissue region as the image of the first small region (or the image of the first unit region) and the image of the second small region (or the second unit region adjacent to the first unit region).
- image may include third data associated with at least one of the images.
- Information on the organization in the representative organization area may be determined from information on the organization that is the origin. For example, as the type of tissue in the representative tissue area, one type of the tissue that is the generation source may be used. For example, the type of organization that is the origin of generation may be the type of organization in the representative organization area that is the most numerous.
- the organization of the area can be associated with each of the two small areas.
- the region information of the tissue is divided into two and associated with each of the two small regions.
- determine which subregion the tissues (including representative regions) that span subregion boundaries belong to and determine It may be determined that the subregion contains tissue. For example, it is possible to calculate the percentage of the area that a region of tissue belongs to two sub-regions, and assume that the tissue belongs to the larger sub-region.
- centroid of the region of tissue may be calculated and it may be determined that the tissue is included in the small region to which the centroid belongs. Other methods may be used to determine the subregion to which the tissue belongs.
- the detection result DB 40 stores detection result data generated by the duplication processing unit 500 .
- the detection result DB 40 may associate the detection result data with information about the subject from whom the biological sample S was collected.
- the detection result DB 40 is composed of, for example, a memory device, a hard disk, an optical recording medium, or a magnetic recording medium.
- the detection result display unit 620 displays the detection result data generated by the duplication processing unit 500.
- FIG. 11 shows an example of a detection result image displayed by the detection result display unit 620.
- the detection result image in FIG. 11 is obtained by processing the image of the processing target area in FIG. 4 according to the present embodiment. A large number of tissues have been detected, and the tissue present at the boundaries of the subregions can also be detected with the correct shape (eg, without discontinuities in one tissue region at the boundaries of the subregions).
- a line indicating a small area may be superimposed on the detection result image displayed on the detection result display unit 620 based on the user's instruction information. Data indicating statistical information and analysis results for each type of tissue may also be displayed. In addition, different line types or different colors may be used for display for each type of tissue. In the example of FIG. 11, each type of tissue is displayed with a different color or different brightness.
- the detection result image may be displayed superimposed on the image to be processed (see FIG. 4).
- a detection result image may be superimposed and displayed only in a small area (or unit area) specified by the user in the image to be processed.
- the magnification of the detection result screen may also be changed.
- FIG. 12 is a flow chart of an example of the operation of the medical image analysis apparatus 10 according to this embodiment.
- the pathological image display unit 610 reads the pathological image selected by the user from the image DB 20, decodes it, and displays it on the application screen (pathological image viewing screen) (S101).
- the processing target region setting unit 200 sets a processing target region for the pathological image displayed on the pathological image viewing screen (S101).
- the area setting unit 300 divides the processing target area into a plurality of areas (small areas) (S102).
- the area setting unit 300 selects a small area and sets a target area for the selected small area (S103).
- the target area includes the entire small area and has a constant width area (margin area) around the small area.
- the margin has a width at least greater than the size of tissue to be detected.
- the area setting unit 300 acquires an image of the set target area (S103).
- the tissue detection unit 400 performs tissue detection processing from the image of the target region, and identifies the detected tissue region (S104).
- Steps S103 and S104 are repeated until target area setting and tissue detection processing are performed for all small areas (NO in S105).
- the overlap processing unit 500 selects a set of target regions that partially overlap each other (S106).
- the duplication processing unit 500 performs duplication detection processing on the selected pair (S107). Specifically, the overlap processing unit 500 detects a group of tissue regions that at least partially overlap in regions where the target regions overlap. Then, the overlap processing unit 500 performs processing (overlap elimination processing) for eliminating overlap between the detected tissue region groups (S107).
- the detected tissue region group is processed to set a representative tissue region representing the tissue region group. In the overlap elimination process, for example, the largest or smallest tissue region in the detected tissue region group is set as the representative tissue region.
- an integrated organization obtained by integrating a group of organizational areas is set as a representative organizational area.
- the integrated tissue region is divided into a plurality of regions, and each divided tissue is defined as a representative tissue region.
- the overlap processing unit 500 repeats steps S106 and S107 until it selects all sets of target areas that partially overlap each other and performs overlap detection processing and overlap elimination processing (NO in S108).
- the overlap processing unit 500 When the overlap processing unit 500 performs overlap detection processing and overlap elimination processing for all pairs, the tissue information (region information, etc.) detected from the image of each target region and the representative tissue information generated by the overlap elimination processing are combined. Based on this, detection result data is generated (S109). For example, the tissue regions determined not to overlap among the tissue regions identified from each target region are shown, and the representative tissue region set by the overlap removal process is shown (instead of the tissue regions determined to overlap, the representative tissue region An image in which the regions are arranged) is generated as a detection result image.
- the duplication processing unit 500 stores the detection result data in the detection result DB 40 (S109). Also, the duplication processing unit 500 displays the detection result data on the detection result display unit 620 (S109).
- the tissue region overlap detection processing (S106) was performed.
- the processing of overlap detection and overlap elimination of tissue regions may be performed in parallel.
- overlap detection of the tissue region is performed. detection process) may be performed.
- Effective use of resources when target area setting and tissue detection processing are performed using GPU (Graphical Processing Unit) in machine learning, etc., and duplicate detection and deduplication processing are performed using CPU (Central Processing Unit). becomes possible.
- the processing target region is divided into a plurality of small regions serving as tissue detection target units, and a target region including a margin region surrounding the small region is set for each small region.
- tissue detection processing is performed for each target region.
- tissue regions located on the boundaries of small regions can also be specified with high accuracy.
- overlapping tissue regions are detected in overlapping regions between target regions, but by performing processing to eliminate overlap (for example, a tissue region representing a group of overlapping tissue regions is detected as this tissue region (instead of groups) also eliminates the problem of overlapping tissue regions being detected.
- Modification 1 shows a method of simply performing a process in which the overlap processing unit 500 determines the presence or absence of overlap between tissue regions. As a result, it is possible to reduce the amount of calculation for the duplication determination process and speed up the process. A specific example will be described below.
- FIG. 13A shows tissue regions (tissue regions) 1 to 7 detected from overlapping regions between two target regions in the overlap processing section 500.
- FIG. 13A shows tissue regions (tissue regions) 1 to 7 detected from overlapping regions between two target regions in the overlap processing section 500.
- the overlap processing unit 500 calculates the circumscribed shape of each of the tissue regions 1-7.
- Various shapes such as a rectangle and a circle are possible for the circumscribed shape.
- an example of a circumscribing rectangle is shown here, it is not limited to a rectangle.
- the overlap processing unit 500 determines whether or not the circumscribed rectangles overlap between the tissue regions 1 to 7 (simple hit determination). Simple hit determination can be performed by determining whether circumscribing rectangles include the same coordinates. Since the object of calculation is a rectangle, the amount of calculation is small, and determination can be made at high speed.
- the leftmost part of the upper part of FIG. 14 is a table showing the results of the simple hit determination.
- a triangular mark is stored between tissue regions where the circumscribing rectangles overlap.
- the bounding rectangle of tissue region 1 partially overlaps the bounding rectangles of tissue regions 2 , 4 and 7 .
- the bounding rectangle of tissue region 2 partially overlaps the bounding rectangle of tissue region 5 .
- the bounding rectangle of tissue region 3 partially overlaps the bounding rectangles of tissue regions 4 and 7 .
- the bounding rectangle of tissue region 5 partially overlaps the bounding rectangle of tissue region 6 .
- tissue regions 1 to 7 are processed in order, and the same set of tissue regions is not detected redundantly. For example, since tissue region 2 has already been detected in processing tissue region 1, tissue region 1 is not detected in processing tissue region 2. FIG. That is, the lower left side of the table is invalidated. This makes it possible to omit bidirectional calculations and save the amount of calculations.
- the overlap processing unit 500 performs hit determination on the outline (boundary) of the tissue for the tissue group determined to overlap in the simple hit determination (hit determination on the outline). That is, it is determined whether or not the tissues belonging to the tissue group overlap each other based on whether or not the tissues belonging to the tissue group contain the same coordinates.
- Fig. 14 shows the results of hit determination on contours. Circle marks are stored between tissues that overlap each other. Tissue region 1 partially overlaps tissue region 4, but tissue region 2 and tissue region 7 do not overlap. Tissue region 2 partially overlaps tissue region 5 . Tissue region 3 does not overlap with either of tissue regions 4,7. Tissue region 5 partially overlaps tissue region 6 .
- the third from the left in the upper part of FIG. 14 is the bottom left side of the contour hit determination result table (bidirectional information).
- the overlap processing unit 500 identifies groups of tissue regions that overlap each other from the bidirectional information in FIG. Four tissue region groups are identified in this example. That is, a group of tissue regions 1 and 4, a group of tissue regions 2, 5 and 6, a group including only tissue region 3, and a group including only tissue region 7 are identified. As an example, as shown on the left side of the lower part of FIG. You can generate a list.
- the duplication processing unit 500 sets a representative tissue area from each group (or each linked list) using the method described in the above embodiment.
- tissue region with the largest size in each group is taken as the representative tissue region. Since tissue region 1 is the largest in the group of tissue regions 1 and 4, tissue region 1 is the representative tissue region. Since tissue region 6 is the largest in the group of tissue regions 2, 5, and 6, tissue region 6 is the representative tissue region. In the group containing only the tissue region 3, the tissue region 3 becomes the representative tissue region, and in the group containing only the tissue region 7, the tissue region 7 becomes the representative tissue.
- the duplication processing unit 500 holds information on each representative tissue area.
- FIG. 13(C) shows an example of a representative organization area generated from each group (each linked list). Tissue regions 1, 6, 3 and 7 are selected as representative tissue regions.
- overlapping tissue region groups can be detected with a small amount of computation. For example, if the contour hit determination is performed for all pairs of the tissue regions 1 to 7, the amount of calculation increases. can be reduced.
- the target region When setting the target region, the target region may be set in units of regions (unit regions) in which the biological sample S is imaged. As a result, the boundary of the target area coincides with the boundary of the imaged unit area (because the unit area constitutes the margin area in the target area), so the image of the target area can be easily obtained. Since the image can be decoded in units of unit areas, the image of the target area can be acquired at high speed with a small amount of memory consumption.
- FIG. 15 is a diagram showing a specific example of modification 2.
- the small area JA1 includes 24 unit areas UA, and the small area JA2 similarly includes 24 unit areas UA. Although only two small areas are shown in the figure, there are actually small areas below each of the small areas JA1 and JA2 and also to the right of the small area JA2.
- the unit area UA is an area that becomes an imaging unit when the imaging unit (signal acquisition unit 113) images the biological sample S.
- the left and right widths MW1 of the margin area MA1 are the same size as the width UW1 of the unit area.
- the vertical width MW2 above and below the margin area MA1 is the same size as the vertical width UW2 of the unit area. Therefore, the image of the target area TA1 can be acquired by reading and decoding the image (unit image) of the unit area UA included in the target area TA1.
- the margin area is not configured in units of unit areas, in order to obtain an image of the margin area included in the target area TA1, a portion overlapping the target area TA1 in the small area JA2 is specified, and the specified part is I need to get an image. It is also necessary to similarly specify portions overlapping the target region TA1 for small regions (not shown) adjacent to the small region JA1 in the downward direction and the lower right direction, respectively, and acquire images of the specified portions. A large memory capacity is required for the process of identifying the overlapping portion, and the process is delayed. On the other hand, in this modified example, it is only necessary to read out an image (unit image) of the unit area that has been captured in advance, so the image of the target area TA1 can be obtained at high speed with a small memory capacity.
- the image of the unit area may be compressed with different compression parameters such as the image quality value (Jpeg Quality value) for each unit area.
- image quality value Jpeg Quality value
- an image with many cells may be compressed with high quality compression parameters, while an image with few or no cells may be compressed with low quality compression parameters.
- a machine learning model is used for tissue detection, a model corresponding to quality may be prepared and the model (tissue detection method) to be used may be switched for each small region. For example, when only a single unit area exists in the small area, a model corresponding to the compression parameters of the image of this unit area (unit image) is used.
- the model corresponding to the compression parameter with the largest number of unit images using the same compression parameter or the model corresponding to the compression parameter with the highest quality is used.
- detection parameters may be switched instead of switching models. Switching models or switching detection parameters is an example of switching tissue detection methods.
- the regions to be imaged may partially overlap (that is, the divided regions (unit regions) may be imaged with a margin added. may be performed).
- an imaging region is set in the biological sample S with the same imaging size as the target region, and imaging is performed. At this time, the imaging area is partially overlapped with the imaging area to be set next by the same size as the margin area included in the target area.
- a first imaging region is set in a tissue derived from a living body, the first imaging region is imaged, a second imaging region is set by partially overlapping the first imaging region, and a second imaging region is set. to image the imaging area of .
- An image captured from the first imaging region is used as the image of the first target region, and an image captured from the second imaging region is used as the image of the second target region. Thereafter, similar processing is repeated.
- Such an operation makes it possible to easily acquire an image of the target region before completing the imaging of the biological sample S, and it is possible to quickly start the processing of the above-described embodiment immediately after the start of imaging. That is, the tissue detection process (S104) and the overlap detection/overlap elimination process (S107) can be performed while imaging the biological sample S. This makes it easy to hide the latency of imaging, image processing, slide position movement, and the like, and can lead to a reduction in processing time.
- the imaging unit can extract images of divided regions (unit regions) from an image captured from an imaging region including a margin, and acquire a set of extracted images as a pathological image or an image to be processed.
- the acquired pathological image or image to be processed is sent to the image DB 20 and stored in the image DB 20 .
- FIG. 16 is a diagram showing a specific example of modification 3.
- the imaging unit (signal acquisition unit 113) divides and scans the biological sample S (part of the biological sample S is shown in the drawing).
- the region containing the biological sample S is divided by the size of the divided scans, and divided regions (unit regions) Rs1, Rs2, Rs3, . . and A margin area surrounding the divided area Rs is added to each divided area Rs, and the divided area to which the margin area is added is used as an imaging area (referred to as a scan area with margin Ts) for imaging.
- a scan region Ts1 with a margin including the divided region Rs1 is set as an imaging region, and imaging is performed.
- a scan area Ts2 with a margin including the divided area Rs2 is set as an imaging area, and imaging is performed. Scanning continues in the same way.
- the image captured in the marginal scan area is sent by the tissue detection unit 400.
- the image of the scan area with the margin sequentially sent from the signal acquisition unit 113 is used as the image of the target area as it is, and the processing of the above-described embodiment is performed.
- the image of the portion corresponding to the divided area in the image of the scan area with margins corresponds to the image of the small area.
- the image of the portion other than the divided area in the image of the scan area with margins corresponds to the image of the margin area in the target area.
- the image DB 20 may extract and store images of divided areas (unit areas) of the scan area with margins.
- a pathological image can be obtained from the set of extracted images. In this way, acquisition of a pathological image and the above-described processing of the present embodiment (tissue detection processing, overlap detection, and overlap elimination processing) can be performed at the same time.
- the divided scan method in Modification 3 may be any of the methods shown in FIGS. 2(A) and 2(B) shown in the above embodiment.
- this disclosure can also take the following configurations.
- a region setting unit that sets a first region and a second region partially overlapping with the first region in an image to be processed obtained by imaging a tissue derived from a living body; an identifying unit that identifies a first tissue region that is a tissue region included in the first region and a second tissue region that is a tissue region included in the second region; with The identification unit
- a medical image analysis apparatus comprising: an overlap processing unit that performs processing on the first tissue region and the second tissue region that at least partially overlaps the first tissue region, and sets a third tissue region.
- Device [Item 5] 5.
- any one of Items 1 to 4 wherein the overlap processing unit divides the at least partially overlapping first tissue region and the second tissue region into a plurality of tissue regions, and defines the divided tissue regions as the third tissue region.
- the medical image analysis device according to . [Item 6] 6.
- the medical image analysis apparatus according to any one of items 1 to 5, wherein the overlap processing unit generates a detection result image showing the first tissue region, the second tissue region, and the third tissue region.
- the duplication processing unit calculating a first circumscribed shape circumscribing the first tissue region, calculating a second circumscribed shape circumscribing the second tissue region; determining if the first circumscribed shape and the second circumscribed shape at least partially overlap; the first tissue region included in the first circumscribing shape and the first tissue region included in the second circumscribing shape when the first circumscribing shape and the second circumscribing shape do not at least partially overlap; 7.
- the medical image analysis apparatus according to any one of items 1 to 6.
- the overlap processing portion is included in the first tissue region included in the first circumscribed shape and the second circumscribed shape when the first circumscribed shape and the second circumscribed shape at least partially overlap.
- the medical image analysis apparatus according to item 7, wherein it is determined whether the first tissue regions at least partially overlap.
- the medical image analysis apparatus according to item 7 or 8, wherein the first circumscribed shape and the second circumscribed shape are rectangles.
- the image to be processed is divided into a plurality of unit areas, The image to be processed includes a plurality of unit images corresponding to the plurality of unit areas, The medical image analysis apparatus according to any one of items 1 to 9, wherein the region setting section sets the first region and the second region in units of the unit regions.
- the duplication processing unit converts the information of the first tissue region into first data that associates the information of the first tissue region with the unit image that includes the first tissue region, and the information of the second tissue region that includes the second tissue region.
- the medical image analysis apparatus according to item 10 comprising generating second data associated with a unit image and third data associated with the information of the third tissue region with the unit image including the third tissue region.
- the identification includes performing processing for identifying the first tissue region according to compression parameters for each unit image included in the first region, and performing processing for identifying the first tissue region according to compression parameters for each unit image included in the second region. 12.
- the medical image analysis apparatus according to item 10 or 11, wherein a process of identifying the second tissue region is performed.
- a first imaging region is set in the biological tissue with the same imaging size as the first region or the second region, the first imaging region is imaged, and the first imaging region is the same as the first imaging region.
- An imaging unit that sets the next second imaging area by overlapping and captures the second imaging area, The image of the first area is an image captured from the first imaging area,
- the medical image analysis apparatus according to any one of items 1 to 12, wherein the image of the second area is an image captured from the second imaging area.
- [Item 15] a display unit that displays an image containing the tissue of biological origin; Item 1, comprising: a processing target area setting unit that receives instruction information specifying an area in the image displayed on the display unit, and sets an image included in the area specified by the instruction information as the processing target image.
- the medical image analysis device according to any one of 14.
- [Item 16] setting a first region and a second region partially overlapping with the first region in an image to be processed obtained by imaging a tissue derived from a living body; specifying a first tissue region that is a tissue region included in the first region and a second tissue region that is a tissue region included in the second region; A medical image analysis method for setting a third tissue region by performing processing on the second tissue region that at least partially overlaps with the tissue region.
- the information processing unit executes a program to set the first region and the second region, specify the first tissue region and the second tissue region, and set the third tissue region. 18.
- medical image analysis system 10 medical image analysis device 20 image database 30 operation device 40 detection result database 100 microscope system 110 microscope device 111 light irradiation unit 112 optical unit 113 signal acquisition unit 114 sample placement unit 120 control unit (control device ) 130 information processing unit (information processing device) 200 processing target region setting unit 300 region setting unit 400 tissue detection unit (specification unit) 500 duplicate processing unit (identifying unit) 600 output unit (display unit) 610 pathological image display unit 620 detection result display unit 1001 processing target image 1002 small area
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
図1に示される顕微鏡システム100は、顕微鏡装置110、制御部(制御装置)120、及び情報処理部(情報処理装置)130を含む。顕微鏡装置110は、光照射部111、光学部112、及び信号取得部113を備えている。顕微鏡装置110は、さらに、生体由来試料Sが配置される試料載置部114を備えていてよい。なお、顕微鏡装置110の構成は図1に示されるものに限定されず、例えば、光照射部111は、顕微鏡装置110の外部に存在してもよく、例えば、顕微鏡装置110に含まれない光源が光照射部111として利用されてもよい。また、光照射部111は、光照射部111と光学部112によって試料載置部114が挟まれるように配置されていてよく、例えば、光学部112が存在する側に配置されてもよい。顕微鏡装置110は、明視野観察、位相差観察、微分干渉観察、偏光観察、蛍光観察、及び暗視野観察のうちの1又は2以上で構成されてよい。
生体由来試料Sは、生体成分を含む試料であってよい。前記生体成分は、生体の組織、細胞、生体の液状成分(血液や尿等)、培養物、又は生細胞(心筋細胞、神経細胞、及び受精卵など)であってよい。
光照射部111は、生体由来試料Sを照明するための光源、および光源から照射された光を標本に導く光学部である。光源は、可視光、紫外光、若しくは赤外光、又はこれらの組合せを生体由来試料に照射しうる。光源は、ハロゲンランプ、レーザ光源、LEDランプ、水銀ランプ、及びキセノンランプのうちの1又は2以上であってよい。蛍光観察における光源の種類及び/又は波長は、複数でもよく、当業者により適宜選択されてよい。光照射部は、透過型、反射型又は落射型(同軸落射型若しくは側射型)の構成を有しうる。
光学部112は、生体由来試料Sからの光を信号取得部113へと導くように構成される。光学部112は、顕微鏡装置110が生体由来試料Sを観察又は撮像することを可能とするように構成されうる。
信号取得部113は、生体由来試料Sからの光を受光し、当該光を電気信号、特にはデジタル電気信号へと変換することができるように構成されうる。信号取得部113は、当該電気信号に基づき、生体由来試料Sに関するデータを取得することができるように構成されてよい。信号取得部113は、生体由来試料Sの像(画像、特には静止画像、タイムラプス画像、又は動画像)のデータを取得することができるように構成されてよく、特に光学部112によって拡大された画像のデータを取得するように構成されうる。信号取得部113は、1次元又は2次元に並んで配列された複数の画素を備えている1つ又は複数の撮像素子、CMOS又はCCDなど、を含む撮像装置を有する。信号取得部113は、低解像度画像取得用の撮像素子と高解像度画像取得用の撮像素子とを含んでよく、又は、AFなどのためのセンシング用撮像素子と観察などのための画像出力用撮像素子とを含んでもよい。撮像素子は、前記複数の画素に加え、各画素からの画素信号を用いた信号処理を行う信号処理部(CPU、DSP、及びメモリのうちの1つ、2つ、又は3つを含む)、及び、画素信号から生成された画像データ及び信号処理部により生成された処理データの出力の制御を行う出力制御部を含みうる。更には、撮像素子は、入射光を光電変換する画素の輝度変化が所定の閾値を超えたことをイベントとして検出する非同期型のイベント検出センサを含み得る。前記複数の画素、前記信号処理部、及び前記出力制御部を含む撮像素子は、好ましくは1チップの半導体装置として構成されうる。
制御部120は、顕微鏡装置110よる撮像を制御する。制御部120は、撮像制御のために、光学部112及び/又は試料載置部114の移動を駆動して、光学部112と試料載置部との間の位置関係を調節しうる。制御部120は、光学部及び/又は試料載置部を、互いに近づく又は離れる方向(例えば対物レンズの光軸方向)に移動させうる。また、制御部は、光学部及び/又は試料載置部を、前記光軸方向と垂直な面におけるいずれかの方向に移動させてもよい。制御部は、撮像制御のために、光照射部111及び/又は信号取得部113を制御してもよい。
試料載置部114は、生体由来試料Sの試料載置部上における位置が固定できるように構成されてよく、いわゆるステージであってよい。試料載置部114は、生体由来試料Sの位置を、対物レンズの光軸方向及び/又は当該光軸方向と垂直な方向に移動させることができるように構成されうる。
情報処理部130は、顕微鏡装置110が取得したデータ(撮像データなど)を、顕微鏡装置110から取得しうる。情報処理部130は、撮像データに対する画像処理を実行しうる。当該画像処理は、色分離処理を含んでよい。当該色分離処理は、撮像データから所定の波長又は波長範囲の光成分のデータを抽出して画像データを生成する処理、又は、撮像データから所定の波長又は波長範囲の光成分のデータを除去する処理などを含みうる。また、当該画像処理は、組織切片の自家蛍光成分と色素成分を分離する自家蛍光分離処理や互いに蛍光波長が異なる色素間の波長を分離する蛍光分離処理を含みうる。前記自家蛍光分離処理では、同一ないし性質が類似する前記複数の標本のうち、一方から抽出された自家蛍光シグナルを用いて他方の標本の画像情報から自家蛍光成分を除去する処理を行ってもよい。
病理画像表示部610は、本アプリケーションを利用するユーザから操作装置30で指定された病理画像の一部又は全部を、本アプリケーションの画面に表示する。病理画像の一部又は全部を表示する画面を、病理画像閲覧画面と称する。医療用画像解析装置10は、ユーザによって指定された病理画像を診断DB40から読み出し、本アプリケーションのウィンドウ内の病理画像閲覧画面に表示する。例えばWSIファイルを読み出し、デコードすることによって病理画像を展開し、病理画像閲覧画面に病理画像を表示する。ユーザは病理画像を閲覧しながら、病理画像の倍率を変更することが可能であってもよい。この場合、ユーザが指定した倍率の画像を診断DB40から読み出して、表示しなおせばよい。
処理対象領域設定部200は、病理画像閲覧画面に表示された病理画像に対して処理を行う対象となる領域(処理対象領域)を設定する。処理対象領域設定部200は、ユーザの指示情報に基づき、処理対象領域を設定してもよい。例えばユーザがマウス操作等により矩形等で囲んだ領域を処理対象領域としてもよい。あるいは、病理画像閲覧画面内の所定範囲(例えば表示領域の中心から一定範囲あるいは表示領域全体)を処理対象領域としてもよい。例えば、病理画像が表示されてから一定時間ユーザの操作がない場合は、当該所定範囲を処理対象領域として自動的に確定してもよい。画像DB20から読み出した病理画像の全体を処理対象領域としてもよい。
一例として対象領域TA1は、第1の領域、対象領域TA2は第1の領域に対応する。対象領域TA1から検出される組織の領域は第1組織領域、対象領域TA2から検出される組織の領域は第2組織領域に対応する。第1組織領域と第2組織領域とが重複する場合に、後述する重複解消を第1組織領域と第2組織領域に対して行うことで第3組織が設定される。
一例として、検出結果データは、第1小領域(あるいは第1単位領域)から検出された組織の領域情報を、第1小領域の画像(あるいは第1単位領域の画像)に対応付けた第1データを含んでもよい。また検出結果データは、第2小領域(あるいは第2単位領域)から検出された組織の領域情報を、第2小領域の画像(あるいは第2単位領域の画像)に対応付けた第2データを含んでもよい。また、検出結果データは、代表組織領域における組織の情報を第1小領域の画像(あるいは第1単位領域の画像)及び第2小領域の画像(あるいは第1単位領域に隣接する第2単位領域の画像)の少なくとも一方に対応付けた第3データを含んでもよい。代表組織領域における組織の情報を、生成元となる組織の情報から決定してもよい。例えば代表組織領域における組織の種別として、生成元となる組織のうちの1つの種別を用いてもよい。例えば生成元となる組織のうち最も多い種別を代表組織領域における組織の種別としてもよい。
本変形例1では、重複処理部500が組織領域間の重複の有無を判定する処理を簡単に行う手法を示す。これにより重複判定の処理の演算量を低減し、処理を高速化することができる。以下具体例を用いて説明する。
対象領域を設定する際、生体由来試料Sを撮像した領域(単位領域)の単位で対象領域の設定を行ってもよい。これにより対象領域の境界が、撮像した単位領域の境界と一致するため(対象領域におけるマージン領域が単位領域により構成されるため)、対象領域の画像を容易に取得することができる。単位領域の単位で画像をデコードすればよいため、少ないメモリ消費量で、対象領域の画像を高速に取得できる。
上述した実施形態の説明において、生体由来試料Sの領域を分割して撮像する際、撮像する領域が一部重複してもよいこと(すなわち、分割領域(単位領域)にマージンを追加して撮像を行ってもよいこと)を記載した。本変形例では、対象領域と同じサイズの撮像サイズで、生体由来試料Sに撮像領域を設定し、撮像を行う。この際、対象領域に含まれるマージン領域と同じサイズだけ撮像領域を、次に設定する撮像領域との間で一部重ねながら、撮像を行う。このように生体由来組織に第1の撮像領域を設定し、前記第1の撮像領域を撮像し、第1の撮像領域と一部重複させて次の第2の撮像領域を設定し、第2の撮像領域を撮像する。第1の対象領域の画像として、第1の撮像領域から撮像された画像を用い、第2の対象領域の画像として、第2の撮像領域から撮像された画像を用いる。以降、同様の処理を繰り返す。
このような動作により生体由来試料Sの撮像を完了する前から対象領域の画像を容易に取得することができ、上述した実施形態の処理を撮像開始直後から早期に開始できる。すなわち、生体由来試料Sの撮像を行いながら、組織の検出処理(S104)、重複検出・重複解消処理(S107)を行うことができる。これにより、撮像、画像処理、スライドの位置移動などのレイテンシの隠蔽が容易になり、処理時間の削減につなげることができる。
[項目1]
生体由来組織を撮像した処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定する領域設定部と、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定する特定部と、
を備え、
前記特定部は、
前記第1組織領域と、前記第1組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する重複処理部と
を備えた医療用画像解析装置。
[項目2]
前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域のうちの1つを選択し、選択した組織領域を前記第3組織領域とする
項目1に記載の医療用画像解析装置。
[項目3]
前記重複処理部は、前記第1組織領域及び前記第2組織領域の大きさに基づき、前記第1組織領域及び前記第2組織領域のうちの1つを選択する
項目2に記載の医療用画像解析装置。
[項目4]
前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域を統合して前記第3組織領域とする
項目1~3のいずれか一項に記載の医療用画像解析装置。
[項目5]
前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域を複数に分割し、分割した組織領域を前記第3組織領域とする
項目1~4のいずれか一項に記載の医療用画像解析装置。
[項目6]
前記重複処理部は、前記第1組織領域、前記第2組織領域及び前記第3組織領域を示す検出結果画像を生成する
項目1~5のいずれか一項に記載の医療用画像解析装置。
[項目7]
前記重複処理部は、
前記第1組織領域に外接する第1外接形状を算出し、前記第2組織領域に外接する第2外接形状を算出し、
前記第1外接形状と前記第2外接形状が少なくとも部分的に重複するかを判定し、
前記第1外接形状と前記第2外接形状が少なくとも部分的に重複しない場合に、前記第1外接形状に含まれる前記第1組織領域と、前記第2外接形状に含まれる前記第1組織領域とが重複しないことを決定する
項目1~6のいずれか一項に記載の医療用画像解析装置。
[項目8]
前記重複処理部は、前記第1外接形状と前記第2外接形状が少なくとも部分的に重複する場合に、前記第1外接形状に含まれる前記第1組織領域と、前記第2外接形状に含まれる前記第1組織領域が少なくとも部分的に重複するかを判定する
項目7に記載の医療用画像解析装置。
[項目9]
前記第1外接形状及び前記第2外接形状は矩形である
項目7又は8に記載の医療用画像解析装置。
[項目10]
前記処理対象画像は、複数の単位領域に分割されており、
前記処理対象画像は前記複数の単位領域に対応する複数の単位画像を含み、
前記領域設定部は、前記単位領域の単位で、前記第1の領域及び前記第2の領域を設定する
項目1~9のいずれか一項に記載の医療用画像解析装置。
[項目11]
前記重複処理部は、前記第1組織領域の情報を前記第1組織領域が含まれる前記単位画像に関連付けた第1データ、及び前記第2組織領域の情報を前記第2組織領域が含まれる前記単位画像に関連付けた第2データ、前記第3組織領域の情報を前記第3組織領域が含まれる前記単位画像に関連付けた第3データを生成する
を備えた項目10に記載の医療用画像解析装置。
[項目12]
前記特定は、前記第1の領域に含まれる単位画像ごとの圧縮パラメタに応じて前記第1組織領域を特定する処理を行い、前記第2の領域に含まれる単位画像ごとの圧縮パラメタに応じて前記第2組織領域を特定する処理を行う
項目10又は11に記載の医療用画像解析装置。
[項目13]
前記第1の領域又は前記第2の領域と同じサイズの撮像サイズで前記生体由来組織に第1の撮像領域を設定し、前記第1の撮像領域を撮像し、前記第1の撮像領域と一部重複させて次の第2の撮像領域を設定し、前記第2の撮像領域を撮像する撮像部を備え、
前記第1の領域の画像は、前記第1の撮像領域から撮像された画像であり、
前記第2の領域の画像は、前記第2の撮像領域から撮像された画像である
項目1~12のいずれか一項に記載の医療用画像解析装置。
[項目14]
前記検出結果画像を表示する表示部
を備えた項目6に記載の医療用画像解析装置。
[項目15]
前記生体由来組織を含む画像を表示する表示部と、
前記表示部に表示された前記画像における領域を指定する指示情報を受け、前記指示情報によって指定された前記領域に含まれる画像を前記処理対象画像とする処理対象領域設定部と
を備えた項目1~14のいずれか一項に記載の医療用画像解析装置。
[項目16]
生体由来組織を撮像した処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定し、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定し
前記第1組織領域と、前記第1組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する
医療用画像解析方法。
[項目17]
生体由来組織を撮像し、処理対象画像を取得する撮像部と、
情報処理部と、を備え、
前記情報処理部は、
前記処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定し、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定し、
前記第1組織領域と、前記第2組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する
医療用画像解析システム。
[項目18]
前記情報処理部は、プログラムを実行することにより、前記第1の領域及び前記第2の領域の設定、前記第1組織領域及び前記第2組織領域を特定する処理、前記第3組織領域を設定する処理を行う
項目17に記載の医療用画像解析システム。
[項目19]
前記プログラムを格納する記憶部を備え、
前記情報処理部は、前記記憶部から前記プログラムを読み出して実行する
項目18に記載の医療用画像解析システム。
10 医療用画像解析装置
20 画像データベース
30 操作装置
40 検出結果データベース
100 顕微鏡システム
110 顕微鏡装置
111 光照射部
112 光学部
113 信号取得部
114 試料載置部
120 制御部(制御装置)
130 情報処理部(情報処理装置)
200 処理対象領域設定部
300 領域設定部
400 組織検出部(特定部)
500 重複処理部(特定部)
600 出力部(表示部)
610 病理画像表示部
620 検出結果表示部
1001 処理対象画像
1002 小領域
Claims (19)
- 生体由来組織を撮像した処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定する領域設定部と、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定する特定部と、
を備え、
前記特定部は、
前記第1組織領域と、前記第1組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する重複処理部と
を備えた医療用画像解析装置。 - 前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域のうちの1つを選択し、選択した組織領域を前記第3組織領域とする
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、前記第1組織領域及び前記第2組織領域の大きさに基づき、前記第1組織領域及び前記第2組織領域のうちの1つを選択する
請求項2に記載の医療用画像解析装置。 - 前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域を統合して前記第3組織領域とする
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、少なくとも部分的に重複する前記第1組織領域及び前記第2組織領域を複数に分割し、分割した組織領域を前記第3組織領域とする
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、前記第1組織領域、前記第2組織領域及び前記第3組織領域を示す検出結果画像を生成する
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、
前記第1組織領域に外接する第1外接形状を算出し、前記第2組織領域に外接する第2外接形状を算出し、
前記第1外接形状と前記第2外接形状が少なくとも部分的に重複するかを判定し、
前記第1外接形状と前記第2外接形状が少なくとも部分的に重複しない場合に、前記第1外接形状に含まれる前記第1組織領域と、前記第2外接形状に含まれる前記第1組織領域とが重複しないことを決定する
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、前記第1外接形状と前記第2外接形状が少なくとも部分的に重複する場合に、前記第1外接形状に含まれる前記第1組織領域と、前記第2外接形状に含まれる前記第1組織領域が少なくとも部分的に重複するかを判定する
請求項7に記載の医療用画像解析装置。 - 前記第1外接形状及び前記第2外接形状は矩形である
請求項7に記載の医療用画像解析装置。 - 前記処理対象画像は、複数の単位領域に分割されており、
前記処理対象画像は前記複数の単位領域に対応する複数の単位画像を含み、
前記領域設定部は、前記単位領域の単位で、前記第1の領域及び前記第2の領域を設定する
請求項1に記載の医療用画像解析装置。 - 前記重複処理部は、前記第1組織領域の情報を前記第1組織領域が含まれる前記単位画像に関連付けた第1データ、及び前記第2組織領域の情報を前記第2組織領域が含まれる前記単位画像に関連付けた第2データ、前記第3組織領域の情報を前記第3組織領域が含まれる前記単位画像に関連付けた第3データを生成する
を備えた請求項10に記載の医療用画像解析装置。 - 前記特定は、前記第1の領域に含まれる単位画像ごとの圧縮パラメタに応じて前記第1組織領域を特定する処理を行い、前記第2の領域に含まれる単位画像ごとの圧縮パラメタに応じて前記第2組織領域を特定する処理を行う
請求項10に記載の医療用画像解析装置。 - 前記第1の領域又は前記第2の領域と同じサイズの撮像サイズで前記生体由来組織に第1の撮像領域を設定し、前記第1の撮像領域を撮像し、前記第1の撮像領域と一部重複させて次の第2の撮像領域を設定し、前記第2の撮像領域を撮像する撮像部を備え、
前記第1の領域の画像は、前記第1の撮像領域から撮像された画像であり、
前記第2の領域の画像は、前記第2の撮像領域から撮像された画像である
請求項1に記載の医療用画像解析装置。 - 前記検出結果画像を表示する表示部
を備えた請求項6に記載の医療用画像解析装置。 - 前記生体由来組織を含む画像を表示する表示部と、
前記表示部に表示された前記画像における領域を指定する指示情報を受け、前記指示情報によって指定された前記領域に含まれる画像を前記処理対象画像とする処理対象領域設定部と
を備えた請求項1に記載の医療用画像解析装置。 - 生体由来組織を撮像した処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定し、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定し
前記第1組織領域と、前記第1組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する
医療用画像解析方法。 - 生体由来組織を撮像し、処理対象画像を取得する撮像部と、
情報処理部と、を備え、
前記情報処理部は、
前記処理対象画像に、第1の領域と、前記第1の領域と一部が重複する第2の領域とを設定し、
前記第1の領域に含まれる組織の領域である第1組織領域と、前記第2の領域に含まれる組織の領域である第2組織領域とを特定し、
前記第1組織領域と、前記第2組織領域に少なくとも一部が重なる前記第2組織領域とに対して処理を行い、第3組織領域を設定する
医療用画像解析システム。 - 前記情報処理部は、プログラムを実行することにより、前記第1の領域及び前記第2の領域の設定、前記第1組織領域及び前記第2組織領域を特定する処理、前記第3組織領域を設定する処理を行う
請求項17に記載の医療用画像解析システム。 - 前記プログラムを格納する記憶部を備え、
前記情報処理部は、前記記憶部から前記プログラムを読み出して実行する
請求項18に記載の医療用画像解析システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/551,416 US20240169534A1 (en) | 2021-03-29 | 2022-02-24 | Medical image analysis device, medical image analysis method, and medical image analysis system |
EP22779668.7A EP4316414A1 (en) | 2021-03-29 | 2022-02-24 | Medical image analysis device, medical image analysis method, and medical image analysis system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021055964A JP2022152980A (ja) | 2021-03-29 | 2021-03-29 | 医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム |
JP2021-055964 | 2021-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022209443A1 true WO2022209443A1 (ja) | 2022-10-06 |
Family
ID=83456020
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/007460 WO2022209443A1 (ja) | 2021-03-29 | 2022-02-24 | 医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240169534A1 (ja) |
EP (1) | EP4316414A1 (ja) |
JP (1) | JP2022152980A (ja) |
WO (1) | WO2022209443A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09281405A (ja) * | 1996-04-17 | 1997-10-31 | Olympus Optical Co Ltd | 顕微鏡システム |
JP2012003214A (ja) * | 2010-05-19 | 2012-01-05 | Sony Corp | 情報処理装置、情報処理方法、プログラム、撮像装置、及び光学顕微鏡を搭載した撮像装置 |
JP2017134434A (ja) * | 2010-08-20 | 2017-08-03 | サクラ ファインテック ユー.エス.エー., インコーポレイテッド | デジタル顕微鏡を備えたシステム及びこれを用いた試料の検査方法 |
JP2018165718A (ja) * | 2012-09-06 | 2018-10-25 | ソニー株式会社 | 情報処理装置、情報処理方法、および顕微鏡システム |
JP2020182219A (ja) * | 2018-07-31 | 2020-11-05 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び制御方法 |
JP2021006037A (ja) * | 2014-09-03 | 2021-01-21 | ヴェンタナ メディカル システムズ, インク. | 免疫スコアを計算するためのシステム及び方法 |
-
2021
- 2021-03-29 JP JP2021055964A patent/JP2022152980A/ja active Pending
-
2022
- 2022-02-24 EP EP22779668.7A patent/EP4316414A1/en active Pending
- 2022-02-24 US US18/551,416 patent/US20240169534A1/en active Pending
- 2022-02-24 WO PCT/JP2022/007460 patent/WO2022209443A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09281405A (ja) * | 1996-04-17 | 1997-10-31 | Olympus Optical Co Ltd | 顕微鏡システム |
JP2012003214A (ja) * | 2010-05-19 | 2012-01-05 | Sony Corp | 情報処理装置、情報処理方法、プログラム、撮像装置、及び光学顕微鏡を搭載した撮像装置 |
JP2017134434A (ja) * | 2010-08-20 | 2017-08-03 | サクラ ファインテック ユー.エス.エー., インコーポレイテッド | デジタル顕微鏡を備えたシステム及びこれを用いた試料の検査方法 |
JP2018165718A (ja) * | 2012-09-06 | 2018-10-25 | ソニー株式会社 | 情報処理装置、情報処理方法、および顕微鏡システム |
JP2021006037A (ja) * | 2014-09-03 | 2021-01-21 | ヴェンタナ メディカル システムズ, インク. | 免疫スコアを計算するためのシステム及び方法 |
JP2020182219A (ja) * | 2018-07-31 | 2020-11-05 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び制御方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2022152980A (ja) | 2022-10-12 |
EP4316414A1 (en) | 2024-02-07 |
US20240169534A1 (en) | 2024-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3776458B1 (en) | Augmented reality microscope for pathology with overlay of quantitative biomarker data | |
CN110476101B (zh) | 用于病理学的增强现实显微镜 | |
US20220076411A1 (en) | Neural netork based identification of areas of interest in digital pathology images | |
US9110305B2 (en) | Microscope cell staining observation system, method, and computer program product | |
JP2022534157A (ja) | 組織学的画像および術後腫瘍辺縁評価における腫瘍のコンピュータ支援レビュー | |
US10083340B2 (en) | Automated cell segmentation quality control | |
JP2008541048A (ja) | 自動画像解析 | |
JP2022050455A (ja) | 試料除去領域選択方法 | |
JP7487418B2 (ja) | 多重化免疫蛍光画像における自己蛍光アーチファクトの識別 | |
CN208766110U (zh) | 病理多靶点智能辅助诊断系统 | |
WO2022176396A1 (ja) | 情報処理装置及び情報処理方法、コンピュータプログラム、並びに医療診断システム | |
Ma et al. | Hyperspectral microscopic imaging for the detection of head and neck squamous cell carcinoma on histologic slides | |
WO2022209443A1 (ja) | 医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム | |
JPWO2018128091A1 (ja) | 画像解析プログラム及び画像解析方法 | |
WO2022201992A1 (ja) | 医療用画像解析装置、医療用画像解析方法及び医療用画像解析システム | |
WO2022259648A1 (ja) | 情報処理プログラム、情報処理装置、情報処理方法、及び顕微鏡システム | |
EP4318402A1 (en) | Information processing device, information processing method, information processing system and conversion model | |
WO2023149296A1 (ja) | 情報処理装置、生体試料観察システム及び画像生成方法 | |
WO2023157755A1 (ja) | 情報処理装置、生体試料解析システム及び生体試料解析方法 | |
WO2023157756A1 (ja) | 情報処理装置、生体試料解析システム及び生体試料解析方法 | |
US20230071901A1 (en) | Information processing apparatus and information processing system | |
WO2022181263A1 (ja) | 医療用画像処理装置、医療用画像処理方法、及びプログラム | |
JP7492650B2 (ja) | 多重免疫蛍光染色組織のデジタル画像における壊死領域の自動識別 | |
WO2022270015A1 (ja) | 生体標本観察装置及び生体標本観察システム | |
WO2022249583A1 (ja) | 情報処理装置、生体試料観察システム及び画像生成方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22779668 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18551416 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022779668 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022779668 Country of ref document: EP Effective date: 20231030 |