WO2021157405A1 - Dispositif d'analyse, procédé d'analyse, programme d'analyse et système d'aide au diagnostic - Google Patents

Dispositif d'analyse, procédé d'analyse, programme d'analyse et système d'aide au diagnostic Download PDF

Info

Publication number
WO2021157405A1
WO2021157405A1 PCT/JP2021/002442 JP2021002442W WO2021157405A1 WO 2021157405 A1 WO2021157405 A1 WO 2021157405A1 JP 2021002442 W JP2021002442 W JP 2021002442W WO 2021157405 A1 WO2021157405 A1 WO 2021157405A1
Authority
WO
WIPO (PCT)
Prior art keywords
analysis
image
unit
result
parameter
Prior art date
Application number
PCT/JP2021/002442
Other languages
English (en)
Japanese (ja)
Inventor
山根 健治
真司 渡辺
長谷川 寛
一真 高橋
威 國弘
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2021157405A1 publication Critical patent/WO2021157405A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This disclosure relates to an analysis device, an analysis method, an analysis program, and a diagnostic support system.
  • the object to be observed is a tissue or cell collected from a patient, and corresponds to a piece of meat, saliva, blood, or the like of an organ.
  • the specified area is analyzed by a predetermined analysis procedure using a predetermined parameter, and the final analysis is performed. Output the result.
  • the analysis procedure and parameters for the pathological image have not been established, in which case the user needs to adjust the analysis procedure and parameters.
  • this disclosure proposes an analysis device, an analysis method, an analysis program, and a diagnostic support system that can confirm the influence of the analysis result by the parameter.
  • the one-form analysis apparatus includes a reception unit that receives parameters used in the first analysis process of images related to pathology, and the first one that is executed based on the parameters. It includes a first result of the analysis process and an output unit that outputs the second result of the second analysis process executed based on the first result in a visually recognizable state.
  • ⁇ First Embodiment> 1. Configuration of the system according to the first embodiment 2. Various information 2-1. Pathological image 2-2. Browsing history information 2-3. Diagnostic information 3. Analytical apparatus according to the first embodiment 4. Processing procedure 4-1. Analysis processing procedure according to the first embodiment 4-2. 5. Parameter display processing procedure according to the first embodiment. Example of parameters 6. Effect of analysis device according to the first embodiment ⁇ Second embodiment> 7. Analytical apparatus according to the second embodiment 8. Effect of analyzer according to the second embodiment 9. Analysis combining multiple types of pathological images 9-1. Analysis example using HE-stained image and IHC-stained image 9-2. Analysis example using HE-stained image and fluorescent IHC-stained image 9-3. Analysis example using images other than HE-stained images and pathological images 10. Hardware configuration 11.
  • FIG. 1 is a diagram showing a diagnosis support system 1 according to the first embodiment.
  • the diagnosis support system 1 includes a pathology system 10 and an analysis device 100.
  • the pathological system 10 is a system mainly used by pathologists, and is applied to, for example, laboratories and hospitals. As shown in FIG. 1, the pathology system 10 includes a microscope 11, a server 12, a display control device 13, and a display device 14.
  • the microscope 11 is an imaging device that has the function of an optical microscope, images an observation object stored on a glass slide, and acquires a pathological image (an example of a medical image) that is a digital image.
  • the observation object is, for example, a tissue or cell collected from a patient, such as a piece of meat, saliva, or blood of an organ.
  • the server 12 is a device that stores and stores the pathological image captured by the microscope 11 in a storage unit (not shown).
  • the server 12 searches for a pathological image from a storage unit (not shown) and sends the searched pathological image to the display control device 13. Further, when the server 12 receives the request for acquiring the pathological image from the analysis device 100, the server 12 searches for the pathological image from the storage unit and sends the searched pathological image to the analysis device 100.
  • the display control device 13 sends a viewing request for the pathological image received from the user to the server 12. Then, the display control device 13 controls the display device 14 so as to display the pathological image received from the server 12.
  • the display device 14 has a screen on which, for example, a liquid crystal, an EL (Electro-Luminescence), a CRT (Cathode Ray Tube), or the like is used.
  • the display device 14 may correspond to 4K or 8K, or may be formed by a plurality of display devices.
  • the display device 14 displays a pathological image controlled to be displayed by the display control device 13. Although details will be described later, the server 12 stores browsing history information regarding the area of the pathological image observed by the pathologist via the display device 14.
  • the analysis device 100 is a device that sends a pathological image acquisition request to the server 12 and analyzes the pathological image received from the server 12.
  • FIGS. 2 and 3 are diagrams for explaining the imaging process according to the first embodiment.
  • the microscope 11 described below has a low-resolution imaging unit for imaging at a low resolution and a high-resolution imaging unit for imaging at a high resolution.
  • FIG. 2 includes a glass slide G10 in which the observation object A10 is housed in the imaging region R10, which is a region that can be photographed by the microscope 11.
  • the glass slide G10 is placed on, for example, a stage (not shown).
  • the microscope 11 captures the imaging region R10 with a low-resolution imaging unit to generate an overall image, which is a pathological image in which the observation object A10 is entirely imaged.
  • the label information L10 shown in FIG. 2 describes identification information (for example, a character string or a QR code (registered trademark)) for identifying the observation object A10. By associating the identification information described in the label information L10 with the patient, it is possible to identify the patient corresponding to the entire image. In the example of FIG. 2, "# 001" is described as the identification information. In the label information L10, for example, a brief description of the observation object A10 may be described.
  • the microscope 11 identifies the region where the observation object A10 exists from the whole image after generating the whole image, and divides the region where the observation object A10 exists into each predetermined size into high resolution. Images are sequentially taken by the imaging unit. For example, as shown in FIG. 3, the microscope 11 first images the region R11 and generates a high-resolution image I11 which is an image showing a part of the observation target A10. Subsequently, the microscope 11 moves the stage to image the region R12 by the high-resolution imaging unit, and generates a high-resolution image I12 corresponding to the region R12. Similarly, the microscope 11 produces high resolution images I13, I14, ... Corresponding to regions R13, R14, .... Although only the area R18 is shown in FIG. 3, the microscope 11 sequentially moves the stage to image all the divided areas corresponding to the observation object A10 by the high-resolution imaging unit, and corresponds to each divided area. Generate high resolution images.
  • the low-resolution imaging unit and the high-resolution imaging unit described above may have different optical systems or the same optical system.
  • the microscope 11 changes the resolution according to the image pickup target.
  • FIG. 3 shows an example in which the microscope 11 takes an image from the central portion of the observation object A10.
  • the microscope 11 may image the observation object A10 in an order different from the imaging order shown in FIG.
  • the microscope 11 may take an image from the outer peripheral portion of the observation object A10.
  • the microscope 11 divides the entire region of the imaging region R10 or the glass slide G10 shown in FIG. 2 and images the image with the high-resolution imaging unit. You may.
  • FIG. 4 is a diagram for explaining a process of generating a partial image (tile image).
  • FIG. 4 shows a high-resolution image I11 corresponding to the region R11 shown in FIG.
  • the server 12 generates a partial image from the high-resolution image.
  • the partial image may be generated by a device other than the server 12 (for example, an information processing device mounted inside the microscope 11).
  • the server 12 generates 100 tile images T11, T12, ... By dividing one high-resolution image I11. For example, when the resolution of the high resolution image I11 is 256 0 ⁇ 256 [pixel: pixel], the server 12 has 100 tile images T11 having a resolution of 256 ⁇ 256 [pixel: pixel] from the high resolution image I11. T12, ... Is generated. Similarly, the server 12 generates tile images by dividing other high-resolution images into similar sizes.
  • the regions R111, R112, R113, and R114 are regions that overlap with other adjacent high-resolution images (not shown in FIG. 4).
  • the server 12 performs stitching processing on high-resolution images adjacent to each other by aligning the overlapping areas by a technique such as template matching.
  • the server 12 may generate a tile image by dividing the high-resolution image after the stitching process.
  • the server 12 generates a tile image of an area other than the areas R111, R112, R113, and R114 before the stitching process, and generates a tile image of the area R111, R112, R113, and R114 after the stitching process. May be good.
  • the server 12 generates a tile image which is the minimum unit of the captured image of the observation object A10. Then, the server 12 generates tile images having different hierarchies by sequentially synthesizing the tile images of the smallest unit. Specifically, the server 12 generates one tile image by synthesizing a predetermined number of adjacent tile images. This point will be described with reference to FIGS. 5 and 6. 5 and 6 are diagrams for explaining the pathological image according to the first embodiment.
  • the server 12 generates a tile image obtained by further synthesizing tile images adjacent to each other among the tile images after synthesizing the tile images of the smallest unit.
  • the server 12 generates one tile image T100 by synthesizing four tile images T110, T120, T210, and T220 adjacent to each other.
  • the server 12 when the resolution of the tile images T110, T120, T210, and T220 is 256 ⁇ 256, the server 12 generates the tile image T100 having the resolution of 256 ⁇ 256.
  • the server 12 uses a 4-pixel average, a weighting filter (a process that reflects close pixels more strongly than a distant pixel), and 1/2 thinning out of an image having a resolution of 512 ⁇ 512, which is a composite of four tile images adjacent to each other. By performing processing or the like, a tile image having a resolution of 256 ⁇ 256 is generated.
  • the server 12 By repeating such a compositing process, the server 12 finally generates one tile image having the same resolution as the resolution of the tile image of the smallest unit. For example, as in the above example, when the resolution of the minimum unit tile image is 256 ⁇ 256, the server 12 repeats the above-mentioned composition process to finally obtain one tile image having a resolution of 256 ⁇ 256. Generate T1.
  • the area D shown in FIG. 5 shows an example of an area displayed on the display screen of the display device 14 or the like.
  • the resolution that can be displayed by the display device is a tile image for three vertical tiles and a tile image for four horizontal tiles.
  • the degree of detail of the observation object A10 displayed on the display device changes depending on the hierarchy to which the tile image to be displayed belongs. For example, when the tile image of the lowest layer is used, a narrow area of the observation object A10 is displayed in detail. Further, the wider the area of the observation object A10 is displayed coarser as the upper tile image is used.
  • the server 12 stores the tile images of each layer as shown in FIG. 6 in a storage unit (not shown). For example, the server 12 stores each tile image together with tile identification information (an example of partial image information) that can uniquely identify each tile image. In this case, when the server 12 receives a request for acquiring a tile image including the tile identification information from another device (for example, the display control device 13), the server 12 transmits the tile image corresponding to the tile identification information to the other device. .. Further, for example, the server 12 may store each tile image together with the layer identification information for identifying each layer and the tile identification information that can be uniquely identified within the same layer.
  • tile identification information an example of partial image information
  • the server 12 when the server 12 receives a request for acquiring a tile image including the hierarchy identification information and the tile identification information from another device, the server 12 corresponds to the tile identification information among the tile images belonging to the hierarchy corresponding to the hierarchy identification information. Send the tile image to be sent to another device.
  • the server 12 may store tile images of each layer as shown in FIG. 6 for each imaging condition.
  • An example of the imaging condition is a focal length with respect to a subject (observation object A10 or the like).
  • the microscope 11 may take an image of the same subject while changing the focal length.
  • the server 12 may store tile images of each layer as shown in FIG. 6 for each focal length.
  • the reason for changing the focal length is that some observation objects A10 are translucent, so that the focal length is suitable for imaging the surface of the observation object A10 and the inside of the observation object A10 is imaged. This is because there is a suitable focal length.
  • the microscope 11 can generate a pathological image of the surface of the observation object A10 and a pathological image of the inside of the observation object A10 by changing the focal length.
  • the imaging condition there is a staining condition for the observation object A10.
  • a luminescent substance may be stained on a specific portion (for example, the nucleus of a cell) of the observation object A10.
  • the luminescent material is, for example, a substance that emits light when irradiated with light having a specific wavelength.
  • different luminescent substances may be stained on the same observation object A10.
  • the server 12 may store a tile image of each layer as shown in FIG. 6 for each dyed luminescent material.
  • the number and resolution of the tile images mentioned above are examples and can be changed as appropriate depending on the system.
  • the number of tile images synthesized by the server 12 is not limited to four.
  • the resolution of the tile image is 256 ⁇ 256, but the resolution of the tile image may be other than 256 ⁇ 256.
  • the display control device 13 uses software that employs a system that can handle the tile image group having a hierarchical structure described above, and is desired from the tile image group having a hierarchical structure in response to an input operation via the display control device 13 of the user.
  • the tile image is extracted and output to the display device 14.
  • the display device 14 displays an image of an arbitrary portion selected by the user among images having an arbitrary resolution selected by the user.
  • the display control device 13 functions as a virtual microscope.
  • the virtual observation magnification here actually corresponds to the resolution.
  • FIG. 7 is a diagram showing an example of a viewing mode of a pathological image by a viewer.
  • a viewer such as a pathologist browses the pathological image I10 in the order of regions D1, D2, D3, ..., D7.
  • the display control device 13 first acquires the pathological image corresponding to the area D1 from the server 12 according to the browsing operation by the viewer.
  • the server 12 acquires one or more tile images forming a pathological image corresponding to the area D1 from the storage unit, and transfers the acquired one or more tile images to the display control device 13. Send. Then, the display control device 13 displays the pathological image formed from one or more tile images acquired from the server 12 on the display device 14. For example, when there are a plurality of tile images, the display control device 13 displays the plurality of tile images side by side. Similarly, each time the display control device 13 changes the display area by the viewer, the display control device 13 outputs a pathological image corresponding to the display target area (areas D2, D3, ..., D7, etc.) from the server 12. It is acquired and displayed on the display device 14.
  • the pathological image corresponding to the regions D1, D2, and D7 is a 1.25-magnification display image
  • the pathological image corresponding to the regions D3 and D4 is a 20-magnification display image
  • the pathological image corresponding to the regions D5 and D6 is a 40-magnification display image.
  • the display control device 13 acquires and displays the tile images of the hierarchy corresponding to each magnification among the tile images of the hierarchical structure stored in the server 12.
  • the layer of the tile image corresponding to the areas D1 and D2 is higher than the layer of the tile image corresponding to the area D3 (that is, the layer close to the tile image T1 shown in FIG. 6).
  • FIG. 8 is a diagram showing an example of the browsing history storage unit 12a included in the server 12.
  • the browsing history storage unit 12a stores information such as “sampling”, “center coordinates”, “magnification”, and “time”.
  • “Sampling” indicates the order of timing for storing browsing information.
  • the "center coordinates” indicate the position information of the viewed pathological image. In this example, the center coordinates are the coordinates indicated by the center position of the viewed pathological image, and correspond to the coordinates of the coordinate system of the tile image group in the lowest layer.
  • “Magnification” indicates the display magnification of the viewed pathological image.
  • “Time” indicates the elapsed time from the start of browsing. In the example of FIG.
  • sampling “1” indicates browsing information of region D1 shown in FIG. 7
  • sampling “2” indicates browsing information of region D2
  • samplings “3” and “4” indicate browsing information of region D3.
  • the sampling "5" indicates the browsing information of the area D4
  • the samplings "6", "7” and “8” indicate the browsing information of the area D5. That is, in the example of FIG. 8, the area D1 is browsed for about 30 seconds, the region D2 is browsed for about 30 seconds, the region D3 is browsed for about 60 seconds, the region D4 is browsed for about 30 seconds, and the region D5 is browsed for about 90 seconds. Indicates that it has been viewed. In this way, the browsing time of each area can be extracted from the browsing history information.
  • each pixel of the pathological image (each) can be analyzed by analyzing the browsing history information stored in the browsing history storage unit 12a. It is possible to extract the number of times (which can be said to be coordinates) is displayed.
  • the browsing history storage unit 12a stores the browsing information so as to be associated with the patient, the medical record, and the like.
  • FIGS. 9A to 9C are diagrams showing a diagnostic information storage unit included in the medical information system 30.
  • 9A to 9C show an example in which diagnostic information is stored in a different table for each organ to be inspected.
  • FIG. 9A shows an example of a table that stores diagnostic information related to a breast cancer test
  • FIG. 9B shows an example of a table that stores diagnostic information related to a lung cancer test
  • FIG. 9C shows an example of a table that stores diagnostic information related to a colon test. An example of the table is shown.
  • the diagnostic information storage unit 30A shown in FIG. 9A includes "patient ID”, "pathological image”, “diagnosis result”, “grade”, “tissue type”, “genetic test”, “ultrasonography”, and “medication”. Memorize information.
  • "Patient ID” indicates identification information for identifying a patient.
  • a “pathological image” indicates a pathological image saved by a pathologist at the time of diagnosis. In the “pathological image”, position information (center coordinates, magnification, etc.) indicating an image area to be saved with respect to the entire image may be stored instead of the image itself.
  • the "diagnosis result” is a diagnosis result by a pathologist, and indicates, for example, the presence or absence of a lesion site and the type of the lesion site.
  • “Grade” indicates the degree of progression of the diseased area. "Histological type” indicates the type of diseased part. “Genetic test” indicates the result of the genetic test. “Ultrasonography” indicates the result of an ultrasonic examination. Dosing provides information about dosing to the patient.
  • the diagnostic information storage unit 30B shown in FIG. 9B stores information related to the "CT examination” performed in the lung cancer examination instead of the "ultrasonic examination” stored in the diagnostic information storage unit 30A shown in FIG. 9A.
  • the diagnostic information storage unit 30C shown in FIG. 9C stores information related to the "endoscopy” performed in the large intestine examination instead of the "ultrasound examination” stored in the diagnostic information storage unit 30A shown in FIG. 9A. ..
  • the communication unit 110 is realized by, for example, a NIC (Network Interface Card) or the like.
  • the communication unit 110 is connected to a network (not shown) by wire or wirelessly, and transmits / receives information to / from the pathology system 10 or the like via the network.
  • the control unit 130 which will be described later, transmits / receives information to / from these devices via the communication unit 110.
  • the input unit 111 is an input device that inputs various information to the analysis device 100.
  • the input unit 111 corresponds to a keyboard, a mouse, a touch panel, and the like.
  • the display unit 112 is a display device that displays information output from the control unit 130.
  • the display unit 112 corresponds to a liquid crystal display, an organic EL (Electro Luminescence) display, a touch panel, and the like.
  • the storage unit 120 has a pathological image DB (Data Base) 120a, an analysis module table 120b, an analysis module configuration information 120c, a parameter table 120d, an intermediate data table 120e, and an outline parameter conversion table 120f.
  • the storage unit 120 is realized by, for example, a semiconductor memory element such as a RAM (Random Access Memory) or a flash memory (Flash Memory), or a storage device such as a hard disk or an optical disk.
  • the pathological image DB 120a is a database that stores a plurality of pathological images.
  • FIG. 11 is a diagram showing an example of the data structure of the pathological image DB.
  • the pathological image DB 120a has a "patient ID" and a "pathological image".
  • the patient ID is information that uniquely identifies the patient.
  • the pathological image shows a pathological image saved by the pathologist at the time of diagnosis.
  • the pathological image is transmitted from the server 12.
  • the pathological image DB 120a includes the patient ID and the pathological image, as well as the "diagnosis result", "grade”, “histological type”, “genetic test", “ultrasonography”, and “medication” described in FIGS. 9A-9C. Information may be retained.
  • the analysis module table 120b is a table that holds a plurality of analysis modules that execute analysis processing for pathological images.
  • FIG. 12 is a diagram showing an example of the data structure of the analysis module table. As shown in FIG. 12, the analysis module table 120b has a "module ID", a "module type", and an "analysis module".
  • the module ID is information that uniquely identifies the analysis module.
  • the module type is information indicating the analysis content executed by the analysis module.
  • the analysis module is the data (program) of the corresponding analysis module.
  • the analysis module of the module ID "M4" is a module that executes the calculation of the positive rate of the nucleus. In the calculation of the positive rate of nuclei, the positive rate of nuclei is calculated from the number of positive nuclei contained in the designated region and the number of negative nuclei.
  • the analysis module of module ID "M5" is a module that executes positive membrane detection. Positive membrane detection is the detection of cell membranes from regions that belong to the positive.
  • the analysis module of module ID “M6” is a module that executes negative film detection. Negative membrane detection is the detection of cell membranes from regions that belong to the negative.
  • the analysis module of module ID "M7" is a module that executes the calculation of the positive rate of the membrane.
  • the positive rate of the membrane is calculated from the number of positive membranes contained in the designated region and the number of negative membranes.
  • FIG. 14 is a diagram showing a connection relationship of analysis modules based on the analysis module configuration information 120c shown in FIG.
  • the analysis module of the module ID "Mn (n is a natural number)" is referred to as an analysis module Mn.
  • the parameter table 120d is a table that stores parameters used when each analysis module executes analysis processing.
  • FIG. 15 is a diagram showing an example of the data structure of the parameter table 120d. As shown in FIG. 15, the parameter table 120d associates the module ID with the parameter set. The module ID is information that uniquely identifies the analysis module. A plurality of parameters are associated with the parameter set. In the example shown in FIG. 15, the first parameter, the second parameter, and the third parameter are shown, but the parameter set may further include other parameters.
  • the intermediate data table 120e is a table that stores the processing results of each analysis module.
  • FIG. 16 is a diagram showing an example of the data structure of the intermediate data table 120e. As shown in FIG. 16, the intermediate data table 120e associates the module ID with the intermediate data.
  • the module ID is information that uniquely identifies the analysis module.
  • the intermediate data indicates the processing result data of the analysis module identified by the module ID.
  • the intermediate data of the module ID “M1” is the data of the processing result of staining separation.
  • the intermediate data of the module ID “M2” is the data of the processing result of the positive nucleus detection.
  • the intermediate data of the module ID “M3” is the data of the processing result of the negative nucleus detection.
  • the intermediate data of the module ID “M4" is the data of the processing result of the positive rate calculation of the nucleus.
  • the intermediate data of the module ID "M5" is the data of the processing result of the positive film detection.
  • the intermediate data of the module ID “M6” is the data of the processing result of the negative film detection.
  • the intermediate data of the module ID “M7” is the data of the processing result of the positive rate calculation of the membrane.
  • the intermediate data of the module ID “M8” is the data of the scoring processing result.
  • the summary parameter conversion table 120f is a table that is referred to when converting a parameter into a summary parameter.
  • FIG. 17 is a diagram showing an example of the data structure of the outline parameter conversion table 120f. As shown in FIG. 17, the module ID and the outline parameter conversion formula are associated with the outline parameter conversion table 120f. The module ID is information that uniquely identifies the analysis module.
  • the summary parameter conversion formula is a conversion formula used when converting a parameter into a summary parameter.
  • the module ID "M1" there are three parameters of the module ID "M1", that is, the first parameter, the second parameter, and the third parameter. Will be integrated.
  • the acquisition unit 130a is a processing unit that sends a pathological image acquisition request to the server 12 and acquires the pathological image from the server 12.
  • the acquisition unit 130a registers the acquired pathological image in the pathological image DB 120a.
  • information such as "diagnosis result”, “grade”, “tissue type”, “genetic test”, “ultrasonography”, and “medication” described in FIGS. 9A to 9C is attached to the pathological image. It may have been done.
  • the analysis target designation unit 130b is a processing unit that designates an area to be analyzed in the pathological image based on the user's operation.
  • the area to be analyzed is referred to as "annotation”.
  • FIG. 18 is a diagram for explaining the processing of the analysis target designation unit 130b according to the first embodiment.
  • the analysis target designating unit 130b acquires the information of the pathological image I1 from the pathological image DB 120a and displays it on the display unit 112.
  • the observation object A10 is entirely visualized on the pathological image I1.
  • the user operates the input unit 111 while referring to the display unit 112 to select the annotation RI 21.
  • the annotation for the pathological image may be specified in advance by the pathologist of the pathological system 10.
  • the reception unit 130c is a processing unit that receives parameters used in the analysis processing of the pathological image.
  • the reception unit 130c cooperates with the output unit 130e, which will be described later, to receive changes in the parameters used by each analysis module.
  • the analysis unit 130d loads the "analysis module M3" from the analysis module table 120b.
  • the analysis unit 130d acquires the parameters for the analysis module M3.
  • the analysis unit 130d acquires the intermediate data of the analysis module M1 from the intermediate data table 120e.
  • the analysis unit 130d inputs the intermediate data of the analysis module M1 to the analysis module M3 and executes a partial analysis process (negative nucleus detection).
  • the analysis unit 130d registers the result (intermediate data) of the partial analysis process (negative nucleus detection) in the intermediate data table 120e.
  • the analysis unit 130d loads the "analysis module M5" from the analysis module table 120b.
  • the analysis unit 130d acquires the parameters for the analysis module M5.
  • the analysis unit 130d acquires the intermediate data of the analysis module M1 from the intermediate data table 120e.
  • the analysis unit 130d inputs the intermediate data of the analysis module M1 into the analysis module M5 and executes a partial analysis process (positive film detection).
  • the analysis unit 130d registers the result (intermediate data) of the partial analysis process (positive film detection) in the intermediate data table 120e.
  • the analysis unit 130d loads the "analysis module M4" from the analysis module table 120b.
  • the analysis unit 130d acquires the parameters for the analysis module M4.
  • the analysis unit 130d acquires the intermediate data of the analysis modules M2 and M3 from the intermediate data table 120e.
  • the analysis unit 130d inputs the intermediate data of the analysis modules M2 and M3 into the analysis module M4, and executes a partial analysis process (calculation of the positive rate of the nucleus).
  • the analysis unit 130d registers the result (intermediate data) of the partial analysis process (calculation of the positive rate of the nucleus) in the intermediate data table 120e.
  • the analysis unit 130d loads the "analysis module M8" from the analysis module table 120b.
  • the analysis unit 130d acquires the parameters for the analysis module M8.
  • the analysis unit 130d acquires the intermediate data of the analysis modules M4 and M7 from the intermediate data table 120e.
  • the analysis unit 130d inputs the intermediate data of the analysis modules M4 and M7 into the analysis module M8 and executes a partial analysis process (carcinoma identification).
  • the analysis unit 130d registers the result (intermediate data) of the partial analysis process (carcinoma identification) in the intermediate data table 120e.
  • the analysis unit 130d uses the initial value of the parameter set in advance for the analysis module Mn.
  • the initial value of the parameter is set as the parameter of the parameter set "Set1" of the parameter table 120d.
  • FIG. 19 is a diagram showing an example of the analysis result screen according to the first embodiment.
  • icons IC1 to IC8 are arranged on the analysis result screen 40.
  • the icons IC1 to IC8 are icons corresponding to the analysis modules M1 to M8.
  • the output unit 130e specifies the connection relationship between the analysis modules M1 to M8 based on the analysis module configuration information 120c, and arranges and connects the icon ICs 1 to IC8 based on the connection relationship.
  • the output unit 130e acquires the result (intermediate data) of the partial analysis process (staining separation) from the intermediate data table 120e and visually arranges it in the area A41 of the icon IC1.
  • the output unit 130e visually arranges the updated intermediate data in the area A41.
  • the output unit 130e acquires the result (intermediate data) of the partial analysis process (negative nucleus detection) from the intermediate data table 120e and visually arranges it in the area A43 of the icon IC3.
  • the output unit 130e visually arranges the updated intermediate data in the area A43.
  • the output unit 130e acquires the result (intermediate data) of the partial analysis process (calculation of the positive rate of the nucleus) from the intermediate data table 120e and visually arranges it in the area A44 of the icon IC4.
  • the output unit 130e visually arranges the updated intermediate data in the region A44.
  • the output unit 130e acquires the result (intermediate data) of the partial analysis process (calculation of the positive rate of the membrane) from the intermediate data table 120e and visually arranges it in the area A47 of the icon IC7.
  • the output unit 130e visually arranges the updated intermediate data in the area A47.
  • the output unit 130e may acquire the parameters set in the analysis module Mn from the parameter table 120d or from the reception unit 130c.
  • the user When changing the parameters used in the analysis module M1, the user operates the input unit 111 and presses the change button bu1. Similarly, when changing the parameters used in the analysis module Mn, the user presses the change button bun.
  • the analysis unit 130d executes each analysis module Mn using the updated parameter.
  • the analysis unit 130d may execute the partial analysis process of all the analysis modules Mn, or the part of some analysis modules located downstream of the specific analysis module. Only the analysis process may be executed.
  • new intermediate data is generated and the intermediate data in the intermediate data table 120e is updated.
  • the output unit 130e updates the analysis result screen 40 every time the intermediate data is updated.
  • the reception unit 130c generates an outline parameter based on the outline parameter conversion table 120f. For example, the reception unit 130c calculates the summary parameters of the first to third parameters of the parameter sets Set1 to Set3, and displays the summary parameters of the parameter sets Set1 to Set3 on the display unit 112 in a visually observable state.
  • the user operates the input unit 111 to select one of the parameter sets, and the reception unit 130c receives the summary parameters corresponding to the selected parameter set. After selecting any of the parameter sets, the user may operate the input unit 111 to directly change the value of the summary parameter.
  • the reception unit 130c receives the selection of the outline parameter by the above process, the reception unit 130c associates the outline parameter with the module ID and outputs the output to the analysis unit 130d.
  • the reception unit 130c may convert the summary parameter into a visible image and display it on the icon ICn of the analysis result screen 40, and accept the change of the summary parameter.
  • FIG. 20 is a diagram for explaining a process of changing the outline parameters.
  • the icon IC2 will be used as an example.
  • the icon IC 2 includes an outline parameter change area A52.
  • the reception unit 130c converts the summary parameter as a selection candidate into an image, and arranges the converted image in the summary parameter change area A52. For example, the reception unit 130c arranges the luminance images corresponding to the outline parameters in the outline parameter change area A52.
  • Each image and each summary parameter are associated with each other, and when any image is selected by the user, the reception unit 130c acquires the summary parameter corresponding to the received image.
  • FIG. 21 is a flowchart showing an analysis processing procedure according to the first embodiment.
  • the analysis unit 130d of the analysis device 100 acquires the analysis module configuration information 120c (step S101).
  • the analysis target designation unit 130b of the analysis device 100 acquires the pathology image from the pathology image DB 120a and displays it on the display unit 112 (step S102).
  • the analysis target designation unit 130b accepts the annotation designation from the user who operates the input unit 111 (step S103).
  • the analysis unit 130d of the analysis device 100 loads the analysis module from the analysis module table 120b (step S104).
  • the analysis unit 130d acquires the parameters for the loaded analysis module (step S105).
  • the analysis unit 130d executes the partial analysis process of the analysis module (step S106).
  • the analysis unit 130d stores the processing result of the partial analysis process in the intermediate data table 120e (step S107).
  • the output unit 130e generates an analysis result screen based on the intermediate data of the intermediate data table 120e and displays it on the display unit 112 (step S108). If the analysis unit 130d does not have the next analysis module (steps S109, No), the analysis unit 130d proceeds to step S111. On the other hand, if there is a next analysis module (step S109, Yes), the analysis unit 130d loads the next analysis module (step S110) and proceeds to step S105.
  • the reception unit 130c determines whether or not the parameter change has been accepted (step S111). If the reception unit 130c has not received the parameter change (steps S111, No), the reception unit 130c ends the analysis process. On the other hand, when the reception unit 130c accepts the parameter change (step S111, Yes), the analysis unit 130d loads the analysis module related to the changed parameter (step S112) and proceeds to step S105.
  • the analysis unit 130d executes the analysis module M1 and uses two vectors (first vector and second vector) to annotate the region of staining for detecting the nucleus and the staining for detecting the membrane. Separate into areas. For example, when the color of the region is close to the first vector, the analysis unit 130d sets such a region as a staining region for detecting the nucleus. When the color of the region is close to the second vector, the analysis unit 130d sets such a region as a staining region for detecting the film.
  • the circularity, irregularity, roundness, etc. of the nucleus are specified as parameters.
  • the analysis modules M2 and M3 execute a partial analysis process for detecting nuclear cells from annotations using at least one of the circularity, deformity, roundness, and the like of the nucleus. Further, in the analysis modules M2 and M3 (other analysis modules Mn), the score regarding the CR ratio of cells and the degree of dyeing of stains may be used as parameters.
  • a certain analysis module Mn may execute a process of detecting a cell in which the distance between the annotation and the cell nucleus is less than the threshold value and the closest cell is detected as the annotation cell.
  • the threshold value used here is used as a parameter.
  • FIG. 23 is a diagram for explaining an example of parameters. In the example shown in FIG. 23, the nuclei n1, n2, and n3 exist in the vicinity of the annotation A21, but the nucleus n1 which is the shortest distance from the annotation A21 and is less than the threshold value (parameter) is attached to the annotation A21. Selected as the nucleus of the corresponding cell.
  • a histogram is generated based on the relationship between the distance and frequency between the first cell group and the second cell group.
  • FIG. 24 is a diagram showing an example of a histogram generated by the analysis module.
  • the vertical axis of the histogram h1 corresponds to the frequency, and the horizontal axis corresponds to the distance.
  • the output unit 130e may visually display the histogram h1 as intermediate data in association with the icon of the analysis module.
  • the reception unit 130c receives the parameters used in the analysis processing of the pathological image, and the output unit 130e is in the middle of the analysis processing executed based on the parameters. Output information in a visible state. As a result, the user can confirm the influence of the analysis result by the parameter.
  • the output unit 130e can visually recognize the information on the progress of the analysis process executed based on the summary parameter in which a plurality of parameters are combined and the summary parameter in association with each other. Output in the state. As a result, the user can confirm the influence of the analysis result by the summary parameter. In addition, the user can roughly grasp a plurality of parameters with one summary parameter.
  • the analysis unit 130d reads out each analysis module stored in the analysis module table 120b in order and executes the partial analysis process, and the output unit 130e makes the results of the partial analysis process visible. Output. As a result, the user can confirm the result of each partial analysis process affected by the parameter change.
  • the partial analysis process executed by the analysis device 100 includes a process for separating the dyeing, and the parameters of the process for separating the dyeing include a vector in RGB space for performing the separation.
  • the analysis unit 130d can separate the annotation region into a staining region for detecting the nucleus and a staining region for detecting the film by using it as a vector parameter in the RGB space.
  • the plurality of pathological images may be any combination of images.
  • “pathological images stained with different reagents”, “bright-field images and fluorescent images”, “general-stained images and special-stained images”, “pathological images and pathological images” "Different medical images” are raised.
  • FIG. 25 is a diagram showing an example of the analysis device according to the second embodiment.
  • the analysis device 200 includes a communication unit 210, an input unit 211, a display unit 212, a storage unit 220, and a control unit 230.
  • the communication unit 210 is realized by, for example, a NIC or the like.
  • the communication unit 210 is connected to a network (not shown) by wire or wirelessly, and transmits / receives information to / from the pathology system 10 or the like via the network.
  • the control unit 230 which will be described later, transmits / receives information to / from these devices via the communication unit 210.
  • the input unit 211 is an input device that inputs various information to the analysis device 200.
  • the input unit 211 corresponds to a keyboard, mouse, touch panel, and the like.
  • the display unit 212 is a display device that displays information output from the control unit 230.
  • the display unit 212 corresponds to a liquid crystal display, an organic EL display, a touch panel, and the like.
  • the storage unit 220 has a pathological image DB 220a, an analysis module table 220b, an analysis module configuration information 220c, a parameter table 220d, an intermediate data table 220e, and an outline parameter conversion table 220f.
  • the storage unit 220 is realized by, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk.
  • the analysis module table 220b is a table that holds a plurality of analysis modules that execute analysis processing for pathological images.
  • the description of the analysis module table 220b is the same as the description of the analysis module table 120b described in the first embodiment.
  • the analysis module configuration information 220c is information indicating the connection relationship of the analysis module.
  • the description of the analysis module configuration information 220c is the same as the description of the analysis module configuration information 120c described in the first embodiment.
  • the intermediate data table 220e is a table that stores the processing results of each analysis module.
  • the description of the intermediate data table 220e is the same as the description of the intermediate data table 120e described in the first embodiment.
  • the control unit 230 has an acquisition unit 230a, an analysis target designation unit 230b, a reception unit 230c, an analysis unit 230d, and an output unit 230e.
  • the control unit 230 is realized by, for example, executing a program (an example of an analysis program) stored in the analysis device 200 by a CPU or an MPU with a RAM or the like as a work area. Further, the control unit 230 may be executed by an integrated circuit such as an ASIC or FPGA.
  • the analysis target designation unit 230b is a processing unit that designates annotations to be analyzed for a plurality of pathological images based on the user's operation.
  • FIG. 26 is a diagram for explaining the processing of the analysis target designation unit 230b according to the second embodiment.
  • the analysis target designation unit 230b acquires the information of the pathological images I1 and I2 from the pathological image DB 220a and displays them on the display unit 212.
  • the analysis target designation unit 230b outputs the pathological image and annotation information to the analysis unit 230d.
  • the reception unit 230c is a processing unit that receives parameters used in the pathological image analysis processing.
  • the reception unit 230c may receive changes in the parameters used by each analysis module in cooperation with the output unit 230e described later.
  • the analysis unit 230d has intermediate data of the analysis process (plural partial analysis process) executed for the annotation RI21 and intermediate data of the analysis process (plural partial analysis process) executed for the annotation RI22.
  • the intermediate data of the analysis process (plurality of partial analysis processes) executed for the annotation RI23 is distinguished and registered in the intermediate data table 220e.
  • the analysis unit 230d executes each analysis module Mn using the updated parameter.
  • the analysis unit 230d may execute the partial analysis process of all the analysis modules Mn, or only the partial analysis process of the analysis module located downstream of the specific analysis module. May be executed.
  • FIG. 27 is a diagram showing an example of the analysis result screen 50 according to the second embodiment.
  • icons IC1-1 to IC8-1, IC1-2 to IC8-2, and IC1-3 to IC8-3 are arranged on the analysis result screen 50.
  • Icons IC1-1 to IC8-1, IC1-2 to IC8-2, and IC1-3 to IC8-3 are icons corresponding to the analysis modules M1 to M8, respectively.
  • the output unit 230e identifies the connection relationship between the analysis modules M1 to M8 based on the analysis module configuration information 220c, and based on the connection relationship, the icons IC1 to IC8, IC1-2 to IC8-2, and IC1-3.
  • ⁇ IC8-3 is arranged and connected.
  • the change button of the icon IC1-1 the user changes the parameters used in the analysis module M1 that executes the staining separation for the annotation RI21.
  • the change button of the icon IC1-2 By pressing the change button of the icon IC1-2, the user changes the parameters used in the analysis module M1 that executes staining separation for the annotation RI22. By pressing the change button of the icon IC1-3, the user changes the parameters used in the analysis module M1 that executes the staining separation for the annotation RI23.
  • the process of pressing the change button to change the parameter is the same as that of the first embodiment.
  • the user can specify independent parameters for the analysis module that executes the same partial analysis process. For example, the parameters specified individually are distinguished from each other by using the identification information of the icon or the like.
  • the user may operate the input unit 111 to collectively specify the parameters.
  • the reception unit 230c may accept the parameters specified for the icons IC1-1 to IC8-1 as they are as the parameters of the icons IC1-2 to IC8-2 and IC1-3 to IC8-3.
  • the reception unit 230c When the reception unit 230c receives the parameter change by the above process, the reception unit 230c associates the changed parameter with the identification information of the icon in which the parameter has been changed and outputs the changed parameter to the analysis unit 230d.
  • the analysis unit 230d receives the update of the parameter from the reception unit 130c, the analysis unit 230d executes each analysis module Mn using the updated parameter.
  • the analysis unit 230d loads the analysis module M1 and sets the parameters specified by the icon IC1-1.
  • Intermediate data is calculated by executing the dyeing separation used, the dyeing separation using the parameters specified by the icon IC1-2, and the dyeing separation using the parameters specified by the icon IC1-3, respectively. The same applies to the other analysis modules Mn.
  • FIG. 28 is a diagram showing an example of an HE-stained image and an IHC-stained image.
  • the analysis target designation unit 230b acquires the information of the HE-stained image I1a, the IHC-stained image I1b, and the IHC-stained image I1c from the pathological image DB 220a and displays them on the display unit 212.
  • the analysis target designation unit 230b outputs the information of the HE-stained image I1a, the IHC-stained image I1b, the IHC-stained image I1c, and the annotations RI21a, RI21b, and RI21c to the analysis unit 230d.
  • the analysis unit 230d receives the annotation RI21a as an input, detects the region of the nucleus being stained, and calculates the shape feature amount (area, perimeter, major axis length, minor axis length, circularity, etc.). The analysis unit 230d calculates a "nuclear shape score" indicating the degree of nuclear atypia from the shape feature amount. For example, the analysis unit 230d executes HE staining separation, nuclear region detection, nuclear shape feature amount calculation, and nuclear shape score calculation as partial analysis processing.
  • the analysis unit 230d calculates the total score from the nuclear shape score, the nuclear staining score, and the membrane staining score. For example, the analysis unit 230d executes total score calculation as a partial analysis process.
  • each analysis module Mn corresponding to each partial analysis process is registered in the analysis module table 220b. Further, it is assumed that the connection-related information of each analysis module is registered in the analysis module configuration information 220c.
  • the analysis unit 230d refers to the intermediate data of the analysis process (plural partial analysis processes) executed on the annotation RI21a, the intermediate data of the analysis process (plural partial analysis processes) executed on the annotation RI21b, and the annotation RI21c.
  • the intermediate data of the executed analysis process (plurality of partial analysis processes) is distinguished and registered in the intermediate data table 220e.
  • the analysis unit 230d uses the parameter received by the reception unit 230c as a parameter for the analysis module Mn. When the analysis unit 230d does not accept the parameter, the analysis unit 230d uses the initial value of the parameter set in advance for the analysis module Mn.
  • the analysis unit 230d executes each analysis module Mn using the updated parameter.
  • the analysis unit 230d may execute the partial analysis process of all the analysis modules Mn, or only the partial analysis process of the analysis module located downstream of the specific analysis module. May be executed.
  • FIG. 30 is a diagram showing an example of an HE-stained image and a fluorescence IHC-stained image.
  • the analysis target designation unit 230b acquires the information of the HE-stained image I1a and the fluorescence IHC-stained image I1d from the pathological image DB 220a and displays them on the display unit 212.
  • the user operates the input unit 211 while referring to the display unit 212 to select the annotation RI21a.
  • the analysis target designation unit 230b receives the selection of the annotation RI21a
  • the analysis target designation unit 230b corrects the position / rotation deviation of the observation target A10d from the observation target A10a.
  • Annotation RI21d is set.
  • the user may operate the input unit 211 to specify the annotation RI21d.
  • the analysis target designation unit 230b outputs the information of the HE stained image I1a, the fluorescent IHC stained image I1d, and the annotations RI21a and RI21d to the analysis unit 230d.
  • the analysis unit 230d receives the annotation RI21a as an input, detects the region of the nucleus being stained, and calculates the shape feature amount (area, perimeter, major axis length, minor axis length, circularity, etc.).
  • the analysis unit 230d performs a gating process (for example, a process of selecting only a cell population of a feature amount of interest in a sample and creating a histogram of only that cell) based on the shape feature amount, and a cell having a predetermined feature. Is extracted as tumor cells.
  • the analysis process 230d executes HE stain separation, nuclear region detection, nuclear shape feature calculation, and tumor cell extraction as partial analysis processes.
  • the analysis unit 230d receives the annotation RI21d as an input, performs fluorescence staining separation, and detects positive cells (T cells, B cells, etc.) for each staining. For example, the analysis processing unit 230d executes fluorescence staining separation, positive cell detection (fluorescence 1; T cells), and positive cell detection (fluorescence 2; B cells) as partial analysis processing.
  • the analysis unit 230d calculates the round-robin distance between the nucleus of the tumor cell detected from the HE-stained cells and the cell detected from the fluorescence image, and creates an intercellular distance distribution. For example, the analysis unit 230d executes the calculation of the distance distribution between the first cells and the calculation of the distance distribution between the second cells as the partial analysis process.
  • the distance between the first cells indicates the distance between the nucleus of the tumor cell and the T cell.
  • the second cell-to-cell distance indicates the distance between the nucleus of the tumor cell and the B cell.
  • FIG. 31 is a diagram showing an example of the intercellular distance distribution.
  • the vertical axis corresponds to the frequency and the horizontal axis corresponds to the intercellular distance.
  • the curve 31a shows the distribution of the first cell-cell distance calculated by the calculation of the first-cell distance distribution.
  • the curve 31b shows the distribution of the second cell-to-cell distance calculated by the calculation of the second-cell distance distribution.
  • each analysis module Mn corresponding to each partial analysis process is registered in the analysis module table 220b. Further, it is assumed that the connection-related information of each analysis module is registered in the analysis module configuration information 220c.
  • the analysis unit 230d distinguishes between the intermediate data of the analysis process (plural partial analysis processes) executed for the annotation RI21a and the intermediate data of the analysis process (plural partial analysis processes) executed for the annotation RI21d, and intermediates them. Register in the data table 220e.
  • the analysis unit 230d uses the parameter received by the reception unit 230c as a parameter for the analysis module Mn. When the analysis unit 230d does not accept the parameter, the analysis unit 230d uses the initial value of the parameter set in advance for the analysis module Mn.
  • the analysis unit 230d executes each analysis module Mn using the updated parameter.
  • the analysis unit 230d may execute the partial analysis process of all the analysis modules Mn, or only the partial analysis process of the analysis module located downstream of the specific analysis module. May be executed.
  • the output unit 230e is a processing unit that outputs information on the progress of the analysis process executed based on the parameters in a visually recognizable state. For example, the output unit 230e generates an analysis result screen 70 as shown in FIG. 32 and displays it on the display unit 112.
  • FIG. 32 is a diagram showing an example of the analysis result screen 70 using the HE-stained image and the fluorescent IHC-stained image. As shown in FIG. 32, icons IC71-1 to IC74-1, IC71-2 to IC72-3, IC73, and IC74 are arranged on the analysis result screen 70.
  • the icon IC73 is set with an area for visually displaying the result (intermediate data) of the calculation of the distance between the first cells.
  • the icon IC 74 is set with an area for visually displaying the result (intermediate data) of the calculation of the distance between the second cells.
  • the icons ICn-1, ICn-2, IC73, and IC74 are provided with change buttons for changing parameters, respectively.
  • the user can change the parameters by pressing the change button of the icon IC.
  • FIG. 33 is a diagram showing an example of an HE-stained image and a CT image.
  • the analysis target designation unit 230b acquires the information of the HE-stained image I1a from the pathological image DB 220a, acquires the information of the CT image G33 from a DB (not shown), and displays it on the display unit 212.
  • the HE-stained image I1a corresponds to a general-stained image.
  • the CT image G33 is an image other than the pathological image.
  • the observation target A10a is entirely extracted from the HE-stained image I1a.
  • the user operates the input unit 211 while referring to the display unit 212 to select the annotations RI21a and RI21e.
  • the analysis target designation unit 230b may automatically select the annotation RI21e corresponding to the annotation RI21a.
  • the analysis target designation unit 230b outputs the information of the HE stained image I1a, the CT image G33, and the annotations RI21a and RI21e to the analysis unit 230d.
  • the analysis unit 230d uses the annotation RI21a as an input to detect the stained nuclear region.
  • the analysis unit 230d classifies cell types using a model learned in advance by deep learning or the like.
  • the analysis unit 230d classifies the cell type into one of undifferentiated, poorly differentiated, and well-differentiated.
  • the analysis unit 230d classifies the cell type into one of stage 0, stage 1, stage 2, stage 3, and the like.
  • the analysis unit 230d executes HE staining separation, nuclear region detection, and cell type classification as a partial analysis process.
  • the analysis unit 230d receives the annotation RI21e as an input, detects the organ region, detects the tumor region from the inside of the organ region, and calculates the area of the tumor. For example, the analysis unit 230d executes organ region detection, tumor region detection, and tumor size as partial analysis processing.
  • the analysis unit 230d calculates the malignancy score of the tumor from the cell type and the tumor area. For example, the analysis unit 230d executes a tumor malignancy score calculation as a partial analysis process.
  • each analysis module Mn corresponding to each partial analysis process is registered in the analysis module table 220b. Further, it is assumed that the connection-related information of each analysis module is registered in the analysis module configuration information 220c.
  • the analysis unit 230d distinguishes between the intermediate data of the analysis process (plural partial analysis processes) executed for the annotation RI21a and the intermediate data of the analysis process (plural partial analysis processes) executed for the annotation RI21e, and intermediates them. Register in the data table 220e.
  • the analysis unit 230d uses the parameter received by the reception unit 230c as a parameter for the analysis module Mn. When the analysis unit 230d does not accept the parameter, the analysis unit 230d uses the initial value of the parameter set in advance for the analysis module Mn.
  • the analysis unit 230d executes each analysis module Mn using the updated parameter.
  • the analysis unit 230d may execute the partial analysis process of all the analysis modules Mn, or only the partial analysis process of the analysis module located downstream of the specific analysis module. May be executed.
  • the output unit 230e is a processing unit that outputs information on the progress of the analysis process executed based on the parameters in a visually recognizable state. For example, the output unit 230e generates an analysis result screen 80 as shown in FIG. 34 and displays it on the display unit 112.
  • FIG. 34 is a diagram showing an example of the analysis result screen 80 using the HE-stained image and the CT image. As shown in FIG. 34, icons IC81-1 to IC83-1, IC81-2 to IC83-2, and IC84 are arranged on the analysis result screen 80.
  • Icons IC81-1 to IC83-1 are icons corresponding to the analysis modules of HE staining separation, nuclear region detection, and cell type classification.
  • Icons IC81-2 to IC83-2 are icons corresponding to analysis modules for visceral region detection, tumor region detection, and tumor size calculation.
  • the icon IC84 is an icon corresponding to the analysis module for calculating the tumor malignancy score.
  • the icon IC84 is associated with an area for visually displaying the result of tumor malignancy score calculation.
  • the icons ICn-1, ICn-2, and IC84 are provided with change buttons for changing parameters, respectively.
  • the user can change the parameters by pressing the change button of the icon IC.
  • FIG. 35 is a hardware configuration diagram showing an example of a computer 1000 that realizes the functions of the analysis device.
  • the computer 1000 includes a CPU 1100, a RAM 1200, a ROM (Read Only Memory) 1300, an HDD (Hard Disk Drive) 1400, a communication interface 1500, and an input / output interface 1600.
  • Each part of the computer 1000 is connected by a bus 1050.
  • the CPU 1100 operates based on the program stored in the ROM 1300 or the HDD 1400, and controls each part. For example, the CPU 1100 expands the program stored in the ROM 1300 or the HDD 1400 into the RAM 1200 and executes processing corresponding to various programs.
  • the ROM 1300 stores a boot program such as a BIOS (Basic Input Output System) executed by the CPU 1100 when the computer 1000 is started, a program that depends on the hardware of the computer 1000, and the like.
  • BIOS Basic Input Output System
  • the HDD 1400 is a computer-readable recording medium that non-temporarily records a program executed by the CPU 1100 and data used by the program.
  • the HDD 1400 is a recording medium for recording an information processing program according to the present disclosure, which is an example of program data 1450.
  • the communication interface 1500 is an interface for the computer 1000 to connect to an external network 1550 (for example, the Internet).
  • the CPU 1100 receives data from another device or transmits data generated by the CPU 1100 to another device via the communication interface 1500.
  • the input / output interface 1600 is an interface for connecting the input / output device 1650 and the computer 1000.
  • the CPU 1100 receives data from an input device such as a keyboard or mouse via the input / output interface 1600. Further, the CPU 1100 transmits data to an output device such as a display, a speaker, or a printer via the input / output interface 1600. Further, the input / output interface 1600 may function as a media interface for reading a program or the like recorded on a predetermined recording medium (media).
  • the media is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or PD (Phase change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
  • an optical recording medium such as a DVD (Digital Versatile Disc) or PD (Phase change rewritable Disk)
  • a magneto-optical recording medium such as an MO (Magneto-Optical disk)
  • a tape medium such as a magnetic tape
  • magnetic recording medium such as a magnetic tape
  • semiconductor memory for example, an optical recording medium such as a DVD (Digital Versatile Disc) or PD (Phase change rewritable Disk), a magneto-optical recording medium such as an MO (Magneto-Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory.
  • the CPU 1100 of the computer 1000 executes the analysis processing program loaded on the RAM 1200, thereby executing the acquisition unit 130a, the analysis target designation unit 130b, and the reception unit. Functions such as 130c, analysis unit 130d, and output unit 130e are realized.
  • the HDD 1400 stores an analysis processing program or the like according to the present disclosure.
  • the CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program, but as another example, these programs may be acquired from another device via the external network 1550.
  • the analysis device has a reception unit and an output unit.
  • the reception unit receives the parameters used in the first analysis process of the image related to the pathology.
  • the output unit can visually see the first result of the first analysis process executed based on the parameter and the second result of the second analysis process executed based on the first result.
  • the first result and the second result are image information. As a result, the user can confirm the influence of the analysis result by the parameter.
  • the reception unit further accepts the update of the parameter
  • the output unit further receives the update of the parameter, and each time the parameter is updated, the first result and the second result are executed based on the updated parameter. Is output in a visually recognizable state. This allows the user to confirm the progress information of the analysis result caused by the parameter change.
  • the output unit outputs the first result, the second result, and the parameter in a visually recognizable state in association with each other. This allows the user to easily check the parameters used by the analysis module.
  • the output unit is in a visually recognizable state in which the first result executed based on the summary parameter in which a plurality of parameters are combined, the second result, and the summary parameter are associated with each other. Output with.
  • the user can confirm the influence of the analysis result by the summary parameter.
  • the user can roughly grasp a plurality of parameters with one summary parameter.
  • the analysis process includes a plurality of continuous analysis processes including the first analysis process and the second analysis process, and the output unit receives the plurality of analysis processes by the reception unit.
  • the analysis process is performed based on the obtained parameters.
  • the result of the analysis process is output in a visually recognizable state. As a result, the user can confirm the progress of the process output for each partial analysis process.
  • the analysis unit executes the plurality of analysis processes in order based on the order definition information that defines the order of the plurality of analysis processes. Based on the order definition information, the analysis unit identifies the next analysis process following the previous analysis process among the plurality of analysis processes, and inputs the result of the previous analysis process to the next. Executes the analysis process of. As a result, the partial analysis process of each analysis module can be executed in the order according to the order definition information (analysis module configuration information). In addition, the next analysis process can be executed using the result of the previous analysis process.
  • the output unit outputs the first result of the first analysis process executed on a plurality of different images and the second result of the second analysis process in a visually recognizable state.
  • the plurality of images are pathological images stained with different reagents.
  • the plurality of images include a bright field image and a fluorescence image.
  • the plurality of images include a general-stained image and a special-stained image.
  • the plurality of images include a pathological image and a medical image different from the pathological image.
  • the medical image is an X-ray image, an endoscopic image or a microscope image.
  • the plurality of analysis processes include a process for detecting a cell nucleus, parameters of the process for detecting the cell nucleus include circularity, irregularity, and roundness, and the analysis unit includes the circularity and the irregularity. , At least one of the roundness is used to perform an analysis process for detecting cell nuclei from the image. Thereby, the cell nucleus can be detected by executing the analysis process using parameters such as circularity, irregularity, and roundness.
  • the plurality of analysis processes include a process of separating stains, a parameter of the process of separating stains includes a vector of a color space for performing the separation, and the analysis unit includes a vector of the color space. Is used to separate the region of the image into a stained region for detecting nuclei and a stained region for detecting membranes. As a result, the analysis process using the color space vector can be executed, and the dyeing separation can be executed.
  • the plurality of analysis processes include a process of detecting a plurality of cells and classifying them into a first cell group and a second cell group, and the parameters of the classifying process include a method of determining cells to be compared.
  • the analysis unit calculates the distance between the first cell group and the second cell group.
  • the analysis unit further executes a process of generating a histogram based on the relationship between the distance and the frequency between the first cell group and the second cell group. This makes it possible to easily confirm the relationship between the distance and frequency between the first cell group and the second cell group.
  • the plurality of analysis processes include a process of detecting a plurality of cell nuclei, and the analysis unit compares the distance between each cell nucleus and a predetermined region, and the cell nucleus whose distance is less than the threshold value corresponds to the predetermined region. Select as. This makes it possible to select the optimal cell nucleus.
  • the software In a diagnostic support system including a medical image acquisition device and software used for processing a medical image corresponding to an object imaged by the medical image acquisition device, the software (analysis program) relates to pathology.
  • the first result of the first analysis process that accepts the parameters used in the first analysis process of the image and is executed based on the parameters, and the second analysis process that is executed based on the first result.
  • the present technology can also have the following configurations.
  • a reception section that accepts parameters used in the first analysis process of images related to pathology,
  • An output unit that visually outputs the first result of the first analysis process executed based on the parameters and the second result of the second analysis process executed based on the first result.
  • An analyzer with and.
  • the analysis device according to (1) above, wherein the first result and the second result are image information.
  • the reception unit further accepts the update of the parameter,
  • the output unit is characterized in that each time the parameter is updated, the first result executed based on the updated parameter and the second result are output in a visually recognizable state.
  • the analyzer according to (1) or (2) above.
  • the output unit is characterized in that the first result, the second result, and the parameter are associated with each other and output in a visually recognizable state (1), (2), or (3).
  • the output unit is in a visually recognizable state in which the first result and the second result, which are executed based on the summary parameter in which a plurality of parameters are combined into one, are associated with the summary parameter.
  • the analysis process includes a plurality of continuous analysis processes including the first analysis process and the second analysis process, and the output unit receives the plurality of analysis processes by the reception unit.
  • the analysis apparatus according to any one of (1) to (5) above, wherein the result of the analysis process in which the analysis process is performed based on the obtained parameters is output in a visually recognizable state.
  • the analysis apparatus further comprising an analysis unit that sequentially executes the plurality of analysis processes based on the order definition information that defines the order of the plurality of analysis processes.
  • the analysis unit Based on the order definition information, the analysis unit identifies the next analysis process following the previous analysis process among the plurality of analysis processes, and inputs the result of the previous analysis process to the next.
  • the analysis apparatus according to (7) above, wherein the analysis process of the above is executed.
  • the output unit is characterized in that the first result of the first analysis process executed on a plurality of different images and the second result of the second analysis process are output in a visually recognizable state.
  • the analyzer according to (9) above, wherein the plurality of images are pathological images stained with different reagents.
  • the analysis apparatus according to (9) above, wherein the plurality of images include a bright-field image and a fluorescence image.
  • the plurality of images include a general-stained image and a special-stained image.
  • the analysis device according to (9) above, wherein the plurality of images include a pathological image and a medical image different from the pathological image.
  • the plurality of analysis processes include a process for detecting a cell nucleus, parameters of the process for detecting the cell nucleus include circularity, irregularity, and roundness, and the analysis unit includes the circularity and the irregularity.
  • the plurality of analysis processes include a process of separating stains, a parameter of the process of separating stains includes a vector of a color space for performing the separation, and the analysis unit includes a vector of the color space.
  • the analyzer according to (7) above wherein the image region is separated into a stained region for detecting a nucleus and a stained region for detecting a membrane.
  • the plurality of analysis processes include a process of detecting a plurality of cells and classifying them into a first cell group and a second cell group, and the parameters of the classifying process include a method of determining cells to be compared.
  • the analysis apparatus according to (7) above, wherein the analysis unit calculates the distance between the first cell group and the second cell group.
  • the plurality of analysis processes include a process of detecting a plurality of cell nuclei, and the analysis unit compares the distance between each cell nucleus and a predetermined region, and the cell nucleus whose distance is less than the threshold value corresponds to the predetermined region.
  • the analysis apparatus according to (7) above, wherein the analysis apparatus is selected as.
  • the analysis unit executes a process of detecting a region of the cell nucleus, a process of detecting a feature amount of the shape of the cell nucleus, a gating process based on the feature amount of the cell nucleus, and further executes a process of extracting tumor cells.
  • the analyzer according to (7) above.
  • the computer Accepts the parameters used in the first analysis process of images related to pathology, A process of visually outputting the first result of the first analysis process executed based on the parameters and the second result of the second analysis process executed based on the first result.
  • Computer A reception section that accepts parameters used in the first analysis process of images related to pathology, An output unit that visually outputs the first result of the first analysis process executed based on the parameters and the second result of the second analysis process executed based on the first result.
  • the software Accepts the parameters used in the first analysis process of images related to pathology, A process of visually outputting the first result of the first analysis process executed based on the parameters and the second result of the second analysis process executed based on the first result.
  • a diagnostic support system that is executed by an analyzer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Chemical & Material Sciences (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Hematology (AREA)
  • Urology & Nephrology (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Investigating Or Analysing Biological Materials (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Endoscopes (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un dispositif d'analyse, un procédé d'analyse, un programme d'analyse et un système d'aide au diagnostic qui permettent de vérifier les effets de paramètres sur des résultats d'analyse. Un dispositif d'analyse 100 comprend une unité de réception 130c qui reçoit des paramètres à utiliser dans un premier traitement d'analyse d'une image qui se rapporte à une pathologie et une unité de sortie 130e qui fournit visuellement des premiers résultats à partir du premier traitement d'analyse exécuté sur la base des paramètres et des seconds résultats provenant d'un second traitement d'analyse exécuté sur la base des premiers résultats.
PCT/JP2021/002442 2020-02-04 2021-01-25 Dispositif d'analyse, procédé d'analyse, programme d'analyse et système d'aide au diagnostic WO2021157405A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020016863A JP2021124861A (ja) 2020-02-04 2020-02-04 解析装置、解析方法、解析プログラム及び診断支援システム
JP2020-016863 2020-02-04

Publications (1)

Publication Number Publication Date
WO2021157405A1 true WO2021157405A1 (fr) 2021-08-12

Family

ID=77200483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002442 WO2021157405A1 (fr) 2020-02-04 2021-01-25 Dispositif d'analyse, procédé d'analyse, programme d'analyse et système d'aide au diagnostic

Country Status (2)

Country Link
JP (1) JP2021124861A (fr)
WO (1) WO2021157405A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114661207A (zh) * 2022-02-22 2022-06-24 中元汇吉生物技术股份有限公司 一种界面模块控制方法、装置、计算机设备及存储介质

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003325458A (ja) * 2002-05-14 2003-11-18 Fuji Photo Film Co Ltd 疾患候補情報出力システム
JP2004195213A (ja) * 2002-11-27 2004-07-15 Canon Inc 放射線画像のモデルベース解釈の初期化方法
JP2010013123A (ja) * 2008-07-01 2010-01-21 Rengo Co Ltd 包装箱
JP2013506129A (ja) * 2009-09-29 2013-02-21 ゼネラル・エレクトリック・カンパニイ 蛍光画像を用いて明視野画像を生成するためのシステム及び方法
US20130108139A1 (en) * 2011-10-26 2013-05-02 Definiens Ag Biomarker Evaluation Through Image Analysis
WO2013146841A1 (fr) * 2012-03-30 2013-10-03 コニカミノルタ株式会社 Processeur d'image médical et programme
JP2015043928A (ja) * 2013-08-29 2015-03-12 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
WO2015093518A1 (fr) * 2013-12-18 2015-06-25 コニカミノルタ株式会社 Dispositif de traitement d'image, système de prise en charge de diagnostic pathologique, programme de traitement d'image, et procédé de traitement d'image
JP2015526771A (ja) * 2012-04-30 2015-09-10 ゼネラル・エレクトリック・カンパニイ 生物組織に共局在するバイオマーカーを解析するためのシステム及び方法
WO2015145644A1 (fr) * 2014-03-27 2015-10-01 コニカミノルタ株式会社 Dispositif de traitement d'image et programme de traitement d'image
WO2016042963A1 (fr) * 2014-09-19 2016-03-24 コニカミノルタ株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
JP2016511845A (ja) * 2012-12-06 2016-04-21 クラリエント ダイアグノスティック サービシーズ, インコーポレイテッド 生体試料の分割画面表示及びその記録を取り込むためのシステム及び方法
JP2016511846A (ja) * 2012-12-06 2016-04-21 クラリエント ダイアグノスティック サービシーズ, インコーポレイテッド バイオマーカー発現の選択及び表示
JP2016517115A (ja) * 2013-04-17 2016-06-09 ゼネラル・エレクトリック・カンパニイ 連続的に染色した組織における、1つの細胞の分割を使用する多重化バイオマーカー定量用のシステム及び方法
WO2016190125A1 (fr) * 2015-05-22 2016-12-01 コニカミノルタ株式会社 Dispositif ainsi que procédé de traitement d'image, et programme pour traitement d'image
JP2017511473A (ja) * 2014-04-03 2017-04-20 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 画像を処理して解析するための検査装置
WO2017150194A1 (fr) * 2016-03-04 2017-09-08 コニカミノルタ株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
WO2017170085A1 (fr) * 2016-03-28 2017-10-05 コニカミノルタ株式会社 Dispositif de diagnostic médical et programme de diagnostic médical
JP2018503906A (ja) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド イムノスコア計算における共発現解析のためのシステム及び方法
WO2018128091A1 (fr) * 2017-01-05 2018-07-12 コニカミノルタ株式会社 Programme d'analyse d'image et procédé d'analyse d'image
WO2018143406A1 (fr) * 2017-02-06 2018-08-09 コニカミノルタ株式会社 Dispositif et programme de traitement d'images
JP2019054896A (ja) * 2017-09-20 2019-04-11 株式会社日立製作所 医用画像処理装置及び医用画像処理方法及びそれに用いる処理プログラム
WO2019110567A1 (fr) * 2017-12-05 2019-06-13 Ventana Medical Systems, Inc. Procédé de calcul d'hétérogénéité spatiale et entre marqueurs de tumeur
WO2019110583A1 (fr) * 2017-12-07 2019-06-13 Ventana Medical Systems, Inc. Systèmes d'apprentissage profond et procédés permettant de classifier conjointement des cellules et des régions d'images biologiques
WO2019171909A1 (fr) * 2018-03-08 2019-09-12 コニカミノルタ株式会社 Procédé de traitement d'image, dispositif de traitement d'image et programme
JP2019530847A (ja) * 2016-06-10 2019-10-24 エフ・ホフマン−ラ・ロシュ・アクチェンゲゼルシャフト 明視野像シミュレーションのためのシステム
JP2019533805A (ja) * 2016-10-07 2019-11-21 ベンタナ メディカル システムズ, インコーポレイテッド 視覚化されたスライド全域画像分析を提供するためのデジタル病理学システムおよび関連するワークフロー
JP2019533847A (ja) * 2016-08-12 2019-11-21 ヴェリリー ライフ サイエンシズ エルエルシー 高度な病理診断

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003325458A (ja) * 2002-05-14 2003-11-18 Fuji Photo Film Co Ltd 疾患候補情報出力システム
JP2004195213A (ja) * 2002-11-27 2004-07-15 Canon Inc 放射線画像のモデルベース解釈の初期化方法
JP2010013123A (ja) * 2008-07-01 2010-01-21 Rengo Co Ltd 包装箱
JP2013506129A (ja) * 2009-09-29 2013-02-21 ゼネラル・エレクトリック・カンパニイ 蛍光画像を用いて明視野画像を生成するためのシステム及び方法
US20130108139A1 (en) * 2011-10-26 2013-05-02 Definiens Ag Biomarker Evaluation Through Image Analysis
WO2013146841A1 (fr) * 2012-03-30 2013-10-03 コニカミノルタ株式会社 Processeur d'image médical et programme
JP2015526771A (ja) * 2012-04-30 2015-09-10 ゼネラル・エレクトリック・カンパニイ 生物組織に共局在するバイオマーカーを解析するためのシステム及び方法
JP2016511845A (ja) * 2012-12-06 2016-04-21 クラリエント ダイアグノスティック サービシーズ, インコーポレイテッド 生体試料の分割画面表示及びその記録を取り込むためのシステム及び方法
JP2016511846A (ja) * 2012-12-06 2016-04-21 クラリエント ダイアグノスティック サービシーズ, インコーポレイテッド バイオマーカー発現の選択及び表示
JP2016517115A (ja) * 2013-04-17 2016-06-09 ゼネラル・エレクトリック・カンパニイ 連続的に染色した組織における、1つの細胞の分割を使用する多重化バイオマーカー定量用のシステム及び方法
JP2015043928A (ja) * 2013-08-29 2015-03-12 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー 超音波診断装置及びその制御プログラム
WO2015093518A1 (fr) * 2013-12-18 2015-06-25 コニカミノルタ株式会社 Dispositif de traitement d'image, système de prise en charge de diagnostic pathologique, programme de traitement d'image, et procédé de traitement d'image
WO2015145644A1 (fr) * 2014-03-27 2015-10-01 コニカミノルタ株式会社 Dispositif de traitement d'image et programme de traitement d'image
JP2017511473A (ja) * 2014-04-03 2017-04-20 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 画像を処理して解析するための検査装置
WO2016042963A1 (fr) * 2014-09-19 2016-03-24 コニカミノルタ株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
JP2018503906A (ja) * 2014-12-30 2018-02-08 ベンタナ メディカル システムズ, インコーポレイテッド イムノスコア計算における共発現解析のためのシステム及び方法
WO2016190125A1 (fr) * 2015-05-22 2016-12-01 コニカミノルタ株式会社 Dispositif ainsi que procédé de traitement d'image, et programme pour traitement d'image
WO2017150194A1 (fr) * 2016-03-04 2017-09-08 コニカミノルタ株式会社 Dispositif de traitement d'image, procédé de traitement d'image et programme
WO2017170085A1 (fr) * 2016-03-28 2017-10-05 コニカミノルタ株式会社 Dispositif de diagnostic médical et programme de diagnostic médical
JP2019530847A (ja) * 2016-06-10 2019-10-24 エフ・ホフマン−ラ・ロシュ・アクチェンゲゼルシャフト 明視野像シミュレーションのためのシステム
JP2019533847A (ja) * 2016-08-12 2019-11-21 ヴェリリー ライフ サイエンシズ エルエルシー 高度な病理診断
JP2019533805A (ja) * 2016-10-07 2019-11-21 ベンタナ メディカル システムズ, インコーポレイテッド 視覚化されたスライド全域画像分析を提供するためのデジタル病理学システムおよび関連するワークフロー
WO2018128091A1 (fr) * 2017-01-05 2018-07-12 コニカミノルタ株式会社 Programme d'analyse d'image et procédé d'analyse d'image
WO2018143406A1 (fr) * 2017-02-06 2018-08-09 コニカミノルタ株式会社 Dispositif et programme de traitement d'images
JP2019054896A (ja) * 2017-09-20 2019-04-11 株式会社日立製作所 医用画像処理装置及び医用画像処理方法及びそれに用いる処理プログラム
WO2019110567A1 (fr) * 2017-12-05 2019-06-13 Ventana Medical Systems, Inc. Procédé de calcul d'hétérogénéité spatiale et entre marqueurs de tumeur
WO2019110583A1 (fr) * 2017-12-07 2019-06-13 Ventana Medical Systems, Inc. Systèmes d'apprentissage profond et procédés permettant de classifier conjointement des cellules et des régions d'images biologiques
WO2019171909A1 (fr) * 2018-03-08 2019-09-12 コニカミノルタ株式会社 Procédé de traitement d'image, dispositif de traitement d'image et programme

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114661207A (zh) * 2022-02-22 2022-06-24 中元汇吉生物技术股份有限公司 一种界面模块控制方法、装置、计算机设备及存储介质
CN114661207B (zh) * 2022-02-22 2023-09-26 中元汇吉生物技术股份有限公司 一种界面模块控制方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
JP2021124861A (ja) 2021-08-30

Similar Documents

Publication Publication Date Title
JP6816196B2 (ja) 包括的なマルチアッセイ組織分析のためのシステムおよび方法
AU2014230824B2 (en) Tissue object-based machine learning system for automated scoring of digital whole slides
US20200320336A1 (en) Control method and recording medium
WO2014103664A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2021230000A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations
US11373422B2 (en) Evaluation assistance method, evaluation assistance system, and computer-readable medium
WO2021157405A1 (fr) Dispositif d'analyse, procédé d'analyse, programme d'analyse et système d'aide au diagnostic
JP2011103095A (ja) 医用画像表示システム及びプログラム
WO2022004337A1 (fr) Dispositif d'aide à l'évaluation, dispositif de traitement d'informations et procédé d'apprentissage
WO2020174863A1 (fr) Programme d'aide au diagnostic, système d'aide au diagnostic et procédé d'aide au diagnostic
WO2021261185A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme de traitement d'image et système d'aide au diagnostic
WO2021220873A1 (fr) Dispositif, procédé, programme de génération et système d'aide au diagnostic
US20230215010A1 (en) Information processing apparatus, information processing method, program, and information processing system
WO2021220803A1 (fr) Procédé de commande d'affichage, dispositif de commande d'affichage, programme de commande d'affichage, et système d'aide au diagnostic
JP2019519794A (ja) サンプルの画像の中の、生体要素のうち少なくとも1つの関心ある要素の判別を支援するための方法及び電子デバイス、関連するコンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21750537

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21750537

Country of ref document: EP

Kind code of ref document: A1