WO2020110776A1 - Classification device, classification method and program, and classification result display device - Google Patents

Classification device, classification method and program, and classification result display device Download PDF

Info

Publication number
WO2020110776A1
WO2020110776A1 PCT/JP2019/044869 JP2019044869W WO2020110776A1 WO 2020110776 A1 WO2020110776 A1 WO 2020110776A1 JP 2019044869 W JP2019044869 W JP 2019044869W WO 2020110776 A1 WO2020110776 A1 WO 2020110776A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
class
classes
classification
unit
Prior art date
Application number
PCT/JP2019/044869
Other languages
French (fr)
Japanese (ja)
Inventor
ディーパック ケシュワニ
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2020110776A1 publication Critical patent/WO2020110776A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a classification device, a classification method and program, and a classification result display device, and more particularly to a technique for classifying image regions.
  • Patent Literature 1 discloses a technique of classifying a cell of interest by determining a class of the cell of interest specified by a cell region detected from an image.
  • the present invention has been made in view of such circumstances, and an object thereof is to provide a classification device, a classification method and a program, and a classification result display device that appropriately classify an image area into a plurality of classes.
  • one aspect of the classification device is that an image acquisition unit that acquires an image and a score indicating the degree to which at least one region of the image belongs to each class of a plurality of predetermined classes are calculated.
  • the class in which the calculated score exceeds the threshold is classified into the class based on the priority.
  • the area of can be properly classified into a plurality of classes.
  • the classification section classify into classes with relatively high priority. By this, it is possible to classify into appropriate classes.
  • the image is a medical image of a patient and each of the plurality of classes is a class related to the lesion of the patient.
  • each of the plurality of classes is a class related to the lesion of the patient.
  • the priority setting unit should set priorities for each of a plurality of classes based on the global information. Thereby, the priority can be set appropriately.
  • the global information is preferably information relating to the disease name associated with the patient or the age of the patient. Thereby, the priority can be set appropriately.
  • Priority is preferably determined based on the severity of the lesion. Thereby, the priority can be set appropriately.
  • Priority is preferably determined based on the temporal change of the lesion. Thereby, the priority can be set appropriately.
  • one mode of a classification device is an image acquisition unit that acquires an image and a score that indicates the degree to which at least one region of the image belongs to each of a plurality of predetermined classes.
  • a score calculation unit a global information acquisition unit that acquires global information about an image, and a classification unit that classifies an area into one of the classes in which the calculated score exceeds a threshold, based on the global information.
  • a classification unit that classifies the information according to the above.
  • the class in which the calculated score exceeds the threshold is classified into the class based on the global information.
  • the area of can be properly classified into a plurality of classes.
  • the classification section classify into classes that are compatible with global information. By this, it is possible to classify into appropriate classes.
  • the image is a medical image of a patient and each of the plurality of classes is a class related to the lesion of the patient.
  • each of the plurality of classes is a class related to the lesion of the patient.
  • the global information is information about the disease name associated with the patient, and it is preferable to provide a key finding storage unit that stores a class that matches the disease name. Thereby, the regions can be appropriately classified.
  • the medical image is preferably a CT (Computed Tomography) image of the lung.
  • CT Computer Tomography
  • the patient lesion class preferably includes at least one of beehive lung, ground glass shadow, reticular shadow, and linear shadow.
  • the region of the CT image can be classified into at least one class of beehive lung, frosted glass shadow, reticulated shadow, and linear shadow.
  • the score calculation unit includes a plurality of feature amount calculation units that respectively calculate the feature amount corresponding to each class of the plurality of classes, and calculate the score based on the feature amount corresponding to each class. Thereby, the score can be calculated appropriately.
  • each of the plurality of feature amount calculation units is configured by a convolutional neural network. Thereby, the feature amount can be calculated appropriately.
  • one mode of a classification result display device is a classification result display including the above classification device, a display device, and a display control unit that displays a class into which a region is classified on the display device. It is a device.
  • One aspect of the classification method for achieving the above object is an image acquisition step of acquiring an image, and a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes, respectively.
  • the image area can be appropriately classified into a plurality of classes.
  • One aspect of the classification method for achieving the above object is an image acquisition step of acquiring an image, and a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes, respectively.
  • the image area can be appropriately classified into a plurality of classes.
  • a program for causing a computer to execute the above classification method is also included in this embodiment.
  • the image area can be appropriately classified into a plurality of classes.
  • FIG. 1 is a schematic configuration diagram of a medical information system.
  • FIG. 2 is a functional block diagram of the image processing apparatus according to the first embodiment.
  • FIG. 3 is a block diagram showing the hardware configuration of the image processing apparatus.
  • FIG. 4 is a flowchart showing the processing of the class classification method.
  • FIG. 5 is a diagram for explaining the process of the class classification method.
  • FIG. 6 is a diagram for explaining the processing of the class classification method.
  • FIG. 7 is a functional block diagram of the image processing apparatus according to the second embodiment.
  • FIG. 8 is a diagram showing an example of key findings stored in the key finding storage unit.
  • FIG. 9 is a flowchart showing the processing of the class classification method.
  • FIG. 1 is a schematic configuration diagram of a medical information system 10.
  • the medical information system 10 includes an image processing device 12, a modality 14, and an image database 16.
  • the image processing device 12, the modality 14, and the image database 16 are communicably connected via a network 18.
  • the image processing device 12 can be a computer provided in a medical institution.
  • the image processing apparatus 12 is connected with a mouse 20 and a keyboard 22 as input devices.
  • a display device 24 is connected to the image processing device 12 as an output device.
  • the modality 14 is an image pickup device that picks up an image of a part to be examined of a patient and generates a medical image.
  • Examples of the modality 14 include an X-ray imaging device, a CT device, an MRI device, a PET device, an ultrasonic device, and a CR device using a planar X-ray detector.
  • An endoscopic device may be applied as the modality 14.
  • CT is an abbreviation for Computed Tomography, which stands for computed tomography.
  • MRI is an abbreviation for Magnetic Resonance Imaging that represents a magnetic resonance image.
  • PET is an abbreviation for Positron Emission Tomography, which stands for Positron Emission Tomography.
  • the flat panel X-ray detector is sometimes called an FPD (flat panel detector).
  • CR is an abbreviation for Computed Radiography representing a computer radiography apparatus.
  • DICOM is an abbreviation for Digital Imaging and COmmunications in Medicine.
  • image in this specification may include the meaning of image data that is a signal representing the image, in addition to the meaning of the image itself such as a photograph.
  • a computer equipped with a large-capacity storage device can be applied.
  • the computer incorporates software that provides the functionality of a database management system.
  • the database management system is sometimes called a DBMS (Data Base Management System).
  • a LAN Local Area Network
  • WAN Wide Area Network
  • the DICOM standard can be applied to the communication protocol of the network 18.
  • the network 18 may be configured to be connectable to a public line network or a dedicated line network.
  • the network 18 may be wired or wireless.
  • FIG. 2 is a functional block diagram of the image processing device 12 according to the first embodiment.
  • the image processing device 12 is a classification device that performs class classification (segmentation) for each area of a medical image.
  • An example of class classification includes classifying lung tissues into classes relating to lesions such as bronchiectasis, honeycomb, ground glass shadow, reticular shadow, and linear shadow. In the present embodiment, it is classified whether each pixel of the input CT image corresponds to a frosted glass shadow, a reticular shadow, or a cell lung.
  • Medical images classified into classes are used for volume calculation for each lesion.
  • the change in volume for each lesion is an indicator of the progression of lung disease such as interstitial lung disease.
  • the image processing device 12 includes an image acquisition unit 40, a score calculation unit 42, a priority setting unit 50, a classification unit 52, a storage unit 54, an input control unit 60, and a display control unit 62. ..
  • the image acquisition unit 40, the score calculation unit 42, the priority setting unit 50, the classification unit 52, the storage unit 54, the input control unit 60, and the display control unit 62 are communicably connected via a bus 64.
  • the image acquisition unit 40 acquires a CT image to be processed.
  • the image processing apparatus 12 also stores the acquired CT image in the storage unit 54.
  • the image acquisition unit 40 acquires a CT image from the image database 16.
  • the image acquisition unit 40 may acquire the CT image from the modality 14 (see FIG. 1) which is the CT device, or may acquire the CT image from a storage device (not shown) via the network 18.
  • the medical image may be acquired via an information storage medium (not shown).
  • the score calculation unit 42 calculates a score that is an evaluation value indicating the degree to which at least one region of the CT image acquired by the image acquisition unit 40 belongs to each of a plurality of predetermined classes.
  • the score has a value between 0 and 1, and the higher the degree of belonging to the class, the larger the score.
  • the score calculation unit 42 includes a plurality of feature amount calculation units that respectively calculate the feature amount corresponding to each class, and calculates the scores belonging to each class based on the feature amounts corresponding to each class.
  • the score calculation unit 42 has, as a feature amount calculation unit, a convolutional neural network (CNN) designed and learned for the purpose of recognizing pixels belonging to a specific class from a medical image.
  • the convolutional neural network has a structure in which a convolutional layer that performs local feature extraction of an image by convolution processing using a plurality of filters and a pooling layer that collects the extracted features for each rectangular region are repeated.
  • the score calculation unit 42 includes a ground glass shadow detection convolutional neural network 44 that calculates a score of a ground glass shadow class of each pixel, and a mesh mesh detection convolutional neural network that calculates a score of a mesh shadow class of each pixel. It has a network 46 and a convolutional neural network 48 for pulmonary detection for calculating the pulmonary lung class score for each pixel.
  • a convolutional neural network may be provided for calculating scores for other classes such as linear shadows.
  • Priority setting unit 50 sets priorities for a plurality of classes to be classified.
  • the priority setting unit 50 sets priorities for the ground glass shadow class, the reticular shadow class, and the celluloid class based on the severity of the lesion corresponding to each class.
  • the classification unit 52 classifies the region of the CT image acquired by the image acquisition unit 40 into any one of a plurality of classes. Further, the classification unit 52 generates a classification map in which regions classified into different classes of the CT image are expressed in different colors so that they can be identified. In the present embodiment, the classification unit 52 classifies each pixel of the CT image into a class based on the priority set by the priority setting unit 50 among the classes in which the score calculated by the score calculation unit 42 exceeds the threshold value. .. In this embodiment, the threshold value has a value of 0 to 1. The threshold may be a different value for each class.
  • the storage unit 54 stores various data of the image processing device 12.
  • the storage unit 54 stores various programs executed by the image processing apparatus 12.
  • a plurality of storage devices may be applied, or a single storage device partitioned into a plurality of storage areas may be applied.
  • the storage unit 54 may be one or more storage devices provided outside the image processing apparatus 12.
  • the storage unit 54 also includes a severity storage unit 56.
  • the severity storage unit 56 stores the severity of each lesion corresponding to each class. Severity is an index that relatively represents the degree of risk to the patient's life, and is set in advance based on medical prior knowledge and the like. In the present embodiment, it is assumed that the cells are a cell lung, a reticular shadow, and a ground glass shadow in order of relatively high severity.
  • the severity storage unit 56 may be provided in a medical database (not shown) via the network 18. This facilitates updating the severity.
  • the input control unit 60 converts a signal representing input information transmitted from the mouse 20 and the keyboard 22 into a signal of a format applied to the image processing device 12.
  • the signal representing the input information is appropriately transmitted to each unit of the device.
  • the display control unit 62 transmits a signal representing display information to the display device 24. As a result, the display information is displayed on the display device 24.
  • the display information is, for example, a class into which regions are classified, or a classification map generated by the classification unit 52.
  • the image processing device 12 and the display device 24 function as a classification result display device.
  • FIG. 3 is a block diagram showing the hardware configuration of the image processing apparatus 12.
  • the image processing device 12 executes the program using the hardware shown in FIG. 3 to realize various functions.
  • the image processing device 12 includes a processor 100, a memory 102, a storage device 104, a network controller 106, and a power supply device 108.
  • the image processing device 12 also includes a display controller 110, an input/output interface 112, and an input controller 114.
  • the processor 100, the memory 102, the storage device 104, the network controller 106, the display controller 110, the input/output interface 112, and the input controller 114 are connected via the bus 64 so that data communication is possible.
  • the processor 100 functions as an overall control unit of the image processing apparatus 12, various calculation units, and a storage control unit.
  • the processor 100 executes a program stored in a ROM (read only memory) included in the memory 102.
  • the processor 100 may execute a program downloaded from an external storage device via the network controller 106.
  • the external storage device may be communicatively connected to the image processing device 12 via the network 18.
  • the processor 100 uses a RAM (random access memory) included in the memory 102 as a calculation area and executes various processes in cooperation with various programs. As a result, various functions of the image processing device 12 are realized.
  • RAM random access memory
  • the processor 100 controls the reading of data from the storage device 104 and the writing of data to the storage device 104.
  • the processor 100 may acquire various data from an external storage device via the network controller 106.
  • the processor 100 can execute various kinds of processing such as calculation using the various kinds of acquired data.
  • the processor 100 may include one or more devices. Examples of the processor 100 include FPGA (Field Programmable Gate Array) and PLD (Programmable Logic Device). FPGAs and PLDs are devices whose circuit configurations can be changed after manufacture.
  • the processor 100 is an ASIC (Application Specific Integrated Circuit).
  • the ASIC has a circuit configuration specifically designed to execute a specific process.
  • the processor 100 can apply two or more devices of the same type.
  • the processor 100 may use two or more FPGAs and may use two PLDs.
  • the processor 100 may apply two or more devices of different types.
  • the processor 100 may apply one or more FPGAs and one or more ASICs.
  • the plurality of processors 100 may be configured using one device.
  • one processor is configured using a combination of one or more CPUs (Central Processing Units) and software, and this processor functions as a plurality of processors 100.
  • CPUs Central Processing Units
  • Software in this specification has the same meaning as a program.
  • a GPU Graphics Processing Unit
  • a computer is a typical example in which the plurality of processors 100 are configured by using one device.
  • Another example of configuring a plurality of processors 100 using a single device is a form of using a device that realizes the functions of the entire system including the plurality of processors 100 with a single IC chip.
  • a SoC System On Chip
  • IC is an abbreviation for Integrated Circuit.
  • the hardware structure of the processor 100 is an electrical circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the processor 100 is configured by using one or more various devices as a hardware structure.
  • the memory 102 includes a ROM (not shown) and a RAM (not shown).
  • the ROM stores various programs executed by the image processing device 12.
  • the ROM stores parameters used for executing various programs, files, and the like.
  • the RAM functions as a temporary storage area for data, a work area of the processor 100, and the like.
  • the storage device 104 stores various data non-temporarily.
  • the storage device 104 may be externally attached to the outside of the image processing device 12.
  • a large-capacity semiconductor memory device may be applied instead of or in combination with the storage device 104.
  • the network controller 106 controls data communication with an external device. Controlling data communication may include managing data communication traffic.
  • the network 18 connected via the network controller 106 may be a known network such as a LAN (Local Area Network).
  • ⁇ Power supply unit> As the power supply device 108, a large-capacity power supply device such as UPS (Uninterruptible Power Supply) is applied.
  • UPS Uninterruptible Power Supply
  • the power supply device 108 supplies power to the image processing device 12 when the commercial power supply is cut off due to a power failure or the like.
  • the display controller 110 functions as a display driver that controls the display device 24 based on a command signal transmitted from the processor 100.
  • the input/output interface 112 connects the image processing apparatus 12 and an external device so that they can communicate with each other.
  • the input/output interface 112 can apply a communication standard such as USB (Universal Serial Bus).
  • the input controller 114 converts the format of a signal input using the mouse 20 and the keyboard 22 into a format suitable for the processing of the image processing apparatus 12. Information input from the mouse 20 and the keyboard 22 via the input controller 114 is transmitted to each unit via the processor 100.
  • the hardware configuration of the image processing device 12 shown in FIG. 3 is an example, and addition, deletion, and modification can be performed according to the image processing specifications.
  • FIG. 4 is a flowchart showing the processing of the class classification method.
  • the class classification method includes an image acquisition process (step S1), a score calculation process (step S2), a priority setting process (step S3), a classification process (step S4), and a display process (step S5).
  • step S1 the image acquisition unit 40 acquires a CT image of a patient's lungs as a medical image from the image database 16.
  • step S2 the score calculation unit 42 calculates a score for each class indicating the degree to which each pixel of the CT image acquired in step S1 belongs to each class of the plurality of classes.
  • a frosted glass shadow detection convolutional neural network 44 a reticular shadow detection convolutional neural network 46, and a beehive lung detection convolutional neural network 48 are used to detect the frosted glass shadow class score, the reticular shadow class score, and the honeycomb lung, respectively. Calculate the class score of.
  • the priority setting unit 50 sets priorities for the ground glass shadow class, the reticular shadow class, and the beehive lung class.
  • the priority setting unit 50 reads the severity of the beehive lung, the reticular shadow, and the ground glass shadow from the severity storage unit 56, and sets a higher priority for a class related to a lesion having a higher severity. .. Since the beehive lung, the reticular shadow, and the ground glass shadow are in descending order of the degree of severity, the priority setting unit 50 sets the priority of the class of the benign lung to the highest, and the priority of the class of the reticular shadow. Is set to the second highest, and the priority of the ground glass shadow class is set to the lowest.
  • step S4 the classification unit 52 classifies each pixel of the CT image acquired in step S1 into one of a frosted glass shadow class, a reticular shadow class, and a beehive lung class.
  • the classification unit 52 classifies the pixels whose score of the frosted glass shadow class exceeds the threshold value into the frosted glass shadow class. Further, the classification unit 52 classifies pixels in which only the score of the reticular shadow class exceeds the threshold value into the reticular shadow class. Further, the classification unit 52 classifies pixels in which only the score of the beehive lung class exceeds the threshold value into the beehive lung class.
  • the classification unit 52 selects pixels whose two or more scores out of the scores of the ground glass shadow class score, the reticular shadow class score, and the beehive lung class score exceed a threshold value, and select a pixel out of the classes exceeding the threshold value. It is classified into a class having a relatively high priority set in S4.
  • a pixel in which both the frosted glass shadow class score and the reticular shadow class score exceed the threshold is classified into a reticular shadow class with a relatively high priority. Pixels in which both the reticulated shadow class score and the beehive lung class score exceed the threshold value are classified into the bee lung class having a relatively high priority. Pixels in which the scores of all three classes exceed the threshold value are classified into the honey lung class having the highest priority.
  • the classification unit 52 does not classify a pixel whose score of each of the ground glass shadow class score, the reticular shadow class score, and the celluloid shadow class score is less than a threshold value as a lesion.
  • the classification unit 52 generates a classification map from the classification results of all pixels.
  • a visualized classification map is generated by expressing pixels of a frosted glass shadow class, a reticular shadow class, and a cell of a beehive lung in different colors.
  • step S5 the display control unit 62 causes the display device 24 to display the classification map generated in step S4.
  • the image processing apparatus 12 ends the class classification method.
  • FIG. 5 and 6 are diagrams for explaining the process of the class classification method.
  • the image G1 acquired by the image acquisition unit 40 is input to the convolutional neural network for frosted glass shadow detection 44, the convolutional neural network for meshed dot detection 46, and the convolutional neural network 48 for cell lung detection. ..
  • the image G2 shown in FIG. 5 is an image in which the position of a pixel whose score (feature amount) of the frosted glass shadow class exceeds the threshold value is visualized by hatching in the feature amount detection of the image G1 by the frosted glass shadow detection convolutional neural network 44. is there.
  • An image G3 shown in FIG. 5 is an image in which the position of a pixel in which the score of the mesh-like shadow class exceeds the threshold value is visualized by hatching in the feature amount extraction of the image G1 by the convolutional neural network for mesh-like shadow detection 46.
  • the image G4 shown in FIG. 5 is an image in which the position of a pixel whose beehive lung class score exceeds the threshold value is visualized by hatching in the extraction of the feature amount of the image G1 by the convolutional neural network 48 for detecting the benign lung.
  • the image G2, the image G3, and the image G4 are images for explaining the processing of the class classification method, and need not be generated by the actual class classification method.
  • the classification unit 52 is a classification map based on the result of the feature amount extraction by the convolutional neural network for ground glass shadow detection 44, the convolutional neural network for meshed dot detection 46, and the convolutional neural network for beehive lung detection 48.
  • An image G7 is generated.
  • the classification map is equivalent to an image obtained by integrating the images G2, G3, and G4.
  • Image G5, image G6, and image G8 shown in FIG. 6 are enlarged views of the same region R1 of image G2, image G3, and image G7, respectively. As shown in the images G5 and G6, in this example, there are pixels in which both the reticular shadow class score and the beehive lung class score exceed the threshold value.
  • the classification unit 52 classifies the pixels having both the reticular shadow class score and the beehive lung class score exceeding the threshold value into the beehive lung class.
  • the pixels of the medical image As described above, according to the first embodiment, it is possible to appropriately classify the pixels of the medical image into a plurality of classes and present them without missing a sign of serious image features.
  • the pixels are classified here, they may be classified into a plurality of pixels as one area.
  • the priority setting unit 50 determines the priority based on the severity of the lesion, but may determine the priority based on the temporal change of the lesion. For example, when the lesion A changes to the lesion B as the disease progresses, the priority setting unit 50 sets the priority of the lesion B, which is the lesion temporally later than the lesion A, to be higher.
  • the temporal change of the lesion may be stored in the storage unit 54 in advance. Further, the temporal change of the lesion may be stored in a medical database (not shown) via the network 18. This facilitates updating the temporal change of the lesion.
  • the priority setting unit 50 may acquire global information about an image and set the priority based on the acquired global information.
  • the global information includes at least one of information on a disease name (diagnosis name) of a patient who is a subject of a CT image to be classified, the age of the patient, and the sex of the patient.
  • the global information may be input by a user who is a doctor using the mouse 20 and the keyboard 22, or may be acquired from a medical database (not shown) via the network 18. Even for the same CT image, the displayed classification map may differ depending on the input global information.
  • the priority setting unit 50 may set a value obtained by normalizing the severity as the priority. In this case, the priority setting unit 50 sets the priority to a larger value as the severity is higher, and the classification unit 52 obtains the product of the priority and the score for each class, and the class having the largest product value. Pixels can be classified into.
  • the score calculation unit 42 calculates the score of each class using a convolutional neural network for each class, but a single neural network that can identify a plurality of classes may be used. Even when a single neural network is used, a high score may be calculated for each of a plurality of classes. Therefore, when there are a plurality of classes whose scores exceed the threshold value, it is possible to solve the same problem by selecting a class having a relatively high priority.
  • the image processing apparatus according to the first embodiment classifies regions based on priority when a plurality of scores exceeds a threshold, but the image processing apparatus according to the second embodiment has a plurality of scores as a threshold. If it exceeds, the area is classified based on the global information.
  • FIG. 7 is a functional block diagram of the image processing device 70 according to the second embodiment.
  • the same parts as those in the block diagram shown in FIG. 2 are designated by the same reference numerals, and detailed description thereof will be omitted.
  • the image processing device 70 includes a global information acquisition unit 72.
  • the global information acquisition unit 72 acquires global information input by the user with the mouse 20 and the keyboard 22.
  • the global information acquisition unit 72 acquires, as global information, the disease name of the patient who is the subject of the CT image to be classified.
  • the global information acquisition unit 72 may acquire global information from a medical database (not shown) via the network 18.
  • the score calculation unit 42 calculates a score indicating the degree to which at least one region of the CT image acquired by the image acquisition unit 40 belongs to each of a plurality of predetermined classes.
  • the score calculation unit 42 includes scores belonging to the Tree-in-bud appearance class, the bronchodilation class, the traction tracheal dilation class, the ground glass shadow class, the celluloid class, and the consolidation class. Are calculated respectively.
  • the score has a value between 0 and 1, and the higher the degree of belonging to the class, the larger the score.
  • the classification unit 52 classifies the region of the CT image acquired by the image acquisition unit 40 into any one of a plurality of classes.
  • the classification unit 52 classifies each pixel of the CT image into a class based on the global information acquired by the global information acquisition unit 72, out of one or a plurality of classes whose score exceeds the threshold value.
  • the storage unit 54 includes a key finding storage unit 74.
  • the key finding storage unit 74 stores the disease name and the key finding in association with each other.
  • the key findings are the same as the lesions corresponding to the class to be classified.
  • FIG. 8 is a diagram showing an example of disease names and key findings stored in the key finding storage unit 74.
  • tree-in-bud appearance, bronchodilation, tractive tracheal dilation, ground glass shadow, and cell lung are key findings related to the disease name “collagen lung (1) RA”. Is remembered.
  • bronchodilation, tractive tracheal dilation, and ground glass shadow are stored as key findings related to the disease name "collagen lung (2) SSc (PPS)”.
  • PPS SSc
  • consolidation, bronchial transparent image, traction tracheal dilation, and ground glass shadow are stored.
  • the key finding storage unit 74 may be provided in a medical database (not shown) via the network 18. This facilitates updating the association between the disease name and the key finding.
  • the hardware configuration of the image processing device 70 is the same as the hardware configuration of the image processing device 12 shown in FIG.
  • FIG. 9 is a flowchart showing the processing of the class classification method.
  • the same parts as those in the flowchart shown in FIG. 4 are designated by the same reference numerals, and detailed description thereof will be omitted.
  • the class classification method includes an image acquisition process (step S1), a score calculation process (step S2), a global information acquisition process (step S11), a classification process (step S4), and a display process (step S5).
  • step S1 the image acquisition unit 40 acquires a CT image of a patient's lungs as a medical image from the image database 16.
  • step S2 the score calculation unit 42 calculates a score for each class of a plurality of classes for each pixel of the CT image acquired in step S1.
  • Tree-in-bud appearance class scores, bronchodilation class scores, traction tracheal dilation class scores, frosted shadow class scores, cell lung class scores, and consolidation class scores Calculate the score.
  • step S11 the global information acquisition unit 72 acquires the patient's disease name as global information.
  • the user inputs the patient's disease name from the mouse 20 or the keyboard 22.
  • step S4 the classification unit 52 classifies each pixel of the CT image acquired in step S1 into a tree-in-bud appearance class, a bronchodilation class, a traction tracheal dilation class, a frosted glass shadow class, and a beehive lung. Classify into one of the classes and consolidation classes. Here, each pixel is classified into a class whose score exceeds a threshold value.
  • the classification unit 52 classifies the classes that exceed the threshold value into the classes that conform to the global information acquired in step S11.
  • step S5 the display control unit 62 causes the display device 24 to display the class for each pixel classified in step S4.
  • each pixel may be classified into a plurality of classes, and the display control unit 62 causes the display device 24 to display all the classes into which the pixels are classified.
  • the score calculation unit 42 calculates the score of each class for a pixel as follows.
  • the classification unit 52 extracts a class that exceeds the threshold value from the calculated scores. Here, it is assumed that the threshold value is 0.7. Therefore, the classification unit 52 extracts a Tree-in-bud appearance class, a bronchodilation class, a frosted glass shadow class, a cell lung class, and a consolidation class.
  • the classifying unit 52 classifies the classes whose calculated scores exceed the threshold value into the classes that match the global information.
  • the classes that are suitable for the global information are classified into the classes corresponding to the key findings associated with the disease name that is the global information.
  • the global information acquisition unit 72 has acquired the disease name “collagen disease lung (2) Ssc (PSS)” in step S11.
  • the classification unit 52 obtains the key finding associated with the disease name “collagen disease lung (2)Ssc(PSS)” from the key finding storage unit 74.
  • the classification unit 52 corresponds to the key finding among the tree-in-bud appearance class, the bronchodilation class, the ground glass shadow class, the cell lung class, and the consolidation class, which are classes that exceed the threshold. Classify pixels into bronchodilation class and frosted shadow class.
  • the disease name of "collagen lung (2) Ssc (PSS)” is not associated with lesions of tree-in-bud appearance, cell lung, and consolidation. Therefore, even if the score exceeds the threshold, these classes are excluded from the classification target. This allows the pixels to be properly classified.
  • the second embodiment it is possible to appropriately classify the region of the medical image into a plurality of classes and present the classification that is suitable for the global information of the patient.
  • the score calculation step of step S2 and the global information acquisition step of step S11 may be replaced with each other, and the global information acquisition part 72 may acquire global information before the score calculation part 42 calculates a score.
  • the score calculation unit 42 can calculate only the score of the class corresponding to the key finding related to the global information.
  • the example of using the CT image of the lung as the medical image has been described, but a medical image of an organ other than the lung may be used.
  • a three-dimensional image may be used as the medical image.
  • the image processing device 12 classifies each voxel.
  • the above classification method may be configured as a program for causing a computer to realize each process, and a non-transitory recording medium such as a CD-ROM (Compact Disk-Read Only Memory) storing the program may be configured. Is.
  • a non-transitory recording medium such as a CD-ROM (Compact Disk-Read Only Memory) storing the program may be configured. Is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A classification device which can suitably classify regions of an image into multiple classes, a classification method and program, and a classification result display device are provided. The classification device is provided with: an image acquisition unit which acquires an image; a score calculation unit which calculates scores indicating the degree to which at least one region of the image belongs to each of multiple predetermined classes; a priority setting unit which sets a priority for each of the multiple classes; and a classification unit which classifies a region into one of the classes for which the calculated score exceeds a threshold value, and which classifies on the basis of the priorities.

Description

分類装置、分類方法及びプログラム、分類結果表示装置Classification device, classification method and program, classification result display device
 本発明は分類装置、分類方法及びプログラム、分類結果表示装置に係り、特に画像の領域を分類する技術に関する。 The present invention relates to a classification device, a classification method and program, and a classification result display device, and more particularly to a technique for classifying image regions.
 深層学習を用いた解剖学的特徴に基づく医用画像のセグメンテーションが知られている。例えば、特許文献1には、画像から検出された細胞領域によって特定される注目細胞のクラスを判定し、注目細胞のクラス分けを行う技術が開示されている。 Medical image segmentation based on anatomical features using deep learning is known. For example, Patent Literature 1 discloses a technique of classifying a cell of interest by determining a class of the cell of interest specified by a cell region detected from an image.
国際公開第2004/042392号International Publication No. 2004/042392
 このようなクラス分けにおいて、判定対象についてクラス毎にスコアを算出し、算出したスコアが最も高いクラスに判定対象を分類することが行われる。しかしながら、複数のクラスにおいて高いスコアが算出された場合には、単純に最高スコアのクラスに分類すると、誤診が生じる恐れがあるという問題点があった。 In such classification, a score is calculated for each class for the judgment target, and the judgment target is classified into the class with the highest calculated score. However, when a high score is calculated in a plurality of classes, there is a problem that misclassification may occur if the class is simply classified into the class with the highest score.
 本発明はこのような事情に鑑みてなされたもので、画像の領域を複数のクラスに適切に分類する分類装置、分類方法及びプログラム、分類結果表示装置を提供することを目的とする。 The present invention has been made in view of such circumstances, and an object thereof is to provide a classification device, a classification method and a program, and a classification result display device that appropriately classify an image area into a plurality of classes.
 上記目的を達成するために分類装置の一の態様は、画像を取得する画像取得部と、画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出部と、複数のクラスに対してそれぞれ優先度を設定する優先度設定部と、算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに領域を分類する分類部であって、優先度に基づいて分類する分類部と、を備えた分類装置である。 In order to achieve the above object, one aspect of the classification device is that an image acquisition unit that acquires an image and a score indicating the degree to which at least one region of the image belongs to each class of a plurality of predetermined classes are calculated. A score calculation unit, a priority setting unit that sets a priority for each of a plurality of classes, and a classification unit that classifies an area into one of the classes whose calculated scores exceed a threshold. , A classifying unit that classifies based on the priority.
 本態様によれば、領域を複数のクラスのうちのいずれかのクラスに分類する際に、算出されたスコアが閾値を超えるクラスのうち優先度に基づいたクラスに分類するようにしたので、画像の領域を複数のクラスに適切に分類することができる。 According to this aspect, when the region is classified into any one of the plurality of classes, the class in which the calculated score exceeds the threshold is classified into the class based on the priority. The area of can be properly classified into a plurality of classes.
 分類部は、優先度が相対的に高いクラスに分類することが好ましい。これにより、適切なクラスに分類することができる。 ∙ It is preferable that the classification section classify into classes with relatively high priority. By this, it is possible to classify into appropriate classes.
 画像は患者を撮影した医用画像であり、複数のクラスはそれぞれ患者の病変に関するクラスであることが好ましい。これにより、医用画像の領域を複数の病変に関するクラスに適切に分類することができる。 It is preferable that the image is a medical image of a patient and each of the plurality of classes is a class related to the lesion of the patient. As a result, it is possible to appropriately classify the region of the medical image into classes related to a plurality of lesions.
 画像に関するグローバル情報を取得するグローバル情報取得部を備え、優先度設定部は、グローバル情報に基づいて複数のクラスに対してそれぞれ優先度を設定することが好ましい。これにより、優先度を適切に設定することができる。 It is preferable to have a global information acquisition unit that acquires global information about images, and the priority setting unit should set priorities for each of a plurality of classes based on the global information. Thereby, the priority can be set appropriately.
 グローバル情報は、患者に関連付けられた病名又は患者の年齢に関する情報であることが好ましい。これにより、優先度を適切に設定することができる。 The global information is preferably information relating to the disease name associated with the patient or the age of the patient. Thereby, the priority can be set appropriately.
 優先度は病変の重篤度に基づいて決定されることが好ましい。これにより、優先度を適切に設定することができる。 Priority is preferably determined based on the severity of the lesion. Thereby, the priority can be set appropriately.
 優先度は病変の時間的変化に基づいて決定されることが好ましい。これにより、優先度を適切に設定することができる。 Priority is preferably determined based on the temporal change of the lesion. Thereby, the priority can be set appropriately.
 上記目的を達成するために分類装置の一の態様は、画像を取得する画像取得部と、画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出部と、画像に関するグローバル情報を取得するグローバル情報取得部と、算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに領域を分類する分類部であって、グローバル情報に基づいて分類する分類部と、を備えた分類装置である。 In order to achieve the above object, one mode of a classification device is an image acquisition unit that acquires an image and a score that indicates the degree to which at least one region of the image belongs to each of a plurality of predetermined classes. A score calculation unit, a global information acquisition unit that acquires global information about an image, and a classification unit that classifies an area into one of the classes in which the calculated score exceeds a threshold, based on the global information. And a classification unit that classifies the information according to the above.
 本態様によれば、領域を複数のクラスのうちのいずれかのクラスに分類する際に、算出されたスコアが閾値を超えるクラスのうちグローバル情報に基づいたクラスに分類するようにしたので、画像の領域を複数のクラスに適切に分類することができる。 According to this aspect, when the region is classified into any one of the plurality of classes, the class in which the calculated score exceeds the threshold is classified into the class based on the global information. The area of can be properly classified into a plurality of classes.
 分類部は、グローバル情報に適合するクラスに分類することが好ましい。これにより、適切なクラスに分類することができる。 ㆍIt is preferable that the classification section classify into classes that are compatible with global information. By this, it is possible to classify into appropriate classes.
 画像は患者を撮影した医用画像であり、複数のクラスはそれぞれ患者の病変に関するクラスであることが好ましい。これにより、医用画像の領域を複数の病変に関するクラスに適切に分類することができる。 It is preferable that the image is a medical image of a patient and each of the plurality of classes is a class related to the lesion of the patient. As a result, it is possible to appropriately classify the region of the medical image into classes related to a plurality of lesions.
 グローバル情報は、患者に関連付けられた病名に関する情報であり、病名に適合するクラスを記憶するキー所見記憶部を備えることが好ましい。これにより、領域を適切に分類することができる。 The global information is information about the disease name associated with the patient, and it is preferable to provide a key finding storage unit that stores a class that matches the disease name. Thereby, the regions can be appropriately classified.
 医用画像は肺のCT(Computed Tomography)画像であることが好ましい。本態様によれば、CT画像の領域を複数の病変に関するクラスに適切に分類することができる。 The medical image is preferably a CT (Computed Tomography) image of the lung. According to this aspect, it is possible to appropriately classify the region of the CT image into classes related to a plurality of lesions.
 患者の病変に関するクラスは、蜂窩肺、すりガラス影、網状影、及び線状影のうち少なくとも1つを含むことが好ましい。これにより、CT画像の領域を蜂窩肺、すりガラス影、網状影、及び線状影のうち少なくとも1つのクラスに分類することができる。 The patient lesion class preferably includes at least one of beehive lung, ground glass shadow, reticular shadow, and linear shadow. As a result, the region of the CT image can be classified into at least one class of beehive lung, frosted glass shadow, reticulated shadow, and linear shadow.
 スコア算出部は、複数のクラスの各クラスに対応する特徴量をそれぞれ算出する複数の特徴量算出部を備え、各クラスに対応する特徴量に基づいてスコアをそれぞれ算出することが好ましい。これにより、スコアを適切に算出することができる。 It is preferable that the score calculation unit includes a plurality of feature amount calculation units that respectively calculate the feature amount corresponding to each class of the plurality of classes, and calculate the score based on the feature amount corresponding to each class. Thereby, the score can be calculated appropriately.
 複数の特徴量算出部は、それぞれ畳み込みニューラルネットワークで構成されることが好ましい。これにより、特徴量を適切に算出することができる。 It is preferable that each of the plurality of feature amount calculation units is configured by a convolutional neural network. Thereby, the feature amount can be calculated appropriately.
 上記目的を達成するために分類結果表示装置の一の態様は、上記の分類装置と、表示装置と、領域が分類されたクラスを表示装置に表示する表示制御部と、を備えた分類結果表示装置である。 In order to achieve the above object, one mode of a classification result display device is a classification result display including the above classification device, a display device, and a display control unit that displays a class into which a region is classified on the display device. It is a device.
 本態様によれば、画像の領域を複数のクラスに適切に分類して表示することができる。 According to this aspect, it is possible to appropriately classify and display the image area into a plurality of classes.
 上記目的を達成するために分類方法の一の態様は、画像を取得する画像取得工程と、画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出工程と、複数のクラスに対してそれぞれ優先度を設定する優先度設定工程と、算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに領域を分類する分類工程であって、優先度に基づいて分類する分類工程と、を備えた分類方法である。 One aspect of the classification method for achieving the above object is an image acquisition step of acquiring an image, and a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes, respectively. A score calculation step, a priority setting step for setting a priority for each of a plurality of classes, and a classification step for classifying an area into one of the classes whose calculated scores exceed a threshold. , A classification step of classifying based on priority, and a classification method.
 本態様によれば、画像の領域を複数のクラスに適切に分類することができる。 According to this aspect, the image area can be appropriately classified into a plurality of classes.
 上記目的を達成するために分類方法の一の態様は、画像を取得する画像取得工程と、画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出工程と、画像に関するグローバル情報を取得するグローバル情報取得工程と、算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに領域を分類する分類工程であって、グローバル情報に基づいて分類する分類工程と、を備えた分類方法である。 One aspect of the classification method for achieving the above object is an image acquisition step of acquiring an image, and a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes, respectively. A score calculation step, a global information acquisition step of acquiring global information about an image, and a classification step of classifying an area into any one of classes whose calculated scores exceed a threshold, which are based on the global information. And a classification step of classifying according to the above.
 本態様によれば、画像の領域を複数のクラスに適切に分類することができる。上記の分類方法をコンピュータに実行させるためのプログラムも本態様に含まれる。 According to this aspect, the image area can be appropriately classified into a plurality of classes. A program for causing a computer to execute the above classification method is also included in this embodiment.
 本発明によれば、画像の領域を複数のクラスに適切に分類することができる。 According to the present invention, the image area can be appropriately classified into a plurality of classes.
図1は、医療情報システムの概略構成図である。FIG. 1 is a schematic configuration diagram of a medical information system. 図2は、第1実施形態に係る画像処理装置の機能ブロック図である。FIG. 2 is a functional block diagram of the image processing apparatus according to the first embodiment. 図3は、画像処理装置のハードウェアの構成を示すブロック図である。FIG. 3 is a block diagram showing the hardware configuration of the image processing apparatus. 図4は、クラス分類方法の処理を示すフローチャートである。FIG. 4 is a flowchart showing the processing of the class classification method. 図5は、クラス分類方法の処理を説明するための図である。FIG. 5 is a diagram for explaining the process of the class classification method. 図6は、クラス分類方法の処理を説明するための図である。FIG. 6 is a diagram for explaining the processing of the class classification method. 図7は、第2実施形態に係る画像処理装置の機能ブロック図である。FIG. 7 is a functional block diagram of the image processing apparatus according to the second embodiment. 図8は、キー所見記憶部が記憶するキー所見の一例を示す図である。FIG. 8 is a diagram showing an example of key findings stored in the key finding storage unit. 図9は、クラス分類方法の処理を示すフローチャートである。FIG. 9 is a flowchart showing the processing of the class classification method.
 以下、添付図面に従って本発明の好ましい実施形態について詳説する。 Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.
 [医用画像処理システムの全体構成]
 図1は、医療情報システム10の概略構成図である。医療情報システム10は、画像処理装置12、モダリティ14、及び画像データベース16を備える。画像処理装置12、モダリティ14、及び画像データベース16は、ネットワーク18を介して通信可能に接続される。
[Overall configuration of medical image processing system]
FIG. 1 is a schematic configuration diagram of a medical information system 10. The medical information system 10 includes an image processing device 12, a modality 14, and an image database 16. The image processing device 12, the modality 14, and the image database 16 are communicably connected via a network 18.
 画像処理装置12は、医療機関に備えられるコンピュータを適用可能である。画像処理装置12は、入力装置としてマウス20及びキーボード22が接続される。また、画像処理装置12は、出力装置として表示装置24が接続される。 The image processing device 12 can be a computer provided in a medical institution. The image processing apparatus 12 is connected with a mouse 20 and a keyboard 22 as input devices. A display device 24 is connected to the image processing device 12 as an output device.
 モダリティ14は、患者の検査対象部位を撮像し、医用画像を生成する撮像装置である。モダリティ14の例として、X線撮像装置、CT装置、MRI装置、PET装置、超音波装置、及び平面X線検出器を用いたCR装置が挙げられる。モダリティ14として内視鏡装置を適用してもよい。 The modality 14 is an image pickup device that picks up an image of a part to be examined of a patient and generates a medical image. Examples of the modality 14 include an X-ray imaging device, a CT device, an MRI device, a PET device, an ultrasonic device, and a CR device using a planar X-ray detector. An endoscopic device may be applied as the modality 14.
 なお、CTはコンピュータ断層撮影を表すComputed Tomographyの省略語である。MRIは磁気共鳴画像を表すMagnetic Resonance Imagingの省略語である。PETは陽電子放射断層撮影を表すPositron Emission Tomographyの省略語である。平面X線検出器はFPD(flat panel detector)と呼ばれることがある。CRはコンピュータX線撮影装置を表すComputed Radiographyの省略語である。 Note that CT is an abbreviation for Computed Tomography, which stands for computed tomography. MRI is an abbreviation for Magnetic Resonance Imaging that represents a magnetic resonance image. PET is an abbreviation for Positron Emission Tomography, which stands for Positron Emission Tomography. The flat panel X-ray detector is sometimes called an FPD (flat panel detector). CR is an abbreviation for Computed Radiography representing a computer radiography apparatus.
 医用画像のフォーマットは、DICOM規格を適用可能である。医用画像は、DICOM規格において規定された付帯情報が付加されてもよい。なお、DICOMはDigital Imaging and COmmunications in Medicineの省略語である。 The DICOM standard can be applied to the medical image format. The medical image may have additional information defined in the DICOM standard added. DICOM is an abbreviation for Digital Imaging and COmmunications in Medicine.
 本明細書における画像という用語には、写真等の画像自身の意味の他に、画像を表す信号である画像データの意味が含まれ得る。 The term “image” in this specification may include the meaning of image data that is a signal representing the image, in addition to the meaning of the image itself such as a photograph.
 画像データベース16は、大容量ストレージ装置を備えるコンピュータを適用可能である。コンピュータはデータベース管理システムの機能を提供するソフトウェアが組み込まれる。データベース管理システムは、DBMS(Data Base Management System)と呼ばれることがある。 As the image database 16, a computer equipped with a large-capacity storage device can be applied. The computer incorporates software that provides the functionality of a database management system. The database management system is sometimes called a DBMS (Data Base Management System).
 ネットワーク18は、LAN(Local Area Network)を適用可能である。ネットワーク18はWAN(Wide Area Network)を適用してもよい。ネットワーク18の通信プロトコルは、DICOM規格を適用可能である。なお、ネットワーク18は公衆回線網に接続可能に構成されてもよいし、専用回線網に接続可能に構成されてもよい。ネットワーク18は、有線でもよいし、無線でもよい。 A LAN (Local Area Network) can be applied to the network 18. WAN (Wide Area Network) may be applied to the network 18. The DICOM standard can be applied to the communication protocol of the network 18. The network 18 may be configured to be connectable to a public line network or a dedicated line network. The network 18 may be wired or wireless.
 <第1実施形態>
 [画像処理装置]
 〔画像処理装置の機能〕
 図2は、第1実施形態に係る画像処理装置12の機能ブロック図である。画像処理装置12は、医用画像の領域毎のクラス分類(セグメンテーション)を行う分類装置である。クラス分類の一例として、肺組織を気管支拡張症、蜂巣肺、すりガラス影、網状影、及び線状影等の病変に関するクラスに分類することが挙げられる。本実施形態では、入力されたCT画像の各ピクセルが、すりガラス影、網状影、及び蜂窩肺のいずれに該当するかを分類する。
<First Embodiment>
[Image processing device]
[Functions of image processing device]
FIG. 2 is a functional block diagram of the image processing device 12 according to the first embodiment. The image processing device 12 is a classification device that performs class classification (segmentation) for each area of a medical image. An example of class classification includes classifying lung tissues into classes relating to lesions such as bronchiectasis, honeycomb, ground glass shadow, reticular shadow, and linear shadow. In the present embodiment, it is classified whether each pixel of the input CT image corresponds to a frosted glass shadow, a reticular shadow, or a cell lung.
 クラス分類された医用画像は、病変毎の体積計算等に利用される。病変毎の体積変化は、間質性肺疾患等の肺疾患の進行の指標となる。 Medical images classified into classes are used for volume calculation for each lesion. The change in volume for each lesion is an indicator of the progression of lung disease such as interstitial lung disease.
 画像処理装置12は、画像取得部40と、スコア算出部42と、優先度設定部50と、分類部52と、記憶部54と、入力制御部60と、及び表示制御部62と、を備える。画像取得部40、スコア算出部42、優先度設定部50、分類部52、記憶部54、入力制御部60、及び表示制御部62は、バス64を介して通信可能に接続される。 The image processing device 12 includes an image acquisition unit 40, a score calculation unit 42, a priority setting unit 50, a classification unit 52, a storage unit 54, an input control unit 60, and a display control unit 62. .. The image acquisition unit 40, the score calculation unit 42, the priority setting unit 50, the classification unit 52, the storage unit 54, the input control unit 60, and the display control unit 62 are communicably connected via a bus 64.
 画像取得部40は処理対象のCT画像を取得する。また、画像処理装置12は、取得したCT画像を記憶部54に記憶させる。図2に示す例では、画像取得部40は、画像データベース16からCT画像を取得する。なお、画像取得部40は、CT装置であるモダリティ14(図1参照)からCT画像を取得してもよいし、ネットワーク18を経由して不図示のストレージ装置からCT画像を取得してもよいし、不図示の情報記憶媒体を介して医用画像を取得してもよい。 The image acquisition unit 40 acquires a CT image to be processed. The image processing apparatus 12 also stores the acquired CT image in the storage unit 54. In the example shown in FIG. 2, the image acquisition unit 40 acquires a CT image from the image database 16. The image acquisition unit 40 may acquire the CT image from the modality 14 (see FIG. 1) which is the CT device, or may acquire the CT image from a storage device (not shown) via the network 18. However, the medical image may be acquired via an information storage medium (not shown).
 スコア算出部42は、画像取得部40が取得したCT画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示す評価値であるスコアをそれぞれ算出する。本実施形態では、スコアは0~1の間の値を有し、クラスに属する度合いが高いほど大きな値となる。スコア算出部42は、各クラスに対応する特徴量をそれぞれ算出する複数の特徴量算出部を備え、各クラスに対応する特徴量に基づいて各クラスに属するスコアをそれぞれ算出する。 The score calculation unit 42 calculates a score that is an evaluation value indicating the degree to which at least one region of the CT image acquired by the image acquisition unit 40 belongs to each of a plurality of predetermined classes. In this embodiment, the score has a value between 0 and 1, and the higher the degree of belonging to the class, the larger the score. The score calculation unit 42 includes a plurality of feature amount calculation units that respectively calculate the feature amount corresponding to each class, and calculates the scores belonging to each class based on the feature amounts corresponding to each class.
 スコア算出部42は、特徴量算出部として、医用画像から特定のクラスに属するピクセルを認識することを目的として設計及び学習された畳み込みニューラルネットワーク(Convolutional Neural Network:CNN)を有している。畳み込みニューラルネットワークは、複数のフィルタによる畳み込み処理により画像の局所的な特徴抽出を行う畳み込み層と、抽出した特徴を矩形領域毎にまとめるプーリング層とを繰り返した構造を有している。 The score calculation unit 42 has, as a feature amount calculation unit, a convolutional neural network (CNN) designed and learned for the purpose of recognizing pixels belonging to a specific class from a medical image. The convolutional neural network has a structure in which a convolutional layer that performs local feature extraction of an image by convolution processing using a plurality of filters and a pooling layer that collects the extracted features for each rectangular region are repeated.
 本実施形態では、スコア算出部42は、各ピクセルのすりガラス影のクラスのスコアを算出するすりガラス影検出用畳み込みニューラルネットワーク44、各ピクセルの網状影のクラスのスコアを算出する網状影検出用畳み込みニューラルネットワーク46、及び各ピクセルの蜂窩肺のクラスのスコアを算出する蜂窩肺検出用畳み込みニューラルネットワーク48を有する。線状影等のその他のクラスのスコアを算出するための畳み込みニューラルネットワークを備えてもよい。 In the present embodiment, the score calculation unit 42 includes a ground glass shadow detection convolutional neural network 44 that calculates a score of a ground glass shadow class of each pixel, and a mesh mesh detection convolutional neural network that calculates a score of a mesh shadow class of each pixel. It has a network 46 and a convolutional neural network 48 for pulmonary detection for calculating the pulmonary lung class score for each pixel. A convolutional neural network may be provided for calculating scores for other classes such as linear shadows.
 優先度設定部50は、分類する複数のクラスに対してそれぞれ優先度を設定する。本実施形態では、優先度設定部50は、すりガラス影のクラス、網状影のクラス、及び蜂窩肺のクラスについて、各クラスに対応する病変の重篤度に基づいて優先度を設定する。 Priority setting unit 50 sets priorities for a plurality of classes to be classified. In the present embodiment, the priority setting unit 50 sets priorities for the ground glass shadow class, the reticular shadow class, and the celluloid class based on the severity of the lesion corresponding to each class.
 分類部52は、画像取得部40が取得したCT画像の領域を複数のクラスのうちのいずれかのクラスに分類する。また、分類部52は、CT画像のそれぞれ異なるクラスに分類した領域をそれぞれ異なる色で識別可能に表現した分類マップを生成する。本実施形態では、分類部52は、CT画像の各ピクセルについて、スコア算出部42が算出したスコアが閾値を超えるクラスのうち、優先度設定部50が設定した優先度に基づいたクラスに分類する。本実施形態では、閾値は0~1の値を有する。閾値は、クラス毎に異なる値であってもよい。 The classification unit 52 classifies the region of the CT image acquired by the image acquisition unit 40 into any one of a plurality of classes. Further, the classification unit 52 generates a classification map in which regions classified into different classes of the CT image are expressed in different colors so that they can be identified. In the present embodiment, the classification unit 52 classifies each pixel of the CT image into a class based on the priority set by the priority setting unit 50 among the classes in which the score calculated by the score calculation unit 42 exceeds the threshold value. .. In this embodiment, the threshold value has a value of 0 to 1. The threshold may be a different value for each class.
 記憶部54は、画像処理装置12の各種データを記憶する。例えば、記憶部54には、画像処理装置12において実行される各種プログラムが記憶される。記憶部54は、複数の記憶装置を適用してもよいし、複数の記憶領域に区画された1つの記憶装置を適用してもよい。記憶部54は、画像処理装置12の外部に備えられる1つ以上の記憶装置を適用してもよい。 The storage unit 54 stores various data of the image processing device 12. For example, the storage unit 54 stores various programs executed by the image processing apparatus 12. As the storage unit 54, a plurality of storage devices may be applied, or a single storage device partitioned into a plurality of storage areas may be applied. The storage unit 54 may be one or more storage devices provided outside the image processing apparatus 12.
 また、記憶部54は、重篤度記憶部56を備える。重篤度記憶部56は、各クラスに対応する病変毎の重篤度をそれぞれ記憶する。重篤度は、患者の生命に対する危険度を相対的に表した指標であり、医学的な事前知識等を基に予め設定される。本実施形態では、重篤度が相対的に高い順に、蜂窩肺、網状影、及びすりガラス影であるものとする。重篤度記憶部56は、ネットワーク18を介した不図示の医療用データベースに設けられていてもよい。これにより、重篤度の更新が容易になる。 The storage unit 54 also includes a severity storage unit 56. The severity storage unit 56 stores the severity of each lesion corresponding to each class. Severity is an index that relatively represents the degree of risk to the patient's life, and is set in advance based on medical prior knowledge and the like. In the present embodiment, it is assumed that the cells are a cell lung, a reticular shadow, and a ground glass shadow in order of relatively high severity. The severity storage unit 56 may be provided in a medical database (not shown) via the network 18. This facilitates updating the severity.
 入力制御部60は、マウス20及びキーボード22から送信される入力情報を表す信号を、画像処理装置12に適用される形式の信号に変換する。入力情報を表す信号は、装置各部へ適宜送信される。 The input control unit 60 converts a signal representing input information transmitted from the mouse 20 and the keyboard 22 into a signal of a format applied to the image processing device 12. The signal representing the input information is appropriately transmitted to each unit of the device.
 表示制御部62は、表示情報を表す信号を表示装置24へ送信する。これにより、表示装置24に表示情報が表示される。表示情報とは、例えば領域が分類されたクラス、又は分類部52が生成した分類マップである。画像処理装置12と表示装置24とによって、分類結果表示装置として機能する。 The display control unit 62 transmits a signal representing display information to the display device 24. As a result, the display information is displayed on the display device 24. The display information is, for example, a class into which regions are classified, or a classification map generated by the classification unit 52. The image processing device 12 and the display device 24 function as a classification result display device.
 〔画像処理部のハードウェア構成〕
 〈全体構成〉
 図3は、画像処理装置12のハードウェアの構成を示すブロック図である。画像処理装置12は、図3に示すハードウェアを用いてプログラムを実行し、各種機能を実現する。
[Hardware configuration of image processing unit]
<overall structure>
FIG. 3 is a block diagram showing the hardware configuration of the image processing apparatus 12. The image processing device 12 executes the program using the hardware shown in FIG. 3 to realize various functions.
 画像処理装置12は、プロセッサ100、メモリ102、ストレージ装置104、ネットワークコントローラ106、及び電源装置108を備える。また、画像処理装置12は、ディスプレイコントローラ110、入出力インターフェース112、及び入力コントローラ114を備える。 The image processing device 12 includes a processor 100, a memory 102, a storage device 104, a network controller 106, and a power supply device 108. The image processing device 12 also includes a display controller 110, an input/output interface 112, and an input controller 114.
 プロセッサ100、メモリ102、ストレージ装置104、ネットワークコントローラ106、ディスプレイコントローラ110、入出力インターフェース112、及び入力コントローラ114は、バス64を介してデータ通信が可能に接続される。 The processor 100, the memory 102, the storage device 104, the network controller 106, the display controller 110, the input/output interface 112, and the input controller 114 are connected via the bus 64 so that data communication is possible.
 〈制御部〉
 プロセッサ100は、画像処理装置12の全体制御部、各種演算部、及び記憶制御部として機能する。プロセッサ100は、メモリ102に具備されるROM(read only memory)に記憶されるプログラムを実行する。
<Control part>
The processor 100 functions as an overall control unit of the image processing apparatus 12, various calculation units, and a storage control unit. The processor 100 executes a program stored in a ROM (read only memory) included in the memory 102.
 プロセッサ100は、ネットワークコントローラ106を介して外部の記憶装置からダウンロードされたプログラムを実行してもよい。外部の記憶装置は、ネットワーク18を介して画像処理装置12と通信可能に接続されていてもよい。 The processor 100 may execute a program downloaded from an external storage device via the network controller 106. The external storage device may be communicatively connected to the image processing device 12 via the network 18.
 プロセッサ100は、メモリ102に具備されるRAM(random access memory)を演算領域とし、各種プログラムと協働して、各種処理を実行する。これにより、画像処理装置12の各種機能が実現される。 The processor 100 uses a RAM (random access memory) included in the memory 102 as a calculation area and executes various processes in cooperation with various programs. As a result, various functions of the image processing device 12 are realized.
 プロセッサ100は、ストレージ装置104からのデータの読み出し、及びストレージ装置104へのデータの書き込みを制御する。プロセッサ100は、ネットワークコントローラ106を介して、外部の記憶装置から各種データを取得してもよい。プロセッサ100は、取得した各種データを用いて、演算等の各種処理を実行可能である。 The processor 100 controls the reading of data from the storage device 104 and the writing of data to the storage device 104. The processor 100 may acquire various data from an external storage device via the network controller 106. The processor 100 can execute various kinds of processing such as calculation using the various kinds of acquired data.
 プロセッサ100は、1つ又は2つ以上のデバイスが含まれてもよい。プロセッサ100の例として、FPGA(Field Programmable Gate Array)及びPLD(Programmable Logic Device)等が挙げられる。FPGA及びPLDは、製造後に回路構成を変更し得るデバイスである。 The processor 100 may include one or more devices. Examples of the processor 100 include FPGA (Field Programmable Gate Array) and PLD (Programmable Logic Device). FPGAs and PLDs are devices whose circuit configurations can be changed after manufacture.
 プロセッサ100の他の例として、ASIC(Application Specific Integrated Circuit)が挙げられる。ASICは、特定の処理を実行させるために専用に設計された回路構成を備える。 Another example of the processor 100 is an ASIC (Application Specific Integrated Circuit). The ASIC has a circuit configuration specifically designed to execute a specific process.
 プロセッサ100は、同じ種類の2つ以上のデバイスを適用可能である。例えば、プロセッサ100は2つ以上のFPGAを用いてもよいし、2つのPLDを用いてもよい。プロセッサ100は、異なる種類の2つ以上のデバイスを適用してもよい。例えば、プロセッサ100は1つ以上のFPGAと1つ以上のASICとを適用してもよい。 The processor 100 can apply two or more devices of the same type. For example, the processor 100 may use two or more FPGAs and may use two PLDs. The processor 100 may apply two or more devices of different types. For example, the processor 100 may apply one or more FPGAs and one or more ASICs.
 複数のプロセッサ100を備える場合、複数のプロセッサ100は1つのデバイスを用いて構成してもよい。複数のプロセッサ100を1つのデバイスで構成する一例として、1つ以上のCPU(Central Processing Unit)とソフトウェアとの組合せを用いて1つのプロセッサを構成し、このプロセッサが複数のプロセッサ100として機能する形態がある。なお、本明細書におけるソフトウェアはプログラムと同義である。 If a plurality of processors 100 are provided, the plurality of processors 100 may be configured using one device. As an example of configuring a plurality of processors 100 with a single device, one processor is configured using a combination of one or more CPUs (Central Processing Units) and software, and this processor functions as a plurality of processors 100. There is. Software in this specification has the same meaning as a program.
 CPUに代わり又はCPUと併用して、画像処理に特化したデバイスであるGPU(Graphics Processing Unit)を適用してもよい。複数のプロセッサ100が1つのデバイスを用いて構成される代表例として、コンピュータが挙げられる。 A GPU (Graphics Processing Unit), which is a device specialized for image processing, may be applied instead of or in combination with the CPU. A computer is a typical example in which the plurality of processors 100 are configured by using one device.
 1つのデバイスを用いて複数のプロセッサ100を構成する他の例として、複数のプロセッサ100を含むシステム全体の機能を1つのICチップで実現するデバイスを使用する形態が挙げられる。複数のプロセッサ100を含むシステム全体の機能を1つのICチップで実現するデバイスの代表例として、SoC(System On Chip)が挙げられる。なお、ICは、Integrated Circuitの省略語である。 Another example of configuring a plurality of processors 100 using a single device is a form of using a device that realizes the functions of the entire system including the plurality of processors 100 with a single IC chip. A SoC (System On Chip) is a typical example of a device that implements the functions of the entire system including a plurality of processors 100 with a single IC chip. Note that IC is an abbreviation for Integrated Circuit.
 プロセッサ100のハードウェア的な構造は、より具体的には、半導体素子等の回路素子を組み合わせた電気回路(circuitry)である。 More specifically, the hardware structure of the processor 100 is an electrical circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
 このように、プロセッサ100は、ハードウェア的な構造として、各種のデバイスを1つ以上用いて構成される。 As described above, the processor 100 is configured by using one or more various devices as a hardware structure.
 〈メモリ〉
 メモリ102は、不図示のROM、及び不図示のRAMを備える。ROMは、画像処理装置12において実行される各種プログラムを記憶する。ROMは、各種プログラムの実行に用いられるパラメータ、及びファイル等を記憶する。RAMは、データの一時記憶領域、及びプロセッサ100のワーク領域等として機能する。
<memory>
The memory 102 includes a ROM (not shown) and a RAM (not shown). The ROM stores various programs executed by the image processing device 12. The ROM stores parameters used for executing various programs, files, and the like. The RAM functions as a temporary storage area for data, a work area of the processor 100, and the like.
 〈ストレージ装置〉
 ストレージ装置104は、各種データを非一時的に記憶する。ストレージ装置104は、画像処理装置12の外部に外付けされてもよい。ストレージ装置104に代わり、又はこれと併用して、大容量の半導体メモリ装置を適用してもよい。
<Storage device>
The storage device 104 stores various data non-temporarily. The storage device 104 may be externally attached to the outside of the image processing device 12. A large-capacity semiconductor memory device may be applied instead of or in combination with the storage device 104.
 〈ネットワークコントローラ〉
 ネットワークコントローラ106は、外部装置との間のデータ通信を制御する。データ通信の制御は、データ通信のトラフィックの管理が含まれてもよい。ネットワークコントローラ106を介して接続されるネットワーク18は、LAN(Local Area Network)などの公知のネットワークを適用し得る。
<Network controller>
The network controller 106 controls data communication with an external device. Controlling data communication may include managing data communication traffic. The network 18 connected via the network controller 106 may be a known network such as a LAN (Local Area Network).
 〈電源装置〉
 電源装置108は、UPS(Uninterruptible Power Supply)などの大容量型の電源装置が適用される。電源装置108は停電等に起因して商用電源が遮断された際に、画像処理装置12へ電源を供給する。
<Power supply unit>
As the power supply device 108, a large-capacity power supply device such as UPS (Uninterruptible Power Supply) is applied. The power supply device 108 supplies power to the image processing device 12 when the commercial power supply is cut off due to a power failure or the like.
 〈ディスプレイコントローラ〉
 ディスプレイコントローラ110は、プロセッサ100から送信される指令信号に基づいて表示装置24を制御するディスプレイドライバーとして機能する。
<Display controller>
The display controller 110 functions as a display driver that controls the display device 24 based on a command signal transmitted from the processor 100.
 〈入出力インターフェース〉
 入出力インターフェース112は、画像処理装置12と外部機器とを通信可能に接続する。入出力インターフェース112は、USB(Universal Serial Bus)などの通信規格を適用し得る。
<Input/output interface>
The input/output interface 112 connects the image processing apparatus 12 and an external device so that they can communicate with each other. The input/output interface 112 can apply a communication standard such as USB (Universal Serial Bus).
 〈入力コントローラ〉
 入力コントローラ114は、マウス20及びキーボード22を用いて入力された信号の形式を画像処理装置12の処理に適した形式に変換する。入力コントローラ114を介してマウス20及びキーボード22から入力された情報は、プロセッサ100を介して各部へ送信される。
<Input controller>
The input controller 114 converts the format of a signal input using the mouse 20 and the keyboard 22 into a format suitable for the processing of the image processing apparatus 12. Information input from the mouse 20 and the keyboard 22 via the input controller 114 is transmitted to each unit via the processor 100.
 なお、図3に示す画像処理装置12のハードウェア構成は一例であり、画像処理の仕様に応じて追加、削除、及び変更することが可能である。 Note that the hardware configuration of the image processing device 12 shown in FIG. 3 is an example, and addition, deletion, and modification can be performed according to the image processing specifications.
 [分類方法]
 次に、画像処理装置12による医用画像の領域毎のクラス分類方法について説明する。ここでは、肺のCT画像の各ピクセルを、すりガラス影のクラス、網状影のクラス、及び蜂窩肺のクラスに分類した分類マップを生成する例を説明する。
[Classification method]
Next, a class classification method by the image processing apparatus 12 for each area of the medical image will be described. Here, an example of generating a classification map in which each pixel of the CT image of the lung is classified into a frosted glass shadow class, a reticular shadow class, and a beehive lung class will be described.
 図4は、クラス分類方法の処理を示すフローチャートである。クラス分類方法は、画像取得工程(ステップS1)、スコア算出工程(ステップS2)、優先度設定工程(ステップS3)、分類工程(ステップS4)、及び表示工程(ステップS5)を有する。 FIG. 4 is a flowchart showing the processing of the class classification method. The class classification method includes an image acquisition process (step S1), a score calculation process (step S2), a priority setting process (step S3), a classification process (step S4), and a display process (step S5).
 ステップS1では、画像取得部40は、画像データベース16から医用画像として患者の肺のCT画像を取得する。 In step S1, the image acquisition unit 40 acquires a CT image of a patient's lungs as a medical image from the image database 16.
 ステップS2では、スコア算出部42は、ステップS1で取得したCT画像の各ピクセルについて、複数のクラスの各クラスに属する度合いを示すクラス毎のスコアをそれぞれ算出する。ここでは、すりガラス影検出用畳み込みニューラルネットワーク44、網状影検出用畳み込みニューラルネットワーク46、及び蜂窩肺検出用畳み込みニューラルネットワーク48によって、それぞれすりガラス影のクラスのスコア、網状影のクラスのスコア、及び蜂窩肺のクラスのスコアを算出する。 In step S2, the score calculation unit 42 calculates a score for each class indicating the degree to which each pixel of the CT image acquired in step S1 belongs to each class of the plurality of classes. Here, a frosted glass shadow detection convolutional neural network 44, a reticular shadow detection convolutional neural network 46, and a beehive lung detection convolutional neural network 48 are used to detect the frosted glass shadow class score, the reticular shadow class score, and the honeycomb lung, respectively. Calculate the class score of.
 ステップS3では、優先度設定部50は、すりガラス影のクラス、網状影のクラス、及び蜂窩肺のクラスに対して優先度を設定する。本実施形態では、優先度設定部50は、重篤度記憶部56から蜂窩肺、網状影、及びすりガラス影の重篤度を読み出し、重篤度が高い病変に関するクラスほど優先度を高く設定する。重篤度が相対的に高い順に、蜂窩肺、網状影、及びすりガラス影であるため、優先度設定部50は、蜂窩肺のクラスの優先度を最も高く設定し、網状影のクラスの優先度を2番目に高く設定し、すりガラス影のクラスの優先度を最も低く設定する。 In step S3, the priority setting unit 50 sets priorities for the ground glass shadow class, the reticular shadow class, and the beehive lung class. In the present embodiment, the priority setting unit 50 reads the severity of the beehive lung, the reticular shadow, and the ground glass shadow from the severity storage unit 56, and sets a higher priority for a class related to a lesion having a higher severity. .. Since the beehive lung, the reticular shadow, and the ground glass shadow are in descending order of the degree of severity, the priority setting unit 50 sets the priority of the class of the benign lung to the highest, and the priority of the class of the reticular shadow. Is set to the second highest, and the priority of the ground glass shadow class is set to the lowest.
 ステップS4では、分類部52は、ステップS1で取得したCT画像の各ピクセルをそれぞれすりガラス影のクラス、網状影のクラス、及び蜂窩肺のクラスのうちのいずれかのクラスに分類する。分類部52は、すりガラス影のクラスのスコアのみが閾値を超えるピクセルを、すりガラス影のクラスに分類する。また、分類部52は、網状影のクラスのスコアのみが閾値を超えるピクセルを、網状影のクラスに分類する。また、分類部52は、蜂窩肺のクラスのスコアのみが閾値を超えるピクセルを、蜂窩肺のクラスに分類する。 In step S4, the classification unit 52 classifies each pixel of the CT image acquired in step S1 into one of a frosted glass shadow class, a reticular shadow class, and a beehive lung class. The classification unit 52 classifies the pixels whose score of the frosted glass shadow class exceeds the threshold value into the frosted glass shadow class. Further, the classification unit 52 classifies pixels in which only the score of the reticular shadow class exceeds the threshold value into the reticular shadow class. Further, the classification unit 52 classifies pixels in which only the score of the beehive lung class exceeds the threshold value into the beehive lung class.
 さらに、分類部52は、すりガラス影のクラスのスコア、網状影のクラスのスコア、及び蜂窩肺のクラスのスコアのうち2つ以上のスコアが閾値を超えるピクセルを、閾値を超えたクラスのうちステップS4で設定された優先度が相対的に高いクラスに分類する。 Further, the classification unit 52 selects pixels whose two or more scores out of the scores of the ground glass shadow class score, the reticular shadow class score, and the beehive lung class score exceed a threshold value, and select a pixel out of the classes exceeding the threshold value. It is classified into a class having a relatively high priority set in S4.
 例えば、すりガラス影のクラスのスコアと網状影のクラスのスコアとが共に閾値を超えるピクセルは、優先度が相対的に高い網状影のクラスに分類する。また、網状影のクラスのスコアと蜂窩肺のクラスのスコアとが共に閾値を超えるピクセルは、優先度が相対的に高い蜂窩肺のクラスに分類する。また、3つのクラスのスコアが全て閾値を超えるピクセルは、優先度が相対的に最も高い蜂窩肺のクラスに分類する。 For example, a pixel in which both the frosted glass shadow class score and the reticular shadow class score exceed the threshold is classified into a reticular shadow class with a relatively high priority. Pixels in which both the reticulated shadow class score and the beehive lung class score exceed the threshold value are classified into the bee lung class having a relatively high priority. Pixels in which the scores of all three classes exceed the threshold value are classified into the honey lung class having the highest priority.
 なお、分類部52は、すりガラス影のクラスのスコア、網状影のクラスのスコア、及び蜂窩肺のクラスのスコアのいずれのスコアも閾値未満であるピクセルは、病変として分類しない。 Note that the classification unit 52 does not classify a pixel whose score of each of the ground glass shadow class score, the reticular shadow class score, and the celluloid shadow class score is less than a threshold value as a lesion.
 続いて、分類部52は、全てのピクセルの分類結果から分類マップを生成する。本実施形態では、すりガラス影のクラス、網状影のクラス、及び蜂窩肺のクラスのピクセルをそれぞれ異なる色で表現することで可視化した分類マップを生成する。 Next, the classification unit 52 generates a classification map from the classification results of all pixels. In the present embodiment, a visualized classification map is generated by expressing pixels of a frosted glass shadow class, a reticular shadow class, and a cell of a beehive lung in different colors.
 ステップS5では、表示制御部62は、ステップS4で生成した分類マップを表示装置24に表示させる。以上により、画像処理装置12はクラス分類方法を終了する。 In step S5, the display control unit 62 causes the display device 24 to display the classification map generated in step S4. With the above, the image processing apparatus 12 ends the class classification method.
 図5及び図6は、クラス分類方法の処理を説明するための図である。図5に示すように、すりガラス影検出用畳み込みニューラルネットワーク44、網状影検出用畳み込みニューラルネットワーク46、及び蜂窩肺検出用畳み込みニューラルネットワーク48に、それぞれ画像取得部40が取得した画像G1が入力される。 5 and 6 are diagrams for explaining the process of the class classification method. As shown in FIG. 5, the image G1 acquired by the image acquisition unit 40 is input to the convolutional neural network for frosted glass shadow detection 44, the convolutional neural network for meshed dot detection 46, and the convolutional neural network 48 for cell lung detection. ..
 図5に示す画像G2は、すりガラス影検出用畳み込みニューラルネットワーク44による画像G1の特徴量検出において、すりガラス影のクラスのスコア(特徴量)が閾値を超えたピクセルの位置をハッチングによって可視化した画像である。また、図5に示す画像G3は、網状影検出用畳み込みニューラルネットワーク46による画像G1の特徴量抽出において、網状影のクラスのスコアが閾値を超えたピクセルの位置をハッチングによって可視化した画像である。また、図5に示す画像G4は、蜂窩肺検出用畳み込みニューラルネットワーク48による画像G1の特徴量抽出において、蜂窩肺のクラスのスコアが閾値を超えたピクセルの位置をハッチングによって可視化した画像である。 The image G2 shown in FIG. 5 is an image in which the position of a pixel whose score (feature amount) of the frosted glass shadow class exceeds the threshold value is visualized by hatching in the feature amount detection of the image G1 by the frosted glass shadow detection convolutional neural network 44. is there. An image G3 shown in FIG. 5 is an image in which the position of a pixel in which the score of the mesh-like shadow class exceeds the threshold value is visualized by hatching in the feature amount extraction of the image G1 by the convolutional neural network for mesh-like shadow detection 46. Further, the image G4 shown in FIG. 5 is an image in which the position of a pixel whose beehive lung class score exceeds the threshold value is visualized by hatching in the extraction of the feature amount of the image G1 by the convolutional neural network 48 for detecting the benign lung.
 なお、画像G2、画像G3、及び画像G4は、クラス分類方法の処理を説明するための画像であり、実際のクラス分類方法において生成する必要はない。 The image G2, the image G3, and the image G4 are images for explaining the processing of the class classification method, and need not be generated by the actual class classification method.
 図5に示すように、分類部52は、すりガラス影検出用畳み込みニューラルネットワーク44、網状影検出用畳み込みニューラルネットワーク46、及び蜂窩肺検出用畳み込みニューラルネットワーク48による特徴量抽出の結果から、分類マップである画像G7を生成する。分類マップは、画像G2、画像G3、及び画像G4を統合した画像と等価である。 As shown in FIG. 5, the classification unit 52 is a classification map based on the result of the feature amount extraction by the convolutional neural network for ground glass shadow detection 44, the convolutional neural network for meshed dot detection 46, and the convolutional neural network for beehive lung detection 48. An image G7 is generated. The classification map is equivalent to an image obtained by integrating the images G2, G3, and G4.
 図6に示す画像G5、画像G6、及び画像G8は、それぞれ画像G2、画像G3、及び画像G7の同じ領域R1を拡大したものである。画像G5及び画像G6に示すように、この例では、網状影のクラスのスコア及び蜂窩肺のクラスのスコアが共に閾値を超えるピクセルが存在する。 Image G5, image G6, and image G8 shown in FIG. 6 are enlarged views of the same region R1 of image G2, image G3, and image G7, respectively. As shown in the images G5 and G6, in this example, there are pixels in which both the reticular shadow class score and the beehive lung class score exceed the threshold value.
 ここでは、蜂窩肺は網状影よりも優先度が高い。したがって、分類部52は、画像G8に示すように、網状影のクラスのスコア及び蜂窩肺のクラスのスコアが共に閾値を超えるピクセルを、蜂窩肺のクラスに分類する。 Here, the cell lungs have a higher priority than the reticular shadow. Therefore, as shown in the image G8, the classification unit 52 classifies the pixels having both the reticular shadow class score and the beehive lung class score exceeding the threshold value into the beehive lung class.
 以上のように、第1実施形態によれば、医用画像のピクセルを複数のクラスに適切に分類し、重篤な画像特徴のサインを見逃さずに提示することができる。ここでは、ピクセル毎に分類したが、複数のピクセルを1つの領域として、領域毎に分類してもよい。 As described above, according to the first embodiment, it is possible to appropriately classify the pixels of the medical image into a plurality of classes and present them without missing a sign of serious image features. Although the pixels are classified here, they may be classified into a plurality of pixels as one area.
 本実施形態では、優先度設定部50は、病変の重篤度に基づいて優先度を決定したが、病変の時間的変化に基づいて優先度を決定してもよい。例えば、病気の進行に伴い病変Aが病変Bへ変化する場合であれば、優先度設定部50は、病変Aよりも時間的に後の病変である病変Bの優先度を高く設定する。病変の時間的変化は、記憶部54に予め記憶すればよい。また、病変の時間的変化は、ネットワーク18を介した不図示の医療用データベースに記憶してもよい。これにより、病変の時間的変化の更新が容易になる。 In the present embodiment, the priority setting unit 50 determines the priority based on the severity of the lesion, but may determine the priority based on the temporal change of the lesion. For example, when the lesion A changes to the lesion B as the disease progresses, the priority setting unit 50 sets the priority of the lesion B, which is the lesion temporally later than the lesion A, to be higher. The temporal change of the lesion may be stored in the storage unit 54 in advance. Further, the temporal change of the lesion may be stored in a medical database (not shown) via the network 18. This facilitates updating the temporal change of the lesion.
 また、優先度設定部50は、画像に関するグローバル情報を取得し、取得したグローバル情報に基づいて優先度を設定してもよい。グローバル情報は、例えば分類を行うCT画像の被写体である患者の病名(診断名)に関する情報、患者の年齢、及び患者の性別の少なくとも1つを含む。グローバル情報は、医師であるユーザがマウス20及びキーボード22から入力してもよいし、ネットワーク18を介した不図示の医療用データベースから取得してもよい。同じCT画像であっても、入力されるグローバル情報により表示される分類マップが異なる場合がある。 Alternatively, the priority setting unit 50 may acquire global information about an image and set the priority based on the acquired global information. The global information includes at least one of information on a disease name (diagnosis name) of a patient who is a subject of a CT image to be classified, the age of the patient, and the sex of the patient. The global information may be input by a user who is a doctor using the mouse 20 and the keyboard 22, or may be acquired from a medical database (not shown) via the network 18. Even for the same CT image, the displayed classification map may differ depending on the input global information.
 優先度設定部50は、重篤度を正規化した値を優先度として設定してもよい。この場合、優先度設定部50は、重篤度が高いほど優先度を大きな値に設定し、分類部52は、クラス毎に優先度とスコアとの積を求め、積の値が最も大きいクラスにピクセルを分類することができる。 The priority setting unit 50 may set a value obtained by normalizing the severity as the priority. In this case, the priority setting unit 50 sets the priority to a larger value as the severity is higher, and the classification unit 52 obtains the product of the priority and the score for each class, and the class having the largest product value. Pixels can be classified into.
 本実施形態では、スコア算出部42は、クラス毎の畳み込みニューラルネットワークを用いてそれぞれのクラスのスコアを算出したが、複数のクラスを識別可能な単一のニューラルネットワークを用いてもよい。単一のニューラルネットワークを用いた場合であっても、複数のクラスについてそれぞれ高いスコアが算出される場合がある。したがって、スコアが閾値を超えるクラスが複数存在する場合には優先度が相対的に高いクラスを選択することで、同様の課題を解決することが可能となる。 In the present embodiment, the score calculation unit 42 calculates the score of each class using a convolutional neural network for each class, but a single neural network that can identify a plurality of classes may be used. Even when a single neural network is used, a high score may be calculated for each of a plurality of classes. Therefore, when there are a plurality of classes whose scores exceed the threshold value, it is possible to solve the same problem by selecting a class having a relatively high priority.
 <第2実施形態>
 第2実施形態に係る画像処理装置について説明する。第1の実施形態に係る画像処理装置は、複数のスコアが閾値を超える場合に、優先度に基づいて領域を分類したが、第2の実施形態に係る画像処理装置は、複数のスコアが閾値を超える場合に、グローバル情報に基づいて領域を分類する。
<Second Embodiment>
An image processing apparatus according to the second embodiment will be described. The image processing apparatus according to the first embodiment classifies regions based on priority when a plurality of scores exceeds a threshold, but the image processing apparatus according to the second embodiment has a plurality of scores as a threshold. If it exceeds, the area is classified based on the global information.
 [画像処理装置]
 図7は、第2実施形態に係る画像処理装置70の機能ブロック図である。なお、図2に示すブロック図と共通する部分には同一の符号を付し、その詳細な説明は省略する。
[Image processing device]
FIG. 7 is a functional block diagram of the image processing device 70 according to the second embodiment. The same parts as those in the block diagram shown in FIG. 2 are designated by the same reference numerals, and detailed description thereof will be omitted.
 画像処理装置70は、グローバル情報取得部72を備えている。グローバル情報取得部72は、ユーザがマウス20及びキーボード22によって入力したグローバル情報を取得する。本実施形態では、グローバル情報取得部72は、分類を行うCT画像の被写体である患者の病名をグローバル情報として取得する。グローバル情報取得部72は、ネットワーク18を介した不図示の医療用データベースからグローバル情報を取得してもよい。 The image processing device 70 includes a global information acquisition unit 72. The global information acquisition unit 72 acquires global information input by the user with the mouse 20 and the keyboard 22. In the present embodiment, the global information acquisition unit 72 acquires, as global information, the disease name of the patient who is the subject of the CT image to be classified. The global information acquisition unit 72 may acquire global information from a medical database (not shown) via the network 18.
 スコア算出部42は、画像取得部40が取得したCT画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出する。本実施形態では、スコア算出部42は、Tree-in-bud appearanceのクラス、気管支拡張のクラス、牽引性気管拡張のクラス、すりガラス影のクラス、蜂窩肺のクラス、及びコンソリデーションのクラスに属するスコアをそれぞれ算出する。本実施形態では、スコアは0~1の間の値を有し、クラスに属する度合いが高いほど大きな値となる。 The score calculation unit 42 calculates a score indicating the degree to which at least one region of the CT image acquired by the image acquisition unit 40 belongs to each of a plurality of predetermined classes. In the present embodiment, the score calculation unit 42 includes scores belonging to the Tree-in-bud appearance class, the bronchodilation class, the traction tracheal dilation class, the ground glass shadow class, the celluloid class, and the consolidation class. Are calculated respectively. In this embodiment, the score has a value between 0 and 1, and the higher the degree of belonging to the class, the larger the score.
 分類部52は、画像取得部40が取得したCT画像の領域を複数のクラスのうちのいずれかのクラスに分類する。本実施形態では、分類部52は、CT画像の各ピクセルについて、スコアが閾値を超えた1つ又は複数のクラスのうち、グローバル情報取得部72が取得したグローバル情報に基づいたクラスに分類する。 The classification unit 52 classifies the region of the CT image acquired by the image acquisition unit 40 into any one of a plurality of classes. In the present embodiment, the classification unit 52 classifies each pixel of the CT image into a class based on the global information acquired by the global information acquisition unit 72, out of one or a plurality of classes whose score exceeds the threshold value.
 記憶部54は、キー所見記憶部74を備えている。キー所見記憶部74は、病名とキー所見とを関連付けて記憶する。キー所見は、分類するクラスに対応する病変と同じものである。 The storage unit 54 includes a key finding storage unit 74. The key finding storage unit 74 stores the disease name and the key finding in association with each other. The key findings are the same as the lesions corresponding to the class to be classified.
 図8は、キー所見記憶部74が記憶する病名とキー所見との一例を示す図である。本実施形態では、図8に示すように、病名「膠原病肺(1)RA」に関連するキー所見として、Tree-in-bud appearance、気管支拡張、牽引性気管拡張、すりガラス影、及び蜂窩肺が記憶されている。また、病名「膠原病肺(2)SSc(PPS)」に関連するキー所見として、気管支拡張、牽引性気管拡張、及びすりガラス影が記憶されている。また、病名「特発性間質性肺炎COP/OP」に関連するキー所見として、コンソリデーション、気管支透亮像、牽引性気管拡張、及びすりガラス影が記憶されている。 FIG. 8 is a diagram showing an example of disease names and key findings stored in the key finding storage unit 74. In the present embodiment, as shown in FIG. 8, tree-in-bud appearance, bronchodilation, tractive tracheal dilation, ground glass shadow, and cell lung are key findings related to the disease name “collagen lung (1) RA”. Is remembered. In addition, bronchodilation, tractive tracheal dilation, and ground glass shadow are stored as key findings related to the disease name "collagen lung (2) SSc (PPS)". Also, as key findings related to the disease name "idiopathic interstitial pneumonia COP/OP", consolidation, bronchial transparent image, traction tracheal dilation, and ground glass shadow are stored.
 キー所見記憶部74は、ネットワーク18を介した不図示の医療用データベースに設けられていてもよい。これにより、病名とキー所見との関連の更新が容易になる。 The key finding storage unit 74 may be provided in a medical database (not shown) via the network 18. This facilitates updating the association between the disease name and the key finding.
 画像処理装置70のハードウェアの構成は、図3に示した画像処理装置12のハードウェアの構成と同様である。 The hardware configuration of the image processing device 70 is the same as the hardware configuration of the image processing device 12 shown in FIG.
 [分類方法]
 次に、画像処理装置70による医用画像の領域毎のクラス分類方法について説明する。図9は、クラス分類方法の処理を示すフローチャートである。なお、図4に示すフローチャートと共通する部分には同一の符号を付し、その詳細な説明は省略する。クラス分類方法は、画像取得工程(ステップS1)、スコア算出工程(ステップS2)、グローバル情報取得工程(ステップS11)、分類工程(ステップS4)、及び表示工程(ステップS5)を有する。
[Classification method]
Next, a class classification method by the image processing apparatus 70 for each area of the medical image will be described. FIG. 9 is a flowchart showing the processing of the class classification method. The same parts as those in the flowchart shown in FIG. 4 are designated by the same reference numerals, and detailed description thereof will be omitted. The class classification method includes an image acquisition process (step S1), a score calculation process (step S2), a global information acquisition process (step S11), a classification process (step S4), and a display process (step S5).
 ステップS1では、画像取得部40は、画像データベース16から医用画像として患者の肺のCT画像を取得する。 In step S1, the image acquisition unit 40 acquires a CT image of a patient's lungs as a medical image from the image database 16.
 ステップS2では、スコア算出部42は、ステップS1で取得したCT画像の各ピクセルについて、複数のクラスのクラス毎のスコアをそれぞれ算出する。ここでは、Tree-in-bud appearanceのクラスのスコア、気管支拡張のクラスのスコア、牽引性気管拡張のクラスのスコア、すりガラス影のクラスのスコア、蜂窩肺のクラスのスコア、及びコンソリデーションのクラスのスコアを算出する。 In step S2, the score calculation unit 42 calculates a score for each class of a plurality of classes for each pixel of the CT image acquired in step S1. Here, Tree-in-bud appearance class scores, bronchodilation class scores, traction tracheal dilation class scores, frosted shadow class scores, cell lung class scores, and consolidation class scores Calculate the score.
 ステップS11では、グローバル情報取得部72は、グローバル情報として患者の病名を取得する。ここでは、ユーザがマウス20又はキーボード22から患者の病名を入力する。 In step S11, the global information acquisition unit 72 acquires the patient's disease name as global information. Here, the user inputs the patient's disease name from the mouse 20 or the keyboard 22.
 ステップS4では、分類部52は、ステップS1で取得したCT画像の各ピクセルをそれぞれTree-in-bud appearanceのクラス、気管支拡張のクラス、牽引性気管拡張のクラス、すりガラス影のクラス、蜂窩肺のクラス、及びコンソリデーションのクラスのうちのいずれかのクラスに分類する。ここでは、各ピクセルをスコアが閾値を超えるクラスに分類する。 In step S4, the classification unit 52 classifies each pixel of the CT image acquired in step S1 into a tree-in-bud appearance class, a bronchodilation class, a traction tracheal dilation class, a frosted glass shadow class, and a beehive lung. Classify into one of the classes and consolidation classes. Here, each pixel is classified into a class whose score exceeds a threshold value.
 さらに、分類部52は、閾値を超えたクラスのうちステップS11で取得したグローバル情報に適合するクラスに分類する。 Further, the classification unit 52 classifies the classes that exceed the threshold value into the classes that conform to the global information acquired in step S11.
 ステップS5では、表示制御部62は、ステップS4で分類したピクセル毎のクラスを表示装置24に表示させる。本実施形態では、各ピクセルは複数のクラスに分類される場合があり、表示制御部62は、ピクセルが分類された全てのクラスを表示装置24に表示させる。以上により、画像処理装置12はクラス分類方法を終了する。 In step S5, the display control unit 62 causes the display device 24 to display the class for each pixel classified in step S4. In the present embodiment, each pixel may be classified into a plurality of classes, and the display control unit 62 causes the display device 24 to display all the classes into which the pixels are classified. With the above, the image processing apparatus 12 ends the class classification method.
 具体例として、あるピクセルについて、スコア算出部42が各クラスのスコアを以下のように算出したものとする。 As a specific example, it is assumed that the score calculation unit 42 calculates the score of each class for a pixel as follows.
 Tree-in-bud appearanceのクラスのスコア=0.6
 気管支拡張のクラスのスコア=0.72
 牽引性気管支拡張のクラスのスコア=0.45
 すりガラス影のクラスのスコア=0.9
 蜂窩肺のクラスのスコア=0.8
 コンソリデーションのクラスのスコア=0.8
 分類部52は、算出したスコアのうち、閾値を超えるクラスを抽出する。ここでは、閾値が0.7であるとする。したがって、分類部52は、Tree-in-bud appearanceのクラス、気管支拡張のクラス、すりガラス影のクラス、蜂窩肺のクラス、及びコンソリデーションのクラスを抽出する。
Tree-in-bud appearance class score = 0.6
Bronchodilation class score = 0.72
Traction bronchodilation class score=0.45
Ground glass shadow class score = 0.9
Cell lung class score = 0.8
Consolidation class score = 0.8
The classification unit 52 extracts a class that exceeds the threshold value from the calculated scores. Here, it is assumed that the threshold value is 0.7. Therefore, the classification unit 52 extracts a Tree-in-bud appearance class, a bronchodilation class, a frosted glass shadow class, a cell lung class, and a consolidation class.
 さらに、分類部52は、算出されたスコアが閾値を超えるクラスのうちグローバル情報に適合するクラスに分類する。ここでは、グローバル情報に適合するクラスとして、グローバル情報である病名に関連付けられたキー所見に該当するクラスに分類する。 Further, the classifying unit 52 classifies the classes whose calculated scores exceed the threshold value into the classes that match the global information. Here, the classes that are suitable for the global information are classified into the classes corresponding to the key findings associated with the disease name that is the global information.
 ここでは、ステップS11において、グローバル情報取得部72が病名「膠原病肺(2)Ssc(PSS)」を取得したものとする。分類部52は、キー所見記憶部74から病名「膠原病肺(2)Ssc(PSS)」に関連付けられたキー所見を取得する。 Here, it is assumed that the global information acquisition unit 72 has acquired the disease name “collagen disease lung (2) Ssc (PSS)” in step S11. The classification unit 52 obtains the key finding associated with the disease name “collagen disease lung (2)Ssc(PSS)” from the key finding storage unit 74.
 図8に示すように、病名「膠原病肺(2)Ssc(PSS)」に関連付けられたキー所見は、気管支拡張、牽引性気管拡張、及びすりガラス影である。したがって、分類部52は、閾値を超えたクラスであるTree-in-bud appearanceのクラス、気管支拡張のクラス、すりガラス影のクラス、蜂窩肺のクラス、及びコンソリデーションのクラスのうち、キー所見に該当する気管支拡張のクラス、及びすりガラス影のクラスにピクセルを分類する。「膠原病肺(2)Ssc(PSS)」の病名と、Tree-in-bud appearance、蜂窩肺、及びコンソリデーションの病変とは関連がない。したがって、スコアが閾値を超えた場合であっても、これらのクラスは分類対象から除外する。これにより、ピクセルを適切に分類することができる。 As shown in FIG. 8, the key findings associated with the disease name “collagen lung (2)Ssc(PSS)” are bronchodilation, traction tracheal dilation, and ground glass shadow. Therefore, the classification unit 52 corresponds to the key finding among the tree-in-bud appearance class, the bronchodilation class, the ground glass shadow class, the cell lung class, and the consolidation class, which are classes that exceed the threshold. Classify pixels into bronchodilation class and frosted shadow class. The disease name of "collagen lung (2) Ssc (PSS)" is not associated with lesions of tree-in-bud appearance, cell lung, and consolidation. Therefore, even if the score exceeds the threshold, these classes are excluded from the classification target. This allows the pixels to be properly classified.
 以上のように、第2実施形態によれば、医用画像の領域を複数のクラスに適切に分類し、患者のグローバル情報に適合した分類を提示することができる。 As described above, according to the second embodiment, it is possible to appropriately classify the region of the medical image into a plurality of classes and present the classification that is suitable for the global information of the patient.
 ステップS2のスコア算出工程とステップS11によるグローバル情報取得工程とを入れ替え、グローバル情報取得部72がグローバル情報を取得してからスコア算出部42がスコアを算出してもよい。この場合、スコア算出部42は、グローバル情報に関連するキー所見に対応するクラスのスコアのみを算出することができる。 The score calculation step of step S2 and the global information acquisition step of step S11 may be replaced with each other, and the global information acquisition part 72 may acquire global information before the score calculation part 42 calculates a score. In this case, the score calculation unit 42 can calculate only the score of the class corresponding to the key finding related to the global information.
 第1実施形態及び第2実施形態では、医用画像として肺のCT画像を用いる例を説明したが、肺以外の臓器の医用画像を用いることもできる。また、医用画像として三次元画像を用いてもよい。三次元画像の場合、画像処理装置12は、ボクセル毎にクラス分類を行う。 In the first and second embodiments, the example of using the CT image of the lung as the medical image has been described, but a medical image of an organ other than the lung may be used. A three-dimensional image may be used as the medical image. In the case of a three-dimensional image, the image processing device 12 classifies each voxel.
 <その他>
 上記の分類方法は、各工程をコンピュータに実現させるためのプログラムとして構成し、このプログラムを記憶したCD-ROM(Compact Disk-Read Only Memory)等の非一時的な記録媒体を構成することも可能である。
<Other>
The above classification method may be configured as a program for causing a computer to realize each process, and a non-transitory recording medium such as a CD-ROM (Compact Disk-Read Only Memory) storing the program may be configured. Is.
 本発明の技術的範囲は、上記の実施形態に記載の範囲には限定されない。各実施形態における構成等は、本発明の趣旨を逸脱しない範囲で、各実施形態間で適宜組み合わせることができる。 The technical scope of the present invention is not limited to the scope described in the above embodiment. The configurations and the like in the respective embodiments can be appropriately combined within the scope without departing from the spirit of the present invention.
10…医療情報システム
12…画像処理装置
14…モダリティ
16…画像データベース
18…ネットワーク
20…マウス
22…キーボード
24…表示装置
40…画像取得部
42…スコア算出部
44…すりガラス影検出用畳み込みニューラルネットワーク
46…網状影検出用畳み込みニューラルネットワーク
48…蜂窩肺検出用畳み込みニューラルネットワーク
50…優先度設定部
52…分類部
54…記憶部
56…重篤度記憶部
60…入力制御部
62…表示制御部
64…バス
70…画像処理装置
72…グローバル情報取得部
74…キー所見記憶部
100…プロセッサ
102…メモリ
104…ストレージ装置
106…ネットワークコントローラ
108…電源装置
110…ディスプレイコントローラ
112…入出力インターフェース
114…入力コントローラ
G1~G7…画像
R1…領域
S1~S5,S11…分類方法の各ステップ
10... Medical information system 12... Image processing device 14... Modality 16... Image database 18... Network 20... Mouse 22... Keyboard 24... Display device 40... Image acquisition unit 42... Score calculation unit 44... Frosted glass shadow detection convolutional neural network 46 ... convolutional neural network 48 for detecting reticulated shadows ... convolutional neural network 50 for detecting benign lungs ... priority setting unit 52 ... classification unit 54 ... storage unit 56 ... severity storage unit 60 ... input control unit 62 ... display control unit 64 ... Bus 70... Image processing device 72... Global information acquisition unit 74... Key finding storage unit 100... Processor 102... Memory 104... Storage device 106... Network controller 108... Power supply device 110... Display controller 112... Input/output interface 114... Input controller G1 -G7... Image R1... Regions S1 to S5, S11... Each step of the classification method

Claims (20)

  1.  画像を取得する画像取得部と、
     前記画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出部と、
     前記複数のクラスに対してそれぞれ優先度を設定する優先度設定部と、
     前記算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに前記領域を分類する分類部であって、前記優先度に基づいて分類する分類部と、
     を備えた分類装置。
    An image acquisition unit that acquires images,
    A score calculation unit that calculates a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes;
    A priority setting unit for setting a priority for each of the plurality of classes,
    A classifying unit that classifies the region into any of the classes in which the calculated score exceeds a threshold value, and a classifying unit that classifies based on the priority.
    Classification device equipped with.
  2.  前記分類部は、前記優先度が相対的に高いクラスに分類する請求項1に記載の分類装置。 The classifying device according to claim 1, wherein the classifying unit classifies the class into the class having a relatively high priority.
  3.  前記画像は患者を撮影した医用画像であり、
     前記複数のクラスはそれぞれ前記患者の病変に関するクラスである請求項1又は2に記載の分類装置。
    The image is a medical image of a patient,
    The classification device according to claim 1, wherein the plurality of classes are classes related to lesions of the patient.
  4.  前記画像に関するグローバル情報を取得するグローバル情報取得部を備え、
     前記優先度設定部は、前記グローバル情報に基づいて前記複数のクラスに対してそれぞれ優先度を設定する請求項3に記載の分類装置。
    A global information acquisition unit for acquiring global information about the image,
    The classification device according to claim 3, wherein the priority setting unit sets a priority for each of the plurality of classes based on the global information.
  5.  前記グローバル情報は、前記患者に関連付けられた病名又は前記患者の年齢に関する情報である請求項4に記載の分類装置。 The classification device according to claim 4, wherein the global information is information regarding a disease name associated with the patient or an age of the patient.
  6.  前記優先度は前記病変の重篤度に基づいて決定される請求項4又は5に記載の分類装置。 The classification device according to claim 4 or 5, wherein the priority is determined based on the severity of the lesion.
  7.  前記優先度は前記病変の時間的変化に基づいて決定される請求項4から6のいずれか1項に記載の分類装置。 The classification device according to any one of claims 4 to 6, wherein the priority is determined based on a temporal change of the lesion.
  8.  画像を取得する画像取得部と、
     前記画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出部と、
     前記画像に関するグローバル情報を取得するグローバル情報取得部と、
     前記算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに前記領域を分類する分類部であって、前記グローバル情報に基づいて分類する分類部と、
     を備えた分類装置。
    An image acquisition unit that acquires images,
    A score calculation unit that calculates a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes;
    A global information acquisition unit for acquiring global information about the image;
    A classifying unit that classifies the region into any of the classes in which the calculated score exceeds a threshold value, and a classifying unit that classifies based on the global information,
    Classification device equipped with.
  9.  前記分類部は、前記グローバル情報に適合するクラスに分類する請求項8に記載の分類装置。 The classifying device according to claim 8, wherein the classifying unit classifies into a class that matches the global information.
  10.  前記画像は患者を撮影した医用画像であり、
     前記複数のクラスはそれぞれ前記患者の病変に関するクラスである請求項8又は9に記載の分類装置。
    The image is a medical image of a patient,
    The classification device according to claim 8 or 9, wherein each of the plurality of classes is a class relating to a lesion of the patient.
  11.  前記グローバル情報は、前記患者に関連付けられた病名に関する情報であり、
     前記病名に適合するクラスを記憶するキー所見記憶部を備える請求項10に記載の分類装置。
    The global information is information about a disease name associated with the patient,
    The classification device according to claim 10, further comprising a key finding storage unit that stores a class that matches the disease name.
  12.  前記医用画像は肺のCT(Computed Tomography)画像である請求項3から7、10、又は11のいずれか1項に記載の分類装置。 The classification device according to claim 3, wherein the medical image is a CT (Computed Tomography) image of the lung.
  13.  前記患者の病変に関するクラスは、蜂窩肺、すりガラス影、網状影、及び線状影のうち少なくとも1つを含む請求項12に記載の分類装置。 13. The classification device according to claim 12, wherein the class relating to the lesion of the patient includes at least one of a beehive lung, a ground glass shadow, a reticular shadow, and a linear shadow.
  14.  前記スコア算出部は、前記複数のクラスの各クラスに対応する特徴量をそれぞれ算出する複数の特徴量算出部を備え、
     前記各クラスに対応する特徴量に基づいて前記スコアをそれぞれ算出する請求項1から13のいずれか1項に記載の分類装置。
    The score calculation unit includes a plurality of feature amount calculation units that respectively calculate a feature amount corresponding to each class of the plurality of classes,
    The classification device according to any one of claims 1 to 13, wherein the score is calculated based on a feature amount corresponding to each class.
  15.  前記複数の特徴量算出部は、それぞれ畳み込みニューラルネットワークで構成される請求項14に記載の分類装置。 The classification device according to claim 14, wherein each of the plurality of feature amount calculation units is configured by a convolutional neural network.
  16.  請求項1から15のいずれか1項に記載の分類装置と、
     表示装置と、
     前記領域が分類されたクラスを前記表示装置に表示する表示制御部と、
     を備えた分類結果表示装置。
    A classification device according to any one of claims 1 to 15,
    A display device,
    A display controller that displays the class into which the area is classified on the display device;
    Classification result display device equipped with.
  17.  画像を取得する画像取得工程と、
     前記画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出工程と、
     前記複数のクラスに対してそれぞれ優先度を設定する優先度設定工程と、
     前記算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに前記領域を分類する分類工程であって、前記優先度に基づいて分類する分類工程と、
     を備えた分類方法。
    An image acquisition process for acquiring an image,
    A score calculation step of calculating a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes;
    A priority setting step of setting a priority for each of the plurality of classes,
    A classifying step of classifying the region into any one of the classes in which the calculated score exceeds a threshold, the classifying step of classifying based on the priority,
    Classification method with.
  18.  画像を取得する画像取得工程と、
     前記画像の少なくとも1つの領域が予め定められた複数のクラスの各クラスに属する度合いを示すスコアをそれぞれ算出するスコア算出工程と、
     前記画像に関するグローバル情報を取得するグローバル情報取得工程と、
     前記算出されたスコアが閾値を超えるクラスのうちのいずれかのクラスに前記領域を分類する分類工程であって、前記グローバル情報に基づいて分類する分類工程と、
     を備えた分類方法。
    An image acquisition process for acquiring an image,
    A score calculation step of calculating a score indicating the degree to which at least one region of the image belongs to each of a plurality of predetermined classes;
    A global information acquisition step of acquiring global information about the image;
    A classifying step of classifying the region into any one of the classes in which the calculated score exceeds a threshold, the classifying step of classifying based on the global information,
    Classification method with.
  19.  請求項17又は18に記載の分類方法をコンピュータに実行させるためのプログラム。 A program for causing a computer to execute the classification method according to claim 17 or 18.
  20.  非一時的かつコンピュータ読取可能な記録媒体であって、前記記録媒体に格納された指令がコンピュータによって読み取られた場合に請求項19に記載のプログラムをコンピュータに実行させる記録媒体。 A non-transitory computer-readable recording medium that causes a computer to execute the program according to claim 19 when a command stored in the recording medium is read by the computer.
PCT/JP2019/044869 2018-11-28 2019-11-15 Classification device, classification method and program, and classification result display device WO2020110776A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018222662 2018-11-28
JP2018-222662 2018-11-28

Publications (1)

Publication Number Publication Date
WO2020110776A1 true WO2020110776A1 (en) 2020-06-04

Family

ID=70851987

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/044869 WO2020110776A1 (en) 2018-11-28 2019-11-15 Classification device, classification method and program, and classification result display device

Country Status (1)

Country Link
WO (1) WO2020110776A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009232981A (en) * 2008-03-26 2009-10-15 Fujifilm Corp Image display apparatus and program for the same
JP2010079398A (en) * 2008-09-24 2010-04-08 Fuji Xerox Co Ltd Similar image providing device and program
JP2018513507A (en) * 2015-03-20 2018-05-24 フラウンホッファー−ゲゼルシャフト ツァ フェルダールング デァ アンゲヴァンテン フォアシュンク エー.ファオ Relevance score assignment for artificial neural networks
JP2018175226A (en) * 2017-04-10 2018-11-15 富士フイルム株式会社 Medical image classification device, method, and program
JP2018175217A (en) * 2017-04-10 2018-11-15 富士フイルム株式会社 Image processing apparatus, method and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009232981A (en) * 2008-03-26 2009-10-15 Fujifilm Corp Image display apparatus and program for the same
JP2010079398A (en) * 2008-09-24 2010-04-08 Fuji Xerox Co Ltd Similar image providing device and program
JP2018513507A (en) * 2015-03-20 2018-05-24 フラウンホッファー−ゲゼルシャフト ツァ フェルダールング デァ アンゲヴァンテン フォアシュンク エー.ファオ Relevance score assignment for artificial neural networks
JP2018175226A (en) * 2017-04-10 2018-11-15 富士フイルム株式会社 Medical image classification device, method, and program
JP2018175217A (en) * 2017-04-10 2018-11-15 富士フイルム株式会社 Image processing apparatus, method and program

Similar Documents

Publication Publication Date Title
JP6906347B2 (en) Medical image classifiers, methods and programs
US20200335200A1 (en) Similar case retrieval apparatus, similar case retrieval method, non-transitory computer-readable storage medium, similar case retrieval system, and case database
JP2021106887A (en) Medical image display device, method and program
US10734107B2 (en) Image search device, image search method, and image search program
JP5794597B2 (en) Extracorporeal blood volume estimation and surgically removed sample counting system and method
US11875897B2 (en) Medical image processing apparatus, method, and program, and diagnosis support apparatus, method, and program
KR20170046104A (en) Method and apparatus for providing medical information service based on diesease model
Lucas et al. Wound size imaging: ready for smart assessment and monitoring
US20180024995A1 (en) Medical information providing apparatus and medical information providing method
US20210133473A1 (en) Learning apparatus and learning method
CN106709920B (en) Blood vessel extraction method and device
JP2009157527A (en) Medical image processor, medical image processing method and program
JP7170747B2 (en) Similarity determination device, method and program
JP7101809B2 (en) Image processing equipment, image processing methods, and programs
US11983879B2 (en) Image processing apparatus, image processing method, and program
US20210166382A1 (en) Similarity determination apparatus, similarity determination method, and similarity determination program
CN107072613A (en) Classification based on longitudinal feature to the health status of tissue of interest
JP3234668U (en) Image recognition system for scoliosis by X-ray
US20150199586A1 (en) Image processing device, method, and program
CN111276221B (en) Vertebrae image information processing method, vertebrae image information display method and vertebrae image information storage medium
WO2020110776A1 (en) Classification device, classification method and program, and classification result display device
WO2020044840A1 (en) Region dividing device, method and program, similarity determination device, method and program, and feature amount derivation device, method and program
US20210279879A1 (en) Similarity determination apparatus, similarity determination method, and similarity determination program
Doğanay et al. A hybrid lung segmentation algorithm based on histogram-based fuzzy C-means clustering
WO2021070527A1 (en) Image processing device, method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19889368

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19889368

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP