WO2010050333A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
WO2010050333A1
WO2010050333A1 PCT/JP2009/067205 JP2009067205W WO2010050333A1 WO 2010050333 A1 WO2010050333 A1 WO 2010050333A1 JP 2009067205 W JP2009067205 W JP 2009067205W WO 2010050333 A1 WO2010050333 A1 WO 2010050333A1
Authority
WO
WIPO (PCT)
Prior art keywords
estimated value
class
certainty factor
image
information processing
Prior art date
Application number
PCT/JP2009/067205
Other languages
French (fr)
Japanese (ja)
Inventor
大介 梶
Original Assignee
コニカミノルタエムジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタエムジー株式会社 filed Critical コニカミノルタエムジー株式会社
Priority to JP2010535737A priority Critical patent/JPWO2010050333A1/en
Publication of WO2010050333A1 publication Critical patent/WO2010050333A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/56Details of data transmission or power supply, e.g. use of slip rings
    • A61B6/563Details of data transmission or power supply, e.g. use of slip rings involving image data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/031Recognition of patterns in medical or anatomical images of internal organs
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to an information processing apparatus.
  • machine learning techniques are used to support user decision making. For example, when image processing is performed on an X-ray image, since there are appropriate types and parameters of image processing depending on the imaging region and imaging direction in which X-ray imaging has been performed, the imaging region and imaging direction of the X-ray image are first specified. There is a need. Accordingly, it is possible to create a discriminator by a machine learning method, discriminate the radiographed region and radiographing direction of the X-ray image by the discriminator, and select image processing according to the discriminated radiographing region and / or imaging direction. (For example, refer to Patent Document 1).
  • Machine learning methods include support vector machines and neural networks, as well as boosting methods such as AdaBoost and AdaBoostMlt.
  • Discriminating with a classifier using machine learning does not always give a correct answer.
  • an X-ray image in which the chest is imaged and an X-ray image in which the thoracic vertebra is imaged are similar in image characteristics, and may be misidentified.
  • the imaging region is forcibly discriminated and image processing specialized for the imaging region is selected, improper image processing will be performed in the case of erroneous determination.
  • the discrimination result by the discriminator is provided to the user, the user cannot judge whether or not the discrimination result can be trusted.
  • An object of the present invention is to calculate a certainty factor for the discrimination result of the discriminator.
  • a probability that the image belongs to each class among a plurality of classes is output as an estimated value, and a discriminator that determines the class to which the image belongs based on the estimated value;
  • a discriminator that determines the class to which the image belongs based on the estimated value;
  • An information processing apparatus is provided.
  • the control means obtains a difference between the maximum estimated value and the estimated value other than the estimated value of each class output by the discriminator, and there are a predetermined number or more classes having the difference within a threshold value.
  • the information processing apparatus which outputs a binary certainty factor depending on whether or not.
  • the said control means outputs the absolute value of the difference of the largest estimated value of the estimated value of each class output by the said discriminator, and the 2nd largest estimated value as a certainty factor.
  • An information processing apparatus is provided.
  • the control means obtains an absolute value of a difference between a maximum estimated value of the estimated values of each class output from the classifier and a second largest estimated value, and the absolute value of the difference is smaller than a threshold value.
  • the information processing apparatus which outputs a binary certainty factor depending on whether or not.
  • control unit outputs a binary certainty factor depending on whether or not a maximum estimated value is equal to or less than a threshold value among the estimated values of each class output by the classifier. Is provided.
  • control unit according to any one of claims 1 to 5, wherein the control unit normalizes the estimated value of each class output by the classifier and calculates the certainty factor using the normalized estimated value.
  • An information processing apparatus is provided.
  • the control means determines whether the certainty factor is high or low based on the calculated certainty factor value, and when it is determined that the certainty factor is low, warns that the certainty factor is low together with the determination result of the class.
  • the information processing apparatus according to any one of claims 1 to 6, wherein warning information to be displayed is displayed on the display means.
  • the image is a medical image in which a patient is photographed by an image generation device,
  • the control means determines whether the calculated certainty factor is high or low, When it is determined by the control means that the certainty level is high, image processing specialized for the class having the maximum estimated value is performed on the medical image, and when the certainty level is determined to be low, the medical image is
  • the information processing apparatus according to any one of claims 1 to 7, further comprising image processing means for performing general-purpose image processing.
  • the class of the medical image is a class for each type of the image generation apparatus in which the medical image is captured.
  • the class of the medical image is any one of a class for each imaging region where the medical image is captured, a class for each imaging direction, or a class for each combination of the imaging region and the imaging direction. Provided.
  • the discriminator is configured by the AdaBoostMlt method, performs discrimination of at least three classes, The information processing apparatus according to any one of claims 1 to 10, wherein the estimated value output by the discriminator is given together with a value corresponding to the certainty factor.
  • the certainty factor for the discrimination result of the discriminator can be calculated, and information on the certainty factor can be provided together with the discrimination result.
  • FIG. 1 is a diagram showing a system configuration of a medical image system 1 including an information processing apparatus 3 according to the present embodiment.
  • FIG. 2 shows an arrangement example of each device when the medical image system 1 is used in a small-scale medical facility.
  • the medical image system 1 is constructed for a relatively small medical facility such as a medical practitioner or a clinic, and generates and manages a medical image by examining and photographing a patient.
  • the medical image is managed by patient information.
  • the correspondence between the medical image and the patient information is determined by a doctor, and the information on the determined correspondence is input to the medical image system 1 by the operation of the doctor.
  • the medical image system 1 stores a medical image and patient information in association with each other according to the input.
  • order information including patient information and examination information is issued, and medical images are managed based on the order information. This is different from the medical imaging system 1 for a small medical facility.
  • the imaging part and the imaging direction of the medical image can be easily determined from the order information.
  • order information can be obtained by the medical image system 1 for a small-scale medical facility. If not, the doctor himself / herself needs to specify the imaging part and imaging direction of the medical image. In the medical image system 1, in order to reduce such a burden on the doctor, the imaging region and the imaging direction are discriminated by a discriminator.
  • the medical imaging system 1 includes an ultrasonic diagnostic apparatus 2a, an endoscope apparatus 2b, and a CR (Computed Radiography) apparatus 2c.
  • the ultrasonic diagnostic apparatus 2a, the endoscope apparatus 2b, and the CR apparatus 2c are one type of image generation apparatus that generates a medical image.
  • the medical image system 1 includes an information processing device 3 and a reception device 4 as shown in FIG.
  • the information processing apparatus 3 is preferably a WS (workstation) provided in an examination room where a doctor is resident.
  • the ultrasonic diagnostic apparatus 2a, the endoscope apparatus 2b, the CR apparatus 2c, the information processing apparatus 3, and the receiving apparatus 4 are connected to the network 5 via a switching hub (not shown).
  • the network 5 is, for example, a LAN (Local Area Network).
  • DICOM Digital-Imaging-and-Communication-in-Medicine
  • DICOM Digital-Imaging-and-Communication-in-Medicine
  • the ultrasonic diagnostic apparatus 2a emits ultrasonic waves and generates a medical image based on the reflected waves.
  • a conversion device 21 is connected to the ultrasonic diagnostic apparatus 2 a and is connected to the network 5 via the conversion device 21.
  • the conversion device 21 performs conversion from an analog signal to a digital signal, and converts the format into a format compliant with DICOM when the medical image is in a format not compliant with DICOM.
  • the conversion device 21 adds a UID (unique ID) for individually specifying a medical image in the medical image system 1 to the medical image.
  • the UID is created by combining, for example, a device ID unique to the ultrasound diagnostic apparatus 2a and an imaging date / time.
  • the endoscope apparatus 2b includes a small imaging device provided at the distal end portion of the tube, and performs imaging using the imaging device to generate a medical image.
  • the CR device 2c includes an X-ray source that emits X-rays, an X-ray detector that detects X-rays, and the like.
  • the X-ray detector may be a cassette with a built-in volatile phosphor plate that stores X-ray energy, or may be an FPD (Flat Panel Detector).
  • the FPD includes an X-ray detection element and a photoelectric conversion element arranged in a matrix. The photoelectric conversion element photoelectrically converts the X-ray detected by the X-ray detection element to generate a medical image.
  • a reading unit is provided in the CR device 2c.
  • the reading unit irradiates the stimulable phosphor plate with excitation light, and photoelectrically converts the stimulated light emitted from the stimulable phosphor plate to generate a medical image.
  • the endoscope apparatus 2b and the CR apparatus 2c give a UID to the medical image generated by each.
  • the UID is created by combining the device ID of the endoscope device 2b or the device ID of the ultrasonic diagnostic device 2a that respectively generated the medical image with the photographing date and time.
  • the information processing device 3 performs, for example, information processing of medical images, patient information, and electronic medical record information, displays information necessary for a doctor's examination, and performs image processing on the medical images.
  • FIG. 3 is a diagram illustrating a functional configuration of the information processing apparatus 3. As illustrated in FIG. 3, the information processing apparatus 3 includes a control unit 31, an operation unit 32, a display unit 33, a communication unit 34, a storage unit 35, and an image processing unit 36.
  • the control unit 31 includes a CPU (Central Processing Unit) and a RAM (Random Access Memory).
  • the control unit 31 reads out various programs stored in the storage unit 35 and expands them in the RAM, performs various calculations according to the expanded programs, and centrally controls each component.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • control unit 31 constitutes a discriminator 3a that discriminates the class to which the input data belongs.
  • the discriminator 3a uses the medical image as input data, discriminates the class of the imaging region of the medical image, and outputs the discrimination result.
  • the control unit 31 is a control unit that calculates a certainty factor for the discrimination result by the discriminator 3 a and causes the display unit 33 to display the certainty factor.
  • the operation unit 32 includes, for example, a keyboard and a mouse, generates an operation signal corresponding to a user operation, and outputs the operation signal to the control unit 31.
  • the display unit 33 includes a display, and is a display unit that displays a medical image and various operation screens on the display according to display control of the control unit 31.
  • the communication unit 34 includes a communication interface and communicates with an external device on the network 5. For example, a medical image is received from the CR device 2c.
  • the storage unit 35 stores programs used in the control unit 31, parameters necessary for executing the programs, and files.
  • a hard disk or a semiconductor nonvolatile memory can be used as the storage unit 35.
  • the storage unit 35 stores a medical image database. This database includes, for example, a medical image, a UID assigned to the medical image, and patient information associated with the medical image.
  • the storage unit 35 stores information on the types and parameters of image processing specialized for each imaging region for each imaging region (for example, chest, abdomen, and head) of the medical image.
  • the image processing parameter specialized for the imaging region refers to a parameter designed to obtain an appropriate processing result according to the imaging region.
  • the storage unit 35 stores gradation conversion parameter 1 and frequency enhancement parameter 2 information for the chest, and gradation conversion parameter 1 for the abdomen.
  • the storage unit 35 stores general-purpose image processing types and parameter information.
  • General-purpose image processing parameters are parameters designed to be applicable to medical images of any imaging region regardless of the imaging region.
  • the image processing unit 36 is an image processing unit that performs various types of image processing on medical images.
  • Image processing may be realized as software processing by storing a program for image processing in the storage unit 35 and cooperation between the program and the CPU.
  • Various image processing may be realized by hardware like an image processing circuit.
  • Examples of the image processing include gradation conversion processing, frequency enhancement processing, dynamic range compression processing, granularity suppression processing, enlargement / reduction processing, display position adjustment processing, and black and white inversion processing.
  • image processing parameters include a gradation conversion curve used in the gradation conversion process (a curve that defines the relationship between the input pixel value and the output pixel value so that the output pixel value becomes the target gradation), Normalization curve slope (G value; value indicating contrast) used for key conversion processing, y-intercept of normalization curve (S value; density correction value), enhancement degree during frequency enhancement processing, enlargement during enlargement / reduction processing Rate (reduction rate).
  • the reception device 4 performs, for example, reception registration, accounting calculation, and insurance score calculation for a patient who has visited the hospital.
  • FIG. 2 there are a patient reception 11 and a waiting room 12 near the entrance 10.
  • the reception device 4 is arranged at the reception 11.
  • the information processing apparatus 3 and the ultrasonic diagnostic apparatus 2a are arranged in the examination room 13, and the CR apparatus 2c is arranged in the X-ray imaging room 15.
  • an endoscope apparatus 2b is arranged in the examination room 16.
  • the reception person in charge gives a reception number tag printed with a reception number for identifying each patient.
  • the receptionist inputs patient information such as the patient's name, age, and address, and the reception number assigned to the patient into the reception device 4.
  • patient information is stored in a database, and a patient list in which patient names are arranged in the order of reception numbers is created.
  • the patient list is transmitted to the information processing device 3.
  • the information processing apparatus 3 When the doctor determines that examination imaging is necessary by medical examination, the information processing apparatus 3 operates to display the patient list. Since the information processing apparatus 3 displays a patient list according to the operation, the doctor performs an operation of selecting a patient to be imaged from the patient list. Thereafter, the doctor guides the patient to the CR device 2c, for example, and sets an imaging region and an imaging direction, and performs an imaging instruction operation on the CR device 2c. In the CR device 2c, after photographing, a medical image is generated, and a UID is attached to the medical image and transmitted to the information processing device 3. After the imaging is finished, the doctor and the patient return to the examination room 13.
  • the discriminating unit 3a determines the imaging region of the medical image. For example, the feature amount of the medical image is calculated by the control unit 31, and the feature amount is input to the discriminator 3a as input data. Examples of the feature amount of the medical image include an average value of pixel values, a standard deviation, a variance, edge information, a high-order autocorrelation function, and density gradient information.
  • the discriminator 3a discriminates an imaging region from these feature values such as a chest, an abdomen, and a foot. Since the image processing parameters corresponding to the imaging region are stored in the storage unit 35, the image processing unit 36 performs image processing using the parameters.
  • the discriminator 3a is used to discriminate the imaging part of the medical image and assists the doctor in identifying the imaging part.
  • the discriminator 3a is configured by a machine learning technique such as AdaBoost, AdaBoostMlt, artificial neural network, and support vector machine.
  • AdaBoostMlt is an algorithm for creating a final classifier by linear combination of a plurality of weak classifiers.
  • a weak classifier refers to a classifier whose discrimination accuracy may be low. Even if the discrimination accuracy of each weak classifier is low, it is possible to approximate the empirical distribution of learning data because the weak classifiers have sufficient diversity.
  • AdaBoost is a technique for configuring classifiers that classify into two classes
  • AdaBoostMlt is a technique for configuring classifiers for multi-class classification of two or more classes.
  • a set of output values Y ⁇ 1,... from an input value x that is x ⁇ X ⁇ R n (R n represents an n-dimensional space. X is a subset of the n-dimensional space and is a set of x).
  • K ⁇ the set of mappings to the power set 2 Y is expressed as follows.
  • W mlt ⁇ h mlt : X ⁇ 2 Y ⁇
  • the set W of weak classifiers h is expressed as follows.
  • W ⁇ h: X ⁇ Y ⁇ R
  • h (x, y) [y ⁇ h mlt (x), h mlt ⁇ W mlt ] ⁇ []
  • the experience distribution p 0 (x, y) given by the learning data is defined as follows.
  • x represents vector data defined by the feature amount group of the medical image adopted as the learning data, and y represents a class known for the medical image.
  • N indicates the number of learning data.
  • the expanded distribution p 0 (x , Y, y ′) are defined as follows:
  • each weak classifier h about discrimination error function e t (h), the total evaluation points p + a correct determination result (h, p), the total evaluation points of the erroneous discrimination result p - (h, p), the loss function L (F) is defined as follows. [] Indicates a function that outputs 1 when the condition in [] is satisfied, and outputs 0 otherwise.
  • the control unit 31 In order to obtain the function f ⁇ (x, y) of the discriminator 3a, the control unit 31 first inputs learning data (x 1 , y 1 ),..., (X N , y N ), and weights coefficients ⁇ t , The learning count t is initialized.
  • (1) Select the weak classifier h that minimizes the discrimination error function e (p t , h t ). That is, the discrimination error function e ( pt , ht ) is calculated while changing the combination of a plurality of weak discriminators h, and the weak discriminator h having the smallest discriminating error function is selected.
  • (2) ⁇ t is set by replacing ⁇ t ⁇ 1 + (0,..., ⁇ t ,..., 0).
  • ⁇ t is a parameter for adjusting the weighting coefficient ⁇ t of the weak classifier h t of interest, and is expressed as follows.
  • the class of the imaging region of the medical image is determined by the classifier 3a, and the certainty factor for the determination result is calculated and displayed by the control unit 31.
  • the certainty factor refers to the degree of reliability with respect to the discrimination result of the discriminator 3a.
  • the image processing unit 36 performs image analysis of a medical image and calculates a feature amount (step S1).
  • the feature amount obtained by the image analysis is input to the discriminator 3a, and the discriminator 3a outputs the degree of possibility that the medical image belongs to the class of each imaging region (this is referred to as an estimated value).
  • the estimated value is given with a value corresponding to the certainty factor for the discrimination result of the discriminator 3a.
  • the estimated value y indicating the degree of possibility of being a class k is obtained for each class k when the feature quantity to which the medical image is input is x by the function f ⁇ (x, y) of the classifier 3a. Is output.
  • the classifier 3a determines that the imaging part of the class for which the maximum estimated value y is output among the estimated values y of each class is the imaging part of the medical image (step S2). For example, in the case of discriminating three classes of imaging regions of the chest, abdomen, and foot, an estimated value y of the chest, an estimated value y of the abdomen, and an estimated value y of the foot are obtained. Among these, when the estimated value y of the chest is the maximum, it is determined that the imaging part of the medical image is the chest.
  • the certainty factor may be calculated using the estimated value y output from the discriminator 3a by the function f ⁇ (x, y) as it is, but the estimated value y by the function f ⁇ (x, y) is linearly combined. It is the sum of the outputs of multiple weak classifiers. Since the value range varies depending on the number of weak classifiers, the control unit 31 normalizes the estimated value y (step S3). When normalization is performed so that the range of the estimated value y is 0 ⁇ y ⁇ 1, the probability function f ′ ⁇ (x
  • Z (y) is a normalization constant determined so that the total sum of Z at each estimated value y is 1.
  • the exponent of the estimated value y may be normalized, and the estimated value y obtained by normalizing the exponent may be used to calculate the certainty factor.
  • the normalized value obtained by normalizing the exponent of the estimated value y is the estimated value y under the conditional probability distribution.
  • y) of the discriminator 3a under the conditional probability distribution is expressed by the following equation. p (y
  • x) 1 / Z ⁇ e f ⁇ (x, y) Z represents a normalization constant.
  • the control unit 31 calculates a certainty factor for the discrimination result by the discriminator 3a using the normalized estimated value y.
  • the control unit 31 obtains a difference between the maximum estimated value y of the normalized estimated values y of each class and the other estimated values y (step S4).
  • the control unit 31 sets and outputs a certainty factor of 0 (step S6).
  • the reliability is set to a value of 1 and output (Step S7).
  • a value of 1 indicates that the certainty factor is high, and a value of 0 indicates that the certainty factor is low.
  • the class of the maximum estimated value y is output as the discrimination result of the discriminator 3a. However, if the difference between the estimated value y of another class and the maximum estimated value y is small, it is considered that the discrimination is difficult. In such a case, since the reliability of the discrimination result is considered to be low, a binary certainty factor is output depending on how many classes have a small difference between the maximum estimated value y and other estimated values.
  • the estimated value y (value normalized to a range of 0.00 to 1.00) shown in Table 1 below is obtained for the three classes of the chest, abdomen, and feet, the largest of the three classes
  • the threshold is 0.20 and the predetermined number of classes below the threshold is set to 2
  • the estimated values y of the abdominal and foot classes both exceed the threshold, so the number of classes below the threshold is 0. Since the number of classes is not a predetermined number of 2 or more, the certainty factor is set to 1.
  • the certainty factor may be an index that can indicate the degree of reliability with respect to the determination result of the discriminator 3a, and the certainty factor may be calculated by another method. For example, there is a method of obtaining a binary certainty factor based on whether or not the difference between the maximum estimated value y and the second largest estimated value y (hereinafter referred to as quasi-estimated value y) is equal to or less than a threshold value. In this method, the certainty factor is determined by focusing on the relative positioning of the maximum estimated value y, how far the class of the maximum estimated value is from the semi-estimated value y.
  • control unit 31 obtains the absolute value of the difference between the maximum estimated value y and the semi-estimated value y, and if the obtained absolute value is less than or equal to the threshold value, sets the confidence value to 0 and outputs it. If it is not less than the threshold, the confidence value is set to 1 and output. A value of 1 indicates that the certainty factor is high, and a value of 0 indicates that the certainty factor is low. If the difference from the quasi-estimated value is less than or equal to the threshold value, it was difficult to discriminate from the quasi-estimated class. This is because the possibility is high and the reliability is high.
  • the maximum estimated value y shown in Table 3 below is obtained for three classes of the chest, abdomen, and feet, the maximum estimated value y is 0.30 for the chest and the quasi-estimated value y is 0.20 for the abdomen. . Since the absolute value of the difference between the maximum estimated value y and the quasi-estimated value y is 0.10, if the threshold value is set to 0.15, the absolute value of the difference is less than or equal to the threshold value, and the certainty factor is Set to 0 and output.
  • the absolute value of the difference between the maximum estimated value y and the semi-estimated value y may be calculated as the certainty factor.
  • the absolute value of the difference is the difference in possibility between the class of the maximum estimated value y that is most likely to belong to the class and the class of the semi-estimated value y that is the second most likely. It becomes an index to show and shows the reliability for the discrimination result.
  • the certainty factor is set to a value of 0 if the maximum estimated value y is not more than the threshold value, and the certainty factor is set to a value of 1 if it is not less than the threshold value.
  • the method of doing is also mentioned. This is because the maximum estimated value y is compared with other classes, but if the maximum estimated value y itself is so small that it does not reach a certain value, it is considered that the reliability of the discrimination result is low.
  • the maximum estimated value y is 0.30 for the chest.
  • the threshold value is 0.5
  • the maximum estimated value y is equal to or less than the threshold value, and thus a certainty value of 0 is output.
  • a combination of classes that can be easily discriminated is set in advance, and the maximum estimated value y and the class of the maximum estimated value y are combined.
  • the maximum estimated value y is 0.50 for the chest.
  • the chest and the foot are set as a combination of classes that can be easily discriminated, and if the threshold is 0.20, the difference between the estimated value y of the chest and the foot is 0.03. is there.
  • the image characteristics of the chest and feet are greatly different and the difference between the estimated values y is less than the threshold value despite the fact that the discrimination is easy, and the reliability is considered to be low.
  • the control unit 31 determines whether the certainty factor is high or low based on the calculated certainty factor value (step S8).
  • the certainty factor is output as a continuous value such as the absolute value of the difference between the maximum estimated value and the semi-estimated value when the certainty factor is output as a binary value of 1 or 0. There are cases.
  • the certainty factor is output as a binary value, a value of 0 indicates that the certainty factor is low, and a value of 1 indicates that the certainty factor is high. Therefore, the certainty factor depends on whether the value is 0 or 1. What is necessary is just to determine whether the degree is high or low.
  • the control unit 31 determines that the certainty factor is low if the absolute value of the difference between the maximum estimated value and the quasi-estimated value is equal to or less than the threshold value, and otherwise, the certainty factor is high.
  • the control unit 31 determines that the certainty factor is high (step S8; Y), and the image processing specialized for the class output as a discrimination result by the discriminator 3a is performed by the image processing unit 36. Is applied to the medical image (step S9). For example, when the classifying unit of the medical image is determined to be the chest by the discriminator 3a, the control unit 31 acquires the image processing type and parameter information determined for the chest from the storage unit 35, and performs image processing. To the unit 36. The image processing unit 36 executes the type of image processing determined for the chest using the determined parameters.
  • the control unit 31 determines that the certainty factor is low (step S8; N), and the image processing unit 36 performs general-purpose image processing on the medical image (step S10). For example, even when the class of the imaging region of the medical image is determined to be the chest by the discriminator 3a, the control unit 31 acquires general-purpose image processing type and parameter information from the storage unit 35, and the image The data is output to the processing unit 36.
  • the image processing unit 36 executes a type of image processing defined as general-purpose using a predetermined parameter.
  • a viewer screen D1 as shown in FIG. 5 is displayed on the display unit 13 by display control of the control unit 31. Is done.
  • a patient information field d1 is a display column for patient information of a patient on which a medical image is displayed.
  • the image display field d2 is a display field for medical images obtained by examination imaging.
  • the image adjustment column d3 is a display column for operation buttons used for operating parameters of image processing performed on the displayed medical image.
  • the image display field d2 includes imaging part information d21 of the displayed medical image. This imaging region is a result determined by the discriminator 3a. If the patient information displayed in the patient information column d1 corresponds to the medical image displayed in the image display column d2, and the imaging region displayed in the image display column d2 is correct, the doctor may operate the OK button d22. . In addition, when the correspondence or the imaging region is incorrect, the doctor may operate the NG button d23.
  • the OK button d22 When the OK button d22 is operated, in the information processing apparatus 3, the medical image, patient information, UID assigned to the medical image, and the like are made into a database by the control unit 31 and stored in the storage unit 35.
  • a correction screen for correcting the imaging region is displayed by the display control of the control unit 31, and thus the doctor inputs the imaging region determined by visual observation on the correction screen.
  • image processing is performed again with parameters according to the imaging region input by the image processing unit 36, and the viewer screen is displayed with the re-image-processed medical image.
  • the control unit 31 shades or adds a color to the character of the imaging region of the medical image determined that the certainty factor is low on the viewer screen D1. To warn that the certainty level is low. In addition, the control unit 31 displays a message d4 such as “Please check the imaging region” to warn that the certainty level is low.
  • the doctor can easily determine the imaging part of the medical image by visual observation, it can be confirmed immediately by the message d4 whether or not the determination result of the classifier 3a is erroneously determined.
  • a viewer screen D2 may be further displayed.
  • the viewer screen D2 is a screen that displays detailed information for each medical image.
  • information d5 indicating that the discriminated imaging region is the chest, its estimated value is 0.4, and the certainty factor is 0. It is displayed.
  • a message d6 is displayed notifying the user that general-purpose image processing has been performed on the medical image, not image processing specialized for the chest.
  • the discriminator 3a determines the probability that the medical image belongs to the class of each imaging region from the feature amounts obtained by analyzing the medical image. Output as an estimated value, discriminate the imaging region class of the medical image based on the estimated value, and the control unit 31 uses the estimated value of each imaging region class output by the discriminator 3a to The certainty factor for the determination result by 3a is calculated, and the display unit 33 displays the calculated certainty factor. Thereby, the information on the certainty factor for the discrimination result can be provided together with the discrimination result of the imaging part by the discriminator 3a.
  • control unit 31 determines whether the calculated certainty factor is high or low, and when it is determined that the certainty factor is high, the image processing unit 36 selects an imaging region having the maximum estimated value for the medical image.
  • class-specific image processing is performed and it is determined that the certainty level is low, general-purpose image processing is performed on the medical image.
  • general-purpose image processing can be selected when the possibility of erroneous determination is high, and inappropriate image processing can be prevented from being performed.
  • control unit 31 displays warning information such as a message, a color, and shading that warns that the certainty factor is low on the display unit 33, the user can easily grasp that the certainty factor is low.
  • the above-mentioned embodiment is a suitable example of this invention, and is not limited to this.
  • the characteristics of a medical image differ depending on the photographing direction and the image generation apparatus in which the medical image has been photographed. Therefore, the present invention may be applied in the case of class discrimination for each imaging direction instead of the class for each imaging region, or may be applied in the case of class determination for each combination of imaging region and imaging direction. . Alternatively, the present invention can also be applied to class determination for each image generation apparatus.
  • the present invention can be applied not only to medical images but also to general image discrimination by a classifier.
  • a non-volatile memory such as a ROM or a flash memory, or a portable recording medium such as a CD-ROM can be applied.
  • a carrier wave carrier wave
  • a medium for providing the program data via a communication line can be applied.
  • It can be used for an information processing apparatus that performs discrimination using a discriminator.

Abstract

It is possible to calculate the certainty factor of an identification result obtained by an identification device. An information processing device includes an identification device and a control means.  The identification device outputs as an estimated value, a probability that an image belongs to each of a plurality of classes in accordance with a characteristic amount obtained by analyzing the image and identifies the class to which the image belongs according to the estimated value.  The control means uses the estimated value of each of the classes outputted from the identification device so as to calculate the certainty factor of the identification result obtained by the identification device.

Description

情報処理装置Information processing device
 本発明は、情報処理装置に関する。 The present invention relates to an information processing apparatus.
 一般に、ユーザの意思決定をサポートするため、機械学習の手法が用いられている。例えば、X線画像を画像処理する際、X線撮影が行われた撮影部位や撮影方向によって、適切な画像処理の種類やパラメータがあるため、X線画像の撮影部位や撮影方向をまず特定する必要がある。そこで、機械学習の手法によって識別器を作成し、当該識別器によりX線画像の撮影部位や撮影方向を判別し、判別された撮影部位及び/又は撮影方向に応じた画像処理を選択することが行われている(例えば、特許文献1参照)。 Generally, machine learning techniques are used to support user decision making. For example, when image processing is performed on an X-ray image, since there are appropriate types and parameters of image processing depending on the imaging region and imaging direction in which X-ray imaging has been performed, the imaging region and imaging direction of the X-ray image are first specified. There is a need. Accordingly, it is possible to create a discriminator by a machine learning method, discriminate the radiographed region and radiographing direction of the X-ray image by the discriminator, and select image processing according to the discriminated radiographing region and / or imaging direction. (For example, refer to Patent Document 1).
 機械学習の手法としては、サポートベクトルマシンやニューラルネットワークの他、ブースティング手法の1つであるAdaBoostやAdaBoostMltが挙げられる。 Machine learning methods include support vector machines and neural networks, as well as boosting methods such as AdaBoost and AdaBoostMlt.
特開2008-11900号公報JP 2008-11900 A
 機械学習を利用した識別器による判別は常に正解が得られるわけではない。例えば、胸部が撮影されたX線画像と、胸椎が撮影されたX線画像は、画像の特徴が似ているため、誤判別する場合もある。このような場合にも無理に撮影部位を判別して、その撮影部位に特化した画像処理を選択すると、誤判別であった場合に不適切な画像処理を施してしまうことになる。判別が難しい場面では、無理に判別せずに撮影部位に依存しない汎用的な画像処理を選択することが好ましい。しかし、ユーザに対しては識別器による判別結果のみが提供されるため、ユーザは判別結果を信頼してよいかどうか、判断ができない。  判別 Discriminating with a classifier using machine learning does not always give a correct answer. For example, an X-ray image in which the chest is imaged and an X-ray image in which the thoracic vertebra is imaged are similar in image characteristics, and may be misidentified. Even in such a case, if the imaging region is forcibly discriminated and image processing specialized for the imaging region is selected, improper image processing will be performed in the case of erroneous determination. In a scene where discrimination is difficult, it is preferable to select general-purpose image processing that does not rely on forcible discrimination and does not depend on the imaging region. However, since only the discrimination result by the discriminator is provided to the user, the user cannot judge whether or not the discrimination result can be trusted. *
 本発明の課題は、識別器の判別結果に対する確信度を算出することである。 An object of the present invention is to calculate a certainty factor for the discrimination result of the discriminator.
 請求項1に記載の発明によれば、
 画像を解析して得られた特徴量から、複数のクラスのうち当該画像が各クラスに属する確率を推定値として出力し、当該推定値に基づいて画像が属するクラスを判別する識別器と、
 前記識別器により出力された各クラスの推定値を用いて、前記識別器による判別結果に対する確信度を算出し、当該算出された確信度を表示手段に表示する制御手段と、
 を備える情報処理装置が提供される。
According to the invention of claim 1,
From the feature amount obtained by analyzing the image, a probability that the image belongs to each class among a plurality of classes is output as an estimated value, and a discriminator that determines the class to which the image belongs based on the estimated value;
Using the estimated value of each class output by the discriminator, calculating a certainty factor for the discrimination result by the discriminator, and displaying the calculated confidence factor on a display unit,
An information processing apparatus is provided.
 請求項2に記載の発明によれば、
 前記制御手段は、前記識別器により出力された各クラスの推定値のうち、最大の推定値と、それ以外の推定値との差を求め、当該差が閾値以内にあるクラスが所定数以上あるか否かによって2値の確信度を出力する請求項1に記載の情報処理装置が提供される。
According to invention of Claim 2,
The control means obtains a difference between the maximum estimated value and the estimated value other than the estimated value of each class output by the discriminator, and there are a predetermined number or more classes having the difference within a threshold value. The information processing apparatus according to claim 1, which outputs a binary certainty factor depending on whether or not.
 請求項3に記載の発明によれば、
 前記制御手段は、前記識別器により出力された各クラスの推定値のうちの最大の推定値と、2番目に大きい推定値との差の絶対値を、確信度として出力する請求項1に記載の情報処理装置が提供される。
According to invention of Claim 3,
The said control means outputs the absolute value of the difference of the largest estimated value of the estimated value of each class output by the said discriminator, and the 2nd largest estimated value as a certainty factor. An information processing apparatus is provided.
 請求項4に記載の発明によれば、
 前記制御手段は、前記識別器により出力された各クラスの推定値のうちの最大の推定値と、2番目に大きい推定値との差の絶対値を求め、当該差の絶対値が閾値より小さいか否かによって2値の確信度を出力する請求項1に記載の情報処理装置が提供される。
According to invention of Claim 4,
The control means obtains an absolute value of a difference between a maximum estimated value of the estimated values of each class output from the classifier and a second largest estimated value, and the absolute value of the difference is smaller than a threshold value. The information processing apparatus according to claim 1, which outputs a binary certainty factor depending on whether or not.
 請求項5に記載の発明によれば、
 前記制御手段は、前記識別器により出力された各クラスの推定値のうち、最大の推定値が閾値以下であるか否かによって2値の確信度を出力する請求項1に記載の情報処理装置が提供される。
According to the invention of claim 5,
2. The information processing apparatus according to claim 1, wherein the control unit outputs a binary certainty factor depending on whether or not a maximum estimated value is equal to or less than a threshold value among the estimated values of each class output by the classifier. Is provided.
 請求項6に記載の発明によれば、
 前記制御手段は、前記識別器により出力された各クラスの推定値を正規化し、当該正規化された推定値を用いて前記確信度を算出する請求項1~5の何れか一項に記載の情報処理装置が提供される。
According to the invention of claim 6,
The control unit according to any one of claims 1 to 5, wherein the control unit normalizes the estimated value of each class output by the classifier and calculates the certainty factor using the normalized estimated value. An information processing apparatus is provided.
 請求項7に記載の発明によれば、
 前記制御手段は、前記算出された確信度の値に基づいて当該確信度が高いか低いかを判定し、確信度が低いと判定された場合、クラスの判定結果とともに確信度が低いことを警告する警告情報を前記表示手段に表示させる請求項1~6の何れか一項に記載の情報処理装置が提供される。
According to the invention of claim 7,
The control means determines whether the certainty factor is high or low based on the calculated certainty factor value, and when it is determined that the certainty factor is low, warns that the certainty factor is low together with the determination result of the class. The information processing apparatus according to any one of claims 1 to 6, wherein warning information to be displayed is displayed on the display means.
 請求項8に記載の発明によれば、
 前記画像は、画像生成装置により患者が撮影された医用画像であり、
 前記制御手段は、算出された確信度が高いか低いかを判定し、
 前記制御手段により確信度が高いと判定された場合、前記医用画像に対し最大の推定値を持つクラスに特化した画像処理を施し、確信度が低いと判定された場合、前記医用画像に対し汎用の画像処理を施す画像処理手段を備える請求項1~7の何れか一項に記載の情報処理装置が提供される。
According to the invention described in claim 8,
The image is a medical image in which a patient is photographed by an image generation device,
The control means determines whether the calculated certainty factor is high or low,
When it is determined by the control means that the certainty level is high, image processing specialized for the class having the maximum estimated value is performed on the medical image, and when the certainty level is determined to be low, the medical image is The information processing apparatus according to any one of claims 1 to 7, further comprising image processing means for performing general-purpose image processing.
 請求項9に記載の発明によれば、
 前記医用画像のクラスは、医用画像の撮影が行われた画像生成装置の種類毎のクラスである請求項8に記載の情報処理装置が提供される。
According to the invention of claim 9,
The information processing apparatus according to claim 8, wherein the class of the medical image is a class for each type of the image generation apparatus in which the medical image is captured.
 請求項10に記載の発明によれば、
 前記医用画像のクラスは、医用画像が撮影された撮影部位毎のクラス、撮影方向毎のクラス又は撮影部位と撮影方向の組合せ毎のクラスの何れかである請求項8に記載の情報処理装置が提供される。
According to the invention of claim 10,
The information processing apparatus according to claim 8, wherein the class of the medical image is any one of a class for each imaging region where the medical image is captured, a class for each imaging direction, or a class for each combination of the imaging region and the imaging direction. Provided.
 請求項11に記載の発明によれば、
 前記識別器は、AdaBoostMltの手法により構成され、少なくとも3クラス以上のクラスの判別を行い、
 前記識別器により出力される推定値は、前記確信度に相当する値を伴って与えられる請求項1~10の何れか一項に記載の情報処理装置が提供される。
According to the invention of claim 11,
The discriminator is configured by the AdaBoostMlt method, performs discrimination of at least three classes,
The information processing apparatus according to any one of claims 1 to 10, wherein the estimated value output by the discriminator is given together with a value corresponding to the certainty factor.
 請求項12に記載の発明によれば、
 前記制御手段は、前記算出された確信度を表示手段に表示する請求項1~11の何れか一項に記載の情報処理装置が提供される。
According to the invention of claim 12,
The information processing apparatus according to any one of claims 1 to 11, wherein the control unit displays the calculated certainty factor on a display unit.
 本発明によれば、識別器の判別結果に対する確信度を算出することができ、判別結果とともに、その確信度の情報を提供することが可能となる。 According to the present invention, the certainty factor for the discrimination result of the discriminator can be calculated, and information on the certainty factor can be provided together with the discrimination result.
本実施の形態に係る情報処理装置を含む医用画像システムを示す図である。It is a figure which shows the medical image system containing the information processing apparatus which concerns on this Embodiment. 医用画像システムを医療施設に用いたときの、各装置の配置例を示す図である。It is a figure which shows the example of arrangement | positioning of each apparatus when a medical image system is used for a medical facility. 情報処理装置の機能的構成を示す図である。It is a figure which shows the functional structure of information processing apparatus. 確信度を算出する際に情報処理装置により実行される処理を示すフローチャートである。It is a flowchart which shows the process performed by information processing apparatus when calculating a certainty factor. ビューア画面の一例を示す図である。It is a figure which shows an example of a viewer screen. ビューア画面の一例を示す図である。It is a figure which shows an example of a viewer screen.
 以下、図面を参照して本発明の実施の形態について説明する。
 本実施形態では、医用画像を処理する医用画像システムにおいて、識別器により医用画像が属するクラス(撮影部位毎のクラス)を判別し、その判別結果に対する確信度の情報を提供する情報処理装置の例を説明する。
Embodiments of the present invention will be described below with reference to the drawings.
In this embodiment, in a medical image system that processes medical images, an example of an information processing apparatus that determines a class to which a medical image belongs (a class for each imaging region) by a classifier and provides information on certainty for the determination result. Will be explained.
 図1は、本実施形態に係る情報処理装置3を含む医用画像システム1のシステム構成を示す図である。図2は、医用画像システム1が小規模の医療施設に用いられたときの各装置の配置例を示している。 FIG. 1 is a diagram showing a system configuration of a medical image system 1 including an information processing apparatus 3 according to the present embodiment. FIG. 2 shows an arrangement example of each device when the medical image system 1 is used in a small-scale medical facility.
 医用画像システム1は、開業医やクリニックのように比較的小規模の医療施設用に構築され、患者の検査撮影を行って医用画像を生成、管理する。医用画像は患者情報により管理されるが、この医用画像と患者情報の対応関係は医師により決定され、決定された対応関係の情報は医師の操作により医用画像システム1に入力される。医用画像システム1は入力に応じて医用画像と患者情報とを対応付けて記憶する。総合病院のように大規模な医療施設用に構築された医用画像システムでは、患者情報や検査情報が含まれるオーダ情報が発行され、このオーダ情報によって医用画像が管理されている。この点が小規模の医療施設用の医用画像システム1とは異なる。大規模な医療施設用の医用画像システムでは、オーダ情報によって医用画像の撮影部位及び撮影方向を容易に判別することができるが、小規模の医療施設用の医用画像システム1でオーダー情報が得られない場合は、医師自身が医用画像の撮影部位や撮影方向を特定する必要がある。医用画像システム1では、このような医師の負担を軽減するため、撮影部位や撮影方向を識別器により判別することが行われる。 The medical image system 1 is constructed for a relatively small medical facility such as a medical practitioner or a clinic, and generates and manages a medical image by examining and photographing a patient. The medical image is managed by patient information. The correspondence between the medical image and the patient information is determined by a doctor, and the information on the determined correspondence is input to the medical image system 1 by the operation of the doctor. The medical image system 1 stores a medical image and patient information in association with each other according to the input. In a medical image system constructed for a large-scale medical facility such as a general hospital, order information including patient information and examination information is issued, and medical images are managed based on the order information. This is different from the medical imaging system 1 for a small medical facility. In a medical image system for a large-scale medical facility, the imaging part and the imaging direction of the medical image can be easily determined from the order information. However, order information can be obtained by the medical image system 1 for a small-scale medical facility. If not, the doctor himself / herself needs to specify the imaging part and imaging direction of the medical image. In the medical image system 1, in order to reduce such a burden on the doctor, the imaging region and the imaging direction are discriminated by a discriminator.
 医用画像システム1は、図1に示すように超音波診断装置2a、内視鏡装置2b、CR(Computed Radiography)装置2cを含んで構成されている。超音波診断装置2a、内視鏡装置2b、CR装置2cは、医用画像を生成する画像生成装置の1種である。
 また、医用画像システム1は、図1に示すように情報処理装置3、受付装置4を含んで構成されている。情報処理装置3は、医師の常駐場所である診察室に設けられたWS(ワークステーション)であることが好ましい。
As shown in FIG. 1, the medical imaging system 1 includes an ultrasonic diagnostic apparatus 2a, an endoscope apparatus 2b, and a CR (Computed Radiography) apparatus 2c. The ultrasonic diagnostic apparatus 2a, the endoscope apparatus 2b, and the CR apparatus 2c are one type of image generation apparatus that generates a medical image.
The medical image system 1 includes an information processing device 3 and a reception device 4 as shown in FIG. The information processing apparatus 3 is preferably a WS (workstation) provided in an examination room where a doctor is resident.
 超音波診断装置2a、内視鏡装置2b、CR装置2c、情報処理装置3、受付装置4は、図示しないスイッチングハブ等を介してネットワーク5に接続されている。ネットワーク5は、例えばLAN(Local Area Network)である。通信規格としては、一般に医療関連のデータを扱う通信規格であるDICOM(Digital Imaging and Communication in Medicine)が用いられている。 The ultrasonic diagnostic apparatus 2a, the endoscope apparatus 2b, the CR apparatus 2c, the information processing apparatus 3, and the receiving apparatus 4 are connected to the network 5 via a switching hub (not shown). The network 5 is, for example, a LAN (Local Area Network). As a communication standard, DICOM (Digital-Imaging-and-Communication-in-Medicine), which is a communication standard that handles medical-related data, is generally used.
 次に、上記医用画像システム1を構成する各装置を説明する。
 超音波診断装置2aは、超音波を照射し、その反射波に基づいて医用画像を生成する。
 超音波診断装置2aには変換装置21が接続され、この変換装置21を介してネットワーク5に接続されている。変換装置21は、アナログ信号からデジタル信号への変換を行うとともに、医用画像がDICOMに準拠しない形式である場合にその形式をDICOMに準拠する形式に変換する。また、変換装置21は医用画像システム1において医用画像を個々に特定するためのUID(ユニークID)を医用画像に付加する。UIDは、例えば超音波診断装置2a固有の装置ID、撮影日時を組み合わせて作成される。
Next, each device constituting the medical image system 1 will be described.
The ultrasonic diagnostic apparatus 2a emits ultrasonic waves and generates a medical image based on the reflected waves.
A conversion device 21 is connected to the ultrasonic diagnostic apparatus 2 a and is connected to the network 5 via the conversion device 21. The conversion device 21 performs conversion from an analog signal to a digital signal, and converts the format into a format compliant with DICOM when the medical image is in a format not compliant with DICOM. Further, the conversion device 21 adds a UID (unique ID) for individually specifying a medical image in the medical image system 1 to the medical image. The UID is created by combining, for example, a device ID unique to the ultrasound diagnostic apparatus 2a and an imaging date / time.
 内視鏡装置2bは、管の先端部に設けられた小型の撮影装置を備え、当該撮影装置によって撮影を行い、医用画像を生成する。
 CR装置2cはX線を照射するX線源、X線を検出するX線検出器等を備えて構成されている。X線検出器は、X線エネルギーを蓄積する揮尽性蛍光体プレートが内蔵されたカセッテであってもよいし、FPD(Flat Panel Detector)であってもよい。FPDはマトリクス状にX線検出素子、光電変換素子が配置されてなり、X線検出素子によって検出されたX線を光電変換素子が光電変換して医用画像を生成する。カセッテを用いる場合、CR装置2c内に読取部が設けられる。読取部は揮尽性蛍光体プレートに励起光を照射し、揮尽性蛍光体プレートから発光される輝尽光を光電変換して医用画像を生成する。
The endoscope apparatus 2b includes a small imaging device provided at the distal end portion of the tube, and performs imaging using the imaging device to generate a medical image.
The CR device 2c includes an X-ray source that emits X-rays, an X-ray detector that detects X-rays, and the like. The X-ray detector may be a cassette with a built-in volatile phosphor plate that stores X-ray energy, or may be an FPD (Flat Panel Detector). The FPD includes an X-ray detection element and a photoelectric conversion element arranged in a matrix. The photoelectric conversion element photoelectrically converts the X-ray detected by the X-ray detection element to generate a medical image. When a cassette is used, a reading unit is provided in the CR device 2c. The reading unit irradiates the stimulable phosphor plate with excitation light, and photoelectrically converts the stimulated light emitted from the stimulable phosphor plate to generate a medical image.
 内視鏡装置2b、CR装置2cは、それぞれが生成した医用画像にUIDを付与する。UIDは、それぞれ医用画像の生成を行った内視鏡装置2bの装置ID又は超音波診断装置2aの装置IDに、撮影日時を組合せて作成される。 The endoscope apparatus 2b and the CR apparatus 2c give a UID to the medical image generated by each. The UID is created by combining the device ID of the endoscope device 2b or the device ID of the ultrasonic diagnostic device 2a that respectively generated the medical image with the photographing date and time.
 情報処理装置3は、例えば医用画像、患者情報、電子カルテ情報の情報処理を行い、医師の診察に必要な情報を表示したり、医用画像に画像処理を施したりする。
 図3は、情報処理装置3の機能的構成を示す図である。
 図3に示すように、情報処理装置3は、制御部31、操作部32、表示部33、通信部34、記憶部35、画像処理部36を備えて構成されている。
The information processing device 3 performs, for example, information processing of medical images, patient information, and electronic medical record information, displays information necessary for a doctor's examination, and performs image processing on the medical images.
FIG. 3 is a diagram illustrating a functional configuration of the information processing apparatus 3.
As illustrated in FIG. 3, the information processing apparatus 3 includes a control unit 31, an operation unit 32, a display unit 33, a communication unit 34, a storage unit 35, and an image processing unit 36.
 制御部31は、CPU(Central Processing Unit)、RAM(Random Access Memory)を備えて構成されている。制御部31は、記憶部35に記憶されている各種プログラムを読み出してRAMに展開し、展開されたプログラムに従って各種演算を行ったり、各構成部を集中制御したりする。 The control unit 31 includes a CPU (Central Processing Unit) and a RAM (Random Access Memory). The control unit 31 reads out various programs stored in the storage unit 35 and expands them in the RAM, performs various calculations according to the expanded programs, and centrally controls each component.
 また、制御部31は、入力データが属するクラスを判別する識別器3aを構成する。識別器3aは医用画像を入力データとして、当該医用画像の撮影部位のクラスを判別し、その判別結果を出力する。制御部31は識別器3aによる判別結果に対する確信度を算出し、表示部33に表示させる制御手段である。 Also, the control unit 31 constitutes a discriminator 3a that discriminates the class to which the input data belongs. The discriminator 3a uses the medical image as input data, discriminates the class of the imaging region of the medical image, and outputs the discrimination result. The control unit 31 is a control unit that calculates a certainty factor for the discrimination result by the discriminator 3 a and causes the display unit 33 to display the certainty factor.
 操作部32は、例えばキーボードやマウスを備え、ユーザの操作に応じた操作信号を生成して制御部31に出力する。なお、操作部32と表示部33のディスプレイが一体化したタッチパネルを用いてもよい。
 表示部33はディスプレイを備え、制御部31の表示制御に従ってディスプレイに医用画像や各種操作画面を表示する表示手段である。
 通信部34は通信用のインターフェイスを備え、ネットワーク5上の外部装置と通信を行う。例えば、CR装置2cから医用画像を受信する。
The operation unit 32 includes, for example, a keyboard and a mouse, generates an operation signal corresponding to a user operation, and outputs the operation signal to the control unit 31. In addition, you may use the touchscreen with which the display of the operation part 32 and the display part 33 was integrated.
The display unit 33 includes a display, and is a display unit that displays a medical image and various operation screens on the display according to display control of the control unit 31.
The communication unit 34 includes a communication interface and communicates with an external device on the network 5. For example, a medical image is received from the CR device 2c.
 記憶部35は制御部31で用いられるプログラム、プログラムの実行に必要なパラメータ、ファイルを記憶している。記憶部35としては、ハードディスクや半導体の不揮発性メモリを用いることができる。
 記憶部35は、医用画像のデータベースを記憶している。このデータベースは、例えば医用画像、当該医用画像に付与されたUID、当該医用画像に対応付けられた患者情報からなる。
The storage unit 35 stores programs used in the control unit 31, parameters necessary for executing the programs, and files. As the storage unit 35, a hard disk or a semiconductor nonvolatile memory can be used.
The storage unit 35 stores a medical image database. This database includes, for example, a medical image, a UID assigned to the medical image, and patient information associated with the medical image.
 記憶部35は、医用画像の撮影部位(例えば、胸部、腹部、頭部)毎に、それぞれの撮影部位に特化した画像処理の種類とパラメータの情報を記憶している。撮影部位に特化した画像処理のパラメータとは、撮影部位に応じて適切な処理結果が得られるように設計されたパラメータをいう。例えば、記憶部35は胸部に対し、階調変換処理のパラメータ1、周波数強調処理のパラメータ2の情報を記憶し、腹部に対しては階調変換処理のパラメータ1を記憶している。また、記憶部35は汎用の画像処理の種類とパラメータの情報とを記憶している。汎用の画像処理のパラメータとは、撮影部位に関係なくどの撮影部位の医用画像にも適用できるように設計されたパラメータをいう。 The storage unit 35 stores information on the types and parameters of image processing specialized for each imaging region for each imaging region (for example, chest, abdomen, and head) of the medical image. The image processing parameter specialized for the imaging region refers to a parameter designed to obtain an appropriate processing result according to the imaging region. For example, the storage unit 35 stores gradation conversion parameter 1 and frequency enhancement parameter 2 information for the chest, and gradation conversion parameter 1 for the abdomen. The storage unit 35 stores general-purpose image processing types and parameter information. General-purpose image processing parameters are parameters designed to be applicable to medical images of any imaging region regardless of the imaging region.
 画像処理部36は、医用画像に各種画像処理を施す画像処理手段である。画像処理は画像処理用のプログラムを記憶部35に記憶しておき、当該プログラムとCPUとの協働によるソフトウェア処理として実現することとしてもよい。また、画像処理回路のようにハードウェアによって各種画像処理を実現することとしてもよい。 The image processing unit 36 is an image processing unit that performs various types of image processing on medical images. Image processing may be realized as software processing by storing a program for image processing in the storage unit 35 and cooperation between the program and the CPU. Various image processing may be realized by hardware like an image processing circuit.
 画像処理としては、例えば階調変換処理、周波数強調処理、ダイナミックレンジ圧縮処理、粒状抑制処理、拡大縮小処理、表示位置調整処理、白黒反転処理が挙げられる。
 このような画像処理のパラメータとしては、階調変換処理に用いられる階調変換曲線(出力画素値が目標階調となるように、入力画素値と出力画素値の関係を定めた曲線)、階調変換処理に用いられる正規化曲線の傾き(G値;コントラストを示す値)、正規化曲線のy切片(S値;濃度補正値)、周波数強調処理時の強調度、拡大縮小処理時の拡大率(縮小率)が挙げられる。
Examples of the image processing include gradation conversion processing, frequency enhancement processing, dynamic range compression processing, granularity suppression processing, enlargement / reduction processing, display position adjustment processing, and black and white inversion processing.
Such image processing parameters include a gradation conversion curve used in the gradation conversion process (a curve that defines the relationship between the input pixel value and the output pixel value so that the output pixel value becomes the target gradation), Normalization curve slope (G value; value indicating contrast) used for key conversion processing, y-intercept of normalization curve (S value; density correction value), enhancement degree during frequency enhancement processing, enlargement during enlargement / reduction processing Rate (reduction rate).
 受付装置4は、例えば来院した患者の受付登録、会計計算、保険点数計算を行う。 The reception device 4 performs, for example, reception registration, accounting calculation, and insurance score calculation for a patient who has visited the hospital.
 次に、上記医用画像システム1による処理の流れを、医師や患者のワークフローとともに説明する。
 図2に示すように、入口10付近には患者の受付11と待合室12がある。受付11には受付装置4が配置されている。一方、診察室13には情報処理装置3、超音波診断装置2aが配置され、X線撮影室15にはCR装置2cが配置されている。また、検査室16には内視鏡装置2bが配置されている。
Next, the flow of processing by the medical image system 1 will be described together with the workflows of doctors and patients.
As shown in FIG. 2, there are a patient reception 11 and a waiting room 12 near the entrance 10. The reception device 4 is arranged at the reception 11. On the other hand, the information processing apparatus 3 and the ultrasonic diagnostic apparatus 2a are arranged in the examination room 13, and the CR apparatus 2c is arranged in the X-ray imaging room 15. In the examination room 16, an endoscope apparatus 2b is arranged.
 患者が来院すると、受付11では受付担当者が来院した患者に、各患者を識別するための受付番号が印刷された受付番号札を付与する。受付担当者は患者の氏名や年齢、住所等の患者情報と、当該患者に付与された受付番号を受付装置4に入力する。受付装置4では、患者情報がデータベース化されて保存されるとともに、受付番号順に患者の氏名等を並べた患者リストが作成される。患者リストは情報処理装置3に送信される。 When the patient comes to the hospital, at the reception 11, the reception person in charge gives a reception number tag printed with a reception number for identifying each patient. The receptionist inputs patient information such as the patient's name, age, and address, and the reception number assigned to the patient into the reception device 4. In the reception device 4, patient information is stored in a database, and a patient list in which patient names are arranged in the order of reception numbers is created. The patient list is transmitted to the information processing device 3.
 受付番号が付与された患者は受付順に診察室13に移動し、医師による診察を受ける。医師は診察によって検査撮影が必要と判断すると、情報処理装置3において患者リストを表示するよう操作する。情報処理装置3では操作に応じて患者リストの表示が行われるので、医師は患者リストの中から撮影対象の患者の選択操作を行う。その後、医師は患者を例えばCR装置2cまで案内し、撮影部位、撮影方向を設定すると、CR装置2cにおいて撮影の指示操作を行う。CR装置2cでは撮影の後、医用画像が生成され、当該医用画像にUIDが付されて情報処理装置3に送信される。撮影終了後、医師と患者は診察室13に戻る。 Patients who are given a receipt number move to the examination room 13 in the order of receipt and receive a doctor's examination. When the doctor determines that examination imaging is necessary by medical examination, the information processing apparatus 3 operates to display the patient list. Since the information processing apparatus 3 displays a patient list according to the operation, the doctor performs an operation of selecting a patient to be imaged from the patient list. Thereafter, the doctor guides the patient to the CR device 2c, for example, and sets an imaging region and an imaging direction, and performs an imaging instruction operation on the CR device 2c. In the CR device 2c, after photographing, a medical image is generated, and a UID is attached to the medical image and transmitted to the information processing device 3. After the imaging is finished, the doctor and the patient return to the examination room 13.
 情報処理装置3では、識別器3aにより医用画像の撮影部位の判別が行われる。例えば、制御部31により医用画像の特徴量が算出され、当該特徴量が入力データとして識別器3aに入力される。医用画像の特徴量としては、画素値の平均値、標準偏差、分散、エッジ情報、高次自己相関関数、濃度勾配情報が挙げられる。識別器3aではこれら特徴量から、胸部、腹部、足のように、撮影部位が判別される。撮影部位に応じた画像処理のパラメータが記憶部35に記憶されているので、画像処理部36により当該パラメータを用いた画像処理が施される。 In the information processing apparatus 3, the discriminating unit 3a determines the imaging region of the medical image. For example, the feature amount of the medical image is calculated by the control unit 31, and the feature amount is input to the discriminator 3a as input data. Examples of the feature amount of the medical image include an average value of pixel values, a standard deviation, a variance, edge information, a high-order autocorrelation function, and density gradient information. The discriminator 3a discriminates an imaging region from these feature values such as a chest, an abdomen, and a foot. Since the image processing parameters corresponding to the imaging region are stored in the storage unit 35, the image processing unit 36 performs image processing using the parameters.
 このように、識別器3aは医用画像の撮影部位を判別するために用いられ、医師が撮影部位を特定する作業を補助している。識別器3aは、例えばAdaBoost、AdaBoostMlt、人工ニューラルネットワーク、サポートベクトルマシンのような機械学習の手法によって構成される。本実施形態では、AdaBoostMltの手法により学習データを用いて構成された識別器3aの例を示す。学習データとしては、撮影部位が既に判明している医用画像の特徴量が用いられる。AdaBoostMltは複数の弱識別器の線形結合により最終的な識別器を作成するアルゴリズムである。弱識別器とは、判別精度が低くてもよい識別器をいう。個々の弱識別器の判別精度が低くても、弱識別器が十分な多様性を有することにより、学習データの経験分布に近似することが可能である。 Thus, the discriminator 3a is used to discriminate the imaging part of the medical image and assists the doctor in identifying the imaging part. The discriminator 3a is configured by a machine learning technique such as AdaBoost, AdaBoostMlt, artificial neural network, and support vector machine. In the present embodiment, an example of a classifier 3a configured using learning data by the AdaBoostMlt method is shown. As learning data, a feature amount of a medical image whose imaging region is already known is used. AdaBoostMlt is an algorithm for creating a final classifier by linear combination of a plurality of weak classifiers. A weak classifier refers to a classifier whose discrimination accuracy may be low. Even if the discrimination accuracy of each weak classifier is low, it is possible to approximate the empirical distribution of learning data because the weak classifiers have sufficient diversity.
 AdaBoostMltの手法による識別器3aの構成方法について説明する。
 一般に、AdaBoostは2クラスに分類する識別器を構成する手法であるが、AdaBoostMltは2クラス以上の多クラス分類のための識別器を構成する手法である。
A configuration method of the discriminator 3a by the AdaBoostMlt method will be described.
In general, AdaBoost is a technique for configuring classifiers that classify into two classes, while AdaBoostMlt is a technique for configuring classifiers for multi-class classification of two or more classes.
 x∈X⊂R(Rはn次元空間を示す。Xはn次元空間の部分集合であり、xの集合である。)である入力値xから出力値の集合Y={1、…、K}のべき集合2Yへの写像の集合を下記のように表す。
 Wmlt={hmlt:X→2Y
 このとき、弱識別器hの集合Wは、下記のように表される。
 W={h:X×Y→R|h(x、y)=[y∈hmlt(x)、hmlt∈Wmlt]}
 なお、[]は[]内の条件が成立するときに1、それ以外のときは0を出力する関数を示す。
A set of output values Y = {1,... from an input value x that is x∈X⊂R n (R n represents an n-dimensional space. X is a subset of the n-dimensional space and is a set of x). , K}, the set of mappings to the power set 2 Y is expressed as follows.
W mlt = {h mlt : X → 2 Y }
At this time, the set W of weak classifiers h is expressed as follows.
W = {h: X × Y → R | h (x, y) = [y∈h mlt (x), h mlt ∈W mlt ]}
[] Indicates a function that outputs 1 when the condition in [] is satisfied, and outputs 0 otherwise.
 また、学習データを(x、y)、…、(x、y)で表すと、学習データにより与えられる経験分布p0(x、y)は下記のように定義される。
Figure JPOXMLDOC01-appb-M000001
 xは学習データとして採用された医用画像の特徴量群により定義されるベクトルデータ、yはその医用画像について判明しているクラスを示す。Nは学習データ数を示す。
Further, when the learning data is represented by (x 1 , y 1 ),..., (X N , y N ), the experience distribution p 0 (x, y) given by the learning data is defined as follows.
Figure JPOXMLDOC01-appb-M000001
x represents vector data defined by the feature amount group of the medical image adopted as the learning data, and y represents a class known for the medical image. N indicates the number of learning data.
 学習データの経験分布p(x、y)に対し、出力されるデータの発生確率を、学習データxのクラスy以外のクラスy′まで形式的に拡張すると、拡張された分布p(x、y、y′)は、下記のように定義される。
Figure JPOXMLDOC01-appb-M000002
When the occurrence probability of the output data is formally expanded to the class y ′ other than the class y of the learning data x with respect to the empirical distribution p 0 (x, y) of the learning data, the expanded distribution p 0 (x , Y, y ′) are defined as follows:
Figure JPOXMLDOC01-appb-M000002
 さらに、各弱識別器hに関する判別エラー関数e(h)、正解判別結果の総評価点p(h、p)、誤判別結果の総評価点p(h、p)、損失関数L(f)は、下記のように定義される。
Figure JPOXMLDOC01-appb-M000003
 なお、[]は[]内の条件が成立するときに1、それ以外のときは0を出力する関数を示す。
Furthermore, each weak classifier h about discrimination error function e t (h), the total evaluation points p + a correct determination result (h, p), the total evaluation points of the erroneous discrimination result p - (h, p), the loss function L (F) is defined as follows.
Figure JPOXMLDOC01-appb-M000003
[] Indicates a function that outputs 1 when the condition in [] is satisfied, and outputs 0 otherwise.
 上記定義の下、T回の学習で得られる弱識別器h、…、hを結合して、最終的に得られる識別器3aの関数fλ(x、y)を下記のように表す。
 fλ(x、y)=λ(x、y)+、…、λ(x、y)
 なお、λ~λは各弱識別器h~hの重み付け係数である。
Under the above definition, the weak classifiers h 1 ,..., H T obtained by T times of learning are combined, and the function f λ (x, y) of the classifier 3a finally obtained is expressed as follows. .
f λ (x, y) = λ 1 h 1 (x, y) +,..., λ T h T (x, y)
Note that λ 1 to λ T are weighting coefficients of the weak classifiers h 1 to h T.
 この識別器3aの関数fλ(x、y)を得るため、まず制御部31は学習データ(x、y)、…、(x、y)を入力し、重み付け係数λ、学習回数tの初期化を行う。λを上記関数fλ(x、y)におけるλ~λを要素とするベクトルデータλ=(λ、…、λ)で示すと、初期化により重み付け係数はλ=(0、…、0)、学習回数はt=0に設定される。また、制御部31は拡張された学習データの分布をpとして設定する。初期値t=0の場合、pは、上記p(x、y、y′)の式で表される。 In order to obtain the function f λ (x, y) of the discriminator 3a, the control unit 31 first inputs learning data (x 1 , y 1 ),..., (X N , y N ), and weights coefficients λ t , The learning count t is initialized. When λ t is represented by vector data λ t = (λ 1 ,..., λ T ) having λ 1 to λ T as elements in the function f λ (x, y), the weighting coefficient is λ 0 = ( 0,..., 0), and the learning count is set to t = 0. The control unit 31 sets the distribution of the expanded training data as p t. In the case of the initial value t = 0, p 0 is represented by the above formula p 0 (x, y, y ′).
 次に、制御部31は以下の処理(1)~(4)を、t=1、…、Tまで繰り返して行う。
 (1)判別エラー関数e(p、h)が最小となる弱識別器hを選択する。つまり、複数の弱識別器hの組合せを変えながら判別エラー関数e(p、h)を算出し、そのうち判別エラー関数が最小となる組合せの弱識別器hを選択する。
 (2)λをλt-1+(0、…、α、…、0)に置き換えて設定する。ここで、αは注目する弱識別器hの重み付け係数λを調整するパラメータであり、下記のように表される。
Figure JPOXMLDOC01-appb-M000004
Next, the control unit 31 repeats the following processes (1) to (4) until t = 1,.
(1) Select the weak classifier h that minimizes the discrimination error function e (p t , h t ). That is, the discrimination error function e ( pt , ht ) is calculated while changing the combination of a plurality of weak discriminators h, and the weak discriminator h having the smallest discriminating error function is selected.
(2) λ t is set by replacing λ t−1 + (0,..., Α t ,..., 0). Here, α t is a parameter for adjusting the weighting coefficient λ t of the weak classifier h t of interest, and is expressed as follows.
Figure JPOXMLDOC01-appb-M000004
 (3)拡張された分布pを、下記式によりpt+1に更新する。
Figure JPOXMLDOC01-appb-M000005
 (4)tをt+1に置き換えて設定する。
 上記の処理(1)~(4)を繰り返す結果、識別器3aの関数fλ(x、y)が得られる。
(3) updates the extended distribution p t, the p t + 1 by the following equation.
Figure JPOXMLDOC01-appb-M000005
(4) Set by replacing t with t + 1.
As a result of repeating the above processes (1) to (4), the function f λ (x, y) of the discriminator 3a is obtained.
 次に、図4を参照して、医用画像に画像処理を施す際に情報処理装置3により実行される処理を説明する。この処理では、識別器3aにより医用画像の撮影部位のクラスが判別され、制御部31によりその判別結果に対する確信度が算出されて表示される。確信度とは、識別器3aの判別結果に対する信頼性の度合いをいう。 Next, processing executed by the information processing apparatus 3 when image processing is performed on a medical image will be described with reference to FIG. In this process, the class of the imaging region of the medical image is determined by the classifier 3a, and the certainty factor for the determination result is calculated and displayed by the control unit 31. The certainty factor refers to the degree of reliability with respect to the discrimination result of the discriminator 3a.
 図4に示すように、まず画像処理部36が医用画像の画像解析を行い、特徴量を算出する(ステップS1)。画像解析により得られた特徴量は識別器3aに入力され、識別器3aは当該特徴量から医用画像が各撮影部位のクラスに属する可能性の程度(これを推定値という)を出力する。推定値は識別器3aの判別結果に対する確信度に相当する値を伴って与えられる。具体的には、識別器3aの関数fλ(x、y)により、医用画像が入力された特徴量がxであるとき、クラスkである可能性の程度を示す推定値yが各クラスkについて出力される。そして、識別器3aは各クラスの推定値yのうち、最大の推定値yが出力されたクラスの撮影部位が、医用画像の撮影部位であると判別する(ステップS2)。例えば、胸部、腹部、足の3つの撮影部位のクラスについて判別する場合、胸部の推定値y、腹部の推定値y、足の推定値yが得られる。そのうち、胸部の推定値yが最大であった場合、医用画像の撮影部位は胸部であると判別される。 As shown in FIG. 4, first, the image processing unit 36 performs image analysis of a medical image and calculates a feature amount (step S1). The feature amount obtained by the image analysis is input to the discriminator 3a, and the discriminator 3a outputs the degree of possibility that the medical image belongs to the class of each imaging region (this is referred to as an estimated value). The estimated value is given with a value corresponding to the certainty factor for the discrimination result of the discriminator 3a. Specifically, the estimated value y indicating the degree of possibility of being a class k is obtained for each class k when the feature quantity to which the medical image is input is x by the function f λ (x, y) of the classifier 3a. Is output. Then, the classifier 3a determines that the imaging part of the class for which the maximum estimated value y is output among the estimated values y of each class is the imaging part of the medical image (step S2). For example, in the case of discriminating three classes of imaging regions of the chest, abdomen, and foot, an estimated value y of the chest, an estimated value y of the abdomen, and an estimated value y of the foot are obtained. Among these, when the estimated value y of the chest is the maximum, it is determined that the imaging part of the medical image is the chest.
 関数fλ(x、y)によって識別器3aから出力された推定値yをそのまま用いて確信度を算出してもよいが、関数fλ(x、y)による推定値yは、線形結合された複数の弱識別器の出力の和である。その値域は弱識別器の数によって変動するので、制御部31は推定値yの正規化を行う(ステップS3)。
 推定値yの値域を0≦y≦1とする正規化を行う場合、正規化された推定値yを出力する識別器3aの確率関数f′λ(x|y)は、下記式により示される。
 f′λ(x|y)=1/Z(y)・(λ(x、y)+…+λ(x、y))
 ここで、Z(y)は各推定値yのときのZの総和が1になるように定められた正規化定数である。
The certainty factor may be calculated using the estimated value y output from the discriminator 3a by the function f λ (x, y) as it is, but the estimated value y by the function f λ (x, y) is linearly combined. It is the sum of the outputs of multiple weak classifiers. Since the value range varies depending on the number of weak classifiers, the control unit 31 normalizes the estimated value y (step S3).
When normalization is performed so that the range of the estimated value y is 0 ≦ y ≦ 1, the probability function f ′ λ (x | y) of the discriminator 3a that outputs the normalized estimated value y is expressed by the following equation. .
f ′ λ (x | y) = 1 / Z (y) · (λ 1 h 1 (x, y) +... + λ T h T (x, y))
Here, Z (y) is a normalization constant determined so that the total sum of Z at each estimated value y is 1.
 なお、推定値yの指数を正規化し、この指数が正規化された推定値yを確信度の算出に用いてもよい。推定値yの指数が正規化された正規化値は、条件付き確率分布の下での推定値yである。条件付き確率分布の下での識別器3aの確率関数p(x|y)は、下記式により示される。
 p(y|x)=1/Z・efλ(x、y)
 Zは正規化定数を示す。
Note that the exponent of the estimated value y may be normalized, and the estimated value y obtained by normalizing the exponent may be used to calculate the certainty factor. The normalized value obtained by normalizing the exponent of the estimated value y is the estimated value y under the conditional probability distribution. The probability function p (x | y) of the discriminator 3a under the conditional probability distribution is expressed by the following equation.
p (y | x) = 1 / Z · e fλ (x, y)
Z represents a normalization constant.
 次に、制御部31は、正規化された推定値yを用いて識別器3aによる判別結果についての確信度を算出する。確信度を算出する際、制御部31は、各クラスの正規化された推定値yのうちの最大の推定値yと、その他の推定値yとの差を求める(ステップS4)。次いで、制御部31は求めた差が閾値以下となるクラスが所定数以上ある場合(ステップS5;Y)、確信度を0の値に設定して出力し(ステップS6)、所定数以上無い場合(ステップS5;N)、確信度を1の値に設定して出力する(ステップS7)。 Next, the control unit 31 calculates a certainty factor for the discrimination result by the discriminator 3a using the normalized estimated value y. When calculating the certainty factor, the control unit 31 obtains a difference between the maximum estimated value y of the normalized estimated values y of each class and the other estimated values y (step S4). Next, when there are a predetermined number or more of classes in which the obtained difference is equal to or less than the threshold (step S5; Y), the control unit 31 sets and outputs a certainty factor of 0 (step S6). (Step S5; N), the reliability is set to a value of 1 and output (Step S7).
 ここで、1の値は確信度が高いことを示し、0の値は確信度が低いことを示している。識別器3aの判別結果として最大の推定値yのクラスが出力されるが、他のクラスの推定値yとこの最大の推定値yとの差が小さければ判別が困難であったと考えられる。このような場合、判別結果に対する信頼性が低いと考えられるので、最大の推定値yとその他の推定値の差が小さいクラスがいくつあるかによって2値の確信度が出力される。 Here, a value of 1 indicates that the certainty factor is high, and a value of 0 indicates that the certainty factor is low. The class of the maximum estimated value y is output as the discrimination result of the discriminator 3a. However, if the difference between the estimated value y of another class and the maximum estimated value y is small, it is considered that the discrimination is difficult. In such a case, since the reliability of the discrimination result is considered to be low, a binary certainty factor is output depending on how many classes have a small difference between the maximum estimated value y and other estimated values.
 例えば、胸部、腹部、足の3クラスについて、下記表1に示す推定値y(0.00~1.00の値域に正規化された値)が得られた場合、3クラスの中で最大の推定値yは、胸部のクラスの推定値y=0.80である。この推定値y=0.80と、他の腹部、足のクラスの推定値yとの差は、表1にも示すように0.60、0.50である。閾値が0.20、閾値以下のクラスの所定数が2と定められているとすると、腹部、足のクラスの推定値yは何れも閾値を超えていることから、閾値以下のクラスの数は0である。このクラスの数は、所定数の2以上無いので、確信度は1に設定される。
Figure JPOXMLDOC01-appb-T000006
For example, when the estimated value y (value normalized to a range of 0.00 to 1.00) shown in Table 1 below is obtained for the three classes of the chest, abdomen, and feet, the largest of the three classes The estimated value y is an estimated value y = 0.80 of the chest class. The difference between the estimated value y = 0.80 and the estimated values y of the other abdominal and foot classes is 0.60 and 0.50 as shown in Table 1. If the threshold is 0.20 and the predetermined number of classes below the threshold is set to 2, the estimated values y of the abdominal and foot classes both exceed the threshold, so the number of classes below the threshold is 0. Since the number of classes is not a predetermined number of 2 or more, the certainty factor is set to 1.
Figure JPOXMLDOC01-appb-T000006
 同様の条件下で、得られた推定値yが下記表2に示す推定値yであった場合、最大の推定値yは胸部の推定値y=0.40である。そうすると、他の腹部、足のクラスの推定値yと最大推定値yとの差はそれぞれ閾値以下となり、閾値以下のクラスの数は2である。このクラスの数2は所定数の2以上であるので、確信度は0に設定される。
Figure JPOXMLDOC01-appb-T000007
If the obtained estimated value y is the estimated value y shown in Table 2 below under the same conditions, the maximum estimated value y is the estimated value y = 0.40 of the chest. Then, the difference between the estimated value y and the maximum estimated value y of the other abdominal and foot classes is less than the threshold value, and the number of classes less than or equal to the threshold value is 2. Since the number 2 of this class is a predetermined number 2 or more, the certainty factor is set to zero.
Figure JPOXMLDOC01-appb-T000007
 なお、確信度は識別器3aの判定結果に対する信頼性の度合いを示すことができる指標であればよく、他の方法によって確信度を算出することとしてもよい。例えば、最大推定値yと2番目に大きい推定値y(以下、準推定値yという)との差が閾値以下か否かによって2値の確信度を求める方法が挙げられる。この方法では最大推定値のクラスが準推定値yとどれだけ離れているのか、相対的に最大推定値yを位置づけることに着目して確信度が決定される。 The certainty factor may be an index that can indicate the degree of reliability with respect to the determination result of the discriminator 3a, and the certainty factor may be calculated by another method. For example, there is a method of obtaining a binary certainty factor based on whether or not the difference between the maximum estimated value y and the second largest estimated value y (hereinafter referred to as quasi-estimated value y) is equal to or less than a threshold value. In this method, the certainty factor is determined by focusing on the relative positioning of the maximum estimated value y, how far the class of the maximum estimated value is from the semi-estimated value y.
 具体的には、制御部31は最大推定値yと、準推定値yとの差の絶対値を求め、求められた絶対値が閾値以下なら確信度の値を0に設定して出力し、閾値以下でなければ確信度の値を1に設定して出力する。1の値は確信度が高いことを示し、0の値は確信度が低いことを示している。準推定値との差が閾値以下であれば準推定値のクラスとの判別が困難であったことから信頼性が低いと考えられ、逆に差が閾値より大きければ最大推定値のクラスである可能性が高く信頼性が高いと考えられるからである。 Specifically, the control unit 31 obtains the absolute value of the difference between the maximum estimated value y and the semi-estimated value y, and if the obtained absolute value is less than or equal to the threshold value, sets the confidence value to 0 and outputs it. If it is not less than the threshold, the confidence value is set to 1 and output. A value of 1 indicates that the certainty factor is high, and a value of 0 indicates that the certainty factor is low. If the difference from the quasi-estimated value is less than or equal to the threshold value, it was difficult to discriminate from the quasi-estimated class. This is because the possibility is high and the reliability is high.
 例えば、胸部、腹部、足の3クラスについて下記表3に示す推定値yが得られた場合、最大推定値yは胸部の0.30であり、準推定値yは腹部の0.20である。最大推定値yと準推定値yとの差の絶対値は0.10であるので、閾値が0.15と定められているとすると、差の絶対値は閾値以下であるとして、確信度は0に設定されて出力される。
Figure JPOXMLDOC01-appb-T000008
For example, when the estimated value y shown in Table 3 below is obtained for three classes of the chest, abdomen, and feet, the maximum estimated value y is 0.30 for the chest and the quasi-estimated value y is 0.20 for the abdomen. . Since the absolute value of the difference between the maximum estimated value y and the quasi-estimated value y is 0.10, if the threshold value is set to 0.15, the absolute value of the difference is less than or equal to the threshold value, and the certainty factor is Set to 0 and output.
Figure JPOXMLDOC01-appb-T000008
 なお、最大推定値yと準推定値yとの差の絶対値そのものを確信度として算出してもよい。当該差の絶対値は、そのクラスに属する可能性が一番大きい最大推定値yのクラスが、2番目に可能性が大きい準推定値yのクラスと、どれぐらい可能性の差があるかを示す指標となり、判別結果に対する信頼性を示す。 Note that the absolute value of the difference between the maximum estimated value y and the semi-estimated value y may be calculated as the certainty factor. The absolute value of the difference is the difference in possibility between the class of the maximum estimated value y that is most likely to belong to the class and the class of the semi-estimated value y that is the second most likely. It becomes an index to show and shows the reliability for the discrimination result.
 また、確信度を算出する他の方法の1つとしては、最大の推定値yが閾値以下であれば確信度を0の値に設定し、閾値以下でなければ確信度を1の値に設定する方法も挙げられる。他のクラスとの比較では最大の推定値yとなるが、最大の推定値y自体が一定の値に達しないほど小さいのであれば、判別結果に対する信頼性が低いと考えられるからである。 As another method for calculating the certainty factor, the certainty factor is set to a value of 0 if the maximum estimated value y is not more than the threshold value, and the certainty factor is set to a value of 1 if it is not less than the threshold value. The method of doing is also mentioned. This is because the maximum estimated value y is compared with other classes, but if the maximum estimated value y itself is so small that it does not reach a certain value, it is considered that the reliability of the discrimination result is low.
 例えば、胸部、腹部、足の3クラスについて下記表4に示す推定値yが得られた場合、最大の推定値yは胸部の0.30である。ここで、閾値が0.5であるとすると、最大の推定値yは閾値以下であるので、0の値の確信度が出力される。
Figure JPOXMLDOC01-appb-T000009
For example, when the estimated value y shown in Table 4 below is obtained for three classes of the chest, abdomen, and feet, the maximum estimated value y is 0.30 for the chest. Here, if the threshold value is 0.5, the maximum estimated value y is equal to or less than the threshold value, and thus a certainty value of 0 is output.
Figure JPOXMLDOC01-appb-T000009
 さらに、確信度を算出する他の方法の1つとして、予め判別が容易であるクラスの組合せを設定しておき、最大の推定値yと、当該最大の推定値yのクラスと組合せられているクラスの推定値yとの差が閾値以下であるか否かによって確信度を決定する方法が挙げられる。閾値以下であれば、確信度として、確信度が高いことを示す1の値が設定され、閾値以下でなければ確信度が低いことを示す0の値が設定される。これは、判別が容易なクラス間において推定値yの差が小さい場合、判別が容易であるにも拘わらず、うまく判別できておらず、判別結果に対する信頼性が低いと考えられるからである。 Furthermore, as another method for calculating the certainty factor, a combination of classes that can be easily discriminated is set in advance, and the maximum estimated value y and the class of the maximum estimated value y are combined. There is a method of determining the certainty factor based on whether or not the difference from the estimated value y of the class is equal to or less than a threshold value. If the threshold value is equal to or less than the threshold value, the value of 1 indicating that the certainty factor is high is set as the certainty factor. This is because, when the difference in the estimated value y is small between classes that are easy to discriminate, it is considered that the discrimination is easy but the discrimination is not performed well and the reliability of the discrimination result is low.
 例えば、胸部、腹部、足の3クラスについて下記表5に示す推定値yが得られた場合、最大の推定値yは胸部の0.50である。ここで、胸部と足は判別が容易であるクラスの組合せとして設定されており、閾値は0.20であるとすると、胸部と足の推定値yの差は0.03であり、閾値以下である。胸部と足は画像の特徴が大きく異なり、判別が容易であるにも拘わらず推定値yの差が閾値以下であり、信頼性が低いと考えられるので確信度は0として出力される。
Figure JPOXMLDOC01-appb-T000010
For example, when the estimated value y shown in Table 5 below is obtained for the three classes of the chest, abdomen, and feet, the maximum estimated value y is 0.50 for the chest. Here, the chest and the foot are set as a combination of classes that can be easily discriminated, and if the threshold is 0.20, the difference between the estimated value y of the chest and the foot is 0.03. is there. The image characteristics of the chest and feet are greatly different and the difference between the estimated values y is less than the threshold value despite the fact that the discrimination is easy, and the reliability is considered to be low.
Figure JPOXMLDOC01-appb-T000010
 このようにして確信度が算出されると、制御部31が算出された確信度の値に基づいて確信度が高いか低いかを判定する(ステップS8)。上述のように確信度の算出方法によっては、確信度が1又は0の2値で出力される場合と、最大推定値と準推定値との差の絶対値のように連続値で出力される場合とがある。確信度が2値で出力される場合、0の値は確信度が低いことを示し、1の値は確信度が高いことを示しているので、0又は1の何れの値であるかによって確信度が高いか低いかを判定すればよい。一方、確信度が連続値で出力される場合、閾値との比較によって高いか低いかを判定すればよい。例えば、制御部31は、最大推定値と準推定値との差の絶対値が閾値以下であれば確信度が低く、そうでなければ確信度が高いと判定する。 When the certainty factor is calculated in this way, the control unit 31 determines whether the certainty factor is high or low based on the calculated certainty factor value (step S8). As described above, depending on the certainty factor calculation method, the certainty factor is output as a continuous value such as the absolute value of the difference between the maximum estimated value and the semi-estimated value when the certainty factor is output as a binary value of 1 or 0. There are cases. When the certainty factor is output as a binary value, a value of 0 indicates that the certainty factor is low, and a value of 1 indicates that the certainty factor is high. Therefore, the certainty factor depends on whether the value is 0 or 1. What is necessary is just to determine whether the degree is high or low. On the other hand, when the certainty factor is output as a continuous value, it may be determined whether it is high or low by comparison with a threshold value. For example, the control unit 31 determines that the certainty factor is low if the absolute value of the difference between the maximum estimated value and the quasi-estimated value is equal to or less than the threshold value, and otherwise, the certainty factor is high.
 ここでは、2値で確信度が出力された場合を例に説明する。確信度が1の値であれば、制御部31は確信度が高いと判定し(ステップS8;Y)、識別器3aにより判別結果として出力されたクラスに特化した画像処理が画像処理部36により医用画像に施される(ステップS9)。例えば、識別器3aにより医用画像の撮影部位のクラスは胸部だと判別された場合、制御部31は胸部について定められている画像処理の種類とパラメータの情報を記憶部35から取得し、画像処理部36に出力する。画像処理部36は、胸部について定められている種類の画像処理を定めされたパラメータを用いて実行する。 Here, a case where the certainty factor is output in binary will be described as an example. If the certainty factor is a value of 1, the control unit 31 determines that the certainty factor is high (step S8; Y), and the image processing specialized for the class output as a discrimination result by the discriminator 3a is performed by the image processing unit 36. Is applied to the medical image (step S9). For example, when the classifying unit of the medical image is determined to be the chest by the discriminator 3a, the control unit 31 acquires the image processing type and parameter information determined for the chest from the storage unit 35, and performs image processing. To the unit 36. The image processing unit 36 executes the type of image processing determined for the chest using the determined parameters.
 一方、確信度が0の値であれば、制御部31は確信度が低いと判定し(ステップS8;N)、画像処理部36は汎用の画像処理を医用画像に施す(ステップS10)。例えば、識別器3aにより医用画像の撮影部位のクラスは胸部だと判別されている場合であっても、制御部31は汎用の画像処理の種類とパラメータの情報を記憶部35から取得し、画像処理部36に出力する。画像処理部36は、汎用として定められている種類の画像処理を定められたパラメータを用いて実行する。 On the other hand, if the certainty factor is 0, the control unit 31 determines that the certainty factor is low (step S8; N), and the image processing unit 36 performs general-purpose image processing on the medical image (step S10). For example, even when the class of the imaging region of the medical image is determined to be the chest by the discriminator 3a, the control unit 31 acquires general-purpose image processing type and parameter information from the storage unit 35, and the image The data is output to the processing unit 36. The image processing unit 36 executes a type of image processing defined as general-purpose using a predetermined parameter.
 このように、医用画像の撮影部位の判別結果に対する確信度が算出され、画像処理が施されると、制御部31の表示制御により、図5に示すようなビューア画面D1が表示部13に表示される。
 図5に示すように、ビューア画面D1には患者情報欄d1、画像表示欄d2、画像調整欄d3が表示されている。患者情報欄d1は、医用画像が表示されている患者の患者情報の表示欄である。画像表示欄d2は、検査撮影によって得られた医用画像の表示欄である。画像調整欄d3は、表示された医用画像に施す画像処理のパラメータを操作するのに用いられる操作ボタンの表示欄である。
Thus, when the certainty factor for the determination result of the imaging region of the medical image is calculated and image processing is performed, a viewer screen D1 as shown in FIG. 5 is displayed on the display unit 13 by display control of the control unit 31. Is done.
As shown in FIG. 5, a patient information field d1, an image display field d2, and an image adjustment field d3 are displayed on the viewer screen D1. The patient information column d1 is a display column for patient information of a patient on which a medical image is displayed. The image display field d2 is a display field for medical images obtained by examination imaging. The image adjustment column d3 is a display column for operation buttons used for operating parameters of image processing performed on the displayed medical image.
 画像表示欄d2は、表示された医用画像の撮影部位の情報d21を含む。この撮影部位は、識別器3aによって判別された結果である。患者情報欄d1に表示された患者情報と、画像表示欄d2に表示された医用画像が対応し、画像表示欄d2に表示された撮影部位が正しければ、医師はOKボタンd22を操作すればよい。また、対応関係や撮影部位が誤っている場合、医師はNGボタンd23を操作すればよい。 The image display field d2 includes imaging part information d21 of the displayed medical image. This imaging region is a result determined by the discriminator 3a. If the patient information displayed in the patient information column d1 corresponds to the medical image displayed in the image display column d2, and the imaging region displayed in the image display column d2 is correct, the doctor may operate the OK button d22. . In addition, when the correspondence or the imaging region is incorrect, the doctor may operate the NG button d23.
 OKボタンd22が操作された場合、情報処理装置3では医用画像、患者情報、医用画像に付与されたUID等が制御部31によりデータベース化され、記憶部35に記憶される。NGボタンd23が操作された場合、制御部31の表示制御により撮影部位を修正操作できる修正画面が表示されるので、医師は目視により判断した撮影部位をこの修正画面において入力する。情報処理装置3では画像処理部36により入力された撮影部位に応じたパラメータで画像処理が再度行われ、再画像処理された医用画像により上記ビューア画面の表示が行われる。 When the OK button d22 is operated, in the information processing apparatus 3, the medical image, patient information, UID assigned to the medical image, and the like are made into a database by the control unit 31 and stored in the storage unit 35. When the NG button d23 is operated, a correction screen for correcting the imaging region is displayed by the display control of the control unit 31, and thus the doctor inputs the imaging region determined by visual observation on the correction screen. In the information processing apparatus 3, image processing is performed again with parameters according to the imaging region input by the image processing unit 36, and the viewer screen is displayed with the re-image-processed medical image.
 撮影部位の判別結果に対する確信度が低いと判定された場合、ビューア画面D1において、制御部31は確信度が低いと判定された医用画像の撮影部位の文字に網掛けする、或いは色を付して表示することにより、確信度が低いことを警告する。また、制御部31は例えば「撮影部位を確認してください」のようなメッセージd4を表示して、確信度が低いことを警告する。 When it is determined that the certainty factor for the determination result of the imaging region is low, the control unit 31 shades or adds a color to the character of the imaging region of the medical image determined that the certainty factor is low on the viewer screen D1. To warn that the certainty level is low. In addition, the control unit 31 displays a message d4 such as “Please check the imaging region” to warn that the certainty level is low.
 医師は目視により医用画像の撮影部位を容易に判断できるため、メッセージd4により識別器3aの判別結果が誤判別かどうかはすぐに確認できるが、詳細を把握したい医師のために、図6に示すようなビューア画面D2をさらに表示することとしてもよい。ビューア画面D2は医用画像1枚毎に詳細な情報を表示する画面である。このビューア画面D2では、図6に示すように識別器3aの判別結果として、判別された撮影部位が胸部であり、その推定値が0.4、確信度が0であることを示す情報d5が表示されている。また、確信度が低いため、胸部に特化した画像処理ではなく、汎用の画像処理が医用画像に施されたことを通知するメッセージd6が表示されている。 Since the doctor can easily determine the imaging part of the medical image by visual observation, it can be confirmed immediately by the message d4 whether or not the determination result of the classifier 3a is erroneously determined. Such a viewer screen D2 may be further displayed. The viewer screen D2 is a screen that displays detailed information for each medical image. In this viewer screen D2, as shown in FIG. 6, as the discrimination result of the discriminator 3a, information d5 indicating that the discriminated imaging region is the chest, its estimated value is 0.4, and the certainty factor is 0. It is displayed. Further, since the degree of certainty is low, a message d6 is displayed notifying the user that general-purpose image processing has been performed on the medical image, not image processing specialized for the chest.
 以上のように、本実施形態によれば、識別器3aは医用画像を解析して得られた特徴量から、複数の撮影部位のクラスのうち当該医用画像が各撮影部位のクラスに属する確率を推定値として出力し、当該推定値に基づいて医用画像の撮影部位のクラスを判別し、制御部31は前記識別器3aにより出力された各撮影部位のクラスの推定値を用いて、前記識別器3aによる判別結果に対する確信度を算出し、表示部33が前記算出された確信度を表示する。これにより、識別器3aによる撮影部位の判別結果とともに、その判別結果に対する確信度の情報を提供することができる。 As described above, according to the present embodiment, the discriminator 3a determines the probability that the medical image belongs to the class of each imaging region from the feature amounts obtained by analyzing the medical image. Output as an estimated value, discriminate the imaging region class of the medical image based on the estimated value, and the control unit 31 uses the estimated value of each imaging region class output by the discriminator 3a to The certainty factor for the determination result by 3a is calculated, and the display unit 33 displays the calculated certainty factor. Thereby, the information on the certainty factor for the discrimination result can be provided together with the discrimination result of the imaging part by the discriminator 3a.
 また、前記制御部31は算出された確信度が高いか低いかを判定し、確信度が高いと判定された場合、画像処理部36は前記医用画像に対し最大の推定値を持つ撮影部位のクラスに特化した画像処理を施し、確信度が低いと判定された場合、前記医用画像に対し汎用の画像処理を施す。これにより、誤判別の可能性が高い場合には汎用の画像処理を選択することができ、不適切な画像処理の実施を防ぐことができる。 Further, the control unit 31 determines whether the calculated certainty factor is high or low, and when it is determined that the certainty factor is high, the image processing unit 36 selects an imaging region having the maximum estimated value for the medical image. When class-specific image processing is performed and it is determined that the certainty level is low, general-purpose image processing is performed on the medical image. As a result, general-purpose image processing can be selected when the possibility of erroneous determination is high, and inappropriate image processing can be prevented from being performed.
 また、制御部31は確信度が低いことを警告するメッセージや色、網掛け等の警告情報を表示部33に表示するので、ユーザは確信度が低いことを容易に把握することができる。 In addition, since the control unit 31 displays warning information such as a message, a color, and shading that warns that the certainty factor is low on the display unit 33, the user can easily grasp that the certainty factor is low.
 なお、上述の実施形態は本発明の好適な一例であり、これに限定されない。
 例えば、撮影方向や医用画像の撮影が行われた画像生成装置によっても、医用画像の特徴は異なってくるので、適する画像処理の種類、パラメータが異なってくる。よって、撮影部位毎のクラスではなく、撮影方向毎のクラスの判別の場合に本発明を適用してもよいし、撮影部位及び撮影方向の組合せ毎のクラスの判別の場合に適用してもよい。或いは、画像生成装置毎のクラスの判別の場合にも適用可能である。
In addition, the above-mentioned embodiment is a suitable example of this invention, and is not limited to this.
For example, the characteristics of a medical image differ depending on the photographing direction and the image generation apparatus in which the medical image has been photographed. Therefore, the present invention may be applied in the case of class discrimination for each imaging direction instead of the class for each imaging region, or may be applied in the case of class determination for each combination of imaging region and imaging direction. . Alternatively, the present invention can also be applied to class determination for each image generation apparatus.
 また、医用画像に限らず、画像一般について識別器による判別を行う場合にも本発明を適用することができる。 In addition, the present invention can be applied not only to medical images but also to general image discrimination by a classifier.
 また、上述の情報処理装置3に用いられるプログラムのコンピュータ読み取り可能な媒体としては、ROM、フラッシュメモリ等の不揮発性メモリ、CD-ROM等の可搬型記録媒体を適用することが可能である。また、当該プログラムのデータを通信回線を介して提供する媒体として、キャリアウエーブ(搬送波)も適用可能である。 Further, as a computer-readable medium for the program used in the information processing apparatus 3 described above, a non-volatile memory such as a ROM or a flash memory, or a portable recording medium such as a CD-ROM can be applied. Further, a carrier wave (carrier wave) can also be applied as a medium for providing the program data via a communication line.
 識別器を用いて判別を行う情報処理装置に利用することができる。 It can be used for an information processing apparatus that performs discrimination using a discriminator.
1 医用画像システム
2a 超音波診断装置
2b 内視鏡装置
2c CR装置
3 情報処理装置
31 制御部
3a 識別器
32 操作部
33 表示部
35 記憶部
36 画像処理部
4 受付装置
DESCRIPTION OF SYMBOLS 1 Medical image system 2a Ultrasound diagnostic apparatus 2b Endoscope apparatus 2c CR apparatus 3 Information processing apparatus 31 Control part 3a Classifier 32 Operation part 33 Display part 35 Storage part 36 Image processing part 4 Reception apparatus

Claims (12)

  1.  画像を解析して得られた特徴量から、複数のクラスのうち当該画像が各クラスに属する確率を推定値として出力し、当該推定値に基づいて画像が属するクラスを判別する識別器と、
     前記識別器により出力された各クラスの推定値を用いて、前記識別器による判別結果に対する確信度を算出する制御手段と、
     を備える情報処理装置。
    From the feature amount obtained by analyzing the image, a probability that the image belongs to each class among a plurality of classes is output as an estimated value, and a discriminator that determines the class to which the image belongs based on the estimated value;
    Control means for calculating a certainty factor for the discrimination result by the classifier using the estimated value of each class output by the classifier;
    An information processing apparatus comprising:
  2.  前記制御手段は、前記識別器により出力された各クラスの推定値のうち、最大の推定値と、それ以外の推定値との差を求め、当該差が閾値以内にあるクラスが所定数以上あるか否かによって2値の確信度を出力する請求項1に記載の情報処理装置。 The control means obtains a difference between the maximum estimated value and the estimated value other than the estimated value of each class output by the discriminator, and there are a predetermined number or more classes having the difference within a threshold value. The information processing apparatus according to claim 1, wherein a binary certainty factor is output depending on whether or not.
  3.  前記制御手段は、前記識別器により出力された各クラスの推定値のうちの最大の推定値と、2番目に大きい推定値との差の絶対値を、確信度として出力する請求項1に記載の情報処理装置。 The said control means outputs the absolute value of the difference of the largest estimated value of the estimated value of each class output by the said discriminator, and the 2nd largest estimated value as a certainty factor. Information processing device.
  4.  前記制御手段は、前記識別器により出力された各クラスの推定値のうちの最大の推定値と、2番目に大きい推定値との差の絶対値を求め、当該差の絶対値が閾値より小さいか否かによって2値の確信度を出力する請求項1に記載の情報処理装置。 The control means obtains an absolute value of a difference between a maximum estimated value of the estimated values of each class output from the classifier and a second largest estimated value, and the absolute value of the difference is smaller than a threshold value. The information processing apparatus according to claim 1, wherein a binary certainty factor is output depending on whether or not.
  5.  前記制御手段は、前記識別器により出力された各クラスの推定値のうち、最大の推定値が閾値以下であるか否かによって2値の確信度を出力する請求項1に記載の情報処理装置。 2. The information processing apparatus according to claim 1, wherein the control unit outputs a binary certainty factor depending on whether or not a maximum estimated value is equal to or less than a threshold value among the estimated values of each class output by the classifier. .
  6.  前記制御手段は、前記識別器により出力された各クラスの推定値を正規化し、当該正規化された推定値を用いて前記確信度を算出するか、或いは各クラスの推定値の指数の請求項1~5の何れか一項に記載の情報処理装置。 The control means normalizes the estimated value of each class output by the discriminator and calculates the certainty factor using the normalized estimated value, or claims of the exponent of the estimated value of each class 6. The information processing apparatus according to any one of 1 to 5.
  7.  前記制御手段は、前記算出された確信度の値に基づいて当該確信度が高いか低いかを判定し、確信度が低いと判定された場合、クラスの判定結果とともに確信度が低いことを警告する警告情報を前記表示手段に表示させる請求項1~6の何れか一項に記載の情報処理装置。 The control means determines whether the certainty factor is high or low based on the calculated certainty factor value, and when it is determined that the certainty factor is low, warns that the certainty factor is low together with the determination result of the class. The information processing apparatus according to any one of claims 1 to 6, wherein warning information to be displayed is displayed on the display means.
  8.  前記画像は、画像生成装置により患者が撮影された医用画像であり、
     前記制御手段は、算出された確信度が高いか低いかを判定し、
     前記制御手段により確信度が高いと判定された場合、前記医用画像に対し最大の推定値を持つクラスに特化した画像処理を施し、確信度が低いと判定された場合、前記医用画像に対し汎用の画像処理を施す画像処理手段を備える請求項1~7の何れか一項に記載の情報処理装置。
    The image is a medical image in which a patient is photographed by an image generation device,
    The control means determines whether the calculated certainty factor is high or low,
    When it is determined by the control means that the certainty level is high, image processing specialized for the class having the maximum estimated value is performed on the medical image, and when the certainty level is determined to be low, the medical image is The information processing apparatus according to any one of claims 1 to 7, further comprising image processing means for performing general-purpose image processing.
  9.  前記医用画像のクラスは、医用画像の撮影が行われた画像生成装置の種類毎のクラスである請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the class of the medical image is a class for each type of the image generation apparatus in which the medical image is captured.
  10.  前記医用画像のクラスは、医用画像が撮影された撮影部位毎のクラス、撮影方向毎のクラス又は撮影部位と撮影方向の組合せ毎のクラスの何れかである請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the class of the medical image is any one of a class for each imaging region where the medical image is captured, a class for each imaging direction, or a class for each combination of the imaging region and the imaging direction.
  11.  前記識別器は、AdaBoostMltの手法により構成され、少なくとも3クラス以上のクラスの判別を行い、
     前記識別器により出力される推定値は、前記確信度に相当する値を伴って与えられる請求項1~10の何れか一項に記載の情報処理装置。
    The discriminator is configured by the AdaBoostMlt method, performs discrimination of at least three classes,
    The information processing apparatus according to any one of claims 1 to 10, wherein the estimated value output by the discriminator is given together with a value corresponding to the certainty factor.
  12.  前記制御手段は、前記算出された確信度を表示手段に表示する請求項1~11の何れか一項に記載の情報処理装置。 12. The information processing apparatus according to claim 1, wherein the control unit displays the calculated certainty factor on a display unit.
PCT/JP2009/067205 2008-10-30 2009-10-02 Information processing device WO2010050333A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010535737A JPWO2010050333A1 (en) 2008-10-30 2009-10-02 Information processing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008-279285 2008-10-30
JP2008279285 2008-10-30

Publications (1)

Publication Number Publication Date
WO2010050333A1 true WO2010050333A1 (en) 2010-05-06

Family

ID=42128699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/067205 WO2010050333A1 (en) 2008-10-30 2009-10-02 Information processing device

Country Status (2)

Country Link
JP (1) JPWO2010050333A1 (en)
WO (1) WO2010050333A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012170781A (en) * 2011-02-24 2012-09-10 Toshiba Corp Medical image diagnostic apparatus and medical image display device
JP2013192624A (en) * 2012-03-16 2013-09-30 Hitachi Ltd Medical image diagnosis supporting apparatus, medical image diagnosis supporting method and computer program
WO2016130879A1 (en) * 2015-02-13 2016-08-18 Prairie Ventures, Llc System and method to objectively measure quality assurance in anatomic pathology
JP6251945B1 (en) * 2016-07-13 2017-12-27 メディアマート株式会社 Diagnosis support system, medical diagnosis support apparatus, and diagnosis support method
WO2018012090A1 (en) * 2016-07-13 2018-01-18 メディアマート株式会社 Diagnosis support system, medical diagnosis support device, and diagnosis support system method
WO2019240257A1 (en) * 2018-06-15 2019-12-19 キヤノン株式会社 Medical image processing device, medical image processing method and program
JP2019216848A (en) * 2018-06-15 2019-12-26 キヤノン株式会社 Medical image processing apparatus, medical image processing method, and program
JP2020075104A (en) * 2018-10-08 2020-05-21 ゼネラル・エレクトリック・カンパニイ Ultrasound cardiac doppler study automation
JP2020168233A (en) * 2019-04-04 2020-10-15 株式会社日立製作所 Ultrasonic imaging device, and image processor
WO2021153355A1 (en) * 2020-01-29 2021-08-05 キヤノン株式会社 Medical information processing system, medical information processing device, control method for medical information processing system, and program
US20210398259A1 (en) 2019-03-11 2021-12-23 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US11222243B2 (en) 2017-09-15 2022-01-11 Fujifilm Corporation Medical image processing device, medical image processing method, and medical image processing program
WO2024018906A1 (en) * 2022-07-20 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Information processing device, information processing method, and program
US11922601B2 (en) 2018-10-10 2024-03-05 Canon Kabushiki Kaisha Medical image processing apparatus, medical image processing method and computer-readable medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005044330A (en) * 2003-07-24 2005-02-17 Univ Of California San Diego Weak hypothesis generation device and method, learning device and method, detection device and method, expression learning device and method, expression recognition device and method, and robot device
WO2006062013A1 (en) * 2004-12-10 2006-06-15 Konica Minolta Medical & Graphic, Inc. Image processing device, image processing method, and image processing program
JP3800892B2 (en) * 1999-10-29 2006-07-26 コニカミノルタホールディングス株式会社 Radiation image processing device
JP2006202276A (en) * 2004-12-22 2006-08-03 Fuji Photo Film Co Ltd Image processing method, system, and program
WO2007029467A1 (en) * 2005-09-05 2007-03-15 Konica Minolta Medical & Graphic, Inc. Image processing method and image processing device
JP2008011900A (en) * 2006-07-03 2008-01-24 Fujifilm Corp Image type discrimination device, method and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3800892B2 (en) * 1999-10-29 2006-07-26 コニカミノルタホールディングス株式会社 Radiation image processing device
JP2005044330A (en) * 2003-07-24 2005-02-17 Univ Of California San Diego Weak hypothesis generation device and method, learning device and method, detection device and method, expression learning device and method, expression recognition device and method, and robot device
WO2006062013A1 (en) * 2004-12-10 2006-06-15 Konica Minolta Medical & Graphic, Inc. Image processing device, image processing method, and image processing program
JP2006202276A (en) * 2004-12-22 2006-08-03 Fuji Photo Film Co Ltd Image processing method, system, and program
WO2007029467A1 (en) * 2005-09-05 2007-03-15 Konica Minolta Medical & Graphic, Inc. Image processing method and image processing device
JP2008011900A (en) * 2006-07-03 2008-01-24 Fujifilm Corp Image type discrimination device, method and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YUTAKA UEDA ET AL.: "Shinryosho Muke System ''REGIUS Unitea'' no Kaihatsu", KONICA MINOLTA TECHNOLOGY REPORT, vol. 5, January 2008 (2008-01-01), pages 11 - 15 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012170781A (en) * 2011-02-24 2012-09-10 Toshiba Corp Medical image diagnostic apparatus and medical image display device
JP2013192624A (en) * 2012-03-16 2013-09-30 Hitachi Ltd Medical image diagnosis supporting apparatus, medical image diagnosis supporting method and computer program
US9317918B2 (en) 2012-03-16 2016-04-19 Hitachi, Ltd. Apparatus, method, and computer program product for medical diagnostic imaging assistance
WO2016130879A1 (en) * 2015-02-13 2016-08-18 Prairie Ventures, Llc System and method to objectively measure quality assurance in anatomic pathology
JP6251945B1 (en) * 2016-07-13 2017-12-27 メディアマート株式会社 Diagnosis support system, medical diagnosis support apparatus, and diagnosis support method
WO2018012090A1 (en) * 2016-07-13 2018-01-18 メディアマート株式会社 Diagnosis support system, medical diagnosis support device, and diagnosis support system method
US11222243B2 (en) 2017-09-15 2022-01-11 Fujifilm Corporation Medical image processing device, medical image processing method, and medical image processing program
US11734820B2 (en) 2017-09-15 2023-08-22 Fujifilm Corporation Medical image processing device, medical image processing method, and medical image processing program
KR102507711B1 (en) * 2018-06-15 2023-03-10 캐논 가부시끼가이샤 Medical image processing apparatus, medical image processing method, and computer readable medium
JP2019216848A (en) * 2018-06-15 2019-12-26 キヤノン株式会社 Medical image processing apparatus, medical image processing method, and program
KR20210018461A (en) * 2018-06-15 2021-02-17 캐논 가부시끼가이샤 Medical image processing apparatus, medical image processing method, and computer-readable medium
GB2589250A (en) * 2018-06-15 2021-05-26 Canon Kk Medical image processing device, medical image processing method and program
WO2019240257A1 (en) * 2018-06-15 2019-12-19 キヤノン株式会社 Medical image processing device, medical image processing method and program
GB2589250B (en) * 2018-06-15 2023-03-08 Canon Kk Medical image processing apparatus, medical image processing method and program
JP7114358B2 (en) 2018-06-15 2022-08-08 キヤノン株式会社 MEDICAL IMAGE PROCESSING APPARATUS, MEDICAL IMAGE PROCESSING METHOD AND PROGRAM
JP2020075104A (en) * 2018-10-08 2020-05-21 ゼネラル・エレクトリック・カンパニイ Ultrasound cardiac doppler study automation
JP7123891B2 (en) 2018-10-08 2022-08-23 ゼネラル・エレクトリック・カンパニイ Automation of echocardiographic Doppler examination
US11922601B2 (en) 2018-10-10 2024-03-05 Canon Kabushiki Kaisha Medical image processing apparatus, medical image processing method and computer-readable medium
US20210398259A1 (en) 2019-03-11 2021-12-23 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US11887288B2 (en) 2019-03-11 2024-01-30 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
JP2020168233A (en) * 2019-04-04 2020-10-15 株式会社日立製作所 Ultrasonic imaging device, and image processor
JP7269778B2 (en) 2019-04-04 2023-05-09 富士フイルムヘルスケア株式会社 Ultrasonic imaging device and image processing device
WO2021153355A1 (en) * 2020-01-29 2021-08-05 キヤノン株式会社 Medical information processing system, medical information processing device, control method for medical information processing system, and program
WO2024018906A1 (en) * 2022-07-20 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Information processing device, information processing method, and program

Also Published As

Publication number Publication date
JPWO2010050333A1 (en) 2012-03-29

Similar Documents

Publication Publication Date Title
WO2010050333A1 (en) Information processing device
JP5533662B2 (en) Information processing device
US8577110B2 (en) Device, method and computer readable recording medium containing program for separating image components
EP3975197A1 (en) Device and method for processing medical image using predicted metadata
US20090103796A1 (en) Method of discriminating between right and left breast images and breast radiographing system
JP2005198970A (en) Medical image processor
US8036443B2 (en) Image processing method and image processor
JP2006325638A (en) Method of detecting abnormal shadow candidate and medical image processing system
EP3644273A1 (en) System for determining image quality parameters for medical images
JP2004290329A (en) Medical image processor, medical network system and program for medical image processor
JP2003284713A (en) Image processing device for medical use, image processing parameter correcting method, program, and storage medium
US11779288B2 (en) Methods, systems, and apparatus for determining radiation doses
JP2001076141A (en) Image recognizing method and image processor
JP4727476B2 (en) Medical image information processing apparatus and confirmation work support program
JP2010246705A (en) Image processor, image processing method and program
JP2020027507A (en) Medical information processing device, medical information processing method, and program
JP2010086449A (en) Information processing apparatus
US20240062367A1 (en) Detecting abnormalities in an x-ray image
JP2006263055A (en) X-ray image processing system and x-ray image processing method
JP2006034521A (en) Medical image processor and medical image processing method
JP6326812B2 (en) Image processing apparatus and irradiation field recognition method
JP2004321457A (en) Method and device for determining condition for image processing
JP7370694B2 (en) Medical information processing device, medical information processing method, and program
JP2001224576A (en) Image processing method and image processor
JP2007260062A (en) Processor of medical image information and confirmation work support program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09823446

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2010535737

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09823446

Country of ref document: EP

Kind code of ref document: A1