WO2015029135A1 - Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence - Google Patents

Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence Download PDF

Info

Publication number
WO2015029135A1
WO2015029135A1 PCT/JP2013/072856 JP2013072856W WO2015029135A1 WO 2015029135 A1 WO2015029135 A1 WO 2015029135A1 JP 2013072856 W JP2013072856 W JP 2013072856W WO 2015029135 A1 WO2015029135 A1 WO 2015029135A1
Authority
WO
WIPO (PCT)
Prior art keywords
morbidity
difference
image
diagnosed
images
Prior art date
Application number
PCT/JP2013/072856
Other languages
English (en)
Japanese (ja)
Inventor
武 中山
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to JP2015533822A priority Critical patent/JP6208243B2/ja
Priority to PCT/JP2013/072856 priority patent/WO2015029135A1/fr
Priority to TW103125674A priority patent/TWI524198B/zh
Publication of WO2015029135A1 publication Critical patent/WO2015029135A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • A61B5/0042Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • A61B2576/026Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof

Definitions

  • the present invention relates to a morbidity rate evaluation apparatus, a morbidity rate evaluation method, and a morbidity rate evaluation program.
  • the image display device of Patent Literature 1 acquires a feature amount that represents an abnormal shadow from an abnormal shadow candidate region in a diagnostic image, searches for a similar feature amount from medical images stored in advance, The diagnostic image and the corresponding medical image are displayed so as to be comparable.
  • the diagnosis support apparatus of Patent Literature 2 accumulates reference images that have been diagnosed in a database. Then, the image-like feature amount of the lesion position of the diagnostic image is extracted, the extracted feature amount is compared with each feature amount of the reference image, and the similarity is calculated. Further, a plurality of reference images having a high degree of similarity are selected, and a disease name probability that is an average value of the degrees of similarity is displayed for each disease name.
  • the diagnosis support apparatus of Patent Document 3 stores examination history information including information on a lesion part of a subject. Then, according to the examination history information, the size of a lesion part to be processed is changed when data obtained from the subject is processed. Also, the occurrence probability of a lesion is calculated according to the examination history information, and the processing method of the lesion is changed according to the occurrence probability.
  • the image collection device of Patent Document 4 collects images and detects the type and degree of progress of the abnormality. When the degree of progress is within a predetermined range, it is output that an additional inspection is necessary, and the inspection conditions for the additional inspection are determined.
  • the hospital visitor selection device of Patent Document 5 acquires virtual patient data of a patient candidate based on disease information including the type of disease, date of occurrence, occurrence location, etc. of the patient candidate, and the patient candidate needs to go to the hospital. Determine whether or not.
  • the study of Non-Patent Document 1 compares MRI (Magnetic Resonance Imaging) images of brains of patients with mild cognitive impairment in time series. And it reports that cerebral atrophy actually occurs for all patients studied.
  • MRI Magnetic Resonance Imaging
  • JP 2006-34585 A (Claim 1, paragraph 0031) Japanese Patent No. 4021179 (Claim 7, paragraph 0026) Japanese Patent No. 5159242 (Claims 1, 3 etc.) Japanese Patent No. 3495327 (Claim 1 etc.) JP 2010-113477 A (Claim 1 etc.)
  • the devices of Patent Literatures 1, 2, and 3 calculate “feature amounts” of the respective images, and further calculate “similarity” from the two “feature amounts” (for example, And paragraph 0026 of Patent Document 2).
  • the “feature amount” is a multidimensional vector having a plurality of physical quantities acquired from the medical image as components.
  • the plurality of components many components including volume, lightness, circularity and the like are assumed.
  • group health examinations, medical checkups, short-term health examinations of less than one year, etc. are performed for a large number of people and are frequently performed. In such a case, there are many technical and cost restrictions, and it is difficult to acquire a high-quality image with a large amount of information.
  • the image acquisition device of Patent Document 4 detects the degree of progress of abnormality, and this detection is based on the premise that there is an example of precise diagnosis in which abnormality diagnosis has been made in the past. Therefore, the apparatus of Patent Document 4 is also unsuitable when it cannot be expected to acquire a high-quality image.
  • the visitor selection apparatus of Patent Document 5 is intended for patients who already have some abnormality and actually have a visit history, and does not make any diagnosis for a patient to be newly diagnosed.
  • the research of Non-Patent Document 1 uses images of patients who have been diagnosed with mild cognitive impairment in the past. Neither of them is intended for healthy individuals who are included in large numbers in subjects such as group health examinations, medical checkups, and short-term health examinations of less than one year. Therefore, the present invention determines whether or not there is an abnormality based on an image that is not necessarily high in quality when it is necessary to take a medical image of a large number of people in a short period of time, such as in a group medical examination, and for each disease name The purpose is to calculate the likelihood of morbidity.
  • the morbidity evaluation apparatus includes a storage unit that stores diagnostic information in which a difference between two images at different imaging time points and a disease name are stored in association with a body part, and a user stores Accepting the input of the category of the feature amount of the image and the part to be diagnosed, and calculating the difference between the two images at different imaging time points when imaging the part to be diagnosed by the diagnosis subject for the received category,
  • the diagnosis information is searched using the part to be diagnosed and the calculated difference as a search key, the disease name of the corresponding record is acquired, and the ratio of the number of the corresponding record to the number of records having the part
  • a control unit that calculates the morbidity based on the acquired morbidity and displays the calculated morbidity for each acquired disease name. Other means will be described in the embodiment for carrying out the invention.
  • the presence or absence of an abnormality is determined based on an image that is not necessarily high in image quality, and for each disease name It is possible to calculate the possibility of morbidity.
  • a current medical image 102 is obtained for a certain part of the patient to be diagnosed (in the example of FIG. 1, the brain).
  • past medical images 101 for the relevant part of other patients are accumulated.
  • the medical image 101 is obtained by imaging the relevant part of many patients, and has a medical image ID 103 that uniquely identifies the medical image.
  • correspondence information 104 in which the disease name diagnosed by the doctor is stored in association with the medical image ID.
  • the similarity S is a scalar quantity calculated by the following Equation 1 with the feature quantity vector Q and any one feature quantity vector P as inputs.
  • E is a vector whose n components are all “1”.
  • W is a vector having the weight of each feature quantity as n components.
  • is the sum (scalar amount) of the components of the vector “W”.
  • W t is a “transpose matrix” of the vector “W”, and when “W t ” is multiplied by the vector “E ⁇ (QP)” from the right, a scalar quantity is calculated.
  • the components of the vectors “P” and “Q” are normalized to a range of 0 to 1.
  • the apparatus completes the correspondence information 104 by associating such similarity S with past medical images on a one-to-one basis.
  • the apparatus specifies the disease name corresponding to the past medical image having the highest similarity (close to “1”) as the disease name of the patient who is the current diagnosis target.
  • the following problems occur.
  • In the first place, it is generally impossible to expect high-quality medical images that can acquire many types of feature values. -Comparing medical images that do not have sufficiently high image quality, a temporary calculation is possible, but a meaningless result is output. For example, as a result of not being able to detect a difference, similar levels of similarity are calculated for all past medical images.
  • Long time required for processing. ⁇ Comparing the feature level itself without having the idea of change along the time axis. Therefore, the feature amount of the medical image of the healthy target patient is coincidentally similar to the feature amount of the medical image of the affected patient and may be misdiagnosed as suffering from some disease.
  • FIG. 2 An outline of information processing according to the present embodiment will be described with reference to FIGS. Details will be described later along a separate flowchart.
  • FIG. 2 will be described.
  • the past medical image 111 is acquired for a certain part of the patient to be diagnosed (also the brain in the example of FIG. 2).
  • the current medical image 112 is also acquired for the part of the patient.
  • the morbidity rate evaluation apparatus 1 (FIG. 6) of the present embodiment can use the past diagnosis examples 114.
  • Various diagnosis examples 114 can be assumed.
  • a plurality of two-dimensional graphs showing the relationship between the rate of change and the number of diagnoses are prepared for each disease.
  • the morbidity rate evaluation apparatus 1 determines a certain feature amount (category). Assume that the determined feature amount is, for example, “area”. Secondly, the morbidity evaluation apparatus 1 extracts a difference between feature amounts.
  • the difference in “area” is simply the difference between the cross-sectional area of the portion expressed as the brain in the medical image 111 and the cross-sectional area of the portion expressed as the brain in the medical image 112.
  • the difference can also be expressed as a numerical value (unit: cm 2 ) by converting the number of pixels into an area. Further, it can be expressed as a medical image 113 as a portion (a contracted portion) having a specific pixel value that exists in the medical image 111 and does not exist in the medical image 112.
  • the morbidity evaluation apparatus 1 calculates the change rate R (percentage) of the difference by the following formula 2.
  • R (q 1 ⁇ p 1 ) / p 1 ⁇ 100 (Formula 2)
  • p 1 is the area of the brain of the medical image 111.
  • q 1 is the area of the brain of the medical image 112.
  • the morbidity evaluation apparatus 1 applies the difference to all the graphs, and acquires the number of diagnoses for each disease.
  • the morbidity evaluation device 1 displays the disease name and the morbidity in association with each other in descending order of the number of diagnoses.
  • the morbidity is, for example, a numerical value (percentage) obtained by dividing the acquired number of diagnoses by the total number of diagnoses (through all diseases) for the site.
  • the feature quantity is “lightness” in FIG. 3.
  • the difference in “brightness” is simply the brightness of the portion expressed as the brain in the medical image 111 (for example, the average value of gray scales of 0 to 255 of each pixel) and the portion expressed as the brain in the medical image 112. It is the difference from brightness.
  • the difference can be expressed as a unitless numerical value. Further, it may be expressed as a medical image 113 with a color or the like determined in advance according to the magnitude of the difference.
  • brain diseases such as those that develop as a change in “area”, those that develop as a change in “brightness”, and those that develop as a change in other features.
  • the user can also search for disease names using the rate of change of a certain feature amount as a key. It is also possible to search for disease names using as a key a combination of change rates of a plurality of feature amounts (for example, “area” decreases and “brightness” increases).
  • FIG. 4 is similar to FIG. 2 (where “area” is a feature amount).
  • the change rate of the area is a single numerical value.
  • the area change rate is greatly different in that it is a plurality of numerical values indicating changes in change along a time series.
  • a past medical image 121 of a part (a brain) of a patient to be diagnosed is acquired.
  • a plurality of current and near past medical images 122 of the relevant part of the patient are also acquired.
  • the morbidity evaluation apparatus 1 extracts a difference between each of the medical images 122 (current now, previous now-1 and previous time now-2) and the oldest medical image 121 serving as a reference (std). (Reference numeral 123), and the change rate 124 of the difference with respect to the reference is calculated. The change rate of the difference is calculated by the number of medical images 122. Secondly, the morbidity rate evaluation apparatus 1 creates a time-series graph 125 of the change rate. Thirdly, the morbidity rate evaluation apparatus 1 applies the time series graph 125 of the change rate to a plurality of “types” (reference numeral 126). The “type” is a time-series graph pattern of change rate created based on past diagnosis examples (details will be described later).
  • the morbidity rate evaluation apparatus 1 specifies the “type” that most closely approximates the created time series graph 125.
  • the morbidity evaluation apparatus 1 displays the name of a disease having the specified type in association with the morbidity that is a percentage obtained by dividing the number of diagnoses of the disease by the total number of diagnoses of the type. It is assumed that the morbidity evaluation apparatus 1 creates correspondence information 127 in which one or a plurality of diseases and the number of diagnoses thereof are stored in association with a type based on past diagnosis examples.
  • FIG. 5 will be described. Compared with the example of FIG. 4, the only difference is that the feature quantity is “lightness” in FIG. 5.
  • Brain diseases that develop as time-series changes in “area”, those that develop as time-series changes in “brightness”, and those that develop as time-series changes in other feature values There are various things such as things.
  • the user can also search for a disease name using a time-series transition of a change in a certain feature amount as a key. It is also possible to search for disease names using a combination of time-series transitions of changes in a plurality of feature quantities (for example, “area” repeatedly shrinks and expands and “brightness” simply continues to increase) as a key. it can.
  • the morbidity evaluation apparatus 1 is a general computer.
  • the morbidity evaluation device 1 includes a central control device 11, an input device 12 such as a keyboard, a mouse, and a touch screen, an output device 13 such as a display, a main storage device 14, an auxiliary storage device 15, and a communication device 16. These are connected to each other by a system bus.
  • the relative position correction unit 21, the difference image calculation unit 22, the change rate calculation unit 23, the abnormality presence / absence determination unit 24, the morbidity rate calculation unit 25, and the input / output control unit 26 in the main storage device 14 are programs. Thereafter, when the subject is described as “XX section”, the central control device 11 reads each program from the auxiliary storage device 15 and loads it into the main storage device 14, and then the function of each program (detailed later). Shall be realized.
  • the auxiliary storage device 15 stores image management information 31, abnormality determination information 32, doctor diagnosis information 33, and a medical image 34. These details will be described later.
  • the input / output control unit 26 corresponds to “input unit” and “output unit”.
  • the morbidity evaluation device 1 is connected to the terminal device 2 through the network 4 so as to be communicable.
  • the terminal device 2 is also a general computer, and is connected to each other via a central control device, an input device such as a keyboard, a mouse, and a touch screen, an output device such as a display, a main storage device, an auxiliary storage device, and a communication device. (Not shown).
  • the imaging device 3 is a device that captures a medical image from the body. There are various types of medical images picked up by the image pickup device 3, such as a CT (Computed Tomography) image, an MRI image, a PET (Position Emission Tomography) image, an X-ray image, an ultrasonic image, an endoscopic image, and the like. obtain.
  • the imaging device 3 can output the captured image as a digital image (consisting of pixels and their pixel values) that can be processed by a computer including the morbidity evaluation device 1 and the terminal device 2.
  • the morbidity rate evaluation device 1, the terminal device 2, and the imaging device 3 are often arranged in a clinic or the like.
  • a doctor or the like acquires a medical image of the body of a patient (including a healthy person) by using the imaging device 3 at an opportunity such as a health checkup.
  • the doctor or the like inputs the medical image to the terminal device 2 and displays the medical image on an output device (display), for example.
  • the medical image is transmitted to the morbidity evaluation apparatus 1 via the network 4.
  • the morbidity evaluation apparatus 1 creates a “diagnosis result” using the received medical image and sends it back to the terminal apparatus 2.
  • FIG. 6 it is assumed that a plurality of terminal devices 2 access one morbidity rate evaluation device 1.
  • the morbidity rate evaluation device 1 and the terminal device 2 may be configured in a single casing.
  • the morbidity evaluation apparatus 1 is divided into a plurality of cases for each body part (brain, lung,%), For each medical department (brain neurology, circulatory organ department,). It may be.
  • the user who uses the terminal device 2 or the like may be a person other than a doctor, for example, an employee of a life insurance company, a patient himself including a healthy person, or the like.
  • a doctor for example, an employee of a life insurance company
  • a patient himself including a healthy person, or the like.
  • doctors' diagnosis examples should not be published.
  • the person who manages his / her health condition can also obtain the diagnosis example by himself and operate the morbidity evaluation apparatus 1 of the present embodiment.
  • a health-related business operator obtains a diagnosis example from a doctor or the like, receives a medical image of the customer from a general customer, and operates the morbidity evaluation apparatus 1 to provide a diagnostic service to the customer. It becomes possible.
  • the image management information 31 will be described with reference to FIG.
  • the patient name column 202 contains the patient name
  • the image ID column 203 contains the image ID
  • the imaging device ID column 204 contains the imaging device.
  • the ID is stored in the imaging time column 205
  • the site name column 206 is stored with the site name
  • the image file name column 207 is stored with the image file name.
  • the patient ID in the patient ID column 201 is an identifier that uniquely identifies a patient.
  • “patient” means a general subject who can acquire an image of the body. In other words, “patient” is a concept including “healthy person” who is not affected.
  • the patient name in the patient name column 202 is the name of the patient.
  • the image ID in the image ID column 203 is an identifier that uniquely identifies a medical image.
  • the imaging device ID in the imaging device ID column 204 is an identifier that uniquely identifies the imaging device 3.
  • the imaging time point in the imaging time point column 205 is the date when the medical image is captured.
  • the part name in the part name column 206 is a name of a part (part) of the patient's body.
  • the site name may be, for example, a nerve center such as the brain and medulla, an organ such as the heart and lungs, and a skeleton such as the pelvis and femur.
  • the image file name in the image file name field 207 is a name of a medical image as digital image information.
  • the abnormality determination information 32 will be described with reference to FIG.
  • the feature amount column 212 has a feature amount
  • the priority column 213 has a priority
  • the normal change rate range column 214 has a normal state.
  • the change rate range, the normal change amount range column 215 stores the normal change amount range
  • the unit column 216 stores the unit
  • the change period 217 stores the change period.
  • the part name in the part name column 211 is the same as the part name in FIG.
  • the feature amount in the feature amount column 212 is a category of the physical amount of the part that can be acquired from the medical image.
  • the feature amount mainly expresses the shape, properties, etc. of the part, and specific examples thereof will be described later.
  • the morbidity evaluation apparatus 1 acquires a difference between feature quantities at two different time points, and estimates the morbidity for each disease name (candidate) from past diagnosis examples based on the difference (details will be described later).
  • the priority in the priority column 213 is a priority order among the feature amounts when there are a plurality of feature amounts corresponding to one part name. The lower the number, the higher the priority.
  • the normal change rate range in the normal change rate range column 214 is a combination of the upper limit and the lower limit of the change rate at which a doctor or the like diagnoses as normal.
  • the upper and lower limits do not always need to be equidistant around “0”.
  • the upper limit may be “+ 3%” and the lower limit may be “ ⁇ 2%”.
  • the change rate is a rate (percentage) at which the feature amount changes.
  • the rate of change is calculated by Equation 3 below.
  • Rate of change (feature before change ⁇ feature after change) / feature before change ⁇ 100 (Formula 3)
  • the normal change amount range in the normal change amount range column 215 is a combination of the upper limit and the lower limit of the change amount that a doctor or the like diagnoses as normal.
  • the change amount is an absolute amount at which the feature amount changes.
  • the amount of change is calculated by Equation 4 below.
  • Change amount feature amount before change-feature amount after change (Formula 4)
  • the unit in the unit column 216 is a feature amount unit. It can be unitless.
  • the feature quantity expressed by an 8-bit binary number (0 to 255 in decimal number) such as “brightness” is “no unit”.
  • the change period of the change period 217 indicates how long the change rate and change amount of the feature amount are converted into the change rate and change amount over the period. For example, the rate of change “+ 1%” in the change period “1 month” indicates that a certain feature value changes “+ 12%” in 12 months, “+ 6%” in 6 months, and “+0.5 in half a month. % "Changes, ... are shown equally.
  • the change amount “+1” of the change period “1 month” is, for example, that a certain feature amount changes “+12” in 12 months, “+6” changes in 6 months, and “+0.5 in half months”. "Indicates changing, ... equally.”
  • “...” Is an abbreviated expression of the numerical value stored in each column, and does not mean that there is no numerical value in that column.
  • the morbidity evaluation apparatus 1 counts, for example, the number of pixels within a region closed by a contour and having a pixel value in a specific range among pixels of a medical image, and determines the number.
  • the area of the brain (more precisely, the cross-sectional area) can be calculated by converting it into an area according to the above rule.
  • the morbidity evaluation apparatus 1 calculates, for example, an average value of pixel values (for example, a gray scale of 0 to 255) of pixels within a region closed by a contour in a medical image, etc.
  • the brightness of the brain can be calculated.
  • “lightness dispersion” can be obtained by subdividing the area and calculating the variance of the average value of the pixel values in each subdivided area.
  • the morbidity evaluation apparatus 1 calculates the circularity of the brain (accurately, for example, by calculating the aspect ratio of the region closed by the contour in the medical image, or by applying the contour shape or the like to the template. Is the circularity of the cross section).
  • an endoscopic image (color image) of “stomach” is assumed.
  • “Color” The morbidity evaluation apparatus 1 calculates the average value of pixel values (for example, 0 to 255 for each of R, G, and B) of pixels in the imaged region, for example, and so on. The color (hue) of can be calculated.
  • “Texture” The morbidity evaluation apparatus 1 calculates the texture statistics (unevenness, texture, etc.) of the stomach inner wall by performing, for example, spatial frequency analysis, Fourier transform, etc. on the pixels in the imaged region. can do.
  • an X-ray image of “lung” is assumed.
  • “Number of nodules” The morbidity evaluation apparatus 1 counts the number of shadows having a predetermined lightness and circularity between straight outlines (ribs), for example, and thereby presents nodules (tumors or the like) existing in the lungs. The number of lesions) is often calculated.
  • the morbidity evaluation apparatus 1 uses the existing technology and uses other features such as “blood flow velocity”, “temperature”, “saturation”, and the like of a specific part. You can also get As shown in FIG. 8, a plurality of feature amounts correspond to one part. Of course, there are individual circumstances such that a specific feature amount is easily acquired for a certain part and is difficult to acquire for another part (or cannot be assumed in the first place). However, in FIG. 8, such individual circumstances are omitted for the sake of simplicity of explanation.
  • doctor diagnosis information 33 (Doctor's diagnosis information)
  • the doctor diagnosis information 33 in association with the patient ID stored in the patient ID column 221, the doctor ID column 222 has a doctor ID, the diagnostic image ID column 223 has a diagnostic image ID, and the region name column 224 has a region.
  • the change rate column 225 stores the change rate, the change amount 226 stores the change amount, the disease name column 227 stores the disease name, and the diagnosis time column 228 stores the diagnosis time point.
  • the patient ID in the patient ID column 221 is the same as the patient ID in FIG.
  • the doctor ID in the doctor ID column 222 is an identifier that uniquely identifies the doctor who diagnosed the patient.
  • the diagnostic image ID in the diagnostic image ID column 223 is the same as the image ID in FIG. However, here, the image ID of the medical image (the medical image having the feature “p” in FIGS. 2 to 5) serving as a reference for comparing the feature and the medical image (FIG. 2 to 5 are combinations of image IDs of medical images having the feature quantity “q”.
  • the site name in the site name column 224 is the same as the site name in FIG. However, the site name here may be a site name based on the subjective symptoms of the patient, and it does not matter whether or not there is a disease in the site as seen from the doctor's eyes.
  • the change rate in the change rate column 225 is a vector having the change rate of each feature amount as a component.
  • the feature amounts for the part name “brain” are “area”, “lightness”, “temperature”, “color”, “circularity”, and “texture” in descending order of priority.
  • the disease name in the disease name column 227 is the name of the disease diagnosed by the doctor based on his / her opinion after referring to the medical image, change rate, and change amount specified by the diagnostic image ID. When there is no disease at the site as seen from the doctor's eyes, “(not applicable)” is stored.
  • the diagnosis time point in the diagnosis time point column 228 is the date when the doctor made a diagnosis.
  • the patient “Jiro Koizumi” whose patient ID is “P002” is aware that there is an abnormality in the brain.
  • ⁇ "Jiro Koizumi” has been treated by doctor “D002” four times.
  • the four dates are “2012415”, “201220515”, “2012615”, and “20120715”.
  • the doctor “D002” compares the medical images “I101” and “I102” at the first diagnosis time “2012415”.
  • the medical image “I101” is acquired at a time point “20120315” before the first diagnosis time point.
  • Jiro Koizumi has brought the medical image “I101” to the doctor “D002” at the time of the first diagnosis.
  • the medical image “I102” is acquired by the doctor “D002” himself at the first diagnosis time using the imaging device “M004”.
  • the doctor “D002” acquires a medical image using the imaging device “M004” each time three times after that, and obtains the acquired images “I103”, “I104”, and “I105” as medical images. It is compared with “I101”. -Regarding the “area” of the brain, the rate of change “ ⁇ 1%” continues and the amount of change “ ⁇ 5” continues. These numerical values are converted into the change rate (amount) of “change period 1 month”. Eventually, the brain area continues to contract by “ ⁇ 1%” for the rate of change by month and by “ ⁇ 5 cm 2 ” for the amount of change.
  • the rate of change changes as “+ 4%, ⁇ 4%, + 6%, ⁇ 2%”, and the amount of change is “+8, ⁇ 8, + 12%, ⁇ 4%” It has changed as follows. Eventually, the lightness tends to increase while repeating large and decreasing amplitudes. The doctor “D002” has consistently diagnosed “mild cognitive impairment” since the first diagnosis.
  • the records in the third to sixth lines in FIG. 9 are records for the patient “P002”. These records correspond to the time series graphs 231 and 232 in FIG. Attention is paid to the first (leftmost) component of each vector stored in the change rate column 225 of the records in the third to sixth lines in FIG. The component indicates the rate of change of the area of the brain, and in order from the top record, that is, in time series, “ ⁇ 1%”, “ ⁇ 1%”, “ ⁇ 1%” and “ ⁇ 1%” is there.
  • a graph showing the transition of the time series is a time series graph 231 in FIG.
  • the component indicates the change rate of the lightness of the brain, and is “+ 4%”, “ ⁇ 4%”, “+ 6%”, and “ ⁇ 2%” in order from the top record, that is, in time series.
  • a graph showing the transition of this time series is a time series graph 232 of FIG.
  • the records in the seventh to tenth lines in FIG. 9 are records for the patient “P003”. These records correspond to the time series graphs 233 and 234 of FIG. Attention is paid to the first (leftmost) component of each vector stored in the change rate column 225 of the records in the seventh to tenth lines in FIG. This component indicates the rate of change in the number of nodules in the lung, and is “+ 3%”, “+ 3%”, “+ 4%”, and “+ 4%” in order from the top record, that is, in time series. A graph showing the transition of this time series is a time series graph 233 in FIG. Similarly, attention is focused on the second component (from the left) of each vector stored in the change rate column 225 of the records in the seventh to tenth rows in FIG.
  • This component indicates the change rate of lightness dispersion of the lung, and is “+ 2%”, “+ 2%”, “+ 3%”, and “+ 3%” in order from the top record, that is, in time series.
  • a graph showing the transition of this time series is a time series graph 234 in FIG.
  • Type list In the type list 35 of FIG. 10, in association with the types stored in the type column 241, the code transition column 242 shows the code transition, the increase / decrease trend column 243 shows the increase / decrease trend, and the time series graph example column 244 An example of a time series graph is stored in.
  • the type in the type column 241 is an identifier that uniquely identifies the type.
  • the type is a combination of “sign transition” and “increase / decrease tendency”.
  • the sign transition in the sign transition column 242 indicates how the sign of the change rate changes.
  • the increase / decrease trend in the increase / decrease trend column 243 indicates how the level of the change rate tends to change.
  • the example of the time series graph in the time series graph example column 244 is a code of the time series graph corresponding to the type. It can be seen that the time series graph 231 belongs to the type “a”, the time series graph 232 belongs to the type “b”, and the time series graphs 233 and 234 both belong to the type “c”.
  • the morbidity rate evaluation apparatus 1 can calculate the morbidity rate by executing one of them.
  • the first processing procedure will be described with reference to FIG. It is assumed that the image management information 31, the abnormality determination information 32, and the doctor diagnosis information 33 are stored in the auxiliary storage device 15 in a completed state at the time of starting the first processing procedure. Furthermore, it is assumed that all the medical images 34 whose image IDs are stored in the image management information 31 are stored in the auxiliary storage device 15. For the sake of simplification of description, it is assumed that the morbidity rate evaluation device 1 and the terminal device 2 are combined into one housing. That is, it is assumed that a user such as a doctor is operating the input device 12 of the morbidity evaluation apparatus 1 or the like.
  • step S301 the input / output control unit 26 acquires a medical image to be compared. Specifically, the input / output control unit 26 accepts that a user such as a doctor inputs a patient ID, a part name, and a medical image via the input device 12.
  • the medical image input here is the current medical image (reference numeral 112 in FIG. 2), and may be referred to as a “comparison medical image” hereinafter.
  • the patient ID and the part name received here may be hereinafter referred to as “target patient ID” and “target part name”, respectively.
  • step S ⁇ b> 302 the input / output control unit 26 acquires a medical image for comparison reference. Specifically, the input / output control unit 26 first searches the image management information 31 (FIG. 7) using the target patient ID and the target part name as search keys, and the imaging point of time is the latest among the corresponding records. Get a record. Second, the input / output control unit 26 acquires from the auxiliary storage device 15 a medical image having the image ID of the record acquired in “first” in step S302. Assume that all medical images stored in the auxiliary storage device 15 are assigned image IDs. The medical image acquired here may be referred to as a “comparison reference medical image” hereinafter.
  • step S303 the relative position correction unit 21 aligns the medical image.
  • Both images have the same target patient and part, but there are cases in which the imaging devices that acquired these images are different. Furthermore, even if both images are acquired by the same imaging device, the setting conditions, imaging environment, and the like of the imaging device at the time of each imaging may be different. Then, both images cannot be compared as they are. Therefore, it is necessary to align the images of both images as a step before obtaining the difference by superimposing both images (reference numeral 113).
  • the relative position correction unit 21 first selects two points having a small temporal change such as a skeleton from the comparative reference medical images. How to select such two points differs depending on the part. In the case of a medical image of the brain, such two points are, for example, the center of gravity 111a (FIG. 2) of the skull and the vertex 111b (FIG. 2) of the skull. Hereinafter, the description will be continued by taking the case of the brain as an example. Second, the relative position correction unit 21 selects points (points 112a and 112b in FIG. 2) corresponding to the two points selected in “first” in step S303 from the comparison target medical images.
  • the relative position correction unit 21 places the comparison target medical image and the comparison reference medical image on the same plane. Then, the comparison target medical image is entirely reduced or enlarged so that the point 112a overlaps the point 111a and the point 112b overlaps the point 111b.
  • the comparison target medical image reduced or enlarged in this way may be hereinafter referred to as a “position corrected comparison target medical image”.
  • step S304 the difference image calculation unit 22 determines a feature amount to be compared. Specifically, the difference image calculation unit 22 first searches the abnormality determination information 32 (FIG. 8) using the target part name as a search key, and displays all corresponding records on the output device 13. Second, the difference image calculation unit 22 selects one or a plurality of feature quantities (categories) via the input device 12 from the feature quantities of the record displayed in “first” in step S304. Accept to do. At this time, the difference image calculation unit 22 may accept that the user inputs the number of feature quantity categories.
  • the difference image calculation unit 22 when “2” is input, the difference image calculation unit 22 includes the feature amount (“area” and “area” of the record having the target part name in the abnormality determination information 32 and the priority is “2” or less. It is assumed that “brightness”) is selected. Here, the following description will be continued assuming that only “area” is selected.
  • the feature amount selected here may be hereinafter referred to as “selected feature amount”.
  • the difference image calculation unit 22 accepts that the user selects one of “change rate” and “change amount” via the input device 12. Here, the following description will be continued assuming that “change rate” is selected.
  • the difference image calculation unit 22 creates a difference image. Specifically, the difference image calculation unit 22 first superimposes the comparison reference medical image and the post-position correction comparison target medical image, and calculates any pixel in the union area from the pixel value of the comparison target medical image. Then, the pixel value of the comparison reference medical image is subtracted.
  • the “union area” is the union part of the brain contour area in the comparative reference medical image and the brain outline area in the post-position correction comparison target medical image (the part to which either corresponds). is there.
  • the difference image calculation unit 22 may execute the process for pixels at all positions in the union area. Further, in the union region, when the pixel position is specified as the first, second, third,... From left to right and further from top to bottom, for example, the “integer multiple of 10” th Processing (thinning-out processing) may be performed only for the pixel at the position.
  • the difference image calculation unit 22 identifies pixels that exceed a certain threshold value as a result of the “first” subtraction in step S305. Usually, there is a significant difference between the pixel value of a pixel in which brain tissue is imaged and the pixel value of a pixel that is not (in the brain space).
  • the difference image calculation unit 22 sets the pixel value of the pixel specified in “second” in step S305 as a predetermined pixel value (for example, a pixel value representing “white”), and sets the pixel values of other pixels as other values.
  • a difference image having a predetermined pixel value for example, a pixel value representing “black” is created. According to the difference image, the user can easily visually recognize which part of the part is reduced or enlarged.
  • step S306 the change rate calculation unit 23 calculates the change rate and the like. Specifically, the change rate calculation unit 23 first counts the number of pixels belonging to the region within the outline of the brain among the pixels of the comparative reference medical image, and further calculates a predetermined conversion coefficient for the number. The area “p” is the result of multiplying. Secondly, the change rate calculation unit 23 counts the number of pixels belonging to the region within the outline of the brain among the pixels of the comparison target medical image, and further multiplies the number by the conversion coefficient, The area is “q”. Third, the change rate calculation unit 23 calculates the change rate ((q ⁇ p) / p ⁇ 100).
  • the change rate calculation unit 23 calculates the change amount (q ⁇ p).
  • the rate-of-change calculating unit 23 uses the difference between the number of days when the comparative reference medical image is captured and the time when the comparative target medical image is captured to calculate the rate of change calculated in “third” in step S306. Convert to the rate of change. For example, when the difference in the number of days is “50 days” and the “change period” corresponding to the selected feature amount is one month, the change rate calculated in “third” in step S306 is multiplied by “30/50”. To do.
  • the abnormality presence / absence determination unit 24 determines the presence / absence of an abnormality. Specifically, the abnormality presence / absence determination unit 24 first searches the abnormality determination information 32 (FIG. 8) using the target part name and the selected feature amount as a search key, and acquires the normal change rate range of the corresponding record. To do. If “change amount” is received in “third” in step S304, the abnormality presence / absence determination unit 24 acquires a normal change amount range. Secondly, the abnormality presence / absence determination unit 24 determines whether or not the change rate converted in “fourth” in step S306 is within the normal change rate range acquired in “first” in step S307. If it is within the range, the determination result “normal” is generated, and if it is not within the range, the determination result “abnormal” is generated.
  • the morbidity rate calculation unit 25 calculates the morbidity rate. Specifically, the morbidity rate calculation unit 25 first acquires records satisfying all of the following conditions 1 and 2 from the doctor diagnosis information 33 (FIG. 9). (Condition 1) The site name (column 224) matches the target site name. (Condition 2) The change rate (column 225) corresponding to the selected feature amount matches the change rate converted in “fourth” in step S306. Or even if it does not completely match, it is included within a predetermined error range. Now, assume that the target region name is “brain”, the selected feature amount is “area”, and the change rate converted in “fourth” in step S306 is “ ⁇ 1%”. Then, the records in the third to sixth lines and the eleventh line in FIG. 9 are acquired.
  • the morbidity rate calculation unit 25 acquires disease names of all the records acquired in “first” in step S308. And the number of the acquired records is hold
  • step S309 the input / output control unit 26 displays the diagnosis result. Specifically, the input / output control unit 26 displays the diagnosis result display screen 51a (FIG. 13) on the output device 13, and displays the following information in the following columns of the diagnosis result display screen 51a.
  • the determination result table 137 includes a feature amount column 137a, a change rate column 137b, a change amount column 137c, and a determination result column 137d.
  • the feature amount column 137a all feature amounts corresponding to the target part names are displayed.
  • the input / output control unit 26 displays the change rate (or change amount) converted in “fourth” in step S306 in the change rate column 137b (or change amount column 137c) of the row having the selected feature amount.
  • the determination result generated in the “second” in step S307 is displayed in the determination result column 137d. Note that “*” indicates that no processing was performed for the column.
  • the morbidity rate table 138 has a disease name column 138a, a morbidity rate column 138b, and a confirmation column 138c.
  • the input / output control unit 26 selects a combination of the disease name and the morbidity such as “(mild cognitive impairment, 57%), (dementia, 14%)” held in “fourth” in step S308, and displays the disease name column. 138a and the morbidity column 138b.
  • step S310 the input / output control unit 26 registers the diagnosis result. Specifically, in the input / output control unit 26, first, a user such as a doctor inputs a check mark in a check box displayed in the confirmation column 138c of any row of the morbidity rate table 138, and “ It is accepted that the “determined result registration” button 139 is pressed. A user such as a doctor confirms the comparative reference medical image 134a, the post-position correction comparison target image 135a, the difference image 136, and the determination result table 137, and inputs a check mark based on his / her own opinion. A check mark does not have to be entered in the row with the highest morbidity in the morbidity table 138.
  • the input / output control unit 26 creates a new record of the doctor diagnosis information 33 (FIG. 9), and stores the following information in the following columns of the new record.
  • Patient ID column 221 Target patient ID Doctor ID column 223: User's own doctor ID
  • Diagnosis image ID column 223 Image ID of the comparison reference medical image, and image ID of the newly assigned comparison target medical image
  • Part name column 224 Position of the component corresponding to the selected feature amount in the target part name / change rate column 225 (or change amount column 226): Change rate (or change amount) converted in “fourth” in step S306 -Position of other components in the change rate column 225 (or change amount column 226): "*" -Disease name column 227: Disease name corresponding to the input check mark or “(Not applicable)”
  • Diagnosis time column 228 Current date and time Then, the first processing procedure is terminated.
  • step S401 the input / output control unit 26 acquires a medical image to be compared.
  • the processing in step S401 is the same as step S301 in the first processing procedure.
  • step S ⁇ b> 402 the input / output control unit 26 acquires a medical image for comparison reference. Specifically, the input / output control unit 26 first searches the image management information 31 (FIG. 7) using the target patient ID and the target part name as search keys, and the imaging time points of the corresponding records are in order from the newest. Get a predetermined number of records. Here, the following description will be continued on the assumption that the predetermined number is “3”. Secondly, the input / output control unit 26 acquires from the auxiliary storage device 15 a medical image having the image ID of the record acquired in “first” in step S ⁇ b> 402. Assume that all medical images stored in the auxiliary storage device 15 are assigned image IDs. Of the medical images acquired here, the one with the oldest imaging time may be referred to as a “comparison reference medical image”, and the other (two) may be referred to as “intermediate medical images”.
  • step S403 the relative position correction unit 21 aligns the medical image.
  • the processing in step S403 is the same as step S303 in the first processing procedure. However, it is assumed that the relative position correction unit 21 aligns the “intermediate medical image” with respect to the “comparison reference medical image” and creates the “intermediate medical image after position correction”.
  • step S404 the difference image calculation unit 22 determines a feature amount to be compared.
  • the processing in step S404 is the same as step S304 in the first processing procedure.
  • step S405 the difference image calculation unit 22 creates a difference image.
  • the processing in step S405 is the same as step S305 in the first processing procedure.
  • the difference image calculation part 22 shall produce all the following difference images. -Difference image between "Comparison standard medical image” and “Intermediate medical image after position correction with earlier imaging time”-"Intermediate medical image after position correction with earlier imaging time” and “Intermediate medical after position correction” Difference image between “images with late imaging time” and “difference image between“ positional corrected intermediate medical images with late imaging time ”and“ position corrected comparative medical images ”
  • step S406 the change rate calculation unit 23 calculates the change rate and the like.
  • the processing in step S406 is the same as step S306 in the first processing procedure.
  • the change rate calculation unit 23 calculates all the following change rates (or changes). ⁇ Change rate (or change amount) of “intermediate medical image with the earlier imaging time point” relative to “comparative reference medical image” ⁇ Change rate (or amount of change) of “intermediate medical image with late imaging time” relative to “comparative reference medical image” -Rate of change (or amount of change) of “comparison medical image” relative to “comparison reference medical image”
  • the rate-of-change calculating unit 23 acquires the rate of change from the reference medical image for three medical images at different imaging points. Of course, it is also possible to acquire the rate of change from the reference medical image for four or more medical images at different imaging time points.
  • the abnormality presence / absence determination unit 24 determines the presence / absence of an abnormality. Specifically, the abnormality presence / absence determination unit 24 first searches the abnormality determination information 32 (FIG. 8) using the target part name and the selected feature amount as a search key, and acquires the normal change rate range of the corresponding record. To do. When “change amount” is received in “third” in step S404 (S304), the abnormality determination unit 24 acquires a normal change amount range. Secondly, the abnormality presence / absence determination unit 24 determines whether or not all the change rates converted in “fourth” in step S406 (S306) are within the normal change rate range acquired in “first” in step S407. If all the change rates are within the range, the determination result “normal” is generated. Otherwise, the determination result “abnormal” is generated.
  • the morbidity rate calculation unit 25 classifies past diagnosis examples. Specifically, the morbidity rate calculation unit 25 first acquires records that satisfy all of the following conditions 3 to 5 from the doctor diagnosis information 33 (FIG. 9). (Condition 3) The site name (column 224) matches the target site name. (Condition 4) A change rate (not “*”) corresponding to the selected feature amount is stored in the change rate column 225. (Condition 5) The patient ID is not the target patient ID.
  • the morbidity rate calculation unit 25 sorts the records acquired in “first” in step S408 for each patient ID, and creates a record group for each patient ID. Thirdly, the morbidity rate calculation unit 25 deletes the record group created in “second” in step S408, in which the number of records belonging to the record group is less than the predetermined number.
  • the predetermined number in this case is “4”, for example. That is, if there are “4” or more records, the time-series transition of the rate of change based on those records can be treated as significant.
  • the morbidity rate calculation unit 25 creates a time-series graph of the selected feature amount for the patient based on the record group that remains without being deleted in “third” in step S408. That is, the morbidity rate calculation unit 25 creates, for example, a time-series graph of the change rate for the target region “brain” and the selected feature amount “area” (for example, see reference numeral 231 in FIG. 10) for each patient ID. become.
  • the method of creating the time series graph is in accordance with the method described in step S406.
  • the morbidity rate calculation unit 25 creates a plurality of types, and classifies the time series graph created in “fourth” in step S408 by the created type. Then, the patient ID, type, and disease name of the time series graph are temporarily associated with each other.
  • the type means each pattern when the time series graphs are classified without duplication according to the result of processing the time series graphs mathematically and physically by an arbitrary method (as described above with reference to FIG. 10). ).
  • Information temporarily stored at this time is, for example, “(P001, type d, mild cognitive impairment), (P002, type e, dementia), (P003, type f, brain tumor), (P004, type g, meningitis).
  • classified information corresponds to one time series graph.
  • step S ⁇ b> 409 the morbidity rate calculation unit 25 classifies the transition of the image change leading to the comparison target medical image. Specifically, the morbidity rate calculating unit 25 first reaches the comparison target medical image from the comparison reference medical image through the intermediate medical image based on the plurality of change rates calculated in step S406 (the target patient ID of the target patient ID). Create a time series graph of the rate of change (for the patient). Secondly, the morbidity rate calculation unit 25 determines which of the types created in “fifth” in step S408 the time-series graph created in “first” in step S409 corresponds to. The type determined at this time may be hereinafter referred to as “patient type”.
  • the morbidity rate calculation unit 25 calculates the morbidity rate. Specifically, the morbidity rate calculation unit 25 first counts, for each disease name, the number of items having patient types among the classified information held in “fifth” in step S408. Then, the counted number is temporarily stored in association with the disease name. Information temporarily stored at this time is, for example, “(mild cognitive impairment, 8), (dementia, 4), (meningitis, 2),...”.
  • the morbidity rate calculation unit 25 divides the number (8, 4, 2,%) For each disease name counted in “first” in step S410 by the total number of classified information having patient types. The resulting percentage is temporarily stored in association with the disease name. Assume that the total number of classified information having patient types is “20”. Then, in the above example, the morbidity rate calculating unit 25 holds “(mild cognitive impairment, 40%), (dementia, 20%), (meningitis, 10%),...”. “40%” is a calculation result of 8/20 ⁇ 100, “20%” is a calculation result of 4/20 ⁇ 100, and “10%” is a calculation result of 2/20 ⁇ 100. .
  • step S411 the input / output control unit 26 displays the diagnosis result. Specifically, the input / output control unit 26 displays the diagnosis result display screen 51b (FIG. 14) on the output device 13, and displays the following information in the following columns of the diagnosis result display screen 51b.
  • Reference time column 141a Imaging of comparative reference medical image Time point / previous time point column 141b: Imaging time point of the intermediate medical image whose imaging time point is early ⁇ Previous time point column 141c: Imaging time point of the intermediate medical image whose imaging time point is slow / Current time point column 141d: Imaging time point of the comparison medical image
  • Column 142a Difference image between comparison reference medical image and intermediate medical image after position correction, which has an earlier imaging time point.
  • the description of FIG. 13 is applied as it is.
  • the rate of change (or amount of change) is displayed as a combination of a lower limit and an upper limit (indicating the vertical fluctuation width of the time series graph).
  • the morbidity rate table 138 the description of FIG.
  • step S412 the input / output control unit 26 registers the diagnosis result. Specifically, the input / output control unit 26 first executes the same processing as the “first” and “second” processing in step S310 of the first processing procedure.
  • the input / output control unit 26 acquires a record having the target patient ID from the doctor diagnosis information 33 (FIG. 9). Then, the disease name of the acquired record is updated and overwritten with the disease name corresponding to the input check mark or “(not applicable)”. If there is no record having the target patient ID in the doctor diagnosis information 33, nothing is done. Thereafter, the second processing procedure is terminated.
  • Step S304 of the first processing procedure Processing in the case where a plurality of “selected feature values” (categories) are selected in “second” in step S304 of the first processing procedure is as follows. (1) The difference image calculation unit 22 and the like repeat the processing of steps S305 to S308 for each selected feature amount. (2) The input / output control unit 26 executes the same process as the process of step S309 described above, except for the following, for the process of step S309. In the “difference” column 136 (FIG. 13), difference images for a plurality of selected feature amounts are displayed. A difference image of “area” (5 seconds) ⁇ a difference image of “lightness” (5 seconds) ⁇ ...
  • the change rate (or change amount) and the determination result are displayed in all rows having each of the selected feature amounts in the determination result table 137.
  • the input / output control unit 26 executes the same processing as the processing content of step S310 described above for the processing of step S310.
  • step S404 (S304) of the second processing procedure, the processing when a plurality of “selected feature amounts” (categories) are selected is as follows. (1) The difference image calculation unit 22 and the like repeat the processing of steps S405 to S410 for each selected feature amount. (2) The input / output control unit 26 executes the same process as the process of step S411 described above, except for the following, for the process of step S411.
  • difference images for a plurality of selected feature amounts are displayed. You may display by a slide type. In the column 143, a time series graph for a plurality of selected feature amounts is displayed.
  • the input / output control unit 26 executes the same processing as the processing content of step S412 described above for the processing of step S412.
  • the morbidity evaluation apparatus 1 of the present embodiment has the following effects.
  • the morbidity rate evaluation apparatus 1 searches past diagnosis examples using the received difference between the feature amounts. At this time, if the number of feature quantity categories is limited, it is not required that the medical image of the past diagnosis example and the received medical image have high image quality. Therefore, a large amount of data that cannot necessarily require high image quality, such as a periodic health checkup, can be used effectively. Furthermore, the morbidity evaluation apparatus 1 calculates a difference. Therefore, if medical images captured by the same imaging device 3 are continuously used, measurement errors (such as image distortion) unique to the imaging devices can be eliminated. (2) The morbidity evaluation apparatus 1 uses either or both of the “rate” and “amount” of the difference. Therefore, multifaceted morbidity evaluation can be performed according to the feature quantity category.
  • the morbidity evaluation apparatus 1 feeds back the diagnosis result as doctor diagnosis information 33. Therefore, as the number of uses increases, the influence of random measurement errors that occur each time a medical image is captured becomes relatively small, and the diagnostic accuracy improves. (4) The morbidity evaluation apparatus 1 determines “abnormal” or “normal” in accordance with the feature quantity category. Therefore, it is possible to easily know which category has a large change in the feature amount. (5) The morbidity evaluation apparatus 1 searches for a diagnosis example using transitions of changes at three or more different time points. Therefore, the diagnosis result reflects a longer-term viewpoint and has a positive effect on the patient's consultation psychology.
  • the morbidity evaluation apparatus 1 uses feature amounts of a plurality of categories. Therefore, it is possible to flexibly and flexibly use a feature amount that matches the characteristics of the part or disease. (7) The morbidity evaluation apparatus 1 displays a difference image. Therefore, the difference change can be easily visually recognized. (8) The morbidity evaluation apparatus 1 converts the difference into a difference for a period of a predetermined length. Therefore, the difference is easy to understand sensibly and is easy to compare according to the category of the region, disease, and feature amount. (9) The morbidity evaluation apparatus 1 performs alignment of medical images. Therefore, it is possible to compare medical images with different setting conditions and imaging environments.
  • this invention is not limited to an above-described Example, Various modifications are included.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • Each of the above-described configurations, functions, processing units, processing means, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files that realize each function can be stored in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • the control lines and information lines are those that are considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. In practice, it may be considered that almost all the components are connected to each other.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Primary Health Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Epidemiology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Quality & Reliability (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Psychiatry (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Neurology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un dispositif d'évaluation de taux d'incidence caractérisé en ce qu'il comprend : une unité de mémoire permettant de mémoriser des informations de diagnostic, qui mémorise des régions du corps associées à des différences entre deux images prises de la région à des moments différents et à un nom de maladie ; et une unité de commande permettant de recevoir l'entrée d'une catégorie de valeur de caractéristique d'image et de la région d'un utilisateur à diagnostiquer, de calculer la différence dans la catégorie reçue entre les deux images de la région du patient en cours de diagnostic et qui ont été prises à des moments différents, de rechercher les informations de diagnostic à l'aide de la région en cours de diagnostic et de la différence calculée en tant que clés de recherche, d'acquérir des noms de maladies de dossiers médicaux pertinents, de calculer les taux d'incidence sur la base des pourcentages des dossiers médicaux pertinents parmi le nombre total de dossiers médicaux pour la région et d'afficher les taux d'incidence calculés pour chaque nom de maladie acquis.
PCT/JP2013/072856 2013-08-27 2013-08-27 Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence WO2015029135A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2015533822A JP6208243B2 (ja) 2013-08-27 2013-08-27 罹患率評価装置、罹患率評価方法及び罹患率評価プログラム
PCT/JP2013/072856 WO2015029135A1 (fr) 2013-08-27 2013-08-27 Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence
TW103125674A TWI524198B (zh) 2013-08-27 2014-07-28 A morbidity assessment device, a morbidity assessment method, and a morbidity assessment program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2013/072856 WO2015029135A1 (fr) 2013-08-27 2013-08-27 Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence

Publications (1)

Publication Number Publication Date
WO2015029135A1 true WO2015029135A1 (fr) 2015-03-05

Family

ID=52585754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/072856 WO2015029135A1 (fr) 2013-08-27 2013-08-27 Dispositif d'évaluation de taux d'incidence, procédé d'évaluation de taux d'incidence, et programme d'évaluation de taux d'incidence

Country Status (3)

Country Link
JP (1) JP6208243B2 (fr)
TW (1) TWI524198B (fr)
WO (1) WO2015029135A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106294751A (zh) * 2016-08-10 2017-01-04 依据数据(湖南)科技有限公司 基于关键词网络相关性分析的异常检查报告自动识别方法
JP2017046814A (ja) * 2015-08-31 2017-03-09 キヤノン株式会社 情報処理装置、画像処理装置、情報処理システム、情報処理方法、及びプログラム。
WO2019003749A1 (fr) * 2017-06-30 2019-01-03 富士フイルム株式会社 Dispositif de traitement d'images médicales, procédé et programme associés
JP2019037541A (ja) * 2017-08-25 2019-03-14 株式会社Aze 医用画像処理装置、医用画像処理装置の制御方法、及びプログラム
GB2569427A (en) * 2017-10-13 2019-06-19 Optellum Ltd System, method and apparatus for assisting a determination of medical images
WO2021060461A1 (fr) * 2019-09-25 2021-04-01 富士フイルム株式会社 Dispositif de traitement d'image, procédé et programme
US11158054B2 (en) 2017-08-28 2021-10-26 Fujifilm Corporation Medical image processing apparatus, medical image processing method, and medical image processing program
US11200674B2 (en) 2018-10-29 2021-12-14 Canon Kabushiki Kaisha Image processing apparatus, method, and storage medium for enabling user to REC0GNT7E change over time represented by substraction image
JP7451170B2 (ja) 2019-12-20 2024-03-18 エヌ・ティ・ティ・コミュニケーションズ株式会社 情報処理装置、情報処理方法およびプログラム

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109949274B (zh) 2019-02-25 2020-12-25 腾讯科技(深圳)有限公司 一种图像处理方法、装置及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230518A (ja) * 2000-11-29 2002-08-16 Fujitsu Ltd 診断支援プログラム、診断支援プログラムを記録したコンピュータ読取可能な記録媒体、診断支援装置及び診断支援方法
JP2004288047A (ja) * 2003-03-24 2004-10-14 Fujitsu Ltd 診療支援システム及び診療支援プログラム
JP2007279942A (ja) * 2006-04-05 2007-10-25 Fujifilm Corp 類似症例検索装置、類似症例検索方法およびそのプログラム
JP2009095550A (ja) * 2007-10-18 2009-05-07 Canon Inc 診断支援装置、診断支援装置の制御方法、およびそのプログラム
JP2011092685A (ja) * 2009-09-30 2011-05-12 Fujifilm Corp 診断支援システム、診断支援プログラムおよび診断支援方法
JP2013025723A (ja) * 2011-07-25 2013-02-04 Panasonic Corp 類似症例検索装置および読影知識抽出装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002230518A (ja) * 2000-11-29 2002-08-16 Fujitsu Ltd 診断支援プログラム、診断支援プログラムを記録したコンピュータ読取可能な記録媒体、診断支援装置及び診断支援方法
JP2004288047A (ja) * 2003-03-24 2004-10-14 Fujitsu Ltd 診療支援システム及び診療支援プログラム
JP2007279942A (ja) * 2006-04-05 2007-10-25 Fujifilm Corp 類似症例検索装置、類似症例検索方法およびそのプログラム
JP2009095550A (ja) * 2007-10-18 2009-05-07 Canon Inc 診断支援装置、診断支援装置の制御方法、およびそのプログラム
JP2011092685A (ja) * 2009-09-30 2011-05-12 Fujifilm Corp 診断支援システム、診断支援プログラムおよび診断支援方法
JP2013025723A (ja) * 2011-07-25 2013-02-04 Panasonic Corp 類似症例検索装置および読影知識抽出装置

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10304182B2 (en) 2015-08-31 2019-05-28 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and recording medium
JP2017046814A (ja) * 2015-08-31 2017-03-09 キヤノン株式会社 情報処理装置、画像処理装置、情報処理システム、情報処理方法、及びプログラム。
US10650516B2 (en) 2015-08-31 2020-05-12 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and recording medium
CN106294751A (zh) * 2016-08-10 2017-01-04 依据数据(湖南)科技有限公司 基于关键词网络相关性分析的异常检查报告自动识别方法
US11216945B2 (en) 2017-06-30 2022-01-04 Fujifilm Corporation Image processing for calculation of amount of change of brain
JPWO2019003749A1 (ja) * 2017-06-30 2020-05-21 富士フイルム株式会社 医用画像処理装置、方法およびプログラム
WO2019003749A1 (fr) * 2017-06-30 2019-01-03 富士フイルム株式会社 Dispositif de traitement d'images médicales, procédé et programme associés
JP2019037541A (ja) * 2017-08-25 2019-03-14 株式会社Aze 医用画像処理装置、医用画像処理装置の制御方法、及びプログラム
JP7140477B2 (ja) 2017-08-25 2022-09-21 キヤノンメディカルシステムズ株式会社 医用画像処理装置、医用画像処理装置の制御方法、及びプログラム
US11158054B2 (en) 2017-08-28 2021-10-26 Fujifilm Corporation Medical image processing apparatus, medical image processing method, and medical image processing program
GB2569427A (en) * 2017-10-13 2019-06-19 Optellum Ltd System, method and apparatus for assisting a determination of medical images
GB2569427B (en) * 2017-10-13 2021-12-15 Optellum Ltd System, method and apparatus for assisting a determination of medical images
US11594005B2 (en) 2017-10-13 2023-02-28 Optellum Limited System, method and apparatus for assisting a determination of medical images
US11200674B2 (en) 2018-10-29 2021-12-14 Canon Kabushiki Kaisha Image processing apparatus, method, and storage medium for enabling user to REC0GNT7E change over time represented by substraction image
WO2021060461A1 (fr) * 2019-09-25 2021-04-01 富士フイルム株式会社 Dispositif de traitement d'image, procédé et programme
JP7451170B2 (ja) 2019-12-20 2024-03-18 エヌ・ティ・ティ・コミュニケーションズ株式会社 情報処理装置、情報処理方法およびプログラム

Also Published As

Publication number Publication date
TWI524198B (zh) 2016-03-01
JP6208243B2 (ja) 2017-10-04
JPWO2015029135A1 (ja) 2017-03-02
TW201518967A (zh) 2015-05-16

Similar Documents

Publication Publication Date Title
JP6208243B2 (ja) 罹患率評価装置、罹患率評価方法及び罹患率評価プログラム
US10540763B2 (en) Systems and methods for matching, naming, and displaying medical images
US9271651B2 (en) System and method for integrated quantifiable detection, diagnosis and monitoring of disease using patient related time trend data
Niemeijer et al. Retinopathy online challenge: automatic detection of microaneurysms in digital color fundus photographs
US8934685B2 (en) System and method for analyzing and visualizing local clinical features
US7634121B2 (en) Method and system for rule-based comparison study matching to customize a hanging protocol
US8010381B2 (en) System and method for disease diagnosis from patient structural deviation data
JP6027065B2 (ja) 類似画像検索装置、類似画像検索装置の作動方法、および類似画像検索プログラム
EP2812828B1 (fr) Optimisation interactive de bases de données de balayage pour test statistique
US20110129129A1 (en) System and method for integrated quantifiable detection, diagnosis and monitoring of disease using population related data for determining a disease signature
US20110129131A1 (en) System and method for integrated quantifiable detection, diagnosis and monitoring of disease using population related time trend data and disease profiles
JP6099593B2 (ja) 類似症例検索装置、類似症例検索方法、及び類似症例検索プログラム
US11195610B2 (en) Priority alerts based on medical information
US20080166070A1 (en) Method for providing adaptive hanging protocols for image reading
CN110598722B (zh) 多模态神经影像数据自动信息融合系统
WO2018116727A1 (fr) Dispositif de recherche de cas similaire, procédé de fonctionnement et programme de fonctionnement associé, et système de recherche de cas similaire
US20110129130A1 (en) System and method for integrated quantifiable detection, diagnosis and monitoring of disease using population related time trend data
JP2015203920A (ja) 類似症例検索システム、類似症例検索方法及びプログラム
JP5732015B2 (ja) グラフ作成装置およびグラフ作成方法並びにグラフ作成プログラム
JP2015191287A (ja) 類似症例検索装置、類似症例検索方法、及び類似症例検索プログラム
JP6054295B2 (ja) 臨床状態タイムライン
US20200111558A1 (en) Information processing apparatus, medical image display apparatus, and storage medium
CN111681749A (zh) 一种病理科规范化工作管理和诊断咨询系统及方法
Venson et al. Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment
Cai et al. Image quality assessment on CT reconstruction images: Task-specific vs. general quality assessment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13892614

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015533822

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13892614

Country of ref document: EP

Kind code of ref document: A1