WO2009090894A1 - Système de support pour imagerie diagnostique dynamique - Google Patents

Système de support pour imagerie diagnostique dynamique Download PDF

Info

Publication number
WO2009090894A1
WO2009090894A1 PCT/JP2009/050026 JP2009050026W WO2009090894A1 WO 2009090894 A1 WO2009090894 A1 WO 2009090894A1 JP 2009050026 W JP2009050026 W JP 2009050026W WO 2009090894 A1 WO2009090894 A1 WO 2009090894A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
small
area
abnormal
region
Prior art date
Application number
PCT/JP2009/050026
Other languages
English (en)
Japanese (ja)
Inventor
Shintarou Muraoka
Original Assignee
Konica Minolta Medical & Graphic, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Medical & Graphic, Inc. filed Critical Konica Minolta Medical & Graphic, Inc.
Priority to JP2009549997A priority Critical patent/JP5136562B2/ja
Publication of WO2009090894A1 publication Critical patent/WO2009090894A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung

Definitions

  • the present invention relates to a dynamic image diagnosis support system.
  • a dynamic image of a region to be examined is photographed using a semiconductor image sensor such as an FPD (flat panel detector).
  • FPD flat panel detector
  • Patent Document 1 describes a technique for creating a difference image between frames of a dynamic image and displaying the difference moving image.
  • JP 2004-31434 A describes a technique for creating a difference image between frames of a dynamic image and displaying the difference moving image.
  • some lung diseases are diagnosed by a combination of the aspiration state of the lung field and the state of blood flow. For example, if there is a location where there is sufficient aspiration but insufficient blood flow, the location is diagnosed as having a suspected chronic lung type. Also, if there is a part where there is sufficient blood flow but not enough aeration, the part is diagnosed as being suspected of having chronic bronchitis.
  • An object of the present invention is to provide diagnosis support information that takes into account the state of blood flow accompanying respiration.
  • a dynamic image diagnosis support system includes: Imaging means for capturing dynamic images of the chest of a human body and obtaining a plurality of frame images indicating the dynamics of the chest, A reference image is defined from the plurality of frame images, a lung field region is extracted from the reference image, the extracted lung field region is divided into a plurality of small regions, and the other frame images are extracted.
  • Area dividing means for calculating a small area corresponding to each of a plurality of small areas of the reference image in the frame image;
  • Arousal determination means for performing image analysis for each corresponding small region in the plurality of frame images and determining whether the state of aeration is abnormal for each small region;
  • Blood flow determination means for performing image analysis for each corresponding small region in the plurality of frame images and determining whether or not the state of blood flow is abnormal for each small region;
  • An abnormality determination unit that determines whether or not each of the small regions is abnormal based on a determination result by the arousal determination unit and a determination result by the blood flow determination unit;
  • Display means for displaying at least the determination result in the abnormality determination means; Is provided.
  • the invention described in claim 2 is the invention described in claim 1,
  • the arousal determination means calculates an area change rate between the plurality of frame images for each small region, and the state of arousal is abnormal for each small region based on the calculated area change rate for each small region. It is determined whether or not.
  • the invention described in claim 3 is the invention described in claim 1,
  • the blood flow determining means extracts a blood vessel region from each of the plurality of frame images, calculates a concentration change amount between the plurality of frame images of each small region including the extracted blood vessel region, and calculates Based on the concentration change amount for each small region, it is determined whether or not the state of blood flow is abnormal for each small region.
  • the invention described in claim 4 is the invention described in claim 3,
  • the blood flow determination means performs a warping process on the plurality of frame images to match the shapes of the lung field regions of the plurality of frame images and then extracts a blood vessel region.
  • the invention according to claim 5 is the invention according to any one of claims 1 to 4,
  • the display means superimposes and displays the determination result by the abnormality determination means on the reference image.
  • the invention according to claim 6 is the invention according to claim 5,
  • the display means superimposes and displays an annotation on a small area determined to be abnormal by the abnormality determination means in the reference image.
  • the invention according to claim 7 is the invention according to any one of claims 1 to 4,
  • the display means displays the plurality of frame images in a moving image, and superimposes and displays an annotation on a small area determined to be abnormal by the abnormality determining means in each frame image displayed in the moving image.
  • warping processing is performed on a plurality of frame images to match the shape of the lung field region between the plurality of frame images. Since the blood vessel region is extracted, it is not necessary for the patient to hold his / her breath at the time of imaging in order to match the shape of the lung field between the frames.
  • FIG. 1 It is a figure which shows the example of whole structure of the dynamic image diagnosis assistance system in embodiment of this invention. It is a flowchart which shows the imaging
  • (A) is a figure for demonstrating the periphery area
  • (b) is a periphery used for determination of abnormality in the arousal determination process of FIG. 6 when a small area
  • (A) is a figure for demonstrating the left and right lung field area
  • FIG. 1 shows the overall configuration of a dynamic image diagnosis support system 100 in the present embodiment.
  • an imaging device 1 and an imaging console 2 are connected by a communication cable or the like, an imaging console 2, a diagnostic console 3, an arithmetic device 4,
  • the image server 5 is connected via a communication network NT such as a LAN (Local Area Network).
  • a communication network NT such as a LAN (Local Area Network).
  • Each device constituting the dynamic image diagnosis support system 100 conforms to the DICOM (Digital Image and Communications in Medicine) standard, and communication between the devices is performed according to DICOM.
  • DICOM Digital Image and Communications in Medicine
  • the imaging apparatus 1 is an apparatus that performs dynamic imaging of the chest of a human body including, for example, pulmonary expansion and contraction morphological changes associated with breathing, heart pulsation, and the like.
  • Dynamic imaging is performed by continuously irradiating a human chest with radiation such as X-rays to acquire a plurality of images (that is, continuous imaging). A series of images obtained by this continuous shooting is called a dynamic image. Each of the plurality of images constituting the dynamic image is called a frame image.
  • the imaging apparatus 1 includes a radiation source 11, a radiation irradiation control device 12, a radiation detection unit 13, a reading control device 14, a cycle detection sensor 15, a cycle detection device 16, and the like.
  • the radiation source 11 irradiates the subject M with radiation (X-rays) under the control of the radiation irradiation control device 12.
  • the radiation irradiation control device 12 is connected to the imaging console 2 and controls the radiation source 11 based on the radiation irradiation conditions input from the imaging console 2 to perform radiation imaging.
  • the radiation irradiation conditions input from the imaging console 2 are, for example, pulse rate, pulse width, pulse interval, imaging start / end timing, X-ray tube current value, X-ray tube voltage value, filter type during continuous irradiation. Etc.
  • the pulse rate is the number of times of radiation irradiation per second, and matches the frame rate described later.
  • the pulse width is a radiation irradiation time per one irradiation.
  • the pulse interval is the time from the start of one radiation irradiation to the start of the next radiation irradiation in continuous imaging, and coincides with a frame interval described later.
  • the radiation detection unit 13 is configured by a semiconductor image sensor such as an FPD.
  • the FPD has, for example, a glass substrate or the like, detects radiation that has been irradiated from the radiation source 11 and transmitted through at least the subject M at a predetermined position on the substrate according to its intensity, and detects the detected radiation as an electrical signal.
  • a plurality of pixels to be converted and stored are arranged in a matrix.
  • Each pixel is configured by a switching unit such as a TFT (Thin Film Transistor).
  • the reading control device 14 is connected to the imaging console 2.
  • the reading control device 14 controls the switching unit of each pixel of the radiation detection unit 13 based on the image reading condition input from the imaging console 2 to switch the reading of the electrical signal accumulated in each pixel.
  • the image data is acquired by reading the electrical signal accumulated in the radiation detection unit 13.
  • the reading control device 14 outputs the acquired image data to the imaging console 2.
  • the image reading conditions are, for example, a frame rate, a frame interval, a pixel size, an image size (matrix size), and the like.
  • the frame rate is the number of frame images acquired per second and matches the pulse rate.
  • the frame interval is the time from the start of one frame image acquisition operation to the start of the next frame image acquisition operation in continuous shooting, and coincides with the pulse interval.
  • the radiation irradiation control device 12 and the reading control device 14 are connected to each other, and exchange synchronization signals with each other to synchronize the radiation irradiation operation and the image reading operation.
  • the cycle detection sensor 15 detects the respiratory motion state of the subject M and outputs detection information to the cycle detection device 16.
  • the cycle detection sensor 15 for example, a respiration monitor belt, a CCD (Charge-Coupled Device) camera, an optical camera, a spirometer, or the like can be applied.
  • CCD Charge-Coupled Device
  • the cycle detection device 16 determines the number of respiratory cycles and the state during one cycle of the current respiratory motion (for example, inspiration, inspiration to expiration conversion point, Which state is at the conversion point of exhalation and exhalation to inspiration), and the detection result (cycle information) is output to the control unit 21 of the imaging console 2.
  • the cycle detection device 16 receives, for example, detection information indicating that the state of the lung is a conversion point from inspiration to expiration by the cycle detection sensor 15 (respiration monitor belt, CCD camera, optical camera, spirometer, etc.).
  • the timing is set as a base point of one cycle, and the period until the next timing when this state is detected is recognized as one cycle.
  • the imaging console 2 outputs radiation irradiation conditions and image reading conditions to the imaging apparatus 1 to control radiation imaging and radiation image reading operations by the imaging apparatus 1, and also captures image data acquired by the imaging apparatus 1. Displayed for confirmation of whether the image is suitable for confirmation of positioning or diagnosis.
  • the photographing console 2 includes a control unit 21, a storage unit 22, an operation unit 23, a display unit 24, and a communication unit 25, and each unit is connected by a bus 26.
  • the control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like.
  • the CPU of the control unit 21 reads out the system program and various processing programs stored in the storage unit 22 in accordance with the operation of the operation unit 23 and expands them in the RAM.
  • the operation and the radiation irradiation operation and reading operation of the imaging apparatus 1 are centrally controlled.
  • the storage unit 22 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
  • the storage unit 22 stores various programs executed by the control unit 21 and data such as parameters necessary for execution of processing by the programs or processing results.
  • the storage unit 22 stores a shooting control processing program for controlling the shooting flow shown in FIG.
  • the storage unit 22 stores radiation irradiation conditions and image reading conditions in association with the examination target region.
  • Various programs are stored in the form of readable program code, and the control unit 21 sequentially executes operations according to the program code.
  • the operation unit 23 includes a keyboard having a cursor key, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
  • the control unit 23 controls an instruction signal input by key operation or mouse operation on the keyboard.
  • the operation unit 23 may include a touch panel on the display screen of the display unit 24. In this case, the operation unit 23 outputs an instruction signal input via the touch panel to the control unit 21.
  • the display unit 24 includes a monitor such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube), and displays input instructions, data, and the like from the operation unit 23 in accordance with display signal instructions input from the control unit 21. To do.
  • a monitor such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube)
  • LCD Liquid Crystal Display
  • CRT Cathode Ray Tube
  • the communication unit 25 includes a LAN adapter, a modem, a TA (Terminal Adapter), and the like, and controls data transmission / reception with each device connected to the communication network NT.
  • the diagnostic console 3 is a terminal that acquires image data of a dynamic image from the image server 5, displays a dynamic image based on the acquired image data, and makes a diagnostic interpretation by a doctor.
  • the diagnostic console 3 includes a control unit 31, a storage unit 32, an operation unit 33, a display unit 34, and a communication unit 35, and each unit is connected by a bus 36.
  • the control unit 31 includes a CPU, a RAM, and the like.
  • the CPU of the control unit 31 reads out the system program and various processing programs stored in the storage unit 32 in accordance with the operation of the operation unit 33 and expands them in the RAM, and displays control described later according to the expanded programs.
  • Various processes including the process are executed to centrally control the operation of each part of the diagnostic console 3.
  • the storage unit 32 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
  • the storage unit 32 stores various programs and programs including the display control processing program executed by the control unit 31 and parameters such as parameters necessary for execution of processing or data such as processing results. These various programs are stored in the form of readable program codes, and the control unit 31 sequentially executes operations according to the program codes.
  • the operation unit 33 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
  • the control unit 33 controls an instruction signal input by key operation or mouse operation on the keyboard.
  • the operation unit 33 may include a touch panel on the display screen of the display unit 34, and in this case, an instruction signal input via the touch panel is output to the control unit 31.
  • the display unit 34 as a display unit is configured by a monitor such as an LCD or a CRT, and displays an input instruction, data, or the like from the operation unit 33 in accordance with an instruction of a display signal input from the control unit 31.
  • the communication unit 35 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
  • the arithmetic device 4 performs image analysis processing on the image data of the dynamic image transmitted from the imaging console 2 and transmits it to the image server 5.
  • the arithmetic device 4 includes a control unit 41, a storage unit 42, a communication unit 43, and the like, and each unit is connected by a bus 44.
  • the control unit 41 includes a CPU, a RAM, and the like.
  • the CPU of the control unit 41 reads out the system program and various processing programs stored in the storage unit 42 and expands them in the RAM, and executes various processes including image analysis processing described later according to the expanded programs. Then, the operation of each part of the arithmetic unit 4 is centrally controlled.
  • the storage unit 42 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
  • the storage unit 42 stores various programs and programs including the image analysis processing program executed by the control unit 41 and parameters such as parameters necessary for execution of processing or data such as processing results. These various programs are stored in the form of readable program codes, and the control unit 41 sequentially executes operations according to the program codes.
  • the communication unit 43 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
  • the image server 5 is a computer device that has a storage device composed of a hard disk or the like, and stores and manages the image data of the dynamic image transmitted from the arithmetic device 4 in the storage device so as to be searchable. When a dynamic image acquisition request is transmitted from the diagnostic console 3, the image server 5 reads out the requested dynamic image data from the storage device and transmits it to the diagnostic console 3.
  • the operation in the dynamic image diagnosis support system 100 will be described. (Shooting operation) First, the flow of imaging in the dynamic image diagnosis support system 100 will be described.
  • FIG. 2 shows a photographing flow executed in the photographing apparatus 1 and the photographing console 2.
  • the operation section 23 of the imaging console 2 is operated by the imaging technician, and patient information (patient name, height, weight, age, sex, etc.) of the imaging target (subject M) is input (step S1).
  • the radiation irradiation conditions are read from the storage unit 22 and set in the radiation irradiation control device 12 by the control unit 21 of the imaging console 2, and the image reading conditions are read from the storage unit 22 and read control device. 14 is set (step S2).
  • the control unit 21 instructs the cycle detection device 16 to start cycle detection. Is output, and the cycle detection of the breathing motion of the subject M by the cycle detection sensor 15 and the cycle detection device 16 is started (step S4).
  • the control unit 21 outputs an imaging start instruction to the radiation irradiation control device 12 and the reading control device 14. Then, dynamic imaging is started (step S5).
  • the control unit 21 When the cycle detecting device 16 detects a predetermined number of dynamic cycles, the control unit 21 outputs an instruction to end imaging to the radiation irradiation control device 12 and the reading control device 14, and the imaging operation is stopped.
  • Image data acquired by shooting is sequentially input to the shooting console 2, and is stored in the storage unit 22 in association with a number indicating the shooting order by the control unit 21 (step S6) and displayed on the display unit 24. (Step S7).
  • the imaging engineer confirms the positioning and the like based on the displayed dynamic image, and determines whether an image suitable for diagnosis is acquired by imaging (imaging OK) or re-imaging is necessary (imaging NG). Then, the operation unit 23 is operated to input a determination result.
  • a series of image data acquired by dynamic photographing is respectively input by the control unit 21.
  • Identification information for identifying a dynamic image information such as patient information, examination site, radiation irradiation conditions, image reading conditions, imaging order number, cycle information, etc. are attached (for example, image data in DICOM format) Is written in the header area) and transmitted to the arithmetic unit 4 via the communication unit 25 (step S9). Then, this process ends.
  • step S8 when a determination result indicating photographing NG is input by a predetermined operation of the operation unit 23 (step S8; NO), a series of image data stored in the storage unit 22 is deleted by the control unit 21 (step S10). ), This process ends. (Operation of the arithmetic unit 4) Next, the operation in the arithmetic device 4 will be described.
  • the control unit 41 and the image analysis processing program stored in the storage unit 42 cooperate.
  • the image analysis process shown in FIG. 3 is executed.
  • an area division process is executed (step S11).
  • Fig. 4 shows the flow of area division processing. This process is realized by the cooperation of the control unit 41 and the area division processing program stored in the storage unit 42.
  • a reference image (referred to as reference image P1) is set from among a plurality of frame images constituting a dynamic image (step S101).
  • the reference image P1 may be any frame image, but here, the reference image P1 will be described as the first (first) frame image. That is, the reference image P1 in the present embodiment will be described as an image that is captured at the inspiration ⁇ expiration conversion point and has the maximum lung field area in one respiratory cycle.
  • the lung field region is extracted from the reference image P1 (step S102).
  • a lung field region is extracted by the following process.
  • a density histogram is created from the signal value of each pixel, and a threshold value is obtained by discriminant analysis or the like.
  • a region having a signal higher than the obtained threshold is extracted as a lung field region candidate.
  • edge detection is performed near the boundary of the candidate area, and a point where the edge is maximum in a small area near the boundary is extracted along the boundary. Then, the extracted edge point is approximated by a polynomial function to obtain a boundary line of the lung field region.
  • a rectangular area including the extracted lung field area is set, and the rectangular area is divided into small areas A1 (shown by dotted lines in FIG. 5) of 0.4 to 4 cm square (step S103).
  • Step S104 1 is set to the counter n (step S104), and local matching processing is performed to determine which position in the (n + 1) th frame image each small region A1 in the nth frame image of the shooting order corresponds to. (Step S105).
  • the local matching process can be performed by a method described in, for example, Japanese Patent Application Laid-Open No. 2001-157667. Specifically, first, the search area of each small area A1 in the frame image with the shooting order n is set in the frame image with the shooting order (n + 1).
  • each search region set in the (n + 1) th frame image has the same center point (x, y) when the coordinates of the center point in each small region A1 in the nth frame image are (x, y).
  • x, y) and the vertical and horizontal widths are set to be larger than each small area A1 of the nth frame image (for example, the vertical and horizontal widths are each doubled).
  • a position having the highest matching degree in the search area set in the (n + 1) th frame image is calculated, and the position on the (n + 1) th frame image is calculated. Calculated as the corresponding position.
  • a least square method or a cross-correlation coefficient is used as an index.
  • step S107 it is determined whether or not (n + 1)> the number of frame images. If it is determined that (n + 1)> the number of frame images is not satisfied (step S107; NO), the counter n is incremented by 1 (step S108). The process returns to step S105. If it is determined that (n + 1)> the number of frame images (step S107; YES), each frame image has a new small area whose apex is the center point of each small area A1, as indicated by a solid line in FIG. The area is divided into A2 (step S109), the area dividing process is completed, and the process proceeds to step S12 in FIG.
  • the small region A1 is generated by dividing the region including the lung field region in the reference image P1 into vertical and horizontal equal intervals (0.4 to 4 cm).
  • the blood vessel region in the lung field in the frame image is generated. It is also possible to obtain a structurally characteristic point such as a blood vessel bifurcation point by extracting (details will be described later), and use a rectangular area centered on the obtained point as a small area A1 as a target area for local matching processing. .
  • the local matching process is performed on the image data of each frame image itself.
  • the local matching process may be performed on an image whose edge is extracted by a gradient filter or the like.
  • the recognition of the rib portion in each frame image is performed by converting the initial shape of the rib detected by edge detection (parabolic approximate shape detection) into a rib shape model (teacher data).
  • a rib shape model teacher data.
  • step S12 of FIG. 3 an arousal determination process is executed.
  • Fig. 6 shows the flow of the arousal determination process. This process is realized by cooperation between the control unit 41 and the arousal determination processing program stored in the storage unit 42.
  • the area change rate with respect to the reference image P1 (specifically, the area change rate with respect to the corresponding small region A2 in the reference image P1)
  • An area change rate with respect to the previous frame image in the shooting order (specifically, an area change rate with respect to the corresponding small region A2 in the previous frame image in the shooting order) is calculated (step S201).
  • the area change rate with respect to the reference image P1 for each small region A2 of each frame image is, for example, the small region corresponding to the reference image P1 in the area (a) of each small region A2 of each frame image. It can be determined by the value (a / b) of the ratio of A2 to the area (b).
  • the area change rate for each small region A2 of each frame image with respect to the frame image in the previous shooting order is, for example, the area of each small region A2 of each frame image (referred to as a) in the shooting order.
  • the area of each small region A2 can be obtained based on the number of pixels counted in the small region A2.
  • step S202 it is determined whether or not the state of arousal is abnormal mainly by examining the consistency with the behavior of the surrounding small area and the consistency of the area change rate with respect to the time change. For example, for each small region A2 in each frame image, the area change rate with respect to the reference image P1 and the area change rate with respect to the previous frame image are compared in the peripheral, left and right lung fields, and upper and lower lung fields, respectively. To do.
  • the vertical positions are substantially the same for the area change rates of the small regions A2 in each frame image (the area change rate with respect to the reference image P1 and the area change rate with respect to the previous frame image).
  • a comparison is made with the average value of the area change rate in a plurality of small regions A2 around the lung field (left lung field or right lung field).
  • the small area A2 that exceeds a predetermined range for example, the average value ⁇ 20%
  • a predetermined range for example, the average value ⁇ 20%
  • the average value is determined in advance by comparing with the average value of the area change rates in the plurality of small regions A2 in the region surrounded by the dotted line in FIG. It is determined whether or not there is an abnormality depending on whether or not the specified range is exceeded.
  • the area for obtaining the average value of the area change rate may be a plurality of rows.
  • each small region A2 determined to be abnormal it is determined whether the area change (area increase / decrease amount) is too large or too small compared to the surrounding area. For example, when the reference image P1 is an image having the largest lung field area as in the present embodiment, this determination can be made according to the following criteria 1) and 2). 1) When it is determined to be abnormal based on the area change rate with respect to the reference image P1.
  • the area change rate is smaller than the average value of the surrounding area change rate minus 20% ⁇ the area change is large (change from the reference image P1 (reduction rate) ) Is big)
  • the area change rate is larger than the average value of the surrounding area change rate + 20% ⁇ the area change is small (the change (reduction rate) from the reference image P1 is small) 2)
  • When it is determined as abnormal based on the area change rate with respect to the previous frame image a) During exhalation ⁇ Area change rate is less than the average value of the surrounding area change rate minus 20% ⁇ Large area change (one The change (reduction ratio) from the previous frame image is large) -The area change rate is larger than the average value of the surrounding area change rate + 20% ⁇ The area change is small (change (reduction rate) from the previous frame image is small)
  • the small area A2 that behaves differently from the surrounding area can be detected as an abnormal part.
  • the area change rate of the small region A2 whose vertical position (position in the vertical direction) is substantially the same are averaged to calculate an average value, and the average values of the left and right lung field regions having substantially the same vertical position are compared.
  • a predetermined range with respect to the other average value for example, the average value of the paired lung fields ⁇ 20%
  • the left and right lung field regions are included. All small areas A2 are determined to be abnormal.
  • the average value of the area change rate of the small region A2 included in the region B1 surrounded by the oblique line illustrated in FIG. 8A is compared with the average value of the area change rate of the small region A2 included in the region B2, As a result, when it exceeds the predetermined range, all the small areas A2 included in the areas B1 and B2 are determined to be abnormal.
  • the area for obtaining the average value of the area change rate may be a plurality of rows.
  • this determination can be made according to the following criteria 1) and 2). 1) When it is determined as abnormal based on the area change rate with respect to the reference image P1.
  • the average value of the area change rate is smaller than the average value of the area change rate of the paired lung field region minus 20% ⁇ the area change is large (reference) (Change from image P1 (reduction rate) is large) -The average value of the area change rate is larger than the average value of the area change rate of the lung field region + 20%, and the area change is small (the change (reduction rate) from the reference image P1 is small).
  • the area change rate maximum value (maximum area / minimum area) between frame images for one cycle of respiratory motion is calculated for each small area A2, and the calculated small areas in the left and right lung field areas are calculated.
  • the area change rate maximum value of A2 is averaged in the horizontal direction (in the row direction), and the average value of the area change rate maximum values of a plurality of small regions A2 having substantially the same vertical position is calculated. Then, a portion contrary to the normal operation that “the area change rate is smaller in the upper lung field” is determined to be abnormal (first example).
  • the average value of the maximum area change rate at each vertical position is compared with the normal value (standard value) of the maximum area change rate according to the vertical position from the lung bottom. If it exceeds a predetermined range (for example, normal value ⁇ 20%), all the small areas A2 in the vertical position are determined to be abnormal.
  • the normal value described above is a function of the vertical position from the lung bottom. The function of the normal value is determined in advance according to patient information (age, height, sex) and imaging state (normal / deep breathing).
  • the start timing of the change is delayed (the change timing is delayed with respect to the lung bottom)” is abnormal (second example).
  • the area change start timing is a timing at which the area change rate with respect to the reference image P1 exceeds a predetermined value. This timing can be counted by the number of images from the reference image P1 to a frame image whose area change rate with respect to the reference image P1 exceeds a predetermined value.
  • an area change start timing at each vertical position is compared with a normal value (standard value) of the area change start timing according to the vertical position from the lung bottom, and a predetermined range for the normal value When it exceeds (for example, normal value ⁇ 20%), all the small areas A2 in the vertical position are determined to be abnormal.
  • the normal value described above is a function of the vertical position from the lung bottom. The function of the normal value is determined according to patient information (age, height, sex) and imaging state (during normal / deep breathing).
  • the measurement data (the timing at which the area change rate from the reference image P1 becomes equal to or greater than the predetermined value) is normalized so that the normal value and the respiratory cycle coincide (the area from the reference image P1). Comparison may be made after calculating the timing at which the rate of change is equal to or greater than a predetermined value / the respiratory cycle at the time of frame image shooting ⁇ the respiratory cycle of the normal value.
  • information indicating the type of abnormality for example, “arousal: upper and lower lung fields (maximum area change rate)” is displayed in the header area or the like of the reference image P1.
  • the position information of the small area A2 determined to be abnormal for example, the coordinates of the four vertices of the small area A2).
  • the type of abnormality for example, a code indicating “arousal: upper and lower lung fields (area change start timing)” in the header area or the like of the reference image P1
  • the position information of the small area A2 determined to be abnormal for example, the coordinates of the four vertices of the small area A2 is written.
  • an abnormality is detected by comparing the upper and lower sides within the data of one patient and in the second example by comparing the absolute value with a normal value (standard value).
  • step S202 When the determination in step S202 is completed, the process proceeds to step S13 in FIG. 3, and a blood flow determination process is executed.
  • FIG. 9 shows a flow of blood flow determination processing. This process is realized by cooperation between the control unit 41 and the blood flow determination processing program stored in the storage unit 42.
  • each frame image a heart region is extracted, a cardiac cycle is recognized from a shape change of the cardiac region, and each frame image is associated with a cardiac cycle (step S301).
  • Extraction of the heart region in each frame image is performed by the following process, for example.
  • the lung field is recognized.
  • the search area is limited from the circumscribed rectangular area of the lung field area.
  • a density histogram is created from the signal value of each pixel in the search region, a threshold value is obtained by a discriminant analysis method or the like, and a region having a signal lower than the threshold value is extracted as a candidate region for the heart.
  • edge detection is performed within the candidate region, and the contour line of the heart region is extracted by tracking the maximum value of the differential value greater than or equal to a predetermined size.
  • the contour edge point search region is limited based on the shape of the approximate heart region so as not to track the background or the edge inside the heart.
  • the cardiac contour template is a template of the timing when the ventricle changes from diastole to systole in the cardiac cycle, a template of the ventricular systole, a template of the timing when the ventricle changes from systole to diastole, Prepare a template for each state that the ventricle can take in the cardiac cycle, such as a template, and recognize the frame image that maximizes the correlation value with the template for each ventricular state in the cardiac cycle as a frame image that shows that state To do.
  • An electrocardiograph is attached to the patient at the time of radiographing, and data on the electrocardiographic waveform is collected at the same time as radiation irradiation. Based on the obtained electrocardiographic waveform, each frame image in the imaging cycle 2 It may be determined whether the image corresponds to the state and written in advance in the header area of each frame image.
  • step S301 after extracting the heart region in each frame image, a series of images from the predetermined timing in the cardiac cycle to the next same timing (here, for example, from the timing when the ventricle changes from the diastole to the systole, (A series of frame images from the time when the ventricle changes from the diastole to the systole) is extracted as one cardiac cycle unit.
  • a number identifying a cardiac cycle unit here, a continuous number starting from 1 is written in the header area of each frame image, and at least a frame image corresponding to the timing at which the ventricle changes from diastole to systole in the cardiac cycle.
  • information indicating that is written In the template of the frame image corresponding to the timing when the ventricle changes from the systole to the diastole in the cardiac cycle.
  • step 302 1 is set to the counter n (step 302), and the warping process of the frame image of the (n + 1) th shooting order is executed (step S303).
  • the warping process is performed as follows, for example.
  • the shift values ( ⁇ x, ⁇ y) of the center point (center pixel) of each small region A1 of the frame image of the nth frame image and the corresponding center point of each small region A1 of the (n + 1) th frame image are respectively set.
  • a shift value ( ⁇ x, ⁇ y) for each pixel of the (n + 1) th frame image is calculated by an approximation process using a two-dimensional 10th order polynomial using the obtained shift values ( ⁇ x, ⁇ y). Then, based on the obtained shift value ( ⁇ x, ⁇ y) of each pixel, nonlinear distortion conversion processing (warping processing) is performed on the (n + 1) th frame image, and each pixel of the (n + 1) th frame image is A new shifted frame image is created separately from the frame image before the warping process. The correspondence relationship of each pixel before and after the warping process in each frame image is stored in the RAM.
  • the new frame image obtained by the warping process is an image with very good matching at the corresponding pixel with the nth frame image.
  • step S304 it is determined whether or not (n + 1)> the number of frame images. If it is determined that (n + 1)> the number of frame images is not satisfied (step S304; NO), the counter n is incremented by 1 (step S305). The process returns to step S303. If it is determined that (n + 1)> the number of frame images (step S304; YES), the process proceeds to step S306.
  • step S306 a series of frame images are grouped in units of cardiac cycles (for each identical identification number assigned in step S301).
  • 1 is set to the counter n (step S307), and the corresponding pixel values of each frame image created by the warping process in the group of number n are added, and further divided by the number of frame images belonging to the number n.
  • an averaged image is created (step S308).
  • a blood vessel region is extracted from the created averaged image (step S309).
  • the extraction of the blood vessel region in the averaged image in step S308 is, for example, a method of extracting a linear structure using the maximum eigenvalue calculated from each pixel of the Hessian matrix (for “circular / linear pattern detection in medical images”).
  • No. 1pp175-185) of IEICE Transactions D-II Vol.J87-D-II No.1pp175-185) can be applied.
  • a band division filter bank that generates each element of the Hessian matrix, it is possible to extract a linear structure at each resolution level, and it is possible to extract blood vessel regions having different sizes.
  • the ribs are extracted simultaneously with the blood vessel shadow, but as described above, the ribs are recognized by using the rib shape model and deleted from the extraction result. Furthermore, for the extraction of peripheral blood vessels, for example, a dendritic figure model that can be deformed such as stretching, merging, and dividing considering the shape of the blood vessel shadow is used as a model of the blood vessel structure. Extracting blood vessel regions can be extracted by using the method of extracting the blood vessels ("Automatic extraction procedure of blood vessel shadow from chest X-ray images using Deformable Model" MEDICAL IMAGENG TECHNOLOGY Vol.17 No.5 September 1999) .
  • the concentration change amount of each pixel on the extracted blood vessel region is calculated and analyzed, and it is determined whether or not the state of blood flow in each small region A2 is abnormal (step S310).
  • a frame image of a predetermined period (from the timing when the ventricle changes from the diastole to the systole to the timing when the ventricle changes from the systole to the diastole) is extracted from the warped frame image for one cardiac cycle, and the standard determined from that is extracted.
  • a difference image is calculated by taking a difference between signal values (density values) between corresponding pixels of the image P2 (here, a frame image at a timing when the ventricle changes from diastole to systole) and each frame image.
  • the density change amount of each pixel is calculated.
  • the integrated value of the density change amount in a predetermined period is set to a reference value determined in advance according to imaging conditions (for example, tube voltage) / blood vessel diameter. If the comparison exceeds a predetermined reference value ⁇ ⁇ , the blood flow state of the small area A2 including the pixel is determined to be abnormal.
  • the blood vessel diameter in each image can be obtained based on the number of pixels between the ends of the width of the extracted blood vessel region.
  • the reference value may be changed according to patient information (patient height, weight, age, sex).
  • the density change timing in each pixel on each blood vessel region (the timing at which the density change amount from the reference image P2 in the cardiac cycle becomes equal to or greater than a predetermined value corresponding to the imaging condition / blood vessel diameter) and the heart
  • the normal value (standard value) of the concentration change timing according to the distance / blood vessel diameter is compared, and if the normal value exceeds a predetermined range (for example, normal value ⁇ 20%), the pixel is included
  • the blood flow in the small area A2 is determined to be abnormal.
  • This normal value is a function of the distance from the heart. This normal value is determined by patient information (age, height, sex) and imaging information (normal / deep breathing).
  • the measurement data (timing at which the density change amount from the reference image P2 becomes equal to or greater than the predetermined value) is normalized so that the normal value and the cardiac cycle coincide with each other (the density from the reference image P2). Comparison may be made after calculating the timing at which the amount of change is equal to or greater than a predetermined value / cardiac cycle at the time of capturing a frame image ⁇ cardiac cycle of a normal value.
  • information indicating the type of abnormality for example, a code indicating “blood flow: concentration change” or “blood” in the header area of the reference image P2 in each cardiac cycle unit, etc.
  • Flow: code indicating “concentration change timing” and position information for example, coordinates of four vertices of the small area A2 determined to be abnormal are written in association with each other.
  • an averaged image is created to extract a blood vessel region, and an abnormality in the blood flow state of each small region A2 is determined from the density change of each pixel on the blood vessel region, but the blood vessel region is not extracted, It is also possible to simply calculate an average signal value in the small area A2 of the cardiac cycle unit and use the fluctuation of the average signal value as an index of blood flow. For example, normality / abnormality of the blood flow state is determined based on the difference in the signal value change amount from the surrounding small area and the shift in the signal value change timing.
  • normality / abnormality of the blood flow state by measuring / analyzing not only the concentration change but also the blood vessel diameter. For example, a method of calculating the maximum blood vessel diameter in a predetermined region using a Gaussian Hessian matrix and comparing the obtained values in the vertical direction of the lung field (“maximum blood vessel diameter ratio in upper and lower lung fields from chest X-ray images” By using “MEDICAL IMAGENG TECHNOLOGY Vol.19 No.5 September 1999)", normal / abnormal blood flow can be determined. In addition, it is possible to determine normality / abnormality of blood flow by combining analysis based on concentration change and analysis based on blood vessel diameter.
  • step S311 When the analysis and determination in the group of number n is completed, it is determined whether n> (number of cardiac cycle units) or not, and it is determined that n> (number of cardiac cycle units) is not satisfied (step S311; NO), the counter n is incremented by 1 (step S312), and the process returns to step S307. If it is determined that n> (number of cardiac cycle units) (step S311; YES), the results of cardiac cycle units are integrated (step S313). For example, the small area A2 determined to be abnormal in at least one cardiac cycle unit is determined to be abnormal, and the integrated determination result is written in, for example, the reference image P2 (frame image with the imaging order of 1). Then, this process ends, and the process proceeds to step S14 in FIG.
  • step S14 in FIG. 3 an abnormality determination process is executed. Specifically, for each small region A2 in the series of frame images, the determination result by the arousal determination process and the determination result by the blood flow determination process are collated, and the determination result by the arousal determination process and the determination by the blood flow determination process Based on the combination of results, it is finally determined whether or not each small area A2 is abnormal. For example, the small region A2 determined as “no abnormality” in the arousal determination process and determined as “the change in blood flow concentration is smaller than a threshold value and abnormal” in the blood flow determination process is suspected of chronic lung type. It is determined that the region is abnormal.
  • the area change is smaller than the surrounding area” or “the area change is small compared to the paired lung field region”, and the small area that is determined as “no abnormality” in the blood flow determination process.
  • A2 is determined to be an abnormal region suspected of having chronic bronchitis.
  • the final determination result specifically, the positional information (for example, the four vertices of the small region A2) of the small region A2 finally determined to be abnormal by the combination of the determination result of the aspiration determination processing and the determination result of the blood flow determination processing ), Disease name, etc. are written in the header information of each frame image.
  • a series of input image data (frame images) is transmitted to the image server 5 via the communication unit 43 (step S15), and this process ends.
  • the image server 5 when a series of image data of a dynamic image is received from the arithmetic unit 4, the received series of image data is stored in a storage device. (Operation of diagnostic console 3) Next, the operation in the diagnostic console 3 will be described.
  • the control unit 31 and the display control processing program stored in the storage unit 32 are used.
  • the display control process shown in FIG. 10 is executed by cooperation.
  • an acquisition request for a series of image data (frame images) of a dynamic image having an identification ID input to the image server 5 is transmitted via the communication unit 35, and a series of dynamic images to be displayed is displayed from the image server 5.
  • Image data (frame image) is acquired (step S21).
  • step S22 the arousal determination result, blood flow determination result, and final determination result attached to the header area of each frame image are acquired (step S22), and the acquired aspiration determination result, blood flow determination result, and final determination result are acquired. Based on this, the determination result display screen is displayed on the display unit 34 (step S23), and this process ends.
  • FIG. 11A shows an example of the determination result display screen 341 displayed on the display unit 34 in step S23.
  • the determination result display screen 341 displays an arousal determination result display area 341a for displaying an arousal determination result, a blood flow determination result display area 341b for displaying a blood flow determination result, and a final determination result.
  • the final determination result display area 341c is displayed, and the arousal determination result, the blood flow determination result, and the final determination result are displayed on one screen.
  • step S23 first, based on the arousal determination result, the blood flow determination result, and the final determination result acquired from each frame image, the small area A2 determined to be abnormal by the aspiration determination process, and determined to be abnormal by the blood flow determination process The small area A2 and the small area A2 finally determined to be abnormal by the abnormality determination process are extracted.
  • the small area A2 determined to be abnormal in any one of the series of frame images is extracted.
  • the reference image P1 on which the small area A2 is drawn is displayed in each of the arousal determination result display area 341a, the blood flow determination result display area 341b, and the final determination result display area 341c, and the arousal determination result display area 341a.
  • An annotation indicating the small area A2 determined to be abnormal in the arousal determination process is superimposed on the displayed reference image P1, and abnormal in the blood flow determination process on the reference image P1 displayed in the blood flow determination result display area 341b.
  • An annotation indicating the small area A2 determined to be superimposed is displayed, and an annotation indicating the small area A2 determined to be abnormal in the abnormality determination process is superimposed and displayed on the reference image P1 displayed in the final determination result display area 341c.
  • As an annotation display method for example, as shown in W1 and W2 in FIG. 11A, a small area A2 determined to be abnormal is displayed with a thick frame. Moreover, you may make it display by changing a color. Furthermore, in the small area A2 determined to be abnormal, the doctor can confirm the degree of abnormality by changing the color of the thick frame and the display according to the number of frame images in which the small area A2 is determined to be abnormal. Good.
  • the determination result display screen 341 is not limited to the screen shown in FIG.
  • the averaged image and the blood vessel extraction result in the averaged image are also transmitted, and the blood flow determination result region 341b
  • an annotation may be displayed in the small area A2 determined to be abnormal on the averaged image in which the extracted blood vessels are emphasized.
  • a series of frame images of dynamic images are sequentially switched in the shooting order in each of the arousal determination result display area 341a, the blood flow determination result display area 341b, and the final determination result display area 341c.
  • the dynamic image By displaying the dynamic image, the dynamic image is displayed, and at the time of displaying each frame image, the annotation is displayed in the small area A2 determined to be abnormal in the frame image, thereby notifying the doctor of the timing of the abnormality. You may do it.
  • the arousal determination result and the blood flow determination result are displayed together.
  • at least the final determination result may be displayed.
  • the disease name determined to be suspicious it is preferable to display the disease name determined to be suspicious.
  • both aeration and blood flow are normal, only aspiration is abnormal, only blood flow is abnormal, both aeration and blood flow are abnormal, These locations may be displayed in different colors.
  • the arithmetic device 4 divides each of the plurality of frame images of the chest dynamic image captured by the imaging device 1 into a plurality of small regions.
  • the image analysis is performed for each corresponding small region between a plurality of frame images, and the arousal determination process for determining whether or not the aspiration state of each small region is abnormal, the blood flow state of each small region is abnormal
  • a blood flow determination process for determining whether or not there is present
  • an abnormality determination process for determining whether or not each small region is abnormal based on the determination result by the arousal determination process and the determination result by the blood flow determination process .
  • the diagnosis console 3 displays at least the determination result in the abnormality determination process.
  • diagnosis support information that takes into account not only the state of arousal but also the state of blood flow associated with breathing.
  • warping processing is performed on a plurality of frame images to match the shape of the lung field region between the plurality of frame images, and then the blood vessel region is extracted. It is not necessary for the patient to hold his or her breath when taking a picture in order to match each frame.
  • the doctor can easily recognize the portion determined to be abnormal. Become.
  • the doctor easily recognizes the timing determined to be abnormal by displaying the annotation in the small area determined to be abnormal by the abnormality determination processing in each frame image displayed as a moving image. It becomes possible.
  • the abnormal part determined in cardiac cycle units may be displayed for the blood flow determination result.
  • a hard disk, a semiconductor nonvolatile memory, or the like is used as a computer-readable medium of the program according to the present invention, but the present invention is not limited to this example.
  • a portable recording medium such as a CD-ROM can be applied.
  • a carrier wave is also applied as a medium for providing program data according to the present invention via a communication line.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physiology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Cardiology (AREA)
  • Hematology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un système de support pour imagerie diagnostique dynamique visant à apporter des informations d'aide au diagnostic auxquelles est ajouté l'état d'un courant sanguin associé à la respiration. Une unité arithmétique (4) d'un système d'imagerie dynamique (100) divise chacune d'une pluralité d'images cadres d'images dynamiques du torse capturées par un dispositif d'imagerie (1) en une pluralité de petites régions, procède à l'analyse de l'image de chaque petite région correspondante parmi les images cadres et met en œuvre un traitement d'évaluation de la ventilation permettant de savoir si l'état de ventilation dans chaque petite région est anormal, un traitement d'évaluation du courant sanguin permettant de savoir si l'état du courant sanguin dans chaque petite région est anormal et un traitement d'évaluation des anomalies permettant de savoir si chaque petite région est anormale au vu des résultats des évaluations obtenues à l'issue du traitement d'évaluation de la ventilation et du traitement d'évaluation du courant sanguin. Suite à cela, une console (3) de diagnostic affiche au moins les résultats de l'évaluation obtenue à l'issue du traitement d'évaluation des anomalies.
PCT/JP2009/050026 2008-01-15 2009-01-06 Système de support pour imagerie diagnostique dynamique WO2009090894A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009549997A JP5136562B2 (ja) 2008-01-15 2009-01-06 動態画像診断支援システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2008005678 2008-01-15
JP2008-005678 2008-01-15

Publications (1)

Publication Number Publication Date
WO2009090894A1 true WO2009090894A1 (fr) 2009-07-23

Family

ID=40885285

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/050026 WO2009090894A1 (fr) 2008-01-15 2009-01-06 Système de support pour imagerie diagnostique dynamique

Country Status (2)

Country Link
JP (5) JP5136562B2 (fr)
WO (1) WO2009090894A1 (fr)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011092982A1 (fr) * 2010-02-01 2011-08-04 コニカミノルタエムジー株式会社 Système et programme dynamiques de traitement d'images
US20120130238A1 (en) * 2010-11-22 2012-05-24 Konica Minolta Medical & Graphic, Inc. Dynamic diagnosis support information generation system
CN102793551A (zh) * 2011-05-24 2012-11-28 柯尼卡美能达医疗印刷器材株式会社 胸部诊断辅助信息生成系统
CN104887258A (zh) * 2010-08-27 2015-09-09 柯尼卡美能达医疗印刷器材株式会社 诊断支援系统
EP3123941A1 (fr) 2015-07-17 2017-02-01 Konica Minolta, Inc. Appareil de capture d'images radiographiques et système de capture d'images radiographiques
JP2017225475A (ja) * 2016-06-20 2017-12-28 コニカミノルタ株式会社 放射線画像処理システムおよび放射線画像処理装置
JP2018000926A (ja) * 2016-06-29 2018-01-11 コニカミノルタ株式会社 動態解析システム
US10026188B2 (en) 2015-05-25 2018-07-17 Konica Minolta, Inc. Dynamic analysis system
JP2019010135A (ja) * 2017-06-29 2019-01-24 コニカミノルタ株式会社 動態解析装置及び動態解析システム
CN109414218A (zh) * 2016-03-04 2019-03-01 4Dx有限公司 成像方法及系统
JP2019072342A (ja) * 2017-10-18 2019-05-16 キヤノンメディカルシステムズ株式会社 医用画像処理装置、x線診断装置、及び医用画像処理プログラム
JP2020062394A (ja) * 2019-10-02 2020-04-23 コニカミノルタ株式会社 画像処理装置
JP7092218B1 (ja) 2021-01-18 2022-06-28 コニカミノルタ株式会社 医療情報管理装置及び医療情報管理プログラム
US11935234B2 (en) 2019-05-22 2024-03-19 Panasonic Corporation Method for detecting abnormality, non-transitory computer-readable recording medium storing program for detecting abnormality, abnormality detection apparatus, server apparatus, and method for processing information

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6128691B2 (ja) 2014-07-10 2017-05-17 富士フイルム株式会社 医用画像計測装置および方法並びにプログラム
JP6611660B2 (ja) * 2016-04-13 2019-11-27 富士フイルム株式会社 画像位置合わせ装置および方法並びにプログラム
US9947093B2 (en) * 2016-05-03 2018-04-17 Konica Minolta, Inc. Dynamic analysis apparatus and dynamic analysis system
JP2017213287A (ja) 2016-06-02 2017-12-07 コニカミノルタ株式会社 解析装置及び解析システム
WO2018035465A1 (fr) 2016-08-18 2018-02-22 William Beaumont Hospital Système et procédé pour déterminer un changement de masse sanguine induit par la respiration à partir d'une tomographie informatisée 4d
JP2018068814A (ja) * 2016-11-01 2018-05-10 国立大学法人東北大学 画像処理装置、画像処理方法、及び、画像処理プログラム
JP6740910B2 (ja) * 2017-01-13 2020-08-19 コニカミノルタ株式会社 動態画像処理システム
JP2018157884A (ja) * 2017-03-22 2018-10-11 コニカミノルタ株式会社 X線動画像処理装置
JP7066476B2 (ja) * 2017-03-28 2022-05-13 キヤノンメディカルシステムズ株式会社 医用画像処理装置、医用画像処理方法及びx線診断装置
JP7073661B2 (ja) * 2017-09-27 2022-05-24 コニカミノルタ株式会社 動態解析装置及び動態解析システム
CA3087702A1 (fr) * 2018-01-05 2019-07-11 Radwisp Pte. Ltd. Programme de soutien de diagnostic informatique et methode de soutien de diagnostic
JP7480997B2 (ja) 2020-08-17 2024-05-10 国立大学法人旭川医科大学 画像処理装置、画像処理方法、およびプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006239195A (ja) * 2005-03-04 2006-09-14 Fuji Photo Film Co Ltd コンピュータによる画像診断支援方法および画像診断支援装置ならびにプログラム
WO2007078012A1 (fr) * 2006-01-05 2007-07-12 National University Corporation Kanazawa University Dispositif d’examen de detection d’image radiologique en continu, programme et support d’enregistrement

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CS224317B1 (cs) * 1981-03-13 1984-01-16 Cyril Doc Dr Csc Simecek Zařízení pro hodnocení distribuce ventilace plic
JPH07194583A (ja) * 1993-12-29 1995-08-01 Toshiba Corp X線診断装置
JP2001157667A (ja) * 1999-12-02 2001-06-12 Fuji Photo Film Co Ltd 画像表示方法および画像表示装置
JP2001157675A (ja) * 1999-12-02 2001-06-12 Fuji Photo Film Co Ltd 画像表示方法および画像表示装置
US6738063B2 (en) * 2002-02-07 2004-05-18 Siemens Corporate Research, Inc. Object-correspondence identification without full volume registration
JP2004321390A (ja) * 2003-04-23 2004-11-18 Toshiba Corp X線画像診断装置及びx線画像診断方法
JP4560643B2 (ja) * 2003-06-17 2010-10-13 株式会社Aze 呼吸気ct画像による換気分布計測方法
JP2005270201A (ja) * 2004-03-23 2005-10-06 Fuji Photo Film Co Ltd X線撮影装置
US7668357B2 (en) * 2005-10-17 2010-02-23 Stanford University Method and system for using computed tomography to test pulmonary function
WO2007111207A1 (fr) * 2006-03-16 2007-10-04 Osaka University Procédé de mesure de spirogramme tri-dimensionnel et programme informatique

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006239195A (ja) * 2005-03-04 2006-09-14 Fuji Photo Film Co Ltd コンピュータによる画像診断支援方法および画像診断支援装置ならびにプログラム
WO2007078012A1 (fr) * 2006-01-05 2007-07-12 National University Corporation Kanazawa University Dispositif d’examen de detection d’image radiologique en continu, programme et support d’enregistrement

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011092982A1 (fr) * 2010-02-01 2011-08-04 コニカミノルタエムジー株式会社 Système et programme dynamiques de traitement d'images
CN104887258A (zh) * 2010-08-27 2015-09-09 柯尼卡美能达医疗印刷器材株式会社 诊断支援系统
US20120130238A1 (en) * 2010-11-22 2012-05-24 Konica Minolta Medical & Graphic, Inc. Dynamic diagnosis support information generation system
CN102551755A (zh) * 2010-11-22 2012-07-11 柯尼卡美能达医疗印刷器材株式会社 动态诊断辅助信息生成系统
US9326745B2 (en) 2010-11-22 2016-05-03 Konica Minolta, Inc. Dynamic diagnosis support information generation system
US10278661B2 (en) 2010-11-22 2019-05-07 Konica Minolta, Inc. Dynamic diagnosis support information generation system
CN102793551A (zh) * 2011-05-24 2012-11-28 柯尼卡美能达医疗印刷器材株式会社 胸部诊断辅助信息生成系统
JP2012239796A (ja) * 2011-05-24 2012-12-10 Konica Minolta Medical & Graphic Inc 胸部診断支援情報生成システム
CN104188676A (zh) * 2011-05-24 2014-12-10 柯尼卡美能达医疗印刷器材株式会社 胸部诊断辅助信息生成系统
CN104188682A (zh) * 2011-05-24 2014-12-10 柯尼卡美能达医疗印刷器材株式会社 胸部诊断辅助信息生成系统
CN102793551B (zh) * 2011-05-24 2015-03-25 柯尼卡美能达医疗印刷器材株式会社 胸部诊断辅助信息生成系统
US9198628B2 (en) 2011-05-24 2015-12-01 Konica Minolta, Inc. Chest diagnostic support information generation system
US10026188B2 (en) 2015-05-25 2018-07-17 Konica Minolta, Inc. Dynamic analysis system
US10679356B2 (en) 2015-05-25 2020-06-09 Konica Minolta, Inc. Dynamic analysis system
US10206647B2 (en) 2015-07-17 2019-02-19 Konica Minolta, Inc. Radiographic image capturing device and radiographic image capturing system
EP3123941A1 (fr) 2015-07-17 2017-02-01 Konica Minolta, Inc. Appareil de capture d'images radiographiques et système de capture d'images radiographiques
EP3422939A4 (fr) * 2016-03-04 2020-04-15 4DX Limited Procédé et système d'imagerie
CN109414218A (zh) * 2016-03-04 2019-03-01 4Dx有限公司 成像方法及系统
EP4181060A1 (fr) * 2016-03-04 2023-05-17 4DMedical Limited Estimation d'un rapport ventilation/perfusion à partir d'au moins une image pulmonaire in vivo
JP2017225475A (ja) * 2016-06-20 2017-12-28 コニカミノルタ株式会社 放射線画像処理システムおよび放射線画像処理装置
US10223790B2 (en) 2016-06-29 2019-03-05 Konica Minolta, Inc. Dynamic analysis system
JP2018000926A (ja) * 2016-06-29 2018-01-11 コニカミノルタ株式会社 動態解析システム
JP2019010135A (ja) * 2017-06-29 2019-01-24 コニカミノルタ株式会社 動態解析装置及び動態解析システム
JP2019072342A (ja) * 2017-10-18 2019-05-16 キヤノンメディカルシステムズ株式会社 医用画像処理装置、x線診断装置、及び医用画像処理プログラム
JP7000110B2 (ja) 2017-10-18 2022-02-10 キヤノンメディカルシステムズ株式会社 医用画像処理装置、x線診断装置、及び医用画像処理プログラム
US11935234B2 (en) 2019-05-22 2024-03-19 Panasonic Corporation Method for detecting abnormality, non-transitory computer-readable recording medium storing program for detecting abnormality, abnormality detection apparatus, server apparatus, and method for processing information
JP2020062394A (ja) * 2019-10-02 2020-04-23 コニカミノルタ株式会社 画像処理装置
JP7092218B1 (ja) 2021-01-18 2022-06-28 コニカミノルタ株式会社 医療情報管理装置及び医療情報管理プログラム
JP2022110425A (ja) * 2021-01-18 2022-07-29 コニカミノルタ株式会社 医療情報管理装置及び医療情報管理プログラム

Also Published As

Publication number Publication date
JP2017124325A (ja) 2017-07-20
JP5445662B2 (ja) 2014-03-19
JPWO2009090894A1 (ja) 2011-05-26
JP2015226852A (ja) 2015-12-17
JP6436182B2 (ja) 2018-12-12
JP5136562B2 (ja) 2013-02-06
JP2013039427A (ja) 2013-02-28
JP6135733B2 (ja) 2017-05-31
JP5858031B2 (ja) 2016-02-10
JP2014050756A (ja) 2014-03-20

Similar Documents

Publication Publication Date Title
JP6436182B2 (ja) 動態画像解析装置
JP6597548B2 (ja) 動態解析システム
JP6772873B2 (ja) 動態解析装置及び動態解析システム
JP6217241B2 (ja) 胸部診断支援システム
WO2012026145A1 (fr) Système et programme d'assistance de diagnostic
JP5919717B2 (ja) 動態医用画像生成システム
JP5521392B2 (ja) 動態画像診断支援システム及びプログラム
JP6743662B2 (ja) 動態画像処理システム
JP6418091B2 (ja) 胸部画像表示システム及び画像処理装置
JP2017176202A (ja) 動態解析システム
JP5109534B2 (ja) 放射線画像撮影システム及び動態用放射線画像撮影支援装置
JP6740910B2 (ja) 動態画像処理システム
JP2017169830A (ja) 動態解析装置
JP5910646B2 (ja) 放射線画像撮影システム
JP2009148336A (ja) 動態画像診断支援システム
JP7047806B2 (ja) 動態解析装置、動態解析システム、動態解析プログラム及び動態解析方法
JP2013013737A (ja) 放射線画像撮影システム
JP6962030B2 (ja) 動態解析装置、動態解析システム、動態解析プログラム及び動態解析方法
JP2015134168A (ja) 放射線画像撮影システム
JP6888721B2 (ja) 動態画像処理装置、動態画像処理プログラム及び動態画像処理方法
JP7255319B2 (ja) 動態解析装置、動態解析システム及びプログラム
JP2009273603A (ja) 動態画像撮影システム
JP2017217047A (ja) 画像表示システム
JP2010114613A (ja) 動態画像処理方法、動態画像処理システム
WO2011093221A1 (fr) Système et programme de traitement d'image dynamique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09702679

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2009549997

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09702679

Country of ref document: EP

Kind code of ref document: A1