WO2011092982A1 - Dynamic image processing system and program - Google Patents

Dynamic image processing system and program Download PDF

Info

Publication number
WO2011092982A1
WO2011092982A1 PCT/JP2010/073138 JP2010073138W WO2011092982A1 WO 2011092982 A1 WO2011092982 A1 WO 2011092982A1 JP 2010073138 W JP2010073138 W JP 2010073138W WO 2011092982 A1 WO2011092982 A1 WO 2011092982A1
Authority
WO
WIPO (PCT)
Prior art keywords
blood vessel
region
vessel region
unit
lung field
Prior art date
Application number
PCT/JP2010/073138
Other languages
French (fr)
Japanese (ja)
Inventor
慎太郎 村岡
哲雄 島田
翔 野地
Original Assignee
コニカミノルタエムジー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタエムジー株式会社 filed Critical コニカミノルタエムジー株式会社
Publication of WO2011092982A1 publication Critical patent/WO2011092982A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4452Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being able to move relative to each other
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/504Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of blood vessels, e.g. by angiography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/507Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for determination of haemodynamic parameters, e.g. perfusion CT
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present invention relates to a dynamic image processing system and a program.
  • a dynamic image of the chest is taken using a semiconductor image sensor such as FPD (flat panel detector), Attempts have been made to apply it to diagnosis.
  • FPD flat panel detector
  • pulsed radiation is continuously irradiated from the radiation source in accordance with the reading / erasing timing of the semiconductor image sensor. Take multiple shots per second to capture the dynamics of the chest.
  • a doctor can observe a series of movements of the chest accompanying respiratory motion, heart beats, and the like.
  • Patent Document 1 describes a technique for displaying a moving image from which bones and blood vessels are removed by capturing adjacent frame images with X-rays having different energies and subjecting them to subtraction processing.
  • the pulmonary blood vessel in which the blood flowing in the peripheral blood vessel is concentrated for example, the pulmonary artery such as the right interlobe pulmonary artery
  • the blood flow volume of the peripheral blood vessel is calculated to be larger than the actual blood flow due to the signal change on the pulmonary vein such as the right lower pulmonary vein.
  • An object of the present invention is to enable accurate calculation of information on blood flow flowing through peripheral blood vessels in a lung field region.
  • a dynamic image processing system for extracting a lung field region from each of a plurality of frame images indicating the dynamics of the chest, Blood vessel region recognition means for performing blood vessel region recognition processing in the extracted lung field region of each frame image; A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image; A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit; Calculating means for calculating a change amount of a pixel signal value in each block region between the frame images from which the blood vessel region has been removed by the blood vessel region removing unit; Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on a change amount of a pixel signal value in each block area calculated by the calculating means; Is provided.
  • the invention according to claim 2 is the invention according to claim 1, An operation means for an operator to specify a part of a blood vessel region to be recognized by the blood vessel region recognition means;
  • the blood vessel region recognizing unit performs a blood vessel region recognition process based on the region designated by the operation unit.
  • the invention according to claim 3 is the invention according to claim 1, In the extracted lung field region, comprising an operation means for an operator to specify a region for performing blood vessel recognition processing by the blood vessel region recognition means,
  • the blood vessel region recognizing unit performs a blood vessel region recognition process only for a region designated by the operation unit.
  • the invention according to claim 4 is the invention according to any one of claims 1 to 3, Display means for displaying blood flow information calculated by the blood flow information calculating means on a display unit; Is provided.
  • the invention according to claim 5 is the invention according to claim 4, Based on the blood flow information of the peripheral blood vessels present in each block region, comprising a determination means for determining whether the blood flow of the peripheral blood vessels in each block region is abnormal,
  • the display means causes the position of the block area determined to be abnormal by the determination means to be displayed on the display unit so as to be identifiable on a frame image displayed as a moving image.
  • the invention according to claim 6 is the invention according to any one of claims 1 to 5,
  • the blood vessel region removing unit replaces the recognized pixel signal value of the blood vessel region with a value determined based on the pixel signal value of the non-blood vessel region in the vicinity thereof.
  • the program of the invention described in claim 7 is: Computer Lung field extraction means for extracting a lung field from each of a plurality of frame images showing the dynamics of the chest, A blood vessel region recognition means for performing a blood vessel region recognition process in the extracted lung field region of each frame image; A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image; A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit; Calculating means for calculating a change amount of a pixel signal value in each block region between frame images from which the blood vessel region has been removed by the blood vessel region removing unit; Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on the amount of change in the pixel signal value in each block area calculated by the calculating means; To function as.
  • the invention according to claim 8 is the invention according to claim 7,
  • FIG. 1 It is a figure which shows the whole structure of the dynamic image processing system in embodiment of this invention. It is a figure which shows the detailed structural example of the radiation generator of FIG. 1, and a radiation detection part. It is a figure which shows an example of the patient fixing
  • FIG. 5 is a diagram illustrating an example of a GUI for designating a target region for the blood vessel region recognition process in FIG. 4.
  • FIG. 5 is a diagram for explaining an example of recognizing a blood vessel region according to an operation by an operator in the blood vessel region recognition process of FIG. 4. It is a figure for demonstrating the blood vessel region removal process of FIG. It is a figure which shows an example of the image before and after the process by the blood vessel region removal process of FIG. It is a figure which shows the example of a display of the blood-flow information in step S19 of FIG. It is a figure which shows the time change of the addition signal value in a certain block area
  • FIG. 1 shows the overall configuration of a dynamic image processing system 100 in the present embodiment.
  • an imaging device 1 and an imaging console 2 are connected by a communication cable or the like, and the imaging console 2 and the diagnostic console 3 are connected to a LAN (Local Area Network). Etc., and connected via a communication network NT.
  • LAN Local Area Network
  • Each device constituting the dynamic image processing system 100 conforms to the DICOM (Digital-Image-and-Communications-in-Medicine) standard, and communication between the devices is performed according to DICOM.
  • DICOM Digital-Image-and-Communications-in-Medicine
  • the imaging apparatus 1 is an apparatus that images the dynamics of the chest with periodicity (cycle), such as pulmonary expansion and contraction morphological changes, heart pulsation, and the like accompanying respiratory motion.
  • Dynamic imaging is performed by continuously irradiating the chest of a human body with radiation such as X-rays in a pulse manner to acquire a plurality of images (that is, continuous imaging).
  • a series of images obtained by this continuous shooting is called a dynamic image.
  • Each of the plurality of images constituting the dynamic image is called a frame image.
  • the imaging apparatus 1 includes a radiation generation device 11, a radiation irradiation control device 12, a radiation detection unit 13, a reading control device 14, and the like.
  • the radiation generation apparatus 11 includes a radiation source 111, a radiation source holding unit 112, a support base shaft 113, and the like.
  • the radiation source 111 is disposed at a position facing the radiation detector 131 across the subject M, and irradiates the subject M with radiation (X-rays) under the control of the radiation irradiation control device 12.
  • the radiation source 111 is held by the radiation source holding unit 112 so as to be movable up and down along the support base axis 113, and at the time of imaging, from the floor to the focal position of the radiation source 111 based on the control from the radiation irradiation control device 12.
  • the height (distance) is adjusted by a drive mechanism (not shown) so as to be the same as the height from the floor to the center of the radiation detector 131.
  • the distance between the radiation source 111 and the radiation detector 131 is preferably 2 m or more.
  • the radiation irradiation control device 12 is connected to the imaging console 2 and controls the radiation generator 11 based on the radiation irradiation conditions input from the imaging console 2 to perform radiation imaging.
  • the radiation irradiation conditions input from the imaging console 2 are, for example, pulse rate, pulse width, pulse interval, imaging start / end timing, X-ray tube current value, X-ray tube voltage value, filter type during continuous irradiation. Etc.
  • the pulse rate is the number of times of radiation irradiation per second, and matches the frame rate described later.
  • the pulse width is a radiation irradiation time per one irradiation.
  • the pulse interval is the time from the start of one radiation irradiation to the start of the next radiation irradiation in continuous imaging, and coincides with a frame interval described later.
  • the radiation irradiation control device 12 generates radiation so that the height from the floor to the focal position of the radiation source 111 is the same as the height from the floor to the center of the radiation detector 131 output from the reading control device 14. Control of each part of the apparatus 11 is performed.
  • the radiation detection unit 13 includes a radiation detector 131, a detector holding unit 132, a support base 133, a patient fixing unit 134, and the like.
  • the radiation detector 131 is composed of a semiconductor image sensor such as an FPD.
  • the FPD has, for example, a glass substrate or the like, detects radiation that has been irradiated from the radiation source 111 and transmitted through at least the subject M at a predetermined position on the substrate according to its intensity, and detects the detected radiation as an electrical signal.
  • a plurality of pixels to be converted and stored are arranged in a matrix.
  • Each pixel includes a switching unit such as a TFT (Thin-Film-Transistor).
  • TFT Thin-Film-Transistor
  • an indirect type FPD having a scintillator that converts radiation into light may be used.
  • the radiation detector 131 is held by a detector holding unit 132 so as to be movable up and down along the support base shaft 133.
  • the radiation detector 131 operates the subject M by operating a foot switch (not shown).
  • the position (height from the floor surface) of the detector holding part 132 can be adjusted according to the height.
  • the detector holding part 132 is provided with a patient fixing part 134.
  • the patient fixing part 134 has at least a left upper fixing part 134a, an upper right fixing part 134b, a lower left fixing part 134c, and a lower right fixing part 134d.
  • the upper left fixed part 134a is configured to be movable in parallel with the radiation irradiation surface of the radiation detector 131 from the left end side of the radiation detector 131 toward the right end direction. The same applies to the lower left fixing part 134c.
  • the upper right fixed part 134b is configured to move from the right end side of the radiation detector 131 toward the left end direction by the same distance as the moving distance of the upper left fixed part 134a in parallel with the radiation irradiation surface of the radiation detector 131. Yes.
  • the position of the subject M can be fixed at the center of the radiation detector 131 by sandwiching the subject M between the upper left fixing portion 134a and the upper right portion fixing portion 134b, and the lower left fixing portion 134c and the lower right fixing portion 134d.
  • positioning reproducibility when shooting at different times for follow-up observation can be improved.
  • the upper left fixed part 134a and the upper right fixed part 134b are positions (radiation detection) that can secure a region (direct X-ray region) directly irradiated with radiation on the radiation detector 131 for time-varying density correction described later. It is preferable to dispose a predetermined distance below the upper end of the vessel 131.
  • the reading control device 14 is connected to the imaging console 2.
  • the reading control device 14 controls the switching unit of each pixel of the radiation detector 131 based on the image reading condition input from the imaging console 2 to switch the reading of the electrical signal accumulated in each pixel.
  • the image data is acquired by reading the electric signal accumulated in the radiation detector 131.
  • This image data is a frame image.
  • the reading control device 14 outputs the acquired frame image to the photographing console 2.
  • the image reading conditions are, for example, a frame rate, a frame interval, a pixel size, an image size (matrix size), and the like.
  • the frame rate is the number of frame images acquired per second and matches the pulse rate.
  • the frame interval is the time from the start of one frame image acquisition operation to the start of the next frame image acquisition operation in continuous shooting, and coincides with the pulse interval.
  • the radiation irradiation control device 12 and the reading control device 14 are connected to each other, and exchange synchronization signals to synchronize the radiation irradiation operation and a series of image reading operations such as reset-accumulation-data reading-reset. It is supposed to let you. If necessary, dark image reading to reset may be performed before or after photographing, and may be synchronized by including them. Also, the reading control device 14 outputs height information (output value from the distance measuring sensor SE1) from the floor to the center of the radiation detector 131 to the radiation irradiation control device 12, and the floor to the radiation detector 131 The height to the center and the height from the floor to the focal position of the radiation source 111 are made to coincide.
  • the imaging console 2 outputs radiation irradiation conditions and image reading conditions to the imaging apparatus 1 to control radiation imaging and radiographic image reading operations by the imaging apparatus 1, and also captures dynamic images acquired by the imaging apparatus 1. Displayed for confirmation of whether the image is suitable for confirmation of positioning or diagnosis.
  • the photographing console 2 includes a control unit 21, a storage unit 22, an operation unit 23, a display unit 24, and a communication unit 25, and each unit is connected by a bus 26.
  • the control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like.
  • the CPU of the control unit 21 reads the system program and various processing programs stored in the storage unit 22 in accordance with the operation of the operation unit 23, expands them in the RAM, and performs shooting control processing described later according to the expanded programs.
  • Various processes including the beginning are executed to centrally control the operation of each part of the imaging console 2 and the radiation irradiation operation and the reading operation of the imaging apparatus 1.
  • the storage unit 22 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
  • the storage unit 22 stores various programs executed by the control unit 21 and data such as parameters necessary for execution of processing by the programs or processing results.
  • the storage unit 22 stores a shooting control processing program for executing the shooting control process shown in FIG.
  • the storage unit 22 stores radiation irradiation conditions and image reading conditions.
  • Various programs are stored in the form of readable program code, and the control unit 21 sequentially executes operations according to the program code.
  • the operation unit 23 includes a keyboard having a cursor key, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
  • the control unit 23 controls an instruction signal input by key operation or mouse operation on the keyboard.
  • the operation unit 23 may include a touch panel on the display screen of the display unit 24. In this case, the operation unit 23 outputs an instruction signal input via the touch panel to the control unit 21.
  • the display unit 24 includes a monitor such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube), and displays input instructions, data, and the like from the operation unit 23 in accordance with display signal instructions input from the control unit 21. To do.
  • a monitor such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube)
  • LCD Liquid Crystal Display
  • CRT Cathode Ray Tube
  • the communication unit 25 includes a LAN adapter, a modem, a TA (Terminal Adapter), and the like, and controls data transmission / reception with each device connected to the communication network NT.
  • the diagnostic console 3 is a dynamic image processing device that acquires a dynamic image from the imaging console 2, displays the acquired dynamic image, and makes a diagnostic interpretation by a doctor.
  • the diagnostic console 3 includes a control unit 31, a storage unit 32, an operation unit 33, a display unit 34, and a communication unit 35, and each unit is connected by a bus 36.
  • the control unit 31 includes a CPU, a RAM, and the like.
  • the CPU of the control unit 31 reads out the system program and various processing programs stored in the storage unit 32 in accordance with the operation of the operation unit 33 and expands them in the RAM, and performs image analysis described later according to the expanded programs.
  • Various processes including the process are executed to centrally control the operation of each part of the diagnostic console 3.
  • the control unit 31 executes an image analysis process to be described later, thereby performing lung field region extraction means, blood vessel region recognition means, blood vessel region removal means, division means, calculation means, blood flow information calculation means, display means, and determination means. Realize.
  • the storage unit 32 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
  • the storage unit 32 stores various programs including an image analysis processing program for executing image analysis processing by the control unit 31, parameters necessary for execution of processing by the program, or data such as processing results. These various programs are stored in the form of readable program codes, and the control unit 31 sequentially executes operations according to the program codes.
  • the operation unit 33 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
  • the control unit 33 controls an instruction signal input by key operation or mouse operation on the keyboard.
  • the operation unit 33 may include a touch panel on the display screen of the display unit 34, and in this case, an instruction signal input via the touch panel is output to the control unit 31.
  • the display unit 34 is composed of a monitor such as an LCD or CRT, and displays an input instruction, data, or the like from the operation unit 33 in accordance with an instruction of a display signal input from the control unit 31 as a display unit.
  • the communication unit 35 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
  • FIG. 3 shows a photographing control process executed by the control unit 21 of the photographing console 2.
  • the photographing control process is executed in cooperation with the photographing control processing program stored in the control unit 21 and the storage unit 22.
  • the operation section 23 of the imaging console 2 is operated by the imaging technician, and patient information (patient name, height, weight, age, sex, etc.) of the imaging target (subject M) is input (step S1).
  • the radiation irradiation conditions are read from the storage unit 22 and set in the radiation irradiation control device 12, and the image reading conditions are read from the storage unit 22 and set in the reading control device 14 (step S2).
  • the frame rate (pulse rate) is preferably set to 7.5 frames / second or more in consideration of a human heartbeat cycle.
  • the imaging engineer prepares for imaging such as patient positioning in the imaging apparatus 1. Specifically, according to the height of the subject M (patient), the height of the detector holding unit 132 to which the radiation detector 131 is attached is adjusted by a foot switch (not shown), and the subject M is detected by the detector holding unit 132. The patient is fixed by the patient fixing part 134. Also, two or more X-ray non-transparent markers for body motion correction (here, two places, marker M1 and marker M2) are attached to the subject M. Then, the patient is instructed to perform rest ventilation (breathing) in a relaxed state. The patient may also be instructed to hold his breath to obtain a dynamic image with little movement of the diaphragm and ribs.
  • the distance from the floor to the center of the radiation detector 131 is acquired by the distance measuring sensor SE 1 and output to the reading control apparatus 14. Is done.
  • the output value of the distance measuring sensor SE1 is output to the radiation irradiation control device 12 as height information.
  • a driving mechanism (not shown) is provided so that the value of the distance from the floor output from the distance measuring sensor SE2 to the focal position of the radiation source 111 is the same as the value output from the reading control device 14.
  • the height of the radiation source holding unit 112 is adjusted.
  • the imaging engineer inputs a radiation irradiation instruction through the operation unit 23 of the imaging console 2.
  • a photographing start instruction is output to the radiation irradiation control device 12 and the reading control device 14, and dynamic photographing is started (step S4). That is, radiation is emitted from the radiation source 111 at a pulse interval set in the radiation irradiation control device 12, and a frame image is acquired by the radiation detector 131.
  • the control unit 21 outputs an instruction to end imaging to the radiation irradiation control device 12 and the reading control device 14, and the imaging operation is stopped.
  • Frame images acquired by shooting are sequentially input to the shooting console 2, and correction processing such as offset correction processing, gain correction processing, defective pixel correction processing, and lag (afterimage) correction processing using the above-described dark image is necessary. It is performed accordingly (step S5). However, these correction processes may be omitted to give priority to shortening the processing time. In this case, the process skips step S5 and proceeds to step S6.
  • the frame image is stored in the storage unit 22 in association with a number indicating the shooting order (step S6) and displayed on the display unit 24 (step S7).
  • the imaging engineer confirms the positioning and the like based on the displayed dynamic image, and determines whether an image suitable for diagnosis is acquired by imaging (imaging OK) or re-imaging is necessary (imaging NG). Then, the operation unit 23 is operated to input a determination result.
  • step S8 When a determination result indicating that the shooting is OK is input by a predetermined operation of the operation unit 23 (step S8; YES), an identification ID for identifying the dynamic image or each of a series of frame images acquired by the dynamic shooting is displayed. Information such as patient information, examination target region, radiation irradiation condition, image reading condition, number indicating the imaging order is attached (for example, written in the header area of the image data in DICOM format) and diagnosed via the communication unit 25 Is transmitted to the console 3 (step S9). Then, this process ends. On the other hand, when a determination result indicating photographing NG is input by a predetermined operation of the operation unit 23 (step S8; NO), a series of frame images stored in the storage unit 22 is deleted (step S10), and this processing is performed. finish.
  • the diagnostic console 3 When the diagnostic console 3 receives a series of frame images of the dynamic image from the imaging console 2 via the communication unit 35, it cooperates with the control unit 31 and the image analysis processing program stored in the storage unit 32. As a result, the image analysis processing shown in FIG. 4 is executed.
  • preprocessing is performed (step S11).
  • FIG. 5 shows a flowchart of the preprocessing executed in step S11.
  • step S101 logarithmic conversion processing is performed, and pixel values (density values, hereinafter referred to as signal values) of each frame image of the dynamic image are converted from a true number to a logarithm (step S101).
  • time-varying density correction processing processing for removing the influence of irradiation dose variation at the time of acquisition of each frame image
  • correction is performed so that the signal value of the direct X-ray region of each frame image becomes the same value.
  • step S102 first, a reference frame image is selected from a series of frame images input from the photographing console 2, and a correction value of each frame image is calculated by the following (Equation 1).
  • Correction value of each frame image (Average signal value of direct X-ray region of each frame image) ⁇ (Average signal value of direct X-ray region of the reference frame image)
  • the correction value calculated by (Equation 1) is subtracted from the signal value for each pixel.
  • the grid eye removal process is a process of removing a stripe pattern caused by the grid arrangement of the scattered radiation removal grid provided to remove scattered radiation.
  • the grid eye removal process can be performed using a known technique. For example, each frame image is subjected to a frequency transform process such as discrete Fourier transform, followed by a low-pass filter process to remove a high frequency region including the frequency of the grid image and an inverse Fourier transform process. Yes (see Takayuki Ishida, “Introduction to Medical Image Processing”, 3.4 Removal of Vertical Striped Shadows by Grid of X-ray Images).
  • each frame image is rotated and translated so that the line segments of the two X-ray opaque markers M1 and M2 in the entire frame image coincide with each other, and alignment is performed.
  • step S105 lung field extraction / division processing is performed on each frame image.
  • the lung field extraction / division process is performed according to the following procedures (1) to (4).
  • An arbitrary frame image (in this case, the frame image with the first shooting order) is set as a reference image from a series of frame images.
  • a lung field region is extracted from the reference image, and a substantially rectangular region circumscribing the extracted lung field region is divided into a plurality of small regions A1 as indicated by dotted lines in FIG.
  • the small area A1 to be divided is, for example, 0.4 to 4 cm square.
  • the lung field region extraction method may be any method.
  • a threshold value is obtained from a histogram of signal values of the reference image by discriminant analysis, and a region having a signal higher than the threshold value is primarily extracted as a lung field region candidate.
  • edge detection is performed in the vicinity of the boundary of the first extracted lung field region candidate, and the boundary of the lung field region can be extracted by extracting along the boundary the point where the edge is maximum in a small region near the boundary. it can.
  • the position corresponding to the center point of each small area A1 of the reference image is extracted from the other frame images by the local matching process.
  • the local matching process can be performed by a method described in, for example, Japanese Patent Application Laid-Open No. 2001-157667. Specifically, first, the initial value of the counter n is set to 1, and the search area of each small area A1 in the frame image with the shooting order nth is set to the frame image with the shooting order nth.
  • each search area has the same center point (x, y), where the coordinates of the center point in each small area A1 in the nth frame image are (x, y).
  • the vertical and horizontal widths are set to be larger than each small area A1 (for example, the vertical and horizontal widths are each doubled).
  • a position having the highest matching degree in the search area set in the (n + 1) th frame image is calculated, and the position on the (n + 1) th frame image is calculated. Calculated as the corresponding position.
  • a least square method or a cross-correlation coefficient is used as an index.
  • each frame image is divided into new small areas A2 having the central point of each small area A1 as a vertex.
  • the local matching process may be performed on the frame image itself, or may be performed on an image extracted (emphasized) by a gradient filter or the like.
  • local matching processing may be performed with an image in which lung blood vessel shadows are emphasized or extracted.
  • a technique for emphasizing pulmonary vessel shadows for example, “Constructing a filter bank for detecting circular and linear patterns in medical images”, IEICE Transactions D-II Vol.J87-D-II No.1 pp175-
  • a pulmonary vessel shadow is emphasized by extracting an image of a resolution level corresponding to the pulmonary blood vessel from an image in which a circular / linear pattern is selectively emphasized for each resolution level. Images can be obtained.
  • the clavicle / rib area moves differently from the pulmonary vascular shadow
  • a part of the lung excluding the clavicle / rib area is recognized by recognizing the clavicle / rib from the reference image using a clavicle / rib shape model.
  • the field region may be divided into small regions and local matching may be performed. In this way, when local matching was performed by dividing the reference image into rectangular regions centered on structurally characteristic points, or local matching was performed by dividing some lung field regions into small regions.
  • the lung field regions of the associated frame images may be divided at equal intervals.
  • warping processing non-linear distortion conversion processing
  • shape of the lung field region of each frame image is matched with the reference image (step S106).
  • the warping process is performed as follows, for example.
  • a shift value ( ⁇ x, ⁇ y) for each pixel of the (n + 1) th frame image is calculated by an approximation process using a two-dimensional 10th order polynomial using the obtained shift values ( ⁇ x, ⁇ y). Then, each pixel of the (n + 1) th frame image is shifted based on the obtained shift value ( ⁇ x, ⁇ y) of each pixel, and a new frame image is created separately from the frame image before the warping process.
  • the correspondence relationship of each pixel before and after the warping process in each frame image is stored in the RAM. This makes it possible to determine which pixel before the warping process corresponds to each pixel after the warping process.
  • the integrated image G is created by adding the signal value of the pixel for each corresponding region of each frame image created by the reference image and the warping process (step S107), the preprocessing is completed, and the processing is illustrated in FIG. 4 moves to step S12.
  • the integrated image G is an image with reduced noise and an image in which the blood vessel region can be easily recognized. Note that the integrated image G may be created by adding the signal values of corresponding pixels of each frame image and then dividing the result by the number of frames and averaging.
  • a blood vessel recognition process is performed on the integrated image G (step S12).
  • a blood vessel as used herein refers to a relatively thick pulmonary blood vessel in which blood flowing in peripheral blood vessels in the lung field is concentrated.
  • a GUI for the operator to designate a region to be recognized by the operator is displayed on the display unit 34, and the blood vessel recognition process is performed only for the region designated from the GUI. It is preferable because the time required for blood vessel recognition processing can be reduced and erroneous recognition can be reduced.
  • a GUI for designating a region for example, as shown in FIG. 7, when the integrated image G is displayed on the display unit 34 and two vertices are designated by the operation unit 33, the designated two vertices are displayed.
  • a rectangular region that is a diagonal line is set as a region for blood vessel recognition.
  • a polygonal region surrounded by the designated vertices may be set as a region for blood vessel recognition.
  • the designated region protrudes from the lung field region, blood vessel recognition processing is performed only within the lung field region.
  • a method of extracting a linear structure using the maximum eigenvalue calculated from each element of the Hessian matrix (“filter bank for detecting circular / linear patterns in medical images” Construction ”IEICE Transactions D-II Vol.J87-D-II No.1 pp 175-185) can be used.
  • a band division filter bank that generates each element of the Hessian matrix, it is possible to extract a linear structure at each resolution level, and it is possible to extract blood vessel regions having different sizes (diameters). .
  • By performing blood vessel extraction at a resolution level coarser than the predetermined threshold only blood vessels having a predetermined diameter or more can be extracted.
  • the clavicle / rib is extracted at the same time as the blood vessel shadow, but the rib is recognized by using the clavicle / rib shape model and deleted from the extraction result.
  • a method of extracting a dendritic structure using a deformable dendritic figure model as a blood vessel structure model (“automatic extraction of blood vessel shadow from chest X-ray image using Deformable Model” By using the procedure “MEDICAL IMAGENG TECHNOLOGY Vol.17 No.5 September 1999), blood vessel shadows can be extracted.
  • the region extracted by these methods is recognized as a blood vessel region.
  • a GUI for the operator to designate a part of the blood vessel portion to be recognized may be displayed on the display unit 34, and the blood vessel recognition process may be performed based on the designated position information.
  • the blood vessel recognition process in this case can be performed as follows.
  • the integrated image G is displayed on the display unit 34.
  • a part of a blood vessel (indicated by BV in FIGS. 8 (a) to 8 (f)) to be recognized by the operation unit 33 on the integrated image G in the traveling direction of the blood vessel.
  • FIGS. 8B and 8C When designated along, in the blood vessel recognition processing, as shown in FIGS. 8B and 8C, the direction perpendicular to the traveling direction designated by the operation unit 33 (the arrow direction in FIG. 8B).
  • a signal value profile is generated.
  • the width W of the portion including the position P designated by the operation unit 33 and where the signal value is lower than the predetermined threshold TH1 with respect to the surroundings counts the number of pixels.
  • the low signal region is recognized as a blood vessel region.
  • a blood vessel is recognized, as shown in FIG. 8D, an ROI centered on the end of the recognized blood vessel region is set, and is calculated by a discriminant analysis method or the like from a histogram of signal values in the ROI. Binarization is performed based on the threshold value.
  • the size of the ROI is determined according to the width W. For example, an ROI having a side length of ⁇ ⁇ W ( ⁇ is a predetermined coefficient) is set. Then, as shown in FIG.
  • a low signal region that includes the center of the ROI and is equal to or less than the calculated threshold value is set as a blood vessel candidate region, and the size of the blood vessel candidate region is a predetermined value with respect to the area of the ROI. When it is within the range, it is recognized as a blood vessel region. Further, when the newly recognized blood vessel region touches the end of the ROI, as shown in FIG. 8 (f), a new ROI centered on that end is set and the recognition of the blood vessel region is repeated in the same manner. .
  • the blood vessel region removal processing is executed, and the signal value of each pixel in the region located at the same position as the blood vessel region recognized from the integrated image G is in the vicinity of the pixel in the blood vessel region. It is replaced with the average value of the signal values in the non-blood vessel region (step S13).
  • the average value is obtained, and the signal value of each pixel in the blood vessel region is replaced with the obtained average value.
  • the P22 pixels at the center are P00, P01, P02, P10, P11, P12, P20, It is replaced with the average value of the signal values of P21, P30, P31, P40, P41.
  • the signal value is replaced only for the pixels at the boundary of the non-blood vessel region in the blood vessel region, and then, among the pixels that have not been replaced yet, The replacement of the signal value for only the pixels may be sequentially repeated until the signal values of all the pixels in the blood vessel region are replaced.
  • step S13 the blood vessel region is removed from the lung field region as shown in FIG.
  • a high-pass filter is applied to each frame image from which the blood vessel region has been removed, and only high-frequency components are extracted (step S14).
  • the cut-off frequency is 1 Hz, and only high frequency components of 1 Hz or more corresponding to the human heartbeat cycle are extracted.
  • the lung field region is divided into a plurality of block regions (step S15), the signal values in the regions are added for each block region, and the added signal value (added signal value)
  • the time change is calculated (step S16). For example, it is divided into six block areas, namely, upper right lung field R1, middle right lung field R2, lower right lung field R3, upper left lung field L1, middle left lung field L2, and lower left lung field L3 (see FIG. 11).
  • the addition signal value is calculated for each block area, and the time change is calculated.
  • FIG. 12 shows the time change of the added signal value in a certain block area.
  • the horizontal axis represents the elapsed time t from the start of dynamic imaging
  • the vertical axis represents the added signal value in the block area.
  • a signal change amount corresponding to one heartbeat period is calculated for each block region (step S17). Specifically, as shown in FIG. 12, with respect to the time change of the signal value of each block region, the maximum value equal to or greater than a first threshold value that is predetermined and the second value that is predetermined. A minimum value equal to or less than the threshold is extracted, and a signal change amount corresponding to one heartbeat period is calculated by calculating a difference between the maximum value and the next minimum value. For a plurality of heartbeat cycles, a signal change amount corresponding to one heartbeat cycle is calculated (D0 to D3 in FIG. 12), and an average value of the calculated signal change amounts is calculated. The signal value in each block area changes according to the blood flow volume in the block area.
  • the calculated signal change amount corresponding to one heartbeat period in each block area is information indicating the blood flow rate of the peripheral blood vessels in one heartbeat period in the block.
  • blood flow information of peripheral blood vessels in each block region is calculated based on the signal change amount corresponding to one heartbeat period calculated in step S17 (step S18).
  • the ratio of the blood flow rate of the peripheral blood vessels in each block region to the blood flow rate of the peripheral blood vessels in the entire lung field is calculated. For example, it is calculated by (signal change amount corresponding to one heartbeat period in the block region / total signal change amount corresponding to one heartbeat period in all block regions) ⁇ 100 (%).
  • the calculated ratio is displayed on the display unit 34 as the blood flow information of the peripheral blood vessels in each block region (step S19).
  • FIG. 11 shows an example of blood flow information displayed on the display unit 34 in step S19.
  • the display unit 34 displays the blood flow information of the peripheral blood vessels in each block region as numerical values, and a series of frame images are displayed as a moving image.
  • Each block region displayed as a moving image is displayed as a peripheral blood vessel.
  • the color of the saturation according to the value of the blood flow information (or the luminance value according to the value of the blood flow information of the peripheral blood vessel) is displayed.
  • each block area is displayed with a color of saturation corresponding to the value of blood flow information of the peripheral blood vessels, so that the doctor can grasp the distribution of blood flow in the peripheral blood vessels of each block area at a glance. It becomes possible to do.
  • step S20 it is determined whether or not the blood flow information of the peripheral blood vessels in each block region is below a predetermined threshold value. If it is determined that the block region is below, it is determined that the block region is abnormal (step S20). If there is a block area determined to be abnormal, the corresponding block area on the series of frame images displayed as moving images is displayed in an identifiable manner (for example, by emphasizing the outline) (step S21). ). FIG. 13 shows an example in which block areas determined to be abnormal are displayed in an identifiable manner.
  • the control unit 31 extracts a lung field region from each of a plurality of frame images showing the dynamics of the chest, and blood vessels in the extracted lung field region Region recognition processing is performed to remove the recognized blood vessel region from each frame image. Then, the lung field region of each frame image from which the blood vessel region is removed is divided into a plurality of block regions, the amount of change in the pixel signal value in each block region is calculated, and the calculated pixel signal value in each block region Based on the change amount, peripheral blood flow information existing in each block region is calculated and displayed on the display unit 34.
  • a GUI is provided for the operator to specify a part of the blood vessel region to be recognized, and the blood vessel region recognition processing is performed based on the region specified by the GUI, thereby The pulmonary blood vessels can be accurately recognized.
  • a GUI is provided for the operator to specify a region for performing the blood vessel region recognition processing, and the blood vessel region recognition processing is performed only for the region specified by the GUI, thereby reducing the blood vessel recognition processing time. It can be shortened and the recognition accuracy can be improved.
  • the position of the block area determined to be abnormal is displayed as a moving image. Since the information is displayed on the frame image so as to be identifiable, it is possible for the doctor to easily grasp the portion where the blood flow in the peripheral blood vessels is abnormal.
  • a series of frame images are displayed as moving images, and each block region in the frame images displayed as moving images corresponds to the value of blood flow information of peripheral blood vessels.
  • the block area determined to be abnormal is displayed in an identifiable color (or luminance value corresponding to the blood flow information value of the peripheral blood vessel) and is displayed in an identifiable manner. Display one of the images (for example, a reference image) or a still image such as the integrated image G, and display each block area in a color corresponding to the value of blood flow information of peripheral blood vessels on these images Alternatively, an abnormal block area may be displayed in an identifiable manner.
  • a hard disk, a semiconductor nonvolatile memory, or the like is used as a computer-readable medium of the program according to the present invention, but the present invention is not limited to this example.
  • a portable recording medium such as a CD-ROM can be applied.
  • a carrier wave is also applied as a medium for providing program data according to the present invention via a communication line.
  • DESCRIPTION OF SYMBOLS 100 Dynamic image processing system 1 Imaging device 11 Radiation generation apparatus 111 Radiation source 112 Radiation source holding part 113 Support base axis 12 Radiation irradiation control apparatus 13 Radiation detection part 131 Radiation detector 132 Detector holding part 133 Support base axis 134 Patient fixing part 134a Upper left Section fixing section 134b upper right section fixing section 134c lower left section fixing section 134d lower right section fixing section 14 reading control device 2 photographing console 21 control section 22 storage section 23 operation section 24 display section 25 communication section 26 bus 3 diagnostic console 31 control section 32 storage unit 33 operation unit 34 display unit 35 communication unit 36 bus

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Vascular Medicine (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The objective of the present invention is to accurately calculate information relating to bloodstream flowing through the peripheral blood vessels in a lung field. A diagnostic consol (3) according to the present invention includes a controller (31) which extracts a lung field from each frame image representing the dynamic state of a chest region to identify a vessel territory from the extracted lung field and remove the identified vessel territory from each frame image. Then, the controller divides, into multiple block areas, the lung field in each frame image from which the vessel territory has been removed, to calculate a variation amount of a pixel signal value of each block area and calculate bloodstream information of the peripheral blood vessels in each block area on the basis of the calculated variation amount of the pixel signal.

Description

動態画像処理システム及びプログラムDynamic image processing system and program
 本発明は、動態画像処理システム及びプログラムに関する。 The present invention relates to a dynamic image processing system and a program.
 従来のフィルム/スクリーンや輝尽性蛍光体プレートを用いた胸部の放射線による静止画撮影及び診断に対し、FPD(flat panel detector)等の半導体イメージセンサを利用して胸部の動態画像を撮影し、診断に応用する試みがなされるようになってきている。具体的には、半導体イメージセンサの画像データの読取・消去の応答性の早さを利用し、半導体イメージセンサの読取・消去のタイミングと合わせて放射源からパルス状の放射線を連続照射し、1秒間に複数回の撮影を行って、胸部の動態を撮影する。撮影により取得された一連の複数枚の画像を順次表示することにより、医師は呼吸運動や心臓の拍動等に伴う胸部の一連の動きを観察することが可能となる。 In contrast to still image photography and diagnosis by chest radiation using conventional film / screen and photostimulable phosphor plate, a dynamic image of the chest is taken using a semiconductor image sensor such as FPD (flat panel detector), Attempts have been made to apply it to diagnosis. Specifically, by utilizing the responsiveness of reading / erasing of image data of the semiconductor image sensor, pulsed radiation is continuously irradiated from the radiation source in accordance with the reading / erasing timing of the semiconductor image sensor. Take multiple shots per second to capture the dynamics of the chest. By sequentially displaying a series of a plurality of images acquired by imaging, a doctor can observe a series of movements of the chest accompanying respiratory motion, heart beats, and the like.
 しかしながら、表示された動態画像を観察して目視で異常個所を認識することは難しく、読影時間も必要となる。そこで、動態画像を見やすく表示するための各種技術も提案されている。例えば、特許文献1には、隣接するフレーム画像を異なるエネルギーのX線で撮影し、それらをサブトラクション処理することで、骨、血管を除去した動画像を表示する技術が記載されている。 However, it is difficult to visually recognize the abnormal part by observing the displayed dynamic image, and interpretation time is also required. Therefore, various techniques for displaying dynamic images in an easy-to-view manner have been proposed. For example, Patent Document 1 describes a technique for displaying a moving image from which bones and blood vessels are removed by capturing adjacent frame images with X-rays having different energies and subjecting them to subtraction processing.
特開2004-328145号公報JP 2004-328145 A
 ところで、X線動態画像から肺野領域内の末梢血管を流れる血流量を定量的に算出しようとした場合、末梢血管に流れる血液が集中する肺血管(例えば、右葉間部肺動脈等の肺動脈、右下肺静脈等の肺静脈)上の信号変化によって、末梢血管の血流量が実際より大きく算出されてしまうという問題があった。 By the way, when trying to quantitatively calculate the blood flow amount flowing in the peripheral blood vessel in the lung field region from the X-ray dynamic image, the pulmonary blood vessel in which the blood flowing in the peripheral blood vessel is concentrated (for example, the pulmonary artery such as the right interlobe pulmonary artery, There is a problem that the blood flow volume of the peripheral blood vessel is calculated to be larger than the actual blood flow due to the signal change on the pulmonary vein such as the right lower pulmonary vein.
 特許文献1に記載の技術のように複数の隣接フレーム画像のペアを同じ係数を用いてサブトラクション処理した場合、X線動態画像では血流により肺血管上の信号値が時間的に変化するため、肺血管上の血流による信号変化が除去できずに、画像中にノイズとして残存してしまうという問題があった。 When subtraction processing is performed on a plurality of pairs of adjacent frame images using the same coefficient as in the technique described in Patent Document 1, the signal value on the pulmonary blood vessel changes with time in the X-ray dynamic image, There has been a problem that signal changes due to blood flow on the pulmonary blood vessels cannot be removed and remain as noise in the image.
 本発明の課題は、肺野領域の末梢血管を流れる血流の情報を精度よく算出できるようにすることである。 An object of the present invention is to enable accurate calculation of information on blood flow flowing through peripheral blood vessels in a lung field region.
 上記課題を解決するため、請求項1に記載の発明の動態画像処理システムは、
 胸部の動態を示す複数のフレーム画像のそれぞれから肺野領域を抽出する肺野領域抽出手段と、
 前記各フレーム画像の前記抽出された肺野領域において血管領域の認識処理を行う血管領域認識手段と、
 前記各フレーム画像から前記血管領域認識手段により認識された血管領域を除去する血管領域除去手段と、
 前記血管領域除去手段により血管領域が除去された各フレーム画像において前記肺野領域を複数のブロック領域に分割する分割手段と、
 前記血管領域除去手段により血管領域が除去されたフレーム画像間における前記各ブロック領域内の画素信号値の変化量を算出する算出手段と、
 前記算出手段により算出された各ブロック領域内の画素信号値の変化量に基づいて、前記各ブロック領域に存在する末梢血管の血流情報を算出する血流情報算出手段と、
 を備える。
In order to solve the above-described problem, a dynamic image processing system according to claim 1 is provided.
Lung field region extraction means for extracting a lung field region from each of a plurality of frame images indicating the dynamics of the chest,
Blood vessel region recognition means for performing blood vessel region recognition processing in the extracted lung field region of each frame image;
A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image;
A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit;
Calculating means for calculating a change amount of a pixel signal value in each block region between the frame images from which the blood vessel region has been removed by the blood vessel region removing unit;
Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on a change amount of a pixel signal value in each block area calculated by the calculating means;
Is provided.
 請求項2に記載の発明は、請求項1に記載の発明において、
 前記血管領域認識手段により認識すべき血管領域の一部を操作者が指定するための操作手段を備え、
 前記血管領域認識手段は、前記操作手段により指定された領域に基づいて血管領域の認識処理を行う。
The invention according to claim 2 is the invention according to claim 1,
An operation means for an operator to specify a part of a blood vessel region to be recognized by the blood vessel region recognition means;
The blood vessel region recognizing unit performs a blood vessel region recognition process based on the region designated by the operation unit.
 請求項3に記載の発明は、請求項1に記載の発明において、
 前記抽出された肺野領域において前記血管領域認識手段により血管認識処理を行う領域を操作者が指定するための操作手段を備え、
 前記血管領域認識手段は、前記操作手段により指定された領域についてのみ血管領域の認識処理を行う。
The invention according to claim 3 is the invention according to claim 1,
In the extracted lung field region, comprising an operation means for an operator to specify a region for performing blood vessel recognition processing by the blood vessel region recognition means,
The blood vessel region recognizing unit performs a blood vessel region recognition process only for a region designated by the operation unit.
 請求項4に記載の発明は、請求項1~3の何れか一項に記載の発明において、
 前記血流情報算出手段により算出された血流情報を表示部に表示させる表示手段と、
 を備える。
The invention according to claim 4 is the invention according to any one of claims 1 to 3,
Display means for displaying blood flow information calculated by the blood flow information calculating means on a display unit;
Is provided.
 請求項5に記載の発明は、請求項4に記載の発明において、
 前記各ブロック領域に存在する末梢血管の血流情報に基づいて、前記各ブロック領域における末梢血管の血流が異常であるか否かを判断する判断手段を備え、
 前記表示手段は、前記判断手段により異常であると判断されたブロック領域の位置を動画表示したフレーム画像上に識別可能に前記表示部に表示させる。
The invention according to claim 5 is the invention according to claim 4,
Based on the blood flow information of the peripheral blood vessels present in each block region, comprising a determination means for determining whether the blood flow of the peripheral blood vessels in each block region is abnormal,
The display means causes the position of the block area determined to be abnormal by the determination means to be displayed on the display unit so as to be identifiable on a frame image displayed as a moving image.
 請求項6に記載の発明は、請求項1~5の何れか一項に記載の発明において、
 前記血管領域除去手段は、前記認識された血管領域の画素信号値をその近傍の非血管領域の画素信号値に基づいて定められる値に置き換える。
The invention according to claim 6 is the invention according to any one of claims 1 to 5,
The blood vessel region removing unit replaces the recognized pixel signal value of the blood vessel region with a value determined based on the pixel signal value of the non-blood vessel region in the vicinity thereof.
 請求項7に記載の発明のプログラムは、
 コンピュータを、
 胸部の動態を示す複数のフレーム画像のそれぞれから肺野領域を抽出する肺野領域抽出手段、
 前記各フレーム画像の前記抽出された肺野領域において血管領域の認識処理を行う血管領域認識手段、
 前記各フレーム画像から前記血管領域認識手段により認識された血管領域を除去する血管領域除去手段、
 前記血管領域除去手段により血管領域が除去された各フレーム画像において前記肺野領域を複数のブロック領域に分割する分割手段、
 前記血管領域除去手段により血管領域が除去されたフレーム画像間における前記各ブロック領域内の画素信号値の変化量を算出する算出手段、
 前記算出手段により算出された各ブロック領域内の画素信号値の変化量に基づいて、前記各ブロック領域に存在する末梢血管の血流情報を算出する血流情報算出手段、
 として機能させる。
The program of the invention described in claim 7 is:
Computer
Lung field extraction means for extracting a lung field from each of a plurality of frame images showing the dynamics of the chest,
A blood vessel region recognition means for performing a blood vessel region recognition process in the extracted lung field region of each frame image;
A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image;
A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit;
Calculating means for calculating a change amount of a pixel signal value in each block region between frame images from which the blood vessel region has been removed by the blood vessel region removing unit;
Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on the amount of change in the pixel signal value in each block area calculated by the calculating means;
To function as.
 請求項8に記載の発明は、請求項7に記載の発明において、
 前記コンピュータを、
 前記血流情報算出手段により算出された血流情報を表示部に表示させる表示手段、
として機能させる。
The invention according to claim 8 is the invention according to claim 7,
The computer,
Display means for displaying blood flow information calculated by the blood flow information calculating means on a display unit;
To function as.
 本発明によれば、肺野領域の末梢血管を流れる血流の情報を精度よく算出することが可能となる。 According to the present invention, it is possible to accurately calculate information on blood flow flowing through peripheral blood vessels in the lung field region.
本発明の実施の形態における動態画像処理システムの全体構成を示す図である。It is a figure which shows the whole structure of the dynamic image processing system in embodiment of this invention. 図1の放射線発生装置と放射線検出部の詳細構成例を示す図である。It is a figure which shows the detailed structural example of the radiation generator of FIG. 1, and a radiation detection part. 図1の放射線検出部の患者固定部の一例を示す図である。It is a figure which shows an example of the patient fixing | fixed part of the radiation detection part of FIG. 図1の撮影用コンソールの制御部により実行される撮影制御処理を示すフローチャートである。3 is a flowchart illustrating a shooting control process executed by a control unit of the shooting console of FIG. 1. 図1の診断用コンソールの制御部により実行される画像解析処理を示すフローチャートである。It is a flowchart which shows the image analysis process performed by the control part of the diagnostic console of FIG. 図4のステップS11において実行される前処理を示すフローチャートである。It is a flowchart which shows the pre-processing performed in step S11 of FIG. 図5の肺野領域抽出分割処理により取得される小領域を説明するための図である。It is a figure for demonstrating the small area | region acquired by the lung field area | region extraction division | segmentation process of FIG. 図4の血管領域認識処理の対象領域を指定するGUIの一例を示す図である。FIG. 5 is a diagram illustrating an example of a GUI for designating a target region for the blood vessel region recognition process in FIG. 4. 図4の血管領域認識処理において、操作者による操作に応じて血管領域を認識する例を説明するための図である。FIG. 5 is a diagram for explaining an example of recognizing a blood vessel region according to an operation by an operator in the blood vessel region recognition process of FIG. 4. 図4の血管領域除去処理を説明するための図である。It is a figure for demonstrating the blood vessel region removal process of FIG. 図4の血管領域除去処理による処理前と処理後の画像の一例を示す図である。It is a figure which shows an example of the image before and after the process by the blood vessel region removal process of FIG. 図4のステップS19における血流情報の表示例を示す図である。It is a figure which shows the example of a display of the blood-flow information in step S19 of FIG. あるブロック領域における加算信号値の時間変化を示す図である。It is a figure which shows the time change of the addition signal value in a certain block area | region. 図4のステップS21における異常なブロック領域の表示例を示す図である。It is a figure which shows the example of a display of the abnormal block area | region in FIG.4 S21.
 以下、図面を参照して本発明に係る実施の形態を詳細に説明する。ただし、発明の範囲は、図示例に限定されない。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. However, the scope of the invention is not limited to the illustrated examples.
 〈第1の実施の形態〉
 〔動態画像処理システム100の構成〕
 まず、構成を説明する。
<First Embodiment>
[Configuration of Dynamic Image Processing System 100]
First, the configuration will be described.
 図1に、本実施の形態における動態画像処理システム100の全体構成を示す。 FIG. 1 shows the overall configuration of a dynamic image processing system 100 in the present embodiment.
 図1に示すように、動態画像処理システム100は、撮影装置1と、撮影用コンソール2とが通信ケーブル等により接続され、撮影用コンソール2と、診断用コンソール3とがLAN(Local Area Network)等の通信ネットワークNTを介して接続されて構成されている。動態画像処理システム100を構成する各装置は、DICOM(Digital Image and Communications in Medicine)規格に準じており、各装置間の通信は、DICOMに則って行われる。 As shown in FIG. 1, in the dynamic image processing system 100, an imaging device 1 and an imaging console 2 are connected by a communication cable or the like, and the imaging console 2 and the diagnostic console 3 are connected to a LAN (Local Area Network). Etc., and connected via a communication network NT. Each device constituting the dynamic image processing system 100 conforms to the DICOM (Digital-Image-and-Communications-in-Medicine) standard, and communication between the devices is performed according to DICOM.
 〔撮影装置1の構成〕
 撮影装置1は、例えば、呼吸運動に伴う肺の膨張及び収縮の形態変化、心臓の拍動等の、周期性(サイクル)を持つ胸部の動態を撮影する装置である。動態撮影は、人体の胸部に対し、X線等の放射線をパルス的に連続照射して複数の画像を取得(即ち、連続撮影)することにより行う。この連続撮影により得られた一連の画像を動態画像と呼ぶ。また、動態画像を構成する複数の画像のそれぞれをフレーム画像と呼ぶ。
[Configuration of the photographing apparatus 1]
The imaging apparatus 1 is an apparatus that images the dynamics of the chest with periodicity (cycle), such as pulmonary expansion and contraction morphological changes, heart pulsation, and the like accompanying respiratory motion. Dynamic imaging is performed by continuously irradiating the chest of a human body with radiation such as X-rays in a pulse manner to acquire a plurality of images (that is, continuous imaging). A series of images obtained by this continuous shooting is called a dynamic image. Each of the plurality of images constituting the dynamic image is called a frame image.
 撮影装置1は、図1に示すように、放射線発生装置11、放射線照射制御装置12、放射線検出部13、読取制御装置14等を備えて構成されている。 As shown in FIG. 1, the imaging apparatus 1 includes a radiation generation device 11, a radiation irradiation control device 12, a radiation detection unit 13, a reading control device 14, and the like.
 放射線発生装置11は、図2Aに示すように、放射線源111、放射線源保持部112、支持基軸113等を備えて構成されている。 As shown in FIG. 2A, the radiation generation apparatus 11 includes a radiation source 111, a radiation source holding unit 112, a support base shaft 113, and the like.
 放射線源111は、被写体Mを挟んで放射線検出器131と対向する位置に配置され、放射線照射制御装置12の制御に従って、被写体Mに対し放射線(X線)を照射する。放射線源111は、放射線源保持部112により支持基軸113に沿って昇降可能に保持されており、撮影時には、放射線照射制御装置12からの制御に基づいて、床~放射線源111の焦点位置までの高さ(距離)が床~放射線検出器131の中央までの高さと同じになるように、図示しない駆動機構により調整される。放射線源111と放射線検出器131との間の距離は2m以上が好ましい。 The radiation source 111 is disposed at a position facing the radiation detector 131 across the subject M, and irradiates the subject M with radiation (X-rays) under the control of the radiation irradiation control device 12. The radiation source 111 is held by the radiation source holding unit 112 so as to be movable up and down along the support base axis 113, and at the time of imaging, from the floor to the focal position of the radiation source 111 based on the control from the radiation irradiation control device 12. The height (distance) is adjusted by a drive mechanism (not shown) so as to be the same as the height from the floor to the center of the radiation detector 131. The distance between the radiation source 111 and the radiation detector 131 is preferably 2 m or more.
 放射線照射制御装置12は、撮影用コンソール2に接続されており、撮影用コンソール2から入力された放射線照射条件に基づいて放射線発生装置11を制御して放射線撮影を行う。撮影用コンソール2から入力される放射線照射条件は、例えば、連続照射時のパルスレート、パルス幅、パルス間隔、撮影開始/終了タイミング、X線管電流の値、X線管電圧の値、フィルタ種等である。パルスレートは、1秒あたりの放射線照射回数であり、後述するフレームレートと一致している。パルス幅は、放射線照射1回当たりの放射線照射時間である。パルス間隔は、連続撮影において、1回の放射線照射開始から次の放射線照射開始までの時間であり、後述するフレーム間隔と一致している。 The radiation irradiation control device 12 is connected to the imaging console 2 and controls the radiation generator 11 based on the radiation irradiation conditions input from the imaging console 2 to perform radiation imaging. The radiation irradiation conditions input from the imaging console 2 are, for example, pulse rate, pulse width, pulse interval, imaging start / end timing, X-ray tube current value, X-ray tube voltage value, filter type during continuous irradiation. Etc. The pulse rate is the number of times of radiation irradiation per second, and matches the frame rate described later. The pulse width is a radiation irradiation time per one irradiation. The pulse interval is the time from the start of one radiation irradiation to the start of the next radiation irradiation in continuous imaging, and coincides with a frame interval described later.
 また、放射線照射制御装置12は、床~放射線源111の焦点位置までの高さが読取制御装置14から出力される床~放射線検出器131の中央までの高さと同じ高さとなるように放射線発生装置11各部の制御を行う。 Further, the radiation irradiation control device 12 generates radiation so that the height from the floor to the focal position of the radiation source 111 is the same as the height from the floor to the center of the radiation detector 131 output from the reading control device 14. Control of each part of the apparatus 11 is performed.
 放射線検出部13は、図2Aに示すように、放射線検出器131、検出器保持部132、支持基軸133、患者固定部134等を備えて構成されている。 2A, the radiation detection unit 13 includes a radiation detector 131, a detector holding unit 132, a support base 133, a patient fixing unit 134, and the like.
 放射線検出器131は、FPD等の半導体イメージセンサにより構成される。FPDは、例えば、ガラス基板等を有しており、基板上の所定位置に、放射線源111から照射されて少なくとも被写体Mを透過した放射線をその強度に応じて検出し、検出した放射線を電気信号に変換して蓄積する複数の画素がマトリックス状に配列されている。各画素は、例えばTFT(Thin Film Transistor)等のスイッチング部を含んでいる。尚、上記の直接型の他に、放射線を光に変換するシンチレータを有する間接型FPDを用いても良い。 The radiation detector 131 is composed of a semiconductor image sensor such as an FPD. The FPD has, for example, a glass substrate or the like, detects radiation that has been irradiated from the radiation source 111 and transmitted through at least the subject M at a predetermined position on the substrate according to its intensity, and detects the detected radiation as an electrical signal. A plurality of pixels to be converted and stored are arranged in a matrix. Each pixel includes a switching unit such as a TFT (Thin-Film-Transistor). In addition to the direct type described above, an indirect type FPD having a scintillator that converts radiation into light may be used.
 放射線検出器131は、図2Aに示すように、検出器保持部132により支持基軸133に沿って昇降可能に保持されており、撮影時には、撮影技師による図示しないフットスイッチ等の操作により被写体Mの高さに応じて検出器保持部132の位置(床面からの高さ)を調整可能となっている。 As shown in FIG. 2A, the radiation detector 131 is held by a detector holding unit 132 so as to be movable up and down along the support base shaft 133. At the time of imaging, the radiation detector 131 operates the subject M by operating a foot switch (not shown). The position (height from the floor surface) of the detector holding part 132 can be adjusted according to the height.
 また、検出器保持部132には、患者固定部134が設けられている。患者固定部134は、図2Bに示すように、少なくとも左上部固定部134a、右上部固定部134b、左下部固定部134c、右下部固定部134dを有している。左上部固定部134aは、放射線検出器131の左端側から右端方向に向かって放射線検出器131の放射線照射面と平行に移動可能に構成されている。左下部固定部134cも同様である。右上部固定部134bは、放射線検出器131の右端側から左端方向に向かって放射線検出器131の放射線照射面と平行に左上部固定部134aの移動距離と同じ距離だけ移動するように構成されている。右下部固定部134dも同様である。そして、左上部固定部134aと右上部固定部134b、左下部固定部134cと右下部固定部134dで被写体Mを挟み込むことにより被写体Mの位置を放射線検出器131の中央に固定することができる。これにより、後述する画像解析処理の解析精度、血管の検出精度を向上させることができる。また、経過観察用に異なる時期に撮影する場合のポジショニング再現性を向上させることができる。 Also, the detector holding part 132 is provided with a patient fixing part 134. As shown in FIG. 2B, the patient fixing part 134 has at least a left upper fixing part 134a, an upper right fixing part 134b, a lower left fixing part 134c, and a lower right fixing part 134d. The upper left fixed part 134a is configured to be movable in parallel with the radiation irradiation surface of the radiation detector 131 from the left end side of the radiation detector 131 toward the right end direction. The same applies to the lower left fixing part 134c. The upper right fixed part 134b is configured to move from the right end side of the radiation detector 131 toward the left end direction by the same distance as the moving distance of the upper left fixed part 134a in parallel with the radiation irradiation surface of the radiation detector 131. Yes. The same applies to the lower right fixing part 134d. The position of the subject M can be fixed at the center of the radiation detector 131 by sandwiching the subject M between the upper left fixing portion 134a and the upper right portion fixing portion 134b, and the lower left fixing portion 134c and the lower right fixing portion 134d. As a result, it is possible to improve the analysis accuracy of the image analysis processing described later and the blood vessel detection accuracy. In addition, positioning reproducibility when shooting at different times for follow-up observation can be improved.
 なお、左上部固定部134a及び右上部固定部134bは、後述する時間変化濃度補正のために放射線検出器131上に放射線が直接照射される領域(直接X線領域)を確保できる位置(放射線検出器131の上端より所定距離下に配置されることが好ましい。 Note that the upper left fixed part 134a and the upper right fixed part 134b are positions (radiation detection) that can secure a region (direct X-ray region) directly irradiated with radiation on the radiation detector 131 for time-varying density correction described later. It is preferable to dispose a predetermined distance below the upper end of the vessel 131.
 読取制御装置14は、撮影用コンソール2に接続されている。読取制御装置14は、撮影用コンソール2から入力された画像読取条件に基づいて放射線検出器131の各画素のスイッチング部を制御して、当該各画素に蓄積された電気信号の読み取りをスイッチングしていき、放射線検出器131に蓄積された電気信号を読み取ることにより、画像データを取得する。この画像データがフレーム画像である。そして、読取制御装置14は、取得したフレーム画像を撮影用コンソール2に出力する。画像読取条件は、例えば、フレームレート、フレーム間隔、画素サイズ、画像サイズ(マトリックスサイズ)等である。フレームレートは、1秒あたりに取得するフレーム画像数であり、パルスレートと一致している。フレーム間隔は、連続撮影において、1回のフレーム画像の取得動作開始から次のフレーム画像の取得動作開始までの時間であり、パルス間隔と一致している。 The reading control device 14 is connected to the imaging console 2. The reading control device 14 controls the switching unit of each pixel of the radiation detector 131 based on the image reading condition input from the imaging console 2 to switch the reading of the electrical signal accumulated in each pixel. Then, the image data is acquired by reading the electric signal accumulated in the radiation detector 131. This image data is a frame image. Then, the reading control device 14 outputs the acquired frame image to the photographing console 2. The image reading conditions are, for example, a frame rate, a frame interval, a pixel size, an image size (matrix size), and the like. The frame rate is the number of frame images acquired per second and matches the pulse rate. The frame interval is the time from the start of one frame image acquisition operation to the start of the next frame image acquisition operation in continuous shooting, and coincides with the pulse interval.
 ここで、放射線照射制御装置12と読取制御装置14は互いに接続され、互いに同期信号をやりとりして、放射線照射動作と、リセット~蓄積~データ読取~リセット等の一連の画像の読み取りの動作を同調させるようになっている。必要に応じて、暗画像読取~リセットを撮影の前、または、後に実施し、これらを含めて、同調させるようにしても良い。また、読取制御装置14から放射線照射制御装置12に対し、床~放射線検出器131の中央までの高さ情報(測距センサSE1からの出力値)を出力して、床~放射線検出器131の中央までの高さと床~放射線源111の焦点位置までの高さを一致させるようになっている。 Here, the radiation irradiation control device 12 and the reading control device 14 are connected to each other, and exchange synchronization signals to synchronize the radiation irradiation operation and a series of image reading operations such as reset-accumulation-data reading-reset. It is supposed to let you. If necessary, dark image reading to reset may be performed before or after photographing, and may be synchronized by including them. Also, the reading control device 14 outputs height information (output value from the distance measuring sensor SE1) from the floor to the center of the radiation detector 131 to the radiation irradiation control device 12, and the floor to the radiation detector 131 The height to the center and the height from the floor to the focal position of the radiation source 111 are made to coincide.
 〔撮影用コンソール2の構成〕
 撮影用コンソール2は、放射線照射条件や画像読取条件を撮影装置1に出力して撮影装置1による放射線撮影及び放射線画像の読み取り動作を制御するとともに、撮影装置1により取得された動態画像を撮影技師によるポジショニングの確認や診断に適した画像であるか否かの確認用に表示する。
[Configuration of the shooting console 2]
The imaging console 2 outputs radiation irradiation conditions and image reading conditions to the imaging apparatus 1 to control radiation imaging and radiographic image reading operations by the imaging apparatus 1, and also captures dynamic images acquired by the imaging apparatus 1. Displayed for confirmation of whether the image is suitable for confirmation of positioning or diagnosis.
 撮影用コンソール2は、図1に示すように、制御部21、記憶部22、操作部23、表示部24、通信部25を備えて構成され、各部はバス26により接続されている。 As shown in FIG. 1, the photographing console 2 includes a control unit 21, a storage unit 22, an operation unit 23, a display unit 24, and a communication unit 25, and each unit is connected by a bus 26.
 制御部21は、CPU(Central Processing Unit)、RAM(Random Access Memory)等により構成される。制御部21のCPUは、操作部23の操作に応じて、記憶部22に記憶されているシステムプログラムや各種処理プログラムを読み出してRAM内に展開し、展開されたプログラムに従って後述する撮影制御処理を始めとする各種処理を実行し、撮影用コンソール2各部の動作や、撮影装置1の放射線照射動作及び読み取り動作を集中制御する。 The control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like. The CPU of the control unit 21 reads the system program and various processing programs stored in the storage unit 22 in accordance with the operation of the operation unit 23, expands them in the RAM, and performs shooting control processing described later according to the expanded programs. Various processes including the beginning are executed to centrally control the operation of each part of the imaging console 2 and the radiation irradiation operation and the reading operation of the imaging apparatus 1.
 記憶部22は、不揮発性の半導体メモリやハードディスク等により構成される。記憶部22は、制御部21で実行される各種プログラムやプログラムによる処理の実行に必要なパラメータ、或いは処理結果等のデータを記憶する。例えば、記憶部22は、図3に示す撮影制御処理を実行するための撮影制御処理プログラムを記憶している。また、記憶部22は、放射線照射条件及び画像読取条件を記憶している。各種プログラムは、読取可能なプログラムコードの形態で格納され、制御部21は、当該プログラムコードに従った動作を逐次実行する。 The storage unit 22 is configured by a nonvolatile semiconductor memory, a hard disk, or the like. The storage unit 22 stores various programs executed by the control unit 21 and data such as parameters necessary for execution of processing by the programs or processing results. For example, the storage unit 22 stores a shooting control processing program for executing the shooting control process shown in FIG. The storage unit 22 stores radiation irradiation conditions and image reading conditions. Various programs are stored in the form of readable program code, and the control unit 21 sequentially executes operations according to the program code.
 操作部23は、カーソルキー、数字入力キー、及び各種機能キー等を備えたキーボードと、マウス等のポインティングデバイスを備えて構成され、キーボードに対するキー操作やマウス操作により入力された指示信号を制御部21に出力する。また、操作部23は、表示部24の表示画面にタッチパネルを備えても良く、この場合、タッチパネルを介して入力された指示信号を制御部21に出力する。 The operation unit 23 includes a keyboard having a cursor key, numeric input keys, various function keys, and the like, and a pointing device such as a mouse. The control unit 23 controls an instruction signal input by key operation or mouse operation on the keyboard. To 21. In addition, the operation unit 23 may include a touch panel on the display screen of the display unit 24. In this case, the operation unit 23 outputs an instruction signal input via the touch panel to the control unit 21.
 表示部24は、LCD(Liquid Crystal Display)やCRT(Cathode Ray Tube)等のモニタにより構成され、制御部21から入力される表示信号の指示に従って、操作部23からの入力指示やデータ等を表示する。 The display unit 24 includes a monitor such as an LCD (Liquid Crystal Display) or a CRT (Cathode Ray Tube), and displays input instructions, data, and the like from the operation unit 23 in accordance with display signal instructions input from the control unit 21. To do.
 通信部25は、LANアダプタやモデムやTA(Terminal Adapter)等を備え、通信ネットワークNTに接続された各装置との間のデータ送受信を制御する。 The communication unit 25 includes a LAN adapter, a modem, a TA (Terminal Adapter), and the like, and controls data transmission / reception with each device connected to the communication network NT.
 〔診断用コンソール3の構成〕
 診断用コンソール3は、撮影用コンソール2から動態画像を取得し、取得した動態画像を表示して医師が読影診断するための動態画像処理装置である。
[Configuration of diagnostic console 3]
The diagnostic console 3 is a dynamic image processing device that acquires a dynamic image from the imaging console 2, displays the acquired dynamic image, and makes a diagnostic interpretation by a doctor.
 診断用コンソール3は、図1に示すように、制御部31、記憶部32、操作部33、表示部34、通信部35を備えて構成され、各部はバス36により接続されている。 As shown in FIG. 1, the diagnostic console 3 includes a control unit 31, a storage unit 32, an operation unit 33, a display unit 34, and a communication unit 35, and each unit is connected by a bus 36.
 制御部31は、CPU、RAM等により構成される。制御部31のCPUは、操作部33の操作に応じて、記憶部32に記憶されているシステムプログラムや、各種処理プログラムを読み出してRAM内に展開し、展開されたプログラムに従って、後述する画像解析処理を始めとする各種処理を実行し、診断用コンソール3各部の動作を集中制御する。制御部31は、後述する画像解析処理を実行することにより、肺野領域抽出手段、血管領域認識手段、血管領域除去手段、分割手段、算出手段、血流情報算出手段、表示手段、判断手段を実現する。 The control unit 31 includes a CPU, a RAM, and the like. The CPU of the control unit 31 reads out the system program and various processing programs stored in the storage unit 32 in accordance with the operation of the operation unit 33 and expands them in the RAM, and performs image analysis described later according to the expanded programs. Various processes including the process are executed to centrally control the operation of each part of the diagnostic console 3. The control unit 31 executes an image analysis process to be described later, thereby performing lung field region extraction means, blood vessel region recognition means, blood vessel region removal means, division means, calculation means, blood flow information calculation means, display means, and determination means. Realize.
 記憶部32は、不揮発性の半導体メモリやハードディスク等により構成される。記憶部32は、制御部31で画像解析処理を実行するための画像解析処理プログラムを始めとする各種プログラムやプログラムによる処理の実行に必要なパラメータ、或いは処理結果等のデータを記憶する。これらの各種プログラムは、読取可能なプログラムコードの形態で格納され、制御部31は、当該プログラムコードに従った動作を逐次実行する。 The storage unit 32 is configured by a nonvolatile semiconductor memory, a hard disk, or the like. The storage unit 32 stores various programs including an image analysis processing program for executing image analysis processing by the control unit 31, parameters necessary for execution of processing by the program, or data such as processing results. These various programs are stored in the form of readable program codes, and the control unit 31 sequentially executes operations according to the program codes.
 操作部33は、カーソルキー、数字入力キー、及び各種機能キー等を備えたキーボードと、マウス等のポインティングデバイスを備えて構成され、キーボードに対するキー操作やマウス操作により入力された指示信号を制御部31に出力する。また、操作部33は、表示部34の表示画面にタッチパネルを備えても良く、この場合、タッチパネルを介して入力された指示信号を制御部31に出力する。 The operation unit 33 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse. The control unit 33 controls an instruction signal input by key operation or mouse operation on the keyboard. To 31. The operation unit 33 may include a touch panel on the display screen of the display unit 34, and in this case, an instruction signal input via the touch panel is output to the control unit 31.
 表示部34は、LCDやCRT等のモニタにより構成され、表示手段としての制御部31から入力される表示信号の指示に従って、操作部33からの入力指示やデータ等を表示する。 The display unit 34 is composed of a monitor such as an LCD or CRT, and displays an input instruction, data, or the like from the operation unit 33 in accordance with an instruction of a display signal input from the control unit 31 as a display unit.
 通信部35は、LANアダプタやモデムやTA等を備え、通信ネットワークNTに接続された各装置との間のデータ送受信を制御する。 The communication unit 35 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
 〔動態画像処理システム100の動作〕
 次に、上記動態画像処理システム100における動作について説明する。
[Operation of Dynamic Image Processing System 100]
Next, the operation in the dynamic image processing system 100 will be described.
 (撮影装置1、撮影用コンソール2の動作)
 まず、撮影装置1、撮影用コンソール2による撮影動作について説明する。
(Operation of the photographing apparatus 1 and the photographing console 2)
First, the photographing operation by the photographing apparatus 1 and the photographing console 2 will be described.
 図3に、撮影用コンソール2の制御部21において実行される撮影制御処理を示す。撮影制御処理は、制御部21と記憶部22に記憶されている撮影制御処理プログラムとの協働により実行される。 FIG. 3 shows a photographing control process executed by the control unit 21 of the photographing console 2. The photographing control process is executed in cooperation with the photographing control processing program stored in the control unit 21 and the storage unit 22.
 まず、撮影技師により撮影用コンソール2の操作部23が操作され、撮影対象(被写体M)の患者情報(患者の氏名、身長、体重、年齢、性別等)の入力が行われる(ステップS1)。 First, the operation section 23 of the imaging console 2 is operated by the imaging technician, and patient information (patient name, height, weight, age, sex, etc.) of the imaging target (subject M) is input (step S1).
 次いで、放射線照射条件が記憶部22から読み出されて放射線照射制御装置12に設定されるとともに、画像読取条件が記憶部22から読み出されて読取制御装置14に設定される(ステップS2)。ここで、フレームレート(パルスレート)としては、人間の心拍周期を考慮して7.5フレーム/秒以上とすることが好ましい。 Next, the radiation irradiation conditions are read from the storage unit 22 and set in the radiation irradiation control device 12, and the image reading conditions are read from the storage unit 22 and set in the reading control device 14 (step S2). Here, the frame rate (pulse rate) is preferably set to 7.5 frames / second or more in consideration of a human heartbeat cycle.
 次いで、操作部23の操作による放射線照射の指示が待機される。 Next, a radiation irradiation instruction by the operation of the operation unit 23 is on standby.
 撮影技師は、撮影装置1において患者のポジショニング等の撮影準備を行う。具体的には、被写体M(患者)の背丈に応じて、図示しないフットスイッチにより、放射線検出器131が装着された検出器保持部132の高さを調整し、被写体Mを検出器保持部132の患者固定部134により固定する。また、被写体Mに体動補正用のX線非透過マーカを2箇所以上(ここでは、マーカM1、マーカM2の2箇所)貼付する。そして、患者にリラックスした状態で、安静換気(呼吸)を行うよう指示する。また、横隔膜及び肋骨の移動のほとんどない動態画像を得るため、患者に息を止めるよう指示してもよい。 The imaging engineer prepares for imaging such as patient positioning in the imaging apparatus 1. Specifically, according to the height of the subject M (patient), the height of the detector holding unit 132 to which the radiation detector 131 is attached is adjusted by a foot switch (not shown), and the subject M is detected by the detector holding unit 132. The patient is fixed by the patient fixing part 134. Also, two or more X-ray non-transparent markers for body motion correction (here, two places, marker M1 and marker M2) are attached to the subject M. Then, the patient is instructed to perform rest ventilation (breathing) in a relaxed state. The patient may also be instructed to hold his breath to obtain a dynamic image with little movement of the diaphragm and ribs.
 撮影装置1においては、撮影技師の操作により検出器保持部132の高さが調整されると、測距センサSE1により床~放射線検出器131中央までの距離が取得され、読取制御装置14に出力される。読取制御装置14においては、測距センサSE1の出力値が高さ情報として放射線照射制御装置12に出力される。放射線照射制御装置12においては、測距センサSE2から出力される床~放射線源111の焦点位置までの距離の値が読取制御装置14から出力された値と同じになるように図示しない駆動機構が駆動され、放射線源保持部112の高さが調整される。 In the imaging apparatus 1, when the height of the detector holding unit 132 is adjusted by the operation of the imaging engineer, the distance from the floor to the center of the radiation detector 131 is acquired by the distance measuring sensor SE 1 and output to the reading control apparatus 14. Is done. In the reading control device 14, the output value of the distance measuring sensor SE1 is output to the radiation irradiation control device 12 as height information. In the radiation irradiation control device 12, a driving mechanism (not shown) is provided so that the value of the distance from the floor output from the distance measuring sensor SE2 to the focal position of the radiation source 111 is the same as the value output from the reading control device 14. When driven, the height of the radiation source holding unit 112 is adjusted.
 被写体Mのポジショニングが終了すると、撮影技師は、撮影用コンソール2の操作部23により放射線照射指示を入力する。 When the positioning of the subject M is completed, the imaging engineer inputs a radiation irradiation instruction through the operation unit 23 of the imaging console 2.
 操作部23により放射線照射指示が入力されると(ステップS3;YES)、放射線照射制御装置12及び読取制御装置14に撮影開始指示が出力され、動態撮影が開始される(ステップS4)。即ち、放射線照射制御装置12に設定されたパルス間隔で放射線源111により放射線が照射され、放射線検出器131によりフレーム画像が取得される。動態撮影開始から予め定められた時間が経過すると、制御部21により放射線照射制御装置12及び読取制御装置14に撮影終了の指示が出力され、撮影動作が停止される。 When a radiation irradiation instruction is input by the operation unit 23 (step S3; YES), a photographing start instruction is output to the radiation irradiation control device 12 and the reading control device 14, and dynamic photographing is started (step S4). That is, radiation is emitted from the radiation source 111 at a pulse interval set in the radiation irradiation control device 12, and a frame image is acquired by the radiation detector 131. When a predetermined time has elapsed from the start of dynamic imaging, the control unit 21 outputs an instruction to end imaging to the radiation irradiation control device 12 and the reading control device 14, and the imaging operation is stopped.
 撮影により取得されたフレーム画像は順次撮影用コンソール2に入力され、上述の暗画像を用いたオフセット補正処理、ゲイン補正処理、欠陥画素補正処理、ラグ(残像)補正処理等の補正処理が必要に応じて行われる(ステップS5)。ただし、処理時間の短縮化優先するためにこれらの補正処理を省略しても良く、その場合は、処理はステップS5をスキップして、ステップS6に移行する。次に、フレーム画像は撮影順を示す番号と対応付けて記憶部22に記憶されるとともに(ステップS6)、表示部24に表示される(ステップS7)。撮影技師は、表示された動態画像によりポジショニング等を確認し、撮影により診断に適した画像が取得された(撮影OK)か、再撮影が必要(撮影NG)か、を判断する。そして、操作部23を操作して、判断結果を入力する。 Frame images acquired by shooting are sequentially input to the shooting console 2, and correction processing such as offset correction processing, gain correction processing, defective pixel correction processing, and lag (afterimage) correction processing using the above-described dark image is necessary. It is performed accordingly (step S5). However, these correction processes may be omitted to give priority to shortening the processing time. In this case, the process skips step S5 and proceeds to step S6. Next, the frame image is stored in the storage unit 22 in association with a number indicating the shooting order (step S6) and displayed on the display unit 24 (step S7). The imaging engineer confirms the positioning and the like based on the displayed dynamic image, and determines whether an image suitable for diagnosis is acquired by imaging (imaging OK) or re-imaging is necessary (imaging NG). Then, the operation unit 23 is operated to input a determination result.
 操作部23の所定の操作により撮影OKを示す判断結果が入力されると(ステップS8;YES)、動態撮影で取得された一連のフレーム画像のそれぞれに、動態画像を識別するための識別IDや、患者情報、検査対象部位、放射線照射条件、画像読取条件、撮影順を示す番号等の情報が付帯され(例えば、DICOM形式で画像データのヘッダ領域に書き込まれ)、通信部25を介して診断用コンソール3に送信される(ステップS9)。そして、本処理は終了する。一方、操作部23の所定の操作により撮影NGを示す判断結果が入力されると(ステップS8;NO)、記憶部22に記憶された一連のフレーム画像が削除され(ステップS10)、本処理は終了する。 When a determination result indicating that the shooting is OK is input by a predetermined operation of the operation unit 23 (step S8; YES), an identification ID for identifying the dynamic image or each of a series of frame images acquired by the dynamic shooting is displayed. Information such as patient information, examination target region, radiation irradiation condition, image reading condition, number indicating the imaging order is attached (for example, written in the header area of the image data in DICOM format) and diagnosed via the communication unit 25 Is transmitted to the console 3 (step S9). Then, this process ends. On the other hand, when a determination result indicating photographing NG is input by a predetermined operation of the operation unit 23 (step S8; NO), a series of frame images stored in the storage unit 22 is deleted (step S10), and this processing is performed. finish.
 (診断用コンソール3の動作)
 次に、診断用コンソール3における動作について説明する。
(Operation of diagnostic console 3)
Next, the operation in the diagnostic console 3 will be described.
 診断用コンソール3においては、通信部35を介して撮影用コンソール2から動態画像の一連のフレーム画像が受信されると、制御部31と記憶部32に記憶されている画像解析処理プログラムとの協働により図4に示す画像解析処理が実行される。 When the diagnostic console 3 receives a series of frame images of the dynamic image from the imaging console 2 via the communication unit 35, it cooperates with the control unit 31 and the image analysis processing program stored in the storage unit 32. As a result, the image analysis processing shown in FIG. 4 is executed.
 画像解析処理においては、まず、前処理が行われる(ステップS11)。 In the image analysis processing, first, preprocessing is performed (step S11).
 図5に、ステップS11において実行される前処理のフローチャートを示す。 FIG. 5 shows a flowchart of the preprocessing executed in step S11.
 前処理においては、まず、対数変換処理が行われ、動態画像の各フレーム画像の各画素の画素値(濃度値。以下、信号値という)が真数から対数に変換される(ステップS101)。 In the preprocessing, first, logarithmic conversion processing is performed, and pixel values (density values, hereinafter referred to as signal values) of each frame image of the dynamic image are converted from a true number to a logarithm (step S101).
 次いで、時間変化濃度補正処理(各フレーム画像取得時の照射線量バラツキ影響を除去する為の処理)が行われ、各フレーム画像の直接X線領域の信号値が同じ値となるように補正される(ステップS102)。 Next, time-varying density correction processing (processing for removing the influence of irradiation dose variation at the time of acquisition of each frame image) is performed, and correction is performed so that the signal value of the direct X-ray region of each frame image becomes the same value. (Step S102).
 ステップS102においては、まず、撮影用コンソール2から入力された一連のフレーム画像の中から基準フレーム画像を選択し、以下の(式1)により各フレーム画像の補正値を算出する。
(式1)各フレーム画像の補正値=(各フレーム画像の直接X線領域の平均信号値)-(基準フレーム画像の直接X線領域の平均信号値)
 次いで、各フレーム画像において、画素毎に、信号値から(式1)により算出された補正値が減算される。
In step S102, first, a reference frame image is selected from a series of frame images input from the photographing console 2, and a correction value of each frame image is calculated by the following (Equation 1).
(Expression 1) Correction value of each frame image = (Average signal value of direct X-ray region of each frame image) − (Average signal value of direct X-ray region of the reference frame image)
Next, in each frame image, the correction value calculated by (Equation 1) is subtracted from the signal value for each pixel.
 次いで、各フレーム画像に対し、グリッド目除去処理が行われる(ステップS103)。グリッド目除去処理は、散乱放射線を除去するために設けられた散乱線除去グリッドのグリッド配列に起因する縞模様を除去する処理である。グリッド目除去処理は、公知の技術を用いて行うことができる。例えば、各フレーム画像に離散的フーリエ変換等の周波数変換処理を施した後、ローパスフィルタ処理を行ってグリッド像の周波数を含む高周波数領域を除去し、逆フーリエ変換処理を施すことにより行うことができる(石田隆行著「医用画像処理入門」3.4X線画像のグリッドによる縦縞影の除去参照)。 Next, grid eye removal processing is performed on each frame image (step S103). The grid eye removal process is a process of removing a stripe pattern caused by the grid arrangement of the scattered radiation removal grid provided to remove scattered radiation. The grid eye removal process can be performed using a known technique. For example, each frame image is subjected to a frequency transform process such as discrete Fourier transform, followed by a low-pass filter process to remove a high frequency region including the frequency of the grid image and an inverse Fourier transform process. Yes (see Takayuki Ishida, “Introduction to Medical Image Processing”, 3.4 Removal of Vertical Striped Shadows by Grid of X-ray Images).
 次いで、各フレーム画像に対し、体動補正処理が行われる(ステップS104)。体動補正処理では、全フレーム画像の2箇所のX線非透過マーカM1、M2の線分が一致するように、各フレーム画像が回転、平行移動され、位置合わせが行われる。 Next, body motion correction processing is performed on each frame image (step S104). In the body motion correction process, each frame image is rotated and translated so that the line segments of the two X-ray opaque markers M1 and M2 in the entire frame image coincide with each other, and alignment is performed.
 次いで、各フレーム画像に対し、肺野領域抽出分割処理が行われる(ステップS105)。 Next, lung field extraction / division processing is performed on each frame image (step S105).
 肺野領域抽出分割処理は、以下の(1)~(4)の手順で行われる。
(1)一連のフレーム画像の中から任意のフレーム画像(ここでは、撮影順が一番のフレーム画像とする)が基準画像として設定される。
(2)基準画像から肺野領域が抽出され、抽出された肺野領域に外接する略矩形の領域が図6の点線で示すように複数の小領域A1に分割される。分割する小領域A1は、例えば0.4~4cm角である。
The lung field extraction / division process is performed according to the following procedures (1) to (4).
(1) An arbitrary frame image (in this case, the frame image with the first shooting order) is set as a reference image from a series of frame images.
(2) A lung field region is extracted from the reference image, and a substantially rectangular region circumscribing the extracted lung field region is divided into a plurality of small regions A1 as indicated by dotted lines in FIG. The small area A1 to be divided is, for example, 0.4 to 4 cm square.
 なお、肺野領域の抽出方法は何れの方法であってもよい。例えば、基準画像の信号値のヒストグラムから判別分析によって閾値を求め、この閾値より高信号の領域を肺野領域候補として1次抽出する。次いで、1次抽出された肺野領域候補の境界付近でエッジ検出を行い、境界付近の小領域でエッジが最大となる点を境界に沿って抽出すれば肺野領域の境界を抽出することができる。
(3)ローカルマッチング処理により、基準画像の各小領域A1の中心点に対応する位置が他のフレーム画像から抽出される。
Note that the lung field region extraction method may be any method. For example, a threshold value is obtained from a histogram of signal values of the reference image by discriminant analysis, and a region having a signal higher than the threshold value is primarily extracted as a lung field region candidate. Next, edge detection is performed in the vicinity of the boundary of the first extracted lung field region candidate, and the boundary of the lung field region can be extracted by extracting along the boundary the point where the edge is maximum in a small region near the boundary. it can.
(3) The position corresponding to the center point of each small area A1 of the reference image is extracted from the other frame images by the local matching process.
 ローカルマッチング処理は、例えば、特開2001-157667号に記載されている手法により行うことができる。具体的には、まず、カウンタnの初期値を1として、撮影順が(n+1)番のフレーム画像に、撮影順がn番のフレーム画像における各小領域A1の探索領域がそれぞれ設定される。ここで、各探索領域は、n番のフレーム画像における各小領域A1における中心点の座標を(x,y)とすると、同一の中心点(x,y)をもち、n番のフレーム画像の各小領域A1よりも縦横の幅が大きくなるように設定される(例えば、縦横の幅がそれぞれ2倍)。次いで、n番のフレーム画像の各小領域A1毎に、(n+1)番のフレーム画像に設定された探索領域において最もマッチング度合いが高くなる位置が算出され、(n+1)番のフレーム画像上での対応位置として算出される。マッチング度合いとしては、最小二乗法や相互相関係数が指標に用いられる。 The local matching process can be performed by a method described in, for example, Japanese Patent Application Laid-Open No. 2001-157667. Specifically, first, the initial value of the counter n is set to 1, and the search area of each small area A1 in the frame image with the shooting order nth is set to the frame image with the shooting order nth. Here, each search area has the same center point (x, y), where the coordinates of the center point in each small area A1 in the nth frame image are (x, y). The vertical and horizontal widths are set to be larger than each small area A1 (for example, the vertical and horizontal widths are each doubled). Next, for each small area A1 of the nth frame image, a position having the highest matching degree in the search area set in the (n + 1) th frame image is calculated, and the position on the (n + 1) th frame image is calculated. Calculated as the corresponding position. As the degree of matching, a least square method or a cross-correlation coefficient is used as an index.
 ローカルマッチング処理により撮影順がn番のフレーム画像の各小領域A1が(n+1)番のフレーム画像のどの位置(小領域)に対応するかが算出されると、(n+1)番のフレーム画像における各小領域A1の中心点の座標(x′,y′)がn番のフレーム画像の各小領域A1の中心点の座標(x,y)に対応する位置として取得され、RAMに記憶される。カウンタnがインクリメントされ(n+1)>フレーム画像数となるまで繰り返される。
(4)図6に実線で示すように、各フレーム画像が各小領域A1の中心点を頂点とした新たな小領域A2に分割される。
When it is calculated by the local matching processing which position (small area) of the (n + 1) th frame image corresponds to each small area A1 of the nth frame image in the shooting order, in the (n + 1) th frame image The coordinates (x ′, y ′) of the center point of each small area A1 are acquired as positions corresponding to the coordinates (x, y) of the center point of each small area A1 of the nth frame image and stored in the RAM. . The counter n is incremented and repeated until (n + 1)> the number of frame images.
(4) As shown by a solid line in FIG. 6, each frame image is divided into new small areas A2 having the central point of each small area A1 as a vertex.
 なお、ローカルマッチング処理は、フレーム画像そのもので行ってもよいし、グラディエントフィルタ等によりエッジ抽出(強調)した画像に対して行ってもよい。 Note that the local matching process may be performed on the frame image itself, or may be performed on an image extracted (emphasized) by a gradient filter or the like.
 更には、肺血管影を強調もしくは抽出した画像にてローカルマッチング処理を行ってもよい。肺血管影を強調する手法としては、例えば、「医用画像における円形・線状パターン検出のためのフィルタバンク構築」電子情報通信学会論文誌D-II Vol.J87-D-II No.1 pp175-185に記載のように、多重解像度分解後、各解像度レベル毎に円形・線状パターンを選択的に強調した画像から肺血管に相当する解像度レベルの画像を抽出することで、肺血管影が強調された画像を得ることができる。また他にも、例えば、「Deformable Modelを用いた胸部X線像からの血管影の自動抽出手順」MEDICAL IMAGING TECHNOLOGY Vol.17 No.5,September1999に記載のように、血管構造のモデルとして、血管影の形状を配慮した伸長や併合、分割といった変形が可能な樹状図形モデルを用い、評価関数が最大化する樹状構造を抽出する方法を用いることで、肺血管影の抽出が可能となる。このとき、小領域の分割は、基準画像を等間隔に分割しても、抽出した肺血管の分岐点等、構造的に特徴のある点を中心点とした矩形領域に分割してもよい。また、鎖骨/肋骨領域は肺血管影とは異なる動きをしているため、鎖骨/肋骨の形状モデルを用いて基準画像から鎖骨/肋骨を認識し、鎖骨/肋骨領域を除いた一部の肺野領域を小領域に分割し、ローカルマッチングを行ってもよい。このように、基準画像を構造的に特徴ある点を中心とした矩形領域で分割してローカルマッチングを行った場合、もしくは、一部の肺野領域を小領域に分割してローカルマッチングを行った場合は、下記のワーピング処理後に、肺野領域の同一部分が描画された領域を各フレーム画像間で対応付けた後に、対応付いた各フレーム画像の肺野領域を等間隔に分割すればよい。 Furthermore, local matching processing may be performed with an image in which lung blood vessel shadows are emphasized or extracted. As a technique for emphasizing pulmonary vessel shadows, for example, “Constructing a filter bank for detecting circular and linear patterns in medical images”, IEICE Transactions D-II Vol.J87-D-II No.1 pp175- As described in 185, after multi-resolution decomposition, a pulmonary vessel shadow is emphasized by extracting an image of a resolution level corresponding to the pulmonary blood vessel from an image in which a circular / linear pattern is selectively emphasized for each resolution level. Images can be obtained. In addition, as described in, for example, “Automatic extraction procedure of blood vessel shadow from chest X-ray image using Deformable Model” MEDICAL IMAGING TECHNOLOGY Vol. 17 No. 5, September 1999 Extraction of pulmonary blood vessel shadows is possible by using a tree-like figure model that can be deformed such as expansion, merging, and division considering the shape of the shadow, and using a method of extracting a tree-like structure that maximizes the evaluation function . At this time, the division of the small region may be performed by dividing the reference image at equal intervals or by dividing the reference image into rectangular regions having a central characteristic point such as an extracted pulmonary blood vessel branch point. In addition, since the clavicle / rib area moves differently from the pulmonary vascular shadow, a part of the lung excluding the clavicle / rib area is recognized by recognizing the clavicle / rib from the reference image using a clavicle / rib shape model. The field region may be divided into small regions and local matching may be performed. In this way, when local matching was performed by dividing the reference image into rectangular regions centered on structurally characteristic points, or local matching was performed by dividing some lung field regions into small regions. In this case, after the following warping process, after the regions in which the same part of the lung field region is drawn are associated between the frame images, the lung field regions of the associated frame images may be divided at equal intervals.
 次いで、基準画像以外の各フレーム画像にワーピング処理(非線形歪変換処理)が施され、各フレーム画像の肺野領域の形状が基準画像と一致される(ステップS106)。 Next, warping processing (non-linear distortion conversion processing) is performed on each frame image other than the reference image, and the shape of the lung field region of each frame image is matched with the reference image (step S106).
 ワーピング処理は、例えば、以下のようにして行われる。 The warping process is performed as follows, for example.
 まず、カウンタnの初期値を1として、撮影順がn番のフレーム画像の各小領域A1の中心点(中心画素)と(n+1)番のフレーム画像の対応する各小領域A1の中心点のシフト値(Δx,Δy)がそれぞれ算出される。各シフト値(Δx,Δy)は、n番のフレーム画像の各小領域A1の中心点を(x,y)、(n+1)番のフレーム画像の対応する各小領域A1の中心点を(x′,y′)とすると、(Δx=x′-x,Δy=y′-y)を算出することにより求められる。 First, assuming that the initial value of the counter n is 1, the center point (center pixel) of each small area A1 of the frame image of the nth shooting order and the center point of each corresponding small area A1 of the (n + 1) th frame image. Shift values (Δx, Δy) are calculated. Each shift value (Δx, Δy) is set to the center point of each small region A1 of the nth frame image (x, y), and the center point of each corresponding small region A1 of the (n + 1) th frame image (x ′, Y ′), it is obtained by calculating (Δx = x′−x, Δy = y′−y).
 次いで、求められた各シフト値(Δx,Δy)を用いた2次元10次多項式による近似処理により(n+1)番のフレーム画像の各画素毎のシフト値(Δx,Δy)が算出される。そして、得られた各画素のシフト値(Δx,Δy)に基づいて、(n+1)番のフレーム画像の各画素がシフトされ、新たなフレーム画像が、ワーピング処理前のフレーム画像とは別に作成される。各フレーム画像におけるワーピング処理前後の各画素の対応関係は、RAMに記憶される。これにより、ワーピング処理後の各画素がワーピング処理前のどの画素に対応しているかを求めることが可能となる。 Next, a shift value (Δx, Δy) for each pixel of the (n + 1) th frame image is calculated by an approximation process using a two-dimensional 10th order polynomial using the obtained shift values (Δx, Δy). Then, each pixel of the (n + 1) th frame image is shifted based on the obtained shift value (Δx, Δy) of each pixel, and a new frame image is created separately from the frame image before the warping process. The The correspondence relationship of each pixel before and after the warping process in each frame image is stored in the RAM. This makes it possible to determine which pixel before the warping process corresponds to each pixel after the warping process.
 次いで、カウンタnが1インクリメントされ、(n+1)>フレーム画像数となるまで上記処理が繰り返し実行される。 Next, the counter n is incremented by 1, and the above processing is repeatedly executed until (n + 1)> the number of frame images.
 そして、基準画像及びワーピング処理で作成された各フレーム画像の対応する領域毎の画素の信号値が加算されることにより統合画像Gが作成され(ステップS107)、前処理は終了し、処理は図4のステップS12に移行する。ここで、息止め時は拍動による肺血管位置の移動量が微小であるため、ローカルマッチング、ワーピング処理をスキップし、検出器上の同一位置の画素の信号値を加算することで統合画像Gを作成し、処理時間短縮をはかってもよい。統合画像Gは、ノイズが低減した画像となり、血管領域が認識しやすい画像となる。なお、統合画像Gは、各フレーム画像の対応する画素同士の信号値を加算後、フレーム数で除して平均化して作成することとしてもよい。 Then, the integrated image G is created by adding the signal value of the pixel for each corresponding region of each frame image created by the reference image and the warping process (step S107), the preprocessing is completed, and the processing is illustrated in FIG. 4 moves to step S12. Here, since the amount of movement of the pulmonary blood vessel position due to pulsation is very small at the time of breath holding, the local image matching and warping processes are skipped, and the signal value of the pixel at the same position on the detector is added to the integrated image G To reduce the processing time. The integrated image G is an image with reduced noise and an image in which the blood vessel region can be easily recognized. Note that the integrated image G may be created by adding the signal values of corresponding pixels of each frame image and then dividing the result by the number of frames and averaging.
 図4のステップS12では、統合画像Gに血管認識処理が施される(ステップS12)。ここでいう血管とは、肺野内の末梢血管に流れる血液が集中する比較的太めの肺血管を指し、例えば、右葉間部肺動脈、左葉間部肺動脈、右上肺静脈、中葉静脈、右下肺静脈、左上肺静脈、左下肺静脈、A~A10の肺動脈、V~V10の肺静脈(桑原正喜、山岡利成著「DVD 3D画像で学ぶ胸部X線写真読影の基礎」P.29参照)等を指す。 In step S12 of FIG. 4, a blood vessel recognition process is performed on the integrated image G (step S12). A blood vessel as used herein refers to a relatively thick pulmonary blood vessel in which blood flowing in peripheral blood vessels in the lung field is concentrated. For example, the right interlobar pulmonary artery, left interlobar pulmonary artery, upper right pulmonary vein, middle lobe vein, lower right Pulmonary vein, left upper pulmonary vein, left lower pulmonary vein, pulmonary artery of A 1 -A 10 , pulmonary vein of V 1 -V 10 .29).
 なお、血管認識処理の実行前に、血管認識すべき領域を操作者が指定するためのGUIを表示部34に表示し、このGUIから指定された領域についてのみ血管認識処理を行うようにすると、血管認識処理にかかる時間を低減することができ、また、誤認識を減らすことができるので好ましい。領域を指定するGUIの例としては、例えば、図7に示すように、表示部34に統合画像Gを表示し、操作部33により2つの頂点が指定されると、指定された2つの頂点を対角線とする矩形領域を血管認識すべき領域として設定する。又は、操作部33により3以上の頂点が指定された場合は、指定された頂点で囲まれた多角形領域を血管認識すべき領域として設定することとしてもよい。指定された領域が肺野領域からはみ出す場合は、肺野領域内のみに血管認識処理が施される。 Before executing the blood vessel recognition process, a GUI for the operator to designate a region to be recognized by the operator is displayed on the display unit 34, and the blood vessel recognition process is performed only for the region designated from the GUI. It is preferable because the time required for blood vessel recognition processing can be reduced and erroneous recognition can be reduced. As an example of a GUI for designating a region, for example, as shown in FIG. 7, when the integrated image G is displayed on the display unit 34 and two vertices are designated by the operation unit 33, the designated two vertices are displayed. A rectangular region that is a diagonal line is set as a region for blood vessel recognition. Alternatively, when three or more vertices are designated by the operation unit 33, a polygonal region surrounded by the designated vertices may be set as a region for blood vessel recognition. When the designated region protrudes from the lung field region, blood vessel recognition processing is performed only within the lung field region.
 血管認識処理としては、例えば、前述のとおり、ヘッセ行列の各要素から計算された最大固有値を用いて線状構造物を抽出する方法(「医用画像における円形・線状パターン検出のためのフィルタバンク構築」電子情報通信学会論文誌D-II Vol.J87-D-II No.1 pp175-185)を用いることができる。また、ヘッセ行列の各要素を生成する帯域分割フィルタバンクを構築することにより、各解像度レベルでの線状構造物の抽出が可能となり、大きさ(径)の異なる血管領域を抽出することができる。所定閾値より粗い解像度レベルにて血管抽出を行うことで、所定の径以上の血管のみを抽出することが可能となる。この方法では、血管影と同時に鎖骨/肋骨も抽出されるが、鎖骨/肋骨の形状モデルを用いることで肋骨を認識し、抽出結果の中から削除する。更に、他にも、例えば前述のとおり、変形可能な樹状図形モデルを血管構造のモデルとして、樹状構造を抽出する方法(「Deformable Modelを用いた胸部X線像からの血管影の自動抽出手順」MEDICAL IMAGENG TECHNOLOGY Vol.17 No.5 September 1999)を用いることで、血管影を抽出することも可能である。これらの手法により抽出された領域を血管領域として認識する。 As the blood vessel recognition processing, for example, as described above, a method of extracting a linear structure using the maximum eigenvalue calculated from each element of the Hessian matrix (“filter bank for detecting circular / linear patterns in medical images” Construction ”IEICE Transactions D-II Vol.J87-D-II No.1 pp 175-185) can be used. In addition, by constructing a band division filter bank that generates each element of the Hessian matrix, it is possible to extract a linear structure at each resolution level, and it is possible to extract blood vessel regions having different sizes (diameters). . By performing blood vessel extraction at a resolution level coarser than the predetermined threshold, only blood vessels having a predetermined diameter or more can be extracted. In this method, the clavicle / rib is extracted at the same time as the blood vessel shadow, but the rib is recognized by using the clavicle / rib shape model and deleted from the extraction result. In addition, as described above, for example, as described above, a method of extracting a dendritic structure using a deformable dendritic figure model as a blood vessel structure model (“automatic extraction of blood vessel shadow from chest X-ray image using Deformable Model” By using the procedure “MEDICAL IMAGENG TECHNOLOGY Vol.17 No.5 September 1999), blood vessel shadows can be extracted. The region extracted by these methods is recognized as a blood vessel region.
 また、認識すべき血管部分の一部を操作者が指定するためのGUIを表示部34に表示し、指定された位置情報に基づいて血管認識処理を行うこととしてもよい。この場合の血管認識処理は、以下のようにして行うことができる。 Also, a GUI for the operator to designate a part of the blood vessel portion to be recognized may be displayed on the display unit 34, and the blood vessel recognition process may be performed based on the designated position information. The blood vessel recognition process in this case can be performed as follows.
 まず、表示部34に統合画像Gが表示される。図8(a)に示すように、統合画像G上で操作部33により認識すべき血管(図8(a)~図8(f)中にBVで示す)の一部を血管の走行方向に沿って指定すると、血管認識処理においては、図8(b)、(c)に示すように、操作部33により指定された走行方向に対して垂直方向(図8(b)の矢印方向)の信号値プロファイルが生成される。次いで、図8(c)に示すように、操作部33により指定された位置Pを含み、かつ、信号値が周囲に対し所定の閾値TH1以上低下する箇所の幅Wが、画素数をカウントすることにより算出される。そして、算出された幅Wが所定値以上である場合に、この低信号領域が血管領域であると認識される。更に、血管が認識された場合は、図8(d)に示すように、認識された血管領域の端を中心としたROIが設定され、ROI内の信号値のヒストグラムから判別分析法等により算出した閾値をもとに2値化が行われる。ROIのサイズは、幅Wに応じて決定される。例えば、一辺の長さがα×W(αは所定の係数)のROIを設定する。そして、図8(e)に示すように、ROIの中心を含み、算出された閾値以下の低信号領域を血管の候補領域とし、この血管の候補領域のサイズがROIの面積に対して所定の範囲内である場合に、血管領域として認識される。更に、新たに認識された血管領域がROIの端に接する場合は、図8(f)に示すように、その端を中心とした新たなROIを設定し、同様に血管領域の認識が繰り返される。 First, the integrated image G is displayed on the display unit 34. As shown in FIG. 8 (a), a part of a blood vessel (indicated by BV in FIGS. 8 (a) to 8 (f)) to be recognized by the operation unit 33 on the integrated image G in the traveling direction of the blood vessel. When designated along, in the blood vessel recognition processing, as shown in FIGS. 8B and 8C, the direction perpendicular to the traveling direction designated by the operation unit 33 (the arrow direction in FIG. 8B). A signal value profile is generated. Next, as shown in FIG. 8C, the width W of the portion including the position P designated by the operation unit 33 and where the signal value is lower than the predetermined threshold TH1 with respect to the surroundings counts the number of pixels. Is calculated by When the calculated width W is equal to or greater than a predetermined value, the low signal region is recognized as a blood vessel region. Further, when a blood vessel is recognized, as shown in FIG. 8D, an ROI centered on the end of the recognized blood vessel region is set, and is calculated by a discriminant analysis method or the like from a histogram of signal values in the ROI. Binarization is performed based on the threshold value. The size of the ROI is determined according to the width W. For example, an ROI having a side length of α × W (α is a predetermined coefficient) is set. Then, as shown in FIG. 8 (e), a low signal region that includes the center of the ROI and is equal to or less than the calculated threshold value is set as a blood vessel candidate region, and the size of the blood vessel candidate region is a predetermined value with respect to the area of the ROI. When it is within the range, it is recognized as a blood vessel region. Further, when the newly recognized blood vessel region touches the end of the ROI, as shown in FIG. 8 (f), a new ROI centered on that end is set and the recognition of the blood vessel region is repeated in the same manner. .
 次いで、ワーピング処理後の各フレーム画像において、血管領域除去処理が実行され、統合画像Gから認識された血管領域と同位置にある領域の各画素の信号値が当該血管領域の画素の近傍にある非血管領域の信号値の平均値に置き換えられる(ステップS13)。 Next, in each frame image after the warping process, the blood vessel region removal processing is executed, and the signal value of each pixel in the region located at the same position as the blood vessel region recognized from the integrated image G is in the vicinity of the pixel in the blood vessel region. It is replaced with the average value of the signal values in the non-blood vessel region (step S13).
 血管領域除去処理では、例えば、当該血管領域の画素を中心としたN画素×N画素(Nは奇数、例えば、5、11等)のブロックに対して、血管領域、肺野外領域を除いた画素での平均値を求め、求めた平均値で血管領域内の各画素の信号値が置き換えられる。例えば、5画素×5画素のブロックを用いた場合、図9に示す網掛け領域が血管領域であるとすると、中央のP22の画素は、P00、P01、P02、P10、P11、P12、P20、P21、P30、P31、P40、P41、の信号値の平均値に置換される。Nが小さい場合は、まず、血管領域のうち、非血管領域の境界の画素のみに対して信号値の置き換えを行い、その後、まだ置き換えられていない画素のうち、置き換えられた画素との境界の画素のみに対して信号値の置き換えを行うことを、血管領域内の全ての画素の信号値が置き換えられるまで順次繰り返すようにしてもよい。 In the blood vessel region removal processing, for example, a pixel excluding the blood vessel region and the lung field region with respect to a block of N pixels × N pixels (N is an odd number, for example, 5, 11, etc.) centering on the pixel of the blood vessel region. The average value is obtained, and the signal value of each pixel in the blood vessel region is replaced with the obtained average value. For example, when a block of 5 pixels × 5 pixels is used, if the shaded area shown in FIG. 9 is a blood vessel area, the P22 pixels at the center are P00, P01, P02, P10, P11, P12, P20, It is replaced with the average value of the signal values of P21, P30, P31, P40, P41. When N is small, first, the signal value is replaced only for the pixels at the boundary of the non-blood vessel region in the blood vessel region, and then, among the pixels that have not been replaced yet, The replacement of the signal value for only the pixels may be sequentially repeated until the signal values of all the pixels in the blood vessel region are replaced.
 ステップS13の処理により、図10に示すように、肺野領域から血管領域が除去される。 By the processing in step S13, the blood vessel region is removed from the lung field region as shown in FIG.
 次いで、血管領域が除去された各フレーム画像にハイパスフィルタが施され、高周波成分のみが抽出される(ステップS14)。例えば、カットオフ周波数を1Hzとし、人間の心拍周期に相当する1Hz以上の高周波成分のみを抽出する。 Next, a high-pass filter is applied to each frame image from which the blood vessel region has been removed, and only high-frequency components are extracted (step S14). For example, the cut-off frequency is 1 Hz, and only high frequency components of 1 Hz or more corresponding to the human heartbeat cycle are extracted.
 次いで、ハイパスフィルタ処理後の各フレーム画像において、肺野領域が複数のブロック領域に分割され(ステップS15)、ブロック領域ごとに領域内の信号値が加算され、加算された信号値(加算信号値と呼ぶ)の時間変化が算出される(ステップS16)。例えば、右肺野上部R1、右肺野中部R2、右肺野下部R3、左肺野上部L1、左肺野中部L2、左肺野下部L3の6つのブロック領域に分割され(図11参照)、ブロック領域毎に加算信号値が算出され、その時間変化が算出される。 Next, in each frame image after the high-pass filter processing, the lung field region is divided into a plurality of block regions (step S15), the signal values in the regions are added for each block region, and the added signal value (added signal value) The time change is calculated (step S16). For example, it is divided into six block areas, namely, upper right lung field R1, middle right lung field R2, lower right lung field R3, upper left lung field L1, middle left lung field L2, and lower left lung field L3 (see FIG. 11). The addition signal value is calculated for each block area, and the time change is calculated.
 図12に、あるブロック領域の加算信号値の時間変化を示す。図12において、横軸は動態撮影開始からの経過時間tを、縦軸は、ブロック領域内の加算信号値を示している。 FIG. 12 shows the time change of the added signal value in a certain block area. In FIG. 12, the horizontal axis represents the elapsed time t from the start of dynamic imaging, and the vertical axis represents the added signal value in the block area.
 次いで、ブロック領域ごとに一心拍周期相当の信号変化量が算出される(ステップS17)。具体的には、図12に示すように、各ブロック領域の信号値の時間変化に対して、信号値が予め定められた第1閾値以上の極大値、及び信号値が予め定められた第2閾値以下の極小値が抽出され、極大値とその次の極小値の差分を算出することで、一心拍周期相当の信号変化量が算出される。複数の心拍周期に対して、一心拍周期相当の信号変化量が算出され(図12においては、D0~D3)、算出された信号変化量の平均値が算出される。各ブロック領域内の信号値は、そのブロック領域内の血流量に応じて変化する。また、各フレーム画像からは末梢血管を流れる血液が集中した血管領域は除去され、ハイパスフィルタにより血流の周波数成分が抽出されている。よって、算出された各ブロック領域の一心拍周期相当の信号変化量は、そのブロック内の一心拍周期における末梢血管の血流量を示す情報となる。 Next, a signal change amount corresponding to one heartbeat period is calculated for each block region (step S17). Specifically, as shown in FIG. 12, with respect to the time change of the signal value of each block region, the maximum value equal to or greater than a first threshold value that is predetermined and the second value that is predetermined. A minimum value equal to or less than the threshold is extracted, and a signal change amount corresponding to one heartbeat period is calculated by calculating a difference between the maximum value and the next minimum value. For a plurality of heartbeat cycles, a signal change amount corresponding to one heartbeat cycle is calculated (D0 to D3 in FIG. 12), and an average value of the calculated signal change amounts is calculated. The signal value in each block area changes according to the blood flow volume in the block area. Further, from each frame image, a blood vessel region where blood flowing through peripheral blood vessels is concentrated is removed, and a frequency component of blood flow is extracted by a high-pass filter. Therefore, the calculated signal change amount corresponding to one heartbeat period in each block area is information indicating the blood flow rate of the peripheral blood vessels in one heartbeat period in the block.
 次いで、ブロック領域ごとに、ステップS17で算出された各ブロック領域の一心拍周期相当の信号変化量に基づいて各ブロック領域の末梢血管の血流情報が算出される(ステップS18)。各ブロック領域の末梢血管の血流情報としては、ここでは、肺野全体の末梢血管の血流量に対する各ブロック領域の末梢血管の血流量の割合が算出される。例えば、(ブロック領域の一心拍周期相当の信号変化量÷全ブロック領域の一心拍周期相当の信号変化量の合計)×100(%)により算出される。 Next, for each block region, blood flow information of peripheral blood vessels in each block region is calculated based on the signal change amount corresponding to one heartbeat period calculated in step S17 (step S18). As the blood flow information of the peripheral blood vessels in each block region, here, the ratio of the blood flow rate of the peripheral blood vessels in each block region to the blood flow rate of the peripheral blood vessels in the entire lung field is calculated. For example, it is calculated by (signal change amount corresponding to one heartbeat period in the block region / total signal change amount corresponding to one heartbeat period in all block regions) × 100 (%).
 次いで、算出された割合が各ブロック領域の末梢血管の血流情報として表示部34に表示される(ステップS19)。 Next, the calculated ratio is displayed on the display unit 34 as the blood flow information of the peripheral blood vessels in each block region (step S19).
 図11に、ステップS19において表示部34に表示される血流情報の一例を示す。図11に示すように、表示部34には、各ブロック領域の末梢血管の血流情報が数値で表示されるとともに、一連のフレーム画像が動画表示され、動画表示された各ブロック領域が末梢血管の血流情報の値に応じた彩度の色(又は、末梢血管の血流情報の値に応じた輝度値)で表示される。このように、各ブロック領域がその末梢血管の血流情報の値に応じた彩度の色等で表示されることにより、各ブロック領域の末梢血管の血流量の分布を医師が一瞥して把握することが可能となる。 FIG. 11 shows an example of blood flow information displayed on the display unit 34 in step S19. As shown in FIG. 11, the display unit 34 displays the blood flow information of the peripheral blood vessels in each block region as numerical values, and a series of frame images are displayed as a moving image. Each block region displayed as a moving image is displayed as a peripheral blood vessel. The color of the saturation according to the value of the blood flow information (or the luminance value according to the value of the blood flow information of the peripheral blood vessel) is displayed. In this way, each block area is displayed with a color of saturation corresponding to the value of blood flow information of the peripheral blood vessels, so that the doctor can grasp the distribution of blood flow in the peripheral blood vessels of each block area at a glance. It becomes possible to do.
 次いで、各ブロック領域の末梢血管の血流情報が予め定められた閾値を下回るか否かが判断され、下回ると判断されると、そのブロック領域は異常であると判断される(ステップS20)。そして、異常であると判断されたブロック領域がある場合は、動画表示された一連のフレーム画像上の該当するブロック領域が識別可能に(例えば、輪郭を強調する等により)表示される(ステップS21)。図13に、異常と判断されたブロック領域を識別可能に表示した例を示す。 Next, it is determined whether or not the blood flow information of the peripheral blood vessels in each block region is below a predetermined threshold value. If it is determined that the block region is below, it is determined that the block region is abnormal (step S20). If there is a block area determined to be abnormal, the corresponding block area on the series of frame images displayed as moving images is displayed in an identifiable manner (for example, by emphasizing the outline) (step S21). ). FIG. 13 shows an example in which block areas determined to be abnormal are displayed in an identifiable manner.
 以上説明したように、本発明に係る診断用コンソール3によれば、制御部31は、胸部の動態を示す複数のフレーム画像のそれぞれから肺野領域を抽出し、抽出された肺野領域において血管領域の認識処理を行って、認識された血管領域を各フレーム画像から除去する。そして、血管領域が除去された各フレーム画像の肺野領域を複数のブロック領域に分割し、各ブロック領域内の画素信号値の変化量を算出し、算出された各ブロック領域内の画素信号値の変化量に基づいて、各ブロック領域に存在する末梢血管の血流情報を算出し、表示部34に表示する。 As described above, according to the diagnostic console 3 according to the present invention, the control unit 31 extracts a lung field region from each of a plurality of frame images showing the dynamics of the chest, and blood vessels in the extracted lung field region Region recognition processing is performed to remove the recognized blood vessel region from each frame image. Then, the lung field region of each frame image from which the blood vessel region is removed is divided into a plurality of block regions, the amount of change in the pixel signal value in each block region is calculated, and the calculated pixel signal value in each block region Based on the change amount, peripheral blood flow information existing in each block region is calculated and displayed on the display unit 34.
 従って、肺血管上の信号変化によって末梢血管の血流量が実際より大きく算出されてしまうことを防止することができるので、肺野領域の末梢血管を流れる血流情報を精度よく算出することが可能となる。また、被写体Mの体内に放射線同位体を投与することなく、肺血流シンチグラフィと同様の血流情報を取得することが可能となる。 Therefore, it is possible to prevent the blood flow volume in the peripheral blood vessels from being calculated larger than the actual blood flow due to signal changes on the pulmonary blood vessels, so that blood flow information flowing through the peripheral blood vessels in the lung field region can be accurately calculated. It becomes. Further, blood flow information similar to pulmonary blood flow scintigraphy can be acquired without administering a radioisotope into the body of the subject M.
 血管領域認識処理においては、認識すべき血管領域の一部を操作者が指定するためのGUIを設け、GUIにより指定された領域に基づいて血管領域の認識処理を行うことで、肺野領域内の肺血管を精度よく認識することができる。 In the blood vessel region recognition processing, a GUI is provided for the operator to specify a part of the blood vessel region to be recognized, and the blood vessel region recognition processing is performed based on the region specified by the GUI, thereby The pulmonary blood vessels can be accurately recognized.
 また、血管領域認識処理においては、血管領域認識処理を行う領域を操作者が指定するためのGUIを設け、GUIにより指定された領域についてのみ血管領域認識処理を行うことで、血管認識処理時間を短縮することができ、また、認識精度を向上させることができる。 In the blood vessel region recognition processing, a GUI is provided for the operator to specify a region for performing the blood vessel region recognition processing, and the blood vessel region recognition processing is performed only for the region specified by the GUI, thereby reducing the blood vessel recognition processing time. It can be shortened and the recognition accuracy can be improved.
 また、各ブロック領域の末梢血管の血流情報に基づいて、各ブロック領域における末梢血管の血流が異常であるか否かを判断し、異常であると判断されたブロック領域の位置を動画表示したフレーム画像上に識別可能に表示するので、末梢血管の血流が異常な箇所を医師が容易に把握することが可能となる。 In addition, based on the blood flow information of the peripheral blood vessels in each block area, it is determined whether or not the blood flow in the peripheral blood vessels in each block area is abnormal, and the position of the block area determined to be abnormal is displayed as a moving image. Since the information is displayed on the frame image so as to be identifiable, it is possible for the doctor to easily grasp the portion where the blood flow in the peripheral blood vessels is abnormal.
 なお、上述した本実施の形態における記述は、本発明に係る好適な動態画像処理システムの一例であり、これに限定されるものではない。 Note that the description in the present embodiment described above is an example of a suitable dynamic image processing system according to the present invention, and the present invention is not limited to this.
 例えば、上記実施の形態においては、末梢血管の血流情報の表示方法として、一連のフレーム画像が動画表示され、動画表示されたフレーム画像における各ブロック領域が末梢血管の血流情報の値に応じた彩度の色(又は、末梢血管の血流情報の値に応じた輝度値)で表示され、異常と判断されたブロック領域が識別可能に表示されることとしたが、例えば、複数のフレーム画像の何れか(例えば、基準画像)、又は統合画像G等の静止画を表示して、これらの画像上において、各ブロック領域を末梢血管の血流情報の値に応じた色で表示したり、異常なブロック領域を識別可能に表示したりしてもよい。 For example, in the above embodiment, as a method for displaying blood flow information of peripheral blood vessels, a series of frame images are displayed as moving images, and each block region in the frame images displayed as moving images corresponds to the value of blood flow information of peripheral blood vessels. The block area determined to be abnormal is displayed in an identifiable color (or luminance value corresponding to the blood flow information value of the peripheral blood vessel) and is displayed in an identifiable manner. Display one of the images (for example, a reference image) or a still image such as the integrated image G, and display each block area in a color corresponding to the value of blood flow information of peripheral blood vessels on these images Alternatively, an abnormal block area may be displayed in an identifiable manner.
 また、例えば、上記の説明では、本発明に係るプログラムのコンピュータ読み取り可能な媒体としてハードディスクや半導体の不揮発性メモリ等を使用した例を開示したが、この例に限定されない。その他のコンピュータ読み取り可能な媒体として、CD-ROM等の可搬型記録媒体を適用することが可能である。また、本発明に係るプログラムのデータを通信回線を介して提供する媒体として、キャリアウエーブ(搬送波)も適用される。 For example, in the above description, an example in which a hard disk, a semiconductor nonvolatile memory, or the like is used as a computer-readable medium of the program according to the present invention is disclosed, but the present invention is not limited to this example. As other computer-readable media, a portable recording medium such as a CD-ROM can be applied. A carrier wave is also applied as a medium for providing program data according to the present invention via a communication line.
 その他、動態画像処理システム100を構成する各装置の細部構成及び細部動作に関しても、本発明の趣旨を逸脱することのない範囲で適宜変更可能である。 In addition, the detailed configuration and detailed operation of each device constituting the dynamic image processing system 100 can be changed as appropriate without departing from the spirit of the present invention.
 100 動態画像処理システム
 1 撮影装置
 11 放射線発生装置
 111 放射線源
 112 放射線源保持部
 113 支持基軸
 12 放射線照射制御装置
 13 放射線検出部
 131 放射線検出器
 132 検出器保持部
 133 支持基軸
 134 患者固定部
 134a 左上部固定部
 134b 右上部固定部
 134c 左下部固定部
 134d 右下部固定部
 14 読取制御装置
 2 撮影用コンソール
 21 制御部
 22 記憶部
 23 操作部
 24 表示部
 25 通信部
 26 バス
 3 診断用コンソール
 31 制御部
 32 記憶部
 33 操作部
 34 表示部
 35 通信部
 36 バス
DESCRIPTION OF SYMBOLS 100 Dynamic image processing system 1 Imaging device 11 Radiation generation apparatus 111 Radiation source 112 Radiation source holding part 113 Support base axis 12 Radiation irradiation control apparatus 13 Radiation detection part 131 Radiation detector 132 Detector holding part 133 Support base axis 134 Patient fixing part 134a Upper left Section fixing section 134b upper right section fixing section 134c lower left section fixing section 134d lower right section fixing section 14 reading control device 2 photographing console 21 control section 22 storage section 23 operation section 24 display section 25 communication section 26 bus 3 diagnostic console 31 control section 32 storage unit 33 operation unit 34 display unit 35 communication unit 36 bus

Claims (8)

  1.  胸部の動態を示す複数のフレーム画像のそれぞれから肺野領域を抽出する肺野領域抽出手段と、
     前記各フレーム画像の前記抽出された肺野領域において血管領域の認識処理を行う血管領域認識手段と、
     前記各フレーム画像から前記血管領域認識手段により認識された血管領域を除去する血管領域除去手段と、
     前記血管領域除去手段により血管領域が除去された各フレーム画像において前記肺野領域を複数のブロック領域に分割する分割手段と、
     前記血管領域除去手段により血管領域が除去されたフレーム画像間における前記各ブロック領域内の画素信号値の変化量を算出する算出手段と、
     前記算出手段により算出された各ブロック領域内の画素信号値の変化量に基づいて、前記各ブロック領域に存在する末梢血管の血流情報を算出する血流情報算出手段と、
     を備える動態画像処理システム。
    Lung field region extraction means for extracting a lung field region from each of a plurality of frame images indicating the dynamics of the chest,
    Blood vessel region recognition means for performing blood vessel region recognition processing in the extracted lung field region of each frame image;
    A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image;
    A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit;
    Calculating means for calculating a change amount of a pixel signal value in each block region between the frame images from which the blood vessel region has been removed by the blood vessel region removing unit;
    Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on a change amount of a pixel signal value in each block area calculated by the calculating means;
    A dynamic image processing system comprising:
  2.  前記血管領域認識手段により認識すべき血管領域の一部を操作者が指定するための操作手段を備え、
     前記血管領域認識手段は、前記操作手段により指定された領域に基づいて血管領域の認識処理を行う請求項1に記載の動態画像処理システム。
    An operation means for an operator to specify a part of a blood vessel region to be recognized by the blood vessel region recognition means;
    The dynamic image processing system according to claim 1, wherein the blood vessel region recognition unit performs a blood vessel region recognition process based on a region designated by the operation unit.
  3.  前記抽出された肺野領域において前記血管領域認識手段により血管認識処理を行う領域を操作者が指定するための操作手段を備え、
     前記血管領域認識手段は、前記操作手段により指定された領域についてのみ血管領域の認識処理を行う請求項1に記載の動態画像処理システム。
    In the extracted lung field region, comprising an operation means for an operator to specify a region for performing blood vessel recognition processing by the blood vessel region recognition means,
    The dynamic image processing system according to claim 1, wherein the blood vessel region recognition unit performs a blood vessel region recognition process only on a region designated by the operation unit.
  4.  前記血流情報算出手段により算出された血流情報を表示部に表示させる表示手段と、
     を備える請求項1~3の何れか一項に記載の動態画像処理システム。
    Display means for displaying blood flow information calculated by the blood flow information calculating means on a display unit;
    The dynamic image processing system according to any one of claims 1 to 3, further comprising:
  5.  前記各ブロック領域に存在する末梢血管の血流情報に基づいて、前記各ブロック領域における末梢血管の血流が異常であるか否かを判断する判断手段を備え、
     前記表示手段は、前記判断手段により異常であると判断されたブロック領域の位置を動画表示したフレーム画像上に識別可能に前記表示部に表示させる請求項4に記載の動態画像処理システム。
    Based on the blood flow information of the peripheral blood vessels present in each block region, comprising a determination means for determining whether the blood flow of the peripheral blood vessels in each block region is abnormal,
    The dynamic image processing system according to claim 4, wherein the display unit displays the position of the block area determined to be abnormal by the determination unit on the display unit so as to be identifiable on a frame image displayed as a moving image.
  6.  前記血管領域除去手段は、前記認識された血管領域の画素信号値をその近傍の非血管領域の画素信号値に基づいて定められる値に置き換える請求項1~5の何れか一項に記載の動態画像処理システム。 The dynamic state according to any one of claims 1 to 5, wherein the blood vessel region removing unit replaces the pixel signal value of the recognized blood vessel region with a value determined based on a pixel signal value of a non-blood vessel region in the vicinity thereof. Image processing system.
  7.  コンピュータを、
     胸部の動態を示す複数のフレーム画像のそれぞれから肺野領域を抽出する肺野領域抽出手段、
     前記各フレーム画像の前記抽出された肺野領域において血管領域の認識処理を行う血管領域認識手段、
     前記各フレーム画像から前記血管領域認識手段により認識された血管領域を除去する血管領域除去手段、
     前記血管領域除去手段により血管領域が除去された各フレーム画像において前記肺野領域を複数のブロック領域に分割する分割手段、
     前記血管領域除去手段により血管領域が除去されたフレーム画像間における前記各ブロック領域内の画素信号値の変化量を算出する算出手段、
     前記算出手段により算出された各ブロック領域内の画素信号値の変化量に基づいて、前記各ブロック領域に存在する末梢血管の血流情報を算出する血流情報算出手段、
     として機能させるためのプログラム。
    Computer
    Lung field extraction means for extracting a lung field from each of a plurality of frame images showing the dynamics of the chest,
    A blood vessel region recognition means for performing a blood vessel region recognition process in the extracted lung field region of each frame image;
    A blood vessel region removing unit that removes the blood vessel region recognized by the blood vessel region recognizing unit from each frame image;
    A dividing unit for dividing the lung field region into a plurality of block regions in each frame image from which the blood vessel region has been removed by the blood vessel region removing unit;
    Calculating means for calculating a change amount of a pixel signal value in each block region between frame images from which the blood vessel region has been removed by the blood vessel region removing unit;
    Blood flow information calculating means for calculating blood flow information of peripheral blood vessels existing in each block area based on the amount of change in the pixel signal value in each block area calculated by the calculating means;
    Program to function as.
  8.  前記コンピュータを、
     前記血流情報算出手段により算出された血流情報を表示部に表示させる表示手段、
    として機能させるための請求項7に記載のプログラム。
    The computer,
    Display means for displaying blood flow information calculated by the blood flow information calculating means on a display unit;
    The program of Claim 7 for functioning as.
PCT/JP2010/073138 2010-02-01 2010-12-22 Dynamic image processing system and program WO2011092982A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010-019896 2010-02-01
JP2010019896 2010-02-01

Publications (1)

Publication Number Publication Date
WO2011092982A1 true WO2011092982A1 (en) 2011-08-04

Family

ID=44318981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/073138 WO2011092982A1 (en) 2010-02-01 2010-12-22 Dynamic image processing system and program

Country Status (1)

Country Link
WO (1) WO2011092982A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT512393A4 (en) * 2012-06-29 2013-08-15 Ludwig Boltzmann Ges Gmbh Method for processing images of the pulmonary circulation and apparatus for carrying out this method
JP2014079317A (en) * 2012-10-15 2014-05-08 Toshiba Corp Image processing apparatus and program
JP2018033578A (en) * 2016-08-30 2018-03-08 キヤノン株式会社 Radiographic apparatus, radiographic system, radiographic method, and program
JP2018157884A (en) * 2017-03-22 2018-10-11 コニカミノルタ株式会社 X-ray moving image processing device
JP2019072342A (en) * 2017-10-18 2019-05-16 キヤノンメディカルシステムズ株式会社 Medical image processing apparatus, X-ray diagnostic apparatus, and medical image processing program
JP2020081278A (en) * 2018-11-22 2020-06-04 コニカミノルタ株式会社 Image processing device and program
CN113807410A (en) * 2021-08-27 2021-12-17 北京百度网讯科技有限公司 Image recognition method and device and electronic equipment
JP2022027757A (en) * 2016-08-18 2022-02-14 ウィリアム・ボーモント・ホスピタル System and method for determining change in respiratory blood volume from 4d computer tomography

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08161520A (en) * 1994-12-06 1996-06-21 Hitachi Medical Corp Method for extracting object part from three-dimensional image
JP2005312937A (en) * 2004-03-31 2005-11-10 Toshiba Corp Medical image processing apparatus, and method for processing medical image
WO2007078012A1 (en) * 2006-01-05 2007-07-12 National University Corporation Kanazawa University Continuous x-ray image screening examination device, program, and recording medium
WO2009090894A1 (en) * 2008-01-15 2009-07-23 Konica Minolta Medical & Graphic, Inc. Support system of diagnostic dynamic-imaging
JP2009273671A (en) * 2008-05-15 2009-11-26 Konica Minolta Medical & Graphic Inc Kymography system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08161520A (en) * 1994-12-06 1996-06-21 Hitachi Medical Corp Method for extracting object part from three-dimensional image
JP2005312937A (en) * 2004-03-31 2005-11-10 Toshiba Corp Medical image processing apparatus, and method for processing medical image
WO2007078012A1 (en) * 2006-01-05 2007-07-12 National University Corporation Kanazawa University Continuous x-ray image screening examination device, program, and recording medium
WO2009090894A1 (en) * 2008-01-15 2009-07-23 Konica Minolta Medical & Graphic, Inc. Support system of diagnostic dynamic-imaging
JP2009273671A (en) * 2008-05-15 2009-11-26 Konica Minolta Medical & Graphic Inc Kymography system

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT512393B1 (en) * 2012-06-29 2013-08-15 Ludwig Boltzmann Ges Gmbh Method for processing images of the pulmonary circulation and apparatus for carrying out this method
AT512393A4 (en) * 2012-06-29 2013-08-15 Ludwig Boltzmann Ges Gmbh Method for processing images of the pulmonary circulation and apparatus for carrying out this method
JP2014079317A (en) * 2012-10-15 2014-05-08 Toshiba Corp Image processing apparatus and program
JP2022027757A (en) * 2016-08-18 2022-02-14 ウィリアム・ボーモント・ホスピタル System and method for determining change in respiratory blood volume from 4d computer tomography
US11712214B2 (en) 2016-08-18 2023-08-01 William Beaumont Hospital System and method for determining respiratory induced blood mass change from a 4D computed tomography
JP7258983B2 (en) 2016-08-18 2023-04-17 ウィリアム・ボーモント・ホスピタル Systems and methods for determining respiratory blood volume changes from 4D computed tomography
JP2018033578A (en) * 2016-08-30 2018-03-08 キヤノン株式会社 Radiographic apparatus, radiographic system, radiographic method, and program
JP2018157884A (en) * 2017-03-22 2018-10-11 コニカミノルタ株式会社 X-ray moving image processing device
JP7000110B2 (en) 2017-10-18 2022-02-10 キヤノンメディカルシステムズ株式会社 Medical image processing equipment, X-ray diagnostic equipment, and medical image processing programs
JP2019072342A (en) * 2017-10-18 2019-05-16 キヤノンメディカルシステムズ株式会社 Medical image processing apparatus, X-ray diagnostic apparatus, and medical image processing program
JP2020081278A (en) * 2018-11-22 2020-06-04 コニカミノルタ株式会社 Image processing device and program
JP7147507B2 (en) 2018-11-22 2022-10-05 コニカミノルタ株式会社 Image processing device and program
CN113807410A (en) * 2021-08-27 2021-12-17 北京百度网讯科技有限公司 Image recognition method and device and electronic equipment
CN113807410B (en) * 2021-08-27 2023-09-05 北京百度网讯科技有限公司 Image recognition method and device and electronic equipment

Similar Documents

Publication Publication Date Title
JP6436182B2 (en) Dynamic image analyzer
JP6413927B2 (en) Dynamic analysis apparatus and dynamic analysis system
WO2011092982A1 (en) Dynamic image processing system and program
JP5874636B2 (en) Diagnosis support system and program
JP5556413B2 (en) Dynamic image processing apparatus and program
US20200193598A1 (en) Dynamic analysis system
JP6217241B2 (en) Chest diagnosis support system
JP2017200565A (en) Dynamic analysis apparatus and dynamic analysis system
JP6701880B2 (en) Dynamic analysis device, dynamic analysis system, dynamic analysis method and program
US10827999B2 (en) Dynamic analysis apparatus and system for measuring temporal changes in blood vessels
JP2016073466A (en) Image processing apparatus and program
JP2019051322A (en) Kinetics analysis system
JP2017169830A (en) Dynamic analysis apparatus
US11484221B2 (en) Dynamic analysis apparatus, dynamic analysis system, expected rate calculation method, and recording medium
JP6848393B2 (en) Dynamic image processing device
JP2020044445A (en) Dynamic analysis system, program, and dynamic analysis apparatus
JP2018175320A (en) Radiography system
US10709403B2 (en) Processing of interventional radiology images by ECG analysis
JP6790537B2 (en) Dynamic analyzer
JP6962030B2 (en) Dynamic analysis device, dynamic analysis system, dynamic analysis program and dynamic analysis method
JP2019005417A (en) Dynamic image processing device and dynamic image processing system
JP2009273603A (en) Dynamic image capturing system
JP2021132994A (en) Dynamic analysis device and program
JP2023176825A (en) Dynamic image analysis device and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10844736

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10844736

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP