WO2014185197A1 - 画像処理装置及びプログラム - Google Patents
画像処理装置及びプログラム Download PDFInfo
- Publication number
- WO2014185197A1 WO2014185197A1 PCT/JP2014/060287 JP2014060287W WO2014185197A1 WO 2014185197 A1 WO2014185197 A1 WO 2014185197A1 JP 2014060287 W JP2014060287 W JP 2014060287W WO 2014185197 A1 WO2014185197 A1 WO 2014185197A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- boundary line
- displacement
- image
- frame
- corrected
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 172
- 238000006073 displacement reaction Methods 0.000 claims abstract description 365
- 238000012937 correction Methods 0.000 claims abstract description 114
- 230000033001 locomotion Effects 0.000 claims abstract description 78
- 238000004364 calculation method Methods 0.000 claims abstract description 72
- 238000000605 extraction Methods 0.000 claims abstract description 64
- 238000000034 method Methods 0.000 claims description 256
- 230000008569 process Effects 0.000 claims description 207
- 230000008859 change Effects 0.000 claims description 33
- 238000013519 translation Methods 0.000 claims description 23
- 230000001360 synchronised effect Effects 0.000 claims description 7
- 230000000737 periodic effect Effects 0.000 claims description 6
- 241001465754 Metazoa Species 0.000 claims description 5
- 239000000284 extract Substances 0.000 abstract description 6
- 238000005516 engineering process Methods 0.000 abstract description 2
- 230000000241 respiratory effect Effects 0.000 description 104
- 238000003384 imaging method Methods 0.000 description 55
- 210000004072 lung Anatomy 0.000 description 54
- 238000010586 diagram Methods 0.000 description 52
- 238000003745 diagnosis Methods 0.000 description 33
- 230000029058 respiratory gaseous exchange Effects 0.000 description 27
- 230000005855 radiation Effects 0.000 description 26
- 210000000038 chest Anatomy 0.000 description 13
- 238000004891 communication Methods 0.000 description 13
- 238000004458 analytical method Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 10
- 230000005856 abnormality Effects 0.000 description 9
- 238000001514 detection method Methods 0.000 description 9
- 201000010099 disease Diseases 0.000 description 6
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 6
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 208000024891 symptom Diseases 0.000 description 5
- 230000000747 cardiac effect Effects 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 4
- 206010014561 Emphysema Diseases 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000003434 inspiratory effect Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 210000000115 thoracic cavity Anatomy 0.000 description 3
- 102100025422 Bone morphogenetic protein receptor type-2 Human genes 0.000 description 2
- 101000934635 Homo sapiens Bone morphogenetic protein receptor type-2 Proteins 0.000 description 2
- 101001007743 Homo sapiens Neurexophilin-2 Proteins 0.000 description 2
- 102100027526 Neurexophilin-2 Human genes 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004195 computer-aided diagnosis Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000004064 dysfunction Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000010827 pathological analysis Methods 0.000 description 2
- 238000004445 quantitative analysis Methods 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 210000003437 trachea Anatomy 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000009423 ventilation Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 206010012725 Diaphragmatic paralysis Diseases 0.000 description 1
- 208000007474 aortic aneurysm Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 238000011976 chest X-ray Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 208000012059 diaphragm disease Diseases 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 208000020816 lung neoplasm Diseases 0.000 description 1
- 208000037841 lung tumor Diseases 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 201000000349 mediastinal cancer Diseases 0.000 description 1
- 208000018280 neoplasm of mediastinum Diseases 0.000 description 1
- 210000003105 phrenic nerve Anatomy 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000008733 trauma Effects 0.000 description 1
- 210000001835 viscera Anatomy 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5288—Devices using data or image processing specially adapted for radiation diagnosis involving retrospective matching to a physiological signal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
- G06F2218/10—Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30048—Heart; Cardiac
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/03—Recognition of patterns in medical or anatomical images
Definitions
- the present invention relates to an image processing technique for a dynamic image in which a human or animal body is photographed.
- a plurality of X-ray images are acquired continuously in time series, and a line is set at a desired position for each of the plurality of X-ray images.
- a technique for generating a new image by acquiring pixel rows arranged along a line and arranging them in time series is disclosed.
- the amount of movement is obtained by measuring the position of the diaphragm from the dynamic image, and the dynamic image at the maximum inspiration and the dynamic image at the maximum expiration are specified and the pixel difference value is used.
- Relative ventilation information is obtained for each divided chest area, linear interpolation is performed between CT images, coronal images, sagittal images, and latham images are created, and the position of the diaphragm is measured from the latham images.
- a technique is disclosed in which a frame of a dynamic image having the same breathing level as that of the image is aligned with a latham image created from the CT image, and ventilation information is superimposed on the coronal image and the dynamic image.
- a method is disclosed in which the positions of the left and right diaphragms and lung apex are measured, the amount of movement is obtained, and the movement is displayed in a graph from the amount of movement.
- the diagnostic image expressed by the method of Patent Document 1 represents a change in the time direction in a fixed line as one cross-sectional image, but the movement of the shape of the target region in the time direction itself (that is, the frame) It is not possible to express a time change in a two-dimensional space on an image.
- Patent Document 2 there is a method of measuring predetermined positions of the left and right diaphragms and the lung apex, obtaining a movement amount (displacement amount) at the position, and displaying the movement at the position from the movement amount as a graph.
- the present invention has been made in view of such circumstances, and an object of the present invention is to provide an image processing technique capable of capturing the movement of the shape of the target area desired by the user.
- an image processing apparatus is a plurality of frames in which a state in which a physical state of a target region in a human body or an animal body changes in time is sequentially photographed in a time direction.
- Dynamic image acquisition means for acquiring a dynamic image composed of images, and extracting a boundary line of the target area for a plurality of frame images out of the plurality of frame images to obtain the plurality of target area boundary lines Any one of the target region boundary lines other than the reference boundary line among the plurality of target region boundary lines using a boundary line extraction unit that performs a boundary line extraction process and pixels corresponding to the plurality of target region boundary lines.
- a displacement amount calculation process for calculating a displacement amount with the reference boundary line as a displacement reference is performed, the displacement amount is a component that needs to be removed, and the displacement amount is calculated after the displacement amount calculation process.
- Displacement correcting means for obtaining a predetermined number of displacement-corrected boundary lines from which the components to be removed are removed by performing a correction process for correcting a predetermined number of the target region boundary lines other than the reference boundary line;
- Display means for displaying displacement corrected boundary information for display based on a number of displacement corrected boundary lines.
- the invention according to claim 2 is the image processing apparatus according to claim 1, wherein the component to be removed is at least one of deformation components due to vertical movement, translation, and rotation in the target region. including.
- the image processing apparatus wherein the reference boundary line is extracted from a selection target frame image including at least the plurality of frame images.
- Frame selection means for performing a frame selection process including a process of selecting a reference frame image and a reference frame image for extracting the target region boundary line excluding the reference boundary line, and the displacement amount calculation process includes: And calculating a displacement amount between pixels corresponding to the target region boundary line of the reference frame image, using the target region boundary line of the reference frame image as the reference boundary line.
- the invention according to claim 4 is the image processing device according to claim 3, wherein the selection target frame image includes a frame image taken in the past from the plurality of frame images, and the frame The selection process includes a process of selecting, as the reference frame image, a frame image taken in the past from the plurality of frame images for the same body.
- the invention according to claim 5 is the image processing apparatus according to claim 3, wherein the target is a periodic change of the target region in the body synchronized with the shooting time at which the plurality of frame images were shot.
- a period classification unit that detects an area period and classifies the plurality of frame images into the target area period units, and the reference frame image and the reference frame image have the target area period within the same period;
- a value indicating a state in which the physical state of the target region changes with time is defined as a physical state value, and the frame selection process includes: (b1) a first in which the physical state value is set in advance.
- a frame image when the physical state value corresponds to the maximum value, (b2) a frame image when the physical state value corresponds to the maximum value, and (b3) when the physical state value corresponds to the minimum value A first selection process for selecting any one of the frame images as the reference frame image; and (c1) a frame image when the physical state value corresponds to a preset second set value.
- C2) a frame image that is temporally close to the reference frame image, and (c3) a frame when the reference frame image corresponds to the minimum value of the physical state value when the reference frame image is the frame image of (b2).
- the reference frame image is the frame image of (b3) when the reference frame image is the frame image corresponding to the maximum value of the physical state value
- any one frame image is referred to And a second selection process for selecting as a frame image.
- the invention according to claim 6 is the image processing apparatus according to claim 3, wherein the amount of displacement used in the correction process corresponds to one of the reference boundary line and the target region boundary line. A displacement amount between pixels, and the target region boundary line other than one of the target region boundary lines is corrected using the displacement amount.
- the invention according to claim 7 is the image processing apparatus according to claim 3, wherein the displacement amount used in the correction processing is a target region boundary that is closest in time to the target region boundary line to be corrected. The amount of displacement from the line.
- the invention according to claim 8 is the image processing apparatus according to claim 3, wherein the displacement amount used in the correction processing is obtained by a sum of displacement amounts between two temporally adjacent boundary lines. The amount of displacement between the reference boundary line and the target area boundary line to be corrected.
- the invention of claim 9 is the image processing apparatus according to any one of claims 1 to 8, wherein a predetermined number of separated images separated for each of the predetermined number of displacement-corrected boundary lines.
- the image generating means for generating the image is further provided, and the display means sequentially displays the predetermined number of separated images as the displacement-corrected boundary line information.
- the invention according to claim 10 is the image processing apparatus according to any one of claims 1 to 9, wherein the predetermined number of displacement-corrected boundary lines are superimposed and displayed.
- An image generation means for generating a still image is further provided, wherein the display means displays the still image as the displacement-corrected boundary line information.
- the invention according to claim 11 is the image processing apparatus according to any one of claims 1 to 10, wherein the target region is at least one of a diaphragm region and a heart region. It is characterized by including.
- a program that causes a computer to function as the image processing apparatus according to any one of the first to eleventh aspects when executed by a computer included in the image processing apparatus. It is.
- a pixel corresponding to a plurality of target region boundary lines is used to detect a target region boundary line other than the reference boundary line among the plurality of target region boundary lines.
- a displacement amount calculation process for calculating a displacement amount using the reference boundary line as a displacement reference is performed, and the displacement amount is a component that needs to be removed.
- a predetermined number of displacement-corrected boundaries in which the required removal components are removed by performing correction processing for correcting a predetermined number of target region boundary lines other than the reference boundary line using the displacement amounts after the displacement amount calculation processing.
- a line is obtained, and displacement-corrected boundary information for display based on a predetermined number of displacement-corrected boundary lines is displayed. That is, by displaying the displacement corrected boundary line from which the deformation corresponding to the displacement amount is removed from the target area boundary line, the user grasps the change in the shape of the target area boundary line itself. It becomes possible to grasp the movement of the shape itself. Further, since the shape of the target region boundary line itself can be captured, a partial shape abnormality can be found, and a partial abnormality such as adhesion can be easily diagnosed. Furthermore, since the diagnostic content desired by the user is collected in the displacement corrected boundary line information, the diagnostic efficiency is improved with a minimum necessary diagnostic time. For this reason, it becomes possible to perform dynamic diagnosis appropriately and efficiently.
- the component to be removed includes at least one component among the deformation components due to the vertical movement, translation, and rotation in the target region.
- a reference frame image for extracting a reference boundary line and a target area boundary line excluding the reference boundary line are extracted from a selection target frame image including at least a plurality of frame images.
- the selection target frame image includes a frame image taken in the past from the plurality of frame images, and the frame selection processing is performed on a plurality of the same body.
- the reference frame image can be implemented using a common (identical) frame image taken in the past.
- the base frame image and the reference frame image are frame images when the target region period is within the same period
- the frame selection processing includes (b1) to (b3 ) To select any one frame image as a reference frame image, and second to select any one frame image from (c1) to (c4) as a reference frame image. Selection process. This makes it possible to accurately diagnose the change in the shape of the target region boundary line itself between frame images within the same period desired by the user.
- a target region boundary other than one of the target region boundary lines using a displacement amount between corresponding pixels between the reference boundary line and one of the target region boundary lines.
- the line can be corrected with high accuracy.
- the amount of displacement used for the correction processing is the amount of displacement from the target region boundary line closest in time to the target region boundary line to be corrected. That is, the reference frame image is changed every time the displacement amount is calculated, and the base frame image can be changed. As described above, the reference frame image can be changed in accordance with the change of the reference frame image.
- a displacement-corrected boundary line with high correction accuracy can be obtained by performing a correction process that always uses the displacement amount between the diaphragm boundary lines between the latest selection target frame images.
- a correction process that always uses the displacement amount between the diaphragm boundary lines between the latest selection target frame images.
- it is possible to display the displacement corrected boundary line information suitable for the diagnostic application, so that the dynamic diagnosis can be performed more appropriately and efficiently.
- the amount of displacement used for the correction processing is obtained from the sum of the amounts of displacement between two boundary lines that are temporally adjacent to each other from the reference boundary line to the correction target region. This is the amount of displacement between the boundaries.
- the displacement calculation processing subdivides one displacement amount between the reference boundary line and the diaphragm boundary line, thereby calculating the displacement calculated without subdividing the reference frame image and the reference frame image. It is possible to calculate with higher accuracy than the amount.
- a predetermined number of separated images separated for each predetermined number of displacement corrected boundary lines are generated, and the predetermined number of separated images are sequentially used as the displacement corrected boundary line information. indicate. Thereby, it is possible to capture the change in the shape of the target region boundary line itself with the dynamic image.
- a single still image is generated so that a predetermined number of displacement-corrected boundary lines are superimposed and displayed as displacement-corrected boundary line information.
- the displacement corrected boundary line information it is possible to superimpose them so that they can be identified.
- the change in the shape of the target region boundary line itself can be captured by a single still image.
- the target region includes at least one of a diaphragm region and a heart region.
- FIG. 1 is a diagram illustrating an overall configuration of a radiation dynamic image capturing system 100 according to a first embodiment. It is a figure explaining the relationship between respiratory motion and the position of a diaphragm. It is a block diagram which shows the function structure of the image processing apparatus 3 which concerns on 1st Embodiment. It is a figure which illustrates the dynamic image image
- the radiation dynamic image capturing system captures a radiation image in a situation in which the physical state of a target region of a subject periodically changes over time using a human or animal body as a subject.
- FIG. 1 is a diagram showing an overall configuration of a radiation dynamic image capturing system according to the first embodiment.
- the radiation dynamic image capturing system 100 includes an image capturing device 1, an image capturing control device 2 (imaging console), and an image processing device 3 (diagnosis console).
- the imaging device 1 and the imaging control device 2 are connected by a communication cable or the like, and the imaging control device 2 and the image processing device 3 are connected via a communication network NT such as a LAN (Local Area Network).
- NT such as a LAN (Local Area Network).
- Each device constituting the radiation dynamic image capturing system 100 conforms to the DICOM (Digital Image and Communication Communications in Medicine) standard, and communication between the devices is performed according to the DICOM standard.
- DICOM Digital Image and Communication Communications in Medicine
- the imaging apparatus 1 is configured by, for example, an X-ray imaging apparatus or the like, and is an apparatus that captures the chest dynamics of the subject M accompanying breathing. Dynamic imaging is performed by acquiring a plurality of images sequentially in time while repeatedly irradiating the chest of the subject M with radiation such as X-rays. A series of images obtained by this continuous shooting is called a dynamic image. Each of the plurality of images constituting the dynamic image is called a frame image.
- the imaging apparatus 1 includes an irradiation unit (radiation source) 11, a radiation irradiation control device 12, an imaging unit (radiation detection unit) 13, and a reading control device 14. .
- the irradiation unit 11 irradiates the subject M with radiation (X-rays) according to the control of the radiation irradiation control device 12.
- the illustrated example is a system for the human body, and the subject M corresponds to the person to be inspected.
- the subject M is also referred to as a “subject”.
- the radiation irradiation control device 12 is connected to the imaging control device 2 and performs radiation imaging by controlling the irradiation unit 11 based on the radiation irradiation conditions input from the imaging control device 2.
- the imaging unit 13 is configured by a semiconductor image sensor such as an FPD, and converts the radiation irradiated from the irradiation unit 11 and transmitted through the subject M into an electrical signal (image information).
- a semiconductor image sensor such as an FPD
- the reading control device 14 is connected to the photographing control device 2.
- the reading control device 14 controls the switching unit of each pixel of the imaging unit 13 based on the image reading condition input from the imaging control device 2, and switches the reading of the electric signal accumulated in each pixel.
- the image data is acquired by reading the electrical signal accumulated in the imaging unit 13.
- the reading control device 14 outputs the acquired image data (frame image) to the imaging control device 2.
- the image reading conditions are, for example, a frame rate, a frame interval, a pixel size, an image size (matrix size), and the like.
- the frame rate is the number of frame images acquired per second and matches the pulse rate.
- the frame interval is the time from the start of one frame image acquisition operation to the start of the next frame image acquisition operation in continuous shooting, and coincides with the pulse interval.
- the radiation irradiation control device 12 and the reading control device 14 are connected to each other, and exchange synchronization signals with each other to synchronize the radiation irradiation operation and the image reading operation.
- the imaging control device 2 outputs radiation irradiation conditions and image reading conditions to the imaging device 1 to control radiation imaging and radiographic image reading operations by the imaging device 1, and also captures dynamic images acquired by the imaging device 1. Displayed for confirmation of whether the image is suitable for confirmation of positioning or diagnosis.
- the photographing control device 2 includes a control unit 21, a storage unit 22, an operation unit 23, a display unit 24, and a communication unit 25, and each unit is connected by a bus 26. ing.
- the control unit 21 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), and the like.
- the CPU of the control unit 21 reads the system program and various processing programs stored in the storage unit 22 in accordance with the operation of the operation unit 23, expands them in the RAM, and performs shooting control processing described later according to the expanded programs.
- Various processes including the beginning are executed to centrally control the operation of each part of the imaging control device 2 and the operation of the imaging device 1.
- the storage unit 22 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
- the storage unit 22 stores various programs executed by the control unit 21 and data such as parameters necessary for execution of processing by the programs or processing results.
- the operation unit 23 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
- the operation unit 23 is input via a keyboard key operation, a mouse operation, or a touch panel.
- the indicated instruction signal is output to the control unit 21.
- the display unit 24 is configured by a monitor such as a color LCD (Liquid Crystal Display), and displays an input instruction, data, and the like from the operation unit 23 in accordance with an instruction of a display signal input from the control unit 21.
- a monitor such as a color LCD (Liquid Crystal Display)
- LCD Liquid Crystal Display
- the communication unit 25 includes a LAN adapter, a modem, a TA (Terminal Adapter), and the like, and controls data transmission / reception with each device connected to the communication network NT.
- the image processing device 3 acquires the dynamic image transmitted from the imaging device 1 via the imaging control device 2 and displays an image for a doctor or the like to perform an interpretation diagnosis.
- the image processing apparatus 3 includes a control unit 31, a storage unit 32, an operation unit 33, a display unit 34, a communication unit 35, and an analysis unit 36. They are connected by a bus 37.
- the control unit 31 includes a CPU, a RAM, and the like.
- the CPU of the control unit 31 reads the system program and various processing programs stored in the storage unit 32 in accordance with the operation of the operation unit 33, expands them in the RAM, and executes various processes according to the expanded programs.
- the operation of each part of the image processing apparatus 3 is centrally controlled (details will be described later).
- the storage unit 32 is configured by a nonvolatile semiconductor memory, a hard disk, or the like.
- the storage unit 32 stores various programs executed by the control unit 31 and data such as parameters necessary for execution of processing by the programs or processing results.
- the storage unit 32 stores an image processing program for executing image processing to be described later.
- These various programs are stored in the form of readable program codes, and the control unit 31 sequentially executes operations according to the program codes.
- the operation unit 33 includes a keyboard having cursor keys, numeric input keys, various function keys, and the like, and a pointing device such as a mouse.
- the operation unit 33 is input via a keyboard key operation, a mouse operation, or a touch panel.
- the instruction signal is output to the control unit 31.
- the display unit 34 is composed of a monitor such as a color LCD, and displays an input instruction from the operation unit 33, data, and a display image to be described later in accordance with an instruction of a display signal input from the control unit 31.
- the communication unit 35 includes a LAN adapter, a modem, a TA, and the like, and controls data transmission / reception with each device connected to the communication network NT.
- FIG. 2 is a diagram for explaining a general relationship between the respiratory motion and the position of the diaphragm.
- FIG. 2A is a schematic diagram showing when breathing on the side surface inside the human body (when inhaling)
- FIG. 2B is a schematic diagram showing when breathing on the side surface inside the human body (when exhaling)
- FIG. 2C is a schematic diagram showing both the state of exhalation and inspiration on the front surface inside the human body.
- breathing is performed by inhaling air when the left and right chest cavities 52 surrounded by the rib cage 53 and the diaphragm 50 are inflated, and when contracted, the air is exhaled and breathed. . That is, as shown in FIG. 2 (a), at the time of inhalation, the diaphragm 50-1 falls as indicated by an arrow AR11 and the ribs rise as indicated by an arrow AR12, so that the chest cavity 52 expands and the trachea 51 passes through the arrow AR13. So that air is inhaled into the lungs. On the other hand, as shown in FIG.
- the diaphragm 50-2 rises as indicated by the arrow AR21 and the ribs descend as indicated by the arrow AR22, thereby narrowing the chest cavity 52 and passing through the trachea 51, the arrow AR23. Air is pushed out of the lung like As the diaphragm 50 moves in this manner, about 60% of the respiratory motion is shared.
- the relationship between the respiratory motion and the position of the diaphragm 50 in a healthy person has been described.
- a disease such as emphysema
- the lungs expand excessively, and the lungs keep the diaphragm 50 depressed even when exhaling. That is, the diaphragm 50 does not move and breathing movement cannot be performed well.
- diaphragms 50 in a non-healthy person suffering from a diaphragm disease such as diaphragmatic relaxation, with one or both diaphragms 50 not moving, by mediastinal or lung tumor, aortic aneurysm, trauma, mediastinal surgery, etc.
- the phrenic nerve that moves the diaphragm 50 is damaged and diaphragmatic paralysis occurs.
- the user needs to perform a dynamic diagnosis by observing in detail or observing a difference from the movement of a healthy person due to breathing, and a great amount of observation time is spent. , The diagnostic efficiency becomes very bad.
- the diaphragm has a three-dimensional shape, and it is difficult to grasp the shape of the diaphragm from the actually taken X-ray dynamic image and to grasp the movement of the diaphragm. It is also necessary to make a diagnosis based on changes in the shape and movement of the left and right lung fields. However, for users with little experience in dynamic diagnosis, even if it is normal breathing, grasp and judge abnormal shape changes in the left and right lung fields from dynamic images that cause multiple movements such as position fluctuations and shape fluctuations. It is difficult.
- the present invention it is possible to capture the movement of the shape of the target area itself by displaying the boundary information obtained by removing unnecessary deformation between frame images from the boundary line of the target area.
- the image processing apparatus 3 of the radiation dynamic imaging system 100 displays boundary information from which vertical deformation, translation, and rotation between frame images are removed, thereby appropriately performing dynamic diagnosis. And it becomes possible to carry out efficiently.
- FIG. 3 is a diagram illustrating a functional configuration realized by the control unit 31 when the CPU or the like operates according to various programs in the image processing apparatus 3 in the radiation dynamic image capturing system 100 together with other configurations. Note that the image processing apparatus 3 of this embodiment uses a dynamic image in which the chest including the heart and both lungs is mainly captured.
- the control unit 31 mainly includes a dynamic image acquisition unit 110, a frame selection unit 120, a boundary line extraction unit 130, a displacement correction unit 140, and an image generation unit 150.
- control unit 31 As shown in FIG. 3 will be described as being realized by executing a preinstalled program, but it may be realized with a dedicated hardware configuration. .
- Dynamic Image Acquisition Unit 110 In the dynamic image acquisition unit 110, a plurality of frames in which the physical state of the target region inside the body of the subject M photographed by the reading control device 14 of the imaging device 1 is sequentially photographed in the time direction. A dynamic image composed of images is acquired.
- the target region in the present embodiment is assumed to be a diaphragm region. That is, as shown in FIG. 3, the imaging control device 2 is interposed between the imaging device 1 and the image processing device 3, and the detection data (a plurality of frame images) stored in the storage unit 22 of the imaging control device 2. MI) is output to the communication unit 35 of the image processing apparatus 3 via the communication unit 25.
- FIG. 4 is a diagram exemplifying a dynamic image captured by radiodynamic image capturing with respect to the dynamics of the chest of the subject M accompanying breathing.
- the frame images M1 to M10 (MI) acquired by the dynamic image acquisition unit 110 are obtained by continuously capturing one period of the respiratory cycle at a fixed imaging timing.
- Frame selection unit 120 In the frame selection unit 120, a reference frame image BF for extracting a reference boundary line BL to be described later with respect to a selection target frame image TI including at least a first number (a plurality of) frame images MI that is equal to or greater than two; Then, a frame selection process including a process of selecting a reference frame image RF for extracting a diaphragm boundary line LI (details will be described later) excluding the reference boundary line BL is performed.
- FIG. 5 is a schematic diagram showing the respiratory phase PH and the imaging timing TM in which time-series waveform data of physical state values (respiration vibration values) to be described later is shown, and is a diagram for explaining frame selection processing. .
- the frame selection process performs the reference frame image BF and the reference frame image RF from the frame images T1 to T5 (TI). Select.
- Boundary line extraction unit 130 performs boundary line extraction processing on the selection target frame image TI to extract a boundary line of the target area to obtain a first number (a plurality) of target area boundary lines. Since the target region in the present embodiment is a diaphragm region, the target region boundary line will be described below as a diaphragm boundary line.
- the boundary line extraction unit 130 performs boundary line extraction processing on the base frame image BF and the reference frame image RF selected by the frame selection unit 120. That is, the diaphragm boundary line LI for the reference frame image BF corresponds to the reference boundary line BL, and the diaphragm boundary line LI for the reference frame image RF corresponds to the reference boundary line RL. Then, the boundary line extraction unit 130 outputs the standard boundary line BL and the reference boundary line RL to the displacement correction unit 140 (see FIG. 3).
- FIG. 6 and 7 are schematic views illustrating the diaphragm boundary line at the expiration phase PH2 (see FIG. 5).
- FIG. 7 is a schematic diagram showing the diaphragm boundary line in the area R1 of FIG. 6 at the expiration phase PH2 in an orthogonal coordinate system.
- the diaphragm boundary lines L1 to L5 are compared with the selection target frame images T1 to T5 (TI) in FIG. 5 described above corresponding to the base frame image BF and the reference frame image RF.
- Each L5 (LI) is extracted.
- the extraction method is not limited to the method described here, and any method can be used as long as it can be extracted from a dynamic image.
- the first boundary line extraction process is a process for extracting the diaphragm boundary line LI by extracting the contour of the lung field based on the selection target frame image TI.
- FIG. 8 is a schematic view illustrating the contour extraction of the lung field portion including the diaphragm boundary line LI. As shown in FIG. 8, the lung field is extracted as a contour including the heart and spine regions (see FIG. 8B), even if the left and right sides are extracted (see FIG. 8A). May be.
- This extraction method includes conventional techniques (for example, “Image feature analysis and computer-aided diagnosis: Accurate determination of ribcage boundary in chest radiographs”, Xin-Wei Xu and Kunio Doi, Medical Physics, Volume 22 (5), May 1995. , pp.617-626.) can be used.
- the second boundary line extraction process is a process for extracting the diaphragm boundary line LI by model-based extraction.
- the candidate positions of the diaphragm are roughly extracted (roughly extracted) by template matching, which is one of the model-based methods, and the extracted candidate areas are analyzed in detail (exactly extracted), and extracted accurately.
- the weighting of the template according to the amount of movement of the diaphragm can be performed, and the accuracy of the rough extraction can be improved. It is possible to improve the extraction accuracy of the boundary line LI.
- this extraction method for example, “Japanese Patent Application No. 2012-138364 (application date: June 20, 2012)” filed by the present applicant can be adopted.
- the third boundary line extraction process is a process of extracting the diaphragm boundary line LI by extraction by profile analysis.
- FIG. 9 is a diagram for explaining profile analysis.
- FIG. 9A shows a profile region R2 of the selection target frame image TI
- FIG. 9B shows a profile region R2 of the selection target frame image TI on the horizontal axis (see FIG. 9A). It is a graph in which the vertical axis indicates the gray value with respect to the vertical coordinate. As shown in FIG. 9, it is possible to create a profile R2 in the vertical direction, and extract the change point of the gray value peak in the created profile R2 as the boundary line of the diaphragm.
- the fourth boundary line extraction process is a process of extracting the diaphragm boundary line LI by extraction specified by the user.
- the user designation may be such that the user simply draws the extraction target line of the diaphragm boundary line LI, or the diaphragm boundary line LI extracted by the first to third boundary line extraction processes described above. You may use as a method of correcting.
- the user simply designates the line only one of the target selection target frame images TI is designated by the user, and the remaining frame images adopt a corresponding point search method in the time direction. It is preferable to track by doing so.
- FIG. 10 is a schematic diagram for explaining a displacement amount between the diaphragm boundary lines LI in the selection target frame image TI extracted by the boundary line extraction unit 130.
- 10A and 10B show the diaphragm boundary lines L1 and L2 in the frame images T1 and T2 extracted by the boundary line extraction unit 130, respectively, and
- FIG. 10C superimposes the diaphragm boundary lines L1 and L2. Display.
- FIG. 11 is a schematic diagram showing two examples of the amount of displacement between the diaphragm boundary lines LI in the selection target frame image TI extracted by the boundary line extraction unit 130.
- FIG. 11A illustrates the deformation due to vertical movement
- FIG. 11B illustrates the deformation due to translation and rotation.
- the displacement correction unit 140 performs the following two processes.
- the first processing pixels corresponding to the first number (plurality) of diaphragm boundary lines LI are used, and among the first number of diaphragm boundary lines LI, diaphragm boundary lines LI other than the reference boundary line BL are used.
- the amount of displacement D indicates a component that needs to be removed, and the component that needs to be removed includes at least one component among components deformed by vertical movement, translation, and rotation in the diaphragm region.
- a second number (a predetermined number excluding the reference boundary line BL) that is equal to or smaller than the first number is corrected using the displacement amount D. Correction processing is performed.
- a second number (predetermined number) of displacement-corrected boundary lines LIc from which the required removal components have been removed are obtained (see FIG. 3).
- the displacement correction unit 140 receives the standard boundary line BL and the reference boundary line RL extracted by the boundary line extraction unit 130, and performs a displacement amount calculation process and a correction process.
- the displacement amount calculation processing calculates a displacement amount between corresponding pixels of the standard boundary line BL and the reference boundary line RL.
- the correction process is performed on the diaphragm boundary line LI to be corrected using the displacement amount, thereby obtaining the displacement corrected boundary line LIc.
- the diaphragm boundary line LI to be corrected corresponds to the reference boundary line RL.
- FIG. 12 is a schematic view illustrating the displacement-corrected boundary line LIc at the expiration phase PH2 (see FIG. 5).
- the displacement corrected boundary lines L1c to L5c (LIc) shown in FIG. 12 are obtained when the displacement correction unit 140 performs the diaphragm boundary lines L1 to L5 (LI) shown in FIGS. 6 and 7 (that is, the diaphragm boundary line to be corrected).
- LI corresponds to a boundary line from which a component to be removed is removed.
- the frame selection unit 120 selects the reference frame image BF as the frame image T1, and the reference frame image RF (the frame image to be corrected).
- the displacement correction unit 140 performs displacement correction processing using the reference boundary line BL as the diaphragm boundary line L1 and the reference boundary line RL as the diaphragm boundary lines L1 to L5, thereby obtaining the displacement corrected boundary lines L1c to L5c (LIc). ing.
- the first displacement amount calculation process is effective when the deformation (ii) is caused only by the vertical movement. That is, it is implemented on the assumption that the amount of displacement between the diaphragm boundary lines LI is only in the vertical direction.
- FIG. 13 is a schematic diagram for explaining the first displacement amount calculation processing.
- the reference boundary line BL is the diaphragm boundary line L1
- the reference boundary line RL is the diaphragm boundary line L2
- the diaphragm boundary line L2 corresponding to the pixels P11, P12, P13, and P14 of the diaphragm boundary line L1.
- the standard boundary line BL and the reference boundary line RL are shown. The corresponding pixels are designated as appropriate.
- the displacement amount of the diaphragm boundary line L2 with respect to the diaphragm boundary line L1 that is, the displacement amount of the corresponding pixel is only in the Y-axis direction. That is, “displacement amount d1 between pixels P11 and P21”, “displacement amount d2 between pixels P12 and P22”, “displacement amount d3 between pixels P13 and P23” calculated by the first displacement amount calculation process, In addition, “the displacement d4 between the pixels P14 and P24” is a value of only the Y component.
- the values of the displacements d1 to d4 as they are are used as the displacement D12 between the reference boundary line BL and the reference boundary line RL, and the average value, minimum value, or maximum value of the displacement amounts d1 to d4 is used as the reference boundary line.
- the displacement D12 between BL and the reference boundary line RL may be used. This displacement amount D12 becomes a vertical movement deformation component between the reference boundary line BL and the reference boundary line RL.
- the second displacement amount calculation process is effective when the deformation (ii) is caused not only by vertical movement but also by translation and rotation.
- the displacement amount between the diaphragm boundary lines LI is carried out on the assumption that it includes any of vertical movement, translation, and rotation.
- FIG. 14 is a schematic diagram illustrating the second displacement amount calculation process.
- subjected to FIG. 14 is common in FIG. 14 is different from FIG. 13 in that the corresponding pixel designation method is different. That is, in the second displacement amount calculation process, the displacement amount D12 is calculated so as to match the end points. Specifically, rotation or translation is performed between the end points of the diaphragm boundary lines L1 and L2, that is, the pixels P10 and P20 and the pixels P14 and P24 are aligned, and the displacement is performed with respect to the coordinate point existing between the end points. The quantities d1 to d4 are calculated.
- the parallel movement amount and the rotation angle that can be calculated by the displacement amounts d1 to d4 shown in FIG. 14 are set as the displacement amount D12 between the reference boundary line BL and the reference boundary line RL, as well as the parallel movement amount and An amount calculated for an average value (minimum value or maximum value) of rotation angles or a combination thereof may be used as the displacement amount D12 between the reference boundary line BL and the reference boundary line RL.
- This displacement amount D12 becomes a deformation component due to vertical movement, translation and rotation between the reference boundary line BL and the reference boundary line RL.
- the displacement amounts d1 to d4 between the pixels shown in FIG. 14 correspond to the closest point-to-point distance.
- affine transformation or the like can be employed when performing rotation or parallel movement.
- the closest point-to-point distance may be obtained by employing a least square method or the like.
- the third displacement amount calculation process is effective when the deformation (ii) is caused not only by vertical movement but also by translation and rotation.
- the displacement amount between the diaphragm boundary lines LI is carried out on the assumption that it includes any of vertical movement, translation, and rotation.
- the displacement amount D is calculated by shape fitting.
- an ICP (Iterative Closest Point) algorithm or the like can be employed. If the ICP algorithm is used, the convergence calculation is performed so that the distance between corresponding pixels is minimized by translating or rotating one boundary line of the reference boundary line BL and the reference boundary line RL. Can be implemented.
- the ICP algorithm is taken as an example, but other methods may be used as long as the fitting is performed by performing convergence calculation so that the distance between corresponding pixels is minimized. That is, as an advantage of performing fitting by convergence calculation like the ICP algorithm, it is possible to calculate the displacement amount D accurately because the shape can be matched in more detail.
- the displacement amount D between the standard boundary line BL and the reference boundary line RL is obtained so that the distance between the corresponding pixels is minimized by the fitting process.
- This displacement amount D becomes a deformation component due to vertical movement, translation and rotation between the reference boundary line BL and the reference boundary line RL.
- the fourth displacement amount calculation process is effective when the deformation (ii) is caused not only by vertical movement but also by translation and rotation.
- the displacement amount between the diaphragm boundary lines LI is carried out on the assumption that it includes any of vertical movement, translation, and rotation.
- each pixel corresponding to the reference boundary line RL with respect to the reference boundary line BL is tracked, and the “tracking result” is calculated as the displacement amount D.
- the following method can be employed.
- POC phase only correlation
- the correlation similarity between the input image (reference boundary line RL) to be collated with the original registered image (reference boundary line BL) is calculated.
- amplitude grayscale data
- phase image contour data
- shape information is obtained.
- Correlation can be instantaneously image processed using only phase information without using amplitude information not included. Therefore, as an effect, since the high frequency component can be cut, it is possible to prevent erroneous detection of the corresponding point due to the influence of the blood flow itself.
- RIPOC rotation invariant phase only correlation
- polar coordinate conversion is performed on the amplitude component, and a polar coordinate image in which the X direction has an angle theta and the Y direction has a radius r is created.
- the displacement in the X direction corresponds to the angle displacement, so the rotation amount can be estimated from the matching result, the original image is corrected using the estimated rotation amount, and then the position estimation is performed.
- a method as described in Japanese Patent No. 3574301 can be employed.
- the tracking result obtained by the corresponding point search for each pixel of the reference boundary line BL can be set as the displacement amount D between the reference boundary line BL and the reference boundary line RL.
- This displacement amount D becomes a deformation component due to vertical movement, translation and rotation between the reference boundary line BL and the reference boundary line RL.
- the first to fourth displacement amount calculation processes described above include the reference boundary including the case of obtaining the displacement amount D between the reference boundary lines RL and RL as well as the displacement amount D between the reference boundary line BL and the reference boundary line RL.
- the displacement amount D is obtained so as to be closely matched with the line BL.
- the displacement amount D may be obtained by subtracting between corresponding pixels. Therefore, any of the first to fourth displacement amount calculation manual processes described above is a process for obtaining the displacement amount D using the reference boundary line BL as a displacement reference.
- the diaphragm boundary line LI is corrected using the displacement D12 calculated using the reference boundary line BL as the diaphragm boundary line L1 and the reference boundary line RL as the diaphragm boundary line L2.
- FIG. 15 is a schematic diagram for explaining the first correction processing.
- the displacement amount D is only the vertical movement, and the pixels P1 and P2 are shown as representatives between the pixels corresponding to the displacement amount D12.
- the first displacement described above is performed.
- a displacement amount D12 between the diaphragm boundary lines L1 and L2 is calculated by the amount calculation process.
- the diaphragm boundary lines L2 and L3 are corrected.
- the diaphragm boundary line L2 is corrected to obtain the displacement-corrected boundary line L2c, that is, the pixel P2 on the diaphragm boundary line L2 becomes the pixel P2c on the displacement-corrected boundary line L2c after the first correction processing.
- the pixel P3 on the diaphragm boundary line L3 is also subjected to the first correction process. It becomes the pixel P3c on the displacement-corrected boundary line L3c.
- the first correction process using the displacement amount D12 between the diaphragm boundary lines L1 and L2 obtained by the first displacement amount calculation process is shown.
- the second to fourth displacement amounts are shown.
- the first correction process can be performed using the displacement D12 between the diaphragm boundary lines L1 and L2 obtained by the calculation process.
- Second Correction Process using the displacement amount D1I calculated using the reference boundary line BL as the diaphragm boundary line L1 and the reference boundary line RL as the diaphragm boundary line LI (where the argument I is an integer of 2 or more), This is a process for correcting the diaphragm boundary line LI.
- FIG. 16 is a schematic diagram illustrating the second correction process.
- the pixels P1 and P2 are shown as representative pixels between the corresponding pixels of the displacement amount D12, and the corresponding pixels of the displacement amount D13.
- the pixels P1 and P3 are shown as representatives.
- the diaphragm boundary is first calculated by the first displacement amount calculation process.
- a displacement amount D12 between the lines L1 and L2 is calculated.
- the diaphragm boundary line L2 is corrected using the displacement amount D12. Specifically, as described above, the pixel P2 on the diaphragm boundary line L2 becomes the pixel P2c on the displacement-corrected boundary line L2c after the second correction processing.
- the diaphragm boundary L3 When the diaphragm boundary L3 is corrected using the displacement D13 between the reference boundary BL and the diaphragm boundary L3 (when the correction target diaphragm boundary LI is the diaphragm boundary L3), the diaphragm boundary L3 The upper pixel P3 becomes the pixel P3c on the displacement-corrected boundary line L3c after the second correction processing.
- FIG. 17 is a schematic diagram showing before and after the second correction process.
- FIG. 17A shows the diaphragm boundary before the process (diaphragm boundary line LI to be corrected) indicating L2 and L3, and FIG. ) Shows a reference boundary line BL (diaphragm boundary line L1) and displacement corrected boundary lines L2c and L3c after correction processing.
- BL diaphragm boundary line L1
- the correction process including the first and second correction processes described above is as follows. That is, using a pixel corresponding to the target region boundary line, one of a plurality of target region boundary lines (diaphragm boundary lines L1 to L3) other than the reference boundary line BL (L1) or The displacement amount calculation process for calculating the displacement amount ⁇ D (D12 or D12, D13) with the reference boundary line BL as the displacement reference is performed for the above, the displacement amount ⁇ D is a component that needs to be removed, and the displacement amount after the displacement amount calculation process This is a correction process for correcting two (predetermined number) target area boundary lines L2 and L3 other than the reference boundary line BL using ⁇ D.
- the displacement amount ⁇ D used for the correction process is a displacement amount D12 between corresponding pixels of the reference boundary line and one of the target region boundary lines (diaphragm curve L2). Processing for correcting the target region boundary line (diaphragm curve L2) other than one of the target region boundary lines using the displacement amount D12 is the first correction processing.
- the displacement amount ⁇ D used in the correction process is between the corresponding pixels of the reference boundary line BL (L1) and one or more of the target area boundary lines (diaphragm curves L2, L3).
- the displacement amounts are D12 and D13.
- a process of correcting two (predetermined number) target area boundary lines (diaphragm curves L2 and L3) other than the reference boundary line BL using the displacement amounts D12 and D13 is a second correction process.
- the image generation unit 150 generates display-corrected displacement-corrected boundary line information LG based on a second number of displacement-corrected boundary lines LIc that is equal to or smaller than the first number (see FIG. 3).
- a second number of separated images separated for each second number of displacement corrected boundary lines LIc may be generated, or the second number of displacement corrected boundary lines may be generated.
- One still image may be generated so that the LIc is superimposed and displayed.
- the display unit 34 displays the displacement corrected boundary line information LG generated by the image generation unit 150 (see FIG. 3). That is, when the separated image is generated, the display unit 34 sequentially displays the second number of separated images as the displacement corrected boundary line information LG. When a still image is generated, the display unit 34 displays one still image as the displacement corrected boundary line information LG.
- FIGS. 18A and 19 are schematic diagrams illustrating the displacement corrected boundary line information LG.
- FIG. 18 shows the displacement corrected boundary line information LG in a healthy person
- FIG. 19 shows the displacement corrected boundary line in a non-healthy person. It is a figure which shows information LG.
- FIGS. 18A and 19A show the displacement corrected boundary lines based on the displacement corrected boundary lines L2c to L5c (LIc) whose second number is “4”. This is a case where one still image (displacement corrected boundary line information LG for display) is generated by changing the color for each boundary line so that the line LIc is superimposed and displayed on the display unit 34.
- information indicating the reference boundary line BL is included in the displacement-corrected boundary line information LG, and the diaphragm boundary line that is the reference boundary line BL. L1 may be superimposed and displayed at the same time.
- FIG. 18B and FIG. 19B show a display-corrected displacement corrected boundary line LAc calculated by the image generation unit 150 based on the displacement-corrected boundary lines L2c to L5c. This is a case where the boundary line information LG is generated and displayed on the display unit 34.
- the shape of the left side L and the right side R of the diaphragm (namely, the shape of the left lung field and the right lung field) is bilaterally symmetrical.
- Displacement-corrected boundary lines L2c to L5c on the right side R are substantially the same position, and the shapes of the left side L and the right side R of the diaphragm are asymmetric.
- the displacement-corrected boundary line LIc the normal subject M is displayed in a symmetrical radial pattern as shown in FIG. 18 (a).
- a deformed shape suggesting that there is an abnormality in a part is displayed. For this reason, it is possible to make an objective diagnosis without making a diagnosis based on the user's subjectivity.
- the image generation unit 150 calculates the degree of variation (for example, the total of dispersion / distance difference, average, maximum distance difference, etc.) between the displacement corrected boundary lines LIc based on the displacement corrected boundary lines LIc, and the displacement corrected.
- You may display on the display part 34 as boundary line information LG.
- the displacement-corrected boundary line information LG may be displayed as the two displacement-corrected boundary lines LIc having the highest degree of variation, or only those whose displacement-corrected boundary line LIc has variations greater than or equal to a threshold value. It may be displayed.
- FIG. 20 is a flowchart for explaining basic operations realized in the image processing apparatus 3 according to this embodiment. Since the individual functions of each unit have already been described (see FIG. 3), only the overall flow will be described here.
- step S ⁇ b> 1 the dynamic image acquisition unit 110 of the control unit 31 captures a dynamic image (a plurality of frame images MI) captured by the reading control device 14 of the imaging device 1. Get through 2.
- step S2 the frame selection unit 120 extracts a predetermined number of diaphragm boundary lines LI excluding the reference frame image BF and the reference boundary line BL for extracting the reference boundary line BL from the selection target frame image TI.
- a frame selection process including a process of selecting the reference frame image RF is performed (see FIG. 3).
- the frame image T1 is selected as the base frame image BF
- the frame images T2 to T5 are selected as the reference frame image RF.
- step S3 the boundary line extraction unit 130 extracts a diaphragm boundary line from the base frame image BF and the reference frame image RF (diaphragm boundary line LI to be corrected) selected in step S2 to extract the diaphragm boundary line LI.
- a boundary line extraction process for obtaining the standard boundary line BL and the reference boundary line RL is performed (see FIGS. 6 and 7).
- the diaphragm boundary line L1 is extracted as the reference boundary line BL
- the diaphragm boundary lines L2 to L5 are extracted as the reference boundary line RL that is the diaphragm boundary line LI to be corrected.
- step S4 the displacement correction unit 140 uses the reference boundary line BL extracted in step S3 and the pixel corresponding to the reference boundary line RL, and the displacement amount using the reference boundary line BL of the reference boundary line RL as a displacement reference.
- a correction process for correcting the diaphragm boundary line LI to be corrected using the displacement amount D is performed.
- a displacement-corrected boundary line LIc from which a component to be removed deformation components such as vertical movement, translation, and rotation
- step S5 when further processing is performed in the displacement correction unit 140, when the displacement correction unit 140 changes the reference boundary line RL (diaphragm boundary line LI to be corrected), the frame selection unit 120 receives the reference frame image RF. An instruction is given to change (the frame image to be corrected), and the processes in steps S2 to S4 are repeated again. On the other hand, when ending the process in the displacement correction unit 140, the process proceeds to step S6.
- step S2 when the reference frame image RF (frame image to be corrected) is the frame images T2 and T3 and the frame images T2 and T3 are selected at once as shown in FIG. If the reference boundary line RL is the diaphragm boundary lines L2 and L3), the process can proceed to step S6. However, when the reference frame image RF is selected individually without selecting the frame images T2 and T3 at once (that is, when the reference boundary line RL does not process the diaphragm boundary lines L2 and L3 at once), Each time the reference frame image RF is changed, the process proceeds to step S2, and the processes of steps S2 to S4 are performed.
- the reference frame image RF frame image to be corrected
- step S6 the image generation unit 150 generates the displacement corrected boundary line information LG based on the displacement corrected boundary line LIc obtained in step 4 (see FIGS. 18 and 19).
- the displacement-corrected boundary line information LG includes at least information indicating the displacement-corrected boundary line LIc and information indicating the reference boundary line BL together.
- step S7 the image generation unit 150 outputs the displacement corrected boundary line information LG generated in step S6 on the display unit 34 or the storage unit 32 (see FIG. 3), and this operation flow is ended.
- the first number of diaphragms are used by using pixels corresponding to the first number (plurality) of diaphragm boundary lines LI (multiple target area boundary lines).
- a quantity calculation process is performed.
- the displacement amount D is a component that needs to be removed.
- a correction process for correcting a predetermined number (second number) of diaphragm boundary lines LI other than the reference boundary line BL by using the displacement amount D is performed, thereby removing the components that need to be removed.
- Two (predetermined) displacement-corrected boundary lines LIc are obtained, and display-corrected boundary information LG for display based on the second (predetermined) displacement-corrected boundary lines LIc is displayed on the display unit 34. indicate.
- the user grasps the change in the shape of the diaphragm boundary line LI, so that the diaphragm It becomes possible to grasp the movement of the shape itself. Further, since the shape of the diaphragm boundary line LI itself can be grasped, a partial shape abnormality can be found, and a partial abnormality such as adhesion can be easily diagnosed. Furthermore, since the diagnostic content desired by the user is collected in the displacement-corrected boundary line information LG, the diagnostic efficiency is improved because only the minimum necessary diagnostic time is required. For this reason, it becomes possible to perform dynamic diagnosis appropriately and efficiently.
- the component to be removed includes at least one of the deformation components due to vertical movement, translation, and rotation in the target region, so that deformation due to vertical movement, translation, and rotation is removed from the diaphragm boundary line LI.
- the displacement-corrected boundary line LIc can be displayed.
- the boundary line extraction process, the displacement amount calculation process, and the correction process are performed as compared with the case where the displacement corrected boundary line LIc is obtained for all the frame images MI included in the dynamic image. It is possible to minimize the calculation time required.
- a second number (predetermined number) of separated images separated for each second number (predetermined number) of displacement corrected boundary lines LIc is generated, and the second number is used as the displacement corrected boundary line information LG.
- a (predetermined number) of separated images are sequentially displayed.
- one still image is generated so that the second number (predetermined number) of displacement-corrected boundary lines LIc are superimposed and displayed as displacement-corrected boundary line information LG.
- the shape of the second number (predetermined number) of displacement-corrected boundary lines LIc to be diagnosed is the displacement-corrected boundary line information LG, they can be displayed in an identifiable manner.
- the change in the shape of the diaphragm boundary line LI itself can be captured by a single still image.
- the target area is the diaphragm area
- diseases such as emphysema and diaphragmatic dysfunction by dynamic diagnosis.
- these symptoms are mild symptoms, abnormalities may not be known.
- diagnosis is performed using the displacement-corrected boundary line information LG, it does not depend on the user's subjectivity, thereby preventing misdiagnosis. It becomes possible.
- the displacement amount D used in the third correction process is a displacement amount from the diaphragm boundary line LI that is temporally closest to the diaphragm boundary line LI to be corrected.
- FIG. 21 is a schematic diagram for explaining the third correction process.
- the pixels P1 and P2 are represented as the pixels corresponding to the displacement amount D12.
- the pixels P2 and P3 are shown as representatives between the pixels corresponding to the displacement amount D23, and the pixels P3 and P4 are shown as representatives between the pixels corresponding to the displacement amount D34.
- the reference boundary line BL is the diaphragm boundary line L1
- the correction target diaphragm boundary line LI that is, the reference boundary line RL is the diaphragm boundary line L2
- the diaphragm boundary line L2 is corrected using the displacement amount D12 between the diaphragm boundary lines L1 and L2.
- the pixel P2 on the diaphragm boundary line L2 becomes the pixel P2c on the displacement-corrected boundary line L2c after the third correction processing.
- the comparison target for obtaining the displacement is changed to the diaphragm boundary L2, and the diaphragm boundary L3 is corrected using the displacement D23.
- the pixel P3 on the diaphragm boundary line L3 becomes the pixel P3c on the displacement corrected boundary line L3c after the third correction processing.
- the comparison target for obtaining the displacement amount is changed to the diaphragm boundary line L3, and the diaphragm boundary line is used by using the displacement amount D34 between the diaphragm boundary lines L3 and L4. L4 is corrected. At this time, the pixel P4 on the diaphragm boundary line L4 becomes the pixel P4c on the displacement corrected boundary line L4c after the third correction processing.
- the displacement amount D used for the correction process is the displacement amount D from the diaphragm boundary line LI that is temporally closest to the diaphragm boundary line LI to be corrected. .
- the displacement corrected boundary lines L2c to L4c with high correction accuracy can be obtained. it can. As a result, it is possible to display the displacement-corrected boundary line information LG suitable for the diagnostic application, so that the dynamic diagnosis can be performed more appropriately and efficiently.
- the image processing apparatus according to the third embodiment of the present invention is different from the image processing apparatus 3 according to the first embodiment in that the displacement correction unit is described below.
- the remaining configuration is the same as that of the image processing apparatus 3.
- the displacement amount D used in the fourth correction process is such that when at least one diaphragm boundary line LI exists between the reference boundary line BL and the diaphragm boundary line LI to be corrected, the two boundary lines LI that are temporally adjacent to each other.
- the displacement amount D between the reference boundary line BL and the correction target diaphragm boundary line LI is obtained by the sum of the displacement amounts D.
- FIG. 22 is a schematic diagram illustrating the fourth correction process.
- the pixels P1 and P2 are shown as representative pixels between the diaphragm boundary lines L1 to L3, and between the corresponding pixels of the displacement amount D23.
- Pixels P2 and P3 are shown as representatives, and pixels P1 and P3 are shown as representatives between the pixels corresponding to the displacement amount D13.
- the displacement amount calculation process is performed by subdividing the displacement amount D13 into the displacement amount D12 and the displacement amount D23, and then adding the displacement amount D (D12 + D23) to the diaphragm boundary line L1, Used as a displacement amount between L3. For this reason, first, the displacement amount D12 between the diaphragm boundary line L1 and the diaphragm boundary line L2 is calculated by the displacement amount calculation process, and the displacement amount D23 between the diaphragm boundary line L2 and the diaphragm boundary line L3 is calculated. To do. In this way, the displacement amount calculation process calculates (D12 + D23).
- the displacement amount D13 is calculated between the diaphragm boundary line L1 and the diaphragm boundary line L3 in comparison with the displacement amount D13 obtained directly. Since the displacement amount (D12 + D23) includes the displacement amount via the diaphragm boundary line L2, it reflects which path the diaphragm boundary lines L1 to L3 are displaced, and has high accuracy.
- the fourth correction process corrects the diaphragm boundary line L3 using the displacement amount (D12 + D23) calculated with high accuracy
- the displacement corrected boundary line L3c is also calculated more accurately. That is, the pixel P3 on the diaphragm boundary line L3 becomes the pixel P3c on the displacement corrected boundary line L3c after the fourth correction process.
- the basic operation of the image processing apparatus according to the third embodiment is different in steps S2 to S4 in FIG. That is, in step S2 of the first embodiment, the frame selection unit 120 performs frame selection processing on the base frame image BF and the reference frame image RF.
- step S2 of the third embodiment in addition to this, The difference is that a frame image (hereinafter referred to as “intermediate frame image”) existing between the shooting timing of the frame image BF and the shooting timing of the reference frame image RF is also selected. That is, in the example of calculating the displacement amount D13 in FIG. 22, the frame image T1 is selected as the base frame image BF, the frame image T3 is selected as the reference frame image RF, and the intermediate frame existing between these shooting timings is selected. The frame image T2 is also selected as the frame image.
- step S3 of the third embodiment the boundary line extraction process is performed not only on the base frame image BF and the reference frame image RF but also on the intermediate frame image.
- Step S4 of the third embodiment after performing a displacement amount calculation process for calculating the displacement amount D of the reference boundary line RL with respect to the reference boundary line BL using the diaphragm boundary line in the intermediate frame image, the displacement amount A correction process for correcting the diaphragm boundary line LI to be corrected is performed using D (see FIG. 22).
- the displacement amount D used for the correction process is the reference boundary line BL obtained by the sum of the displacement amounts D between two boundary lines LI that are temporally adjacent.
- the displacement amount calculation processing subdivides the reference frame image BF and the reference frame image RF by subdividing and calculating one displacement amount D between the reference boundary line BL and the diaphragm boundary line LI. Compared with the calculated displacement amount D, it is possible to calculate with high accuracy.
- FIG. 23 is a diagram illustrating a functional configuration of the control unit 31A used in the image processing apparatus 3A configured as the fourth embodiment of the present invention.
- the control unit 31A is used as an alternative to the control unit 31 (see FIG. 3) in the image processing apparatus 3 of the first embodiment.
- the difference from the first embodiment is that the control unit 31A further includes a period classification unit 115 and is changed to the frame selection unit 120A accordingly.
- the remaining configuration is the same as that of the image processing apparatus 3.
- Period classification unit 115 In the period classification unit 115, a so-called respiratory cycle that is a periodic change of the diaphragm in the subject M (body) synchronized with the imaging time at which the plurality of frame images MI acquired by the dynamic image acquisition unit 110 are captured. (Target region cycle) is detected, and the plurality of frame images MI are classified into the respiratory cycle units (target region cycle units). Then, the cycle classification unit 115 outputs a plurality of frame images MI ′ after classification in the respiratory cycle unit to the frame selection unit 120 (see FIG. 23).
- the cycle classification unit 115 When detecting the respiratory cycle of the subject M, the cycle classification unit 115 performs a respiratory information acquisition process for acquiring respiratory information, and detects the respiratory cycle PC, the inspiratory phase PH1, and the expiratory phase PH2 based on the respiratory information. .
- the respiratory information acquisition process and the detection of the respiratory cycle will be described separately.
- the respiratory information acquisition process constitutes a dynamic image acquired by the dynamic image acquisition unit 110. This is a process of calculating a respiratory vibration value based on a plurality of frame images MI and using the respiratory vibration value as respiratory information (see FIG. 23).
- a respiratory vibration value is calculated using a plurality of frame images MI acquired by the dynamic image acquisition unit 110.
- the respiratory vibration value is an index for measuring a change in the size of the lung field region due to respiration. For example, “a distance between feature points of the lung field region (a distance from the lung apex to the diaphragm, etc.)” “Lung field area value (lung field region size)” “absolute position of diaphragm”, “pixel density value of lung field”, and the like.
- the respiratory vibration value is “the area value of the lung field” and “the distance between the feature points of the lung field” will be described as an example.
- the respiratory vibration value is the “area value of the lung field”
- FIGS. 24A and 24B are schematic views illustrating the contour extraction of the lung field.
- the lung field is extracted by using the extraction method described with reference to FIG. 8, as in FIG. 8, and is extracted for each left and right (see FIG. 24A). (See FIG. 24B).
- the respiratory information acquisition process extracts the lung field contour OL using the acquired plurality of frame images MI, and calculates the number of pixels in the extracted region as the lung field area value.
- a respiratory vibration value is acquired (refer to FIG. 24A and FIG. 24B).
- the distance between feature points in the lung field is calculated using a plurality of frame images MI. That is, the lung field is extracted in the same manner as described above, and two feature points are obtained from the extracted region, and the distance between the two points is obtained as a respiratory vibration value.
- a change in the distance between the feature points (respiration vibration value) is defined as a respiration phase PH.
- FIGS. 24C and 24D are diagrams illustrating positions of feature points in the lung field region when the lung field contour OL in FIG. 24A is employed.
- the lung apex is the upper end LT of the lung region, and from the lung apex to the body axis direction.
- FIG. 24D shows an example in which the intersection of the straight line and the diaphragm is extracted as the lower end LB of the lung region.
- the apex of the lung is extracted as the upper end LT of the lung region, and the lateral angle is extracted as the lower end LB of the lung region. This is an example.
- the contour OL of the lung field region is extracted using the plurality of acquired frame images MI, and the respiratory vibration is detected by detecting the distance between the feature points from the extracted region.
- a value is acquired (see FIG. 24C and FIG. 24D).
- the respiratory phase PH showing the waveform data of the respiratory vibration values acquired in the respiratory information acquisition process in time series, that is, the area value of the lung field region, the distance between feature points, and the like
- the result of monitoring in the time direction for each imaging timing TM is obtained.
- respiratory information is obtained using a captured image, but a measurement result by an external device may be used.
- information related to the respiratory cycle is input to the cycle classification unit 115 from an external device.
- an apparatus as described in Japanese Patent No. 3793102 can be used.
- a method implemented by monitoring with a sensor composed of a laser beam and a CCD camera for example, "Study on sleep monitoring of sleepers using FG visual sensor", Hiroki Aoki, Masato Nakajima, IEICE Society Conference Proceedings of the Lectures 2001. Information and Systems Society Conference Proceedings, 320-321, 2001-08-29, etc.
- respiratory information can also be obtained by detecting the movement of the chest of the subject M using laser irradiation, a respiratory monitor belt, or the like, or by detecting respiratory airflow using an anemometer. It is also possible to apply.
- a change in the respiratory vibration value detected in the respiratory information acquisition process is set as a respiratory phase PH, and a respiratory cycle PC, an inspiratory phase PH1, and an expiratory phase PH2 are detected.
- the inspiration phase PH1 and the expiration phase PH2 are detected by calculating the maximum value B1 and the minimum value B2 of the respiratory vibration value within the respiratory cycle PC (see FIG. 5).
- one cycle of the breathing cycle (breathing cycle) PC is composed of inspiration and expiration, and consists of one expiration and one inspiration.
- inspiration cycle the area of the lung field in the thorax increases as the diaphragm lowers and breathes in.
- the maximum inspiration phase IM is when the inspiration is taken in to the maximum (inspiration and expiration conversion point).
- exhalation the area of the lung field decreases as the diaphragm rises and exhales, but the maximum exhalation phase EM is when exhaled to the maximum (conversion point of exhalation and inspiration).
- the respiratory cycle PC is between the maximum values B1, but may be between the minimum values PB2.
- the respiratory cycle PC is determined by sequentially calculating the time when the respiratory vibration value becomes the maximum value and the minimum value in the entire time of the dynamic image, and the maximum of the respiratory vibration value within the respiratory cycle PC is determined.
- the value B1 and the minimum value B2 are determined. Specifically, the maximum value (maximum inspiration phase IM) and the minimum value (maximum expiration phase EM) of the respiratory vibration value are calculated in a state where the respiratory vibration value in the entire time is smoothed and the high frequency noise component is reduced. . As a result, it is possible to prevent erroneous detection of a noise component included in the respiratory vibration value as a maximum value or a minimum value.
- the second method is a method of detecting the time when the respiratory cycle PC is detected first and the respiratory vibration value becomes the maximum value and the minimum value for each respiratory cycle PC.
- the difference from the first method is that the maximum value (that is, the maximum inspiration phase IM) and the minimum value (maximum expiration phase EM) of the respiratory vibration value are calculated not in the entire time but in the respiratory cycle PC.
- the maximum value and the minimum value may be extracted in a state where the respiratory vibration value is smoothed to reduce the high-frequency noise component.
- the cycle classification unit 115 detects the change of the respiratory vibration value as the respiratory phase PH, detects the maximum value B1 and the minimum value B2 of the respiratory vibration value in the respiratory cycle PC, and determines the inspiration phase PH1 and the expiration phase PH2. Detection is performed (see FIG. 5).
- the frame selection process of (a1) in the frame selection unit 120A includes the following first steps: The selection process and the second selection process are performed.
- the first selection process includes (b1) a frame image when the respiratory vibration value corresponds to a preset first set value as the reference frame image BF, and (b2) the respiratory vibration value is the maximum value B1. And (b3) a process of selecting one of the frame images when the respiratory vibration value corresponds to the minimum value B2.
- the second selection process includes (c1) a frame image when the respiratory vibration value corresponds to a preset second set value as the reference frame image RF, and (c2) a time with respect to the reference frame image BF.
- C3 When the reference frame image BF is the frame image of (b2), the frame image when the respiratory vibration value corresponds to the minimum value B2, and (c4) the reference frame image BF is When it is a frame image of (b3), it means a process of selecting any one of the frame images corresponding to the maximum value B1 of the respiratory vibration value.
- the first set value of (b1) and the second set value of (c1) are respiratory vibration values arbitrarily designated by the user, respectively.
- the first selection process the first set value designated The frame image corresponding to the set value is selected as the reference frame image BF.
- the second selection process the frame image corresponding to the designated second set value is used as the reference frame image RF.
- a selection process is performed.
- the user can arbitrarily specify the reference frame image BF and the reference frame image RF for the frame images in the respiratory cycle PC.
- FIG. 25 is a diagram illustrating an operation flow of the image processing apparatus 3A according to the fourth embodiment.
- steps SA1, SA4 to SA8 are the same as steps S1, S3 to S7 in FIG.
- the period classification unit 115 that was not present in the first embodiment is added, and the frame selection unit 120 is replaced with the frame selection unit 120A, so that only the following steps are changed.
- step SA2 a plurality of frame images MI acquired in step SA1 are captured in the period classification unit 115.
- the respiratory cycle PC of the subject M (body) synchronized with the captured imaging time is detected, and the plurality of frame images MI are classified into units of the respiratory cycle PC, thereby obtaining a plurality of frame images MI ′.
- the cycle classification unit 115 simultaneously detects the maximum value B1 and the minimum value B2 of the respiratory vibration value, the inspiration phase PH1, and the expiration phase PH2.
- step SA3 the frame selection process (a1) in the frame selection unit 120A includes the first selection process and the second selection process in consideration of the respiratory cycle PC detected in step SA2. By doing so, the base frame image BF and the reference frame image RF are selected. The remaining steps are the same as in the first embodiment.
- the base frame image BF and the reference frame image RF are frame images when the respiratory cycle PC is within the same cycle, and the frame selection of (a1)
- the process includes a first selection process for selecting any one of the frame images (b1) to (b3) as the reference frame image BF, and any one of the frame images (c1) to (c4). And a second selection process for selecting as a reference frame image RF. This makes it possible to accurately diagnose the change in the shape of the diaphragm boundary line LI between frame images within the same period desired by the user.
- the fifth embodiment aims to obtain displacement-corrected boundary line information using dynamic images of the past and present of the subject M.
- the terms “present” and “past” are written at the beginning of the term, but the meaning of “present” here is said to be new in time compared to what is written “past”. It shall be used in the concept.
- FIG. 26 is a diagram showing a functional configuration of the control unit 31B used in the image processing apparatus 3B configured as the fifth embodiment of the present invention.
- the control unit 31B is used as an alternative to the control unit 31 (see FIG. 3) in the image processing apparatus 3 of the first embodiment.
- the difference from the first embodiment is that a dynamic image acquired in the past is further provided with a period classification unit 115B, so that a dynamic image acquisition unit 110B, a frame selection unit 120B, a boundary line extraction unit 130B, a displacement correction unit 140B, The point is changed to the image generation unit 150B.
- the information storage device 5 is composed of a database server using a personal computer or a workstation, for example, and includes a past image storage unit (database) 51.
- the control unit 31B is a bus. Data is transmitted and received via 36 (see FIG. 1).
- the past image storage unit 51 stores in advance dynamic images taken in the past of the subject M used for diagnosis.
- the remaining configuration is the same as that of the image processing apparatus 3.
- Dynamic image acquisition unit 110B In addition to the current dynamic image acquisition unit 310 that acquires the current dynamic image (a plurality of current frame images NMI) corresponding to the dynamic image acquisition unit 110 described above, the dynamic image acquisition unit 110B has the same subject as the current frame image NMI.
- a past dynamic image acquisition unit 210 that acquires a past dynamic image (a plurality of past frame images PMI) for M is configured (see FIG. 26).
- the past dynamic image acquisition unit 210 acquires a past dynamic image from the past image storage unit 51, for example, as shown in FIG.
- the imaging range of the current frame image NMI and the past frame image PMI is the same, but at least the diaphragm region is necessarily included.
- the selection target frame image TI is the current frame image NMI and the past frame image PMI, which are photographed at the photographing time with respect to the same subject M (body). It is configured to include various types of frame images.
- Period classification unit 115B includes a past period classification unit 215 and a current period classification unit 315 (see FIG. 26).
- the past cycle classification unit 215 and the current cycle classification unit 315 included in the cycle classification unit 115B have the same functions as the cycle classification unit 115 described above.
- the current cycle classification unit 315 detects the current breathing cycle in the subject M synchronized with the imaging time at which the plurality of current frame images NMI acquired by the dynamic image acquisition unit 310 were captured, and the plurality of current frames The frame image NMI is classified into current respiratory cycle units. Then, the current cycle classification unit 315 outputs a plurality of current frame images NMI ′ after classification in the current respiratory cycle unit to the frame selection unit 120B (see FIG. 26).
- the past cycle classification unit 215 detects the past respiratory cycle in the subject M synchronized with the imaging time at which the plurality of past frame images PMI acquired by the dynamic image acquisition unit 210 were captured, and the plurality of past The frame image PMI is classified into past respiratory cycle units. Then, the cycle classification unit 215 outputs a plurality of past frame images PMI ′ after the classification to the past respiratory cycle unit to the frame selection unit 120B (see FIG. 26).
- FIG. 27 is a schematic diagram showing the waveform data of respiratory vibration values detected by the period classification unit 115B in time series.
- FIG. 27A is a diagram showing the past breathing phase PPH detected by the past cycle classification unit 215 and the past imaging timing PTM
- FIG. 27B is a diagram showing the current breathing detected by the current cycle classification unit 315. It is a figure which shows the phase NPH and the present imaging
- the past cycle classification unit 215 detects the past breathing cycle PPC, the past maximum value PB1 and the past minimum value PB2, the past inspiration phase PPH1, and the past expiration phase PPH2 of the past respiratory vibration value. . Further, as shown in FIG. 27B, the current cycle classification unit 315 obtains the current respiratory cycle NPC, the current maximum value NB1 and the current minimum value NB2 of the current respiratory vibration value, the current inspiration phase NPH1, and the current expiration phase NPH2. To detect.
- the past respiratory cycle PPC (current respiratory cycle NPC) is between the past maximum value PB1 (current maximum value NB1), but may be between the past minimum value PB2 (current minimum value NB2).
- Frame Selector 120B Since the cycle classification unit 115B detects the past respiratory cycle PPC and the like and the current respiratory cycle NPC and the like, the frame selection unit 120B can perform the following processing.
- the frame selection process in the frame selection unit 120B is performed by using a past frame image PMI (a frame captured in the past in time) out of the frame images TI (two types of frame images, ie, a current frame image NMI and a past frame image PMI). Image) is selected as the reference frame image BF.
- PMI a frame captured in the past in time
- TI two types of frame images, ie, a current frame image NMI and a past frame image PMI.
- Image is selected as the reference frame image BF.
- the frame selection processing in the frame selection unit 120B is 10 frames when the selection target frame images TI are the past frame images PT1 to PT5 and the current frame images NT1 to T5. Then, the frame selection process selects the reference frame image BF from the five past frame images PT1 to PT5. Then, the current frame images NT1 to T5 are selected as the reference frame image RF.
- Boundary line extraction unit 130B > The boundary line extraction unit 130B is similar to the boundary line extraction unit 130 for the past frame image PTI and the current frame image NTI corresponding to the reference frame image BF selected by the frame selection unit 120B and the reference frame image RF. Perform the process.
- the boundary line extraction unit 130B is selected as the reference frame image BF from the past frame images PT1 to PT5.
- the process is performed on all the frame images and the current frame images NT1 to NT5. That is, the reference frame image BF is extracted from the past frame images PT1 to PT5, and the current diaphragm boundary lines NL1 to NL5 (NLI) are extracted from the current frame images NT1 to NT5 (see FIG. 26).
- Displacement Correction Unit 140B is similar to the displacement correction unit 140 for the past diaphragm boundary line PLI and the current diaphragm boundary line NLI corresponding to the reference boundary line BL and the reference boundary line RL extracted by the boundary line extraction unit 130B. Perform the process.
- the reference boundary line BL is always the past diaphragm boundary line PL1
- the reference boundary line RL is any one of the current diaphragm boundary lines NL1 to NL5. Processing is appropriately performed using the lines NL1 to NL5. That is, current displacement corrected boundary lines NL1c to NL5c (NLIc) are obtained from the current diaphragm boundary lines NL1 to NL5 using the diaphragm boundary line PL1 as a displacement reference (see FIG. 26).
- Image Generation Unit 150B performs the same processing as the image generation unit 150 on each of the past displacement corrected boundary line PLIc and the current displacement corrected boundary line NLIc obtained by the displacement correction unit 140B.
- the image generation unit 150B performs processing for each of the current displacement corrected boundary lines NL1c to NL5c. I do. That is, the current displacement corrected boundary line information NLG is obtained from the current displacement corrected boundary lines NL1c to NL5c (see the block diagram of FIG. 26).
- FIG. 28 is a diagram illustrating an operation flow of the image processing apparatus 3B according to the fifth embodiment.
- the fifth embodiment includes a past dynamic image acquisition unit 210, a current dynamic image acquisition unit 310, a current cycle classification unit 315, and a past cycle classification unit 215 that did not exist in the first embodiment.
- the process is changed as follows.
- step SB1A the past dynamic image acquisition unit 210 acquires a past dynamic image (a plurality of past frame images PMI) from the past image storage unit 51 of the information storage device 5.
- step SB2A the past cycle classification unit 215 classifies the plurality of past frame images PMI in units of past breathing cycle PPC, thereby obtaining a plurality of past frame images PMI '.
- the past cycle classification unit 215 also detects the past respiratory vibration value PB1, the past minimum value PB2, the past inspiration phase PPH1, and the past expiration phase PPH2 in addition to the past breathing cycle PPC.
- steps SB1B and SB2B are performed in parallel with steps SB1A and SB2A. That is, in step SB1B, the current dynamic image acquisition unit 310 acquires the current dynamic image (a plurality of current frame images NMI) captured by the reading control device 14 of the imaging device 1 via the imaging control device 2.
- the current dynamic image acquisition unit 310 acquires the current dynamic image (a plurality of current frame images NMI) captured by the reading control device 14 of the imaging device 1 via the imaging control device 2.
- step SB2B the current cycle classification unit 315 classifies the plurality of current frame images NMI into current respiratory cycle NPC units, thereby obtaining a plurality of current frame images NMI ′.
- the current cycle classification unit 315 also detects the current maximum value NB1, the current minimum value NB2, the current inspiration phase NPH1, and the current expiration phase NPH2 of the current respiratory vibration value.
- step SB3 the frame selection process in the frame selection unit 120B is performed by using one past frame image PMI among the selection target frame images TI (two types of frame images of the current frame image NMI and the past frame image PMI) as a reference frame. Select as image BF.
- the remaining frame selection process is the same as in step S2. That is, in the example of FIG. 27, the past frame image PT1 is selected as the base frame image BF, and the current frame images NT1 to NT5 are selected as the reference frame image RF. That is, the past frame images PT2 to PT5 are excluded from the processing target here.
- step SB4 the above-described step S3 is performed on the past frame image PTI and the current frame image NTI corresponding to the reference frame image BF selected in step SB3 and the reference frame image RF in the boundary line extraction unit 130B.
- the standard boundary line BL and the reference boundary line RL are extracted by performing the same process as in FIG.
- step SB5 the displacement correction unit 140B is appropriately similar to the above-described step S4 for the current diaphragm boundary line NLI corresponding to the reference boundary line BL and the reference boundary line RL extracted in step SB4. By performing the processing, the current displacement corrected boundary line NLIc is obtained.
- step SB6 when further processing is performed in the displacement correction unit 140B, when the displacement correction unit 140B changes the diaphragm boundary line LI (reference boundary line RL) to be corrected, the frame selection unit 120B receives the reference frame image RF. Is changed, and the processes of steps S3B to S5B are repeated again. On the other hand, when ending the process in the displacement correction unit 140B, the process proceeds to step SB7.
- LI reference boundary line RL
- step SB7 the image generation unit 150B performs the same processing as step S6 described above on the current displacement corrected boundary line NLIc obtained in step SB4, so that each of the current displacement corrected boundary line information is obtained. Obtain NLG.
- the current displacement corrected boundary line information NLG includes at least information indicating the current displacement corrected boundary line NLIc and information indicating the reference boundary line BL.
- step SB8 the image generation unit 150B outputs the current displacement corrected boundary line information NLG generated in step SB7 on the display unit 34 or the storage unit 32 (see FIG. 26), and this operation flow ends. Is done.
- the selection target frame image TI includes the current frame image NMI and the past frame captured at the photographing time with respect to the same subject M (body). It includes two types of frame images (frame images taken in the past from a plurality of frame images) together with the image PMI, and the frame selection process is performed in the past from a plurality of frame images on the same body. Includes a process of selecting a frame image taken as a reference frame image. That is, when obtaining the current displacement corrected boundary line information NLG indicating the current displacement corrected boundary line NLIc, the common (same) frame image (in the example of FIG. This can be implemented using the past frame image PT1). Thus, it becomes possible to accurately compare the shape of the past and the present in the diaphragm boundary line LI of one body and the change of the same from the dynamic diagnosis. For this reason, it becomes possible to perform follow-up observation accurately.
- the current displacement corrected boundary line information NLG is obtained in order to display the difference in the shape change between the past and the present of the diaphragm boundary line LI of the subject M.
- the same frame For the purpose of displaying the difference in shape change between the right side and the left side of the diaphragm boundary line LI in the image, right side displacement corrected boundary line information and left side displacement corrected boundary line information may be obtained (not shown).
- the reference boundary line BL is implemented using the common (identical) right diaphragm boundary line (or the left diaphragm boundary line).
- the shape of the right and left diaphragm boundary lines LI is axisymmetric with respect to the spine as an axis of symmetry, so that the shape of either the right or left diaphragm boundary line LI can be reversed. Then, it is necessary to perform the displacement amount calculation process to obtain the right displacement corrected boundary line and the left displacement corrected boundary line.
- the reference boundary line BL is set to the common (identical) inspiration diaphragm boundary line (or the expiration diaphragm boundary). Line) to obtain the intake displacement corrected boundary line and the left displacement corrected boundary line.
- the target region is a diaphragm region, but in the sixth embodiment, the case where the target region is a heart region is handled.
- the difference from the first embodiment is that a boundary line extraction unit 130C that extracts the boundary line of the heart region is changed.
- the period classification unit when the period classification unit is provided, the period is changed to the period classification unit 115C that classifies the plurality of frame images MI based on the periodic change of the heart.
- the remaining configuration is the same as that of the image processing apparatus according to the first to fifth embodiments.
- the first displacement amount calculation considering only the vertical movement is accompanied with the movement of the heart in addition to the movement of the heart as the movement of the heart with breathing. Even if the correction process is executed by the process, an appropriate displacement corrected boundary line LIc cannot be obtained. Therefore, it is preferable to calculate the displacement amount D by the second to fourth displacement amount calculation processes. Alternatively, after the displacement amount D of the diaphragm region is removed, the displacement amount D of the heart region may be calculated by the first displacement amount calculation process.
- boundary line extraction unit 130C will be described first, and then the period classification unit 115C will be described.
- Boundary line extraction unit 130C > In the boundary line extraction unit 130C, a boundary line extraction process is performed on the first number of frame images TI to extract the boundary lines of the heart region and obtain the first number of heart boundary lines (target region boundary lines). .
- a technique for detecting a cardiac boundary line (heart outline) from each frame image various known techniques can be employed. For example, using a model (heart model) indicating the shape of the heart, X-rays can be used.
- a method for detecting the outline of the heart by matching the feature points in the image with the feature points of the heart model for example, “Image feature analysis and computer-aided diagnosis in digital radiography: Automated analysis of sizes of heart and lung in chest images ”, Nobuyuki Nakamori et al., Medical Physics, Volume 17, Issue 3, 1990 May, 1990, pp.342-350.
- the method for extracting the cardiac boundary line HLI is not limited to the above method, and any method may be used as long as it can be extracted from a dynamic image.
- FIG. 29 is a schematic diagram illustrating a cardiac boundary line extracted from each frame image. As shown in FIGS. 29A to 29C, it can be seen that the cardiac boundary lines HL1 to HL3 (HLI) are extracted based on each frame image.
- HLI cardiac boundary lines
- Period classification unit 115C In the period classification unit 115C, a so-called heartbeat that is a periodic change of the heart region in the subject M (body) synchronized with the imaging time at which the plurality of frame images MI acquired by the dynamic image acquisition unit 110 were captured. A cycle (target region cycle) is detected, and the plurality of frame images MI are classified into the heartbeat cycle units (target region cycle units). Then, the period classification unit 115C outputs a plurality of frame images MI ′ after the classification to the heartbeat period unit to the frame selection unit.
- Heartbeat cycle acquisition process In the heartbeat period acquisition process, the heartbeat period is acquired by calculating the amount of movement of the heart wall (that is, the heart boundary line HLI) using the captured image acquired by the dynamic image acquisition unit 110. is there. Specifically, the phase of the pulsation of the heart at the timing when each frame image is captured is detected by detecting the fluctuation of the heart wall from the dynamic image. Then, the heartbeat cycle is determined based on the phase of the heartbeat.
- FIG. 29 a variation in the lateral width of the heart is adopted as an example of a variation in the heart wall (heart boundary line HLI) captured in the dynamic image. That is, FIGS. 29 (a) to 29 (c) illustrate a state in which the lateral width of the heart increases from w1 to w3 in the process of expanding the heart.
- the heart cycle can be detected by detecting the outline of the heart (heart boundary line HLI) from each frame image using the method described above and the like, and detecting the lateral width of the heart.
- FIG. 30 is a schematic view illustrating the relationship between the time when the image was taken and the lateral width of the heart (the amount of motion of the heart wall) for a plurality of frame images constituting the dynamic image.
- the horizontal axis represents time
- the vertical axis represents the width of the heart
- the value of the width of the heart from which a circle is detected is illustrated.
- the lateral width of the heart captured at time t is Hwt
- the lateral width of the heart captured at time t + 1 is Hwt + 1
- (Hwt + 1 ⁇ Hwt) ⁇ 0 holds, it is captured at time t.
- the obtained frame image is classified when the heart is expanded, and if (Hwt + 1 ⁇ Hwt) ⁇ 0 holds, the frame image captured at time t is classified when the heart contracts.
- the lateral width of the heart that is, the fluctuation of the heart wall
- the cycle classification unit 115C can classify the plurality of frame images MI into the heartbeat cycle units by detecting the heartbeat cycle based on the motion of the heart wall captured in the dynamic image. Become.
- the heartbeat cycle acquisition process is not limited to the above method, and may be a method of acquiring a heartbeat cycle using a result acquired from an electrocardiograph.
- the detection operation by the phase detection unit of the electrocardiograph can be realized by being performed in synchronization with the imaging operation by the imaging device 1.
- the target region is the heart region
- it is a mild symptom an abnormality may not be known.
- diagnosis is performed using the displacement-corrected boundary line information LG, it does not depend on the subjectivity of the user, and thus misdiagnosis can be prevented.
- image processing devices 3, 3A, 3B, etc. are described separately for each embodiment, but these individual functions may be combined with each other as long as they do not contradict each other. .
- the displacement-corrected boundary line information LG may be superimposed and displayed on the dynamic image so that it can be seen which line on the dynamic image is displayed.
- the displacement-corrected boundary line information LG displayed on the dynamic image displays not only the result determined to be abnormal, but also the maximum exhalation, the maximum inspiration, and the period before and after the suspected disease position. May be.
- the displacement-corrected boundary line L5c corresponding to the maximum exhalation and the displacement-corrected boundary line L1c corresponding to the maximum inspiration are always displayed during moving image reproduction, and the others are not displayed. You may make it do.
- the image processing apparatus 3 has been described with respect to the case where it is configured to include the frame selection unit 120. That is, in the configuration not including the frame selection unit 120, the boundary line extraction unit 130 performs the boundary line extraction processing on all of the plurality of frame images MI constituting the dynamic image acquired by the dynamic image acquisition unit 110. Then, the displacement correction unit 140 sequentially sets the reference boundary line BL and the reference boundary line RL under the conditions (predetermined rules) specified by the user in advance, thereby performing the displacement amount calculation process and the correction process. It will be.
- the frame selection process in the frame selection unit 120 is performed as a pre-process of the boundary line extraction process in the boundary line extraction unit 130.
- the present invention is not limited to this. It may be a configuration. That is, after performing the boundary line extraction process on all of the plurality of frame images MI constituting the dynamic image acquired by the dynamic image acquisition unit 110, the frame selection process includes the reference frame image BF (reference boundary line BL), and The reference frame image RF (reference boundary line RL) is selected.
- the diaphragm boundary line LI of the reference frame image BF is set to the reference boundary line BL
- the diaphragm boundary line LI of the reference frame image RF is set to the reference boundary line RL.
- the target region is the diaphragm region or the heart region
- the present invention is not limited to this, and the target region may be the diaphragm region and the heart region. That is, the processing until obtaining the displacement-corrected boundary line information LG is performed in parallel in the diaphragm region and the heart region, so that the displacement-corrected boundary line information LG is stored in the display unit 34 and stored separately for each target region. Is output to the unit 32.
- the purpose is to obtain the right-side displacement corrected boundary line and the left-side displacement corrected boundary line.
- the present invention is not limited to this.
- the healthy person's diaphragm boundary line LI By implementing the reference boundary line BL using the common (identical) healthy person diaphragm boundary line with respect to the non-healthy person's diaphragm boundary line LI, the healthy person displacement corrected boundary line and the unhealthy person displacement correction You can also get a finished border.
- the subject may be an animal body as well as a human body.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Data Mining & Analysis (AREA)
- Physiology (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Databases & Information Systems (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
本発明の第1実施形態に係る放射線動態画像撮影システムについて以下説明する。
第1実施形態に係る放射線動態画像撮影システムは、人体または動物の身体を被写体として、被写体の対象領域の物理的状態が周期的に時間変化する状況に対して放射線画像の撮影を行う。
撮影装置1は、例えば、X線撮影装置等によって構成され、呼吸に伴う被写体Mの胸部の動態を撮影する装置である。動態撮影は、被写体Mの胸部に対し、X線等の放射線を繰り返して照射しつつ、時間順次に複数の画像を取得することにより行う。この連続撮影により得られた一連の画像を動態画像と呼ぶ。また、動態画像を構成する複数の画像のそれぞれをフレーム画像と呼ぶ。
撮影制御装置2は、放射線照射条件や画像読取条件を撮影装置1に出力して撮影装置1による放射線撮影及び放射線画像の読み取り動作を制御するとともに、撮影装置1により取得された動態画像を撮影技師によるポジショニングの確認や診断に適した画像であるか否かの確認用に表示する。
画像処理装置3は、撮像装置1から送信された動態画像を、撮影制御装置2を介して取得し、医師等が読影診断するための画像を表示する。
この実施形態における画像処理装置3の詳細を説明する前提として、呼吸運動と横隔膜の位置との関係と、それに伴う動態診断における問題点とを説明しておく。
本発明の第1実施形態における放射線動態画像撮影システム100の画像処理装置3は、フレーム画像間の上下動、並進、回転による変形が取り除かれた境界線情報を表示することにより、動態診断を適切かつ効率的に行うことが可能となる。
図3は、放射線動態画像撮影システム100における画像処理装置3において、CPU等が各種プログラムに従って動作することにより制御部31で実現される機能構成を他の構成とともに示す図である。なお、この実施形態の画像処理装置3は、主として心臓および両肺を含む胸部が撮影された動態画像を使用する。
動態画像取得部110では、撮像装置1の読取制御装置14によって撮影された被検者Mの身体内部における対象領域の物理的状態が時間変化する状態を時間方向に順次に撮影された複数のフレーム画像から構成される動態画像を取得する。本実施形態における対象領域とは、横隔膜領域を想定する。すなわち、図3で示されるように、撮像装置1と画像処理装置3との間に、撮影制御装置2が介在し、撮影制御装置2の記憶部22に記憶された検出データ(複数のフレーム画像MI)が通信部25を介して、画像処理装置3の通信部35に出力される。
フレーム選択部120では、少なくとも2以上である第1の数の(複数の)フレーム画像MIを少なくとも含む選択対象フレーム画像TIに対し、後述の基準境界線BLを抽出するための基準フレーム画像BFと、基準境界線BLを除く横隔膜境界線LI(詳細は後述する)を抽出するための参照フレーム画像RFとを選択する処理を含むフレーム選択処理を行う。
境界線抽出部130では、上記選択対象フレーム画像TIに対し、対象領域の境界線を抽出して第1の数(複数の)の対象領域境界線を得る境界線抽出処理を行う。本実施形態における対象領域は横隔膜領域であるため、対象領域境界線を横隔膜境界線として以下説明する。
第1の境界線抽出処理は、選択対象フレーム画像TIに基づいて肺野部の輪郭抽出を行って横隔膜境界線LIを抽出する処理である。図8は、横隔膜境界線LIを含む肺野部の輪郭抽出を例示する模式図である。肺野部の抽出は、図8で示すように、左右ごとに抽出しても(図8(a)参照)、心臓や脊椎の領域を含んだ輪郭(図8(b)参照)として抽出してもよい。本抽出方法としては、従来技術(例えば、“Image feature analysis and computer-aided diagnosis: Accurate determination of ribcage boundary in chest radiographs”, Xin-Wei Xu and Kunio Doi, Medical Physics, Volume 22(5), May 1995, pp.617-626.等参照)等を採用することができる。
第2の境界線抽出処理は、モデルベースによる抽出により横隔膜境界線LIを抽出する処理である。すなわち、横隔膜の候補位置をモデルベースの手法の1つであるテンプレートマッチングで大まかに抽出(粗抽出)し、抽出された候補領域に対し、詳細に解析(精密抽出)することで、精度良く抽出する。この際の粗抽出に対し、横隔膜の動きに関する医学的知識を用いることで、横隔膜の動き量に応じたテンプレートの重み付けを実施することができ、粗抽出の精度を向上することができるため、横隔膜境界線LIの抽出精度の向上を図ることが可能となる。本抽出方法としては、例えば、本出願人による出願である「特願2012-138364号(出願日:平成24年6月20日)」を採用することができる。
第3の境界線抽出処理は、プロファイル解析による抽出により横隔膜境界線LIを抽出する処理である。図9は、プロファイル解析を説明する図である。そして、図9(a)は選択対象フレーム画像TIのプロファイル領域R2を示す図であり、図9(b)は横軸の選択対象フレーム画像TIのプロファイル領域R2(図9(a)参照)の縦方向の座標に対して、縦軸が濃淡値を示すグラフである。図9で示されるように、縦方向にプロファイルR2を作成し、作成したプロファイルR2における濃淡値のピークの変化点を横隔膜の境界線として抽出することが可能となる。
第4の境界線抽出処理は、ユーザ指定による抽出により横隔膜境界線LIを抽出する処理である。具体的に、ユーザ指定とは、単純にユーザが横隔膜境界線LIの抽出対象のラインを引くようにしても良いし、上記の第1~第3の境界線抽出処理により抽出した横隔膜境界線LIを補正する方法として使用しても良い。なお、前者の単純にユーザがライン指定する場合においては、対象とする選択対象フレーム画像TIのうち、1枚だけをユーザ指定し、残余のフレーム画像は、時間方向に対応点探索法などを採用することにより追跡することが好ましい。
図10は、境界線抽出部130が抽出した選択対象フレーム画像TIにおける横隔膜境界線LI間の変位量について説明する模式図である。図10(a)及び図10(b)は境界線抽出部130が抽出したフレーム画像T1,T2における横隔膜境界線L1,L2をそれぞれ示し、図10(c)は横隔膜境界線L1,L2を重畳的に表示したものである。
(i)横隔膜境界線LIの形状そのものが変化する動き、すなわち、「形状変形」、
(ii)図11(a)及び図11(b)で示されるような横隔膜の呼吸運動を伴う動き、すなわち、「上下動、並進、回転による変形」、
の2種類の変化が含まれている。このうち、(i)の形状変形のみを得るためには、横隔膜境界線LIに対して、(ii)の上下動、並進、回転による変形(図11参照)を取り除く必要がある。
第1の変位量算出処理では、上記(ii)の変形が上下動にのみ起因する場合に有効となる。すなわち、横隔膜境界線LI間の変位量は、上下方向のみであるという前提の下、実施する。
第2の変位量算出処理では、上記(ii)の変形が上下動にのみならず、並進、回転にも起因する場合に有効となる。すなわち、横隔膜境界線LI間の変位量は、上下動、並進、回転の何れも含むという前提の下、実施する。
第3の変位量算出処理では、上記(ii)の変形が上下動にのみならず、並進、回転にも起因する場合に有効となる。すなわち、横隔膜境界線LI間の変位量は、上下動、並進、回転の何れも含むという前提の下、実施する。
第4の変位量算出処理では、上記(ii)の変形が上下動にのみならず、並進、回転にも起因する場合に有効となる。すなわち、横隔膜境界線LI間の変位量は、上下動、並進、回転の何れも含むという前提の下、実施する。
続いて、変位量算出処理を実施した後に行われる補正処理について説明する。
第2の補正処理では、基準境界線BLを横隔膜境界線L1とし、参照境界線RLを横隔膜境界線LI(ここで、引数Iは2以上の整数)として算出された変位量D1Iを用いて、横隔膜境界線LIを補正する処理である。
画像生成部150では、第1の数以下である第2の数の変位補正済境界線LIcに基づいて表示用の変位補正済境界線情報LGを生成する(図3参照)。変位補正済境界線情報LGとしては、第2の数の変位補正済境界線LIc毎に分離した第2の数の分離画像を生成しても良いし、第2の数の変位補正済境界線LIcが重畳表示されるよう1枚の静止画像を生成しても良い。
図20は、本実施形態に係る画像処理装置3において実現される基本動作を説明するフローチャートである。なお、既に各部の個別機能の説明は行ったため(図3参照)、ここでは全体の流れのみ説明する。
本発明の第2実施形態における画像処理装置は、第1実施形態の画像処理装置3のうち、変位補正部が以下で説明する点で異なる。なお、残余の構成は画像処理装置3と同様である。
第2実施形態における変位補正部の補正処理を以下では第3の補正処理と称する。第3の補正処理に用いられる変位量Dは、補正対象の横隔膜境界線LIに対し時間的に直近の横隔膜境界線LIからの変位量である。
本発明の第3実施形態における画像処理装置は、第1実施形態の画像処理装置3のうち、変位補正部が以下で説明する点で異なる。なお、残余の構成は画像処理装置3と同様である。
第3実施形態における変位補正部の補正処理を以下では第4の補正処理と称する。第4の補正処理に用いられる変位量Dは、基準境界線BL~補正対象の横隔膜境界線LI間に少なくとも1つの横隔膜境界線LIが存在する場合、時間的に隣接する2つ境界線LI間の変位量Dの和によって得られる、基準境界線BL~補正対象の横隔膜境界線LI間の変位量Dとなる。
第3実施形態に係る画像処理装置の基本動作は、図20におけるステップS2~S4が異なる。すなわち、第1実施形態のステップS2では、フレーム選択部120が、基準フレーム画像BFと参照フレーム画像RFとにおけるフレーム選択処理を行ったが、第3実施形態のステップS2では、これに加え、基準フレーム画像BFの撮影タイミングと参照フレーム画像RFの撮影タイミングとの間に存在するフレーム画像(以下、「中間フレーム画像」と称する)をも選択する点で異なる。すなわち、図22の変位量D13を算出する例では、基準フレーム画像BFはフレーム画像T1が選択され、参照フレーム画像RFはフレーム画像T3が選択されるとともに、これらの撮影タイミングの間に存在する中間フレーム画像としてフレーム画像T2をも選択されることになる。
図23は、本発明の第4実施形態として構成された画像処理装置3Aで用いられる制御部31Aの機能構成を示す図である。この制御部31Aは、第1実施形態の画像処理装置3における制御部31(図3参照)の代替として使用される。第1実施形態と異なる点は、制御部31Aが周期分類部115を更に備え、これに伴いフレーム選択部120Aに変更される点である。なお、残余の構成は画像処理装置3と同様である。
周期分類部115では、動態画像取得部110にて取得された複数のフレーム画像MIが撮影された撮影時刻に同期した被検者M(身体)における横隔膜の周期的な変化となる、いわゆる呼吸周期(対象領域周期)を検出し、該複数のフレーム画像MIを該呼吸周期単位(対象領域周期単位)に分類する。そして、周期分類部115は、該呼吸周期単位に分類後の複数のフレーム画像MI’をフレーム選択部120に出力する(図23参照)。
横隔膜領域の物理的状態が時間変化する状態を示す値として規定される物理状態値を、呼吸振動値と称するとき、呼吸情報取得処理は、動態画像取得部110にて取得した動態画像を構成する複数のフレーム画像MIに基づいて呼吸振動値を算出し、該呼吸振動値を呼吸情報とする処理である(図23参照)。
続いて、呼吸情報取得処理で各々検出された呼吸振動値の変化を呼吸位相PHとし、呼吸周期PC、吸気位相PH1、及び呼気位相PH2を検出する。具体的に、吸気位相PH1及び呼気位相PH2は、呼吸周期PC内における呼吸振動値の最大値B1と最小値B2とを算出することで検出される(図5参照)。
周期分類部115が、呼吸周期PC、呼吸振動値の最大値B1と最小値B2、吸気位相PH1、及び、呼気位相PH2を検出するため、フレーム選択部120Aが以下のような処理を行うことが可能となる。
続いて、図25は、第4実施形態に係る画像処理装置3Aの動作フローを例示した図である。なお、図25のうち、ステップSA1,SA4~SA8は図20のステップS1,S3~S7と同様であるため、その説明は省略する。
被検者Mの過去に撮影された動態画像を用いて経過観察を行う場合においては、複数のX線動態画像を並べて比較して診断する必要が有り、比較効率が悪くなる。
動態画像取得部110Bは、上記の動態画像取得部110に相当する現在動態画像(複数の現在フレーム画像NMI)を取得する現在動態画像取得部310に加え、現在フレーム画像NMIと同一の被検者Mに対する過去動態画像(複数の過去フレーム画像PMI)を取得する過去動態画像取得部210を含んで構成される(図26参照)。
周期分類部115Bは、過去周期分類部215と現在周期分類部315とを含んで構成される(図26参照)。ここで、周期分類部115Bに含まれる過去周期分類部215及び現在周期分類部315は、上述の周期分類部115と同様の機能を有する。
周期分類部115Bが、過去呼吸周期PPC等、及び、現在呼吸周期NPC等を検出したため、フレーム選択部120Bでは、以下のような処理を行うことが可能となる。
境界線抽出部130Bでは、フレーム選択部120Bにて選択された基準フレーム画像BF、及び、参照フレーム画像RFに相当する、過去フレーム画像PTI及び現在フレーム画像NTIを対象に境界線抽出部130と同様の処理を行う。
変位補正部140Bでは、境界線抽出部130Bにて抽出された基準境界線BL、及び参照境界線RLに相当する、過去横隔膜境界線PLI及び現在横隔膜境界線NLIを対象に変位補正部140と同様の処理を行う。
画像生成部150Bでは、変位補正部140Bにて得られた過去変位補正済境界線PLIc及び現在変位補正済境界線NLIcのそれぞれに対して、画像生成部150と同様の処理を行う。
続いて、図28は、第5実施形態に係る画像処理装置3Bの動作フローを例示した図である。この第5実施形態では、第1実施形態では存在しなかった過去動態画像取得部210、現在動態画像取得部310、現在周期分類部315及び過去周期分類部215を備え、フレーム選択部120がフレーム選択部120Bに置換されたことで、下記のように工程が変更される。
第5実施形態では、被検者Mの横隔膜境界線LIの過去と現在との形状変化の差異を表示するために、現在変位補正済境界線情報NLGが得られたが、例えば、同一のフレーム画像内における横隔膜境界線LIの右側と左側との形状変化の差異を表示することを目的として、右側変位補正済境界線情報及び左側変位補正済境界線情報を得てもよい(不図示)。
第2の変形例では、横隔膜境界線LIの吸気位相PH1時と呼気位相PH2時との形状変化の差異を表示することを目的として、吸気変位補正済境界線情報及び呼気変位補正済境界線情報を得てもよい(不図示)。
横隔膜領域の動きを横隔膜境界線LIから直接把握することが困難であることを上述したように、心臓の動きを心臓境界線から直接把握することも困難である。すなわち、実際の心臓の動き以外の動きが伴っており、正確に心臓の動きを診断するのは困難である。
境界線抽出部130Cでは、上記第1の数のフレーム画像TIに対し、心臓領域の境界線を抽出して第1の数の心臓境界線(対象領域境界線)を得る境界線抽出処理を行う。
周期分類部115Cでは、動態画像取得部110にて取得された複数のフレーム画像MIが撮影された撮影時刻に同期した被検者M(身体)における心臓領域の周期的な変化となる、いわゆる心拍周期(対象領域周期)を検出し、該複数のフレーム画像MIを該心拍周期単位(対象領域周期単位)に分類する。そして、周期分類部115Cは、該心拍周期単位に分類後の複数のフレーム画像MI’をフレーム選択部に出力する。
心拍周期取得処理では、動態画像取得部110によって取得された撮影画像を用いて、心臓壁(すなわち、心臓境界線HLIに相当する)の動き量を算出することで、心拍周期を取得する処理である。詳細には、動態画像から心臓壁の変動が検出されることで、各フレーム画像が撮影されたタイミングにおける心臓の拍動の位相が検出される。そして、当該心臓の拍動の位相により心拍周期を決定する。
以上、本発明の実施形態について説明してきたが、本発明は、上記実施形態に限定されるものではなく、様々な変形が可能である。
2 撮影制御装置
3,3A,3B 画像処理装置
31,31A,31B 制御部
32 記憶部
34 表示部
100 放射線動態画像撮影システム
110,110B 動態画像取得部
115,115B 周期分類部
120,120A,120B フレーム選択部
130,130B 境界線抽出部
140,140B 変位補正部
150,150B 画像生成部
210 過去動態画像取得部
215 過去周期分類部
310 現在動態画像取得部
315 現在周期分類部
M 被写体(被検者)
MI フレーム画像
TI 第1の数のフレーム画像
BF 基準フレーム画像
RF 参照フレーム画像
BL 基準境界線
RL 参照境界線
LI 横隔膜境界線
HLI 心臓境界線
LIc 変位補正済境界線
LG 変位補正済境界線情報
PC 呼吸周期
PH 呼吸位相
PH1 吸気位相
PH2 呼気位相
EM 最大呼気位相
IM 最大吸気位相
Claims (12)
- 人体または動物の身体内部における対象領域の物理的状態が時間変化する状態を時間方向に順次に撮影された複数のフレーム画像から構成される動態画像を取得する動態画像取得手段と、
前記複数のフレーム画像のうち、複数のフレーム画像に対し、前記対象領域の境界線を抽出して前記複数の対象領域境界線を得る境界線抽出処理を行う境界線抽出手段と、
前記複数の対象領域境界線に対応する画素を用いて、前記複数の対象領域境界線のうち、基準境界線以外の対象領域境界線のいずれか1つまたはそれ以上について、前記基準境界線を変位基準とした変位量を算出する変位量算出処理を行い、前記変位量は要除去成分であり、前記変位量算出処理後に前記変位量を用いて前記基準境界線以外の所定数の前記対象領域境界線を補正する補正処理を行うことにより、前記要除去成分が取り除かれた所定数の変位補正済境界線を得る変位補正手段と、
前記所定数の変位補正済境界線に基づく表示用の変位補正済境界線情報を表示する表示手段と、
を備えることを特徴とする、
画像処理装置。 - 請求項1に記載の画像処理装置であって、
前記要除去成分は、
前記対象領域における上下動、並進、及び、回転による変形成分のうち、少なくとも一つの成分を含む、
画像処理装置。 - 請求項1または請求項2に記載の画像処理装置であって、
前記複数のフレーム画像を少なくとも含む選択対象フレーム画像に対し、
前記基準境界線を抽出するための基準フレーム画像と前記基準境界線を除く前記対象領域境界線を抽出するための参照フレーム画像とを選択する処理を含むフレーム選択処理を行うフレーム選択手段、
を更に備え、
前記変位量算出処理は、
前記基準フレーム画像の前記対象領域境界線を前記基準境界線として、前記参照フレーム画像の前記対象領域境界線との対応する画素間における変位量を算出する処理、
を含む、
画像処理装置。 - 請求項3に記載の画像処理装置であって、
前記選択対象フレーム画像は、前記複数のフレーム画像より時間的に過去に撮影されたフレーム画像を含み、
前記フレーム選択処理は、
同一の前記身体に対して、前記複数のフレーム画像より時間的に過去に撮影されたフレーム画像を前記基準フレーム画像として選択する処理を含む、
画像処理装置。 - 請求項3に記載の画像処理装置であって、
前記複数のフレーム画像が撮影された撮影時刻に同期した前記身体における前記対象領域の周期的な変化となる対象領域周期を検出し、前記複数のフレーム画像を該対象領域周期単位に分類する周期分類手段、
を更に備え、
前記基準フレーム画像と前記参照フレーム画像とは、前記対象領域周期が同一周期内にあるときのフレーム画像であり、
前記対象領域の物理的状態が時間変化する状態を示す値が物理状態値として規定され、
前記フレーム選択処理は、
(b1)前記物理状態値が予め設定された第1の設定値に相当するときのフレーム画像、
(b2)前記物理状態値が最大値に相当するときのフレーム画像、及び、
(b3)前記物理状態値が最小値に相当するときのフレーム画像
のうち、何れか1つのフレーム画像を前記基準フレーム画像として選択する第1の選択処理と、
(c1)前記物理状態値が予め設定された第2の設定値に相当するときのフレーム画像、
(c2)前記基準フレーム画像に対し時間的に近接するフレーム画像、
(c3)前記基準フレーム画像が前記(b2)のフレーム画像であるとき、前記物理状態値の最小値に相当するときのフレーム画像、及び、
(c4)前記基準フレーム画像が前記(b3)のフレーム画像であるとき、前記物理状態値の最大値に相当するときのフレーム画像、
のうち、何れか1つのフレーム画像を前記参照フレーム画像として選択する第2の選択処理と、
を含む、
画像処理装置。 - 請求項3に記載の画像処理装置であって、
前記補正処理に用いられる前記変位量は、前記基準境界線と前記対象領域境界線のうち1つとの対応する画素間における変位量であり、前記変位量を用いて前記対象領域境界線の1つ以外の前記対象領域境界線を補正する、
画像処理装置。 - 請求項3に記載の画像処理装置であって、
前記補正処理に用いられる前記変位量は、補正対象の対象領域境界線に対し時間的に直近の対象領域境界線からの変位量である、
画像処理装置。 - 請求項3に記載の画像処理装置であって、
前記補正処理に用いられる前記変位量は、時間的に隣接する2つ境界線間の変位量の和によって得られる、前記基準境界線~補正対象の対象領域境界線間の変位量である、
画像処理装置。 - 請求項1ないし請求項8のうち、いずれか1項記載の画像処理装置であって、
前記所定数の変位補正済境界線毎に分離した所定数の分離画像を生成する画像生成手段、
を更に備え、
前記表示手段は、
前記変位補正済境界線情報として、前記所定数の分離画像を逐次表示することを特徴とする、
画像処理装置。 - 請求項1ないし請求項9のうち、いずれか1項記載の画像処理装置であって、
前記複数の変位補正済境界線が重畳表示されるよう1枚の静止画像を生成する画像生成手段、
を更に備え、
前記表示手段は、
前記変位補正済境界線情報として、前記静止画像を表示することを特徴とする、
画像処理装置。 - 請求項1ないし請求項10のうち、いずれか1項記載の画像処理装置であって、
前記対象領域は、
横隔膜領域及び心臓領域の何れか少なくとも一方の領域を含むことを特徴とする、
画像処理装置。 - 画像処理装置に含まれるコンピュータによって実行されることにより、前記コンピュータを、請求項1ないし請求項11のうち、いずれか1項記載の画像処理装置として機能させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015516996A JP6350522B2 (ja) | 2013-05-16 | 2014-04-09 | 画像処理装置及びプログラム |
US14/891,110 US9665935B2 (en) | 2013-05-16 | 2014-04-09 | Image processing device and program |
CN201480026373.4A CN105188541A (zh) | 2013-05-16 | 2014-04-09 | 图像处理装置和程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-104336 | 2013-05-16 | ||
JP2013104336 | 2013-05-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014185197A1 true WO2014185197A1 (ja) | 2014-11-20 |
Family
ID=51898179
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/060287 WO2014185197A1 (ja) | 2013-05-16 | 2014-04-09 | 画像処理装置及びプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US9665935B2 (ja) |
JP (1) | JP6350522B2 (ja) |
CN (1) | CN105188541A (ja) |
WO (1) | WO2014185197A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017169830A (ja) * | 2016-03-24 | 2017-09-28 | コニカミノルタ株式会社 | 動態解析装置 |
JP2018007949A (ja) * | 2016-07-15 | 2018-01-18 | コニカミノルタ株式会社 | 動態解析装置 |
JP2018532515A (ja) * | 2015-11-09 | 2018-11-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | X線画像吸気品質モニタリング |
JP2019063328A (ja) * | 2017-10-03 | 2019-04-25 | コニカミノルタ株式会社 | 動態画像処理装置 |
JP2020044445A (ja) * | 2019-12-27 | 2020-03-26 | コニカミノルタ株式会社 | 動態解析システム、プログラム及び動態解析装置 |
JP2020089612A (ja) * | 2018-12-07 | 2020-06-11 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
US11151715B2 (en) | 2017-05-25 | 2021-10-19 | Konica Minolta, Inc. | Dynamic analysis system |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6643827B2 (ja) * | 2015-07-31 | 2020-02-12 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
KR102439769B1 (ko) * | 2016-01-18 | 2022-09-05 | 삼성메디슨 주식회사 | 의료 영상 장치 및 그 동작방법 |
JP6701880B2 (ja) * | 2016-03-30 | 2020-05-27 | コニカミノルタ株式会社 | 動態解析装置、動態解析システム、動態解析方法及びプログラム |
JP6929689B2 (ja) * | 2016-04-26 | 2021-09-01 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置及び医用画像診断装置 |
AU2017299238C1 (en) * | 2016-07-19 | 2021-03-18 | Radwisp Pte. Ltd. | Diagnostic assistance program |
JP6815818B2 (ja) * | 2016-10-17 | 2021-01-20 | キヤノン株式会社 | 放射線撮影システム及び放射線撮影方法 |
JP6805918B2 (ja) * | 2017-03-23 | 2020-12-23 | コニカミノルタ株式会社 | 放射線画像処理装置及び放射線画像撮影システム |
JP6950483B2 (ja) * | 2017-11-20 | 2021-10-13 | コニカミノルタ株式会社 | 動態撮影システム |
CN109948396B (zh) * | 2017-12-20 | 2021-07-23 | 深圳市理邦精密仪器股份有限公司 | 一种心拍分类方法、心拍分类装置及电子设备 |
US11199506B2 (en) * | 2018-02-21 | 2021-12-14 | Applied Materials Israel Ltd. | Generating a training set usable for examination of a semiconductor specimen |
JP7183563B2 (ja) | 2018-04-11 | 2022-12-06 | コニカミノルタ株式会社 | 放射線画像表示装置及び放射線撮影システム |
JP7317639B2 (ja) * | 2019-09-05 | 2023-07-31 | 富士フイルムヘルスケア株式会社 | 放射線画像処理システム及び画像処理方法 |
JP2022109778A (ja) * | 2021-01-15 | 2022-07-28 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2022143288A (ja) * | 2021-03-17 | 2022-10-03 | コニカミノルタ株式会社 | 動態画像解析装置及びプログラム |
CN113561897B (zh) * | 2021-07-15 | 2022-08-12 | 河北三国新能源科技有限公司 | 一种基于全景环视的驾考车坡道停车位置判断方法和系统 |
CN116600194B (zh) * | 2023-05-05 | 2024-07-23 | 长沙妙趣新媒体技术有限公司 | 一种用于多镜头的切换控制方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009172190A (ja) * | 2008-01-25 | 2009-08-06 | Konica Minolta Holdings Inc | 画像生成装置、プログラム、画像生成方法 |
JP2010069099A (ja) * | 2008-09-19 | 2010-04-02 | Toshiba Corp | 画像処理装置及びx線コンピュータ断層撮影装置 |
JP2010245707A (ja) * | 2009-04-02 | 2010-10-28 | Canon Inc | 画像解析装置、画像処理装置及び画像解析方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3574301B2 (ja) | 1996-08-26 | 2004-10-06 | 株式会社山武 | パターン照合装置 |
JP4404291B2 (ja) | 2003-04-08 | 2010-01-27 | キヤノン株式会社 | 画像処理装置及び方法及びシステム |
EP1661519B1 (en) * | 2003-09-01 | 2012-05-09 | Panasonic Corporation | Biological signal monitor device |
US7421101B2 (en) * | 2003-10-02 | 2008-09-02 | Siemens Medical Solutions Usa, Inc. | System and method for local deformable motion analysis |
US8111947B2 (en) * | 2004-06-08 | 2012-02-07 | Canon Kabushiki Kaisha | Image processing apparatus and method which match two images based on a shift vector |
JP4797173B2 (ja) | 2005-06-21 | 2011-10-19 | 国立大学法人金沢大学 | X線診断支援装置、プログラム及び記録媒体 |
JP5414157B2 (ja) * | 2007-06-06 | 2014-02-12 | 株式会社東芝 | 超音波診断装置、超音波画像処理装置、及び超音波画像処理プログラム |
JP2010051729A (ja) * | 2008-08-29 | 2010-03-11 | Toshiba Corp | 超音波診断装置、超音波画像処理装置及び超音波画像処理プログラム |
US20110194748A1 (en) * | 2008-10-14 | 2011-08-11 | Akiko Tonomura | Ultrasonic diagnostic apparatus and ultrasonic image display method |
US20130333756A1 (en) | 2012-06-19 | 2013-12-19 | Richard A. DeLucca | Backsheet for a photovoltaic cell module and photovoltaic cell module including same |
-
2014
- 2014-04-09 CN CN201480026373.4A patent/CN105188541A/zh active Pending
- 2014-04-09 JP JP2015516996A patent/JP6350522B2/ja active Active
- 2014-04-09 US US14/891,110 patent/US9665935B2/en active Active
- 2014-04-09 WO PCT/JP2014/060287 patent/WO2014185197A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009172190A (ja) * | 2008-01-25 | 2009-08-06 | Konica Minolta Holdings Inc | 画像生成装置、プログラム、画像生成方法 |
JP2010069099A (ja) * | 2008-09-19 | 2010-04-02 | Toshiba Corp | 画像処理装置及びx線コンピュータ断層撮影装置 |
JP2010245707A (ja) * | 2009-04-02 | 2010-10-28 | Canon Inc | 画像解析装置、画像処理装置及び画像解析方法 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018532515A (ja) * | 2015-11-09 | 2018-11-08 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | X線画像吸気品質モニタリング |
JP7004648B2 (ja) | 2015-11-09 | 2022-01-21 | コーニンクレッカ フィリップス エヌ ヴェ | X線画像吸気品質モニタリング |
JP2017169830A (ja) * | 2016-03-24 | 2017-09-28 | コニカミノルタ株式会社 | 動態解析装置 |
JP2018007949A (ja) * | 2016-07-15 | 2018-01-18 | コニカミノルタ株式会社 | 動態解析装置 |
US11151715B2 (en) | 2017-05-25 | 2021-10-19 | Konica Minolta, Inc. | Dynamic analysis system |
JP2019063328A (ja) * | 2017-10-03 | 2019-04-25 | コニカミノルタ株式会社 | 動態画像処理装置 |
JP2020089612A (ja) * | 2018-12-07 | 2020-06-11 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
JP7143747B2 (ja) | 2018-12-07 | 2022-09-29 | コニカミノルタ株式会社 | 画像表示装置、画像表示方法及び画像表示プログラム |
JP2020044445A (ja) * | 2019-12-27 | 2020-03-26 | コニカミノルタ株式会社 | 動態解析システム、プログラム及び動態解析装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014185197A1 (ja) | 2017-02-23 |
US20160098836A1 (en) | 2016-04-07 |
CN105188541A (zh) | 2015-12-23 |
US9665935B2 (en) | 2017-05-30 |
JP6350522B2 (ja) | 2018-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6350522B2 (ja) | 画像処理装置及びプログラム | |
JP6512338B2 (ja) | 画像処理装置及びプログラム | |
JP5408400B1 (ja) | 画像生成装置及びプログラム | |
US9117287B2 (en) | Image analysis apparatus, method, and program | |
US20210233243A1 (en) | Diagnostic support program | |
JP7462898B2 (ja) | 診断支援プログラム | |
JP6805918B2 (ja) | 放射線画像処理装置及び放射線画像撮影システム | |
JP6656910B2 (ja) | 医用画像処理装置、医用画像診断装置及び医用画像処理プログラム | |
JP6418091B2 (ja) | 胸部画像表示システム及び画像処理装置 | |
JP6221770B2 (ja) | 画像処理装置、およびプログラム | |
JP6253085B2 (ja) | X線動画像解析装置、x線動画像解析プログラム及びx線動画像撮像装置 | |
JP2014079312A (ja) | 画像処理装置及びプログラム | |
JP5510619B1 (ja) | 画像処理装置 | |
JP2016073466A (ja) | 画像処理装置及びプログラム | |
EP2106603A2 (en) | Temporal registration of medical data | |
JP6273940B2 (ja) | 画像解析装置、画像撮影システム及び画像解析プログラム | |
US11836923B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JP2019180899A (ja) | 医用画像処理装置 | |
JP2019111006A (ja) | 動態画像処理方法及び動態画像処理装置 | |
JP5051025B2 (ja) | 画像生成装置、プログラム、および画像生成方法 | |
Panayiotou et al. | Image-based view-angle independent cardiorespiratory motion gating and coronary sinus catheter tracking for x-ray-guided cardiac electrophysiology procedures | |
JP2020168173A (ja) | 動態解析装置、動態解析システム及びプログラム | |
JP2012095791A (ja) | 医用画像生成装置、医用画像生成方法及び医用画像生成プログラム | |
JP2015112150A (ja) | 画像処理装置、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480026373.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14797490 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015516996 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14891110 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14797490 Country of ref document: EP Kind code of ref document: A1 |