WO2011043458A1 - Medical image processing device, x-ray image capturing device, medical image processing program, and medical image processing method - Google Patents
Medical image processing device, x-ray image capturing device, medical image processing program, and medical image processing method Download PDFInfo
- Publication number
- WO2011043458A1 WO2011043458A1 PCT/JP2010/067729 JP2010067729W WO2011043458A1 WO 2011043458 A1 WO2011043458 A1 WO 2011043458A1 JP 2010067729 W JP2010067729 W JP 2010067729W WO 2011043458 A1 WO2011043458 A1 WO 2011043458A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- original
- comparison
- images
- original images
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims description 5
- 238000000034 method Methods 0.000 claims description 38
- 238000003384 imaging method Methods 0.000 claims description 27
- 238000010606 normalization Methods 0.000 claims description 19
- 230000000052 comparative effect Effects 0.000 claims description 10
- 238000005520 cutting process Methods 0.000 claims description 10
- 238000009826 distribution Methods 0.000 claims description 9
- 238000011946 reduction process Methods 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 9
- 238000009795 derivation Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 238000005304 joining Methods 0.000 description 6
- 238000012217 deletion Methods 0.000 description 4
- 230000037430 deletion Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000002059 diagnostic imaging Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 210000001015 abdomen Anatomy 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 210000003141 lower extremity Anatomy 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3876—Recombination of partial images to recreate the original image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5235—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
- A61B6/5241—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
Definitions
- the present invention relates to a medical image processing apparatus, an X-ray imaging apparatus, a medical image processing program, and a medical image processing method, and in particular, a medical image processing apparatus that creates a long image using two or more images, and an X-ray imaging
- the present invention relates to an apparatus, a medical image processing program, and a medical image processing method.
- Long imaging refers to an imaging technique in which a continuous subject that exceeds the detection area of one X-ray detector is imaged multiple times with multiple X-ray detectors or one X-ray detector, and the images are joined. .
- a radiographic image for manually aligning or as disclosed in Patent Document 1
- a radiation image that automatically derives the connection position from the positional relationship of the X-ray detector and creates a joined long image There is a photographing device.
- Patent Document 1 due to mechanical accuracy when moving the X-ray detector, an error occurs in the connection position, or the joint position of the image cannot be derived unless the positional relationship of the X-ray detector is known. There was a problem.
- the present invention has been made in view of the above problems, and does not depend on the mechanical accuracy when moving the X-ray detector, and even when the positional relationship of the X-ray detector is not known, image alignment is performed. It is an object of the present invention to provide a medical image processing apparatus, a program, and an X-ray imaging apparatus capable of performing the above.
- a medical image processing apparatus includes an image acquisition unit that acquires a plurality of original images captured by overlapping the same part of a subject, and the same part in the original image is captured.
- a registration unit that performs registration of the plurality of original images by overlapping the regions formed, and an image creation unit that creates a long image using the plurality of registered original images,
- the alignment means cuts out a reference image from an area in which the same part in one original image is imaged, and also cuts out a plurality of comparison images from the other original image, the reference image, and the reference image Relative position deriving means for deriving a relative position that is a position obtained by obtaining a difference from each comparison image, and the comparison image in which the difference is minimized is cut out from the other original image,
- Location registration means based on the relative position, performs the alignment of the plurality of original images, characterized in that.
- an X-ray imaging apparatus includes the above-described medical image processing apparatus.
- the medical image processing program includes a step of acquiring a plurality of original images obtained by imaging the same part of a subject in duplicate, and a reference image from an area where the same part in the one original image is taken. And cutting out a plurality of comparison images from the other original image, obtaining a difference between the reference image and each comparison image, and the comparison image that minimizes the difference is the other original image
- a step of deriving a relative position that is a position cut out from the image, a step of aligning the plurality of original images based on the relative position, and a long image using the plurality of aligned original images And a step of causing a computer to execute.
- the medical image processing method includes a step of acquiring a plurality of original images obtained by imaging the same part of a subject in duplicate, and a reference image from an area where the same part in the one original image is taken. And cutting out a plurality of comparison images from the other original image, obtaining a difference between the reference image and each comparison image, and the comparison image that minimizes the difference is the other original image
- a step of deriving a relative position that is a position cut out from the image, a step of aligning the plurality of original images based on the relative position, and a long image using the plurality of aligned original images And a step of creating.
- the image alignment is performed without depending on the mechanical accuracy of the X-ray detector, and a long image is created. Can do.
- FIG. 1 is a schematic diagram showing a configuration example of a medical image processing apparatus 1 according to a first embodiment.
- Block diagram showing a program stored in the medical image processing apparatus 1 Explanatory drawing explaining the outline
- the flowchart which shows the flow of a process of the medical image processing apparatus which concerns on 1st embodiment.
- Explanatory drawing for explaining the number of thinnings and the number of comparisons between pixels Flow chart showing the flow of relative position derivation processing
- Flow chart showing the flow of normalization processing Explanatory drawing showing the contents of normalization processing
- the schematic diagram which shows the structural example of the X-ray imaging apparatus which concerns on 2nd embodiment.
- FIG. 1 is a schematic diagram showing a configuration example of the medical image processing apparatus 1 according to the first embodiment.
- FIG. 2 is a block diagram showing a program stored in the medical image processing apparatus 1.
- the medical image processing apparatus 1 in FIG. 1 performs image processing of a stored image.
- the image database 4 to be managed is connected by a network such as LAN3.
- the medical image processing apparatus 1 includes a central processing unit (CPU) 11 as a control device that mainly controls the operation of each component, and a main memory that stores a control program for the device and serves as a work area when the program is executed 12, an application such as an operating system and a medical image processing program, a magnetic disk 13 for storing various data, a display memory 14 for temporarily storing display data, and an image based on the data from the display memory 14 Display device 15, an input device 16 such as a mouse or a keyboard as a position input device, a LAN port 17 for connecting to a network, and a bus 18 for connecting these devices.
- the medical image capturing apparatus 2 may be any apparatus that can capture a medical image of a subject, and is configured by, for example, an X-ray imaging apparatus or an MRI apparatus.
- the medical image processing apparatus 1 stores a medical image processing program shown in FIG.
- This medical image processing program includes an image acquisition unit 20 that acquires a plurality of medical images to be aligned (hereinafter referred to as “original images”), an alignment unit 30 that aligns original images, and an original that has been aligned. And an image joining unit 40 that joins the images to create a long image.
- the alignment unit 30 includes an image cutout unit 31 that cuts out a reference image and a plurality of comparison images, which will be described later, from the original image, an image reduction unit 32 that performs image reduction processing of the reference image and the comparison image, the reference image, A relative position deriving unit 33 for obtaining a difference from the comparative image and deriving a position (hereinafter also referred to as a “relative position”) where the comparative image that minimizes the difference is cut out from the original image, and a relative position and a magnification of the image reduction And a position range deriving unit 34 for deriving the alignment range of the original image.
- an image cutout unit 31 that cuts out a reference image and a plurality of comparison images, which will be described later, from the original image
- an image reduction unit 32 that performs image reduction processing of the reference image and the comparison image, the reference image
- a relative position deriving unit 33 for obtaining a difference from the comparative image and deriving a position (hereinafter also referred
- the relative position derivation unit 33 a normalization unit 331 that normalizes the reference image and the comparison image, a difference derivation unit 332 that derives a difference between the pixel value of the reference image and the pixel value of each comparison image, A minimum difference deriving unit 333 for deriving a comparative image having the smallest difference.
- These medical image processing programs are stored in the magnetic disk 13 and are loaded into the main memory 12 and executed by the central processing unit (CPU) 11 to realize the function.
- FIG. 3 is an explanatory diagram for explaining an outline of alignment of the medical image processing apparatus 1 according to the first embodiment.
- FIG. 4 is a flowchart showing a process flow of the medical image processing apparatus according to the first embodiment.
- FIG. 5 is an explanatory diagram for explaining the thinning-out number and the number of comparisons between pixels.
- FIG. 6 is a flowchart showing the flow of the relative position derivation process.
- FIG. 7 is a flowchart showing the flow of normalization processing.
- FIG. 8 is an explanatory diagram showing the contents of the normalization process.
- an X-ray imaging apparatus is used as the medical image imaging apparatus 2, and the X-ray detector of this X-ray imaging apparatus is photographed by shifting the X-ray detector to different positions while having an overlapping portion with respect to the subject.
- a process for creating a long image by joining the original image A, that is, the original image A and the original image B in FIG. 3 will be described as an example.
- the original image A and the original image B are images taken with overlapping portions along the body axis direction of the subject. Therefore, the lower area (foot side) of the original image A and the upper area (head side) of the original image B each include an area where the same part of the subject is imaged.
- a region where the same part of the subject is imaged is a region corresponding to the above-described overlapping portion (hereinafter referred to as “overlapping region”). Then, when the original image A and the original image B are aligned and overlapped so that the overlapping areas overlap, a long image in which the original image A and the original image B are continuous is created.
- the alignment between the original image A and the original image B is performed by aligning the reference image 50 that is a partial area in the overlapping area of the original image A with the reference image that is in the overlapping area of the original image B.
- a search range 51 having an area larger than 50 is defined. Then, the position within the search range 51 where the reference image 50 is located is searched, and the original image A and the original image B are aligned based on the search result.
- the reference image 50 includes only a portion that is an overlapping area in the original image A. Therefore, when the overlapping area is an area having a width of several centimeters from the lower end of the original image A, the range not exceeding the above several centimeters from the lower end of the original image A is set in the y direction of the reference image 50.
- the overlapping area is often provided in a range of about 30 to 40 mm.
- the reference image 50 has a width in the y direction from the lower end of the original image A to 70 pixels, and is several cm inward from the left and right ends. This region is configured as a rectangular region having a width in the x direction.
- the shape of the reference image 50 may be an arbitrary shape other than a square shape.
- the width of the search range 51 in the y direction is configured to start from the upper end of the original image B and not exceed the above-mentioned several centimeters and wider than the width of the reference image 50 in the y direction.
- the x-direction width of the search range 51 preferably includes the x-direction width of the reference image 50. Therefore, in the present embodiment, the width of the search range 51 in the x direction is the entire area from the left end to the right end of the original image B, that is, the entire width area of the original image B in the x direction.
- a plurality of comparison images 52a, 52b, 52c, and 52d are created by cutting out a region having the same shape as the reference image 50 in the search range 51 while shifting it by a predetermined number of pixels in the x and y directions.
- a plurality of comparison images 52a, 52b, 52c, and 52d are associated with coordinates based on the origin (0, 0) of the original image B of the region cut out from the reference image to create these comparison images. Record it. For example, for the comparison image 52a, the coordinates (x 52 1, y 52 1), (x 52 2, y 52 2), (x 52 ) of the four vertices of the rectangular region cut out to create the comparison image 52a.
- the y direction is fixed and a plurality of comparison images are created while shifting in the x direction.
- the x direction is fixed and a plurality of comparison images are generated while shifting in the y direction. May be.
- the comparison image may be cut out while shifting the region having the same shape as the reference image 50 along one arbitrary direction, not limited to the x direction and the y direction.
- the reference image 50 is cut out from the original image A
- the search range 51 is set in the original image B
- the comparison images 52a, 52b, 52c, and 52d are cut out, but the reference image 50 is cut out from the original image B
- the comparison image 52a, 52b, 52c, 52d may be cut out by setting the search range 51 in the original image A.
- Step S1 the image acquisition unit 20 acquires an image acquisition process, that is, two original images A and B that partially overlap (S1).
- the image acquisition unit 20 may obtain two original images A and B from the medical image photographing apparatus 2, or may obtain two original images A and B from the image database 4.
- Step S2 the alignment unit 30 performs alignment processing (S2).
- This alignment process is a process name that collectively refers to processes from steps S3 to S7 described later.
- Step S3 The alignment unit 30 starts a loop i for repeating the processing from step S3 to step S7 (S3).
- the image reduction process is performed using the smallest magnification in the loop i, and a small image is obtained. create. Then, the processing after step S5 is performed.
- the image reduction processing is performed using a larger magnification than the first time, and the processing from step S4 is performed. Then, when the image reduction magnification is equal, that is, when the processing of S4 and subsequent steps is performed using the reference image and the comparison image without image reduction, the loop i ends.
- Step S4 the image reduction unit 32 performs image reduction processing to increase the processing speed (S4).
- the image may be reduced by a downsampling method or an average value method.
- an optimal thinning number is calculated by the following method, and image reduction is performed using the thinning number. I do.
- the image reduction unit 32 may perform image reduction on the original image A and the original image B, and may cut out the reference image 50 and the comparison images 52a, 52b, 52c, and 52d from these reduced images, or The reference image 50 and the comparison images 52a, 52b, 52c, and 52d may be cut out from the previous original image A and original image B, and image reduction processing may be performed on these. That is, as long as the reference image 50 and the comparison images 52a, 52b, 52c, and 52d subjected to the following processing are reduced, either method may be used.
- the number of comparisons between the reference image 50 and the pixels in the search range 51 in the loop i will be described based on FIG.
- the number of pixels in the horizontal direction (x direction) of the search range 51 is X
- the number of pixels in the vertical direction (y direction) is Y
- the number of pixels in the horizontal direction of the reference image 50 is X ′
- the vertical direction the number of pixels a X of the search area on one side the number of pixels of Y 'in the lateral direction (left side)
- the number of pixels in the search area of the longitudinal side (upper side) is referred to as a Y
- the number N of comparisons between the pixels is expressed by the following equation (1) when thinning is not performed.
- N Number of comparisons between pixels X ′: Number of pixels arranged in the horizontal direction of the cut out reference image
- Y ′ Number of pixels arranged in the vertical direction of the cut out reference image
- a X One side in the horizontal direction (left side) Search area
- a Y Vertical one-side (upper) search area
- step S5 the relative position deriving unit 33 performs a relative position deriving process for deriving the relative position (S5).
- the relative position deriving process will be described in the order of steps in FIG.
- Step S51 The image cutout unit 31 cuts out the reference image 50 from the overlapping area of the original image A (S51). Normally, in the case of long shooting, the overlapping area of the original images A and B is set to 30 mm to 40 mm. Therefore, the position where the reference image 50 is cut out is the geometric positional relationship of the X-ray detector when the image cutting unit 31 captures the original image A and the original image B by shifting the X-ray detector (for example, The above-mentioned movement amount corresponding to 30 mm to 40 mm) may be obtained and derived from the X-ray imaging apparatus. Alternatively, the user may display the original image A on the monitor 15, and the image cutout unit 31 may cut out the designated area input on the original image A by the input device 16 as the reference image 50.
- Step S52 The image cutout unit 31 sets a search range 51 that is within the overlapping area of the original image B and includes an area that includes the reference image 50 (S52).
- the search range 51 is provided to find the comparison image 52a that most closely matches the reference image 50 at a higher speed. Therefore, when the comparison image 52 is cut out from the original image B without setting the search range 51, this step may be omitted.
- Step S53 The relative position deriving unit 33 starts the loop j (S53).
- Step S54 The image cutout unit 31 cuts out the comparison images 52a and 52b having the same size as the reference image 50 from the search range 51 (S54).
- the image cutout unit 31 cuts out a region having the same shape as that of the reference image 50 at a certain position in the y direction to be processed in the loop i while shifting it by a predetermined number of pixels in the x direction. Cut out 52b.
- Step S55 The normalizing unit 331 performs normalization processing to normalize the difference between the shooting conditions of the comparison images 52a and 52b and the shooting condition of the reference image 50 (S55). This normalization process will be described with reference to FIGS.
- Step S551 The normalizing unit 331 performs logarithmic conversion processing on the cut-out reference image 50 and comparison images 52a and 52b (S551).
- Step S552 The normalizing unit 331 performs a histogram derivation process to derive a histogram from the logarithmically transformed image (S552).
- the normalization unit 331 performs an extraction process on the reference image 50 and the comparison images 52a and 52b in the region where the subject is photographed in the reference image 50 and the comparison images 52a and 52b (hereinafter referred to as “subject region”).
- subject region For the extraction process of the subject region, a known method may be used, but in this embodiment, the normalization unit 331 captures the X-ray diaphragm of the X-ray imaging apparatus in the reference image 50 and the comparison images 52a and 52b.
- the second deletion process derives the signal level when the X-ray is directly incident on the X-ray detector from the sensitivity of the X-ray detector and the X-ray energy at the time of X-ray imaging. This can be done by performing threshold processing on the X-ray image using a threshold that can distinguish the regions.
- the normalization unit 331 derives a histogram indicating the distribution of pixel values in each subject region in the reference image 50 and all the extracted comparison images 52a and 52b.
- 8A and 8B are histograms of the original images A and B obtained in this step. Note that FIG. 8B shows only the histogram of the comparison image 52a for convenience of explanation, but in practice, the same number of histograms as the number of comparison images cut out in step S54 are derived.
- the comparison image 52a will be described as an example, but the processing from S553 to S556 is performed on the other comparison images cut out in step S54 as well as the comparison image 52a.
- Steps S553, S554 The normalization unit 331 calculates an average value of pixel values of the reference image 50 and the comparison image 52a and a distribution width of the pixel values as parameters for use in the normalization process (S553, 554).
- step S553 the normalization unit 331 performs histogram average value derivation processing to obtain an average value from each derived histogram (S553), and in step S554, derives the width of the histogram from the start position to the end position of the histogram.
- a histogram width deriving process is performed (S554).
- a ave and A wide in FIG. 8 (a) indicate the average value and the width of the histogram of the reference image 50
- B ave and B wide in FIG. 8 (b) indicate the average value and the width of the histogram of the comparative image 52a. Show. Note that the process of calculating the average value and the distribution width of the pixels of the reference image 50 may be performed only once at the beginning of the loop j.
- Step S555 As shown in FIG. 8 (c), the normalization unit 331 compares all the pixels of the comparison image 52a with the average value of the reference image 50 according to the following equation (3) in order to align the image density reference.
- a histogram average value difference process for adding a difference from the average value of the image 52a is performed (S555).
- P B ' P B + (P Aave- P Bave ) (3)
- P B Pixel value of comparison image
- P B ′ Pixel value after difference processing of histogram average value
- P Aave Average value of pixel value of reference image
- P Bave Average value of pixel value of comparison image
- the normalizing unit 331 performs histogram width division processing according to the following equation (4) in order to align the deviation of the pixel values (S556).
- P B '' P B ' x (P Awide / P Bwide ) ... (4)
- P B ′′ is the pixel value of the comparison image after the histogram width division processing, and the pixel value of the comparison image after normalization
- P Awide Histogram width of the reference image
- P Bwide Comparison image histogram width
- the relative position deriving unit 33 starts the loop k (S56). In loop k, the same number of processes as the number of comparison images cut out in step S54 are repeated.
- the difference deriving unit 332 performs a difference deriving process between the images in order to calculate how close the normalized comparison images 52a and 52b are to the reference image (S57). Specifically, the difference derivation process between the images can be realized by subtracting the comparison images 52a and 52b from the reference image 50 and deriving the total value of the differences as a parameter. It can be said that the image is approximated with the normalized comparison image.
- Step S58 The minimum difference deriving unit 333 performs the minimum difference deriving process for determining whether the difference parameter obtained by the difference deriving process in step S57 is the smallest value in the iterative process in the loop k (S58). .
- an image having the smallest difference from the reference image 50 is determined.
- Step S59 When the relative position deriving unit 33 repeats the processes from Steps S56 to 59 for all of the plurality of comparison images 52a and 52b cut out in Step S54, the loop k ends (S59).
- Step S510 When the relative position deriving unit 33 repeats the processing from step S53 to step S510 for all the positions in the y direction of the search range 51, the loop j ends (S510).
- the process in the y direction is repeated with loop j and the process in the x direction is repeated with loop k, but the process in the x direction with loop j and the process in the y direction with loop k may be repeated.
- Step S511) Based on the result of step S58, the minimum difference deriving unit 333 determines a comparison image having the smallest difference from the reference image 50 within the search range 21, and the comparison image indicating the minimum difference position is extracted from the region.
- the coordinates in the image B are obtained (S551).
- the main memory 12 stores, for each position in the y direction, a plurality of comparison images cut out along the x direction at the y direction position. A comparative image indicating the minimum difference position is recorded. Therefore, the minimum difference deriving unit 333 obtains a comparative image having the smallest difference from these.
- Step S6 The position range deriving unit 34 performs position range deriving processing for deriving the alignment range of the original image B based on the original image A before reduction based on the relative position in step S511 and the thinning-out number. (S6). Specifically, a range obtained by adding the number of pixels corresponding to the thinned-out number in step S4 with the relative position in step S511 as the center is obtained as the position range.
- the position range deriving unit 34 thins out the coordinates of these four vertices. Coordinates obtained by adding numbers, that is, (x 52 1 ⁇ n, y 52 1 + n) (x 52 2 + n, y 52 2 + n), (x 52 3 + n, y 52 3 ⁇ n), (x 52 4 ⁇ n, y 52 A rectangular area defined by the four vertices of 4-n) is derived as an alignment range.
- the alignment unit 30 repeats the next loop i using the alignment range derived in step S6 as the search range 51.
- the loop i is terminated (S7).
- the coordinates of the position range in step S6 coincide with the coordinates of the area where the comparison image 52a in the original image B is cut out. Therefore, after the end of loop i, the alignment unit 30 performs alignment by overlapping the region where the reference image 50 of the original image A is cut out with the alignment range of the original image B.
- Step S8 The image joining unit 40 performs long image creation processing of the two original images A and B (S8). That is, the image joining unit 40 joins both images that are aligned in the alignment range derived by the alignment unit 30 in step S7, and creates a long image.
- FIG. 9 is a schematic diagram illustrating a configuration example of an X-ray imaging apparatus according to the second embodiment.
- the X-ray imaging apparatus 100 of the present embodiment shown in FIG. 9 is an apparatus used for diagnosis in, for example, a hospital or the like, and an X-ray tube 101 for X-ray irradiation and two-dimensional X-ray detection for transmission X-ray image detection And an X-ray detector 102 such as a flat panel type.
- the X-ray tube 101 and the X-ray detector 102 are configured to be synchronized with each other and movable along the body axis direction Z of the subject M placed on the bed 105. While the subject M is stationary, the imaging is repeated while moving the X-ray tube 101 and the X-ray detector 102 in the body axis direction Z of the subject M, thereby continuously A plurality of partially overlapping X-ray images can be acquired.
- the X-ray detector 102 detects a transmitted X-ray image that has been irradiated from the X-ray tube 101 and passed through the subject M at each imaging, and converts the detection result into an electrical signal (X-ray detection signal) for image processing.
- the data is output to the unit 103.
- the image processing unit 103 (image processing apparatus) includes hardware such as a CPU, ROM, RAM, and hard disk.
- the image processing unit 103 performs image processing (gradation conversion and joining) for display based on a plurality of X-ray images acquired one after another according to the X-ray detection signal output from the X-ray detector 102, and the subject One long X-ray image corresponding to the M long imaging region (for example, the region from the abdomen to the lower limb) is created.
- the display unit 104 receives the output from the image processing unit 103 and displays a long X-ray image.
- the long X-ray image created by the image processing unit 103 is stored in a recording device such as a magneto-optical disk device in the state of digital image data, and these are transferred to an external device via a network. It can also be configured.
- the image processing unit 103 may store the program of FIG. 2 and perform image processing similar to that of the first embodiment. This program realizes each function in cooperation with hardware configuring the image processing unit 103. As a result, the X-ray image captured by the X-ray imaging apparatus 100 is aligned by the same processing as in the first embodiment, the images are joined based on the alignment result, and displayed on the display unit 104. Can do.
- a long image is created using the aligned image, but this alignment process can also be applied to alignment for creating a difference image.
- the normalization unit 331 normalizes the comparison images 52a, 52b, 52c, and 52d, but normalizes the original image B and compares the normalized original image B with the comparison images 52a and 52b. , 52c, 52d may be cut out. Then, the original image A and the normalized original image B may be aligned, and the original image A and the normalized original image B may be joined to create a long image. As a result, the joint between the original image A and the original image B becomes smooth.
- 1 medical image processing device 11 CPU, 12 main memory, 13 magnetic disk, 14 display memory, 15 monitor, 16 input device, 17 LAN port, 18 bus, 2 medical imaging device, 3 LAN, 4 image database, 100 X X-ray equipment, 101 X-ray tube, 102 X-ray detector, 103 image processing device, 104 display unit
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- High Energy & Nuclear Physics (AREA)
- Signal Processing (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
本発明の実施形態としては、医用画像撮影装置2に医用画像処理装置1が内蔵される構成と、医用画像撮影装置2と医用画像処理装置1とが独立した構成がある。第一実施形態では医用撮影装置2と医用画像処理装置1とが独立した構成を記述する。以下、図1及び図2に基づいて、第一実施形態に係る医用画像処理装置1の構成について説明する。図1は、第一実施形態に係る医用画像処理装置1の構成例を示す模式図である。また、図2は、医用画像処理装置1に格納されるプログラムを示すブロック図である。 << First Embodiment >>
As an embodiment of the present invention, there are a configuration in which the medical
ステップS1では、画像取得部20が、画像取得処理、すなわち、部分的に重なりを持つ2枚の原画像A、Bを取得する(S1)。画像取得部20は、医用画像撮影装置2から2枚の原画像A、Bを得てもよいし、画像データベース4から2枚の原画像A、Bを得てもよい。 (Step S1)
In step S1, the image acquisition unit 20 acquires an image acquisition process, that is, two original images A and B that partially overlap (S1). The image acquisition unit 20 may obtain two original images A and B from the medical
ステップS2では、位置合わせ部30が、位置合わせ処理を行う(S2)。この位置合わせ処理は、後述するステップS3からS7までの処理を総称した処理名である。 (Step S2)
In step S2, the alignment unit 30 performs alignment processing (S2). This alignment process is a process name that collectively refers to processes from steps S3 to S7 described later.
位置合わせ部30は、ステップS3~ステップS7までの処理を繰り返すためのループiを開始する(S3)。最も始めにループiを開始する際は、大まかな位置を探索するために、後述するステップS4の画像縮小処理で、ループiの中で最も小さい倍率を用いて画像縮小処理を行い、小さな画像を作成する。そして、ステップS5以下の処理を行う。 (Step S3)
The alignment unit 30 starts a loop i for repeating the processing from step S3 to step S7 (S3). When starting the loop i first, in order to search for a rough position, in the image reduction process in step S4 described later, the image reduction process is performed using the smallest magnification in the loop i, and a small image is obtained. create. Then, the processing after step S5 is performed.
ステップS4では、画像縮小部32は、処理の高速化のために画像縮小処理を行う(S4)。画像の縮小は、ダウンサンプリング法でも良いし平均値法等でも良い。本実施形態では、基準画像と比較画像との画素同士の比較回数を減らして処理の高速化を図るために、最適な間引き数を以下の方法により算出し、その間引き数を用いて画像の縮小を行う。 (Step S4)
In step S4, the image reduction unit 32 performs image reduction processing to increase the processing speed (S4). The image may be reduced by a downsampling method or an average value method. In the present embodiment, in order to reduce the number of comparisons between pixels of the reference image and the comparison image to increase the processing speed, an optimal thinning number is calculated by the following method, and image reduction is performed using the thinning number. I do.
但し、N:画素同士の比較の回数
X’:切り出した基準画像の横方向に並んだ画素数
Y’:切り出した基準画像の縦方向に並んだ画素数
AX:横方向の片側(左側)の探索領域
AY:縦方向の片側(上側)の探索領域
ここで、複数回の間引きを行った場合の画素同士の比較回数は、以下の式(2)の様になる。
但し、N:画素同士の比較の回数
n0:X’、Y’、(2AX+1)、(2AY+1)の最大公約数
na:X’、Y’、na-1の最大公約数
na=1迄、ループiを行う。 Hereinafter, the number of comparisons between the
However, N: Number of comparisons between pixels X ′: Number of pixels arranged in the horizontal direction of the cut out reference image Y ′: Number of pixels arranged in the vertical direction of the cut out reference image A X : One side in the horizontal direction (left side) Search area A Y : Vertical one-side (upper) search area Here, the number of comparisons between pixels when thinning is performed a plurality of times is expressed by the following equation (2).
However, N: Number of comparisons between pixels n 0 : Maximum common divisor of X ', Y', (2A X +1), (2A Y +1) n a : Maximum common divisor of X ', Y', na -1 Loop i is repeated until the number n a = 1.
ステップS5では、相対位置導出部33は、相対位置を導出する相対位置導出処理を行う(S5)。以下、図6の各ステップ順に、相対位置導出処理について説明する。 (Step S5)
In step S5, the relative position deriving unit 33 performs a relative position deriving process for deriving the relative position (S5). Hereinafter, the relative position deriving process will be described in the order of steps in FIG.
画像切出部31が、原画像Aの重複領域から基準画像50の切り出しを行う(S51)。通常、長尺撮影の場合、原画像A、Bの重複領域は、30mm~40mm設定されている。そこで、基準画像50を切り出す位置は、画像切出部31が、X線検出器をずらして原画像A及び原画像Bを撮影したときのX線検出器の幾何学的な位置関係(例えば、上述の30mm~40mmに相当する移動量)を、X線撮影装置から得て導出してもよい。また、ユーザが原画像Aをモニタ15に表示し、入力機器16によって原画像A上において入力した指定領域を画像切出部31が基準画像50として切り出してもよい。 (Step S51)
The image cutout unit 31 cuts out the
画像切出部31は、原画像Bの重複領域内であって、基準画像50を包摂する領域からなる探索範囲51を設定する(S52)。探索範囲51は、基準画像50に最も一致する比較画像52aをより高速に発見するために設けられるものである。そのため、探索範囲51を設定することなく比較画像52を原画像Bから切り出す場合には、本ステップを省略してもよい。 (Step S52)
The image cutout unit 31 sets a
相対位置導出部33は、ループjを開始する(S53)。 (Step S53)
The relative position deriving unit 33 starts the loop j (S53).
画像切出部31は、基準画像50と同サイズの比較画像52a、52bを探索範囲51から切り出す(S54)。画像切出部31は、基準画像50と同一形状の領域をループiにおいて処理の対象となるy方向のある位置において、x方向に所定画素数ずつずらしながら切り出すことにより、複数の比較画像52a、52bを切り出す。 (Step S54)
The image cutout unit 31 cuts out the comparison images 52a and 52b having the same size as the
正規化部331は、比較画像52a、52bの撮影条件と基準画像50の撮影条件との違いを正規化するための正規化処理を行う(S55)。この正規化処理を、図7、図8に基づいて説明する。 (Step S55)
The normalizing unit 331 performs normalization processing to normalize the difference between the shooting conditions of the comparison images 52a and 52b and the shooting condition of the reference image 50 (S55). This normalization process will be described with reference to FIGS.
正規化部331は、切り出した基準画像50及び比較画像52a、52bに対数変換処理を行う(S551)。 (Step S551)
The normalizing unit 331 performs logarithmic conversion processing on the cut-
正規化部331は、対数変換を行った画像よりヒストグラムを導出するためにヒストグラム導出処理を行う(S552)。 (Step S552)
The normalizing unit 331 performs a histogram derivation process to derive a histogram from the logarithmically transformed image (S552).
正規化部331は、正規化処理に用いるためのパラメータとして、基準画像50及び比較画像52aの画素値の平均値と画素値の分布幅とを算出する(S553、554)。 (Steps S553, S554)
The normalization unit 331 calculates an average value of pixel values of the
正規化部331は、図8(c)に示すように、画像の濃度の基準を揃えるために、下式(3)に従って、比較画像52aの全ての画素に、基準画像50の平均値と比較画像52aの平均値との差分を加算するヒストグラム平均値差分処理を行う(S555)。
PB’=PB+(PAave―PBave)・・・(3)
但し、PB:比較画像の画素値
PB’:ヒストグラム平均値差分処理後の画素値
PAave:基準画像の画素値の平均値
PBave:比較画像の画素値の平均値
(ステップS556)
正規化部331は、図8(d)に示すように、画素値の偏差を揃えるために、下式(4)に従って、ヒストグラム幅除算処理を行う(S556)。 (Step S555)
As shown in FIG. 8 (c), the normalization unit 331 compares all the pixels of the comparison image 52a with the average value of the
P B ' = P B + (P Aave- P Bave ) (3)
However, P B : Pixel value of comparison image P B ′ : Pixel value after difference processing of histogram average value P Aave : Average value of pixel value of reference image P Bave : Average value of pixel value of comparison image
(Step S556)
As shown in FIG. 8 (d), the normalizing unit 331 performs histogram width division processing according to the following equation (4) in order to align the deviation of the pixel values (S556).
但し、PB’’:ヒストグラム幅除算処理後の比較画像の画素値であり、正規化後の比較画像の画素値
PAwide:基準画像のヒストグラムの幅
PBwide:比較画像のヒストグラムの幅
(ステップS56)
相対位置導出部33は、ループkを開始する(S56)。ループkでは、ステップS54で切り出された比較画像の数と同数回の処理を繰り返す。 P B '' = P B ' x (P Awide / P Bwide ) ... (4)
Where P B ″ is the pixel value of the comparison image after the histogram width division processing, and the pixel value of the comparison image after normalization
P Awide : Histogram width of the reference image
P Bwide : Comparison image histogram width (step S56)
The relative position deriving unit 33 starts the loop k (S56). In loop k, the same number of processes as the number of comparison images cut out in step S54 are repeated.
差異導出部332は、正規化された比較画像52a、52bがどの程度基準画像に近い画像であるかを算出するために、画像間の差異導出処理を行う(S57)。この画像間の差異導出処理は、具体的には、基準画像50から各比較画像52a、52bを差分し、差分の合計値をパラメータとして導出することで実現でき、この合計値が小さい程、基準画像と正規化された比較画像とが近似していると言える。 (Step S57)
The difference deriving unit 332 performs a difference deriving process between the images in order to calculate how close the normalized comparison images 52a and 52b are to the reference image (S57). Specifically, the difference derivation process between the images can be realized by subtracting the comparison images 52a and 52b from the
最小差異導出部333は、ステップS57の差異導出処理によって求められた差異のパラメータが、ループk内の繰り返し処理内で最も小さい値であるかを決定するための最小差異導出処理を行う(S58)。本ステップにより、ステップS54で切り出された複数の比較画像52a、52bのうち、最も基準画像50と差異が小さいものが決定される。 (Step S58)
The minimum difference deriving unit 333 performs the minimum difference deriving process for determining whether the difference parameter obtained by the difference deriving process in step S57 is the smallest value in the iterative process in the loop k (S58). . By this step, among the plurality of comparison images 52a and 52b cut out in step S54, an image having the smallest difference from the
相対位置導出部33は、ステップS54で切り出された複数の比較画像52a、52bの全てについて、ステップS56~59までの処理を繰り返すと、ループkを終了する(S59)。 (Step S59)
When the relative position deriving unit 33 repeats the processes from Steps S56 to 59 for all of the plurality of comparison images 52a and 52b cut out in Step S54, the loop k ends (S59).
相対位置導出部33は、探索範囲51のy方向の全ての位置について、ステップS53からステップS510までの処理を繰り返すと、ループjを終了する(S510)。図6では、ループjでy方向を、ループkでx方向の処理を繰り返すこととしたが、ループjでx方向を、ループkでy方向の処理を繰り返してもよい。 (Step S510)
When the relative position deriving unit 33 repeats the processing from step S53 to step S510 for all the positions in the y direction of the
最小差異導出部333は、ステップS58の結果に基づいて、探索範囲21内において、基準画像50との差異が最も小さい比較画像を決定し、その最小差異位置を示す比較画像が切り出された領域の画像Bにおける座標を求める(S551)。具体的には、ループjの最後の回のステップS58において、メインメモリ12には、y方向の位置毎に、そのy方向の位置において、x方向に沿って切り出された複数の比較画像のうちの最小差異位置を示す比較画像が記録されている。そこで、最小差異導出部333は、この中から、最も差異が小さい比較画像を求める。この結果、探索範囲51内において最も差異が小さい比較画像、例えば52aが求められる。続いて、原画像Bの原点(0,0)を基準とする最小差異位置を示す比較画像52aの矩形状領域の4頂点の座標(x521,y521)、(x522,y522)、(x523,y523)、(x524,y524)を相対位置の座標として導出する。 (Step S511)
Based on the result of step S58, the minimum difference deriving unit 333 determines a comparison image having the smallest difference from the
位置範囲導出部34は、ステップS511の相対位置と、間引き数とに基づいて、縮小前の原画像Aを基準とする、原画像Bの位置合わせ範囲を導出するための位置範囲導出処理を行う(S6)。具体的には、ステップS511の相対位置を中心とし、ステップS4の間引き数相当分の画素数を加算した範囲を位置範囲として求める。 (Step S6)
The position range deriving unit 34 performs position range deriving processing for deriving the alignment range of the original image B based on the original image A before reduction based on the relative position in step S511 and the thinning-out number. (S6). Specifically, a range obtained by adding the number of pixels corresponding to the thinned-out number in step S4 with the relative position in step S511 as the center is obtained as the position range.
画像接合部40が、2枚の原画像A、Bの長尺画像作成処理を行う(S8)。すなわち、画像接合部40は、ステップS7において位置合わせ部30が導出した位置合わせ範囲において位置合わせされた両画像を接合し、長尺画像を作成する。 (Step S8)
The image joining unit 40 performs long image creation processing of the two original images A and B (S8). That is, the image joining unit 40 joins both images that are aligned in the alignment range derived by the alignment unit 30 in step S7, and creates a long image.
次に図9に基づいて第二実施形態に係るX線撮影装置について説明する。図9は、第二実施形態に係るX線撮影装置の構成例を示す模式図である。図9に示す本実施形態のX線撮影装置100は、例えば病院などで診断に用いられる装置であり、X線照射用のX線管101と、透過X線像検出用の2次元X線検出器であるフラットパネル型などのX線検出器102とを有する。これらX線管101及びX線検出器102は互いに同期し、寝台105上に載置された被検体Mの体軸方向Zに沿って移動可能に構成されている。被検体Mを静止させた状態で、X線管101及びX線検出器102を被検体Mの体軸方向Zに移動させながら撮影を繰り返すことにより、被検体Mの体軸に沿った連続的で且つ部分的に重複する複数のX線画像を取得することができる。 << Second Embodiment >>
Next, an X-ray imaging apparatus according to the second embodiment will be described based on FIG. FIG. 9 is a schematic diagram illustrating a configuration example of an X-ray imaging apparatus according to the second embodiment. The
Claims (15)
- 被検体の同一部位を重複させて撮影した複数の原画像を取得する画像取得手段と、
前記原画像において前記同一部位が撮影された領域を重複させて、前記複数の原画像の位置合わせを行う位置合わせ手段と、
前記位置合わせされた複数の原画像を用いた長尺画像を作成する画像作成手段と、を備え、
前記位置合わせ手段は、一方の前記原画像における前記同一部位が撮影された領域から基準画像を切り出すと共に、他方の前記原画像から、複数の比較画像を切り出す画像切出手段と、前記基準画像と前記各比較画像との差異を求め、この差異が最小になる前記比較画像が、前記他方の原画像から切り出された位置である相対位置を導出する相対位置導出手段と、を備え、前記位置合わせ手段は、前記相対位置に基づいて、前記複数の原画像の前記位置合わせを行う、
ことを特徴とする医用画像処理装置。 Image acquisition means for acquiring a plurality of original images taken by overlapping the same part of the subject;
An alignment unit that aligns the plurality of original images by overlapping regions in which the same part is imaged in the original image;
An image creating means for creating a long image using the plurality of aligned original images,
The positioning means cuts out a reference image from an area in which the same part in one of the original images is imaged, and cuts out a plurality of comparison images from the other original image, and the reference image A relative position deriving unit that obtains a difference from each of the comparison images and derives a relative position at which the comparison image that minimizes the difference is a position cut out from the other original image; The means performs the alignment of the plurality of original images based on the relative position.
A medical image processing apparatus. - 前記画像切出手段は、前記基準画像と同一形状の領域を、前記他方の原画像上において少なくとも一方向に沿ってずらしながら前記同一形状の領域を切り出すことにより、前記複数の比較画像を切り出す、
ことを特徴とする請求項1に記載に医用画像処理装置。 The image cutout means cuts out the plurality of comparative images by cutting out the region of the same shape while shifting the region of the same shape as the reference image along at least one direction on the other original image.
2. The medical image processing apparatus according to claim 1, wherein: - 前記位置合わせ手段は、前記基準画像及び前記比較画像の画像縮小を行う画像縮小手段と、前記相対位置と前記画像縮小の倍率とに基づいて、前記一方の原画像に対する他方の原画像の位置合わせ範囲を導出する位置範囲導出手段と、を更に備え、前記画像縮小の倍率を等倍に近づけながら、前記画像縮小処理、前記相対位置の導出、及び前記位置合わせ範囲の導出を繰り返し、前記画像縮小の比率が等倍になったときの位置合わせ範囲において、前記複数の原画像の位置合わせを行う、
ことを特徴とする請求項1に記載の医用画像処理装置。 The alignment unit is configured to perform image reduction of the reference image and the comparison image, and alignment of the other original image with respect to the one original image based on the relative position and the image reduction magnification. Position range deriving means for deriving a range, and repeatedly reducing the image reduction process, deriving the relative position, and deriving the alignment range while making the image reduction magnification close to the same magnification. In the alignment range when the ratio of 1 becomes equal, the plurality of original images are aligned.
2. The medical image processing apparatus according to claim 1, wherein: - 前記画像縮小手段は、前記基準画像及び前記比較画像から所定の間引き数の画素列を間引く処理を行い、
前記位置範囲導出手段は、前記相対位置を中心とし、前記所定の間引き数と同数の画素列を加算した範囲を前記位置合わせ範囲として導出する、
ことを特徴とする請求項3に記載の医用画像処理装置。 The image reduction means performs a process of thinning out a predetermined number of pixel rows from the reference image and the comparison image,
The position range deriving unit derives, as the alignment range, a range obtained by adding the same number of pixel rows as the predetermined thinning number with the relative position as a center;
4. The medical image processing apparatus according to claim 3, wherein - 前記相対位置導出手段は、前記基準画像と前記各比較画像の差異を導出する差異導出手段と、前記差異が最小となる前記比較画像を求め、当該比較画像が前記他方の原画像から切り出された領域の座標を前記相対位置の座標として導出する最小差異導出手段を更に備え、
前記位置範囲導出手段は、前記最小差異導出手段により導出された前記相対位置の座標に基づいて前記位置合わせ範囲を導出する、
ことを特徴とする請求項3に記載の医用画像処理装置。 The relative position deriving means obtains a difference deriving means for deriving a difference between the reference image and each comparison image, the comparison image having the minimum difference, and the comparison image is cut out from the other original image. Further comprising minimum difference deriving means for deriving the coordinates of the region as the coordinates of the relative position;
The position range deriving unit derives the alignment range based on the coordinates of the relative position derived by the minimum difference deriving unit;
4. The medical image processing apparatus according to claim 3, wherein - 前記相対位置導出手段は、前記基準画像の撮影条件と、前記比較画像の撮影条件と、を一致させるための正規化を行う正規化手段を更に備え、
前記相対位置導出手段は、前記基準画像の画素値と正規化された前記比較画像の画素値とに基づいて前記相対位置を導出する、又は前記比較画像の画素値と正規化された前記基準画像の画素値とに基づいて前記相対位置を導出する、
ことを特徴とする請求項1に記載の医用画像処理装置。 The relative position deriving unit further includes a normalizing unit that performs normalization to match the shooting condition of the reference image and the shooting condition of the comparative image,
The relative position deriving means derives the relative position based on the pixel value of the reference image and the normalized pixel value of the comparison image, or the normalized reference image and the pixel value of the comparison image Deriving the relative position based on the pixel value of
2. The medical image processing apparatus according to claim 1, wherein: - 前記正規化手段は、前記基準画像の画素値の平均値及び当該画素値の分布幅と、前記比較画像の画素値の平均値及び当該画素値の分布幅と、を求め、前記比較画像の画素値の平均値及び当該画素値の分布幅を、前記基準画像の画素値の平均値及び当該画素値の分布幅に一致させることにより、正規化を行う、又は前記基準画像の画素値の平均値及び当該画素値の分布幅を、前記比較画像の画素値の平均値及び当該画素値の分布幅に一致させることにより、正規化を行う、
ことを特徴とする請求項6に記載の医用画像処理装置。 The normalization means obtains an average value of pixel values of the reference image and a distribution width of the pixel values, and an average value of pixel values of the comparison image and a distribution width of the pixel values, and the pixels of the comparison image Normalization is performed by matching the average value of the values and the distribution width of the pixel values with the average value of the pixel values of the reference image and the distribution width of the pixel values, or the average value of the pixel values of the reference image And normalizing by matching the distribution width of the pixel values with the average value of the pixel values of the comparison image and the distribution width of the pixel values.
7. The medical image processing apparatus according to claim 6, wherein - 前記画像切出手段は、前記他方の原画像のおける前記同一部位が撮影された領域内に、前記基準画像よりも大きい領域からなる探索範囲を設定し、その探索範囲から前記比較画像を切り出す、
ことを特徴とする請求項1に記載の医用画像処理装置。 The image cutout means sets a search range consisting of a region larger than the reference image in a region where the same part of the other original image is imaged, and cuts out the comparison image from the search range.
2. The medical image processing apparatus according to claim 1, wherein: - 前記複数の原画像は、前記被検体の同一部位を含む、前記被検体の体軸方向に沿った異なる複数の位置で撮影された複数の原画像であり、
前記画像作成手段は、前記原画像において同一部位が撮影された領域を重ね合わせて前記複数の原画像を接合した長尺画像を作成する手段であり、
前記画像切出手段は、前記複数の原画像のうちの前記被検体の頭部側で撮影された画像の下部領域から前記基準画像を切り出すとともに、前記複数の原画像のうちの前記被検体の足側で撮影された画像の上部領域から前記比較画像を切り出す、又は、前記複数の原画像のうちの前記被検体の頭部側で撮影された画像の下部領域から前記比較画像を切り出すとともに、前記複数の原画像のうちの前記被検体の足側で撮影された画像の上部領域から前記基準画像を切り出す、
ことを特徴とする請求項1に記載の医用画像処理装置。 The plurality of original images are a plurality of original images taken at a plurality of different positions along the body axis direction of the subject, including the same part of the subject,
The image creating means is a means for creating a long image in which the plurality of original images are joined by superimposing regions where the same part is photographed in the original image,
The image cutout means cuts out the reference image from a lower region of an image photographed on the head side of the subject among the plurality of original images, and also extracts the reference image of the subject among the plurality of original images. Cut out the comparison image from the upper region of the image taken on the foot side, or cut out the comparison image from the lower region of the image taken on the head side of the subject of the plurality of original images, Cutting out the reference image from the upper region of the image taken on the foot side of the subject of the plurality of original images;
2. The medical image processing apparatus according to claim 1, wherein: - 前記複数の原画像は、X線撮影装置が備えるX線検出器と前記被検体とを、前記被検体の体軸方向に沿って相対移動させて撮影された原画像であり、
前記画像切出手段は、前記X線撮影装置から前記相対移動の量を示すデータを取得し、当該データに基づいて、前記一方の原画像における前記基準画像を切り出す位置を決定する、
ことを特徴とする請求項1に記載の医用画像処理装置。 The plurality of original images are original images taken by relatively moving the X-ray detector provided in the X-ray imaging apparatus and the subject along the body axis direction of the subject,
The image cutout unit acquires data indicating the amount of relative movement from the X-ray imaging apparatus, and determines a position to cut out the reference image in the one original image based on the data.
2. The medical image processing apparatus according to claim 1, wherein: - 前記医用画像処理装置は、前記一方の原画像を表示する表示手段と、
表示された前記一方の原画像上において指定領域を入力するための入力手段と、を更に備え、
前記画像切出手段は、前記表示手段に表示された前記一方の原画像上において入力された前記指定領域を、前記基準画像として切り出す、
ことを特徴とする請求項1に記載の医用画像処理装置。 The medical image processing apparatus includes display means for displaying the one original image;
An input means for inputting a designated area on the displayed one original image;
The image cutout means cuts out the designated area input on the one original image displayed on the display means as the reference image.
2. The medical image processing apparatus according to claim 1, wherein: - 前記複数の原画像は、同一の前記被検体に含まれる同一部位を異なる時間に撮影して得た複数の原画像であって、
前記画像作成手段は、前記長尺画像の作成に代えて、前記複数の原画像を差分して差分画像を作成する手段である、
ことを特徴とする請求項1に記載の医用画像処理装置。 The plurality of original images are a plurality of original images obtained by photographing the same part included in the same subject at different times,
The image creating means is means for creating a difference image by subtracting the plurality of original images instead of creating the long image.
2. The medical image processing apparatus according to claim 1, wherein: - 請求項1に記載の医用画像処理装置を備えたことを特徴とするX線撮影装置。 An X-ray imaging apparatus comprising the medical image processing apparatus according to claim 1.
- 被検体の同一部位を重複して撮影した複数の原画像を取得するステップと、
一方の前記原画像における前記同一部位が撮影された領域から基準画像を切り出すと共に、他方の前記原画像から、複数の比較画像を切り出すステップと、
前記基準画像と前記各比較画像との差異を求め、この差異が最小になる前記比較画像が、前記他方の原画像から切り出された位置である相対位置を導出するステップと、
前記相対位置に基づいて、前記複数の原画像の位置合わせを行うステップと、
位置合わせされた前記複数の原画像を用いた長尺画像を作成するステップと、
をコンピュータに実行させることを特徴とする医用画像処理プログラム。 Acquiring a plurality of original images obtained by imaging the same part of the subject in duplicate,
Cutting out a reference image from an area where the same part in one of the original images is imaged, and cutting out a plurality of comparison images from the other original image;
Obtaining a difference between the reference image and each comparison image, and deriving a relative position at which the comparison image in which the difference is minimized is cut out from the other original image;
Aligning the plurality of original images based on the relative positions;
Creating a long image using the plurality of aligned original images;
A medical image processing program characterized by causing a computer to execute. - 被検体の同一部位を重複して撮影した複数の原画像を取得するステップと、
一方の前記原画像における前記同一部位が撮影された領域から基準画像を切り出すと共に、他方の前記原画像から、複数の比較画像を切り出すステップと、
前記基準画像と前記各比較画像との差異を求め、この差異が最小になる前記比較画像が、前記他方の原画像から切り出された位置である相対位置を導出するステップと、
前記相対位置に基づいて、前記複数の原画像の位置合わせを行うステップと、
位置合わせされた前記複数の原画像を用いた長尺画像を作成するステップと、
を含むことを特徴とする医用画像処理方法。 Acquiring a plurality of original images obtained by duplicating the same part of the subject; and
Cutting out a reference image from an area where the same part in one of the original images is imaged, and cutting out a plurality of comparison images from the other original image;
Obtaining a difference between the reference image and each comparison image, and deriving a relative position at which the comparison image in which the difference is minimized is cut out from the other original image;
Aligning the plurality of original images based on the relative positions;
Creating a long image using the plurality of aligned original images;
A medical image processing method comprising:
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011535476A JPWO2011043458A1 (en) | 2009-10-09 | 2010-10-08 | Medical image processing apparatus, X-ray imaging apparatus, medical image processing program, and medical image processing method |
CN201080045450.2A CN102596035B (en) | 2009-10-09 | 2010-10-08 | Medical image processing device, X-ray image capturing device, medical image processing program, and medical image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009235272 | 2009-10-09 | ||
JP2009-235272 | 2009-10-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011043458A1 true WO2011043458A1 (en) | 2011-04-14 |
Family
ID=43856907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/067729 WO2011043458A1 (en) | 2009-10-09 | 2010-10-08 | Medical image processing device, x-ray image capturing device, medical image processing program, and medical image processing method |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2011043458A1 (en) |
CN (1) | CN102596035B (en) |
WO (1) | WO2011043458A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014054417A1 (en) * | 2012-10-02 | 2014-04-10 | 株式会社島津製作所 | X-ray photographing device |
JP2016059611A (en) * | 2014-09-18 | 2016-04-25 | 富士フイルム株式会社 | Radiation image photographing system, radiation image photographing apparatus, control apparatus, and composite radiation image generation method |
JP2016140510A (en) * | 2015-01-30 | 2016-08-08 | キヤノン株式会社 | Radiographic device, control device, long imaging system, control method and program |
JP2018019874A (en) * | 2016-08-03 | 2018-02-08 | キヤノン株式会社 | Radiographic apparatus, radiographic system, radiographic method, and program |
US10380718B2 (en) | 2015-05-27 | 2019-08-13 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying medical image |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9727964B2 (en) * | 2013-08-08 | 2017-08-08 | Shimadzu Corporation | Image processing device |
CN109106392A (en) | 2014-06-11 | 2019-01-01 | 佳能株式会社 | Image display device, display control unit and display control method |
JP6431292B2 (en) | 2014-06-11 | 2018-11-28 | キヤノン株式会社 | MEDICAL IMAGE DISPLAY DEVICE, ITS CONTROL METHOD, CONTROL DEVICE, PROGRAM |
JP6877109B2 (en) | 2016-04-13 | 2021-05-26 | キヤノン株式会社 | Image processing equipment, image processing methods, and programs |
EP3236418B1 (en) * | 2016-04-13 | 2020-10-28 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006175213A (en) * | 2004-11-24 | 2006-07-06 | Toshiba Corp | Three-dimensional image processing device |
JP2008067916A (en) * | 2006-09-14 | 2008-03-27 | Hitachi Medical Corp | Medical image processor |
JP2009136421A (en) * | 2007-12-05 | 2009-06-25 | Toshiba Corp | Diagnostic x-ray apparatus and x-ray image processing method, and storage medium |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001291088A (en) * | 2000-04-07 | 2001-10-19 | Ge Yokogawa Medical Systems Ltd | Medical image display device |
US20020159564A1 (en) * | 2001-04-30 | 2002-10-31 | Eastman Kodak Company | Mothod for acquiring a radiation image of a long body part using direct digital x-ray detectors |
JP3743404B2 (en) * | 2002-07-18 | 2006-02-08 | ノーリツ鋼機株式会社 | Image processing method, image processing program, and recording medium recording image processing program |
JP4230731B2 (en) * | 2002-07-29 | 2009-02-25 | 株式会社東芝 | Digital image processing apparatus and X-ray diagnostic apparatus |
JP4533587B2 (en) * | 2003-02-07 | 2010-09-01 | 株式会社東芝 | Medical image laminating device |
CN100338631C (en) * | 2003-07-03 | 2007-09-19 | 马堃 | On-site panoramic imagery method of digital imaging device |
US7756324B2 (en) * | 2004-11-24 | 2010-07-13 | Kabushiki Kaisha Toshiba | 3-dimensional image processing apparatus |
JP4807824B2 (en) * | 2005-07-07 | 2011-11-02 | 株式会社日立メディコ | Medical diagnostic imaging system |
-
2010
- 2010-10-08 CN CN201080045450.2A patent/CN102596035B/en not_active Expired - Fee Related
- 2010-10-08 WO PCT/JP2010/067729 patent/WO2011043458A1/en active Application Filing
- 2010-10-08 JP JP2011535476A patent/JPWO2011043458A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006175213A (en) * | 2004-11-24 | 2006-07-06 | Toshiba Corp | Three-dimensional image processing device |
JP2008067916A (en) * | 2006-09-14 | 2008-03-27 | Hitachi Medical Corp | Medical image processor |
JP2009136421A (en) * | 2007-12-05 | 2009-06-25 | Toshiba Corp | Diagnostic x-ray apparatus and x-ray image processing method, and storage medium |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014054417A1 (en) * | 2012-10-02 | 2014-04-10 | 株式会社島津製作所 | X-ray photographing device |
JPWO2014054417A1 (en) * | 2012-10-02 | 2016-08-25 | 株式会社島津製作所 | X-ray equipment |
JP2016059611A (en) * | 2014-09-18 | 2016-04-25 | 富士フイルム株式会社 | Radiation image photographing system, radiation image photographing apparatus, control apparatus, and composite radiation image generation method |
JP2016140510A (en) * | 2015-01-30 | 2016-08-08 | キヤノン株式会社 | Radiographic device, control device, long imaging system, control method and program |
US10380718B2 (en) | 2015-05-27 | 2019-08-13 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying medical image |
JP2018019874A (en) * | 2016-08-03 | 2018-02-08 | キヤノン株式会社 | Radiographic apparatus, radiographic system, radiographic method, and program |
Also Published As
Publication number | Publication date |
---|---|
CN102596035A (en) | 2012-07-18 |
CN102596035B (en) | 2014-12-10 |
JPWO2011043458A1 (en) | 2013-03-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011043458A1 (en) | Medical image processing device, x-ray image capturing device, medical image processing program, and medical image processing method | |
US7933376B2 (en) | X-ray CT system and a method for creating a scanning plan | |
US7426319B2 (en) | Image processing apparatus, image processing method, image processing system, program, and storage medium | |
US20140378828A1 (en) | Hip arthroplasty method and workflow | |
WO2015165222A1 (en) | Method and device for acquiring panoramic image | |
JP5567963B2 (en) | Image processing apparatus, radiation image system, image processing method, and program | |
JP2021524631A (en) | Medical image conversion | |
JP3910239B2 (en) | Medical image synthesizer | |
US20240148351A1 (en) | Image-based planning of tomographic scan | |
KR102453897B1 (en) | Apparatus and method for recovering 3-dimensional oral scan data using computed tomography image | |
US9582892B2 (en) | Radiation imaging apparatus, radiation imaging method, and program | |
JP2015208539A (en) | Image display apparatus, image display method, and program | |
CN111160326A (en) | CT scanning panoramic real-time monitoring method and system | |
US10650561B2 (en) | Composite radiographic image that corrects effects of parallax distortion | |
US20190051042A1 (en) | Ceiling map building method, ceiling map building device, and ceiling map building program | |
JP2004038860A (en) | Image processing method and device | |
JP5305687B2 (en) | X-ray video imaging system | |
JP2005136594A (en) | Image processing apparatus and control method thereof | |
JP2005305048A (en) | Apparatus and method for positioning patient | |
JP7211172B2 (en) | Dynamic image analysis system and dynamic image processing device | |
CN113140031B (en) | Three-dimensional image modeling system and method and oral cavity scanning equipment applying same | |
JP2001291087A (en) | Method and device for positioning image | |
JP4781346B2 (en) | Image processing method and image processing apparatus | |
US20050111615A1 (en) | Radiography apparatus and radiation image processing method | |
JP4078149B2 (en) | Image processing method and image processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080045450.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10822128 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011535476 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10822128 Country of ref document: EP Kind code of ref document: A1 |