WO2011043458A1 - Dispositif de traitement d'images médicales, dispositif de capture de radiographies, programme de traitement d'images médicales, et procédé de traitement d'images médicales - Google Patents

Dispositif de traitement d'images médicales, dispositif de capture de radiographies, programme de traitement d'images médicales, et procédé de traitement d'images médicales Download PDF

Info

Publication number
WO2011043458A1
WO2011043458A1 PCT/JP2010/067729 JP2010067729W WO2011043458A1 WO 2011043458 A1 WO2011043458 A1 WO 2011043458A1 JP 2010067729 W JP2010067729 W JP 2010067729W WO 2011043458 A1 WO2011043458 A1 WO 2011043458A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
original
comparison
images
original images
Prior art date
Application number
PCT/JP2010/067729
Other languages
English (en)
Japanese (ja)
Inventor
真弥 勝間田
克己 鈴木
Original Assignee
株式会社 日立メディコ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 日立メディコ filed Critical 株式会社 日立メディコ
Priority to CN201080045450.2A priority Critical patent/CN102596035B/zh
Priority to JP2011535476A priority patent/JPWO2011043458A1/ja
Publication of WO2011043458A1 publication Critical patent/WO2011043458A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3876Recombination of partial images to recreate the original image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • A61B6/5241Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images

Definitions

  • the present invention relates to a medical image processing apparatus, an X-ray imaging apparatus, a medical image processing program, and a medical image processing method, and in particular, a medical image processing apparatus that creates a long image using two or more images, and an X-ray imaging
  • the present invention relates to an apparatus, a medical image processing program, and a medical image processing method.
  • Long imaging refers to an imaging technique in which a continuous subject that exceeds the detection area of one X-ray detector is imaged multiple times with multiple X-ray detectors or one X-ray detector, and the images are joined. .
  • a radiographic image for manually aligning or as disclosed in Patent Document 1
  • a radiation image that automatically derives the connection position from the positional relationship of the X-ray detector and creates a joined long image There is a photographing device.
  • Patent Document 1 due to mechanical accuracy when moving the X-ray detector, an error occurs in the connection position, or the joint position of the image cannot be derived unless the positional relationship of the X-ray detector is known. There was a problem.
  • the present invention has been made in view of the above problems, and does not depend on the mechanical accuracy when moving the X-ray detector, and even when the positional relationship of the X-ray detector is not known, image alignment is performed. It is an object of the present invention to provide a medical image processing apparatus, a program, and an X-ray imaging apparatus capable of performing the above.
  • a medical image processing apparatus includes an image acquisition unit that acquires a plurality of original images captured by overlapping the same part of a subject, and the same part in the original image is captured.
  • a registration unit that performs registration of the plurality of original images by overlapping the regions formed, and an image creation unit that creates a long image using the plurality of registered original images,
  • the alignment means cuts out a reference image from an area in which the same part in one original image is imaged, and also cuts out a plurality of comparison images from the other original image, the reference image, and the reference image Relative position deriving means for deriving a relative position that is a position obtained by obtaining a difference from each comparison image, and the comparison image in which the difference is minimized is cut out from the other original image,
  • Location registration means based on the relative position, performs the alignment of the plurality of original images, characterized in that.
  • an X-ray imaging apparatus includes the above-described medical image processing apparatus.
  • the medical image processing program includes a step of acquiring a plurality of original images obtained by imaging the same part of a subject in duplicate, and a reference image from an area where the same part in the one original image is taken. And cutting out a plurality of comparison images from the other original image, obtaining a difference between the reference image and each comparison image, and the comparison image that minimizes the difference is the other original image
  • a step of deriving a relative position that is a position cut out from the image, a step of aligning the plurality of original images based on the relative position, and a long image using the plurality of aligned original images And a step of causing a computer to execute.
  • the medical image processing method includes a step of acquiring a plurality of original images obtained by imaging the same part of a subject in duplicate, and a reference image from an area where the same part in the one original image is taken. And cutting out a plurality of comparison images from the other original image, obtaining a difference between the reference image and each comparison image, and the comparison image that minimizes the difference is the other original image
  • a step of deriving a relative position that is a position cut out from the image, a step of aligning the plurality of original images based on the relative position, and a long image using the plurality of aligned original images And a step of creating.
  • the image alignment is performed without depending on the mechanical accuracy of the X-ray detector, and a long image is created. Can do.
  • FIG. 1 is a schematic diagram showing a configuration example of a medical image processing apparatus 1 according to a first embodiment.
  • Block diagram showing a program stored in the medical image processing apparatus 1 Explanatory drawing explaining the outline
  • the flowchart which shows the flow of a process of the medical image processing apparatus which concerns on 1st embodiment.
  • Explanatory drawing for explaining the number of thinnings and the number of comparisons between pixels Flow chart showing the flow of relative position derivation processing
  • Flow chart showing the flow of normalization processing Explanatory drawing showing the contents of normalization processing
  • the schematic diagram which shows the structural example of the X-ray imaging apparatus which concerns on 2nd embodiment.
  • FIG. 1 is a schematic diagram showing a configuration example of the medical image processing apparatus 1 according to the first embodiment.
  • FIG. 2 is a block diagram showing a program stored in the medical image processing apparatus 1.
  • the medical image processing apparatus 1 in FIG. 1 performs image processing of a stored image.
  • the image database 4 to be managed is connected by a network such as LAN3.
  • the medical image processing apparatus 1 includes a central processing unit (CPU) 11 as a control device that mainly controls the operation of each component, and a main memory that stores a control program for the device and serves as a work area when the program is executed 12, an application such as an operating system and a medical image processing program, a magnetic disk 13 for storing various data, a display memory 14 for temporarily storing display data, and an image based on the data from the display memory 14 Display device 15, an input device 16 such as a mouse or a keyboard as a position input device, a LAN port 17 for connecting to a network, and a bus 18 for connecting these devices.
  • the medical image capturing apparatus 2 may be any apparatus that can capture a medical image of a subject, and is configured by, for example, an X-ray imaging apparatus or an MRI apparatus.
  • the medical image processing apparatus 1 stores a medical image processing program shown in FIG.
  • This medical image processing program includes an image acquisition unit 20 that acquires a plurality of medical images to be aligned (hereinafter referred to as “original images”), an alignment unit 30 that aligns original images, and an original that has been aligned. And an image joining unit 40 that joins the images to create a long image.
  • the alignment unit 30 includes an image cutout unit 31 that cuts out a reference image and a plurality of comparison images, which will be described later, from the original image, an image reduction unit 32 that performs image reduction processing of the reference image and the comparison image, the reference image, A relative position deriving unit 33 for obtaining a difference from the comparative image and deriving a position (hereinafter also referred to as a “relative position”) where the comparative image that minimizes the difference is cut out from the original image, and a relative position and a magnification of the image reduction And a position range deriving unit 34 for deriving the alignment range of the original image.
  • an image cutout unit 31 that cuts out a reference image and a plurality of comparison images, which will be described later, from the original image
  • an image reduction unit 32 that performs image reduction processing of the reference image and the comparison image, the reference image
  • a relative position deriving unit 33 for obtaining a difference from the comparative image and deriving a position (hereinafter also referred
  • the relative position derivation unit 33 a normalization unit 331 that normalizes the reference image and the comparison image, a difference derivation unit 332 that derives a difference between the pixel value of the reference image and the pixel value of each comparison image, A minimum difference deriving unit 333 for deriving a comparative image having the smallest difference.
  • These medical image processing programs are stored in the magnetic disk 13 and are loaded into the main memory 12 and executed by the central processing unit (CPU) 11 to realize the function.
  • FIG. 3 is an explanatory diagram for explaining an outline of alignment of the medical image processing apparatus 1 according to the first embodiment.
  • FIG. 4 is a flowchart showing a process flow of the medical image processing apparatus according to the first embodiment.
  • FIG. 5 is an explanatory diagram for explaining the thinning-out number and the number of comparisons between pixels.
  • FIG. 6 is a flowchart showing the flow of the relative position derivation process.
  • FIG. 7 is a flowchart showing the flow of normalization processing.
  • FIG. 8 is an explanatory diagram showing the contents of the normalization process.
  • an X-ray imaging apparatus is used as the medical image imaging apparatus 2, and the X-ray detector of this X-ray imaging apparatus is photographed by shifting the X-ray detector to different positions while having an overlapping portion with respect to the subject.
  • a process for creating a long image by joining the original image A, that is, the original image A and the original image B in FIG. 3 will be described as an example.
  • the original image A and the original image B are images taken with overlapping portions along the body axis direction of the subject. Therefore, the lower area (foot side) of the original image A and the upper area (head side) of the original image B each include an area where the same part of the subject is imaged.
  • a region where the same part of the subject is imaged is a region corresponding to the above-described overlapping portion (hereinafter referred to as “overlapping region”). Then, when the original image A and the original image B are aligned and overlapped so that the overlapping areas overlap, a long image in which the original image A and the original image B are continuous is created.
  • the alignment between the original image A and the original image B is performed by aligning the reference image 50 that is a partial area in the overlapping area of the original image A with the reference image that is in the overlapping area of the original image B.
  • a search range 51 having an area larger than 50 is defined. Then, the position within the search range 51 where the reference image 50 is located is searched, and the original image A and the original image B are aligned based on the search result.
  • the reference image 50 includes only a portion that is an overlapping area in the original image A. Therefore, when the overlapping area is an area having a width of several centimeters from the lower end of the original image A, the range not exceeding the above several centimeters from the lower end of the original image A is set in the y direction of the reference image 50.
  • the overlapping area is often provided in a range of about 30 to 40 mm.
  • the reference image 50 has a width in the y direction from the lower end of the original image A to 70 pixels, and is several cm inward from the left and right ends. This region is configured as a rectangular region having a width in the x direction.
  • the shape of the reference image 50 may be an arbitrary shape other than a square shape.
  • the width of the search range 51 in the y direction is configured to start from the upper end of the original image B and not exceed the above-mentioned several centimeters and wider than the width of the reference image 50 in the y direction.
  • the x-direction width of the search range 51 preferably includes the x-direction width of the reference image 50. Therefore, in the present embodiment, the width of the search range 51 in the x direction is the entire area from the left end to the right end of the original image B, that is, the entire width area of the original image B in the x direction.
  • a plurality of comparison images 52a, 52b, 52c, and 52d are created by cutting out a region having the same shape as the reference image 50 in the search range 51 while shifting it by a predetermined number of pixels in the x and y directions.
  • a plurality of comparison images 52a, 52b, 52c, and 52d are associated with coordinates based on the origin (0, 0) of the original image B of the region cut out from the reference image to create these comparison images. Record it. For example, for the comparison image 52a, the coordinates (x 52 1, y 52 1), (x 52 2, y 52 2), (x 52 ) of the four vertices of the rectangular region cut out to create the comparison image 52a.
  • the y direction is fixed and a plurality of comparison images are created while shifting in the x direction.
  • the x direction is fixed and a plurality of comparison images are generated while shifting in the y direction. May be.
  • the comparison image may be cut out while shifting the region having the same shape as the reference image 50 along one arbitrary direction, not limited to the x direction and the y direction.
  • the reference image 50 is cut out from the original image A
  • the search range 51 is set in the original image B
  • the comparison images 52a, 52b, 52c, and 52d are cut out, but the reference image 50 is cut out from the original image B
  • the comparison image 52a, 52b, 52c, 52d may be cut out by setting the search range 51 in the original image A.
  • Step S1 the image acquisition unit 20 acquires an image acquisition process, that is, two original images A and B that partially overlap (S1).
  • the image acquisition unit 20 may obtain two original images A and B from the medical image photographing apparatus 2, or may obtain two original images A and B from the image database 4.
  • Step S2 the alignment unit 30 performs alignment processing (S2).
  • This alignment process is a process name that collectively refers to processes from steps S3 to S7 described later.
  • Step S3 The alignment unit 30 starts a loop i for repeating the processing from step S3 to step S7 (S3).
  • the image reduction process is performed using the smallest magnification in the loop i, and a small image is obtained. create. Then, the processing after step S5 is performed.
  • the image reduction processing is performed using a larger magnification than the first time, and the processing from step S4 is performed. Then, when the image reduction magnification is equal, that is, when the processing of S4 and subsequent steps is performed using the reference image and the comparison image without image reduction, the loop i ends.
  • Step S4 the image reduction unit 32 performs image reduction processing to increase the processing speed (S4).
  • the image may be reduced by a downsampling method or an average value method.
  • an optimal thinning number is calculated by the following method, and image reduction is performed using the thinning number. I do.
  • the image reduction unit 32 may perform image reduction on the original image A and the original image B, and may cut out the reference image 50 and the comparison images 52a, 52b, 52c, and 52d from these reduced images, or The reference image 50 and the comparison images 52a, 52b, 52c, and 52d may be cut out from the previous original image A and original image B, and image reduction processing may be performed on these. That is, as long as the reference image 50 and the comparison images 52a, 52b, 52c, and 52d subjected to the following processing are reduced, either method may be used.
  • the number of comparisons between the reference image 50 and the pixels in the search range 51 in the loop i will be described based on FIG.
  • the number of pixels in the horizontal direction (x direction) of the search range 51 is X
  • the number of pixels in the vertical direction (y direction) is Y
  • the number of pixels in the horizontal direction of the reference image 50 is X ′
  • the vertical direction the number of pixels a X of the search area on one side the number of pixels of Y 'in the lateral direction (left side)
  • the number of pixels in the search area of the longitudinal side (upper side) is referred to as a Y
  • the number N of comparisons between the pixels is expressed by the following equation (1) when thinning is not performed.
  • N Number of comparisons between pixels X ′: Number of pixels arranged in the horizontal direction of the cut out reference image
  • Y ′ Number of pixels arranged in the vertical direction of the cut out reference image
  • a X One side in the horizontal direction (left side) Search area
  • a Y Vertical one-side (upper) search area
  • step S5 the relative position deriving unit 33 performs a relative position deriving process for deriving the relative position (S5).
  • the relative position deriving process will be described in the order of steps in FIG.
  • Step S51 The image cutout unit 31 cuts out the reference image 50 from the overlapping area of the original image A (S51). Normally, in the case of long shooting, the overlapping area of the original images A and B is set to 30 mm to 40 mm. Therefore, the position where the reference image 50 is cut out is the geometric positional relationship of the X-ray detector when the image cutting unit 31 captures the original image A and the original image B by shifting the X-ray detector (for example, The above-mentioned movement amount corresponding to 30 mm to 40 mm) may be obtained and derived from the X-ray imaging apparatus. Alternatively, the user may display the original image A on the monitor 15, and the image cutout unit 31 may cut out the designated area input on the original image A by the input device 16 as the reference image 50.
  • Step S52 The image cutout unit 31 sets a search range 51 that is within the overlapping area of the original image B and includes an area that includes the reference image 50 (S52).
  • the search range 51 is provided to find the comparison image 52a that most closely matches the reference image 50 at a higher speed. Therefore, when the comparison image 52 is cut out from the original image B without setting the search range 51, this step may be omitted.
  • Step S53 The relative position deriving unit 33 starts the loop j (S53).
  • Step S54 The image cutout unit 31 cuts out the comparison images 52a and 52b having the same size as the reference image 50 from the search range 51 (S54).
  • the image cutout unit 31 cuts out a region having the same shape as that of the reference image 50 at a certain position in the y direction to be processed in the loop i while shifting it by a predetermined number of pixels in the x direction. Cut out 52b.
  • Step S55 The normalizing unit 331 performs normalization processing to normalize the difference between the shooting conditions of the comparison images 52a and 52b and the shooting condition of the reference image 50 (S55). This normalization process will be described with reference to FIGS.
  • Step S551 The normalizing unit 331 performs logarithmic conversion processing on the cut-out reference image 50 and comparison images 52a and 52b (S551).
  • Step S552 The normalizing unit 331 performs a histogram derivation process to derive a histogram from the logarithmically transformed image (S552).
  • the normalization unit 331 performs an extraction process on the reference image 50 and the comparison images 52a and 52b in the region where the subject is photographed in the reference image 50 and the comparison images 52a and 52b (hereinafter referred to as “subject region”).
  • subject region For the extraction process of the subject region, a known method may be used, but in this embodiment, the normalization unit 331 captures the X-ray diaphragm of the X-ray imaging apparatus in the reference image 50 and the comparison images 52a and 52b.
  • the second deletion process derives the signal level when the X-ray is directly incident on the X-ray detector from the sensitivity of the X-ray detector and the X-ray energy at the time of X-ray imaging. This can be done by performing threshold processing on the X-ray image using a threshold that can distinguish the regions.
  • the normalization unit 331 derives a histogram indicating the distribution of pixel values in each subject region in the reference image 50 and all the extracted comparison images 52a and 52b.
  • 8A and 8B are histograms of the original images A and B obtained in this step. Note that FIG. 8B shows only the histogram of the comparison image 52a for convenience of explanation, but in practice, the same number of histograms as the number of comparison images cut out in step S54 are derived.
  • the comparison image 52a will be described as an example, but the processing from S553 to S556 is performed on the other comparison images cut out in step S54 as well as the comparison image 52a.
  • Steps S553, S554 The normalization unit 331 calculates an average value of pixel values of the reference image 50 and the comparison image 52a and a distribution width of the pixel values as parameters for use in the normalization process (S553, 554).
  • step S553 the normalization unit 331 performs histogram average value derivation processing to obtain an average value from each derived histogram (S553), and in step S554, derives the width of the histogram from the start position to the end position of the histogram.
  • a histogram width deriving process is performed (S554).
  • a ave and A wide in FIG. 8 (a) indicate the average value and the width of the histogram of the reference image 50
  • B ave and B wide in FIG. 8 (b) indicate the average value and the width of the histogram of the comparative image 52a. Show. Note that the process of calculating the average value and the distribution width of the pixels of the reference image 50 may be performed only once at the beginning of the loop j.
  • Step S555 As shown in FIG. 8 (c), the normalization unit 331 compares all the pixels of the comparison image 52a with the average value of the reference image 50 according to the following equation (3) in order to align the image density reference.
  • a histogram average value difference process for adding a difference from the average value of the image 52a is performed (S555).
  • P B ' P B + (P Aave- P Bave ) (3)
  • P B Pixel value of comparison image
  • P B ′ Pixel value after difference processing of histogram average value
  • P Aave Average value of pixel value of reference image
  • P Bave Average value of pixel value of comparison image
  • the normalizing unit 331 performs histogram width division processing according to the following equation (4) in order to align the deviation of the pixel values (S556).
  • P B '' P B ' x (P Awide / P Bwide ) ... (4)
  • P B ′′ is the pixel value of the comparison image after the histogram width division processing, and the pixel value of the comparison image after normalization
  • P Awide Histogram width of the reference image
  • P Bwide Comparison image histogram width
  • the relative position deriving unit 33 starts the loop k (S56). In loop k, the same number of processes as the number of comparison images cut out in step S54 are repeated.
  • the difference deriving unit 332 performs a difference deriving process between the images in order to calculate how close the normalized comparison images 52a and 52b are to the reference image (S57). Specifically, the difference derivation process between the images can be realized by subtracting the comparison images 52a and 52b from the reference image 50 and deriving the total value of the differences as a parameter. It can be said that the image is approximated with the normalized comparison image.
  • Step S58 The minimum difference deriving unit 333 performs the minimum difference deriving process for determining whether the difference parameter obtained by the difference deriving process in step S57 is the smallest value in the iterative process in the loop k (S58). .
  • an image having the smallest difference from the reference image 50 is determined.
  • Step S59 When the relative position deriving unit 33 repeats the processes from Steps S56 to 59 for all of the plurality of comparison images 52a and 52b cut out in Step S54, the loop k ends (S59).
  • Step S510 When the relative position deriving unit 33 repeats the processing from step S53 to step S510 for all the positions in the y direction of the search range 51, the loop j ends (S510).
  • the process in the y direction is repeated with loop j and the process in the x direction is repeated with loop k, but the process in the x direction with loop j and the process in the y direction with loop k may be repeated.
  • Step S511) Based on the result of step S58, the minimum difference deriving unit 333 determines a comparison image having the smallest difference from the reference image 50 within the search range 21, and the comparison image indicating the minimum difference position is extracted from the region.
  • the coordinates in the image B are obtained (S551).
  • the main memory 12 stores, for each position in the y direction, a plurality of comparison images cut out along the x direction at the y direction position. A comparative image indicating the minimum difference position is recorded. Therefore, the minimum difference deriving unit 333 obtains a comparative image having the smallest difference from these.
  • Step S6 The position range deriving unit 34 performs position range deriving processing for deriving the alignment range of the original image B based on the original image A before reduction based on the relative position in step S511 and the thinning-out number. (S6). Specifically, a range obtained by adding the number of pixels corresponding to the thinned-out number in step S4 with the relative position in step S511 as the center is obtained as the position range.
  • the position range deriving unit 34 thins out the coordinates of these four vertices. Coordinates obtained by adding numbers, that is, (x 52 1 ⁇ n, y 52 1 + n) (x 52 2 + n, y 52 2 + n), (x 52 3 + n, y 52 3 ⁇ n), (x 52 4 ⁇ n, y 52 A rectangular area defined by the four vertices of 4-n) is derived as an alignment range.
  • the alignment unit 30 repeats the next loop i using the alignment range derived in step S6 as the search range 51.
  • the loop i is terminated (S7).
  • the coordinates of the position range in step S6 coincide with the coordinates of the area where the comparison image 52a in the original image B is cut out. Therefore, after the end of loop i, the alignment unit 30 performs alignment by overlapping the region where the reference image 50 of the original image A is cut out with the alignment range of the original image B.
  • Step S8 The image joining unit 40 performs long image creation processing of the two original images A and B (S8). That is, the image joining unit 40 joins both images that are aligned in the alignment range derived by the alignment unit 30 in step S7, and creates a long image.
  • FIG. 9 is a schematic diagram illustrating a configuration example of an X-ray imaging apparatus according to the second embodiment.
  • the X-ray imaging apparatus 100 of the present embodiment shown in FIG. 9 is an apparatus used for diagnosis in, for example, a hospital or the like, and an X-ray tube 101 for X-ray irradiation and two-dimensional X-ray detection for transmission X-ray image detection And an X-ray detector 102 such as a flat panel type.
  • the X-ray tube 101 and the X-ray detector 102 are configured to be synchronized with each other and movable along the body axis direction Z of the subject M placed on the bed 105. While the subject M is stationary, the imaging is repeated while moving the X-ray tube 101 and the X-ray detector 102 in the body axis direction Z of the subject M, thereby continuously A plurality of partially overlapping X-ray images can be acquired.
  • the X-ray detector 102 detects a transmitted X-ray image that has been irradiated from the X-ray tube 101 and passed through the subject M at each imaging, and converts the detection result into an electrical signal (X-ray detection signal) for image processing.
  • the data is output to the unit 103.
  • the image processing unit 103 (image processing apparatus) includes hardware such as a CPU, ROM, RAM, and hard disk.
  • the image processing unit 103 performs image processing (gradation conversion and joining) for display based on a plurality of X-ray images acquired one after another according to the X-ray detection signal output from the X-ray detector 102, and the subject One long X-ray image corresponding to the M long imaging region (for example, the region from the abdomen to the lower limb) is created.
  • the display unit 104 receives the output from the image processing unit 103 and displays a long X-ray image.
  • the long X-ray image created by the image processing unit 103 is stored in a recording device such as a magneto-optical disk device in the state of digital image data, and these are transferred to an external device via a network. It can also be configured.
  • the image processing unit 103 may store the program of FIG. 2 and perform image processing similar to that of the first embodiment. This program realizes each function in cooperation with hardware configuring the image processing unit 103. As a result, the X-ray image captured by the X-ray imaging apparatus 100 is aligned by the same processing as in the first embodiment, the images are joined based on the alignment result, and displayed on the display unit 104. Can do.
  • a long image is created using the aligned image, but this alignment process can also be applied to alignment for creating a difference image.
  • the normalization unit 331 normalizes the comparison images 52a, 52b, 52c, and 52d, but normalizes the original image B and compares the normalized original image B with the comparison images 52a and 52b. , 52c, 52d may be cut out. Then, the original image A and the normalized original image B may be aligned, and the original image A and the normalized original image B may be joined to create a long image. As a result, the joint between the original image A and the original image B becomes smooth.
  • 1 medical image processing device 11 CPU, 12 main memory, 13 magnetic disk, 14 display memory, 15 monitor, 16 input device, 17 LAN port, 18 bus, 2 medical imaging device, 3 LAN, 4 image database, 100 X X-ray equipment, 101 X-ray tube, 102 X-ray detector, 103 image processing device, 104 display unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Signal Processing (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'images médicales qui aligne les images médicales indépendamment, par exemple, d'une précision mécanique lors du déplacement d'un détecteur. Pour créer une longue image, parmi une pluralité d'images originales capturées par recouvrement d'une région identique d'un sujet, une image de référence (50) est découpée d'une des images originales et une région ayant la même forme que l'image de référence (50) est découpée d'une autre des images originales pour découper une pluralité d'images de comparaison (52a, 52b, 52c, 52d) de l'autre des images originales. Les différences entre l'image de référence (50) et chacune des images de comparaison (52a, 52b, 52c, 52d) sont alors obtenues. En se basant sur l'image de comparaison avec laquelle la différence est la moins importante, une position relative est obtenue qui est une position sur l'autre des images originales, de laquelle l'image de comparaison a été découpée, et les images originales sont alignées pour créer l'image longue.
PCT/JP2010/067729 2009-10-09 2010-10-08 Dispositif de traitement d'images médicales, dispositif de capture de radiographies, programme de traitement d'images médicales, et procédé de traitement d'images médicales WO2011043458A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201080045450.2A CN102596035B (zh) 2009-10-09 2010-10-08 医用图像处理装置、x线拍摄装置、医用图像处理程序、及医用图像处理方法
JP2011535476A JPWO2011043458A1 (ja) 2009-10-09 2010-10-08 医用画像処理装置、x線撮影装置、医用画像処理プログラム、及び医用画像処理方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009235272 2009-10-09
JP2009-235272 2009-10-09

Publications (1)

Publication Number Publication Date
WO2011043458A1 true WO2011043458A1 (fr) 2011-04-14

Family

ID=43856907

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/067729 WO2011043458A1 (fr) 2009-10-09 2010-10-08 Dispositif de traitement d'images médicales, dispositif de capture de radiographies, programme de traitement d'images médicales, et procédé de traitement d'images médicales

Country Status (3)

Country Link
JP (1) JPWO2011043458A1 (fr)
CN (1) CN102596035B (fr)
WO (1) WO2011043458A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014054417A1 (fr) * 2012-10-02 2014-04-10 株式会社島津製作所 Dispositif de radiographie
JP2016059611A (ja) * 2014-09-18 2016-04-25 富士フイルム株式会社 放射線画像撮影システム、放射線画像撮影装置、制御装置、及び合成放射線画像生成方法
JP2016140510A (ja) * 2015-01-30 2016-08-08 キヤノン株式会社 放射線撮影装置、制御装置、長尺撮影システム、制御方法、及びプログラム
JP2018019874A (ja) * 2016-08-03 2018-02-08 キヤノン株式会社 放射線撮影装置、放射線撮影システム、放射線撮影方法、及びプログラム
US10380718B2 (en) 2015-05-27 2019-08-13 Samsung Electronics Co., Ltd. Method and apparatus for displaying medical image

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9727964B2 (en) * 2013-08-08 2017-08-08 Shimadzu Corporation Image processing device
CN109106392A (zh) 2014-06-11 2019-01-01 佳能株式会社 图像显示装置、显示控制装置及显示控制方法
JP6431292B2 (ja) * 2014-06-11 2018-11-28 キヤノン株式会社 医用画像表示装置およびその制御方法、制御装置、プログラム
JP6877109B2 (ja) 2016-04-13 2021-05-26 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
EP3236418B1 (fr) * 2016-04-13 2020-10-28 Canon Kabushiki Kaisha Appareil et procédé de traitement d'image et support d'informations

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006175213A (ja) * 2004-11-24 2006-07-06 Toshiba Corp 3次元画像処理装置
JP2008067916A (ja) * 2006-09-14 2008-03-27 Hitachi Medical Corp 医用画像処理装置
JP2009136421A (ja) * 2007-12-05 2009-06-25 Toshiba Corp X線診断装置及びx線画像処理方法及び記憶媒体

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001291088A (ja) * 2000-04-07 2001-10-19 Ge Yokogawa Medical Systems Ltd 医用画像表示装置
US20020159564A1 (en) * 2001-04-30 2002-10-31 Eastman Kodak Company Mothod for acquiring a radiation image of a long body part using direct digital x-ray detectors
JP3743404B2 (ja) * 2002-07-18 2006-02-08 ノーリツ鋼機株式会社 画像処理方法、画像処理プログラム、および画像処理プログラムを記録した記録媒体
JP4230731B2 (ja) * 2002-07-29 2009-02-25 株式会社東芝 ディジタル画像処理装置及びx線診断装置
JP4533587B2 (ja) * 2003-02-07 2010-09-01 株式会社東芝 医用画像の貼り合わせ装置
CN100338631C (zh) * 2003-07-03 2007-09-19 马堃 数字成像设备现场全景成像的方法
US7756324B2 (en) * 2004-11-24 2010-07-13 Kabushiki Kaisha Toshiba 3-dimensional image processing apparatus
JP4807824B2 (ja) * 2005-07-07 2011-11-02 株式会社日立メディコ 医用画像診断システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006175213A (ja) * 2004-11-24 2006-07-06 Toshiba Corp 3次元画像処理装置
JP2008067916A (ja) * 2006-09-14 2008-03-27 Hitachi Medical Corp 医用画像処理装置
JP2009136421A (ja) * 2007-12-05 2009-06-25 Toshiba Corp X線診断装置及びx線画像処理方法及び記憶媒体

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014054417A1 (fr) * 2012-10-02 2014-04-10 株式会社島津製作所 Dispositif de radiographie
JPWO2014054417A1 (ja) * 2012-10-02 2016-08-25 株式会社島津製作所 X線撮影装置
JP2016059611A (ja) * 2014-09-18 2016-04-25 富士フイルム株式会社 放射線画像撮影システム、放射線画像撮影装置、制御装置、及び合成放射線画像生成方法
JP2016140510A (ja) * 2015-01-30 2016-08-08 キヤノン株式会社 放射線撮影装置、制御装置、長尺撮影システム、制御方法、及びプログラム
US10380718B2 (en) 2015-05-27 2019-08-13 Samsung Electronics Co., Ltd. Method and apparatus for displaying medical image
JP2018019874A (ja) * 2016-08-03 2018-02-08 キヤノン株式会社 放射線撮影装置、放射線撮影システム、放射線撮影方法、及びプログラム

Also Published As

Publication number Publication date
CN102596035B (zh) 2014-12-10
JPWO2011043458A1 (ja) 2013-03-04
CN102596035A (zh) 2012-07-18

Similar Documents

Publication Publication Date Title
WO2011043458A1 (fr) Dispositif de traitement d'images médicales, dispositif de capture de radiographies, programme de traitement d'images médicales, et procédé de traitement d'images médicales
US7933376B2 (en) X-ray CT system and a method for creating a scanning plan
US7426319B2 (en) Image processing apparatus, image processing method, image processing system, program, and storage medium
US20140378828A1 (en) Hip arthroplasty method and workflow
JP4104054B2 (ja) 画像の位置合わせ装置および画像処理装置
WO2015165222A1 (fr) Procédé et dispositif d'acquisition d'image panoramique
JP2018064627A (ja) 放射線撮影装置、放射線撮影システム、放射線撮影方法、及びプログラム
JP2021524631A (ja) 医用画像変換
JP2015208539A (ja) 画像表示装置、画像表示方法及びプログラム
US10650561B2 (en) Composite radiographic image that corrects effects of parallax distortion
US9582892B2 (en) Radiation imaging apparatus, radiation imaging method, and program
CN111160326B (zh) Ct扫描全景实时监测方法和系统
US10413364B1 (en) Internal organ localization of a subject for providing assistance during surgery
US20190051042A1 (en) Ceiling map building method, ceiling map building device, and ceiling map building program
KR102453897B1 (ko) 컴퓨터 단층촬영 영상을 활용한 3차원 구강 스캔 데이터 복원 장치 및 방법
JP2022094744A (ja) 被検体動き測定装置、被検体動き測定方法、プログラム、撮像システム
US9727965B2 (en) Medical image processing apparatus and medical image processing method
JP5305687B2 (ja) X線動画像撮影システム
CN113140031B (zh) 三维影像建模系统、方法及应用其的口腔扫描设备
JP2005305048A (ja) 患者位置決め装置および患者位置決め方法
JP2001291087A (ja) 画像の位置合わせ方法および位置合わせ装置
JP2012130518A (ja) 画像処理装置、画像処理プログラム、及びx線画像診断装置
JP6991751B2 (ja) 放射線撮影システム、放射線撮影方法及びプログラム
CN115880469B (zh) 一种表面点云数据与三维影像的配准方法
US20240148351A1 (en) Image-based planning of tomographic scan

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080045450.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10822128

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011535476

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10822128

Country of ref document: EP

Kind code of ref document: A1