WO2015146166A1 - Dispositif, procédé et programme de capture d'images radiographiques - Google Patents

Dispositif, procédé et programme de capture d'images radiographiques Download PDF

Info

Publication number
WO2015146166A1
WO2015146166A1 PCT/JP2015/001694 JP2015001694W WO2015146166A1 WO 2015146166 A1 WO2015146166 A1 WO 2015146166A1 JP 2015001694 W JP2015001694 W JP 2015001694W WO 2015146166 A1 WO2015146166 A1 WO 2015146166A1
Authority
WO
WIPO (PCT)
Prior art keywords
radiation source
image
radiation
subject
unit
Prior art date
Application number
PCT/JP2015/001694
Other languages
English (en)
Japanese (ja)
Inventor
順也 森田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2015146166A1 publication Critical patent/WO2015146166A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion

Definitions

  • the present invention relates to a radiographic image capturing apparatus, method, and program for capturing a plurality of projection images by capturing a subject at each of a plurality of radiation source positions in order to generate a tomographic image.
  • a radiation source in order to observe the affected area in more detail, a radiation source is moved to irradiate the subject with radiation from different angles, and a plurality of projection images acquired thereby are added to obtain a desired Tomosynthesis imaging that can obtain a tomographic image in which the tomographic plane is emphasized has been proposed.
  • Tomosynthesis imaging depending on the characteristics of the imaging device and the required tomographic image, the radiation source is moved in parallel with the X-ray detector, or moved in a circle or ellipse arc, with different irradiation angles. A plurality of projection images obtained by photographing the subject are acquired, and these projection images are reconstructed to generate a tomographic image.
  • the position of the radiation source (hereinafter referred to as the position of the radiation source) in each imaging is obtained by equally dividing the moving range of the radiation source according to the number of imaging (number of shots) or by imaging an object having a known three-dimensional coordinate.
  • a method of reconstructing a plurality of projection images using information on the calculated radiation source position is proposed.
  • Patent Document 1 a plurality of projection images including a marker image are acquired by attaching a marker to a subject or a photographing stand on which the subject is placed and photographing the marker together with the subject.
  • Patent Document 1 an accurate radiation source position and marker position are calculated for each projection image using the marker position information, and the projection image is reconstructed using the calculated radiation source position and marker position. As a result, the influence of the displacement of the radiation source position can be eliminated.
  • Patent Document 2 a technique has been proposed in which an accurate radiation source position is calculated using a characteristic structure included in a projection image without using a marker and the projection image is aligned (Patent Document 2). reference).
  • the technique described in Patent Document 2 uses a three-dimensional model of predetermined anatomical features (for example, bone, calcification, organ reference point, blood vessel branch point, etc.) to project an object from a projection image.
  • the position of an anatomical feature of a patient is identified, the position of the anatomical feature in the projection image is estimated based on the information of the geometric configuration of the imaging device at the time of the imaging, and the estimated anatomy
  • the projection error is calculated based on the difference between the position of the anatomical feature and the position of the identified anatomical feature.
  • An accurate radiation source position is calculated based on this projection error, and a projection image is reconstructed.
  • the present invention has been made in view of the above circumstances, and enables to accurately reconstruct a projection image without using a marker when acquiring a plurality of projection images at a plurality of radiation source positions such as tomosynthesis imaging. For the purpose.
  • a radiographic imaging device includes a radiation source for irradiating a subject with radiation, Detection means for detecting radiation transmitted through the subject; An image for acquiring a plurality of projection images respectively corresponding to a plurality of radiation source positions by moving the radiation source relative to the detection means and irradiating the subject with radiation at the plurality of radiation source positions by the movement of the radiation source.
  • Reconstructing means for reconstructing a plurality of projection images and generating a provisional tomographic image
  • a feature point detecting means for detecting a plurality of anatomical feature points from the provisional tomographic image
  • Corresponding point determination means for determining a plurality of corresponding points corresponding to a plurality of feature points in each of the plurality of projection images
  • An estimation means for estimating a geometric correspondence between the radiation source position and the projection image based on a plurality of corresponding points is provided.
  • “Move the radiation source relative to the detection means” means both when the detection means is fixed and only the radiation source is moved, and when both the detection means and the radiation source are moved in synchronization. including.
  • the reconstruction unit may be a unit that generates a tomographic image of the subject based on the estimated geometric correspondence.
  • the reconstruction means is a means for generating a temporary tomographic image using information on the temporary radiation source position
  • Corresponding point determination means may be means for determining a corresponding point using information on a temporary radiation source position.
  • the corresponding point determining means calculates temporary corresponding points from a plurality of projection images using information on the temporary radiation source positions, and analyzes each projection image to calculate the temporary corresponding points.
  • the corresponding point may be corrected to determine the corresponding point.
  • the corresponding point determination means may be a means for determining the analysis range of each projection image based on the position of the temporary corresponding point in each projection image and analyzing each projection image in the analysis range.
  • the reconstruction means may be a means for generating a temporary tomographic image on a predetermined tomographic plane of the subject.
  • the reconstruction unit may acquire information on the compression thickness of the subject and determine a predetermined tomographic plane based on the compression thickness information.
  • the estimating means may be a means for estimating a three-dimensional coordinate of the radiation source position at the time of imaging as a geometric correspondence.
  • the estimating means may be a means for estimating the three-dimensional coordinates of the position of the object in the subject corresponding to a plurality of feature points as a geometric correspondence.
  • a radiographic imaging method includes a radiation source for irradiating a subject with radiation,
  • a radiographic imaging method in a radiographic imaging device comprising detection means for detecting radiation transmitted through a subject, Moving the radiation source relative to the detection means, irradiating the subject with radiation at a plurality of radiation source positions by movement of the radiation source, and obtaining a plurality of projection images respectively corresponding to the plurality of radiation source positions; Reconstruct multiple projection images to generate temporary tomographic images, Detecting multiple anatomical feature points from temporary tomographic images, In each of the plurality of projection images, determine a plurality of corresponding points corresponding to a plurality of feature points, The geometrical correspondence between the radiation source position and the projection image is estimated based on a plurality of corresponding points.
  • an anatomical feature point is detected from a temporary tomographic image generated by reconstructing a projection image.
  • the temporary tomographic image is obtained by reconstructing a plurality of projection images, the anatomical feature points included in the temporary tomographic image are also included in the projection image.
  • the projection image is added by reconstruction and a temporary tomographic image is acquired, the temporary tomographic image has less noise than the projection image. For this reason, it is possible to accurately detect feature points from the provisional tomographic image, and as a result, it is possible to accurately detect corresponding points corresponding to the feature points from the projection image. Therefore, it is possible to accurately estimate the geometric correspondence between the radiation source position and the projection image based on the corresponding points.
  • FIG. 1 is a schematic configuration diagram of a radiographic imaging device according to an embodiment of the present invention.
  • the figure which looked at the radiographic imaging apparatus from the arrow A direction of FIG. Schematic block diagram showing the configuration of a computer Illustration for explaining tomosynthesis shooting Diagram for explaining the determination of temporary corresponding points
  • the figure for demonstrating the detection of the feature point in a temporary tomographic image The figure for demonstrating the determination of the corresponding point in a projection image Diagram for explaining the deviation of the object position corresponding to the radiation source position and the feature point
  • FIG. 1 is a schematic configuration diagram of a radiographic image capturing apparatus according to an embodiment of the present invention
  • FIG. 2 is a diagram of the radiographic image capturing apparatus viewed from the direction of arrow A in FIG.
  • the radiographic image capturing apparatus 1 captures a breast M (hereinafter, also referred to as a subject M) from different imaging directions, and a plurality of radiations. It is a mammography imaging device that acquires images.
  • the radiographic imaging device 1 includes an imaging unit 10, a computer 2 connected to the imaging unit 10, and a monitor 3 and an input unit 4 connected to the computer 2.
  • the imaging unit 10 includes an arm unit 12 connected to a base (not shown) by a rotary shaft 11.
  • the imaging unit 13 is attached to one end of the arm unit 12, and the radiation irradiation unit 14 is attached to the other end so as to face the imaging table 13.
  • the arm unit 12 is configured to be able to rotate only the end portion to which the radiation irradiating unit 14 is attached, so that only the radiation irradiating unit 14 can be rotated while fixing the imaging table 13. It is possible. Note that the rotation of the arm unit 12 is controlled by the computer 2.
  • a radiation detector 15 such as a flat panel detector is provided.
  • the imaging table 13 includes a charge amplifier that converts the charge signal read from the radiation detector 15 into a voltage signal, a correlated double sampling circuit that samples the voltage signal output from the charge amplifier, and a voltage signal.
  • a circuit board or the like provided with an AD conversion unit for converting into a digital signal is also installed.
  • the radiation detector 15 can repeatedly perform recording and reading of a radiation image, and may use a so-called direct type radiation detector that directly receives radiation to generate charges, or radiation. May be used as a so-called indirect radiation detector that converts the light into visible light and converts the visible light into a charge signal.
  • a radiation image signal readout method a radiation image signal is read out by turning on and off a TFT (thin film transistor) switch, or a radiation image signal by irradiating reading light.
  • TFT thin film transistor
  • the X-ray source 16 (radiation source) is housed inside the radiation irradiation unit 14.
  • the computer 2 controls the timing of irradiating the X-ray from the X-ray source 16 and the X-ray generation conditions (tube current, time, tube current time product, etc.) in the X-ray source 16.
  • the arm 12 includes a compression plate 17 disposed above the imaging table 13 to press and compress the breast M, a support portion 18 that supports the compression plate 17, and a support portion 18 in the vertical direction of FIGS.
  • a moving mechanism 19 is provided for moving to.
  • the display unit 3 is a display device such as a CRT or a liquid crystal monitor. As will be described later, a projection image acquired by the image acquisition unit 22 and a tomographic image reconstructed by the reconstruction unit 23, as well as messages necessary for operations, etc. Is displayed.
  • the display unit 3 may include a speaker that outputs sound.
  • the input unit 4 includes a keyboard, a mouse, or a touch panel type input device, and accepts an operation of the radiation image capturing apparatus 1 by an operator. It also accepts input of various information such as imaging conditions and information correction instructions necessary for performing tomosynthesis imaging. In the present embodiment, each unit of the radiographic image capturing apparatus 1 operates in accordance with information input from the input unit 4 by the operator.
  • the computer 2 includes a central processing unit (CPU) and a storage device such as a semiconductor memory, a hard disk, and an SSD.
  • the control unit 21, the image acquisition unit 22, and the reconfiguration shown in FIG. A unit 23, a feature point detection unit 24, a corresponding point determination unit 25, an estimation unit 26, and a storage unit 27 are configured.
  • the control unit 21 controls the entire apparatus, such as driving each unit of the radiation image capturing apparatus 1.
  • the image acquisition unit 22 moves the X-ray source 16 by rotating the arm unit 12 around the rotation axis 11, and emits X-rays to the breast M that is a subject at a plurality of source positions by the movement of the X-ray source 16.
  • the reconstruction unit 23 reconstructs a plurality of projection images acquired by the image acquisition unit 22 to generate a tomographic image indicating a desired tomographic plane of the subject. A method for generating a tomographic image will be described below.
  • projection images G1, G2,..., Gn are obtained when the subject M is imaged at different irradiation angles from the positions S1, S2,. Shall. Therefore, for example, when an object (T1, T2) existing at different depths is projected from the radiation source position S1, it is projected onto the projection image G1 at the positions PT11, PT12, and the object is projected from the radiation source position S2. When (T1, T2) is projected, it is projected onto the projection image G2 at the positions of PT21 and PT22. As described above, when projection is repeatedly performed from different source positions S1, S2,..., Sn, the object T1 is projected to the positions of PT11, PT21,. The object T2 is projected at the positions PT12, PT22,..., PTn2.
  • the pixel value of the tomographic image corresponding to the object T1 is obtained by adding the pixel value of PT11 on the projection image G1, the pixel value of PT21 on the projection image G2,..., And the pixel value of PTn1 on the projection image Gn. Can be calculated. Further, the pixel value of the tomographic image corresponding to the object T2 is obtained by adding the pixel value of PT12 on the projection image G1, the pixel value of PT22 on the projection image G2,..., The pixel value of PTn2 on the projection image Gn. This can be calculated. In this way, the tomographic image in which the tomographic plane at the desired position is emphasized is obtained by adding the pixel values of the corresponding projection images G1, G2,..., Gn for each pixel of the tomographic image. Can do.
  • a three-dimensional position coordinate is calculated and stored in the storage unit 27 for each radiation source position by photographing a subject with known three-dimensional coordinates and performing calibration. For this reason, the reconstruction unit 23 uses the three-dimensional position coordinates of the radiation source position stored in the storage unit 27 to correspond to the projection position corresponding to the radiation source position of each projection image for each pixel of the tomographic image. And a tomographic image is acquired by adding the pixel values of the corresponding projection positions.
  • the feature point detector 24 detects a plurality of anatomical feature points from the tomographic image generated by the reconstruction unit 23.
  • the tomographic image for detecting a plurality of feature points is a tomographic image of a predetermined tomographic plane in the subject M. Since the tomographic image for detecting this feature point is not a final tomographic image for diagnosis, it will be referred to as a temporary tomographic image DG0.
  • the predetermined tomographic plane may be a tomographic plane designated by the operator from the input unit 4, and a tomographic plane set in the apparatus 1 such as a central tomographic plane in the reconstruction range of the subject M. It may be. In particular, when the subject is the breast M, as shown in FIGS.
  • the breast M is compressed by the compression plate 17 at the time of photographing, and information on the compression thickness, which is the thickness of the compressed breast M, is acquired.
  • a tomographic image of a tomographic plane that is 1 ⁇ 2 of the compression thickness may be acquired as a temporary tomographic image DG0.
  • the corner method based on the eigenvalues of a 2 ⁇ 2 matrix having elements obtained by Gaussian smoothing of the first derivative of the provisional tomographic image DG0 as in the Harris method.
  • Arbitrary methods are used, such as a method for calculating the likelihood and detecting a corner point as a feature point based on the corner likelihood, or a method for detecting a point whose eigenvalue satisfies a predetermined condition as in the KLT method. be able to.
  • the corresponding point determination unit 25 determines a plurality of corresponding points corresponding to each of the plurality of feature points detected by the feature point detection unit 24 from each of the plurality of projection images Gi. For this purpose, the corresponding point determination unit 25 detects temporary corresponding points corresponding to the feature points in the plurality of projection images Gi.
  • FIG. 5 is a diagram for explaining determination of provisional corresponding points. As in the case of generating a tomographic image, a plurality of projection images G1, G2 are obtained by photographing the subject M from the X-ray source 16 at different irradiation angles from the respective S1, S2,. ,... Gn shall be obtained.
  • the feature points C1 and C2 existing on the tomographic plane D0 are projected from the radiation source position S1, they are projected on the projection image G1 at positions P11 and P12, and from the radiation source position S2.
  • the feature points C1 and C2 are projected, they are projected onto the projection image G2 at positions P21 and P22.
  • the feature point C1 is projected to the positions of P11, P21,.
  • the feature point C2 is projected at the positions of P12, P22,..., Pn2.
  • the corresponding point determination unit 25 temporarily maps the projection positions P11, P21,..., Pn1, P12, P22,..., Pn2 of the feature points C1, C2 in each projection image Gi to the feature points C1, C2. Decide on a point.
  • the position of the temporary corresponding point detected in each projection image Gi should be the same position in the structure included in both the temporary tomographic image DG0 and the projection image Gi. For example, as shown in FIG. 6, when the feature points C1 and C2 are detected in the corner portion of the structure 30 in the provisional tomographic image DG0, the corresponding points are detected in the corner portion of the same structure in the projection image Gi. Should be done.
  • the radiation source position when the temporary corresponding point is projected is calculated by performing calibration and stored in the storage unit 27, and the radiation source position when the projection image is actually acquired is It is off. For this reason, as shown in FIG. 7, temporary corresponding points Pi1 and Pi2 detected in the projection image Gi are present at corresponding positions of the same structure portion 31 as the structure 30 included in the temporary tomographic image DG0. It may disappear.
  • the corresponding point determination unit 25 sets the analysis ranges 32 and 33 of a predetermined size centered on the temporary corresponding points to the projection image Gi, and the characteristics within the analysis ranges 32 and 33
  • the corresponding points corresponding to the points C1 and C2 are searched. Specifically, an area of a predetermined size centered on the feature points C1 and C2 is cut out from the temporary tomographic image DG0 as a template, and template matching between the cut-out template and the analysis areas 32 and 33 is performed to obtain the feature points C1 and C1. A corresponding point corresponding to C2 is searched. Then, the corresponding point determination unit 25 determines the searched corresponding point as the final corresponding point.
  • FIG. 7 the corresponding point determination unit 25 determines the searched corresponding point as the final corresponding point.
  • the provisional corresponding points Pi1 and Pi2 detected in the projection image Gi do not exist at the corresponding positions of the same structure portion 31 as the structure 30 included in the provisional tomographic image DG0.
  • the temporary corresponding points are located at positions very close to the actual corresponding points. For this reason, by performing a search only within the analysis ranges 32 and 33, it is possible to determine a final corresponding point with a small calculation time.
  • the X-ray source 16 does not actually move along the calculated movement path but moves with a mechanical error.
  • the actual source position (broken line) at the time of photographing is shifted from the source position (solid line) calculated by the calibration.
  • the alignment of the projected image is performed on the assumption that imaging is performed at the radiation source position calculated by calibration. For this reason, when the X-ray source 16 deviates from the source position calculated by calibration, the projection position of the object cannot be accurately aligned, and as a result, a tomographic image cannot be generated with high accuracy. .
  • a plurality of feature points are detected from the provisional tomographic image DG0, and the projection image is aligned using the positions of the plurality of feature points.
  • corresponding points on the projection image Gi are used.
  • the projection position of the object cannot be accurately aligned, and as a result, a tomographic image cannot be accurately generated.
  • the estimation unit 26 estimates the geometric relationship between the radiation source position and the projection image based on the corresponding points determined in the projection image Gi. Specifically, the three-dimensional position coordinates of the object in the subject M corresponding to the feature point detected in the provisional tomographic image DG0 and the three-dimensional position coordinates of the radiation source position are estimated.
  • the corresponding point in the projection image The position coordinates (px, py) are expressed by the following formula (1).
  • the z axis is perpendicular to the detection surface of the radiation detector 15
  • the y axis is parallel to the direction in which the X-ray source 16 moves on the detection surface of the radiation detector 15
  • y Assume that the x-axis is set in a direction perpendicular to the axis.
  • the estimation unit 26 determines an error E () between the position of the corresponding point calculated by the above equation (1) and the position coordinates (px ′, py ′) of the actual corresponding point determined in the projection image.
  • the position coordinates (sxi, syi, szi) (i 1 to n) of the plurality of source positions and the position coordinates (mxj, myj, mzj) of the plurality of objects so that the projection error is minimized.
  • J 1 to J, J is the number of objects (feature points)) is optimized.
  • the projection error E of the object in the subject M for a plurality of radiation source positions is expressed by the following equation (2).
  • the estimation unit 26 first fixes the radiation source position to the initial value calculated by calibration, and uses a known optimization method such as the highest-grade descent method or the conjugate gradient method so that the projection error E is minimized. Equation (2) is optimized and the position coordinates (mxj, myj, mzj) of the object are estimated. Then, using the estimated position coordinates (mxj, myj, mzj) of the object and the source position calculated by calibration as the initial values, the expression (2) is optimized so that the projection error E is minimized, and the source position The position coordinates (sxi, syi, szi) and the position coordinates (mxj, myj, mzj) of the object are estimated.
  • a known optimization method such as the highest-grade descent method or the conjugate gradient method
  • the position coordinates (sxi, syi, szi) of the source position are first estimated so as to minimize the projection error, and then the three-dimensional position coordinates (mxj, myj, mzj) of the object are minimized so as to minimize the projection error. ) And by repeating these estimations, the position coordinates (sxi, syi, szi) of the radiation source position and the position coordinates (mxj, myj, mzj) of the object may be alternately estimated.
  • the three-dimensional position coordinates (mxj, myj, mzj) of the object are first estimated so as to minimize the projection error, and then the position coordinates (sxi, syi) of the radiation source position so as to minimize the projection error.
  • the position coordinates (sxi, syi, szi) of the source position and the position coordinates (mxj, myj, mzj) of the object may be estimated alternately. .
  • the process is terminated when the set number of repetitions is reached, and the source position and the three-dimensional position coordinates of the object at that time are output as estimation results. You may do it. Also, if the projection error converges and the projection error does not decrease even after repeated optimization, or if the projection error falls below a predetermined threshold value, the process ends. The position of the source and the position coordinates of the object may be output as estimation results.
  • the calculated position coordinates of the radiation source position be subjected to a smoothing process using a spline interpolation calculation or the like as necessary.
  • control unit 21, the image acquisition unit 22, the reconstruction unit 23, the feature point detection unit 24, the corresponding point determination unit 25, and the estimation unit 26 is performed by central processing using a computer program stored in the storage unit 27.
  • the device does.
  • the computer 2 may be provided with a plurality of processing devices that perform the processing of each unit.
  • FIG. 9 is a flowchart showing processing performed in the present embodiment.
  • the control unit 21 starts processing, performs tomosynthesis imaging while moving the X-ray source 16 (step ST1), and the image acquisition unit 22 performs a plurality of projection images. Is acquired (step ST2).
  • the reconstruction unit 23 generates a temporary tomographic image DG0 (step ST3), and the feature point detection unit 24 detects a plurality of feature points from the temporary tomographic image DG0 (step ST4).
  • the corresponding point determination unit 25 determines the corresponding point corresponding to the feature point in the projection image (step ST5), and the estimation unit 26 detects the feature point detected in the three-dimensional coordinates of the radiation source position and the temporary tomographic image DG0. Is estimated (geometric correspondence estimation, step ST6).
  • the reconstruction unit 23 reconstructs a plurality of projection images using the estimated three-dimensional coordinates of the radiation source position and the three-dimensional coordinates of the object to generate a tomographic image (step ST7), and ends the processing. To do.
  • the generated tomographic image is stored in the storage unit 27 or transmitted to an external server via a network.
  • anatomical feature points are detected from the temporary tomographic image DG0.
  • the temporary tomographic image DG0 is obtained by reconstructing a plurality of projection images Gi in the reconstruction unit 23, anatomical feature points included in the temporary tomographic image DG0 are projected. It is also included in the image Gi.
  • the tomographic image is acquired by adding the projection image Gi by reconstruction, the tomographic image has less noise than the projection image Gi. Therefore, feature points can be detected from the tomographic image with high accuracy, and as a result, corresponding points corresponding to the feature points can be detected from the projection image Gi with high accuracy. Therefore, it is possible to accurately estimate the geometric correspondence between the radiation source position and the projection image based on the corresponding points.
  • the projection image Gi is obtained by photographing without using the marker.
  • the projection image Gi may be obtained by photographing using the marker.
  • the provisional tomographic image DG0 may be generated by aligning the position of the projection image Gi using the marker image included in the projection image Gi.
  • the geometrical correspondence between the source position and the projected image is finally estimated using the feature points detected from the provisional tomographic image DG0, body movement occurs during imaging. Even in such a case, it is possible to correct the body movement between the projected images. That is, when body movement occurs during imaging at each of a plurality of radiation source positions, the position of the subject image is shifted in each projection image Gi, but the three-dimensional radiation source position estimated in the present embodiment. The coordinates and the three-dimensional coordinates of the object are obtained by absorbing the displacement of the subject based on the body movement between the projected images. For this reason, according to the present embodiment, it is possible to correct the body movement of the subject at the time of photographing, and thus it is possible to acquire a tomographic image with higher accuracy.
  • only one temporary tomographic image is generated and the feature point is detected, but a plurality of temporary tomographic images may be generated.
  • anatomical feature points may be detected from each of the temporary tomographic images, and corresponding points in the projection image Gi corresponding to the feature points detected from all the tomographic images may be determined.
  • the tomosynthesis imaging is performed with the breast M as the subject, but the present invention can be applied to the case where the tomosynthesis imaging is performed with the subject other than the breast as the subject.
  • the X-ray source 16 and the radiation detector 15 can be moved in synchronization.
  • the X-ray source 16 and the radiation detector 15 may be moved in synchronization.
  • the position information corresponding to each radiation source position of the radiation detector 15 is reflected on the position of the object in the subject M corresponding to the detected feature point, and the geometric correspondence between the radiation source position and the projection image is obtained. It may be estimated.
  • the radiation source position is calculated in the imaging device that performs tomosynthesis imaging.
  • the present invention can be applied to any imaging device that acquires a plurality of projection images by imaging a subject at a plurality of radiation source positions.
  • the invention can be applied.
  • a transmission imaging apparatus that performs transmission imaging using a contrast agent (for example, an imaging apparatus for a gastric camera using barium) or an imaging apparatus that performs long imaging of a spine while moving a detector and an X-ray source
  • a contrast agent for example, an imaging apparatus for a gastric camera using barium
  • an imaging apparatus that performs long imaging of a spine while moving a detector and an X-ray source The present invention can be applied.
  • the trajectory of the X-ray source 16 is an arc, but may be a straight line. Of course, it can also be applied to the precession trajectory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

La présente invention a pour objet de permettre une reconstruction précise d'images de projection, sans l'aide de marqueurs, lors de l'acquisition d'une pluralité d'images de projection au niveau d'une pluralité de positions de source de rayonnement dans une imagerie par tomosynthèse. Pour ce faire, une unité d'acquisition d'image (22) acquiert une pluralité d'images de projection par imagerie d'un sujet (M) à chaque position parmi une pluralité de positions de source de rayonnement. Une unité de reconstruction (23) génère une image tomographique provisoire, et une unité de détection de point caractéristique (24) détecte une pluralité de points caractéristiques anatomiques à partir de l'image tomographique provisoire. Une unité de détermination de point correspondant (25) détermine des points correspondants qui correspondent aux points caractéristiques dans les images de projection, et une unité d'estimation (26) estime la correspondance géométrique entre les positions de source de rayonnement et les images de projection. L'unité de reconstruction (23) reconstruit la pluralité d'images de projection sur la base de la correspondance géométrique estimée.
PCT/JP2015/001694 2014-03-28 2015-03-25 Dispositif, procédé et programme de capture d'images radiographiques WO2015146166A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014068209A JP2015188604A (ja) 2014-03-28 2014-03-28 放射線画像撮影装置および方法並びにプログラム
JP2014-068209 2014-03-28

Publications (1)

Publication Number Publication Date
WO2015146166A1 true WO2015146166A1 (fr) 2015-10-01

Family

ID=54194710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/001694 WO2015146166A1 (fr) 2014-03-28 2015-03-25 Dispositif, procédé et programme de capture d'images radiographiques

Country Status (2)

Country Link
JP (1) JP2015188604A (fr)
WO (1) WO2015146166A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3590431A1 (fr) * 2018-07-03 2020-01-08 Fujifilm Corporation Dispositif, procédé et programme d'affichage d'images

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7122886B2 (ja) * 2018-06-25 2022-08-22 富士フイルム株式会社 撮影制御装置、方法およびプログラム
WO2020066109A1 (fr) * 2018-09-27 2020-04-02 富士フイルム株式会社 Dispositif, procédé et programme de génération d'images tomographiques

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003024321A (ja) * 2001-03-13 2003-01-28 Ge Medical Systems Global Technology Co Llc 断層撮影法によって得た像から3次元モデルを再構成するための較正方法
JP2012010892A (ja) * 2010-06-30 2012-01-19 Fujifilm Corp 放射線撮影装置および方法並びにプログラム
JP2012020023A (ja) * 2010-07-16 2012-02-02 Fujifilm Corp 放射線撮影装置および方法並びにプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003024321A (ja) * 2001-03-13 2003-01-28 Ge Medical Systems Global Technology Co Llc 断層撮影法によって得た像から3次元モデルを再構成するための較正方法
JP2012010892A (ja) * 2010-06-30 2012-01-19 Fujifilm Corp 放射線撮影装置および方法並びにプログラム
JP2012020023A (ja) * 2010-07-16 2012-02-02 Fujifilm Corp 放射線撮影装置および方法並びにプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3590431A1 (fr) * 2018-07-03 2020-01-08 Fujifilm Corporation Dispositif, procédé et programme d'affichage d'images
US10898145B2 (en) 2018-07-03 2021-01-26 Fujifilm Corporation Image display device, image display method, and image display program

Also Published As

Publication number Publication date
JP2015188604A (ja) 2015-11-02

Similar Documents

Publication Publication Date Title
JP5600272B2 (ja) 放射線撮影装置および方法並びにプログラム
JP6411020B2 (ja) 3dマップと蛍光透視画像との統合
JP5572040B2 (ja) 放射線撮影装置
JP6165809B2 (ja) 断層画像生成装置、方法およびプログラム
AU2013242838B2 (en) Patient movement compensation in intra-body probe tracking systems
JP2020032198A (ja) 身体部内の超音波プローブを追跡するためのシステム
JP6296553B2 (ja) 放射線画像撮影装置および放射線画像撮影装置の作動方法
JP6935204B2 (ja) ユーザが二次元の血管造影投影を選択することをガイドするための装置の作動方法および装置
JP2015089429A (ja) 放射線画像処理装置および方法並びにプログラム
JP2018121745A (ja) X線撮影装置
US11127153B2 (en) Radiation imaging device, image processing method, and image processing program
JPWO2006028085A1 (ja) X線ct装置、画像処理プログラム、及び画像処理方法
US20180049711A1 (en) Method of panoramic imaging with a dual plane fluoroscopy system
JP7275363B2 (ja) 位置ずれ量導出装置、方法およびプログラム
JP7017492B2 (ja) 断層画像生成装置、方法およびプログラム
JP5600271B2 (ja) 放射線撮影装置および方法並びにプログラム
JP7134001B2 (ja) 画像表示装置、方法およびプログラム
WO2015146166A1 (fr) Dispositif, procédé et programme de capture d'images radiographiques
JP2016178986A (ja) 放射線撮像装置、画像処理方法及びプログラム
JP5460482B2 (ja) 放射線撮影装置および方法並びにプログラム
US11730438B2 (en) Positional information acquisition device, positional information acquisition method, positional information acquisition program, and radiography apparatus
JP5608441B2 (ja) 放射線撮影装置および方法並びにプログラム
US11436697B2 (en) Positional information display device, positional information display method, positional information display program, and radiography apparatus
US11801019B2 (en) Positional information display device, positional information display method, positional information display program, and radiography apparatus
US20180308218A1 (en) Non-parallax panoramic imaging for a fluoroscopy system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15768613

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase
122 Ep: pct application non-entry in european phase

Ref document number: 15768613

Country of ref document: EP

Kind code of ref document: A1