US11961165B2 - Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program - Google Patents

Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program Download PDF

Info

Publication number
US11961165B2
US11961165B2 US17/169,564 US202117169564A US11961165B2 US 11961165 B2 US11961165 B2 US 11961165B2 US 202117169564 A US202117169564 A US 202117169564A US 11961165 B2 US11961165 B2 US 11961165B2
Authority
US
United States
Prior art keywords
tomographic
positional shift
shift amount
tomographic image
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/169,564
Other languages
English (en)
Other versions
US20210166443A1 (en
Inventor
Junya Morita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Assigned to FUJIFILM CORPORATION reassignment FUJIFILM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MORITA, JUNYA
Publication of US20210166443A1 publication Critical patent/US20210166443A1/en
Application granted granted Critical
Publication of US11961165B2 publication Critical patent/US11961165B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/025Tomosynthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/04Positioning of patients; Tiltable beds or the like
    • A61B6/0407Supports, e.g. tables or beds, for the body or parts of the body
    • A61B6/0414Supports, e.g. tables or beds, for the body or parts of the body with compression means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4429Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units
    • A61B6/4452Constructional features of apparatus for radiation diagnosis related to the mounting of source units and detector units the source unit and the detector unit being able to move relative to each other
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/502Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10112Digital tomosynthesis [DTS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography

Definitions

  • the present disclosure relates to a tomographic image generating apparatus, a tomographic image generating method, and a tomographic image generating program that acquire a plurality of projection images by imaging a subject at a plurality of radiation source positions to generate tomographic images from a plurality of projection images.
  • tomosynthesis imaging in which imaging is performed by moving a radiation source to emit radiation to a subject from a plurality of radiation source positions and a plurality of projection images acquired by the imaging are added up to generate a tomographic image in which a desired tomographic plane is emphasized.
  • a plurality of projection images are acquired by imaging the subject at a plurality of radiation source positions by moving the radiation source in parallel to a radiation detector or moving the radiation source so as to draw a circular or elliptical arc according to the characteristics of the imaging apparatus and required tomographic images, and the projection images are reconstructed using, for example, a back projection method, such as a simple back projection method or a filtered back projection method, to generate a tomographic image.
  • a back projection method such as a simple back projection method or a filtered back projection method
  • the simple imaging is an imaging method for acquiring one two-dimensional image, which is a transmission image of a subject, by emitting radiation to the subject once.
  • the tomosynthesis imaging has a problem that a reconstructed tomographic image is blurred due to the influence of body movement of the subject due to the time difference of imaging at the plurality of radiation source positions.
  • a lesion such as minute calcification, which is useful for early detection of breast cancer, particularly in a case where the breast is a subject.
  • JP2016-064119A discloses a method in which a plurality of tomographic plane projection images are acquired by projecting the pixel values of a plurality of projection images acquired by tomosynthesis imaging onto coordinate positions on a desired tomographic plane of a subject based on the positional relationship between the radiation source position and a radiation detector in a case of imaging the plurality of projection images while maintaining the pixel values of the plurality of projection images, feature points of an edge, the intersection of the edges, and the corner of the edge are detected in the plurality of tomographic plane projection images, positional shift between the plurality of tomographic plane projection images is corrected such that the detected feature points match, and a tomographic image is generated from the plurality of tomographic plane projection images subjected to positional shift correction.
  • the projection image acquired by tomosynthesis imaging is acquired by the radiation transmitted through the subject, and thus it is an image in which a plurality of structures in the subject overlap each other. Therefore, in a case where the position of the radiation source changes, the transmission direction of the radiation in the subject changes, and thus the appearance of feature points of the edge, the intersection of the edges, and the corner of the edge included in the projection image differs depending on the projection image. For example, a structure that appears as the intersection of edges in one projection image may appear as a plurality of edges that do not have an intersection in another projection image.
  • the present disclosure has been made in view of the aforementioned circumstances, and is to make it possible to acquire a high-quality tomographic image in which the body movement is accurately corrected.
  • a tomographic image generating apparatus comprises an image acquisition unit that acquires a plurality of projection images corresponding to a plurality of radiation source positions, the plurality of projection images being generated by causing an imaging apparatus to perform tomosynthesis imaging in which a radiation source is moved relative to a detection surface of a detection unit in order to emit radiation to a subject at the plurality of radiation source positions according to movement of the radiation source, a reconstruction unit that reconstructs all or a part of the plurality of projection images to generate a tomographic image on each of a plurality of tomographic planes of the subject, a feature point detecting unit that detects at least one feature point from a plurality of the tomographic images, and a positional shift amount derivation unit that derives a positional shift amount between the plurality of projection images based on body movement of the subject with the feature point as a reference on a corresponding tomographic plane corresponding to the tomographic image in which the feature point is detected, in which the reconstruction unit reconstructs the
  • the “radiation source is moved relative to the detection unit” includes a case of moving only the radiation source, a case of moving only the detection unit, and a case of moving both the radiation source and the detection unit.
  • “Reconstruct all or a part of the plurality of projection images” means that reconstruction may be performed with all of the plurality of projection images, or reconstruction may be performed with two or more projection images among the plurality of projection images, not all of the plurality of projection images.
  • the tomographic image generating apparatus may further comprise a projection unit that projects the plurality of projection images on the corresponding tomographic plane based on a positional relationship between the radiation source position and the detection unit in a case of imaging the plurality of projection images to acquire a tomographic plane projection image corresponding to each of the plurality of projection images, in which the positional shift amount derivation unit derives, as the positional shift amount between the plurality of projection images, a positional shift amount between a plurality of the tomographic plane projection images based on the body movement of the subject with the feature point as a reference on the corresponding tomographic plane.
  • the positional shift amount derivation unit may set a local region corresponding to the feature point in the plurality of tomographic plane projection images, and derive the positional shift amount based on the local region.
  • the positional shift amount derivation unit may set a plurality of first local regions including the feature point in the plurality of tomographic plane projection images, set a second local region including the feature point in the tomographic image in which the feature point is detected, derive a positional shift amount of each of the plurality of first local regions with respect to the second local region as a temporary positional shift amount, and derive the positional shift amount based on a plurality of the temporary positional shift amounts.
  • the positional shift amount derivation unit may derive the temporary positional shift amount based on a peripheral region of the feature point in the second local region.
  • the “local region” is a region including the feature point in the tomographic image or the tomographic plane projection image, and can be a region having any size smaller than the tomographic image or the tomographic plane projection image.
  • the local region needs to be larger than the range of movement as the body movement.
  • the body movement may be about 2 mm in a case of being large. Therefore, in a case of the tomographic image or the tomographic plane projection image in which the size of one pixel is 100 ⁇ m square, the local region need only be, for example, a region of 50 ⁇ 50 pixels or 100 ⁇ 100 pixels around the feature point.
  • the “peripheral region of the feature point in the local region” means a region including the feature point in the local region and being smaller than the local region.
  • the reconstruction unit may reconstruct the plurality of projection images excluding a target projection image which corresponds to a target tomographic plane projection image of which the positional shift amount is to be derived, and generates the plurality of tomographic images as target tomographic images, and the positional shift amount derivation unit may derive the positional shift amount of the target tomographic plane projection image by using the target tomographic images.
  • the feature point detecting unit may detect a plurality of the feature points from the plurality of tomographic images
  • the tomographic image generating apparatus may further comprise a focal plane discrimination unit that discriminates whether the corresponding tomographic plane corresponding to the tomographic image in which each of the plurality of feature points is detected is a focal plane
  • the positional shift amount derivation unit may derive the positional shift amount on the corresponding tomographic plane which is discriminated to be the focal plane.
  • the tomographic image generating apparatus may further comprise a combining unit that combines two or more tomographic images among the plurality of tomographic images to generate a composite two-dimensional image, in which the feature point detecting unit detects a two-dimensional feature point in the composite two-dimensional image, and detects the feature point corresponding to the two-dimensional feature point from the plurality of tomographic images.
  • the reconstruction unit may reconstruct all or a part of the plurality of projection images while correcting the positional shift amount to generate a plurality of the corrected tomographic images on the plurality of tomographic planes of the subject as a plurality of new tomographic images
  • the feature point detecting unit may detect the feature point from the plurality of new tomographic images
  • the positional shift amount derivation unit may derive a new positional shift amount between the plurality of new projection images
  • the reconstruction unit may reconstruct the plurality of projection images while correcting the new positional shift amount to generate a new corrected tomographic image on at least one tomographic plane of the subject.
  • the reconstruction unit, the feature point detecting unit, and the positional shift amount derivation unit may repeat generating of the new tomographic image, detecting of the feature point from the new tomographic image, and deriving of the new positional shift amount until the new positional shift amount converges.
  • “Repeat until convergence” means to repeat until the positional shift amount between the plurality of new tomographic plane projection images is equal to or smaller than a predetermined threshold.
  • the tomographic image generating apparatus may further comprise a positional shift amount determination unit that performs image quality evaluation for a region of interest including the feature point in the corrected tomographic image, and determines whether the derived positional shift amount is appropriate or inappropriate based on a result of the image quality evaluation.
  • the positional shift amount determination unit may perform the image quality evaluation for the region of interest including the feature point in the tomographic image, compare the result of the image quality evaluation for the corrected tomographic image with a result of the image quality evaluation for the tomographic image, and decide the tomographic image with a better result of the image quality evaluation as a final tomographic image.
  • the tomographic image generating apparatus may further comprise an evaluation function derivation unit that derives an evaluation function for performing image quality evaluation for a region of interest including the feature point in the corrected tomographic image, in which the positional shift amount derivation unit derives the positional shift amount for optimizing the evaluation function.
  • the subject may be a breast.
  • the positional shift amount derivation unit may change a search range in a case of deriving the positional shift amount depending on at least one of a density of a mammary gland, a size of the breast, an imaging time of the tomosynthesis imaging, a compression pressure of the breast in a case of the tomosynthesis imaging, or an imaging direction of the breast.
  • a tomographic image generating method comprises acquiring a plurality of projection images corresponding to a plurality of radiation source positions, the plurality of projection images being generated by causing an imaging apparatus to perform tomosynthesis imaging in which a radiation source is moved relative to a detection surface of a detection unit in order to emit radiation to a subject at the plurality of radiation source positions according to movement of the radiation source, reconstructing all or a part of the plurality of projection images to generate a tomographic image on each of a plurality of tomographic planes of the subject, detecting at least one feature point from a plurality of the tomographic images, deriving a positional shift amount between the plurality of projection images based on body movement of the subject with the feature point as a reference on a corresponding tomographic plane corresponding to the tomographic image in which the feature point is detected, and reconstructing the plurality of projection images by correcting the positional shift amount to generate a corrected tomographic image on at least one tomographic plane of the subject
  • a program causing a computer to execute the tomographic image generating method according to the aspect of the present disclosure may be provided.
  • a tomographic image generating apparatus comprises a memory that stores a command to be executed by a computer, and a processor configured to execute the stored command, in which processor executes processing of acquiring a plurality of projection images corresponding to a plurality of radiation source positions, the plurality of projection images being generated by causing an imaging apparatus to perform tomosynthesis imaging in which a radiation source is moved relative to a detection surface of a detection unit in order to emit radiation to a subject at the plurality of radiation source positions according to movement of the radiation source, reconstructing all or a part of the plurality of projection images to generate a tomographic image on each of a plurality of tomographic planes of the subject, detecting at least one feature point from a plurality of the tomographic images, deriving a positional shift amount between the plurality of projection images based on body movement of the subject with the feature point as a reference on a corresponding tomographic plane corresponding to the tomographic image in which the feature point is detected, and reconstruct
  • FIG. 1 is a schematic configuration diagram of a radiation image capturing apparatus to which a tomographic image generating apparatus according to a first embodiment of the present disclosure is applied.
  • FIG. 2 is a diagram of the radiation image capturing apparatus as viewed from the direction of arrow A in FIG. 1 .
  • FIG. 3 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in a computer in the first embodiment.
  • FIG. 4 is a diagram illustrating the acquisition of a projection image.
  • FIG. 5 is a diagram illustrating the generation of a tomographic image.
  • FIG. 6 is a diagram illustrating the detection of feature points from the tomographic image.
  • FIG. 7 is a diagram illustrating the generation of a tomographic plane projection image.
  • FIG. 8 is a diagram illustrating the interpolation of pixel values of the tomographic image.
  • FIG. 9 is a diagram illustrating the setting of a region of interest.
  • FIG. 10 is a diagram showing the region of interest set in the tomographic plane projection image.
  • FIG. 11 is a diagram showing an image in the region of interest in a case where no body movement occurs in the first embodiment.
  • FIG. 12 is a diagram showing an image in the region of interest in a case where body movement occurs in the first embodiment.
  • FIG. 13 is a diagram illustrating a search range of the region of interest.
  • FIG. 14 is a diagram showing the feature points in a three-dimensional space.
  • FIG. 15 is a diagram showing a display screen for a corrected tomographic image.
  • FIG. 16 is a flowchart showing a process performed in the first embodiment.
  • FIG. 17 is a diagram showing an image in a region of interest in a case where no body movement occurs in a second embodiment.
  • FIG. 18 is a diagram showing an image in the region of interest in a case where body movement occurs in the second embodiment.
  • FIG. 19 is a diagram illustrating a peripheral region of the feature point.
  • FIG. 20 is a diagram schematically showing a process performed in a third embodiment.
  • FIG. 21 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in a computer in a fourth embodiment.
  • FIG. 22 is a diagram illustrating the generation of a feature point map.
  • FIG. 23 is a flowchart showing a process performed in a fifth embodiment.
  • FIG. 24 is a diagram showing a warning display.
  • FIG. 25 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in a computer in a sixth embodiment.
  • FIG. 26 is a diagram illustrating a ripple artifact.
  • FIG. 27 is a diagram illustrating the derivation of correspondence points.
  • FIG. 28 is a diagram showing a result of plotting pixel values of the feature points and the correspondence points.
  • FIG. 29 is a flowchart showing a process performed in the sixth embodiment.
  • FIG. 30 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in a computer in a seventh embodiment.
  • FIG. 31 is a diagram illustrating the setting of a region of interest in the seventh embodiment.
  • FIG. 32 is a flowchart showing a process performed in the seventh embodiment.
  • FIG. 33 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in a computer in an eighth embodiment.
  • FIG. 1 is a schematic configuration diagram of a radiation image capturing apparatus to which a tomographic image generating apparatus according to a first embodiment of the present disclosure is applied
  • FIG. 2 is a diagram of the radiation image capturing apparatus as viewed from the direction of arrow A in FIG. 1 .
  • a radiation image capturing apparatus 1 is a mammography imaging apparatus that acquires a plurality of radiation images, that is, a plurality of projection images, by imaging a breast M, which is a subject, from a plurality of radiation source positions in order to generate a tomographic image by performing tomosynthesis imaging of the breast.
  • the radiation image capturing apparatus 1 comprises an imaging unit 10 , a computer 2 connected to the imaging unit 10 , and a display unit 3 and an input unit 4 which are connected to the computer 2 .
  • the imaging unit 10 comprises an arm unit 12 connected to a base (not shown) by a rotary shaft 11 .
  • An imaging table 13 is attached to one end portion of the arm unit 12 , and a radiation emission unit 14 is attached to the other end portion so as to face the imaging table 13 .
  • the arm unit 12 is configured so that only the end portion to which the radiation emission unit 14 is attached can rotate. Therefore, it is possible to rotate only the radiation emission unit 14 with the imaging table 13 fixed.
  • the rotation of the arm unit 12 is controlled by the computer 2 .
  • the imaging table 13 comprises a radiation detector 15 such as a flat panel detector therein.
  • the radiation detector 15 has a detection surface 15 A of radiation such as X-rays.
  • a circuit board on which a charge amplifier for converting a charge signal read from the radiation detector 15 into a voltage signal, a correlated double sampling circuit for sampling the voltage signal output from the charge amplifier, an analog digital (AD) conversion unit for converting the voltage signal into a digital signal, and the like are provided is provided inside the imaging table 13 .
  • the radiation detector 15 corresponds to a detection unit. Although the radiation detector 15 is used as the detection unit in the present embodiment, the detection unit is not limited to the radiation detector 15 as long as radiation can be detected and converted into an image.
  • the radiation detector 15 can perform recording and reading of a radiation image repeatedly.
  • a so-called direct-type radiation detector that directly converts radiation, such as X-rays, into electric charges may be used, or a so-called indirect-type radiation detector that converts radiation into visible light and then converts the visible light into a charge signal may be used.
  • As a method of reading a radiation image signal it is desirable to use a so-called thin film transistor (TFT) reading method in which a radiation image signal is read by ON and OFF of a TFT switch, or a so-called optical reading method in which a radiation image signal is read by emission of reading light.
  • TFT thin film transistor
  • optical reading method in which a radiation image signal is read by emission of reading light.
  • other methods may also be used without being limited to the above methods.
  • An X-ray source 16 that is a radiation source is housed inside the radiation emission unit 14 .
  • the timing of emission of X-ray that is radiation from the X-ray source 16 , and an X-ray generation condition in the X-ray source 16 , that is, selection of target and filter materials, a tube voltage, an emission time, and the like are controlled by the computer 2 .
  • the arm unit 12 includes compression plate 17 disposed above the imaging table 13 to compress the breast M, a support unit 18 that supports the compression plate 17 , and a moving mechanism 19 that moves the support unit 18 in the vertical direction in FIGS. 1 and 2 .
  • Information of the distance between the compression plate 17 and the imaging table 13 that is, a compression thickness is input to the computer 2 .
  • the display unit 3 is a display device such as a cathode ray tube (CRT) or a liquid crystal monitor, and displays a message required for the operation, and the like in addition to a projection image, a two-dimensional image, and the generated tomographic image acquired as described later.
  • the display unit 3 may include a speaker for outputting sound.
  • the input unit 4 includes an input device such as a keyboard, a mouse, or a touch panel system, and receives an operation of the radiation image capturing apparatus 1 by the operator.
  • the input unit 4 receives an input of various kinds of information such as imaging conditions and the instruction of correction of the information, which are required to perform the tomosynthesis imaging.
  • each part of the radiation image capturing apparatus 1 operates in accordance with the information input from the input unit 4 by the operator.
  • a tomographic image generating program is installed in the computer 2 .
  • the computer may be a workstation or a personal computer that is directly operated by the operator, or may be a server computer connected to these through a network.
  • the tomographic image generating program is distributed in a state of being recorded on a recording medium such as a digital versatile disc (DVD) or a compact disc read only memory (CD-ROM), and is installed in the computer from the recording medium.
  • DVD digital versatile disc
  • CD-ROM compact disc read only memory
  • the tomographic image generating program is stored in a storage device of a server computer connected to the network, or in a network storage so as to be accessible from the outside, and is downloaded and installed in the computer as necessary.
  • FIG. 3 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in the computer 2 in the first embodiment.
  • the tomographic image generating apparatus comprises a central processing unit (CPU) 21 , a memory 22 , and a storage 23 as the configuration of a standard computer.
  • CPU central processing unit
  • the storage 23 includes a storage device such as a hard disk drive or a solid state drive (SSD), and stores various kinds of information including a program for driving each unit of the radiation image capturing apparatus 1 and the tomographic image generating program.
  • the storage 23 also stores the projection image acquired by tomosynthesis imaging, and the tomographic image and the tomographic plane projection image generated as described later.
  • the memory 22 temporarily stores programs and the like stored in the storage 23 so that the CPU 21 executes various kinds of processing.
  • the tomographic image generating program defines, as the processing to be executed by the CPU 21 , image acquisition processing of acquiring a plurality of projection images of the breast M corresponding to a plurality of radiation source positions by causing the radiation image capturing apparatus 1 to perform tomosynthesis imaging, reconstruction processing of reconstructing all or a part of the plurality of projection images to generate a tomographic image on each of a plurality of tomographic planes of the breast M which is the subject, feature point detecting processing of detecting at least one feature point from a plurality of the tomographic images, projection processing of projecting the plurality of projection images on the corresponding tomographic plane corresponding to the tomographic image in which the feature point is detected, based on a positional relationship between the position of the X-ray source 16 and the radiation detector 15 in a case of imaging the plurality of projection images, and acquiring a tomographic plane projection image corresponding to
  • the CPU 21 executes these kinds of processing according to the tomographic image generating program, so that the computer 2 functions as an image acquisition unit 31 , a reconstruction unit 32 , a feature point detecting unit 33 , a projection unit 34 , a positional shift amount derivation unit 35 , and a display controller 36 .
  • FIG. 4 is a diagram illustrating the acquisition of the projection image Gi.
  • the X-ray source 16 is moved to each radiation source position of S 1 , S 2 , . . . , Sn, the X-ray source 16 is driven at each radiation source position to irradiate the breast M with X-ray, and the X-rays transmitted through the breast M are detected by the radiation detector 15 .
  • the projection images G 1 , G 2 , . . . , Gn are acquired corresponding to the radiation source positions S 1 to Sn.
  • X-rays of the same dose are emitted to the breast M.
  • the plurality of acquired projection images Gi are stored in the storage 23 .
  • the plurality of projection images Gi may be acquired by a program separate from the tomographic image generating program and stored in the storage 23 or the external storage device.
  • the image acquisition unit 31 reads the plurality of projection images Gi stored in the storage 23 or the external storage device from the storage 23 or the external storage device for reconstruction processing and the like.
  • the radiation source position Sc is a radiation source position where an optical axis X 0 of the X-rays emitted from the X-ray source 16 is perpendicular to the detection surface 15 A of the radiation detector 15 .
  • the radiation source position Sc is referred to as a reference radiation source position Sc
  • the projection image Gc acquired by irradiating the breast M with X-rays at the reference radiation source position Sc is referred to as a reference projection image Gc.
  • the optical axis X 0 of the X-ray is perpendicular to the detection surface 15 A of the radiation detector 15 ” means that the optical axis X 0 of the X-ray crosses the detection surface 15 A of the radiation detector 15 at an angle of 90°.
  • a case where the optical axis X 0 of the X-rays crosses the detection surface 15 A of the radiation detector 15 with a certain degree of error with respect to 90° may be included.
  • the optical axis X 0 of the X-ray crosses the detection surface 15 A of the radiation detector 15 with an error of about ⁇ 3° with respect to 90° is included in “the optical axis X 0 of the X-ray is perpendicular to the detection surface 15 A of the radiation detector 15 ” in the present embodiment.
  • a three-dimensional coordinate position in a three-dimensional space including the breast M is set, pixel values of corresponding pixel positions of the plurality of projection images Gi are reconstructed for the set three-dimensional coordinate position, and the pixel value of the coordinate position is calculated.
  • the reconstruction unit 32 corrects the positional shift amount and reconstructs the plurality of projection images Gi to generate a corrected tomographic image in which body movement is corrected.
  • the feature point detecting unit 33 detects at least one feature point from a plurality of the tomographic images Dj.
  • FIG. 6 is a diagram illustrating the detection of the feature points.
  • the detection of the feature points from one tomographic image Dk among the plurality of tomographic images Dj will be described.
  • the tomographic image Dk includes point-like structures E 1 to E 3 such as calcification, and intersections E 4 and E 5 of edges such as intersections of blood vessels on the tomographic plane of the breast M in which the tomographic image Dk is acquired.
  • the feature point detecting unit 33 detects the point-like structure, such as calcification, as a feature point from the tomographic image Dk by using an algorithm of known computer aided diagnosis (hereinafter, referred to as CAD).
  • CAD computer aided diagnosis
  • edges, intersections of edges, corners of edges, and the like included in the tomographic image Dk are detected as feature points by using an algorithm such as a Harris's corner detection method, a scale-invariant feature transform (SIFT), features from accelerated segment test (FAST), or speeded up robust features (SURF).
  • CAD computer aided diagnosis
  • edges, intersections of edges, corners of edges, and the like included in the tomographic image Dk are detected as feature points by using an algorithm such as a Harris's corner detection method, a scale-invariant feature transform (SIFT), features from accelerated segment test (FAST), or speeded up robust features (SURF).
  • SIFT scale-invariant feature transform
  • FAST accelerated segment test
  • only one feature point F 1 is detected from one tomographic image Dk, but it is preferable to detect a plurality of feature points.
  • all of point-like structures E 1 to E 3 and intersections E 4 and E 5 included in the tomographic image Dk shown in FIG. 6 may be detected as the feature points.
  • the feature point may be only one pixel in the tomographic image Dk, or may be a plurality of pixels indicating the positions of feature structures.
  • the feature point is detected only from one tomographic image Dk, but it is assumed that a plurality of feature points are actually detected from each of the plurality of tomographic images.
  • the projection unit 34 projects the plurality of projection images Gi on the corresponding tomographic plane which is the tomographic plane corresponding to the tomographic image in which the feature point F 1 is detected, based on the positional relationship between the radiation source position and the radiation detector 15 in a case of imaging the plurality of projection images Gi. As a result, the projection unit 34 acquires the tomographic plane projection image GTi corresponding to each of the plurality of projection images Gi.
  • the acquisition of the tomographic plane projection image GTi will be described.
  • the plurality of projection images Gi are projected on the plurality of tomographic planes Tj corresponding to plurality of tomographic images Dj to generate the tomographic plane projection image GTi.
  • FIG. 7 is a diagram illustrating the projection of the projection image.
  • FIG. 7 a case will be described in which one projection image Gi acquired at the radiation source position Si is projected on one tomographic plane Tj of the breast M.
  • the pixel value of the projection image Gi positioned on the straight line is projected.
  • the tomographic image generated on the projection image Gi and the tomographic plane Tj is composed of a plurality of pixels discretely arranged two-dimensionally at a predetermined sampling interval, and pixels are arranged in grid points having a predetermined sampling interval.
  • a short line segment orthogonal to the projection image Gi and the tomographic plane Tj indicates the pixel division position. Therefore, in FIG. 7 , the center position of the pixel division position is the pixel position which is the grid point.
  • Equation (1) the relationship of the coordinates (sxi, syi, szi) of the radiation source position at the radiation source position Si, the coordinates (pxi, pyi) of the pixel position Pi in the projection image Gi, and the coordinates (tx, ty, tz) of the projection position on the tomographic plane Tj is expressed by Equation (1) below.
  • a z-axis is set to a direction orthogonal to the detection surface 15 A of the radiation detector 15
  • a y-axis is set to a direction parallel to a direction in which the X-ray source 16 moves in the detection surface of the radiation detector 15
  • an x-axis is set to a direction orthogonal to the y-axis.
  • pxi ( tx ⁇ szi ⁇ sxi ⁇ tz )/( szi ⁇ tz )
  • pyi ( ty ⁇ szi ⁇ syi ⁇ tz )/( szi ⁇ tz ) (1)
  • Equation (1) the projection position on the tomographic plane Tj on which the pixel value of the projection image Gi is projected can be calculated. Therefore, by projecting the pixel value of the projection image Gi on the projection position on the calculated tomographic plane Tj, the tomographic plane projection image GTi is generated.
  • the intersection point of the straight line connecting the radiation source position Si and the pixel position on the projection image Gi, and the tomographic plane Tj may not be positioned on the pixel position on the tomographic plane Tj.
  • the projection position (tx, ty, tz) on the tomographic plane Tj may be positioned between the pixel positions O 1 to O 4 of the tomographic image Dj on the tomographic plane Tj.
  • the pixel value of each pixel position need only be calculated by performing an interpolation calculation using the pixel value of the projection image at the plurality of projection positions around the pixel positions O 1 to O 4 .
  • the interpolation calculation a linear interpolation calculation that weights the pixel value of the projection image at the projection position according to the distance between the pixel position and the plurality of projection positions around the pixel position can be used.
  • any method such as a non-linear bicubic interpolation calculation using more pixel values of projection positions around the pixel position and a B-spline interpolation calculation can be used.
  • the pixel value at the projection position closest to the pixel position may be used as the pixel value at the pixel position.
  • the tomographic plane projection image GTi having pixel values obtained at all of the pixel positions of the tomographic plane Tj is generated. Therefore, in one tomographic plane, the number of tomographic plane projection images GTi matches the number of projection images Gi.
  • the positional shift amount derivation unit 35 derives the positional shift amount between the plurality of tomographic plane projection images GTi based on the body movement of the breast M during the tomosynthesis imaging. First, the positional shift amount derivation unit 35 sets the local region corresponding to the feature point F 1 as a region of interest for the plurality of tomographic plane projection images GTi. Specifically, the local region having a predetermined size centered on the coordinate position of the feature point F 1 is set as the region of interest.
  • FIG. 9 is a diagram illustrating the setting of a region of interest. In FIG. 9 , for the sake of explanation, it is assumed that three projection images G 1 to G 3 are projected on the tomographic plane Tj to generate the tomographic plane projection images GT 1 to GT 3 .
  • the positional shift amount derivation unit 35 sets the region of interest Rf 0 centered on the coordinate position of the feature point F 1 in the tomographic image Dj on the tomographic plane Tj.
  • the regions of interest R 1 to R 3 corresponding to the region of interest Rf 0 are set in the tomographic plane projection images GT 1 to GT 3 .
  • the broken line in FIG. 9 indicates the boundary between the regions of interest R 1 to R 3 and the other regions. Therefore, the positions of the region of interest Rf 0 and the regions of interest R 1 to R 3 coincide with each other on the tomographic plane Tj.
  • FIG. 10 is a diagram showing the regions of interest R 1 to R 3 set in the tomographic plane projection images GT 1 to GT 3 .
  • the body movement may be about 2 mm in a case of being large. Therefore, in a case of the tomographic image or the tomographic plane projection image in which the size of one pixel is 100 ⁇ m square, the regions of interest R 1 to R 3 need only be, for example, a region of 50 ⁇ 50 pixels or 100 ⁇ 100 pixels around the feature point F 1 .
  • the positional shift amount derivation unit 35 performs registration of the regions of interest R 1 to R 3 .
  • the registration is performed with reference to the region of interest set in the reference tomographic plane projection image.
  • the registration of other regions of interest is performed with reference to the region of interest set in the tomographic plane projection image (reference tomographic plane projection image) for reference projection image (referred to as Gs) acquired at the radiation source position Sc in which the optical axis X 0 of the X-rays from the X-ray source 16 is orthogonal to the radiation detector 15 .
  • the positional shift amount derivation unit 35 performs the registration of the regions of interest R 1 and R 3 with respect to the region of interest R 2 , and derives the shift vector representing the movement direction and the movement amount of the regions of interest R 1 and R 3 with respect to the region of interest R 2 as the positional shift amount.
  • the registration means that the movement direction and the movement amount of the regions of interest R 1 and R 3 with respect to the region of interest R 2 are obtained in a predetermined search range such that the correlation between the regions of interest R 1 and R 3 , and the region of interest R 2 .
  • the normalized cross correlation may be used as the correlation.
  • the shift vector is one less than the number of tomographic plane projection images.
  • the number of shift vectors is 14.
  • the number of shift vectors is 2.
  • FIG. 11 is a diagram showing the image of three regions of interest R 1 to R 3 in a case where no body movement occurs during acquisition of the projection images G 1 to G 3 .
  • the center position of the regions of interest R 1 to R 3 that is, the positions P 1 to P 3 corresponding to the feature points F 1 in the tomographic plane projection images GT 1 to GT 3 are shown, and images F 2 of the feature points F 1 included in the regions of interest R 1 to R 3 are indicated by a large circle. As shown in FIG.
  • FIG. 12 is a diagram showing the image of three regions of interest R 1 to R 3 in a case where body movement occurs during acquisition of the projection images G 2 and G 3 among the projection images G 1 to G 3 .
  • the positions P 1 and P 2 corresponding to the feature point F 1 in the regions of interest R 1 and R 2 and the position of the image F 2 of the feature point F 1 included in the regions of interest R 1 and R 2 match each other. For this reason, the positional shift amount of the region of interest R 1 with respect to the region of interest R 2 is 0.
  • the position P 3 corresponding to the feature point F 1 in the region of interest R 3 and the position of the image F 2 of the feature point F 1 included in the region of interest R 3 match each other. Therefore, due to the movement amount and the movement direction of the region of interest R 3 with respect to the region of interest R 2 , the shift vector V 10 having a size and a direction is derived as the positional shift amount.
  • a search range in a case of deriving the positional shift amount may be changed depending on at least one of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast.
  • FIG. 13 is a diagram illustrating the change of the search range. As shown in FIG. 13 , two types of search ranges, a small search range H 1 and a large search range H 2 , are set as the search ranges of the regions of interest R 1 and R 3 with respect to the region of interest R 2 which is a reference.
  • the body movement in a case of imaging tends to be large in a case of imaging.
  • the breast M is large, the body movement tends to be large in a case of imaging.
  • the tomosynthesis imaging time is longer, the body movement during imaging tends to be large.
  • the imaging direction of the breast M is a medio-lateral oblique (MLO) direction
  • the body movement in a case of imaging tends to be large than a cranio-caudal (CC) direction.
  • MLO medio-lateral oblique
  • the positional shift amount derivation unit 35 changes a search range in a case of deriving the positional shift amount by receiving, from the input unit 4 , the input of at least one information of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast M.
  • the large search range H 2 shown in FIG. 13 need only be set.
  • the small search range H 1 shown in FIG. 13 need only be set.
  • the positional shift amount between the plurality of tomographic plane projection image GTi is derived for one feature point F 1 is detected on one tomographic plane Tj.
  • the positional shift amount derivation unit 35 derives a positional shift amount for a plurality of different feature points F (here, ten feature points shown by black circles) in a three-dimensional space in the breast M expressed by the plurality of tomographic images Dj.
  • positional shift amounts for a plurality of different feature points F are derived.
  • the positional shift amount derivation unit 35 interpolates the positional shift amounts for the plurality of different feature points F with respect to the coordinate positions of the three-dimensional space for generating the tomographic image Dj. As a result, for the tomographic plane projection image acquired in a state in which body movement occurs, the positional shift amount derivation unit 35 derives the positional shift amounts in a case of performing reconstruction for all of the coordinate positions of the three-dimensional space for generating a tomographic image.
  • the reconstruction unit 32 reconstructs the projection image Gi while correcting the derived positional shift amount to generate the corrected tomographic image Dhj in which the body movement is corrected. Specifically, in a case where the back projection method is used for reconstruction, the pixel of the projection image Gi in which the positional shift occurs is reconstructed by correcting the positional shift such that the pixel corresponding to the other projection image is projected on the position to be back projected, based on the derived positional shift amount.
  • one positional shift amount may be derived from the plurality of different feature points F.
  • the region of interest is set for each of the plurality of different feature points F, and the positional shift amount is derived on the assumption that the entire region of interest moves in the same direction by the same amount.
  • the positional shift amount need only be derived such that the representative values (for example, mean value, median value, or maximum value) of the correlation for all of the regions of interest between the tomographic plane projection images that are target of positional shift amount derivation are maximized.
  • the three-dimensional space in the breast M represented by the plurality of tomographic images Dj may be divided into a plurality of three-dimensional regions, and one positional shift amount may be derived from the plurality of feature points F in the same manner as described above for each region.
  • FIG. 15 is a diagram showing the display screen of the corrected tomographic image.
  • the tomographic image Dj before body movement correction and the corrected tomographic image Dhj subjected to body movement correction are displayed on a display screen 40 .
  • a label 41 of “before correction” is given to the tomographic image Dj so that it can be seen that the body movement is not corrected.
  • a label 42 of “after correction” is given to the corrected tomographic image Dhj such that it can be seen that body movement is corrected.
  • the label 41 may be given only to the tomographic image Dj, or the label 42 may be given only to the corrected tomographic image Dhj.
  • a broken line indicates that the structures included in the tomographic image Dj before correction is blurred, and a solid line indicates that the structures included in the corrected tomographic image Dhj is not blurred.
  • the tomographic image Dj and the corrected tomographic image Dhj display the same cross section.
  • the projection image Gi may be displayed.
  • the operator can confirm the success or failure of the body movement correction by looking at the display screen 40 . Further, in a case where the body movement is too large, even in a case where the tomographic image is generated by performing reconstruction while correcting the positional shift amount as in the present embodiment, the body movement cannot be corrected accurately, and the body movement correction may fail. In such a case, the tomographic image Dj may have a higher image quality than the corrected tomographic image Dhj due to the failure of the body movement correction. Therefore, the input unit 4 may receive an instruction to store any of the tomographic image Dj or the corrected tomographic image Dhj, and the instructed image may be stored in the storage 23 or the external storage device.
  • FIG. 16 is a flowchart showing a process performed in the first embodiment.
  • the image acquisition unit 31 causes the radiation image capturing apparatus 1 to perform the tomosynthesis imaging to acquire a plurality of projection images Gi (step ST 1 ).
  • the reconstruction unit 32 reconstructs all or a part of the plurality of projection images Gi to generate a plurality of tomographic images Dj (step ST 2 ).
  • the feature point detecting unit 33 detects at least one feature point from a plurality of the tomographic images Dj (step ST 3 ).
  • the projection unit 34 projects the plurality of projection images Gi on the corresponding tomographic plane corresponding to the tomographic image in which the feature point F 1 is detected, based on the positional relationship between the radiation source position and the radiation detector 15 in a case of imaging the plurality of projection images Gi, and acquires the tomographic plane projection image GTi corresponding to each of the plurality of projection images Gi (step ST 4 ).
  • the positional shift amount derivation unit 35 derives the positional shift amount between the plurality of tomographic plane projection image GTi (step ST 5 ). Further, the reconstruction unit 32 reconstructs the plurality of projection images Gi while correcting the positional shift amount, and thereby generates a corrected tomographic image Dhj (step ST 6 ). Then, the display controller 36 displays the corrected tomographic image Dhj on the display unit 3 (step ST 7 ), and the processing is terminated. The generated corrected tomographic image Dhj is transmitted to the external storage device (not shown) and stored.
  • the plurality of projection images Gi by tomosynthesis imaging are acquired, all or a part of the plurality of projection images Gi are reconstructed, and the tomographic image Dj of each of the plurality of tomographic plane Tj of the breast M are generated.
  • At least one feature point is detected from the plurality of tomographic images Dj, the plurality of projection images Gi are projected on the corresponding tomographic plane corresponding to the tomographic image in which the feature point is detected, based on the positional relationship between the radiation source position and the radiation detector 15 in a case of imaging the plurality of projection images Gi, and the tomographic plane projection image GTi corresponding to each of the plurality of projection images Gi are acquired.
  • the positional shift amount between the plurality of tomographic plane projection images is derived with the feature point as a reference, and the plurality of projection images Gi are reconstructed by correcting the positional shift amount to generate the corrected tomographic image Dhj.
  • the feature points are detected from the plurality of tomographic images Dj, not from the projection image Gi or the tomographic plane projection image GTi.
  • the tomographic image Dj includes only the structures included on the corresponding tomographic plane Tj. Therefore, the structures on other tomographic planes included in the projection image Gi are not included in the tomographic image Dj. Therefore, according to the first embodiment, the feature points can be detected accurately without being affected by the structures of other tomographic planes. Therefore, the positional shift amount between the plurality of projection images Gi can be appropriately derived, and as a result, according to the present embodiment, a high-quality corrected tomographic image Dhj with reduced effects of body movement can be acquired.
  • the positional shift amount is derived between the tomographic plane projection images GTi.
  • the region of interest Rf 0 centered on the coordinate position of the feature point F 1 is set in the tomographic image Dj, and the positional shift amount of the region of interest Ri set in the tomographic plane projection image GTi with respect to the set region of interest Rf 0 is derived as a temporary positional shift amount.
  • the difference from the first embodiment is that the positional shift amount between the plurality of tomographic plane projection images GTi is derived based on the derived amount of temporary positional shift amount.
  • the region of interest Ri set in the plurality of tomographic plane projection images GTi corresponds to the first local region
  • the region of interest Rf 0 set in the tomographic image Dj corresponds to the second local region.
  • FIG. 17 is a diagram illustrating the derivation of the positional shift amount in the second embodiment.
  • the region of interest Rf 0 and the regions of interest R 1 to R 3 in FIG. 17 are the same as the region of interest Rf 0 and the regions of interest R 1 to R 3 shown in FIG. 9 .
  • the positional shift amount derivation unit 35 first, derives the positional shift amounts of the regions of interest R 1 to R 3 set in the tomographic plane projection image GTi (GT 1 to GT 3 in FIG. 17 ) with respect to the region of interest Rf 0 as a temporary positional shift amount.
  • FIG. 18 is a diagram showing the image of three regions of interest R 1 to R 3 in a case where body movement occurs during acquisition of the projection images G 2 and G 3 among the projection images G 1 to G 3 .
  • the positions P 1 and P 2 corresponding to the feature point F 1 in the regions of interest R 1 and R 2 and the position of the image F 2 of the feature point F 1 included in the regions of interest R 1 and R 2 match each other. For this reason, the positional shift amount of the region of interest R 1 and R 2 with respect to the region of interest Rf 0 is 0.
  • the position P 3 corresponding to the feature point F 1 in the region of interest R 3 and the position of the image F 2 of the feature point F 1 included in the region of interest R 3 match each other. Therefore, the movement amount and the movement direction of the region of interest R 3 are generated with respect to the region of interest Rf 0 . Therefore, the shift vectors Vf 1 and Vf 2 of the regions of interest R 1 and R 2 with respect to the region of interest Rf 0 , that is, the temporary positional shift amount is 0, but the shift vector Vf 3 of the region of interest R 3 with respect to the region of interest Rf 0 , that is, the temporary positional shift amount has a value.
  • the positional shift amount derivation unit 35 derives the positional shift amount between the tomographic plane projection images GTi based on the temporary positional shift amount.
  • the positional shift amount is derived with reference to the projection image acquired at the reference radiation source position Sc in which the optical axis X 0 of the X-rays from the X-ray source 16 is orthogonal to the radiation detector 15 .
  • the positional shift amount derivation unit 35 derives the positional shift amounts of the tomographic plane projection image GT 1 and the tomographic plane projection image GT 2 by the difference value Vf 1 ⁇ Vf 2 of the shift vectors Vf 1 and Vf 2 of the regions of interest R 1 and R 2 with respect to the region of interest Rf 0 .
  • the positional shift amount derivation unit 35 derives the positional shift amounts of the tomographic plane projection image GT 3 and the tomographic plane projection image GT 2 by the difference value Vf 3 ⁇ Vf 2 of the shift vectors Vf 3 and Vf 2 of the regions of interest R 3 and R 2 with respect to the region of interest Rf 0 .
  • the temporary positional shift amounts of the regions of interest R 1 to R 3 set on the tomographic plane projection image GTi with respect to the region of interest Rf 0 set on the tomographic image Dj are derived, and the positional shift amount between the tomographic plane projection image GTi is derived based on the temporary positional shift amount.
  • the positional shift amount is derived by reducing the influence of the structures included on the tomographic plane other than the tomographic plane in which the feature points are set.
  • the influence of the structures on other tomographic planes can be reduced, the positional shift amount between the plurality of projection images Gi can be accurately derived, and as a result, according to the second embodiment, a high-quality corrected tomographic image Dhj with reduced effects of body movement can be acquired.
  • a search range in a case of deriving the positional shift amount may be changed depending on at least one of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast M.
  • the shift vectors Vf 1 to Vf 3 of the regions of interest R 1 to R 3 with respect to the region of interest Rf 0 are derived as a temporary positional shift amount, but in this case, the peripheral region Ra 0 that is smaller than the region of interest Rf 0 may be set around the feature point F 1 of the region of interest Rf 0 as shown in FIG. 19 , and the shift vector may be derived based on the peripheral region Ra 0 . In this case, the shift vector may be derived using only the peripheral region Ra 0 . Further, in a case of deriving the correlation between the regions of interest R 1 to R 3 , the peripheral region Ra 0 may be weighted larger than the regions other than the peripheral region Ra 0 in the regions of interest R 1 to R 3 .
  • the region of interest Rf 0 is set in the tomographic image Dj, but the tomographic image to be generated may be different for each tomographic plane projection image GTi from which the temporary positional shift amount is derived. Specifically, it is preferable to generate the tomographic image excluding the target projection image corresponding to the target tomographic plane projection image to be derived from which the temporary positional shift amount is derived.
  • this case will be described as the third embodiment.
  • FIG. 20 is a diagram schematically showing a process performed in a third embodiment.
  • the temporary positional shift amount for the projection image G 1 is derived with the projection image G 1 as the target projection image among fifteen projection images G 1 to G 15 , and the tomographic plane projection image GT 1 as the target tomographic plane projection image.
  • the reconstruction unit 32 reconstructs the projection images G 2 to G 15 excluding the projection image G 1 on the tomographic plane Tj to generate a tomographic image (referred to as Dj_ 1 ).
  • the feature point detecting unit 33 detects the feature point from the tomographic image Dj_ 1 , the projection unit 34 generates the tomographic plane projection images GT 1 to GT 15 from the projection images G 1 to G 15 , and the positional shift amount derivation unit 35 sets the region of interest Rf 0 _ 1 on the tomographic image Dj_ 1 , and derives the shift vector Vf 1 of the region of interest R 1 set on the tomographic plane projection image GT 1 with respect to the region of interest Rf 0 _ 1 as the temporary positional shift amount.
  • the reconstruction unit 32 reconstructs the projection images G 1 , G 3 to G 15 excluding the projection image G 2 to generate the tomographic image (referred to as Dj_ 2 ).
  • the feature point detecting unit 33 detects the feature point from the tomographic image Dj_ 2
  • the projection unit 34 generates the tomographic plane projection images GT 1 to GT 15 from the projection images G 1 to G 15
  • the positional shift amount derivation unit 35 sets the region of interest Rf 0 _ 2 on the tomographic image Dj_ 2 , and derives the shift vector Vf 2 of the region of interest R 2 set on the tomographic plane projection image GT 2 with respect to the region of interest Rf 0 _ 2 as the temporary positional shift amount.
  • the target tomographic plane projection image is sequentially changed to derive the temporary positional shift amount for all of the tomographic plane projection images GTi, and as in the second embodiment, the positional shift amount between the tomographic plane projection images GTi is derived based on the temporary positional shift amount.
  • the temporary positional shift amount is derived using the tomographic image that is not affected by the target projection image. Accordingly, the temporary positional shift amount can be more accurately derived, and as a result, the positional shift amount can be accurately derived.
  • the reconstructing of the tomographic image excluding the target projection image may be calculated, as shown in Equation (2) below, by subtracting the corresponding pixel value Gp of the target projection image from the pixel value Dp of each pixel of the tomographic image Dj generated by reconstructing all of the projection image Gi, and multiplying the subtracted pixel value by n/(n ⁇ 1).
  • Equation (2) is a simple method, the amount of calculation for generating the tomographic image excluding the target projection image can be reduced, and the processing for deriving the temporary positional shift amount can be performed at high speed.
  • Tomographic image excluding target projection image ( Dp ⁇ Gp ) ⁇ n /( n ⁇ 1) (2)
  • FIG. 21 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in the computer 2 in the fourth embodiment.
  • the same reference numbers as those in FIG. 3 are assigned to the same configurations as those in FIG. 3 , and detailed description thereof will be omitted here.
  • the fourth embodiment is different from the first embodiment in that the tomographic image generating apparatus further comprises a combining unit 37 that combines two or more tomographic images among the plurality of tomographic images, or at least one of the plurality of tomographic images and at least one of the plurality of projection images Gi to generate the composite two-dimensional image.
  • the combining unit 37 generates a composite two-dimensional image by using, for example, the method disclosed in JP2014-128716A.
  • the method disclosed in JP2014-128716A is a method in which two or more tomographic images among the plurality of tomographic images, or at least one of the plurality of tomographic images and at least one of the plurality of projection images Gi are projected in the depth direction in which the tomographic planes of the subject are arranged to generate the composite two-dimensional image.
  • the method of generating the composite two-dimensional image is not limited thereto.
  • the minimum value projection method may be performed with respect to two or more tomographic images among the plurality of tomographic images, or at least one of the plurality of tomographic images and at least one of the plurality of projection images Gi are projected in the depth direction in which the tomographic planes of the subject are arranged to generate the composite two-dimensional image.
  • the feature point detecting unit 33 first detects a two-dimensional feature point from the composite two-dimensional image. The detection of the two-dimensional feature point may be performed in the same manner as in the above embodiments. Then, the feature point detecting unit 33 detects the feature point corresponding to the two-dimensional feature point from the plurality of tomographic images Dj with reference to the depth map created in advance.
  • the depth map is a map in which each position on the composite two-dimensional image is associated with the depth information indicating the position of the tomographic plane corresponding to each position.
  • the depth map is created by using the method disclosed in WO2014/203531A in advance.
  • the composite two-dimensional image is divided into a plurality of local regions, and the correlation between the each region obtained by division and the plurality of tomographic images Dj.
  • the composite two-dimensional image C 0 is divided into 6 ⁇ 8 local regions, and the correlation between the plurality of tomographic images Dj and each of the divided regions is obtained.
  • the depth map is created by associating the depth of the tomographic image Dj including the region with the largest correlation from the reference position of the tomographic plane with the position of each region.
  • the reference position need only be, for example, the contact surface of the breast M with the compression plate 17 .
  • the position of the tomographic plane Tj in a case of generating the tomographic image Dj is known. Therefore, by referring to the depth map, the position of the tomographic plane corresponding to each local region in the composite two-dimensional image C 0 can be specified.
  • the feature point detecting unit 33 identifies the tomographic plane of the detected two-dimensional feature point with reference to the depth map. Then, the feature point corresponding to the two-dimensional feature point is detected on the specified tomographic plane.
  • the plurality of tomographic images Dj have a large amount of information, so the amount of calculation for detecting the feature points is large.
  • the two-dimensional feature point is detected from the composite two-dimensional image C 0 , and the feature point corresponding to the two-dimensional feature point is detected from the plurality of tomographic images Dj with reference to the depth map. Therefore, in a case where the depth map is created in advance, the amount of calculation can be reduced and the feature point can be detected quickly.
  • the display controller 36 may display the composite two-dimensional image on the display unit 3 together with the corrected tomographic image.
  • the configuration of the tomographic image generating apparatus according to the fifth embodiment is the same as the configuration of the tomographic image generating apparatus according to the first embodiment shown in FIG. 3 , only the processing to be performed is different, and thus the detailed description of the apparatus is omitted.
  • the fifth embodiment is different from the first embodiment in that the corrected tomographic image Dhj is used as a new tomographic image, and feature point detection, tomographic plane projection image acquisition, positional shift amount derivation, and new corrected tomographic image generation are repeated.
  • FIG. 23 is a flowchart showing a process performed in the fifth embodiment.
  • the processing from step ST 11 to step ST 15 are the same as the processing from step ST 1 to step ST 5 shown in FIG. 16 , so detailed description thereof will be omitted here.
  • the positional shift amount derivation unit 35 determines whether the positional shift amount converges (step ST 16 ). The determination of whether the positional shift amount converges may be performed by determining whether the positional shift amount derived for each tomographic plane projection image GTi is equal to or smaller than a predetermined threshold Th 1 .
  • the threshold Th 1 may be set to a value at which it can be said that there is no influence of body movement on the tomographic image without correcting the positional shift amount any more.
  • the determination of whether the positional shift amount converges may be performed by determining whether the average value of the positional shift amounts derived for the plurality of tomographic plane projection image GTi is equal to or smaller than a predetermined threshold Th 1 . In a case of positive in step ST 16 , it is unnecessary to correct the positional shift amount, and thus the display controller 36 displays the tomographic image (step ST 17 ), and the processing is terminated.
  • step ST 16 the reconstruction unit 32 reconstructs the plurality of projection images Gi while correcting the positional shift amount, and thereby generates a corrected tomographic image Dhj as a new tomographic image (step ST 18 ).
  • the feature point detecting unit 33 detects the feature point from the plurality of new tomographic images
  • the projection unit 34 acquires a plurality of new tomographic plane projection images
  • the positional shift amount derivation unit 35 derives a new positional shift amount between the plurality of new tomographic plane projection images in step ST 15 , and determines whether the positional shift amount is equal to or smaller than the predetermined threshold Th 1 in step ST 16 .
  • step ST 18 and step ST 13 to step ST 15 is repeated until it is determined to be positive in step ST 16 . Also, in a case where the corrected tomographic image is generated as a new tomographic image, the tomographic image to be displayed in step ST 17 is a new tomographic image.
  • the derivation of the new positional shift amount based on the new tomographic image is repeated until the positional shift amount converges. Therefore, the positional shift due to the body movement can be removed more appropriately and efficiently, and it possible to acquire a high-quality tomographic image in which the body movement is accurately corrected.
  • the derivation of the new positional shift based on the new tomographic image may be repeated until the positional shift amount converges as in the fifth embodiment.
  • the positional shift amount derived by the positional shift amount derivation unit 35 is compared with a predetermined threshold, and only in a case where the positional shift amount exceeds the threshold value, the tomographic image may be reconstructed while correcting the positional shift amount.
  • the threshold may be set to a value at which it can be said that there is no influence of body movement on the tomographic image without correcting the positional shift amount.
  • a warning display 45 for notifying that the body movement exceeds the threshold may be displayed on the display unit 3 . The operator can instruct whether to perform body movement correction by selecting YES or NO on the warning display 45 .
  • the regions of interest are set in the tomographic image Dj and the tomographic plane projection image GTi, and the movement direction and the movement amount of the region of interest is derived as the shift vector, that is, the positional shift amount and the temporary positional shift amount, but the present invention is not limited thereto.
  • the positional shift amount may be derived without setting the region of interest.
  • FIG. 25 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in the computer 2 in the sixth embodiment.
  • the same reference numbers as those in FIG. 3 are assigned to the same configurations as those in FIG. 3 , and detailed description thereof will be omitted here.
  • the sixth embodiment is different from the first embodiment in that the tomographic image generating apparatus according to the sixth embodiment further comprises a focal plane discrimination unit 38 that discriminates whether the corresponding tomographic plane corresponding to the tomographic image in which each of the plurality of feature points F is detected is a focal plane, and the positional shift amount derivation unit 35 derives the positional shift amount on the corresponding tomographic plane which is discriminated to be the focal plane.
  • the processing according to the sixth embodiment can be applied to the second to fifth embodiments, but only the case where the processing is applied to the first embodiment will be described here.
  • FIG. 26 is a diagram illustrating a ripple artifact.
  • the tomographic image corresponding to the upper and lower tomographic planes of the tomographic image D 3 includes the ripple artifact of the structure 48 .
  • the ripple artifact becomes wider and blurry as the distance from the tomographic plane including the structure 48 increases.
  • the range in which the ripple artifact spreads corresponds to the range in which the X-ray source 16 moves.
  • the feature point F detected by the feature point detecting unit 33 from the tomographic image Dj of the corresponding tomographic plane is the ripple artifact
  • the feature point F is blurred and spreads over a wide area. Therefore, in a case where such a feature point F is used, the positional shift amount cannot be derived accurately.
  • the focal plane discrimination unit 38 discriminates whether the corresponding tomographic plane the feature point F is detected is a focal plane
  • the projection unit 34 generates the tomographic plane projection image GTi on the corresponding tomographic plane which is discriminated to be the focal plane
  • the positional shift amount derivation unit 35 derives the positional shift amount.
  • the positional shift amount is derived using the feature point detected on the corresponding tomographic plane discriminated to be the focal plane.
  • the focal plane discrimination unit 38 derives the correspondence point corresponding to the feature point in the plurality of tomographic images for the feature point detected by the feature point detecting unit 33 .
  • FIG. 27 is a diagram illustrating the derivation of correspondence points. As shown in FIG. 27 , assuming that the feature point F 3 is detected in a certain tomographic image Dk, the positional shift amount derivation unit 35 derives the correspondence points R 1 , R 2 , R 3 , R 4 , . . . corresponding to the feature point F 3 in the plurality of tomographic images positioned in the thickness direction of the tomographic image Dk.
  • the reference code of the correspondence point is R.
  • the correspondence point R need only be derived by registration of the region of interest including the feature point F 3 with the tomographic image other than the tomographic image Dk. Then, the focal plane discrimination unit 38 plots the pixel values of the feature point F 3 and the correspondence point R in the order in which the tomographic planes are arranged.
  • FIG. 28 is a diagram showing a result of plotting pixel values of the feature points and the correspondence points. As shown in FIG. 28 , the pixel values of the feature point and the correspondence point change so as to have a minimum value at the feature point due to the influence of the ripple artifact.
  • the feature point F 3 is not blurred, and the brightness is high, that is, the pixel value is small.
  • the feature point F 3 is the ripple artifact, so that the pixel value is blurred and the pixel value is larger than the minimum value.
  • the focal plane discrimination unit 38 discriminates that the corresponding tomographic plane in which the feature point F 3 is detected is the focal plane in a case where the position of the tomographic plane in which the feature point F 3 is detected is the position P 0 shown in FIG. 28 where the pixel value is the minimum in the result of plotting the pixel values of the feature point F 3 and the correspondence point.
  • the focal plane discrimination unit 38 discriminates that the corresponding tomographic plane in which the feature point F 3 is detected is not the focal plane in a case where the position of the tomographic plane where the feature point F 3 is detected is the position P 1 or the like shown in FIG. 28 where the pixel value is not the minimum.
  • the projection unit 34 generates the tomographic plane projection image GTi only on the corresponding tomographic plane discriminated to be the focal plane, as in the above embodiments.
  • the positional shift amount derivation unit 35 derives the positional shift amount of the tomographic plane projection image GTi on the corresponding tomographic plane discriminated to be the focal plane. That is, the positional shift amount derivation unit 35 derives the positional shift amount of the tomographic plane projection image GTi by using the feature point detected on the corresponding tomographic plane discriminated to be the focal plane.
  • FIG. 29 is a flowchart showing a process performed in the sixth embodiment.
  • the processing from step ST 21 to step ST 23 are the same as the processing from step ST 1 to step ST 3 shown in FIG. 16 , so detailed description thereof will be omitted here.
  • the focal plane discrimination unit 38 discriminates whether the corresponding tomographic plane corresponding to the tomographic image in which each of the plurality of feature points detected by the feature point detecting unit 33 is detected is the focal plane (focal plane discrimination; step ST 24 ).
  • the projection unit 34 generates the tomographic plane projection image GTi on the corresponding tomographic plane discriminated to be the focal plane (step ST 25 ), and the positional shift amount derivation unit 35 derives the positional shift amount by using the feature point detected in the corresponding tomographic plane discriminated to be the focal plane (step ST 26 ).
  • the reconstruction unit 32 reconstructs the plurality of projection images Gi while correcting the positional shift amount, and thereby generates a corrected tomographic image Dhj (step ST 27 ). Then, the display controller 36 displays the corrected tomographic image Dhj on the display unit 3 (step ST 28 ), and the processing is terminated.
  • the generated corrected tomographic image Dhj is transmitted to the external storage device (not shown) and stored.
  • the positional shift amount is derived on the corresponding tomographic plane discriminated to be the focal plane. Therefore, the positional shift amount can be derived accurately without being affected by the ripple artifact, and as a result, the corrected tomographic image Dhj in which the positional shift is accurately corrected can be generated.
  • discrimination is made as to whether the corresponding tomographic plane is the focal plane by using the plot results of the pixel values of the feature point and the correspondence point, but the discrimination of the focal plane is not limited thereto.
  • the difference in contrast of the feature point with the peripheral pixels is larger. Therefore, the contrasts of the feature point and the correspondence point with the peripheral pixels are derived, and in a case where the contrast of the feature point is the maximum, the corresponding tomographic plane in which the feature point is detected may be discriminated to be the focal plane.
  • the pixel value of the position corresponding to the feature point in the projection image has a small variation between the projection images in a case where the feature point is on the focal plane, but in a case where the feature point is not on the focal plane, the projection image may represent the structure other than the structure corresponding to the feature point, so the variation between the projection images is large. Therefore, the variance value of the pixel value corresponding to the feature point between the projection images Gi is derived, and in a case where the variance value is equal to or smaller than a predetermined threshold, the corresponding tomographic plane in which the feature point is detected may be discriminated to be the focal plane.
  • the focal plane discrimination unit 38 may include a discriminator that is machine learned such that in a case where the pixel values of the feature point and surrounding of the feature point are input, the discrimination result is output as to whether the corresponding tomographic plane in which the feature point is detected is the focal plane.
  • the discriminator may discriminate whether the corresponding tomographic plane in which the feature point is detected is the focal plane.
  • FIG. 30 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in the computer 2 in the seventh embodiment.
  • the same reference numbers as those in FIG. 3 are assigned to the same configurations as those in FIG. 3 , and detailed description thereof will be omitted here.
  • the seventh embodiment is different from the first embodiment in that the tomographic image generating apparatus further comprises a positional shift amount determination unit 39 that performs image quality evaluation for a region of interest including the feature point in the corrected tomographic image Dhj, and determines whether the derived positional shift amount is appropriate or inappropriate based on a result of the image quality evaluation.
  • the processing according to the seventh embodiment can be applied to the second to sixth embodiments, but only the case where the processing is applied to the first embodiment will be described here.
  • the positional shift amount determination unit 39 sets, for the image quality evaluation, the regions of interest Rh 1 and Rh 2 centered on the coordinate positions of the plurality (here, two) of the feature points F 4 and F 5 included in the corrected tomographic image Dhj shown in FIG. 31 . Then, a high-frequency image is generated by extracting high-frequency components in each of the regions of interest Rh 1 and Rh 2 . The extraction of the high-frequency components need only be performed by performing the filtering processing using the Laplacian filter to generate a secondary differential image, but the present invention is not limited thereto.
  • the positional shift amount determination unit 39 derives the magnitudes of the high-frequency components of the regions of interest Rh 1 and Rh 2 .
  • the magnitude of the high-frequency components need only be derived by the sum of squares of the pixel value of the high-frequency image, but the present invention is not limited thereto.
  • the positional shift amount determination unit 39 derives the sum of the magnitudes of the high-frequency components of all of the regions of interest Rh 1 and Rh 2 .
  • the positional shift amount determination unit 39 performs the image quality evaluation based on the magnitude of the high-frequency components.
  • the positional shift amount determination unit 39 determines whether the sum of the magnitudes of the high-frequency components of all of the regions of interest Rh 1 and Rh 2 , which are derived as above, is equal to or larger than the predetermined threshold Th 2 . In a case where the sum is equal to or larger than the threshold Th 2 , the positional shift amount determination unit 39 determines that the positional shift amount is appropriate, and in a case where the sum is smaller than the threshold Th 2 , the positional shift amount determination unit 39 determines that the positional shift amount is inappropriate. In a case where the positional shift amount determination unit 39 determines that the positional shift amount is inappropriate, the display controller 36 displays the tomographic image Dj before correction on the display unit 3 instead of the corrected tomographic image Dhj. In this case, instead of the corrected tomographic image Dhj, the tomographic image Dj before correction is transmitted to the external storage device.
  • FIG. 32 is a flowchart showing a process performed in the seventh embodiment.
  • the processing from step ST 31 to step ST 36 are the same as the processing from step ST 1 to step ST 6 shown in FIG. 16 , so detailed description thereof will be omitted here.
  • the positional shift amount determination unit 39 performs image quality evaluation for a region of interest including the feature point in the corrected tomographic image Dhj, and determines whether the derived positional shift amount is appropriate or inappropriate based on a result of the image quality evaluation (step ST 37 ).
  • the display controller 36 displays the corrected tomographic image Dhj on the display unit 3 (step ST 38 ), and the processing is terminated.
  • the generated corrected tomographic image Dhj is transmitted to the external storage device (not shown) and stored.
  • the display controller 36 displays the tomographic image Dj on the display unit 3 (step ST 39 ), and the processing is terminated. In this case, the tomographic image Dj is transmitted to the external storage device (not shown) and stored.
  • the positional shift amount is derived by the positional shift amount derivation unit 35
  • an appropriate positional shift amount may not be derived due to the influence of the structure other than the feature point.
  • the image quality evaluation is performed on the corrected tomographic image Dhj, and the determination is made as to whether the positional shift amount is appropriate or inappropriate based on the result of the image quality evaluation. Therefore, it is possible to appropriately determine whether the derived positional shift amount is appropriate or inappropriate.
  • the tomographic image Dj before correction is displayed or stored in a case where the determination is made that the positional shift amount is inappropriate, it is possible to reduce the possibility of making an erroneous diagnosis due to the corrected tomographic image Dhj generated based on the inappropriate positional shift amount.
  • the image quality evaluation is performed based on the magnitude of the high-frequency components of the region of interest set in the corrected tomographic image Dhj, but the present invention is not limited thereto.
  • the positional shift amount determination unit 39 may perform the image quality evaluation for the region of interest including the feature point in the tomographic image Dj, compare the result of the image quality evaluation for the corrected tomographic image Dhj with a result of the image quality evaluation for the tomographic image Dj, and decide the tomographic image with a better result of the image quality evaluation as a final tomographic image.
  • the final tomographic image is the tomographic image that is displayed on the display unit 3 , or transmitted and stored in the external device.
  • the derivation of the new positional shift based on the new tomographic image may be repeated until the positional shift amount converges as in the fifth embodiment.
  • the positional shift amount derived by the positional shift amount derivation unit 35 is compared with a predetermined threshold, and only in a case where the positional shift amount exceeds the threshold value, the tomographic image may be reconstructed while correcting the positional shift amount.
  • FIG. 33 is a diagram showing a schematic configuration of the tomographic image generating apparatus realized by installing a tomographic image generating program in the computer 2 in the eighth embodiment.
  • the same reference numbers as those in FIG. 3 are assigned to the same configurations as those in FIG. 3 , and detailed description thereof will be omitted here.
  • the eighth embodiment is different from the first embodiment in that the tomographic image generating apparatus further comprises an evaluation function derivation unit 50 that derives an evaluation function for performing image quality evaluation for a region of interest including the feature point in the corrected tomographic image Dhj, and the positional shift amount derivation unit 35 derives the positional shift amount for optimizing the evaluation function.
  • the processing according to the eighth embodiment can be applied to the second to sixth embodiments, but only the case where the processing is applied to the first embodiment will be described here.
  • the evaluation function derivation unit 50 generates the high-frequency image for the region of interest corresponding to the feature point F, which is set with respect to the tomographic plane projection image GTi by the positional shift amount derivation unit 35 .
  • the generation of the high-frequency image need only be performed, as in the positional shift amount determination unit 39 according to the seventh embodiment, by performing the filtering processing using the Laplacian filter to generate a secondary differential image.
  • the pixel value of the derived high-frequency image in the region of interest is referred to as qkl. k represents the k-th projection image, and l represents the number of pixels in the region of interest.
  • the transformation matrix for correcting the positional shift amount is Wk
  • the transformation parameter in the transformation matrix is ⁇ k.
  • the transformation parameter ⁇ k corresponds to the positional shift amount.
  • the image quality evaluation value of the region of interest corresponding to the feature point F in the corrected tomographic image Dhj can be regarded as an added value of the magnitudes of the high-frequency image of the region of interest after positional shift correction in each of the projection images Gi.
  • the evaluation function derivation unit 50 derives the evaluation function shown in Equation (3) below.
  • the evaluation function Ec shown in Equation (3) is an evaluation function Ec to obtain the transformation parameter ⁇ k for minimizing the value in parentheses on the right side with a minus in order to maximize the above addition result.
  • the evaluation function shown in Equation (3) has a plurality of local solutions. Therefore, a constraint condition is applied to the range and the average value of the transformation parameter ⁇ k. For example, a constraint condition is applied such that the average of the transformation parameters ⁇ k for all of the projection images is 0. More specifically, in a case where the transformation parameter ⁇ k is a movement vector representing parallel movement, a constraint condition is applied in which the average value of the movement vectors for all of the projection images Gi is set to 0. Then, in the eighth embodiment, the positional shift amount derivation unit 35 derives the transformation parameter ⁇ k to minimize the evaluation function Ec shown in Equation (3) below, that is, the positional shift amount.
  • the tomographic image generating apparatus further comprises an evaluation function derivation unit 50 that derives an evaluation function for performing image quality evaluation for a region of interest including the feature point in the corrected tomographic image Dhj, and the positional shift amount derivation unit 35 derives the positional shift amount for optimizing the evaluation function. Therefore, it is possible to reduce the possibility that an erroneous diagnosis is made by the corrected tomographic image Dhj generated based on the inappropriate positional shift amount.
  • the regions of interest are set in the tomographic image Dj and the tomographic plane projection image GTi, and the movement direction and the movement amount of the region of interest is derived as the shift vector, that is, the positional shift amount and the temporary positional shift amount, but the present invention is not limited thereto.
  • the positional shift amount may be derived without setting the region of interest.
  • the tomographic plane projection image GTi is acquired by the projection unit 34 , and the positional shift amount between the tomographic plane projection images GTi is derived by the positional shift amount derivation unit 35 , but the present invention is limited to thereto.
  • the positional shift amount between the projection images Gi may be derived without acquiring the tomographic plane projection image GTi.
  • the projection unit 34 is unnecessary in the above embodiments.
  • the positional shift amount derivation unit 35 need only derive the positional shift amount based on the positional relationship of the projection images Gi on the corresponding tomographic plane corresponding to the tomographic image in which the feature point F is detected.
  • the subject is the breast M, but the present invention is not limited thereto. It is needless to say that any part such as the chest or the abdomen of the human body may be the subject.
  • various processors shown below can be used as the hardware structures of processing units that execute various kinds of processing, such as the image acquisition unit 31 , the reconstruction unit 32 , the feature point detecting unit 33 , the projection unit 34 , the positional shift amount derivation unit 35 , the display controller 36 , the combining unit 37 , the focal plane discrimination unit 38 , the positional shift amount determination unit 39 , and the evaluation function derivation unit 50 .
  • the various processors include not only the above-described CPU, which is a general-purpose processor that executes software (program) and functions as various processing units, but also a programmable logic device (PLD) that is a processor whose circuit configuration can be changed after manufacture, such as a field programmable gate array (FPGA), and a dedicated electric circuit that is a processor having a circuit configuration that is designed for exclusive use in order to execute specific processing, such as an application specific integrated circuit (ASIC).
  • a programmable logic device PLD
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • One processing unit may be configured by one of the various processors, or may be a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). Alternatively, a plurality of processing units may be configured by one processor.
  • a processor As an example of configuring a plurality of processing units by one processor, first, as represented by a computer, such as a client and a server, there is a form in which one processor is configured by a combination of one or more CPUs and software and this processor functions as a plurality of processing units. Second, as represented by a system on chip (SoC) or the like, there is a form of using a processor for realizing the function of the entire system including a plurality of processing units with one integrated circuit (IC) chip.
  • SoC system on chip
  • various processing units are configured by one or more of the above-described various processors as a hardware structure.
  • circuitry circuitry in which circuit elements such as semiconductor elements are combined.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Quality & Reliability (AREA)
  • Pulmonology (AREA)
  • Physiology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
US17/169,564 2018-09-27 2021-02-08 Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program Active 2041-01-17 US11961165B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018182724 2018-09-27
JP2018-182724 2018-09-27
PCT/JP2019/038261 WO2020067475A1 (ja) 2018-09-27 2019-09-27 断層画像生成装置、方法およびプログラム

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/038261 Continuation WO2020067475A1 (ja) 2018-09-27 2019-09-27 断層画像生成装置、方法およびプログラム

Publications (2)

Publication Number Publication Date
US20210166443A1 US20210166443A1 (en) 2021-06-03
US11961165B2 true US11961165B2 (en) 2024-04-16

Family

ID=69949828

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/169,564 Active 2041-01-17 US11961165B2 (en) 2018-09-27 2021-02-08 Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program

Country Status (4)

Country Link
US (1) US11961165B2 (ja)
EP (1) EP3858244A4 (ja)
JP (2) JP7105314B2 (ja)
WO (2) WO2020066109A1 (ja)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022148638A (ja) 2021-03-24 2022-10-06 富士フイルム株式会社 画像処理装置、放射線画像撮影システム、画像処理方法、及び画像処理プログラム
JP2022148637A (ja) 2021-03-24 2022-10-06 富士フイルム株式会社 画像処理装置、放射線画像撮影システム、画像処理方法、及び画像処理プログラム
WO2023171073A1 (ja) 2022-03-08 2023-09-14 富士フイルム株式会社 画像処理装置、方法およびプログラム

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090123052A1 (en) 2002-11-27 2009-05-14 Chris Ruth System and Method for Generating a 2D Image from a Tomosynthesis Data Set
US20160081645A1 (en) 2014-09-19 2016-03-24 Fujifilm Corporation Tomographic image generation device and method, and recording medium
US20160081644A1 (en) 2014-09-19 2016-03-24 Fujifilm Corporation Tomographic image generation device and method, and recording medium
US20160095563A1 (en) 2013-06-21 2016-04-07 Fujifilm Corporation Image display device, image display method and image display program
JP2016064119A (ja) 2014-09-19 2016-04-28 富士フイルム株式会社 断層画像生成装置、方法およびプログラム
JP2016064118A (ja) 2014-09-19 2016-04-28 富士フイルム株式会社 断層画像生成装置、方法およびプログラム
JP2018126969A (ja) 2017-02-10 2018-08-16 東洋製罐株式会社 印刷版ユニット及びこれを用いた印刷装置
JP2018182724A (ja) 2017-04-07 2018-11-15 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 不正通信検知方法、不正通信検知システム及びプログラム
EP3586750A1 (en) 2018-06-25 2020-01-01 Fujifilm Corporation Imaging control device, imaging control method, and imaging control program
EP3590431A1 (en) 2018-07-03 2020-01-08 Fujifilm Corporation Image display device, image display method, and image display program
EP3629295A1 (en) 2018-09-27 2020-04-01 FUJIFILM Corporation Tomographic image generation apparatus, method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012075862A (ja) * 2010-09-08 2012-04-19 Fujifilm Corp 体動検出装置、方法およびプログラム
JP2015188604A (ja) * 2014-03-28 2015-11-02 富士フイルム株式会社 放射線画像撮影装置および方法並びにプログラム
JP6556005B2 (ja) * 2015-09-29 2019-08-07 富士フイルム株式会社 断層画像生成装置、方法およびプログラム

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090123052A1 (en) 2002-11-27 2009-05-14 Chris Ruth System and Method for Generating a 2D Image from a Tomosynthesis Data Set
JP2014128716A (ja) 2008-11-21 2014-07-10 Hologic Inc トモシンセシスデータセットから2d画像を生成するためのシステムおよび方法
US20160095563A1 (en) 2013-06-21 2016-04-07 Fujifilm Corporation Image display device, image display method and image display program
JP2016064118A (ja) 2014-09-19 2016-04-28 富士フイルム株式会社 断層画像生成装置、方法およびプログラム
US20160081644A1 (en) 2014-09-19 2016-03-24 Fujifilm Corporation Tomographic image generation device and method, and recording medium
JP2016064119A (ja) 2014-09-19 2016-04-28 富士フイルム株式会社 断層画像生成装置、方法およびプログラム
US20160081645A1 (en) 2014-09-19 2016-03-24 Fujifilm Corporation Tomographic image generation device and method, and recording medium
JP2018126969A (ja) 2017-02-10 2018-08-16 東洋製罐株式会社 印刷版ユニット及びこれを用いた印刷装置
JP2018182724A (ja) 2017-04-07 2018-11-15 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 不正通信検知方法、不正通信検知システム及びプログラム
US20190149561A1 (en) 2017-04-07 2019-05-16 Panasonic Intellectual Property Corporation Of America Unauthorized communication detection method, unauthorized communication detection system, and non-transitory computer-readable recording medium storing a program
EP3586750A1 (en) 2018-06-25 2020-01-01 Fujifilm Corporation Imaging control device, imaging control method, and imaging control program
JP2020000313A (ja) 2018-06-25 2020-01-09 富士フイルム株式会社 撮影制御装置、方法およびプログラム
EP3590431A1 (en) 2018-07-03 2020-01-08 Fujifilm Corporation Image display device, image display method, and image display program
EP3629295A1 (en) 2018-09-27 2020-04-01 FUJIFILM Corporation Tomographic image generation apparatus, method, and program

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
English language translation of the following: Office action dated Mar. 29, 2022 from the JPO in a Japanese patent application No. 2020-549461 corresponding to the instant patent application.
Extended European Search Report dated Oct. 19, 2021, issued in corresponding EP Patent Application No. 19867935.9.
International Search Report issued in International Application No. PCT/JP2019/038261 dated Dec. 17, 2019.
Office Action dated Feb. 16, 2024, issued by the EPO in corresponding EP Patent Application No. 19867935.9.
Office Action dated May 3, 2023, issued by the EPO in corresponding EP Patent Application No. 19867935.9.
Written Opinion of the ISA issued in International Application No. PCT/JP2019/038261 dated Dec. 17, 2019.

Also Published As

Publication number Publication date
EP3858244A4 (en) 2021-11-17
US20210166443A1 (en) 2021-06-03
JP7275363B2 (ja) 2023-05-17
EP3858244A1 (en) 2021-08-04
JP7105314B2 (ja) 2022-07-22
WO2020066109A1 (ja) 2020-04-02
JP2022125356A (ja) 2022-08-26
JPWO2020067475A1 (ja) 2021-08-30
WO2020067475A1 (ja) 2020-04-02

Similar Documents

Publication Publication Date Title
US11961165B2 (en) Tomographic image generating apparatus, tomographic image generating method, and tomographic image generating program
US10278660B2 (en) Medical imaging apparatus and method for displaying a selected region of interest
US7433507B2 (en) Imaging chain for digital tomosynthesis on a flat panel detector
US11154257B2 (en) Imaging control device, imaging control method, and imaging control program
CN109419526B (zh) 用于数字乳房断层合成中的运动评估和校正的方法和系统
US12036051B2 (en) Tomosynthesis imaging support apparatus, method, and program
EP3629295B1 (en) Tomographic image generation apparatus, method, and program
US10111626B2 (en) X-ray CT apparatus
US10898145B2 (en) Image display device, image display method, and image display program
US11170541B2 (en) Depth map creation apparatus that creates a plurality of depth maps on the basis of a plurality of spatial frequency components and plurality of tomographic images
US20210393226A1 (en) Image processing apparatus, image processing method, and image processing program
US11484275B2 (en) Image processing apparatus, method, and program
US11417035B2 (en) X-ray tomosynthesis apparatus, image processing apparatus, and program
US11259766B2 (en) Image processing apparatus, image processing method, and image processing program
US11224397B2 (en) Imaging control device, imaging control method, and imaging control program

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MORITA, JUNYA;REEL/FRAME:055326/0817

Effective date: 20210105

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE