WO2024048374A1 - 画像処理装置、撮影システム、画像処理方法、及びプログラム - Google Patents

画像処理装置、撮影システム、画像処理方法、及びプログラム Download PDF

Info

Publication number
WO2024048374A1
WO2024048374A1 PCT/JP2023/030194 JP2023030194W WO2024048374A1 WO 2024048374 A1 WO2024048374 A1 WO 2024048374A1 JP 2023030194 W JP2023030194 W JP 2023030194W WO 2024048374 A1 WO2024048374 A1 WO 2024048374A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
substance
images
substance discrimination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2023/030194
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
律也 富田
治 嵯峨野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of WO2024048374A1 publication Critical patent/WO2024048374A1/ja
Priority to US19/059,419 priority Critical patent/US20250191253A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/42Arrangements for detecting radiation specially adapted for radiation diagnosis
    • A61B6/4208Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
    • A61B6/4241Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector using energy resolving detectors, e.g. photon counting
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/482Diagnostic techniques involving multiple energy imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01TMEASUREMENT OF NUCLEAR OR X-RADIATION
    • G01T1/00Measuring X-radiation, gamma radiation, corpuscular radiation, or cosmic radiation
    • G01T1/16Measuring radiation intensity
    • G01T1/24Measuring radiation intensity with semiconductor detectors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/008Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/62Semi-transparency

Definitions

  • the present disclosure relates to an image processing device, a photographing system, an image processing method, and a program.
  • a photon counting type radiation detector is known as a radiation detector used in a radiation imaging system such as an X-ray CT (Computed Tomography) device.
  • a photon counting type X-ray detector captures each incident X-ray as a photon and measures the intensity of the X-ray by counting the number of photons.
  • photon-counting X-ray detectors when converting X-ray photons into electric charges, the amount of charge corresponding to the energy of the X-ray photons is generated, so the energy of each X-ray photon is measured. be able to. Therefore, the photon counting type X-ray detector can also measure the energy spectrum of X-rays.
  • Patent Document 1 discloses that an image displaying the results of substance discrimination obtained by photon counting CT is displayed on a display unit.
  • one embodiment of the present disclosure provides an image processing device that can display a radiation intensity image and a substance discrimination image obtained using photon counting technology in a manner that allows easy comparison.
  • An image processing device uses a radiation intensity image obtained by photographing a subject using radiation, and an image obtained by photographing the subject by counting photons of radiation.
  • an acquisition unit that acquires a substance discrimination image showing the discriminated substance; and a display control unit that displays the radiation intensity image and the substance discrimination image side by side, in a switched manner, or in a superimposed manner on a display unit. Equipped with.
  • FIG. 1 schematically shows an example of a configuration of a CT system according to a first embodiment.
  • An example of data of a plurality of energy bands according to Example 1 is shown.
  • 5 shows a flowchart of a series of processes according to the first embodiment.
  • An example of a radiation intensity image according to Example 1 is shown.
  • An example of a radiation substance discrimination image according to Example 1 is shown.
  • An example of a radiation substance discrimination image according to Example 1 is shown.
  • An example of a display screen according to Example 1 is shown. Another example of the display screen according to the first embodiment is shown.
  • Another example of the display screen according to the first embodiment is shown.
  • Another example of the display screen according to the first embodiment is shown.
  • Another example of the display screen according to the first embodiment is shown.
  • Another example of the display screen according to the first embodiment is shown.
  • Another example of the display screen according to the first embodiment is shown. Another example of the display screen according to the first embodiment is shown.
  • An example of the configuration of a CT system according to a second embodiment is schematically shown.
  • An example of a machine learning model according to Example 2 is shown. 7 shows a flowchart of a series of processes according to the second embodiment.
  • An example of a display screen according to Example 2 is shown.
  • An example of a machine learning model according to modification 5 is shown.
  • An example of a machine learning model according to modification 5 is shown.
  • Another example of the machine learning model according to Modification 5 is shown.
  • Another example of the machine learning model according to Modification 5 is shown.
  • the term radiation can include, for example, electromagnetic radiation such as X-rays and gamma rays, and particle radiation such as alpha rays, beta rays, particle beams, proton beams, heavy ion beams, and meson beams.
  • photons such as X-rays and ⁇ -rays, and particles such as ⁇ -rays and ⁇ -rays may be collectively referred to as radiation photons.
  • a substance discrimination image an image in which substances are discriminated using data obtained by radiography is referred to as a substance discrimination image.
  • an image that is a fluoroscopic image such as a CT image or a radiation image obtained by a CT system, a radiation imaging device, etc., and in which material discrimination is not performed is called a radiation intensity image.
  • a still image will be described as an image to which the present disclosure is applied, but the image to which the present disclosure is applied may be a moving image.
  • an imaging system using CT will be described as an example of a radiation imaging system, but the imaging system according to the present disclosure is not limited to this.
  • a DR Digital Radiography
  • FPD Fluorescence Deformation
  • PET PET
  • SPECT Synchron Emission Computed Tomography
  • the radiographic system described above may be used as a radiological diagnostic apparatus.
  • a photographing system for photographing a human body as a subject in the medical field or the like will be described.
  • the present disclosure can also be applied to, for example, a photographing system that photographs a product or the like as a subject for non-destructive inspection in the industrial field or the like.
  • the CT system 1 is a system that can perform photon counting CT.
  • photon counting CT a CT image with a high signal-to-noise ratio can be reconstructed by counting radiation that has passed through a subject using a photon-counting radiation detector that can count photons of radiation.
  • FIG. 1 schematically shows an example of the configuration of a CT system 1 using X-rays according to the first embodiment.
  • the CT system 1 according to this embodiment is provided with a gantry device 10, a bed device 20, and an image processing device 30.
  • the rotation axis of the rotation frame 13 in a non-tilted state or the longitudinal direction of the top plate 23 of the bed device 20 is defined as the Z-axis direction.
  • the axial direction that is orthogonal to the Z-axis direction and horizontal to the floor surface is defined as the X-axis direction.
  • the axial direction that is perpendicular to the Z-axis direction and perpendicular to the floor surface is defined as the Y-axis direction.
  • FIG. 1 depicts the gantry apparatus 10 from multiple directions for explanation, and shows a case where the CT system 1 has one gantry apparatus 10.
  • the gantry device 10 includes an X-ray tube 11, an X-ray detector 12, a rotating frame 13, an X-ray high voltage device 14, a control device 15, a wedge 16, a collimator 17, and a DAS (Data Acquisition System). ) 18 are provided.
  • the X-ray tube 11 is a vacuum tube that has a cathode (filament) that generates thermoelectrons and an anode (target) that generates X-rays upon collision with the thermoelectrons.
  • the X-ray tube 11 generates X-rays for irradiating the subject S by applying a high voltage from the X-ray high voltage device 14 and irradiating thermoelectrons from the cathode to the anode.
  • the X-ray tube 11 includes a rotating anode type X-ray tube that generates X-rays by irradiating a rotating anode with thermoelectrons.
  • the rotating frame 13 is an annular frame that supports the X-ray tube 11 and the X-ray detector 12 so as to face each other, and allows the X-ray tube 11 and the X-ray detector 12 to be rotated by the control device 15.
  • the rotating frame 13 may be cast from aluminum.
  • the rotating frame 13 can also support an X-ray high voltage device 14, a wedge 16, a collimator 17, a DAS 18, and the like.
  • the rotating frame 13 can also support various configurations not shown.
  • the wedge 16 is a filter for adjusting the amount of X-rays irradiated from the X-ray tube 11. Specifically, the wedge 16 transmits and attenuates the X-rays emitted from the X-ray tube 11 so that the X-rays emitted from the X-ray tube 11 to the subject S have a predetermined distribution.
  • This is a filter that
  • the wedge 16 is a wedge filter or a bow-tie filter, and may be a filter made of aluminum or the like so as to have a predetermined target angle and a predetermined thickness.
  • the collimator 17 is a lead plate or the like for narrowing down the irradiation range of the X-rays that have passed through the wedge 16, and forms a slit by combining a plurality of lead plates or the like. Note that the collimator 17 is sometimes called an X-ray diaphragm. Further, although FIG. 1 shows a case where the wedge 16 is disposed between the X-ray tube 11 and the collimator 17, the collimator 17 may be disposed between the X-ray tube 11 and the wedge 16. In this case, the wedge 16 transmits and attenuates the X-rays irradiated from the X-ray tube 11 and whose irradiation range is limited by the collimator 17.
  • the X-ray high-voltage device 14 includes an electric circuit such as a transformer and a rectifier, and includes a high-voltage generator that generates a high voltage to be applied to the X-ray tube 11 and an An X-ray control device is provided that controls the output voltage according to the radiation.
  • the high voltage generator may be of a transformer type or an inverter type. Note that the X-ray high voltage device 14 may be provided on the rotating frame 13 or may be provided on a fixed frame (not shown).
  • the control device 15 includes a processing circuit including a CPU (Central Processing Unit), and a drive mechanism such as a motor and an actuator.
  • the control device 15 controls the operation of the gantry device 10 and the bed device 20 in response to input signals from the input section 308 .
  • the control device 15 controls the rotation of the rotating frame 13, the tilt of the gantry device 10, the operation of the bed device 20 and the top plate 23, and the like.
  • the control device 15 rotates the rotation frame 13 about an axis parallel to the X-axis direction based on input inclination angle (tilt angle) information.
  • the control device 15 may be provided in the gantry device 10 or in the image processing device 30.
  • the X-ray detector 12 outputs a signal that can measure the energy value of the X-ray photon every time the X-ray photon is incident.
  • the X-ray photon is, for example, an X-ray photon irradiated from the X-ray tube 11 and transmitted through the subject S.
  • the X-ray detector 12 has a plurality of detection elements that output one pulse of an electric signal (analog signal) every time an X-ray photon is incident. Therefore, by counting the number of electrical signals (pulses) output from each detection element, it is possible to count the number of X-ray photons that have entered each detection element. Furthermore, by performing arithmetic processing on this signal, it is possible to measure the energy value of the X-ray photon that caused the output of the signal.
  • the above-mentioned detection element is, for example, a semiconductor detection element such as CdTe (Cadmium Telluride) or CdZnTe (Cadmium Zinc Telluride), in which electrodes are arranged.
  • the X-ray detector 12 is a direct conversion type detector that directly converts incident X-ray photons into electrical signals.
  • the X-ray detector 12 is not limited to a direct conversion type detector, and may be a detector that first converts X-ray photons into visible light using a scintillator or the like, and then converts the visible light into an electrical signal using an optical sensor or the like.
  • An indirect conversion type detector may also be used.
  • the X-ray detector 12 is provided with the above-described detection element and a plurality of ASICs (Application Specific Integrated Circuits) that are connected to the detection element and count the X-ray photons detected by the detection element.
  • the ASIC counts the number of X-ray photons incident on the detection element by discriminating the individual charges output by the detection element. Further, the ASIC measures the energy of the counted X-ray photons by performing arithmetic processing based on the size of each charge. Further, the ASIC outputs the X-ray photon counting results to the DAS 18 as digital data.
  • ASICs Application Specific Integrated Circuits
  • the DAS 18 generates detection data based on the results of the counting process input from the X-ray detector 12.
  • the detected data is, for example, a sinogram.
  • the sinogram is data in which the results of the counting process that are incident on each detection element at each position of the X-ray tube 11 are arranged.
  • a sinogram is data in which the results of counting processing are arranged in a two-dimensional orthogonal coordinate system with the view direction and channel direction as axes.
  • the DAS 18 generates a sinogram in units of columns in the slice direction of the X-ray detector 12, for example.
  • the DAS 18 transfers the generated detection data to the image processing device 30.
  • the DAS 18 can be realized by, for example, a processor such as a CPU.
  • the result of the counting process is data in which the number of X-ray photons is assigned to each energy bin (energy bands E1 to E4) as shown in FIG.
  • the DAS 18 counts photons (X-ray photons) originating from the X-rays irradiated from the X-ray tube 11 and transmitted through the subject S, discriminates the energy of the counted X-ray photons, and calculates the result of the counting process. do.
  • FIG. 2 shows an example of a plurality of energy bands, the number and width of the energy bands to be discriminated are not limited to this, and may be set according to a desired configuration.
  • the data generated by the DAS 18 is sent via optical communication from a transmitter having a light emitting diode (LED) provided in the rotating frame 13 to a receiver having a photodiode provided in a non-rotating portion of the gantry device 10. and is transferred to the image processing device 30.
  • the non-rotating portion may be, for example, a fixed frame (not shown) that rotatably supports the rotating frame 13.
  • the method of transmitting data from the rotating frame 13 to the non-rotating part of the gantry device 10 is not limited to optical communication, and any non-contact data transmission method may be adopted, or a contact data transmission method may be used. may be adopted.
  • the bed device 20 is a device on which a subject S to be photographed is placed and moved, and the bed device 20 is provided with a base 21, a bed driving device 22, a top plate 23, and a support frame 24. ing.
  • the base 21 is a casing that supports the support frame 24 movably in the vertical direction.
  • the bed driving device 22 is a drive mechanism that moves the top plate 23 on which the subject S is placed in the longitudinal direction of the top plate 23, and includes a motor, an actuator, and the like.
  • the top plate 23 provided on the upper surface of the support frame 24 is a plate on which the subject S is placed. In addition to the top plate 23, the bed driving device 22 may move the support frame 24 in the longitudinal direction of the top plate 23.
  • the image processing device 30 is provided with an acquisition section 301, a generation section 302, an analysis section 303, a display control section 304, a photographing control section 305, and a storage section 306. Further, a display section 307, an input section 308, a gantry device 10, and a bed device 20 are communicably connected to the image processing device 30.
  • the image processing device 30 and the gantry device 10 are described as separate bodies, but the gantry device 10 may include the image processing device 30 or some of the components of the image processing device 30.
  • the image processing device 30 can be configured by a computer equipped with a processor and a memory.
  • Each component other than the storage unit 306 of the image processing device 30 is functionally configured using, for example, a processor such as one or more CPUs and a program read from the storage unit 306.
  • the processor may be, for example, an MPU (Micro Processing Unit), a GPU (Graphical Processing Unit), an FPGA (Field-Programmable Gate Array), or the like.
  • each component other than the storage unit 306 of the image processing device 30 may be configured with an integrated circuit such as an ASIC that performs a specific function.
  • the internal configuration of the image processing device 30 may include a graphic control unit such as a GPU, a communication unit such as a network card, and an input/output control unit such as a keyboard, display, or touch panel.
  • the acquisition unit 301 can acquire data generated by the DAS 18, various operations input by the operator via the input unit 308, patient information, etc.
  • the acquisition unit 301 also acquires data obtained by photographing the subject S, a CT image of the subject S, A substance discrimination image of the subject S, patient information, etc. may be acquired.
  • the arbitrary network may include, for example, a LAN (Local Area Network), an intranet, the Internet, and the like.
  • the generation unit 302 generates projection data by subjecting the data output from the DAS 18 to pre-processing such as logarithmic transformation processing, offset correction processing, inter-channel sensitivity correction processing, and beam hardening correction. Further, the generation unit 302 generates a CT image by performing reconstruction processing on the generated projection data using a filter correction back projection method, a successive approximation reconstruction method, or the like. The generation unit 302 stores the reconstructed CT image in the storage unit 306.
  • the projection data generated from the counting results obtained by photon counting CT includes information on the energy of the X-rays attenuated by passing through the subject S. Therefore, the generation unit 302 can reconstruct a CT image of a specific energy band.
  • the generation unit 302 can reconstruct CT images of each of a plurality of energy bands. Note that a CT image (of all energy bands) reconstructed without dividing by energy band corresponds to a radiation intensity image.
  • the generation unit 302 can generate a plurality of color-coded CT images, for example, by assigning a color tone according to the energy band to the CT image of each energy band. Furthermore, the generation unit 302 can also generate an image in which a plurality of CT images color-coded according to energy bands are superimposed.
  • the generation unit 302 can generate a substance discrimination image that enables identification of the substance, for example, by using the K absorption edge unique to the substance.
  • the method for generating the material discrimination image is not limited to the method using the K absorption edge, and any known method may be used.
  • the generation unit 302 generates a substance discrimination image that is color-coded according to the substance or an image that is a superimposition of a plurality of color-coded substance discrimination images, similar to the CT image that is color-coded according to the energy band. can be generated.
  • the generation unit 302 can also generate, for example, a monochromatic X-ray image, a density image, an effective atomic number image, and the like.
  • projection data for 360° around the subject S is required, and even in the half-scan method, projection data for 180° + fan angle is required.
  • This embodiment can be applied to any reconstruction method.
  • a reconstruction (full scan reconstruction) method is used in which reconstruction is performed using projection data for 360° around the subject S.
  • the generation unit 302 converts the generated CT image into a tomographic image of an arbitrary cross section, a three-dimensional image by rendering processing, etc. using a known method based on input from the operator via the input unit 308. Can be done.
  • the generation unit 302 stores the generated CT images, material discrimination images, etc., and the converted tomographic images, three-dimensional images, etc. in the storage unit 306.
  • the analysis unit 303 performs desired analysis processing using the various images generated by the generation unit 302. For example, the analysis unit 303 performs image processing on the CT image, substance discrimination image, etc. generated by the generation unit 302, and obtains analysis results such as the size of the abnormal region of the subject S and the density of substances contained in the tissue. . Note that the analysis unit 303 may perform analysis processing using projection data before being converted into an image. The analysis unit 303 stores the generated analysis results in the storage unit 306.
  • the display control unit 304 causes the display unit 307 to display patient information, various images, analysis results, information regarding the various images, etc. stored in the storage unit 306.
  • the display control unit 304 causes the display unit 307 to display a CT image, which is a radiation intensity image, and a substance discrimination image generated using photon counting technology in a manner that makes it easy to compare.
  • the display control unit 304 displays the CT image and the material discrimination image side by side, switching them, or superimposing them.
  • the imaging control unit 305 controls the CT scan performed by the gantry device 10. For example, the imaging control unit 305 controls the operation of the X-ray high voltage device 14, the X-ray detector 12, the control device 15, the DAS 18, and the bed driving device 22 to perform the counting result collection process in the gantry device 10. Control. For example, the imaging control unit 305 controls projection data collection processing in imaging to collect positioning images (scano images) and main imaging (scanning) to collect images to be used for observation.
  • the storage unit 306 is realized by, for example, a RAM (Random Access Memory), a semiconductor memory element such as a flash memory, a hard disk, an optical disk, or the like.
  • the storage unit 306 stores, for example, patient information, projection data, various images such as CT images and substance discrimination images, analysis results, and information regarding various images. Further, for example, the storage unit 306 can store a program for realizing the functions of each component described above.
  • the storage unit 306 may be realized by a server group (cloud) connected to the CT system 1 via a network.
  • the display section 307 displays various information.
  • the display unit 307 displays various images generated by the generation unit 302, and displays a GUI (Graphical User Interface) for accepting various operations from an operator.
  • the display unit 307 may be any display such as a liquid crystal display, an organic EL display, a CRT (cathode ray tube) display, or the like.
  • the display unit 307 may be of a desktop type, or may be a tablet terminal or the like that can wirelessly communicate with the main body of the image processing apparatus 30.
  • the input unit 308 receives various input operations from the operator, converts the received input operations into electrical signals, and outputs the electrical signals to the image processing device 30. Further, for example, the input unit 308 receives input operations from the operator, such as reconstruction conditions when reconstructing a CT image, and image processing conditions when generating a post-processed image from a CT image.
  • the input unit 308 is realized by a mouse, a keyboard, a trackball, a switch, a button, a joystick, a touch pad that performs input operations by touching the operation surface, a touch screen that integrates a display screen and a touch pad, etc. be done.
  • the input unit 308 may be realized by a non-contact input circuit using an optical sensor, a voice input circuit, or the like.
  • the input unit 308 may be provided in the gantry device 10.
  • the input unit 308 may be configured with a tablet terminal or the like that can communicate wirelessly with the main body of the image processing device 30.
  • the input unit 308 is not limited to one that includes physical operation parts such as a mouse and a keyboard.
  • an electric signal processing circuit that receives an electric signal corresponding to an input operation from an external input device provided separately from the image processing apparatus 30 and outputs this electric signal to the image processing apparatus 30 is also included in the input unit 308. Included in the example.
  • FIG. 3 is a flowchart of a series of processes according to this embodiment.
  • the process according to this embodiment is started in response to an instruction from an operator, the process moves to step S301.
  • step S301 the acquisition unit 301 acquires data obtained by photographing the subject S using the gantry device 10 based on the photographing conditions etc. input by the operator.
  • the acquired data includes counting results obtained by photon counting CT.
  • the acquisition unit 301 may acquire data obtained by photographing the subject S from an image processing device or a storage device (not shown) via an arbitrary network.
  • step S302 the generation unit 302 generates a CT image based on the acquired data. Furthermore, the generation unit 302 generates a substance discrimination image based on the counting results included in the acquired data.
  • the method for generating the material discrimination image may be a method using the K absorption edge, or may be any other known method.
  • FIGS. 4A to 4C show examples of a schematic CT image and a substance discrimination image regarding one cross section of the subject S.
  • FIG. 4A shows a CT image 401.
  • the CT image 401 is a radiation intensity image corresponding to a conventional CT image, and is an image whose pixel values are CT values determined using data of all energy bands included in the projection data.
  • FIG. 4B shows a material discrimination image 402 for iodine determined using projection data.
  • FIG. 4C shows a substance discrimination image 403 regarding gadolinium obtained using projection data.
  • the substance discrimination images are not limited to those related to iodine and gadolinium, and the generation unit 302 may generate a substance discrimination image in which other substances such as calcium, bone, and soft tissue are discriminated.
  • a predetermined pattern of hatching is used to make it easier to understand the differences between the substance discrimination images, but in reality, each pixel value is a value assigned a color tone depending on the substance. May be used.
  • the generation unit 302 can generate a cross-sectional image according to an operator's instruction or a prior setting with respect to a three-dimensional CT image or a substance discrimination image generated using projection data.
  • the CT image and the material discrimination image will be explained as cross-sectional images.
  • the generation unit 302 may generate images for each energy band, monochromatic X-ray images, and the like.
  • the prior settings may include at least one of a setting predetermined for each imaging condition including a body part to be imaged, and a setting predetermined for each imaging mode depending on a disease or the like.
  • step S303 the analysis unit 303 performs analysis processing using the various images generated in step S302.
  • the analysis process may include detection of an abnormal site, calculation of the density of a predetermined substance, and the like.
  • the analysis processing is not limited to these, and any analysis processing required in the medical field or industrial field may be performed depending on the desired configuration.
  • the analysis unit 303 may perform analysis processing using projection data before being converted into an image.
  • the analysis process may be omitted depending on the operator's instructions or prior settings.
  • step S304 the display control unit 304 causes the display unit 307 to display the CT image and the substance discrimination image in a display mode that allows easy comparison.
  • the display control unit 304 causes the display unit 307 to display the CT image and the substance discrimination image in a display mode that allows easy comparison.
  • the substance discrimination images 402 and 403 shown in FIGS. 4B and 4C it is possible to grasp the tissue containing the substance to be discriminated, but the surrounding tissue that does not contain the substance to be discriminated is It is difficult to understand the organization. Therefore, it may be difficult to understand the relationship between each organization.
  • the display control unit 304 displays the CT image and the substance discrimination image side by side, switching them, or superimposing them so that they can be easily compared with each other.
  • the display screen displayed on the display unit 307 by the display control unit 304 according to this embodiment will be described using FIGS. 5 to 8B.
  • FIG. 5 shows a display screen 501 that displays a CT image 540 and a substance discrimination image 550 side by side as an example of a display screen according to this embodiment.
  • the display screen 501 shows an example of a substance discrimination image 550 in which iodine is discriminated.
  • the display screen 501 shows a patient ID 510, the patient's name 520, a comment 530 about the patient and the image, a CT image 540, and a substance discrimination image 550.
  • image types 541, 551, image shooting dates 542, 552, and analysis results 543, 553 are shown.
  • display examples 555 of types 554 of substances to be discriminated and color tones corresponding to the substances are shown.
  • the display control unit 304 can read and display information stored in the storage unit 306.
  • the patient ID 510, name 520, and comment 530 may be stored in the storage unit 306 in association with the CT image 540 and the substance discrimination image 550.
  • the comment 530 for example, the name of the disease, the examination site, presence or absence of defects in the image, the reason for the defects, etc. may be displayed.
  • the patient ID 510, name 520, and comment 530 may be additionally input by the operator via the input unit 308, and the display control unit 304 associates the input information with the CT image 540 and the substance discrimination image 550.
  • the information may also be stored in the storage unit 306.
  • separate selection buttons, input boxes, etc. may be provided to accept input from the operator regarding the presence or absence of defective images and the reason.
  • Image types 541 and 551 indicate the types of the displayed CT image 540 and substance discrimination image 550, respectively.
  • the image type 541 indicates a CT image
  • the image type 551 indicates a PCCT image (material discrimination image based on photon counting).
  • Image shooting dates 542 and 552 indicate the dates when the displayed CT image 540 and substance discrimination image 550 were shot.
  • the CT image 540 and the substance discrimination image 550 may be displayed for the purpose of comparison, for example, to make it easier to understand the relationship between tissues, and even if they are not images acquired in the same imaging. good. Therefore, by displaying the photographing dates 542 and 552 of the images, the operator can grasp when each image was photographed, and can also grasp differences in tissue structure in the images as being due to changes over time.
  • Analysis results 543 and 553 indicate the results of analysis processing performed using each image. Note that the analysis results do not need to be displayed for each image, and may be displayed in a single display frame. Note that the analysis result does not need to be a value; for example, an abnormal region or the like may be detected and displayed as a result of the analysis. In this case, in response to an operator's instruction such as turning on/off a button (not shown), the area corresponding to the detected abnormality is highlighted in the corresponding image so that the area can be easily understood. It's okay.
  • the substance type 554 indicates a substance that is the target of discrimination in the substance discrimination image 550, and the display screen 501 shows iodine as an example.
  • the display control unit 304 can display the substance types 554 so that the operator can select the substance to be discriminated, for example, can display options that the operator can select. Further, the display control unit 304 can display a substance discrimination image corresponding to the substance selected according to an operator's instruction as a substance discrimination image 550.
  • the type of substance may include, for example, iodine, gadolinium, calcium, bone, soft tissue, and any metal in the medical field, and may include, for example, solder, silicon, etc. in the industrial field. May include. Further, the type of substance may include other substances depending on the desired configuration. Note that a plurality of substances may be selected as the type of substance, and for example, iodine, gadolinium, etc. may be selected. In this case, an image in which the iodine substance discrimination image and the gadolinium substance discrimination image are superimposed on each other can be displayed as the substance discrimination image 550.
  • the display example 555 of color tones, etc. corresponding to substances exemplifies color tones, etc. corresponding to substances displayed in the substance discrimination image 550.
  • the display example 555 may include color tone, display pattern, and the like.
  • the display example 555 of the color tone corresponding to the substance can be useful for the operator to identify the discriminated substance in the substance discrimination image 550 or the image on which the substance discrimination image 550 is superimposed.
  • the CT image 540 and the substance discrimination image 550 are displayed side by side, so the operator can easily compare these images. Therefore, compared to when observing the substance discrimination image 550 alone, the operator can grasp the relationship between the tissue that contains the substance to be discriminated and the tissue that does not contain the substance to be discriminated. Observations can be made more easily and efficiently.
  • FIG. 6 shows, as an example of the display screen according to this embodiment, a display screen 601 that displays a CT image 540 and two substance discrimination images 550 and 660 side by side. In the example shown in FIG. 6, compared to the example shown in FIG. is shown.
  • the substance discrimination image 660 is an example of a substance discrimination image in which gadolinium is discriminated.
  • Each of the photographing date 662 of the image, the analysis result 663, the type of substance 664, and the display example 665 of the color tone corresponding to the substance corresponds to the substance discrimination image 660.
  • the image shooting date 662, analysis result 663, substance type 664, and display example 665 of color tone corresponding to the substance correspond to the image shooting date 552, analysis result 553, substance type 554, and substance. It may be similar to the display example 555 of color tone, etc.
  • each of the plurality of substance discrimination images 550, 660 may be arranged so as to be adjacent to the top and bottom and/or left and right sides of the CT image 540.
  • the substance discrimination image 550 may be displayed on the left side of the CT image 540
  • the substance discrimination image 660 may be displayed on the right side of the CT image 540.
  • the operator can more easily compare the respective substance discrimination images 550, 660 and the CT image 540, and can perform observation efficiently.
  • a substance discrimination image in which one substance was discriminated was displayed as the substance discrimination image.
  • a substance discrimination image obtained by superimposing a plurality of substance discrimination images in which different types of substances are respectively discriminated may be displayed side by side with the CT image.
  • FIGS. 7A and 7B show display screens 701 and 702 that switch between and display a CT image 540 and a substance discrimination image 550, as examples of display screens according to this embodiment. Note that configurations similar to those in FIG. 5 are given the same reference numerals and detailed explanations are omitted. Further, in FIGS. 7A and 7B, the photographing date of the image is omitted to simplify the explanation, but the photographing date of the image may be shown on each display screen.
  • the display screen 701 shown in FIG. 7A On the display screen 701 shown in FIG. 7A, only the CT image 540 is shown as the display image, and a switching button 780 is shown.
  • the switching button 780 When the switching button 780 is operated in accordance with an operator's instruction, the display screen 701 is switched to a display screen 702 shown in FIG. 7B.
  • the display screen 702 On the display screen 702, only the substance discrimination image 550 is shown as a display image.
  • the switching button 780 is further operated in accordance with the operator's instruction, the display screen 702 is switched to the display screen 701 shown in FIG. 7A.
  • the CT image 540 and the substance discrimination image 550 are displayed in a switched manner, so that the operator can easily compare these images. Therefore, compared to observing the substance discrimination image 550 alone, the operator can easily grasp the relationship between tissues that contain the target substance and tissues that do not contain the target substance, and can do so more efficiently. Observations can be made in detail.
  • the operator can observe these images in a manner that makes it easier to compare them. can.
  • the substance discrimination image displayed while switching from the CT image is not limited to one, but may be a plurality of substance discrimination images in which different types of substances are discriminated.
  • the plurality of substance discrimination images can be displayed so as to be switched in a preset order.
  • a substance discrimination image obtained by superimposing a plurality of substance discrimination images in which different types of substances are respectively discriminated may be displayed.
  • the display screen 701 that displays the CT image 540 and the display screen 702 that displays the substance discrimination image 550 are switched according to the operation of the switching button 780.
  • the display control unit 304 may, for example, provide a slider (not shown) and cause the display unit 307 to display an image in which the CT image 540 and the material discrimination image 550 are blended (superimposed) with each other at a blend ratio according to the operation of the slider. It's okay.
  • the blend ratio may be a ratio of transparency such that when one of the transparency of images to be superimposed on each other is increased, the other becomes smaller.
  • the display control unit 304 sets the blend ratio between the CT image 540 and the substance discrimination image 550 in accordance with the operator's instructions, and blends the CT image 540 and the substance discrimination image 550 at the set blend ratio. can be displayed on the display unit 307 by superimposing them on each other. Further, the display control unit 304 may set the transparency of one of the images according to the operation of the slider, and display the image on the display unit 307 while superimposing it on the other image. Even in these cases, the operator can easily compare the CT image 540 and the substance discrimination image 550.
  • FIG. 7C shows a display screen 703 that displays a substance discrimination image 550 superimposed on a CT image 540. Note that configurations similar to those in FIG. 5 are given the same reference numerals and detailed explanations are omitted. In addition, in FIG. 7C, the photographing date of the image is omitted to simplify the explanation, but the photographing date of the image may be shown on the display screen.
  • a display screen 703 shown in FIG. 7C shows a superimposed image 770 in which a substance discrimination image 550 is superimposed on a CT image 540. Further, each of the image type 771, the analysis result 773, the substance type 774, and the display example 775 of the color tone corresponding to the substance corresponds to the superimposed image 770. Note that the analysis result 773, the substance type 774, and the display example 775 of the color tone corresponding to the substance are similar to the analysis result 553, the substance type 554, and the display example 555 of the color tone etc. corresponding to the substance. It's fine. Note that the image type 771 is CT+PCCT to indicate that the material discrimination image is superimposed on the CT image. Further, the analysis result 773 may correspond to the CT image 540 and/or the substance discrimination image 550.
  • the superimposed image 770 may be a superimposed image in which the CT image 540 is superimposed on the substance discrimination image 550.
  • the substance discrimination image is not limited to one type of substance discrimination image, but may be a substance discrimination image obtained by superimposing a plurality of substance discrimination images in which different types of substances are respectively discriminated.
  • the display control unit 304 may switch the display screen 701 or the display screen 702 to the display screen 703 when the switching button 780 shown in FIG. 7A or 7B is operated.
  • the CT image 540 and the material discrimination image 550 are displayed in a superimposed manner, the operator can easily compare these images. Further, since the CT image 540, the substance discrimination image 550, and the superimposed image 770, which are not superimposed on each other, are displayed in a switched manner, the operator can easily compare these images.
  • a substance discrimination image 550 whose transparency is set according to an operator's instruction or a predetermined setting may be superimposed on the CT image 540, or a substance discrimination image 550 may be similarly A CT image 540 with transparency set may be superimposed.
  • a switching button 780 is also provided, and the respective images and the superimposed image are switched according to the operation of the switching button 780. It may also be displayed.
  • the switching button 780 may be provided for each CT image or substance discrimination image. For example, when the switching button 780 corresponding to the CT image is operated, the display of the CT image can be switched to the display of a superimposed image in which the substance discrimination image is superimposed on the CT image.
  • the display of the substance discrimination image can be switched to the display of a superimposed image in which the CT image is superimposed on the substance discrimination image. Even in this case, the operator can easily compare these images.
  • a substance discrimination image to be superimposed on the CT image can be selected according to an operator's instruction.
  • the plurality of substance discrimination images can be switched to a superimposed image at once in accordance with the operation of the switching button 780.
  • a switch button for switching between the substance discrimination image and the superimposed image may be provided for each of the plurality of substance discrimination images.
  • FIGS. 8A and 8B show display screens 801 and 802 for follow-up observation that switch and display a plurality of CT images and a plurality of substance discrimination images, as examples of display screens according to this embodiment. Note that configurations similar to those in FIG. 5 are given the same reference numerals and detailed explanations will be omitted. Furthermore, although the analysis results are omitted in FIGS. 8A and 8B to simplify the explanation, the analysis results may be shown on each display screen.
  • a display screen 801 shown in FIG. 8A shows CT images 8401, 8402, 8403, 8404, and 8405 acquired at different times.
  • the display screen 801 also shows an image type 841, image shooting dates 8421, 8422, 8423, 8424, and 8425 corresponding to each image, and a switching button 880.
  • the switching button 880 is operated in accordance with an operator's instruction, the display screen 801 is switched to a display screen 802 shown in FIG. 8B.
  • the display screen 802 shows substance discrimination images 8501, 8502, 8503, 8504, 8505 corresponding to CT images 8401, 8402, 8403, 8404, 8405 acquired at different times. Further, on the display screen 802, the image type 851, the photographing date 8521, 8522, 8523, 8524, 8525 of the image corresponding to each image, the substance type 854, and a display example 855 of the color tone etc. corresponding to the substance. and a switching button 880 are shown. Note that the image types 841 and 851, the substance type 854, and the display example 855 of the color tone corresponding to the substance are the image types 541 and 551, the substance type 554, and the color tone etc. corresponding to the substance in FIG. It may be similar to display example 555. Further, when the switching button 880 is further operated in accordance with the operator's instruction, the display screen 802 is switched to the display screen 801 shown in FIG. 8A.
  • a plurality of CT images 8401, 8402, 8403, 8404, 8405 and a plurality of substance discrimination images 8501, 8502, 8503, 8504, 8505 are switched and displayed all at once. Therefore, the operator can easily compare these images.
  • the switching button 880 may be provided for each image.
  • CT images 8401, 8402, 8403, 8404, 8405 and a superimposed image in which the corresponding substance discrimination images 8501, 8502, 8503, 8504, 8505 are superimposed may be displayed while being switched.
  • the substance discrimination images 8501, 8502, 8503, 8504, 8505 and a superimposed image in which the corresponding CT images 8401, 8402, 8403, 8404, 8405 are superimposed may be switched and displayed. Even in these cases, the operator can easily compare these images.
  • a plurality of CT images acquired at different times and a plurality of corresponding substance discrimination images may be displayed side by side as images for follow-up observation.
  • a plurality of CT images or a plurality of substance discrimination images acquired at different times and a plurality of superimposed images in which one of these images is superimposed on the other may be displayed side by side. Even in these cases, the operator can easily compare these images.
  • the plurality of types of images may be displayed alternately so that the corresponding CT images and substance discrimination images, or the corresponding CT images or substance discrimination images, and their superimposed images are adjacent to each other. In this case, the operator can observe the CT image and the substance discrimination image in a manner that makes it easier to compare them.
  • the display control unit 304 controls the CT image and material discrimination image. may be displayed side by side, switched, or superimposed. Note that in this case, the display control unit 304 may display on the display screen a button or a slider for continuously switching and displaying tomographic images. For example, when displaying a CT image and a substance discrimination image side by side, the display control unit 304 switches and displays the CT image and substance discrimination image all at once to the image at the position corresponding to the operation of the button or slider. It's okay.
  • differences between CT images, differences between a reference image and a substance discrimination image, or a reference image may be displayed side by side.
  • the images may be switched and displayed, or these superimposed images may be displayed.
  • the corresponding material discrimination images can be displayed side by side, switched and displayed, or a superimposed image of these can be displayed. Good too.
  • switching and displaying the CT image and the material discrimination image it is possible to switch and display the CT image and the material discrimination image for subsequent consecutive images according to the timing of operation of the switching button. .
  • step S304 when the display control process by the display control unit 304 ends, the series of processes according to the present embodiment ends.
  • the image processing device 30 includes the acquisition section 301 and the display control section 304.
  • the acquisition unit 301 obtains a radiation intensity image obtained by photographing a subject using radiation, and an image obtained by photographing a subject by counting photons of radiation, which indicates the discriminated substance.
  • the display control unit 304 causes the display unit 307 to display the radiation intensity image and the substance discrimination image side by side, switching them, or superimposing them.
  • the operator can easily compare the CT image and the substance discrimination image. Therefore, compared to observing the substance discrimination image alone, the operator can more easily grasp the relationship between tissues that contain the target substance and tissues that do not contain the target substance, and it is more efficient. Observations can be made.
  • the display control unit 304 can cause the display unit 307 to display options for selecting a substance to be discriminated around the substance discrimination image.
  • the operator can easily compare the radiation intensity image and the material discrimination image while appropriately switching the substance discrimination image according to the purpose of observation, and can perform observation more efficiently. can.
  • the radiation intensity image and the material discrimination image can be images generated using common data obtained by photographing a subject using radiation.
  • projection data is acquired from data obtained by imaging using the gantry device 10, and a CT image and a material discrimination image are generated based on the projection data.
  • the radiation intensity image and the material discrimination image show tissues having a common shape, etc., the radiation intensity image and the material discrimination image can be more easily compared.
  • the radiation intensity image and the substance discrimination image that are displayed on the display unit 307 by the display control unit 304 may be displayed for the purpose of comparison, for example, to make it easier to understand the relationship between tissues.
  • the image does not have to be generated using the data. Therefore, the radiation intensity image and the substance discrimination image that the display control unit 304 causes the display unit 307 to display may be generated using different data.
  • a radiation intensity image may be generated using data obtained with a CT system that does not use photon counting technology
  • a material discrimination image may be generated using data obtained with a CT system that uses photon counting technology. good.
  • the substance discrimination image displayed on the display unit 307 may include an image in which a plurality of substance discrimination images in which different types of substances are discriminated are superimposed on each other. In this case, the operator can easily understand the relationship between tissues containing different types of substances and tissues that do not contain those substances, and can perform observation more efficiently.
  • the substance discrimination image displayed on the display unit 307 may include a plurality of substance discrimination images in which different types of substances are discriminated. In this case, multiple substance discrimination images related to multiple types of substances are displayed, allowing the operator to easily compare tissues that contain different substances with each other, making it more efficient. Observations can be made.
  • the display control unit 304 can display each of the plurality of substance discrimination images side by side on the display unit 307 so as to be adjacent to the radiation intensity image. In this case, the operator can more easily compare each substance discrimination image and the radiation intensity image, and can perform observation efficiently.
  • the display control unit 304 can switch the plurality of radiation intensity images and the plurality of substance discrimination images at once and display them on the display unit 307.
  • the display control unit 304 switches the plurality of radiation intensity images or the plurality of substance discrimination images and the plurality of superimposed images of the plurality of radiation intensity images and the plurality of substance discrimination images at once and displays them on the display unit 307. It may be displayed. In these cases, the operator can observe the multiple radiation intensity images and the multiple substance discrimination images in a manner that is easy to compare with a simpler operation, and can perform the observation more efficiently. .
  • the display control unit 304 sets the transparency of one of the radiation intensity image and the substance discrimination image in accordance with an instruction from the operator, and superimposes the transparency on the other of the radiation intensity image and the substance discrimination image on the display unit. 307 can be displayed.
  • the operator can check the superimposed image of the radiation intensity image and the substance discrimination image superimposed with desired transparency. Therefore, the operator can observe the superimposed image in a manner that is easy for the operator to observe, and can perform observation more efficiently.
  • the display control unit 304 switches between the radiation intensity image and the material discrimination image according to an instruction from the operator.
  • the substance discrimination images can be switched together and displayed on the display unit 307. In this case, the operator can observe the radiation intensity image and the material discrimination image regarding the cross section at the desired position through simple operations, and can perform the observation more efficiently.
  • the image processing device 30 can further include an analysis unit 303 that analyzes at least one of the radiation intensity image and the substance discrimination image.
  • the display control unit 304 can display the analysis result by the analysis unit 303 on the display unit 307 around the analyzed image or superimposed on one of the radiation intensity image and the substance discrimination image. In this case, the operator can easily compare the analysis results obtained by analyzing the images in addition to the radiation intensity image and the material discrimination image, allowing for more efficient observation. .
  • the display control unit 304 can cause the display unit 307 to display information regarding the failure of at least one of the radiation intensity image and the substance discrimination image in accordance with an instruction from the operator. In this case, when observing the image, the operator can easily understand whether or not there is a failure in the image, the reason thereof, etc., and can observe the image more efficiently.
  • Modification 1 In the first embodiment, a display screen was described in which a CT image, which is a radiation intensity image, and a substance discrimination image are displayed side by side, switched, or superimposed.
  • the display control unit 304 may select any one of these display screens and display it on the display unit 307 according to the operator's instructions or prior settings.
  • the display control unit 304 selects a display screen to be displayed according to the operator's instructions or prior settings, and causes the display unit 307 to display the selected display screen. For example, when the operator instructs to select a display screen that displays the CT image and substance discrimination image side by side, the display control unit 304 displays the display screen 501 shown in FIG. 5 in response to the instruction. 307.
  • the display control unit 304 may switch the display screen to be displayed and display it on the display unit 307 according to an instruction from the operator. For example, if the operator further instructs to select a display screen that switches and displays CT images and substance discrimination images, the display control unit 304, in response to the instruction, displays the display screen 501 shown in FIG. The display screen 701 shown in FIG. 7A may be switched to be displayed on the display unit 307.
  • the display screen shown in Example 1 corresponds to an analysis screen capable of displaying detailed analysis results and a display screen for performing image observation.
  • the image processing according to the first embodiment can be similarly applied to displaying a preview image such as a CT image on a confirmation screen for confirming an image captured immediately after the subject S is captured.
  • a CT image with a short processing time may be displayed before the substance discrimination image, and the substance discrimination image may be switched and displayed later. In this case, the operator can quickly check the CT image, which requires a short processing time, and can quickly determine the success or failure of imaging.
  • the image displayed on the confirmation screen or the analysis result of the image may be simpler than the image or image analysis result displayed on the analysis screen or display screen described above. That is, the display control unit 304 displays a simpler image and analysis result than the image and analysis result on the analysis screen for analyzing details of the subject, as the image and analysis result on the confirmation screen displayed immediately after photographing the subject. It may be displayed on the section 307. For example, an image based on data whose data amount has been thinned out such as by omitting predetermined data, an analysis result of the image, etc. can be displayed on the confirmation screen. In this case, the processing time until the confirmation screen is displayed can be shortened, and the operator can quickly determine the success or failure of photographing.
  • the image processing device 30 can calculate the evaluation value of the captured image when displaying the confirmation screen. If the calculated evaluation value is less than or equal to the threshold value, the image processing device 30 may determine that re-imaging is necessary, and may display a message recommending re-imaging on the confirmation screen.
  • a Q index value can be used as an evaluation index of an image, but the present invention is not limited to this, and any known evaluation index such as an SN ratio or a contrast value may be used. Note that any known method may be used to calculate the Q index value.
  • the display control unit 304 may display the calculated evaluation value of the image on the confirmation screen. In these cases, the operator can more efficiently determine whether re-imaging is necessary.
  • the display screen shown in Example 1 is an example. Therefore, display items may be added or omitted depending on the desired configuration, and the configuration and arrangement of the display screen may also be changed depending on the desired configuration.
  • the radiation intensity image and the material discrimination image may be displayed vertically side by side.
  • the type of image, the date the image was taken, the analysis result, the type of substance, etc. may be displayed superimposed on the corresponding image, or displayed in the vicinity of the top, bottom, left, and right of the corresponding image.
  • the display control unit 304 can display information indicating the type of image around the radiation intensity image and the substance discrimination image, or in a superimposed manner on the radiation intensity image and the substance discrimination image.
  • imaging information including the radiation dose used for imaging, the amount and type of contrast medium, and the area to be imaged may be displayed around the top, bottom, left, right, etc. of the image, or may be displayed superimposed on the image. It's okay.
  • the images displayed for comparison may be images obtained using different imaging devices; for example, the CT images may be obtained using a CT device that does not use photon counting technology. It may be one that has been acquired. For this reason, the type and model of the device used to acquire the image may be displayed in the upper, lower, left, right, etc. surroundings of the image or in a superimposed manner on the image.
  • the analysis results may be displayed superimposed on various images in response to the operation of a button (not shown). Further, regarding the analysis results of the CT image and the material discrimination image, the analysis results of one image may be displayed superimposed on the other image. For example, depending on the operator's operation, the analysis result of the substance discrimination image may be displayed superimposed on the CT image, or the analysis result of the CT image may be displayed superimposed on the substance discrimination image.
  • Modification 4 Note that in the first embodiment, there are no particular limitations on the order in which photographed data is transferred, the order in which images are generated, and the order in which images are displayed. On the other hand, these orders may be determined by an operator's instruction or a prior setting. Note that, as described above, the advance settings may include at least one of the settings predetermined for each imaging condition including the region to be imaged and the like, and the settings predetermined for each imaging mode depending on the disease and the like.
  • the order of data transfer may be determined for each radiation energy band.
  • the order of data transmission may be determined for each energy band with respect to the data transferred from the DAS 18 to the image processing device 30, depending on the prior settings and the purpose of observation.
  • the acquisition unit 301 can acquire data regarding a specific energy band obtained by photographing a subject earlier than data regarding other energy bands, depending on the prior setting or the purpose of observation. . Therefore, the operator can check the desired image more efficiently.
  • the order of image generation may be determined so that an image of a cross section that is desired to be displayed and confirmed first is generated preferentially.
  • the generation unit 302 may determine the order in which images are generated according to prior settings or the purpose of observation. Specifically, the generation unit 302 generates a radiation intensity image and a material discrimination image for a specific position of the subject into a radiation intensity image and a substance discrimination image for other positions of the subject, depending on the prior settings and the purpose of observation. can be generated faster.
  • the cross section for which images should be generated preferentially may be determined in advance according to settings or the purpose of observation, or it may be determined automatically using the results of substance discrimination performed prior to image generation. It may be determined.
  • the display control unit 304 causes the display unit 307 to display a substance discrimination image after displaying a radiation intensity image such as a CT image, according to the operator's instructions or prior settings. It's okay.
  • the display control unit 304 causes the display unit 307 to display the radiation intensity image after the radiation intensity image is generated, which takes a relatively short processing time, and then causes the substance discrimination image to be generated, which takes a relatively long processing time.
  • the substance discrimination image can be displayed on the display unit 307 later.
  • the radiation intensity image and the substance discrimination image may be images generated using common data obtained by photographing a subject using radiation.
  • the display order it may be desirable to display the radiation intensity image and the substance discrimination image at the same time. Therefore, it may be possible to select to display the substance discrimination image after the radiation intensity image or to display the radiation intensity image and the substance discrimination image at the same time, depending on the operator's instructions.
  • the purpose of observation may be determined based on the operator's instructions regarding the imaging conditions such as the region to be imaged or the operator's instructions regarding the observation purpose.
  • the above-described modifications 1 to 4 are also applicable to the following embodiments.
  • Example 2 In Example 2 of the present disclosure, an example will be described in which image quality enhancement processing is performed to improve the image quality of the radiation intensity image and material discrimination image using a trained machine learning model.
  • the CT system 9 according to this embodiment will be described below with reference to FIGS. 9 to 12, focusing on the differences from the CT system 1 according to the first embodiment. Note that, among the configurations of the CT system 9 according to the present example, the same configurations as those of the CT system according to the first example are given the same reference numerals, and detailed description thereof will be omitted.
  • a machine learning model refers to a learning model using a machine learning algorithm.
  • Specific algorithms for machine learning include the nearest neighbor method, the Naive Bayes method, decision trees, and support vector machines.
  • deep learning which uses neural networks to generate feature quantities and connection weighting coefficients for learning by itself.
  • algorithm using a decision tree there are also methods using gradient boosting such as LightGBM and XGBoost.
  • any available algorithm can be used to apply to the following embodiments and modifications.
  • the teaching data refers to learning data, and is composed of a pair of input data and output data.
  • a trained model is a machine learning model that follows any machine learning algorithm such as deep learning, and is trained in advance using appropriate teaching data (learning data). .
  • learning data teaching data
  • a trained model is obtained in advance using appropriate training data, it does not mean that no further training is performed, and additional training can be performed. Additional learning can also occur after the device is installed at the site of use.
  • FIG. 9 schematically shows an example of the configuration of the CT system 9 according to this embodiment.
  • the image processing device 930 related to the CT system 9 according to the present embodiment includes a high image quality unit 907 in addition to the components of the image processing device 30 according to the first embodiment.
  • the image quality improvement unit 907 uses the CT image and substance discrimination image generated by the generation unit 302 as input to the trained model to obtain a CT image and substance discrimination image with improved image quality.
  • the analysis unit 303 can perform analysis processing on high-quality CT images, substance discrimination images, etc. acquired by the image quality improvement unit 907. Further, the display control unit 304 according to the present embodiment can cause the display unit 307 to display a high-quality CT image, a substance discrimination image, etc. acquired by the image quality improvement unit 907.
  • the high image quality model is stored in the storage unit 306 and used by the high image quality unit 907 for high image quality processing. It may be provided in an external device.
  • the high image quality model according to this embodiment is a learned model obtained by training (learning) related to a machine learning algorithm.
  • input data is a low-quality image with specific shooting conditions assumed to be processed
  • output data is a high-quality image corresponding to the input data.
  • the specific imaging conditions specifically include a predetermined imaging site, imaging method, X-ray tube voltage, image size, and the like.
  • the high image quality model according to this embodiment is configured as a module that outputs a high quality image based on an input low quality image.
  • high image quality in this specification refers to the generation of an image with a quality more suitable for an image test from an input image
  • a high quality image refers to an image whose quality is more suitable for an image test.
  • low-quality images include, for example, two-dimensional images or three-dimensional images obtained by CT, etc., or three-dimensional moving images of continuous CT, which are images taken without particularly high-quality settings. It is something that Specifically, low-quality images include, for example, images taken at low doses by CT or the like.
  • high-quality images with low noise or high contrast are used for various analysis processes and image analysis such as area segmentation processing of images such as CT, analysis will be more accurate than using low-quality images. can often be done. Therefore, high-quality images output by the high-quality engine may be useful not only for image inspection but also for image analysis.
  • image quality suitable for image inspection depends on what is desired to be inspected in various image inspections. Therefore, it is difficult to make a general statement, but for example, image quality suitable for image inspection is one that has little noise, high contrast, shows the photographed subject in colors and gradations that are easy to observe, and has a large image size. Including image quality such as high resolution. Furthermore, the image quality can include images in which non-existent objects and gradations that were drawn during the image generation process have been removed from the image.
  • processing using various machine learning algorithms such as deep learning is performed.
  • this high-quality image processing uses existing methods such as various image filter processing, matching processing using a database of high-quality images corresponding to similar images, and knowledge-based image processing. Any processing may be performed.
  • FIG. 10 shows an example of the configuration of a high image quality model.
  • the configuration shown in FIG. 10 is composed of a plurality of layer groups that are responsible for processing and outputting a group of input values.
  • the types of layers included in this configuration include a convolution layer, a downsampling layer, an upsampling layer, and a merging layer.
  • the convolution layer is a layer that performs convolution processing on a group of input values according to parameters such as the set filter kernel size, number of filters, stride value, and dilation value. Note that the number of dimensions of the kernel size of the filter may also be changed depending on the number of dimensions of the input image.
  • the downsampling layer is a layer that performs processing to make the number of output value groups smaller than the number of input value groups by thinning out or combining input value groups.
  • processing includes, for example, Max Pooling processing.
  • the upsampling layer is a layer that performs processing to make the number of output value groups larger than the number of input value groups by duplicating the input value group or adding values interpolated from the input value group.
  • processing includes, for example, linear interpolation processing.
  • the compositing layer is a layer that inputs value groups such as a group of output values of a certain layer or a group of pixel values composing an image from multiple sources, and performs a process of compositing them by concatenating or adding them.
  • the group of pixel values that make up the input image 1010 is output through the convolution processing block, and the group of pixel values that make up the input image 1010 are combined in the composition layer.
  • the combined pixel values are then shaped into a high-quality image 1020 in the final convolution layer.
  • better characteristics of the CNN may be obtained not only by changing the parameters as described above but also by changing the configuration of the CNN. Better characteristics include, for example, outputting radiation images with higher accuracy and reduced noise, shorter processing time, and shorter time required for training a machine learning model.
  • the configuration of the CNN used in this example is a U-net network that has an encoder function consisting of multiple layers including multiple downsampling layers, and a decoder function consisting of multiple layers including multiple upsampling layers. It is a type of machine learning model. That is, the configuration of CNN includes a U-shaped structure having an encoder function and a decoder function. In the U-net type machine learning model, position information (spatial information) that is made ambiguous in multiple layers configured as encoders is transferred to layers of the same dimension (layers that correspond to each other) in multiple layers configured as decoders. ) (e.g., using skip connections).
  • a batch normalization layer or an activation layer using a normalized linear function may be incorporated after the convolution layer. You may.
  • the GPU can perform efficient calculations by processing more data in parallel. For this reason, when learning is performed multiple times using a machine learning algorithm such as deep learning, it is effective to perform processing using a GPU. Therefore, in this embodiment, a GPU is used in addition to the CPU for processing by the image quality improvement unit 907, which functions as an example of a learning unit. Specifically, when a learning program including a learning model is executed, learning is performed by the CPU and GPU working together to perform calculations. Note that in the processing of the learning section, calculations may be performed only by the CPU or GPU. Furthermore, the image quality improvement processing according to this embodiment may also be implemented using a GPU, similar to the learning section. Note that if the trained model is provided in an external device, the image quality improvement unit 907 does not need to function as a learning unit.
  • the learning section may include an error detection section and an updating section (not shown).
  • the error detection unit obtains an error between output data output from the output layer of the neural network and correct data according to input data input to the input layer.
  • the error detection unit may use a loss function to calculate the error between the output data from the neural network and the correct data.
  • the updating section updates the connection weighting coefficients between the nodes of the neural network, etc., based on the error obtained by the error detection section, so that the error becomes smaller.
  • This updating unit updates the connection weighting coefficients and the like using, for example, an error backpropagation method.
  • the error backpropagation method is a method of adjusting connection weighting coefficients between nodes of each neural network so that the above-mentioned error is reduced.
  • the image size may be changed as appropriate. It is assumed that the Specifically, we perform padding on input images, such as images used as learning data for training machine learning models and images input to high-quality models, and image capture areas around the input images. Adjust the image size by merging. In addition, in order to effectively improve the image quality, the area to be padded can be filled with a fixed pixel value, filled with neighboring pixel values, or mirror padded, depending on the characteristics of the image quality enhancement method. .
  • the image quality improvement processing in the image quality improvement unit 907 may be performed using only one image processing method, or may be performed using a combination of two or more image processing methods.
  • a plurality of image quality improvement method groups may be implemented in parallel to generate a plurality of high quality image groups, and then the highest quality image may be finally selected as the high image quality image.
  • the selection of the highest quality image may be automatically performed using an image quality evaluation index, or a group of multiple high quality images may be selected on a UI (User Interface) provided on the display unit 307 or the like. It may be displayed and performed in response to instructions from the examiner (operator).
  • UI User Interface
  • parameters may be input to the high-quality model together with the low-quality image.
  • a parameter specifying the degree of image quality enhancement or a parameter specifying the image filter size used in the image processing method may be input to the image quality enhancement model together with the input image.
  • the input data of the learning data according to this embodiment is a low-quality image acquired with the same model as the gantry apparatus 10 and the same settings as the gantry apparatus 10.
  • the output data of the learning data of the high-quality model is a high-quality image obtained using image processing such as overlay processing, for example.
  • the output data can be, for example, a high-quality image obtained by performing superimposition processing such as averaging on a group of images obtained by photographing a plurality of times.
  • the output data of the learning data may be, for example, a high-quality image calculated from a high-quality image obtained by imaging at a higher dose than the dose related to the input data.
  • the high-quality image unit 907 can output a high-quality image that has been subjected to noise reduction, etc. through overlay processing or the like. Therefore, the image quality improvement unit 907 can generate a high quality image suitable for image inspection based on the low quality image that is the input image.
  • the output data of the learning data may be a high-quality image corresponding to the input data, and may be, for example, an image subjected to contrast correction to be suitable for inspection, an image with high resolution, or the like.
  • an image obtained by performing image processing using statistical processing such as maximum a posteriori probability estimation (MAP estimation) processing on a low-quality image that is input data can also be used as the output data of the learning data. Note that any known method may be used to generate the high-quality image.
  • a plurality of image quality improvement models may be prepared, each of which independently performs various image quality improvement processes such as noise reduction, contrast adjustment, and further resolution enhancement.
  • one image quality improvement model that performs at least two image quality improvement processes may be prepared.
  • a high-quality image corresponding to the desired processing may be used as the output data of the learning data.
  • a high-quality model that includes individual processing such as noise reduction processing a high-quality image that has been subjected to individual processing such as noise reduction processing may be used as the output data of the learning data.
  • a high-quality model that performs a plurality of high-quality image processes for example, a high-quality image that has been subjected to noise reduction processing, contrast correction processing, etc. may be used as the output data of the learning data.
  • a trained model that generates high-quality images with improved image quality.
  • a trained model for example, it is possible to obtain a high-quality image by using a low-quality image taken at a low dose as an input, so the radiation dose used for imaging can be reduced and the photon It is expected that the occurrence of pile-ups in counting technology can be suppressed.
  • a trained model may be prepared for each type of image. For example, by performing training using training data consisting of a pair of a low-quality CT image and a high-quality CT image, a trained model can be prepared for each type of image. It is possible to obtain a trained model that outputs a high-quality CT image by inputting . Similarly, by performing learning using training data consisting of a pair of low-quality material discrimination images and high-quality material discrimination images, a high-quality material discrimination image is output using a low-quality material discrimination image as input. A trained model can be obtained.
  • FIG. 11 shows a flowchart of a series of processes according to this embodiment.
  • the same reference numerals as those in the first embodiment will be used for the same processes as the series of processes according to the first embodiment shown in FIG. 3, and detailed explanations will be omitted.
  • the acquisition unit 301 can acquire data obtained by photographing the subject S at a low dose, for example.
  • the image quality enhancement unit 907 obtains a high quality CT image and a high quality substance discrimination image by using the CT image and substance discrimination image generated by the generation unit 302 as input to the image quality enhancement model. do.
  • the image quality enhancement model may be prepared for each type of image or for each type of substance, and the image quality enhancement unit 907 may prepare models for each type of image or substance, and the image quality enhancement unit 907 may prepare models for each type of image or substance.
  • a plurality of high-image quality models may be provided depending on imaging conditions such as dose and imaging site.
  • the image quality improvement unit 907 can perform image quality improvement processing using an image quality improvement model according to the shooting conditions of the image used as input.
  • trained models according to image types, material types, and shooting conditions can be obtained by learning using learning data for each type of image, learning data for each type of material, and learning data for each shooting condition. can.
  • step S303 the analysis unit 303 performs analysis processing on the high-quality CT images and substance discrimination images acquired by the image quality improvement unit 907.
  • the analysis process may be the same as the analysis process performed in step S303 according to the first embodiment. Note that, similarly to the first embodiment, the analysis unit 303 can also perform analysis processing on a CT image or a substance discrimination image before the image quality is improved.
  • step S304 the display control unit 304 causes the display unit 307 to display the high-quality CT image and the high-quality substance discrimination image acquired by the image quality improvement unit 907 side by side, switching them, or superimposing them. Furthermore, the display control unit 304 causes the display unit 307 to display the analysis results obtained by the analysis unit 303 using high-quality images. Note that the display screen displayed on the display unit 307 by the display control unit 304 may be similar to the display screen described in Embodiment 1, and a high-quality CT image may be used as the displayed CT image or substance discrimination image. and substance discrimination images can be displayed.
  • FIG. 12 shows an example of a display screen 1201 according to this embodiment.
  • the display screen 1201 is similar to the display screen 501 shown in FIG. 5, but the display screen 1201 further includes a high image quality button 1280.
  • the image quality improvement button 1280 When the image quality improvement button 1280 is operated by the operator via the input unit 308, the display control unit 304 converts the CT image 540 and substance discrimination image 550 into a CT image and substance discrimination image after image quality enhancement. Switch all at once.
  • the image quality improvement button 1280 may be provided for each image whose image quality is desired to be improved.
  • the processing can be similarly applied to the other display screens 601, 701, 702, 703, 801, 802, etc. described in the first embodiment. Further, the present invention can be similarly applied to screens such as the thumbnail list and temporal differences described in the first embodiment.
  • the display control unit 304 converts the currently displayed analysis result into an analysis corresponding to the image to be switched and displayed. You can switch to display the results. For example, when the high image quality button 1280 is operated and the CT image 540 and substance discrimination image 550 are switched to high quality images, the display control unit 304 changes the analysis results 543 and 553 to high quality images. You can switch to the analysis results for later images all at once. Note that when a high image quality button is provided for each image, the display control unit 304 displays the analysis results corresponding to the image to be switched between the analysis results of the image before high quality and the image after high quality. You can switch between the image analysis results and the image analysis results. In step S304, when the display control process by the display control unit 304 ends, a series of processes according to this embodiment ends.
  • the display control unit 304 has a higher image quality than at least one of the radiation intensity image and the substance discrimination image, which is obtained by using the image as an input to the trained model.
  • the image is switched to at least one image and displayed on the display unit 307 according to an instruction from the operator.
  • the operator can appropriately observe the high-quality radiation intensity image and the substance discrimination image in a state where it is easy to compare them, and can perform observation more efficiently.
  • the trained model may include at least one of a plurality of trained models according to the type of image and a plurality of trained models according to the type of substance to be discriminated.
  • the trained model allows the operator to obtain an image that has undergone more appropriate image quality enhancement processing, depending on the type of image and the type of substance to be discriminated. By observing high-quality images, observations can be made more efficiently.
  • the processing using the learned model is not limited to image quality improvement processing, but may also be segmentation processing or the like.
  • the learning data of the trained model may be a CT image or a substance discrimination image as input data, and a label image obtained by labeling each region of the input data by a doctor or the like may be used as output data.
  • the output data may be a label image labeled by a known rule-based segmentation process.
  • the rule base refers to processing based on the regularity of an organization.
  • the configuration of the machine learning model may be similar to the image quality improvement model described above.
  • the image processing device 930 uses at least one of the radiation intensity image and the material discrimination image generated by the generation unit 302 as an input to the trained model, thereby allowing the image processing device 930 to It is possible to obtain an image in which each region of one image is labeled.
  • the learned model may also be prepared for each type of image or type of substance.
  • the image processing device 930 can select and use a trained model for segmentation processing, depending on the type of image to be segmented and the type of substance targeted for discrimination. Further, a plurality of trained models may be provided depending on imaging conditions such as dose and imaging site. In this case, the image processing device 930 can perform segmentation processing using a learned model that corresponds to the shooting conditions of the image used as input.
  • trained models according to image types, material types, and shooting conditions can be obtained by learning using learning data for each type of image, learning data for each type of material, and learning data for each shooting condition. can.
  • the display control unit 304 can display the acquired label image on the display screen as an analysis result. Note that even if a corresponding switching button is provided for displaying the analysis results using the trained model, and the display control unit 304 controls ON/OFF, etc. of the analysis results according to the operation of the switching button by the operator. good. Note that the display control unit 304 can also display the acquired label image on the display unit 307 by switching it to at least one of the corresponding radiation intensity image or substance discrimination image.
  • the display control unit 304 changes whether or not to display a high image quality button and a switching button for receiving instructions from the operator regarding processing using a learned model for segmentation, depending on the image capturing conditions. Good too. For example, when displaying an image under shooting conditions that do not match the shooting conditions of the learning data of the trained model that has been prepared, the display control unit 304 displays a high image quality button and a segmentation switching button on the display unit 307. You can choose not to display it. On the other hand, when displaying an image under shooting conditions that match the shooting conditions of the learning data of the trained model that has been prepared, the display control unit 304 displays a high image quality button and a segmentation switching button on the display unit 307. can be displayed.
  • the shooting conditions may include the type of image.
  • the display control unit 304 does not display a switching button regarding processing using a learned model regarding CT images on the display screen that displays the CT images, for example, if a trained model regarding CT images is not prepared. be able to.
  • the image quality improvement unit 907 and image processing device 930 determine whether the processing can be performed based on the imaging conditions. It's okay. In this case, the display control unit 304 may change whether or not to display the image quality improvement button and the segmentation-related switching button, depending on the determination result by the image quality improvement unit 907 and the image processing device 930. For example, if the image quality improvement unit 907 determines that image quality improvement processing cannot be performed based on the shooting conditions, the display control unit 304 displays a high image quality button based on the judgment of the image quality improvement unit 907. It can be hidden from the screen.
  • the display control unit 304 may display a message to the effect that the process cannot be performed on the display screen.
  • the image quality enhancement process and the segmentation process using the trained model may be executed in response to the operation of the corresponding image quality enhancement button or switching button.
  • the display control unit 304 may display a message on the display screen to the effect that the process cannot be performed.
  • the display control unit 304 may display a message that these images are obtained using a machine learning model. good.
  • the operator can prevent erroneous judgments caused by processing using the machine learning model by observing the image while understanding that the image was acquired using the machine learning model. can.
  • the learning data for the above-mentioned high image quality model and trained model for segmentation is not limited to data obtained using the photographing device itself that actually performs photographing.
  • the image data may be data obtained using the same type of imaging device, data obtained using the same type of imaging device, etc., depending on the desired configuration.
  • instructions from the examiner regarding display, analysis, high image quality processing, segmentation processing, etc. may be given by voice in addition to manual instructions (for example, instructions using a user interface, etc.).
  • the instructions may be given by, etc.
  • a machine learning model including a speech recognition model speech recognition engine, trained model for speech recognition
  • the manual instruction may be an instruction by inputting characters using a keyboard, a touch panel, or the like.
  • a machine learning model including a character recognition model obtained by machine learning
  • the instruction from the examiner may be an instruction using a gesture or the like.
  • a machine learning model including a gesture recognition model gesture recognition engine, trained model for gesture recognition
  • the instruction from the examiner may be the result of detecting the examiner's line of sight on the monitor.
  • the line of sight detection result may be, for example, a pupil detection result using a moving image of the examiner taken from around the monitor. At this time, the pupil detection from the moving image may be performed using the object recognition engine as described above.
  • the instructions from the examiner may be instructions based on brain waves, weak electrical signals flowing through the body, or the like.
  • the learning data may be character data or audio data (waveform data) indicating instructions for displaying radiation intensity images, material discrimination images, superimposed images, etc., and various images may be actually displayed.
  • the learning data may be output data that is an execution command to be displayed on the display unit 307.
  • input data for example, character data or audio data indicating an instruction to display a high-quality image obtained by a high-quality image is used as input data, and an execution command for displaying a high-quality image and a high-quality button are input.
  • the learning data may be the correct data that is an execution command for changing to the active state.
  • the learning data may be, for example, character data or voice data indicating an instruction to display a label image obtained by a trained model for segmentation, and an execution instruction for displaying a label image and a corresponding switching button.
  • the learning data may be learning data in which the correct data is an execution command for changing the state to the active state.
  • the learning data may be anything as long as the instruction content and the execution command content indicated by character data or audio data correspond to each other, for example.
  • audio data may be converted into character data using an acoustic model, a language model, or the like.
  • processing may be performed to reduce noise data superimposed on audio data using waveform data obtained by a plurality of microphones.
  • instructions using text or voice, etc., and instructions using a mouse, touch panel, etc. may be selectable in accordance with instructions from the examiner. Furthermore, it may be configured such that turning on/off of instructions by text or voice can be selected in accordance with instructions from the examiner.
  • machine learning includes deep learning as described above, and for example, a recurrent neural network (RNN) can be used for at least a part of the multi-layer neural network.
  • RNN recurrent neural network
  • an RNN which is a neural network that handles time-series information
  • LSTM Long Short-Term Memory
  • FIG. 13A shows the structure of RNN, which is a machine learning model.
  • the RNN 1320 has a loop structure in its network, inputs data x t 1310 at time t, and outputs data h t 1330. Since the RNN 1320 has a loop function in the network, it is possible to take over the state at the current time to the next state, so it can handle time-series information.
  • FIG. 13B shows an example of input/output of the parameter vector at time t.
  • the data x t 1310 includes N pieces of data (Params1 to ParamsN).
  • the data h t 1330 output from the RNN 1320 includes N pieces of data (Params1 to ParamsN) corresponding to the input data.
  • FIG. 14A shows the structure of LSTM.
  • the information that the network takes over at the next time t is the internal state c t-1 of the network called a cell and the output data h t-1 .
  • FIG. 14B details of the LSTM 1440 are shown in FIG. 14B.
  • FG represents a forgetting gate network
  • IG represents an input gate network
  • OG represents an output gate network, each of which is a sigmoid layer. Therefore, a vector in which each element has a value between 0 and 1 is output.
  • the forgetting gate network FG determines how much past information to retain
  • the input gate network IG determines which value to update.
  • CU is a cell update candidate network and an activation function tanh layer. This creates a vector of new candidate values that is added to the cell.
  • the output gate network OG selects the elements of the cell candidates and selects how much information to convey at the next time.
  • LSTM model is a basic form, it is not limited to the network shown here. Coupling between networks may be changed. Instead of LSTM, QRNN (Quasi Recurrent Neural Network) may be used. Furthermore, the machine learning model is not limited to neural networks, and boosting, support vector machines, etc. may also be used. Furthermore, if the examiner's instructions are input by text or voice, techniques related to natural language processing (for example, sequence to sequence) may be applied. Further, a dialogue engine (a dialogue model, a trained model for dialogue) that responds to the examiner by outputting text or voice may be applied.
  • a dialogue engine a dialogue model, a trained model for dialogue
  • the magnitude of the brightness value of the input data image, the order and slope of bright and dark areas, position, distribution, continuity, etc. are used as feature values. It is thought that the data is extracted as part of the data and used for inference processing.
  • trained models for speech recognition, character recognition, gesture recognition, etc. are trained using time-series data, so they are characterized by the slope between consecutive input time-series data values. It is thought that it is extracted as part of the amount and used for estimation processing. Therefore, such a trained model is expected to be able to perform accurate estimation by using the influence of temporal changes in specific numerical values in estimation processing.
  • the above-described high image quality model and trained model for segmentation can be provided in the image processing device 930.
  • the trained model may be configured with a software module executed by a processor such as a CPU, MPU, GPU, or FPGA, or may be configured with a circuit that performs a specific function such as an ASIC.
  • the learned model may be provided in another server device connected to the image processing device 930.
  • the image processing device 930 can use the trained model by connecting to a server or the like that includes the trained model via any network such as the Internet.
  • the server provided with the trained model may be, for example, a cloud server, a fog server, an edge server, or the like.
  • the reliability of the network may be improved by configuring it to use radio waves in a dedicated wavelength band.
  • the network may be constructed using wireless communication that is capable of high speed, large capacity, low delay, and multiple simultaneous connections.
  • the generation unit 302 generates a radiation intensity image such as a CT image or a substance discrimination image.
  • the acquisition unit 301 acquires radiation intensity images and substance discrimination images from an image processing device or storage device (not shown) via an arbitrary network, and the acquired images are used for display processing, other image processing, etc. It's okay to be hit.
  • the present invention provides a system or device with a program that implements one or more of the functions of the above-described embodiments via a network or storage medium, and one or more processors in the computer of the system or device reads and executes the program. This can also be achieved by processing. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • a computer has one or more processors or circuits and may include separate computers or a network of separate processors or circuits for reading and executing computer-executable instructions.
  • the processor or circuit may include a central processing unit (CPU), microprocessing unit (MPU), graphics processing unit (GPU), application specific integrated circuit (ASIC), or field programmable gateway (FPGA).
  • the processor or circuit may also include a digital signal processor (DSP), a data flow processor (DFP), or a neural processing unit (NPU).
  • DSP digital signal processor
  • DFP data flow processor
  • NPU neural processing unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Molecular Biology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Quality & Reliability (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Pulmonology (AREA)
  • Physiology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
PCT/JP2023/030194 2022-08-29 2023-08-22 画像処理装置、撮影システム、画像処理方法、及びプログラム Ceased WO2024048374A1 (ja)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US19/059,419 US20250191253A1 (en) 2022-08-29 2025-02-21 Image processing apparatus, imaging system, image processing method, and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-136204 2022-08-29
JP2022136204A JP2024032518A (ja) 2022-08-29 2022-08-29 画像処理装置、撮影システム、画像処理方法、及びプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US19/059,419 Continuation US20250191253A1 (en) 2022-08-29 2025-02-21 Image processing apparatus, imaging system, image processing method, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2024048374A1 true WO2024048374A1 (ja) 2024-03-07

Family

ID=90099462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/030194 Ceased WO2024048374A1 (ja) 2022-08-29 2023-08-22 画像処理装置、撮影システム、画像処理方法、及びプログラム

Country Status (3)

Country Link
US (1) US20250191253A1 (enExample)
JP (1) JP2024032518A (enExample)
WO (1) WO2024048374A1 (enExample)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2025154760A (ja) 2024-03-29 2025-10-10 富士フイルム株式会社 情報処理装置、情報処理方法、及びプログラム

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014117568A (ja) * 2012-12-19 2014-06-30 Toshiba Corp X線ct装置及び画像処理装置及び画像処理プログラム
JP2015043975A (ja) * 2013-07-31 2015-03-12 株式会社東芝 医用画像診断装置及び超音波診断装置
JP2016032635A (ja) * 2014-07-30 2016-03-10 株式会社東芝 フォトンカウンティング型x線ct装置
WO2017115532A1 (ja) * 2015-12-28 2017-07-06 キヤノン株式会社 放射線撮影装置、放射線撮影方法及びプログラム
US20180122108A1 (en) * 2016-10-31 2018-05-03 Samsung Electronics Co., Ltd. Medical imaging apparatus and method of processing medical image
JP2018527966A (ja) * 2015-06-30 2018-09-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 低減されたパイルアップを有するx線デバイス
JP2019202035A (ja) * 2018-05-25 2019-11-28 富士フイルム株式会社 骨塩情報取得装置、方法およびプログラム
WO2019225204A1 (ja) * 2018-05-25 2019-11-28 キヤノン株式会社 放射線撮影装置、放射線撮影システム、放射線撮影方法及びプログラム
JP2020039872A (ja) * 2018-09-07 2020-03-19 キヤノンメディカルシステムズ株式会社 X線ct装置、医用画像処理装置及びx線ctシステム
WO2022065318A1 (ja) * 2020-09-28 2022-03-31 富士フイルム株式会社 画像処理装置、画像処理方法、及び画像処理プログラム

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014117568A (ja) * 2012-12-19 2014-06-30 Toshiba Corp X線ct装置及び画像処理装置及び画像処理プログラム
JP2015043975A (ja) * 2013-07-31 2015-03-12 株式会社東芝 医用画像診断装置及び超音波診断装置
JP2016032635A (ja) * 2014-07-30 2016-03-10 株式会社東芝 フォトンカウンティング型x線ct装置
JP2018527966A (ja) * 2015-06-30 2018-09-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 低減されたパイルアップを有するx線デバイス
WO2017115532A1 (ja) * 2015-12-28 2017-07-06 キヤノン株式会社 放射線撮影装置、放射線撮影方法及びプログラム
US20180122108A1 (en) * 2016-10-31 2018-05-03 Samsung Electronics Co., Ltd. Medical imaging apparatus and method of processing medical image
JP2019202035A (ja) * 2018-05-25 2019-11-28 富士フイルム株式会社 骨塩情報取得装置、方法およびプログラム
WO2019225204A1 (ja) * 2018-05-25 2019-11-28 キヤノン株式会社 放射線撮影装置、放射線撮影システム、放射線撮影方法及びプログラム
JP2020039872A (ja) * 2018-09-07 2020-03-19 キヤノンメディカルシステムズ株式会社 X線ct装置、医用画像処理装置及びx線ctシステム
WO2022065318A1 (ja) * 2020-09-28 2022-03-31 富士フイルム株式会社 画像処理装置、画像処理方法、及び画像処理プログラム

Also Published As

Publication number Publication date
US20250191253A1 (en) 2025-06-12
JP2024032518A (ja) 2024-03-12

Similar Documents

Publication Publication Date Title
JP6985482B2 (ja) X線コンピュータ断層撮影装置、スキャン計画設定支援装置、医用画像診断システム、制御方法及び制御プログラム
US11207041B2 (en) X-ray CT system and medical processing apparatus
US11162909B2 (en) System and method for colorizing a radiograph from cabinet X-ray systems
JP7553672B2 (ja) 医用処理装置、x線ctシステム及び処理プログラム
JP7583530B2 (ja) 医用情報処理装置、医用画像診断装置及び医用情報処理方法
EP3760126B1 (en) Systems and methods for high-resolution spectral computed tomography imaging
JP7466401B2 (ja) 医用画像診断装置
US20240070862A1 (en) Medical information processing method and medical information processing apparatus
US20250191253A1 (en) Image processing apparatus, imaging system, image processing method, and computer-readable storage medium
JP7467253B2 (ja) X線ctシステム及び医用処理装置
US12042319B2 (en) Medical image processing apparatus, x-ray CT apparatus, method of medical image processing, and computer program product
JP7619869B2 (ja) 医用画像処理方法、医用画像処理装置及びx線ct装置
JP6466079B2 (ja) X線コンピュータ断層撮影装置及びスキャン計画設定支援装置
JP7426310B2 (ja) X線コンピュータ断層撮像装置
CN116137028A (zh) 医用图像处理装置、医用图像处理方法及存储介质
JP2021013489A (ja) X線ctシステム及び医用処理装置
US20240108302A1 (en) Method for identifying interventional object, imaging system, and non-transitory computer-readable medium
JP2020096693A (ja) X線ctシステム及び処理プログラム
US20250195014A1 (en) Medical image processing apparatus, control method therefor, and recording medium for medical image processing program
JP7433809B2 (ja) 学習済みモデルの生成方法、および医用処理装置
US20240095977A1 (en) Reconstruction device, x-ray ct apparatus, and image processing device
JP2024043514A (ja) 再構成装置、x線ct装置及び画像処理装置
JP2024046604A (ja) 医用画像処理装置およびその制御方法、医用画像処理プログラム
JP2024053264A (ja) X線ct装置、情報処理システム、情報処理方法およびプログラム
JP2024053263A (ja) X線ct装置、情報処理システム、情報処理方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23860137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 23860137

Country of ref document: EP

Kind code of ref document: A1