WO2019130836A1 - 放射線撮影装置、画像処理装置及び画像判定方法 - Google Patents

放射線撮影装置、画像処理装置及び画像判定方法 Download PDF

Info

Publication number
WO2019130836A1
WO2019130836A1 PCT/JP2018/041320 JP2018041320W WO2019130836A1 WO 2019130836 A1 WO2019130836 A1 WO 2019130836A1 JP 2018041320 W JP2018041320 W JP 2018041320W WO 2019130836 A1 WO2019130836 A1 WO 2019130836A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
radiation
subject
images
estimation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/041320
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
友彦 松浦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN201880084102.2A priority Critical patent/CN111526795B/zh
Priority to EP18895945.6A priority patent/EP3714791B1/en
Publication of WO2019130836A1 publication Critical patent/WO2019130836A1/ja
Priority to US16/902,686 priority patent/US11210809B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • A61B6/5241Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT combining overlapping images of the same imaging modality, e.g. by stitching
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/42Arrangements for detecting radiation specially adapted for radiation diagnosis
    • A61B6/4208Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
    • A61B6/4233Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector using matrix detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/42Arrangements for detecting radiation specially adapted for radiation diagnosis
    • A61B6/4266Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a plurality of detector units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/42Arrangements for detecting radiation specially adapted for radiation diagnosis
    • A61B6/4283Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by a detector unit being housed in a cassette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5217Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/505Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of bone
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Definitions

  • the present invention relates to an image processing apparatus that processes a radiation image taken using radiation, a control method therefor, and a radiation imaging apparatus provided with the image processing apparatus.
  • Patent Document 1 discloses a radiation imaging system that performs long-distance imaging by imaging a plurality of radiation detection devices side by side.
  • a plurality of radiation detection devices capture a plurality of radiation images (partial images), and the image processing device synthesizes the plurality of radiation images to form a single radiation image depicting the entire wide observation range. Generate an image (long image).
  • One of the plurality of radiation detection devices used in the radiation imaging system is a portable radiation detection device. Therefore, by installing a plurality of radiation imaging devices on a dedicated long frame for imaging, it is possible to repeat use such as performing long imaging and taking out one radiation imaging device from the pedestal and performing general imaging. It is.
  • Patent Document 2 discloses a method of overlapping and photographing the end portions of adjacent radiation detection devices, and determining and combining the arrangement order based on the similarity of the end portions of the obtained image.
  • the captured image be displayed on the screen in the vertical direction suitable for diagnosis, and the captured image obtained by long-length imaging is no exception.
  • a trunk such as a chest or an abdomen
  • Patent Document 2 although a plurality of images can be correctly connected to generate a long image, it is not possible to determine the direction of the subject in the long image. It is conceivable to provide a gravity sensor capable of detecting the direction of gravity and to determine the vertical direction of the image based on the detected direction of gravity.
  • the vertical direction of the elongate image may be determined. Can not.
  • An image processing apparatus has the following configuration. That is, An acquisition unit configured to acquire a plurality of radiation images generated by detecting radiation irradiated to a subject with a plurality of radiation detection devices; Combining means for combining the plurality of radiation images to generate a long image; Estimation means for estimating the direction of the subject in the plurality of radiation images; Determining means for determining the direction of the subject in the long image based on the estimation result of the direction by the estimation means for each of the plurality of radiation images.
  • the present invention it is possible to accurately determine the direction of an object in a long image obtained by long image capturing using a plurality of radiation detection devices.
  • FIG. 1 is a block diagram showing a configuration example of a radiation imaging apparatus according to the embodiment.
  • FIG. 2 is a flowchart showing the processing procedure of the radiation imaging apparatus in the embodiment.
  • FIG. 3 is a diagram for explaining generation of radiation images and long images obtained from a plurality of radiation detection devices.
  • FIG. 4 is a diagram for explaining the vertical direction of the long image.
  • FIG. 5 is a flowchart showing the processing procedure of the vertical direction determination unit of the first embodiment.
  • FIG. 6 is a flowchart showing the processing procedure of the vertical direction determination unit of the second embodiment.
  • FIG. 7 is a flowchart showing the processing procedure of the vertical direction determination unit of the third embodiment.
  • FIG. 8 is a block diagram showing details of the vertical direction determination unit according to the first embodiment.
  • FIG. 9 is a block diagram showing details of the vertical direction determination unit according to the third embodiment.
  • an indirect type FPD which performs radiation imaging by converting the irradiated radiation into visible light by a phosphor (scintillator) and detecting the obtained visible light by a photodiode.
  • a phosphor sintillator
  • detecting the obtained visible light by a photodiode An example will be described.
  • FPD is a flat panel detector (Flat Panel Detector).
  • FIG. 1 is a diagram showing a functional configuration of a radiation imaging apparatus 100 according to the present embodiment.
  • the radiation imaging apparatus 100 includes a radiation generation unit 101, a radiation generation apparatus 104, and an imaging control apparatus 130.
  • the radiation generation unit 101 irradiates the subject 103 with radiation.
  • the radiation generating apparatus 104 applies a high voltage pulse to the radiation generating unit 101 in response to depression of an exposure switch (not shown) to generate radiation.
  • the FPDs 102a and 102b as a plurality of radiation detection devices convert radiation that has passed through the object 103 into visible light with a phosphor, and detect the converted visible light with a photodiode to obtain an analog electrical signal.
  • Analog electrical signals detected by the FPDs 102a and 102b are converted (A / D converted) into digital data and transmitted as image data to the FPD control units 105a and 105b.
  • a / D converted Analog electrical signals detected by the FPDs 102a and 102b are converted (A / D converted) into digital data and transmitted as image data to the FPD control units 105a and 105b.
  • an example in which long-length imaging is performed using two sets of the FPD 102 and the FPD control unit 105 is shown.
  • the FPD control units 105 a and 105 b are connected to the CPU bus 106.
  • the FPD control units 105a and 105b are an example of an acquisition unit that acquires a plurality of radiation images generated by detecting the radiation irradiated to the subject with a plurality of radiation detection devices.
  • the acquisition unit is not limited to this example.
  • the radiation image may be acquired from HIS or RIS via a network I / F 118 described later.
  • HIS is a hospital information system (Hospital Information Systems)
  • RIS Radiology information system
  • An image processing unit 110, an image storage unit 111, an operation unit 112, and a display unit 113 are further connected to the CPU bus 106. Further, a CPU 114, a ROM 115, a RAM 116, a display control unit 117, a network I / F 118 and the like provided in a general computer are also connected to the CPU bus 106.
  • the CPU 114 is a central processing unit, and controls the entire imaging control apparatus 130 by executing a program stored in the ROM 115 or the RAM 116.
  • the ROM 115 is a read only non-volatile memory (Read Only Memory), and stores various programs executed by the CPU 114 and various data.
  • the RAM 116 is a volatile memory (Random Access Memory) that can be read and written any time, and provides a work area used by the CPU 114.
  • the display control unit 117 performs various displays on the display unit 113 under the control of the CPU 114. For example, the display control unit 117 displays the long image on the display unit 113 based on the direction of the subject determined by the vertical direction determination unit 123.
  • a network I / F 118 connects the imaging control apparatus 130 to the network.
  • the radiation imaging apparatus 100 is connected to an information system such as HIS or RIS via the network I / F 118, for example.
  • the image processing unit 110 includes an image correction unit 120, an image combining unit 121, a diagnostic image processing unit 122, and a vertical direction determination unit 123.
  • the image correction unit 120 performs various correction processes called so-called pre-processing such as offset correction, sensitivity correction, and defective pixel correction for correcting characteristic variations of solid-state imaging elements of the FPDs 102a and 102b.
  • the image combining unit 121 combines a plurality of radiation images to generate a long image. In the present embodiment, the image combining unit 121 combines a plurality of radiation images captured by a plurality of FPDs (FPDs 102a and 102b) to generate one long image.
  • the diagnostic image processing unit 122 performs diagnostic image processing such as gradation processing, dynamic range processing, and spatial frequency processing.
  • the vertical direction determination unit 123 determines the vertical direction of the subject in a long image which is a photographed image by image determination.
  • the vertical direction determination unit 123 will be described in detail with reference to FIG.
  • the image processing unit 110 may be realized by the CPU 114 executing a predetermined program, or part or all of the image processing unit 110 may be realized by dedicated hardware.
  • the image storage unit 111 is a large-capacity storage device for storing the radiation images output from the FPD control units 105 a and 105 b and the radiation image processed by the image processing unit 110.
  • the operation unit 112 inputs user instructions to the image processing unit 110 and the FPD control units 105 a and 105 b.
  • the display unit 113 displays, for example, a long image generated by the image processing unit 110 under the control of the display control unit 117. At this time, the display control unit 117 controls the display direction of the long image so that, for example, the head direction of the subject is upward according to the determination result in the vertical direction by the vertical direction determination unit 123.
  • step S201 the radiation generation apparatus 104 drives the radiation generation unit 101 to irradiate the subject 103 with radiation.
  • the radiation generating apparatus 104 applies a high voltage pulse to the radiation generating unit 101 by depression of an unshown exposure switch to generate radiation.
  • step S202 the FPD control unit 105a and the FPD control unit 105b drive the FPDs 102a and 102b, respectively, to acquire a radiation image.
  • the FPDs 102a and 102b subjected to the radiation irradiation accumulate charge proportional to the radiation dose in the detection element, and output image data based on the accumulated charge.
  • the FPD control units 105a and 105b receive the image data from the FPDs 102a and 102b, respectively, and store the image data as a radiation image in the image storage unit 111 or the RAM 116. For example, as shown in FIG.
  • the FPD control unit 105a acquires a first radiation image 303 from the FPD 102a disposed in the first imaging range 301. Further, the FPD control unit 105 b acquires a second radiation image 304 from the FPD 102 b disposed in the second imaging range 302.
  • step S203 the image correction unit 120 performs image correction on the first radiation image 303 and the second radiation image 304 stored in the image storage unit 111 or the RAM 116.
  • various correction processes called so-called pre-processing such as offset correction, sensitivity correction, and defective pixel correction are performed to correct characteristic variations of solid-state imaging devices included in each of the FPD 102 a and the FPD 102 b.
  • the image combining unit 121 combines the long image 305 from the first radiation image 303 and the second radiation image 304 subjected to the image correction in step S203.
  • the image combining unit 121 determines the bonding position of the first radiation image 303 and the second radiation image 304, and generates the long image 305 by bonding the two images.
  • the method of determining the bonding position is not particularly limited, for example, information on the relative positions of the FPD 102 a and the FPD 102 b may be separately acquired and determined based on this information.
  • the junction position can be determined by applying an image analysis technique such as template matching to the overlapping region of the first radiation image 303 and the second radiation image 304.
  • step S205 the diagnostic image processing unit 122 performs diagnostic image processing on the long image 305 synthesized by the image synthesis unit 121 in step S204. More specifically, the diagnostic image processing unit 122 performs diagnostic image processing such as gradation processing, dynamic range processing, and spatial frequency processing on the long image 305.
  • step S206 the vertical direction determination unit 123 determines the vertical direction of the long image 305 synthesized in step S205.
  • a long image in which the direction of the head of the subject is upward is the upward long image
  • a long image in which the direction of the head of the subject is downward is the downward long image .
  • the long image 305 shown in FIG. 3 is an example of a long image facing upward, and in such a case, it is assumed that an image display suitable for diagnosis can be obtained by displaying the long image in this orientation. .
  • both the upward long image 401 and the downward long image 402 are the same. It can happen to an extent.
  • the vertical direction determination unit 123 separately estimates the vertical directions of the first radiation image 303 and the second radiation image 304, and finally, based on the estimation results, the longitudinal image 305 is obtained. Determine the vertical direction. A more detailed configuration and processing content of the vertical direction determination unit 123 will be described later.
  • step S207 the display unit 113 displays the long image 305 subjected to the diagnostic image processing in step S205 on the display unit 113 in accordance with the vertical direction of the long image 305 determined in step S206. Then, the process in the radiation imaging apparatus 100 ends.
  • FIG. 5 is a flowchart showing the vertical direction determination processing by the vertical direction determination unit 123 of the present embodiment.
  • FIG. 8 is a block diagram showing a more detailed configuration and processing operation of the vertical direction determination unit 123. As shown in FIG.
  • the vertical direction determination unit 123 includes an estimation unit 801 and a determination unit 821.
  • the estimation unit 801 estimates the vertical direction of each of a plurality of radiation images.
  • the determination unit 821 determines the vertical direction in the long image based on the estimation result in the vertical direction by the estimation unit 801 for each of the plurality of radiation images.
  • the direction of the subject in the image can be used to determine the vertical direction of the image.
  • the estimation unit 801 estimates, for each of a plurality of radiation images, the direction of the subject in the images.
  • the determination unit 821 determines the direction of the subject in the long image based on the estimation result by the estimation unit 801 for each of the plurality of radiation images.
  • the direction of the subject in the radiation image, and the direction of the subject in the long image for example, the direction from the foot side to the head side of the subject, or the direction from the head side of the subject to the foot side It can be used.
  • the estimation unit 801 has a first possibility that the direction on the head side of the subject in the radiation image may be the first direction along the longitudinal direction of the long image 305, and Estimate a second possibility that indicates the possibility of being a second direction opposite to the first direction. More specifically, the similarity calculation units 802a and 802b individually estimate the possibilities (first and second possibilities) of the first radiation image 303 and the second radiation image 304 in the vertical direction. The determination unit 821 determines the direction of the subject in the long image 305 (hereinafter also referred to as the vertical direction of the image) based on the first possibility and the second possibility estimated by the estimation unit 801 for each radiation image. judge.
  • the estimation unit 801 determines the first direction 841 and the second direction 841 along the longitudinal direction of the long image 305 based on the estimation result of the estimation unit 801. It is determined which of the directions 842 is the direction toward the head of the subject.
  • FIG. 8 shows a state in which the first direction 841 is the head side direction.
  • the image is an upward image
  • the direction on the head side of the subject is the second direction 842
  • the image is a downward image.
  • it does not specifically limit about the estimation method of the up-down direction of the radiographic image by the estimation part 801 For example, it is possible to apply the following estimation methods.
  • An anatomical region is detected from a radiation image, and the direction of the subject is estimated based on the likelihood of the detected anatomical region and the shape or relative positional relationship. For example, when lungs, heart, liver, etc. are recognized as an anatomical region of a subject in a radiation image, the certainty of each recognition and / or the shape of each recognized region and / or the positional relationship of each region The vertical direction of the radiation image is estimated based on
  • the up-and-down direction (possibility of being a first direction and possibility of being a second direction) of each radiation image is estimated by a method based on machine learning that is actively performed in recent years.
  • Supervised learning which is a method of machine learning, provides a computer with a large amount of teacher image data combining image data feature amounts and vertical correctness to learn in advance, and makes it a feature amount of image data to be determined. If you input, the mechanism to output the upper and lower judgment results is constructed.
  • the estimation unit 801 has a determiner constructed by machine learning using a plurality of teacher images in which the direction of the subject is known. When a radiation image is given as an input to this judgment unit, a first possibility indicating the possibility of the first direction and a second possibility indicating the possibility of the second direction are obtained as the output.
  • the similarity calculation units 802a and 802b of the estimation unit 801 compare the feature amounts of the radiation images 303 and 304 with a plurality of reference images 803 whose direction on the head side of the subject in the image is known.
  • the reference image 803 may be stored in the image storage unit 111, for example.
  • the similarity calculators 802a and 802b calculate the similarity U1 and U2 with the upper reference image and the similarity D1 and D2 with the lower reference image by this comparison.
  • the estimation unit 801 uses the similarities U1 and U2 as an index indicating a first possibility that the head side of the subject in the radiation image is the first direction in the longitudinal direction of the long image 305. Further, the estimation unit 801 uses the similarities D1 and D2 as an index indicating a second possibility which is a second direction opposite to the first direction.
  • the similarity calculation units 802a and 802b divide the radiation image into a plurality of blocks, and calculate the similarity by comparing the feature amount of each block of the plurality of blocks with the feature amount of the corresponding block of the reference image 803.
  • the similarity calculation unit 802a extracts the feature amount of the first radiation image 303. Specifically, the radiation image 303 is divided into a plurality of blocks of a predetermined size and, for example, statistical values (minimum, maximum, average, variance, etc.) of pixel values in each block and each block are represented. Edge strength and its direction are extracted as feature quantities.
  • the similarity calculation unit 802a calculates the similarity U1 between the feature of the reference image 803, which is an upward facing teacher image given in advance as a correct answer, and the feature of the first radiation image 303. For example, the similarity calculation unit 802a calculates the similarity between the feature amount of each block of the first radiation image 303 and the feature amount of the corresponding block of the reference image 803. Then, the similarity calculation unit 802a applies a weight assigned in advance to each block to calculate a weighted sum of similarity for each block, and uses this as the similarity U1.
  • the similarity U1 can be used as an index indicating the upward likelihood (first possibility) of the first radiation image 303.
  • step S503 similarly to step S502, the similarity D1 between the feature of the reference image 803 which is a downward teacher image given as the correct answer and the feature of the first radiation image 303 is calculated.
  • the similarity D1 can be used as an index indicating the downward likelihood (second possibility) of the first radiation image 303.
  • processing similar to steps S501 to S503 is performed on the second radiation image 304 in steps S504 to S506, and the similarity calculation unit 802b calculates the degrees of similarity U2 and D2.
  • the similarity (similarity U1, D1 and similarity U2, D2) is calculated in parallel for two radiation images, but the calculation process of similarity between two radiation images is in series. It may be done. In that case, the number of similarity calculators 802a and 802b may be one.
  • the processing order of the process (steps S501 to S503) for the first radiation image and the process (steps S504 to S506) for the second radiation image is not particularly limited. For example, either the process for the first radiation image or the process for the second radiation image data may be performed first.
  • the determination unit 821 is based on the result of integrating the first possibility generated for the plurality of radiation images and the result of integrating the second possibility generated for the plurality of radiation images.
  • the direction of the subject in the long image is determined.
  • the determination unit 821 determines the vertical direction of the long image 305 based on the degrees of similarity U1, D1, U2, and D2 calculated by the estimation unit 801 in steps S502, S503, S505, and S506. More specifically, for example, in the determination unit 821, the adder 822a is an index of the first possibility indicating the upward likelihood of the first radiation image 303 and the second radiation image 304. Calculate the sum.
  • the adder 822b calculates the sum of the similarities D1 and D2, which is a second possibility index indicating downward likelihood of the first radiation image 303 and the second radiation image 304.
  • the comparison unit 823 determines the vertical direction of the long image 305 by comparing the result (U1 + U2) obtained from the adder 822a with the result (D1 + D2) obtained from the adder 822b. That is, the comparison unit 823 determines that the long image 305 is upward when U1 + U2> D1 + D2, and determines that the long image 305 is downward when U1 + U2 ⁇ D1 + D2.
  • the present invention is also applicable to the case where long-length shooting is performed using three or more sets Needless to say.
  • the number of reference images 803 is large and the processing load of the similarity calculation unit 802 is large, the reference images to be compared with the radiation image may be narrowed down.
  • the estimation unit 801 may use a reference image having a matching degree with each of the radiation images 303 and 304 of a predetermined value or more for comparison (calculation of similarity).
  • the estimation unit 801 may classify the reference image 803 according to the imaging region, and select the reference image to be used for comparison based on the imaging region detected from the radiation image to be compared.
  • a well-known method can be applied to the method of detecting an imaging
  • the imaging region may be determined from imaging instruction information provided from the information system such as HIS or RIS to the radiation imaging apparatus 100 via the network.
  • the comparison may be performed excluding the region where the radiation is not irradiated in the radiation image.
  • the area not irradiated with radiation is an area mainly generated by the radiation stop, and a known method can be applied to its detection method.
  • the radiation imaging apparatus 100 capable of stably displaying a long image in a direction suitable for diagnosis.
  • Second Embodiment In the first embodiment, it is synthesized by using the calculation results of the possibility of being upward and the possibility of being downward for each of a plurality of radiation images (first radiation image and second radiation image) The vertical direction of the long image was judged.
  • the calculation result of the vertical direction of the long images generated from the plurality of radiation images is added to the calculation result of the vertical direction of each of the plurality of radiation images. Determine the direction. That is, in the second embodiment, the estimation unit 801 estimates the direction of the subject in the radiation image and the direction of the subject in the long image 305 generated by the image combining unit 121.
  • the determination unit 821 determines the direction of the subject in the long image 305, that is, the length based on the estimation results by the estimation unit 801 for each of a plurality of radiation images (for example, radiation images 303 and 304) and the long image. Determine the vertical direction of the scale image.
  • the configuration of the radiation imaging apparatus 100 according to the second embodiment and the operation of capturing a long image are the same as in the first embodiment (FIGS. 1 and 2).
  • the vertical direction determination process in step S206 will be described.
  • step S206 the vertical direction determination unit 123 sets the long image 305 as an estimation target in the vertical direction in addition to the first radiation image 303 and the second radiation image 304, and the long image 305 based on the estimation results. Determine the vertical direction of
  • the determination process by the vertical direction determination unit 123 according to the second embodiment, that is, the process of step S206 will be described using the flowchart shown in FIG.
  • step S610 the determination unit 821 of the vertical direction determination unit 123 is long based on the degrees of similarity U1, D1, U2, D2, U0, and DO calculated in steps S502, S503, S505, S506, S608, and S609.
  • the vertical direction of the image 305 is determined.
  • the vertical direction determination unit 123 indicates the sum of the similarities U1, U2, and U0, which is an index indicating upward likelihood of the first radiation image 303, the second radiation image 304, and the long image 305, and downward likelihood.
  • the sum of the similarities D1, D2 and DO which are indicators is compared.
  • the vertical direction determination unit 123 determines that the long image is upward when the sum of the similarities U1, U2, and U0 is large, and otherwise determines that the long image is downward.
  • the similarity calculation unit 802a of the vertical direction determination unit 123 calculates the similarities U1 and D1 of the first radiation image 901. Further, in steps S504 to S506, the similarity calculation unit 802b of the vertical direction determination unit 123 calculates the similarities U2 and D2 of the second radiation image 902. These processes are similar to steps S501 to S506 in the first embodiment (FIG. 5). In steps S707 to S709, the similarity calculation unit 802c performs the same processing as in steps S501 to S503 on the third radiation image 903 to calculate the degrees of similarity U3 and D3.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
PCT/JP2018/041320 2017-12-27 2018-11-07 放射線撮影装置、画像処理装置及び画像判定方法 Ceased WO2019130836A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201880084102.2A CN111526795B (zh) 2017-12-27 2018-11-07 放射线摄像装置、图像处理装置及图像确定方法
EP18895945.6A EP3714791B1 (en) 2017-12-27 2018-11-07 Radiography device, image processing device, and image determination method
US16/902,686 US11210809B2 (en) 2017-12-27 2020-06-16 Image processing apparatus, image determination method and non-transitory computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-252067 2017-12-27
JP2017252067A JP7022584B2 (ja) 2017-12-27 2017-12-27 放射線撮影装置、画像処理装置及び画像判定方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/902,686 Continuation US11210809B2 (en) 2017-12-27 2020-06-16 Image processing apparatus, image determination method and non-transitory computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2019130836A1 true WO2019130836A1 (ja) 2019-07-04

Family

ID=67067004

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041320 Ceased WO2019130836A1 (ja) 2017-12-27 2018-11-07 放射線撮影装置、画像処理装置及び画像判定方法

Country Status (5)

Country Link
US (1) US11210809B2 (enExample)
EP (1) EP3714791B1 (enExample)
JP (1) JP7022584B2 (enExample)
CN (1) CN111526795B (enExample)
WO (1) WO2019130836A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220189141A1 (en) * 2019-09-06 2022-06-16 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102322927B1 (ko) * 2019-10-22 2021-11-04 연세대학교 산학협력단 인공신경망을 이용한 진단 영상을 구분하기 위한 장치, 이를 위한 방법 및 이 방법을 수행하기 위한 프로그램이 기록된 컴퓨터 판독 가능한 기록매체
CN115835820A (zh) * 2021-04-23 2023-03-21 深圳帧观德芯科技有限公司 使用具有多个辐射检测器的图像传感器的成像方法
WO2024090050A1 (ja) * 2022-10-27 2024-05-02 富士フイルム株式会社 画像処理装置、方法およびプログラム、並びに学習装置、方法およびプログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000258861A (ja) * 1999-03-08 2000-09-22 Fuji Photo Film Co Ltd 蓄積性蛍光体シート及びこれを収容するカセッテ
JP2002085392A (ja) * 2000-09-14 2002-03-26 Konica Corp 放射線画像処理方法および放射線画像処理装置
JP2005202477A (ja) * 2004-01-13 2005-07-28 Fuji Photo Film Co Ltd 顔画像の天地方向判定方法、画像記録装置および画像再生装置
JP2012040140A (ja) 2010-08-18 2012-03-01 Fujifilm Corp 可搬型放射線撮影装置セット、可搬型放射線撮影装置
JP2012045172A (ja) 2010-08-26 2012-03-08 Fujifilm Corp 放射線画像撮影システム、放射線画像撮影方法、及びプログラム
WO2015045005A1 (ja) * 2013-09-24 2015-04-02 株式会社島津製作所 X線撮影装置およびx線撮影方法
JP2016076843A (ja) * 2014-10-07 2016-05-12 キヤノン株式会社 撮像装置、撮像装置の制御方法、プログラム、記録媒体

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000157519A (ja) * 1998-11-25 2000-06-13 Konica Corp 画像処理装置
JP5361103B2 (ja) * 2000-10-24 2013-12-04 株式会社東芝 画像処理装置
JP4110074B2 (ja) 2003-11-05 2008-07-02 キヤノン株式会社 放射線画像処理装置、放射線画像処理方法、プログラム及びコンピュータ可読媒体
DE102005036852A1 (de) * 2005-08-04 2007-02-22 Siemens Ag Verfahren bzw. "Vorrichtung" zum Ermitteln einer Lage eines Patienten bei einem auf einem medizinischen Bildgebungsverfahren basierenden Erstellen eines Bildes eines Untersuchungsbereichs des Patienten
JP4865362B2 (ja) * 2006-03-01 2012-02-01 キヤノン株式会社 画像処理装置及びその制御方法、プログラム
JP4740779B2 (ja) * 2006-03-28 2011-08-03 株式会社日立メディコ 放射線撮影装置
JP4682898B2 (ja) * 2006-03-30 2011-05-11 株式会社島津製作所 X線撮影装置
JP4858963B2 (ja) * 2006-09-14 2012-01-18 株式会社日立メディコ 医用画像処理装置
DE102007024452A1 (de) * 2007-05-25 2008-11-27 Siemens Ag Verfahren, Tomographiesystem und Bildbearbeitungssystem zur Darstellung tomographischer Aufnahmen eines Patienten
JP2009045092A (ja) 2007-08-13 2009-03-05 Canon Inc Ct撮影装置及びその制御方法
JP2010094209A (ja) * 2008-10-15 2010-04-30 Fujifilm Corp 放射線画像撮影装置
JP5460103B2 (ja) * 2009-03-31 2014-04-02 キヤノン株式会社 放射線撮影装置及びその暗電流補正方法
JP2011004856A (ja) * 2009-06-24 2011-01-13 Konica Minolta Medical & Graphic Inc 放射線画像撮影システム
JP2012016394A (ja) * 2010-07-06 2012-01-26 Shimadzu Corp 放射線断層撮影装置
JP5567963B2 (ja) * 2010-09-29 2014-08-06 富士フイルム株式会社 画像処理装置、放射線画像システム、画像処理方法およびプログラム
JP5601343B2 (ja) * 2012-05-02 2014-10-08 株式会社島津製作所 放射線撮像装置
JP6289142B2 (ja) * 2014-02-07 2018-03-07 キヤノン株式会社 画像処理装置、画像処理方法、プログラムおよび記憶媒体
DE102014210938A1 (de) * 2014-06-06 2015-12-17 Siemens Aktiengesellschaft Verfahren zum Steuern eines medizinischen Gerätes sowie Steuerungssystem für ein medizinisches Gerät
JP6072102B2 (ja) * 2015-01-30 2017-02-01 キヤノン株式会社 放射線撮影システム及び放射線撮影方法
JP6582510B2 (ja) * 2015-04-15 2019-10-02 コニカミノルタ株式会社 放射線画像撮影システム
JP2016202252A (ja) * 2015-04-15 2016-12-08 キヤノン株式会社 放射線撮影システム、放射線撮影システムの制御方法およびプログラム
JP6955909B2 (ja) * 2017-06-12 2021-10-27 キヤノンメディカルシステムズ株式会社 画像処理装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000258861A (ja) * 1999-03-08 2000-09-22 Fuji Photo Film Co Ltd 蓄積性蛍光体シート及びこれを収容するカセッテ
JP2002085392A (ja) * 2000-09-14 2002-03-26 Konica Corp 放射線画像処理方法および放射線画像処理装置
JP2005202477A (ja) * 2004-01-13 2005-07-28 Fuji Photo Film Co Ltd 顔画像の天地方向判定方法、画像記録装置および画像再生装置
JP2012040140A (ja) 2010-08-18 2012-03-01 Fujifilm Corp 可搬型放射線撮影装置セット、可搬型放射線撮影装置
JP2012045172A (ja) 2010-08-26 2012-03-08 Fujifilm Corp 放射線画像撮影システム、放射線画像撮影方法、及びプログラム
WO2015045005A1 (ja) * 2013-09-24 2015-04-02 株式会社島津製作所 X線撮影装置およびx線撮影方法
JP2016076843A (ja) * 2014-10-07 2016-05-12 キヤノン株式会社 撮像装置、撮像装置の制御方法、プログラム、記録媒体

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3714791A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220189141A1 (en) * 2019-09-06 2022-06-16 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium

Also Published As

Publication number Publication date
CN111526795A (zh) 2020-08-11
US11210809B2 (en) 2021-12-28
EP3714791B1 (en) 2023-01-11
EP3714791A4 (en) 2021-11-10
US20200311975A1 (en) 2020-10-01
EP3714791A1 (en) 2020-09-30
JP2019115558A (ja) 2019-07-18
CN111526795B (zh) 2023-05-02
JP7022584B2 (ja) 2022-02-18

Similar Documents

Publication Publication Date Title
US11410312B2 (en) Dynamic analysis system
US10825190B2 (en) Dynamic image processing apparatus for aligning frame images obtained by photographing dynamic state of chest based on movement of lung-field region
US7940892B2 (en) Energy substraction method and apparatus
WO2019130836A1 (ja) 放射線撮影装置、画像処理装置及び画像判定方法
JP6870765B1 (ja) 動態品質管理装置、動態品質管理プログラム及び動態品質管理方法
JP4404291B2 (ja) 画像処理装置及び方法及びシステム
JP2019212138A (ja) 画像処理装置、画像処理方法及びプログラム
US10123757B2 (en) Image-processing device, radiation image capture system, image-processing method, and computer-readable storage medium
US20180114321A1 (en) Dynamic analysis system
JP2020000475A (ja) 動態画像処理装置及びプログラム
JP6156849B2 (ja) 放射線画像処理装置、方法およびプログラム
JP6155177B2 (ja) 画像診断支援装置に画像処理を実行させるためのコンピュータプログラム、装置及び方法
US10687772B2 (en) Dynamic analysis apparatus
JP2019054991A (ja) 解析装置及び解析システム
JP2018175320A (ja) 放射線撮影システム
US20200286235A1 (en) Dynamic image analysis system and dynamic image processing apparatus
JP7772158B2 (ja) 動態解析装置及びプログラム
JP2025090141A (ja) 動態画像表示装置、動態画像表示方法及びプログラム
JP2007282772A (ja) 医用画像処理装置及び医用画像処理方法
JP5839709B2 (ja) 骨塩量計測装置および方法
JP2022013681A (ja) 動態品質管理装置、動態品質管理プログラム及び動態品質管理方法
JP2020141841A (ja) 動態解析装置及びプログラム
JP2023026878A (ja) 画像処理装置、表示制御方法及びプログラム
JP2006263224A (ja) 放射線画像処理装置、放射線画像処理方法、プログラム及びコンピュータ可読媒体。
JP2021132994A (ja) 動態解析装置及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18895945

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018895945

Country of ref document: EP

Effective date: 20200623