WO2023139971A1 - 画像処理装置、画像処理方法、プログラム、及び機械学習方法 - Google Patents
画像処理装置、画像処理方法、プログラム、及び機械学習方法 Download PDFInfo
- Publication number
- WO2023139971A1 WO2023139971A1 PCT/JP2022/045734 JP2022045734W WO2023139971A1 WO 2023139971 A1 WO2023139971 A1 WO 2023139971A1 JP 2022045734 W JP2022045734 W JP 2022045734W WO 2023139971 A1 WO2023139971 A1 WO 2023139971A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- calcification
- region
- interest
- images
- Prior art date
Links
- 238000010801 machine learning Methods 0.000 title claims description 29
- 238000003672 processing method Methods 0.000 title claims description 6
- 230000002308 calcification Effects 0.000 claims abstract description 131
- 230000005855 radiation Effects 0.000 claims abstract description 95
- 238000001514 detection method Methods 0.000 claims abstract description 86
- 210000000481 breast Anatomy 0.000 claims abstract description 30
- 238000005520 cutting process Methods 0.000 claims abstract description 16
- 239000002131 composite material Substances 0.000 claims abstract description 15
- 238000003384 imaging method Methods 0.000 claims description 52
- 238000000034 method Methods 0.000 claims description 44
- 230000008569 process Effects 0.000 claims description 32
- 238000002601 radiography Methods 0.000 claims description 14
- 238000013528 artificial neural network Methods 0.000 claims description 8
- 238000004088 simulation Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 17
- 238000009607 mammography Methods 0.000 description 16
- 238000003860 storage Methods 0.000 description 14
- 230000006835 compression Effects 0.000 description 11
- 238000007906 compression Methods 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 8
- 238000004195 computer-aided diagnosis Methods 0.000 description 4
- 230000003902 lesion Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000003745 diagnosis Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 3
- 238000009546 plain radiography Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 210000005075 mammary gland Anatomy 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 206010048782 Breast calcifications Diseases 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
Definitions
- the present disclosure relates to an image processing device, an image processing method, a program, and a machine learning method.
- Japanese Patent Application Laid-Open No. 2020-127669 discloses that a region on a radiographic image of a subject is specified based on the calcified positions on volume data of the subject, and an enhanced image that emphasizes the calcified positions in the radiographic image is generated. Further, Japanese Patent Application Laid-Open No. 2020-127669 discloses emphasizing the position of calcification and its shape/characteristics on a radiographic image by using a method such as color.
- tomosynthesis imaging in which a series of multiple projection images are acquired by irradiating the breast with radiation from multiple angles.
- a plurality of tomographic images with little overlap of mammary glands can be obtained.
- Japanese Patent Application Laid-Open No. 2020-141867 discloses generating a two-dimensional image corresponding to a synthesized two-dimensional image by inputting a projected image with a radiation irradiation angle of about 0 degrees into a trained model instead of multiple tomographic images.
- the shape of the calcification image in tomographic images, synthetic 2D images, etc. is important information.
- the calcification image is buried in noise and the visibility is lowered.
- a composite two-dimensional image generated based on a plurality of tomographic images does not accurately represent the shape of the calcification image.
- Japanese Patent Application Laid-Open No. 2020-127669 describes highlighting based on the shape of the calcified image, but this is not a technique for improving the visibility of the shape of the calcified image. Further, Japanese Patent Application Laid-Open No. 2020-141867 describes generating a synthetic two-dimensional image, but does not describe improving the visibility of the shape of a calcification image appearing in the synthetic two-dimensional image.
- the calcification image in the synthesized two-dimensional image is deteriorated in shape compared to the calcification image in the projection image acquired by simple radiography. Therefore, it is desired to restore the shape of the calcification image in the synthesized two-dimensional image to the shape of the calcification image in the projection image obtained by simple radiography.
- An object of the technology of the present disclosure is to provide an image processing device, an image processing method, a program, and a machine learning method that can improve the visibility of the shape of a calcified image in a synthesized two-dimensional image.
- the image processing apparatus of the present disclosure includes at least one processor, and the processor performs calcification image detection processing for detecting a calcification image based on a plurality of tomographic images obtained from a series of a plurality of projection images obtained by breast tomosynthesis imaging; , a shape restoration process for restoring the shape of the calcification image based on the region of interest image generated by the region of interest image generation process, and a composite two-dimensional image generation process for generating a composite two-dimensional image based on the result of shape restoration by the shape restoration process and a plurality of tomographic images.
- the processor When detecting a plurality of calcification images in the calcification image detection process, the processor preferably generates a region of interest image individually for each of the plurality of calcification images in the region of interest image generation process.
- the processor preferably executes shape restoration processing by inputting the region-of-interest image into the machine-learned model.
- the machine-learned model is preferably a neural network in which machine learning is performed with the correct image as the image generated by cutting out the area containing the calcification image from the projection image obtained by simple radiography in which radiation is irradiated from the position facing the detection surface of the radiation detector with the region of interest image as the input image.
- the machine-learned model is preferably a neural network that has been machine-learned using input images and correct images generated by simulation or photography using a phantom.
- the image processing method of the present disclosure includes a calcification image detection step of detecting a calcification image based on a plurality of tomographic images obtained from a series of a plurality of projection images obtained by breast tomosynthesis imaging, and a region of interest image generation step of generating a region of interest image by cutting out a region including the calcification image from the projection image obtained at the irradiation position closest to the position directly facing the detection surface of the radiation detector among the plurality of projection images or from the plurality of tomographic images based on the detection result of the calcification image detection step.
- a shape restoration step of restoring the shape of the calcification image based on the region of interest image generated by the region of interest image generation step and a composite two-dimensional image generation step of generating a composite two-dimensional image based on the shape restoration result of the shape restoration step and a plurality of tomographic images.
- the program of the present disclosure includes a calcification image detection process for detecting a calcification image based on a plurality of tomographic images obtained from a series of a plurality of projection images obtained by breast tomosynthesis imaging, a region of interest image generation process for generating a region of interest image by cutting out a region including the calcification image from the projection image obtained at the irradiation position closest to the position directly facing the detection surface of the radiation detector among the plurality of projection images or from the plurality of tomographic images, based on the detection results of the calcification image detection process.
- a computer is caused to execute a shape restoration process for restoring the shape of a calcification image based on the region of interest image generated by the area image generation process, and a synthetic two-dimensional image generation process for generating a synthetic two-dimensional image based on the shape restoration result of the shape restoration process and a plurality of tomographic images.
- the machine learning method of the present disclosure detects a calcification image based on a plurality of tomographic images obtained from a series of a plurality of projection images obtained by tomosynthesis imaging of the breast, and the projection image obtained at the irradiation position closest to the position facing the detection surface of the radiation detector among the plurality of projection images, or the region of interest image generated by cutting out the region containing the calcification image from the plurality of tomographic images, is used as the input image, and simple radiography is performed in which radiation is emitted from the position facing the detection surface of the radiation detector.
- a neural network is caused to perform machine learning using an image generated by cutting out a region including a calcification image from the projection image obtained by , as a correct image.
- an image processing device an image processing method, a program, and a machine learning method that make it possible to improve the visibility of the shape of a calcified image in a synthesized two-dimensional image.
- FIG. 1 is a block diagram showing an example of the configuration of an image processing device;
- FIG. 3 is a block diagram showing an example of functions implemented by a control unit of the image processing apparatus;
- FIG. 4 is a diagram schematically showing the flow of processing by an image processing device;
- FIG. 4 is a diagram conceptually showing an example of region-of-interest image generation processing;
- FIG. 4 is a diagram conceptually showing an example of synthetic two-dimensional image generation processing;
- 4 is a flow chart showing the flow of a series of processes by the image processing device;
- FIG. 4 is a diagram conceptually showing an example of learning processing in a learning phase;
- FIG. 11 is a block diagram showing functions implemented by a control unit of an image processing apparatus according to a first modified example;
- FIG. 11 is a diagram conceptually showing learning processing according to a first modified example;
- FIG. 10 is a diagram illustrating an example of generating teacher data by simulation;
- FIG. 1 shows an example of the overall configuration of a radiation imaging system 2 according to this embodiment.
- the radiographic imaging system 2 includes a mammography device 10 , a console 12 , PACS (Picture Archiving and Communication Systems) 14 , and an image processing device 16 .
- the console 12, PACS 14, and image processing device 16 are connected via a network 17 by wired communication or wireless communication.
- FIG. 1 shows an example of the appearance of the mammography apparatus 10 when viewed from the left side of the subject.
- the mammography apparatus 10 is a radiographic apparatus that operates under the control of the console 12 and obtains a radiation image of the breast M by irradiating the subject's breast M with radiation R (for example, X-rays) from the radiation source 29.
- radiation R for example, X-rays
- the mammography apparatus 10 has a function of performing simple radiation imaging in which imaging is performed with the radiation source 29 set to an irradiation position along the normal direction of the detection surface 20A of the radiation detector 20, and a function of performing tomosynthesis imaging in which imaging is performed by moving the radiation source 29 to each of a plurality of irradiation positions.
- the mammography apparatus 10 includes an imaging table 24, a base 26, an arm portion 28, and a compression unit 32.
- a radiation detector 20 is arranged inside the imaging table 24 .
- the user positions the breast M of the subject on the imaging surface 24A of the imaging table 24 when performing imaging.
- the radiation detector 20 detects the radiation R that has passed through the breast M that is the subject. Specifically, the radiation detector 20 detects the radiation R that enters the subject's breast M and the imaging table 24 and reaches the detection surface 20A of the radiation detector 20, and generates a radiation image based on the detected radiation R. The radiation detector 20 outputs image data representing the generated radiation image.
- imaging a series of operations in which radiation R is emitted from the radiation source 29 and a radiographic image is generated by the radiation detector 20 may be referred to as "imaging".
- the radiation detector 20 may be an indirect conversion type radiation detector that converts the radiation R into light and converts the converted light into electric charges, or a direct conversion type radiation detector that directly converts the radiation R into electric charges.
- a compression plate 30 that is used to compress the breast M during imaging is attached to the compression unit 32 .
- the compression plate 30 is moved toward or away from the imaging table 24 (hereinafter referred to as “vertical direction”) by a compression plate driving section (not shown) provided in the compression unit 32 .
- the compression plate 30 compresses the breast M between itself and the imaging table 24 by moving vertically.
- the arm portion 28 is rotatable with respect to the base 26 by the shaft portion 27 .
- the shaft portion 27 is fixed to the base 26, and the shaft portion 27 and the arm portion 28 rotate together.
- the shaft part 27 and the compression unit 32 of the imaging table 24 are respectively provided with gears, and by switching between the meshing state and the non-engaging state of the gears, the compression unit 32 of the imaging table 24 and the shaft part 27 can be switched between a state in which the compression unit 32 and the shaft part 27 are connected and rotate together, and a state in which the shaft part 27 is separated from the imaging table 24 and idles. Switching between transmission and non-transmission of the power of the shaft portion 27 is not limited to the gear described above, and various mechanical elements can be used.
- the arm portion 28 and the imaging table 24 are separately rotatable relative to the base 26 with the shaft portion 27 as a rotation axis.
- the radiation source 29 When performing tomosynthesis imaging in the mammography apparatus 10, the radiation source 29 is sequentially moved to each of a plurality of irradiation positions with different irradiation angles by rotating the arm section .
- the radiation source 29 has a radiation tube (not shown) that generates radiation R, and the radiation tube is moved to each of a plurality of irradiation positions according to the movement of the radiation source 29 .
- Fig. 2 illustrates an example of tomosynthesis imaging. 2, the illustration of the compression plate 30 is omitted.
- the number of irradiation positions Pk is seven in FIG. 2, the number of irradiation positions Pk is not limited and can be changed as appropriate.
- the radiation R is emitted from the radiation source 29 toward the breast M, and the radiation detector 20 detects the radiation R transmitted through the breast M to generate a radiation image.
- the radiographic imaging system 2 when tomosynthesis imaging is performed by moving the radiation source 29 to each irradiation position Pk and generating a radiographic image at each irradiation position Pk, seven radiographic images are obtained in the example of FIG.
- the radiographic image captured at each irradiation position Pk will be referred to as a "projection image” when distinguished from the tomographic image, and the multiple projection images captured in one tomosynthesis imaging will be referred to as a "series of multiple projection images”. Moreover, when a projection image is referred to without distinguishing it from a tomographic image, it is simply referred to as a “radiation image”.
- the irradiation angle of the radiation R refers to the angle ⁇ between the normal line CL of the detection surface 20A of the radiation detector 20 and the radiation axis RC.
- the radiation axis RC is an axis that connects the focal point of the radiation source 29 at each irradiation position Pk and a preset position.
- a detection surface 20A of the radiation detector 20 is substantially parallel to the imaging surface 24A.
- the radiation R emitted from the radiation source 29 is a cone beam with the focal point as the apex and the radiation axis RC as the central axis.
- the position of the radiation source 29 is fixed at the irradiation position P4 where the irradiation angle ⁇ is 0 degrees.
- Radiation R is emitted from the radiation source 29 according to an instruction from the console 12, and the radiation detector 20 detects the breast M and the transmitted radiation R, thereby generating a radiation image.
- a higher dose of radiation R is emitted from the radiation source 29 than in tomosynthesis imaging.
- the mammography apparatus 10 and the console 12 are connected by wired communication or wireless communication.
- a radiographic image generated by the radiation detector 20 in the mammography apparatus 10 is output to the console 12 by wired or wireless communication via a communication I/F (Interface) (not shown).
- the console 12 includes a control unit 40, a storage unit 42, a user I/F 44, and a communication I/F 46.
- the control unit 40 has a function of controlling radiation imaging by the mammography apparatus 10 as described above.
- the control unit 40 is configured by a computer system including, for example, a CPU (Central Processing Unit), ROM (Read Only Memory), and RAM (Random Access Memory).
- the storage unit 42 stores information related to radiography, radiation images acquired from the mammography apparatus 10, and the like.
- the storage unit 42 is a non-volatile storage such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- the user I/F 44 includes input devices such as various buttons and switches that are operated by a user such as an engineer regarding radiographic imaging, and display devices such as lamps and displays for displaying information related to imaging and radiographic images obtained by imaging.
- the communication I/F 46 communicates various data such as radiation imaging information and radiation images with the mammography apparatus 10 by wired or wireless communication. Further, the communication I/F 46 communicates various data such as radiation images with the PACS 14 and the image processing apparatus 16 via the network 17 by wired communication or wireless communication.
- the PACS 14 also includes a storage unit 50 (see FIG. 1) that stores a radiographic image group 52 .
- the radiation image group 52 includes projection images obtained from the console 12 via the network 17 .
- the image processing device 16 has a function of supporting diagnosis by a doctor or the like (hereinafter simply referred to as a "doctor") by making a judgment regarding the diagnosis of the lesion when the doctor or the like (hereinafter simply referred to as “doctor”) diagnoses the lesion of the breast M using a radiographic image.
- FIG. 3 shows an example of the configuration of the image processing device 16.
- the image processing device 16 includes a control section 60 , a storage section 62 , a display section 70 , an operation section 72 and a communication I/F 74 .
- the control unit 60, the storage unit 62, the display unit 70, the operation unit 72, and the communication I/F 74 are connected via a bus 79 such as a system bus and a control bus so that various information can be exchanged with each other.
- a bus 79 such as a system bus and a control bus
- the control unit 60 controls the overall operation of the image processing device 16 .
- the control unit 60 is configured by a computer system including a CPU 60A, a ROM 60B, and a RAM 60C.
- the ROM 60B stores in advance various programs, data, etc. for performing control by the CPU 60A.
- the RAM 60C temporarily stores various data.
- the storage unit 62 is a non-volatile storage such as an HDD or SSD.
- the storage unit 62 stores a program 63 for causing the control unit 60 to execute various processes, a machine-learned model 64 for performing shape restoration processing described later, a learning program 65 for causing the machine-learned model 64 to perform machine learning, and the like.
- the display unit 70 is a display that displays radiation images, various information, and the like.
- the operation unit 72 is used by a doctor to input instructions for diagnosing breast lesions using radiation images, various types of information, and the like.
- the operating unit 72 is, for example, various switches, a touch panel, a stylus, a mouse, and the like.
- the communication I/F 74 communicates various information between the console 12 and the PACS 14 via the network 17 by wireless communication or wired communication.
- FIG. 4 shows an example of functions realized by the control unit 60 of the image processing device 16.
- FIG. Various functions are realized by the CPU 60A of the control unit 60 executing processing based on the program 63 stored in the storage unit 62 .
- the control unit 60 functions as a tomographic image generation unit 80 , a calcification image detection unit 81 , a region of interest image generation unit 82 , a shape restoration unit 83 , a synthetic two-dimensional image generation unit 84 and a display control unit 85 .
- the tomographic image generating unit 80 has a function of generating multiple tomographic images 90 (see FIG. 5) from a series of multiple projection images.
- the tomographic image generator 80 acquires a series of projection images from the console 12 or the PACS 14 of the mammography apparatus 10 based on an instruction to diagnose a lesion.
- the tomographic image generation unit 80 generates a plurality of tomographic images 90 having different heights from the imaging plane 24A from a series of acquired projection images.
- the tomographic image generation unit 80 generates a plurality of tomographic images 90 by reconstructing a series of projection images using the back projection method.
- An FBP (Filter Back Projection) method, an iterative reconstruction method, or the like can be used as the back projection method.
- the tomographic image generation unit 80 outputs the plurality of generated tomographic images 90 to the calcification image detection unit 81 and the combined two-dimensional image generation unit 84 .
- FIG. 5 schematically shows the flow of processing by the image processing device 16.
- FIG. Processing by the calcification image detection unit 81, the region-of-interest image generation unit 82, the shape restoration unit 83, and the synthetic two-dimensional image generation unit 84 will be described with reference to FIG.
- the calcification image detection unit 81 performs calcification image detection processing for detecting tissue images (hereinafter referred to as calcification images) in which calcification is assumed to occur in the breast M based on the plurality of tomographic images 90 generated by the tomographic image generation unit 80.
- the calcification image detection unit 81 performs detection processing using a known CAD (Computer-Aided Diagnosis) algorithm on each of a plurality of tomographic images 90, and obtains a union of detection information (corresponding to a mask image 91 described later) obtained from each tomographic image 90.
- CAD Computer-Aided Diagnosis
- a probability (likelihood) indicating that a pixel in each tomographic image 90 is a calcification image is derived, and pixels whose probability is equal to or higher than a predetermined threshold value are detected as calcification images.
- the calcification image detection unit 81 is not limited to detection processing using a CAD algorithm (so-called rule-based detection processing), and may perform detection processing using a machine-learned model that has undergone machine learning.
- the calcification image detection unit 81 may detect calcification images by inputting a plurality of tomographic images 90 into a machine-learned model.
- the detection result of the calcification image by the calcification image detection unit 81 is output as, for example, a mask image 91 representing the position of the calcification image.
- the mask image 91 is a binary image in which pixels included in the calcification image are represented by "1" and other pixels are represented by "0".
- the calcification image detection unit 81 outputs one mask image 91 for multiple tomographic images 90 . By performing detection processing using a plurality of tomographic images 90, calcification images can be detected with high detection accuracy. In the example shown in FIG. 5, the calcification image detection unit 81 detects three calcification images C1 to C3.
- the region-of-interest image generation unit 82 generates a region-of-interest image (hereinafter referred to as a ROI (Region of Interest) image) based on one projection image 92 out of a series of multiple projection images used for reconstruction processing by the tomographic image generation unit 80 and the detection result of the calcification image by the calcification image detection unit 81.
- the projection image 92 used by the region-of-interest image generation unit 82 to generate the ROI image is a projection image obtained at a position directly facing the detection surface 20A of the radiation detector 20 among a series of multiple projection images.
- the position directly facing the detection surface 20A is the position where the irradiation angle ⁇ is 0 degrees. In this embodiment, the position directly facing the detection surface 20A is the irradiation position P4 shown in FIG.
- the projection image 92 obtained at the irradiation position P4 facing the detection plane 20A most accurately expresses the shape of the calcification image.
- FIG. 6 conceptually shows an example of region-of-interest image generation processing by the region-of-interest image generation unit 82 .
- a region-of-interest image generation unit 82 generates an ROI image by cutting out a region including a calcification image from the projection image 92 based on the mask image 91 . Further, when a plurality of calcification images are detected in the calcification image detection process, the region-of-interest image generation unit 82 individually generates an ROI image for each of the plurality of calcification images.
- ROI images are individually generated for each of the three calcified images C1 to C3.
- an ROI image R1 including the calcification image C1 an ROI image R2 including the calcification image C2
- an ROI image R3 including the calcification image C3 are generated.
- the shape restoration unit 83 performs shape restoration processing for restoring the shape of the calcification image based on the ROI image generated by the region-of-interest image generation processing.
- the shape restoration unit 83 performs shape restoration processing using a machine-learned model 64 obtained by machine-learning the relationship between the shape of the calcification image included in the ROI image and the shape of the calcification image obtained by simple radiography.
- the shape restoration unit 83 inputs the ROI image to the machine-learned model 64 and acquires the restoration result 83A output from the machine-learned model 64 .
- the shape restoration section 83 outputs the restoration result 83A to the synthetic two-dimensional image generation section 84 .
- the machine-learned model 64 is, for example, a convolutional neural network (CNN) machine-learned by deep learning.
- CNN convolutional neural network
- the shape restoration processing performed by the shape restoration unit 83 is not limited to changing the shape of the calcification image, and includes clearing an unclear calcification image due to noise or the like by removing noise or the like.
- the projection image 92 contains more information on the shape of the calcification image than the tomographic image 90, but since it is a radiographic image obtained by low radiation dose tomosynthesis imaging, it contains much noise.
- the shape restoration unit 83 can clarify an unclear calcification image affected by noise.
- the shape restoration process reduces false positives in which noise or the like is erroneously detected as a calcified image.
- the shape restoration unit 83 individually inputs the three ROI images R1 to R3 generated by the region-of-interest image generation process into the machine-learned model 64.
- the machine-learned model 64 outputs ROI images R1 to R3 in which the shape of the calcified image is restored as a restoration result 83A.
- the calcification image C1 is the result of erroneous detection of noise as a calcification image by the calcification image detection unit 81 . Therefore, the calcified image C1 has been removed together with noise and the like by shape restoration processing, and the shape has not been restored.
- the calcified images included in the ROI images R2 and R3 have their shapes restored by shape restoration processing.
- the composite two-dimensional image generation unit 84 performs composite two-dimensional image generation processing for generating a composite two-dimensional image 100 based on the shape restoration result 83A of the shape restoration processing and a plurality of tomographic images 90 generated by the tomographic image generation unit 80.
- FIG. 7 conceptually shows an example of synthetic two-dimensional image generation processing by the synthetic two-dimensional image generation unit 84 .
- the combined two-dimensional image generation unit 84 combines a plurality of tomographic images by an addition method, an average method, a maximum intensity projection method, a minimum intensity projection method, or the like, thereby generating an image 100A corresponding to a simple two-dimensional image obtained by radiography at an irradiation position P4 directly facing the detection plane 20A.
- the synthetic two-dimensional image generation unit 84 generates the synthetic two-dimensional image 100 based on the image 100A and the restoration result 83A.
- the synthetic two-dimensional image generation unit 84 generates the synthetic two-dimensional image 100 by replacing the regions corresponding to the ROI images R1 to R3 in the image 100A with the ROI images R1 to R3 included in the restoration result 83A.
- the synthesized two-dimensional image generation unit 84 may generate the synthesized two-dimensional image 100 by performing weighted addition of the ROI images R1 to R3 included in the restoration result 83A to the corresponding regions of the image 100A.
- the display control unit 85 performs display processing for displaying the synthetic two-dimensional image 100 generated by the synthetic two-dimensional image generation processing on the display unit 70 .
- the display control unit 85 may perform highlighting by changing the color or the like of the calcified image. Further, the display control unit 85 may cause the display unit 70 to display one or more tomographic images 90 together with the synthesized two-dimensional image 100 .
- step S ⁇ b>10 the tomographic image generator 80 acquires a series of projection images from the console 12 or PACS 14 of the mammography apparatus 10 .
- the tomographic image generator 80 generates a plurality of tomographic images 90 based on the series of projection images acquired at step S10.
- step S12 the calcification image detection unit 81 detects calcification images from the multiple tomographic images 90 generated in step S11, and generates a mask image 91 as a detection result.
- step S13 the region-of-interest image generation unit 82 uses the mask image 91 generated in step S12 to generate an ROI image by cutting out a region including a calcification image from the projection image 92 obtained at a position directly facing the detection surface 20A of the radiation detector 20.
- the shape restoration unit 83 restores the shape of the calcification image based on the ROI image generated at step S13. Specifically, the shape restoration unit 83 inputs the ROI image to the machine-learned model 64 and acquires the restoration result 83A output from the machine-learned model 64 .
- the synthesized two-dimensional image generation unit 84 generates a synthesized two-dimensional image 100 based on the restoration result 83A obtained at step S14 and the plurality of tomographic images 90 generated at step S11.
- the display control unit 85 causes the display unit 70 to display the synthesized two-dimensional image 100 generated at step S15.
- a ROI image is generated from the projection image 92 obtained at the irradiation position P4 facing the detection surface 20A of the radiation detector 20, and the synthesized two-dimensional image 100 is generated using the result of restoring the shape of the calcification image based on the ROI image.
- the visibility of the shape of the calcification image in the synthesized two-dimensional image 100 is improved.
- the shape of the calcified image can be accurately restored by restoring the shape of the calcified image based on the ROI image generated from the projected image 92.
- the ROI image may be generated using the projection image obtained at the irradiation position closest to the position directly facing the detection surface 20A among the plurality of irradiation positions at which a series of multiple projection images are obtained.
- the calcification image detection unit 81 detects calcification images from a plurality of tomographic images 90 .
- the calcified image detection unit 81 may detect only calcified images whose signal values are equal to or less than a certain value (so-called faint calcified images). This is because the shape of a light calcified image is not represented accurately, and it is difficult to determine the type of shape on the tomographic image 90 as the clinical image displayed on the display unit 70 .
- FIG. 9 conceptually shows an example of learning processing in the learning phase.
- the learning process shown in FIG. 9 is executed by the CPU 60A of the control unit 60 executing the learning program 65 stored in the storage unit 62.
- FIG. 9 is executed by the CPU 60A of the control unit 60 executing the learning program 65 stored in the storage unit 62.
- the control unit 60 includes a tomographic image generation unit 80, a calcification image detection unit 81, a region of interest image generation unit 82, and a correct image generation unit 86, which function as teacher data acquisition units.
- the tomographic image generating unit 80, the calcification image detecting unit 81, and the region of interest image generating unit 82 have the same configuration as the tomographic image generating unit 80, the calcification image detecting unit 81, and the region of interest image generating unit 82 configured in the operation phase described above.
- the calcification image detection unit 81 detects calcification images based on the plurality of tomographic images 90 generated by the tomographic image generation unit 80 and outputs a mask image 91 .
- the region-of-interest image generator 82 generates a ROI image based on the mask image 91 and the projection image 92 obtained at the irradiation position P4 facing the detection plane 20A.
- the ROI image generated by the region-of-interest image generator 82 is used as the input image 102 for learning of the machine learning model 64A.
- the correct image generation unit 86 performs the same processing as the region of interest image generation unit 82 except that the projection image 110 obtained by simple radiography is used instead of the projection image 92 obtained by tomosynthesis imaging. That is, based on the mask image 91, the correct image generation unit 86 generates the ROI image by cutting out the region including the calcification image from the projection image 110.
- FIG. Plain radiography uses a higher radiation dose than tomosynthesis imaging, so a clear calcification image with less noise can be obtained.
- the ROI image generated by the correct image generation unit 86 is used as the correct image 104 for learning of the machine learning model 64A.
- the calcification image C1 is a calcification image in which noise is erroneously detected as a calcification image by the calcification image detection unit 81 . Therefore, the ROI image R1 generated by the correct image generation unit 86 based on the projection image 110 less affected by noise does not include the calcified image C1.
- the machine-learned model 64 is a neural network generated by machine-learning the machine-learning model 64A using teacher data including the input image 102 and the correct image 104 in the learning phase.
- the machine learning model 64A is machine-learned using, for example, the error backpropagation method.
- the error calculation between the judgment result obtained by inputting the input image 102 to the machine learning model 64A and the correct image 104, and the update setting of the weight and bias are repeatedly performed.
- a machine learning model 64A subjected to machine learning in the learning phase is stored in the storage unit 62 as a machine-learned model 64 .
- the machine learning of the machine learning model 64A may be performed within the image processing device 16, or may be performed by an external device.
- the machine-learned model 64 subjected to machine learning in this way restores the shape of the calcification image included in the ROI image generated by the region-of-interest image generation unit 82 to the shape of the calcification image included in the ROI image generated by the correct image generation unit 86.
- FIG. 10 shows functions realized by the control unit 60 of the image processing device 16 according to the first modification.
- the first modification differs from the above embodiment in the calcification image detection processing by the calcification image detection unit 81 and the region of interest image generation processing by the region of interest image generation unit 82 .
- the calcification image detection unit 81 outputs multiple mask images 91 corresponding to each of the multiple tomographic images 90 .
- the region-of-interest image generator 82 generates ROI images based on multiple tomographic images 90 generated by the tomographic image generator 80 and multiple mask images 91 . Specifically, the region-of-interest image generation unit 82 generates an ROI image by cutting out regions including calcification images from a plurality of tomographic images 90 based on a plurality of mask images 91 . In this modified example, the ROI image generated by the region-of-interest image generator 82 is voxel data expressed in units of voxels in a three-dimensional space. The ROI image generated by the region-of-interest image generation unit 82 is extracted from a plurality of tomographic images 90, so it has less information about the shape of the calcification image than the projection image, but has the advantage of less noise.
- a ROI image as voxel data is input to the shape restoration unit 83 .
- the shape restoration unit 83 inputs the ROI image input from the region-of-interest image generation unit 82 to the machine-learned model 64 and acquires a restoration result 83A output from the machine-learned model 64 .
- the reconstruction result 83A includes a two-dimensional ROI image having a calcified image whose shape has been reconstructed. In the example shown in FIG. 10, the calcified image C1 has been removed together with noise and the like by shape restoration processing, and the shape has not been restored, as in the above embodiment.
- the composite two-dimensional image generator 84 generates a composite two-dimensional image 100 based on the restoration result 83A and the multiple tomographic images 90 generated by the tomographic image generator 80, as in the above embodiment.
- FIG. 11 conceptually shows the learning process according to the first modified example.
- the learning process according to this modification differs from the learning process of the above-described embodiment only in that, in order to generate the input image 102, the region-of-interest image generator 82 generates ROI images based on multiple tomographic images 90 and multiple mask images 91.
- a ROI image as voxel data is used as the input image 102 for learning of the machine learning model 64A.
- the correct image 104 is the same as in the above embodiment.
- the machine-learned model 64 restores the shape of the calcification image included in the ROI image as voxel data generated by the region-of-interest image generation unit 82 to the shape of the calcification image included in the ROI image generated by the correct image generation unit 86.
- the input image 102 and the correct image 104 are generated based on radiographic images obtained by performing tomosynthesis imaging and simple radiography with the breast M as the subject.
- the input image 102 and the correct image 104 may be generated based on radiation images obtained by performing tomosynthesis imaging and plain radiography with a phantom (for example, a breast phantom) as a subject.
- the input image 102 and the correct image 104 may be generated by computer simulation.
- an input image 102 and a correct image 104 are generated by a radiation imaging simulator 200 based on a calcification model.
- the machine-learned model 64 may be a neural network in which machine learning is performed using the input image 102 and the correct image 104 generated by simulation or photography using a phantom.
- the machine learning model 64A may be a neural network in which machine learning is performed using the input image 102 and the correct image 104 generated by simulation or photography using a phantom.
- the following various processors can be used as the hardware structure of the processing unit that executes various processes such as the tomographic image generation unit 80, the calcification image detection unit 81, the region of interest image generation unit 82, the shape restoration unit 83, the synthetic two-dimensional image generation unit 84, the display control unit 85, and the correct image generation unit 86.
- the various processors described above include GPUs (Graphics Processing Units) in addition to CPUs.
- processors are not limited to general-purpose processors such as CPUs that run software (programs) and function as various processing units, and include programmable logic devices (PLDs), which are processors whose circuit configurations can be changed after manufacture, such as FPGAs (Field Programmable Gate Arrays), and dedicated electric circuits, which are processors having circuit configurations specially designed for executing specific processing such as ASICs (Application Specific Integrated Circuits).
- PLDs programmable logic devices
- FPGAs Field Programmable Gate Arrays
- dedicated electric circuits which are processors having circuit configurations specially designed for executing specific processing such as ASICs (Application Specific Integrated Circuits).
- a single processing unit may be composed of one of these various processors, or may be composed of a combination of two or more processors of the same type or different types (for example, a combination of multiple FPGAs or a combination of a CPU and an FPGA). Also, a plurality of processing units may be configured by one processor.
- processors As an example of configuring a plurality of processing units with a single processor, first, as typified by computers such as clients and servers, one processor is configured by combining one or more CPUs and software, and this processor functions as a plurality of processing units. Second, as represented by System On Chip (SoC), etc., there is a form of using a processor that realizes the function of the entire system including multiple processing units with a single IC (Integrated Circuit) chip. In this way, various processing units are configured using one or more of the above various processors as a hardware structure.
- SoC System On Chip
- an electric circuit combining circuit elements such as semiconductor elements can be used.
- the program 63 may be provided in a non-temporarily recorded form on recording media such as CD-ROM (Compact Disc Read Only Memory), DVD-ROM (Digital Versatile Disc Read Only Memory), and USB (Universal Serial Bus) memory.
- the program 63 may be downloaded from an external device via a network.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Engineering & Computer Science (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- High Energy & Nuclear Physics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
Description
図10は、第1変形例に係る画像処理装置16の制御部60により実現される機能を示す。第1変形例は、石灰化像検出部81による石灰化像検出処理と、関心領域画像生成部82による関心領域画像生成処理とが上記実施形態と異なる。
上記実施形態及び変形例では、学習フェーズにおいて、乳房Mを被写体としてトモシンセシス撮影及び単純放射線撮影を行うことにより得られた放射線画像に基づいて入力画像102及び正解画像104を生成している。これに代えて、ファントム(例えば、乳房ファントム)を被写体としてトモシンセシス撮影及び単純放射線撮影を行うことにより得られた放射線画像に基づいて入力画像102及び正解画像104を生成してもよい。
Claims (8)
- 少なくとも1つのプロセッサを備え、
前記プロセッサは、
乳房のトモシンセシス撮影により得られた一連の複数の投影画像から得られた複数の断層画像に基づいて石灰化像を検出する石灰化像検出処理と、
前記石灰化像検出処理による検出結果に基づき、複数の前記投影画像のうち放射線検出器の検出面に正対する位置に最も近い照射位置で得られた投影画像、又は、複数の前記断層画像から前記石灰化像を含む領域を切り出すことにより関心領域画像を生成する関心領域画像生成処理と、
前記関心領域画像生成処理により生成された前記関心領域画像に基づいて前記石灰化像の形状を復元する形状復元処理と、
前記形状復元処理による形状の復元結果と、複数の前記断層画像とに基づいて合成2次元画像を生成する合成2次元画像生成処理と、
を実行する画像処理装置。 - 前記プロセッサは、
前記石灰化像検出処理において複数の前記石灰化像を検出した場合、前記関心領域画像生成処理において、複数の前記石灰化像の各々について個別に前記関心領域画像を生成する、
請求項1に記載の画像処理装置。 - 前記プロセッサは、
機械学習済みモデルに前記関心領域画像を入力することにより前記形状復元処理を実行する、
請求項1又は請求項2に記載の画像処理装置。 - 前記機械学習済みモデルは、前記関心領域画像を入力画像とし、放射線検出器の検出面に正対する位置から放射線を照射する単純放射線撮影により得られた投影画像から前記石灰化像を含む領域を切り出すことにより生成された画像を正解画像として機械学習が行われたニューラルネットワークである、
請求項3に記載の画像処理装置。 - 前記機械学習済みモデルは、シミュレーション又はファントムを用いた撮影により生成された入力画像と正解画像を用いて機械学習が行われたニューラルネットワークである、
請求項3に記載の画像処理装置。 - 乳房のトモシンセシス撮影により得られた一連の複数の投影画像から得られた複数の断層画像に基づいて石灰化像を検出する石灰化像検出ステップと、
前記石灰化像検出ステップによる検出結果に基づき、複数の前記投影画像のうち放射線検出器の検出面に正対する位置に最も近い照射位置で得られた投影画像、又は、複数の前記断層画像から前記石灰化像を含む領域を切り出すことにより関心領域画像を生成する関心領域画像生成ステップと、
前記関心領域画像生成ステップにより生成された前記関心領域画像に基づいて前記石灰化像の形状を復元する形状復元ステップと、
前記形状復元ステップによる形状の復元結果と、複数の前記断層画像とに基づいて合成2次元画像を生成する合成2次元画像生成ステップと、
を含む画像処理方法。 - 乳房のトモシンセシス撮影により得られた一連の複数の投影画像から得られた複数の断
層画像に基づいて石灰化像を検出する石灰化像検出処理と、
前記石灰化像検出処理による検出結果に基づき、複数の前記投影画像のうち放射線検出器の検出面に正対する位置に最も近い照射位置で得られた投影画像、又は、複数の前記断層画像から前記石灰化像を含む領域を切り出すことにより関心領域画像を生成する関心領域画像生成処理と、
前記関心領域画像生成処理により生成された前記関心領域画像に基づいて前記石灰化像の形状を復元する形状復元処理と、
前記形状復元処理による形状の復元結果と、複数の前記断層画像とに基づいて合成2次元画像を生成する合成2次元画像生成処理と、
をコンピュータに実行させるプログラム。 - 乳房のトモシンセシス撮影により得られた一連の複数の投影画像から得られた複数の断層画像に基づいて石灰化像を検出し、複数の前記投影画像のうち放射線検出器の検出面に正対する位置に最も近い照射位置で得られた投影画像、又は、複数の前記断層画像から前記石灰化像を含む領域を切り出すことにより生成した関心領域画像を入力画像とし、
放射線検出器の検出面に正対する位置から放射線を照射する単純放射線撮影により得られた投影画像から前記石灰化像を含む領域を切り出すことにより生成した画像を正解画像として、ニューラルネットワークに機械学習を行わせる、
機械学習方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023575120A JPWO2023139971A1 (ja) | 2022-01-19 | 2022-12-12 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-006672 | 2022-01-19 | ||
JP2022006672 | 2022-01-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023139971A1 true WO2023139971A1 (ja) | 2023-07-27 |
Family
ID=87348166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/045734 WO2023139971A1 (ja) | 2022-01-19 | 2022-12-12 | 画像処理装置、画像処理方法、プログラム、及び機械学習方法 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023139971A1 (ja) |
WO (1) | WO2023139971A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000350722A (ja) * | 1999-04-22 | 2000-12-19 | Ge Medical Syst Sa | 器官の注目する要素の配置および三次元表現の方法 |
JP2003079606A (ja) * | 2001-09-13 | 2003-03-18 | Fuji Photo Film Co Ltd | 異常陰影検出装置 |
JP2005080758A (ja) * | 2003-09-05 | 2005-03-31 | Konica Minolta Medical & Graphic Inc | 画像処理装置 |
JP2017064185A (ja) * | 2015-09-30 | 2017-04-06 | 富士フイルム株式会社 | 制御装置、放射線画像撮影装置、放射線画像撮影方法、及び放射線画像撮影プログラム |
JP2017510323A (ja) * | 2014-02-28 | 2017-04-13 | ホロジック, インコーポレイテッドHologic, Inc. | トモシンセシス画像スラブを生成し表示するためのシステムおよび方法 |
JP2020127669A (ja) | 2019-02-12 | 2020-08-27 | キヤノンメディカルシステムズ株式会社 | 医用情報処理装置、x線診断装置及びプログラム |
JP2020141867A (ja) | 2019-03-06 | 2020-09-10 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、学習方法、x線診断装置、医用画像処理方法、およびプログラム |
-
2022
- 2022-12-12 WO PCT/JP2022/045734 patent/WO2023139971A1/ja active Application Filing
- 2022-12-12 JP JP2023575120A patent/JPWO2023139971A1/ja active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000350722A (ja) * | 1999-04-22 | 2000-12-19 | Ge Medical Syst Sa | 器官の注目する要素の配置および三次元表現の方法 |
JP2003079606A (ja) * | 2001-09-13 | 2003-03-18 | Fuji Photo Film Co Ltd | 異常陰影検出装置 |
JP2005080758A (ja) * | 2003-09-05 | 2005-03-31 | Konica Minolta Medical & Graphic Inc | 画像処理装置 |
JP2017510323A (ja) * | 2014-02-28 | 2017-04-13 | ホロジック, インコーポレイテッドHologic, Inc. | トモシンセシス画像スラブを生成し表示するためのシステムおよび方法 |
JP2017064185A (ja) * | 2015-09-30 | 2017-04-06 | 富士フイルム株式会社 | 制御装置、放射線画像撮影装置、放射線画像撮影方法、及び放射線画像撮影プログラム |
JP2020127669A (ja) | 2019-02-12 | 2020-08-27 | キヤノンメディカルシステムズ株式会社 | 医用情報処理装置、x線診断装置及びプログラム |
JP2020141867A (ja) | 2019-03-06 | 2020-09-10 | キヤノンメディカルシステムズ株式会社 | 医用画像処理装置、学習方法、x線診断装置、医用画像処理方法、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023139971A1 (ja) | 2023-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7237809B2 (ja) | 深層学習ベースの画像再構成のためのシステムおよび方法 | |
US7957574B2 (en) | Methods and apparatus for generating a risk metric for soft plaque in vessels | |
US10143433B2 (en) | Computed tomography apparatus and method of reconstructing a computed tomography image by the computed tomography apparatus | |
US8611492B2 (en) | Imaging method for rotating a tissue region | |
US12056875B2 (en) | Image processing device, learning device, radiography system, image processing method, learning method, image processing program, and learning program | |
EP3590431B1 (en) | Image display device, image display method, and image display program | |
JP5669799B2 (ja) | 画像処理装置、放射線画像撮影システム、画像処理プログラム、及び画像処理方法 | |
WO2023139971A1 (ja) | 画像処理装置、画像処理方法、プログラム、及び機械学習方法 | |
JP2020199214A (ja) | 読影支援装置、読影支援方法、及び読影支援プログラム | |
WO2023139970A1 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP2023051400A (ja) | 学習装置、画像生成装置、学習方法、画像生成方法、学習プログラム、及び画像生成プログラム | |
JP2022153114A (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
EP4215118B1 (en) | Image processing apparatus, image processing method, and program | |
WO2024048168A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2023105689A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
WO2024042889A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2024042891A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20220318997A1 (en) | Image processing device, learning device, radiography system, image processing method, learning method, image processing program, and learning program | |
JP7548797B2 (ja) | 医用画像処理装置、マンモグラフィ装置、プログラム、および方法 | |
WO2024042890A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2024048169A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
US20230404514A1 (en) | Medical data processing method, model generating method, and medical data processing apparatus | |
EP4382047A1 (en) | System and method for projectjon enhancement for synthetic 2d image generation | |
WO2022070570A1 (ja) | 画像処理装置、画像処理方法、及び画像処理プログラム | |
JP7376447B2 (ja) | 制御装置、制御方法、及び制御プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22922122 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023575120 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022922122 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022922122 Country of ref document: EP Effective date: 20240718 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |