WO2022065050A1 - 超音波診断装置および超音波診断装置の制御方法 - Google Patents
超音波診断装置および超音波診断装置の制御方法 Download PDFInfo
- Publication number
- WO2022065050A1 WO2022065050A1 PCT/JP2021/033135 JP2021033135W WO2022065050A1 WO 2022065050 A1 WO2022065050 A1 WO 2022065050A1 JP 2021033135 W JP2021033135 W JP 2021033135W WO 2022065050 A1 WO2022065050 A1 WO 2022065050A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- ultrasonic
- interest
- region
- radiation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 18
- 239000000523 sample Substances 0.000 claims abstract description 77
- 210000000481 breast Anatomy 0.000 claims abstract description 48
- 230000005855 radiation Effects 0.000 claims description 118
- 238000002604 ultrasonography Methods 0.000 claims description 19
- 238000000605 extraction Methods 0.000 claims description 12
- 230000015654 memory Effects 0.000 description 24
- 238000004891 communication Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000005540 biological transmission Effects 0.000 description 13
- 238000006243 chemical reaction Methods 0.000 description 9
- 238000002591 computed tomography Methods 0.000 description 8
- 239000003550 marker Substances 0.000 description 7
- 238000010191 image analysis Methods 0.000 description 5
- 230000003321 amplification Effects 0.000 description 4
- 238000007689 inspection Methods 0.000 description 4
- 230000003902 lesion Effects 0.000 description 4
- 238000003199 nucleic acid amplification method Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000009607 mammography Methods 0.000 description 3
- 238000002592 echocardiography Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 2
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- BQCIDUSAKPWEOX-UHFFFAOYSA-N 1,1-Difluoroethene Chemical compound FC(F)=C BQCIDUSAKPWEOX-UHFFFAOYSA-N 0.000 description 1
- FYYHWMGAXLPEAU-UHFFFAOYSA-N Magnesium Chemical compound [Mg] FYYHWMGAXLPEAU-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052749 magnesium Inorganic materials 0.000 description 1
- 239000011777 magnesium Substances 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 229920000131 polyvinylidene Polymers 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000006104 solid solution Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/502—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for diagnosis of breast, i.e. mammography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/44—Constructional features of apparatus for radiation diagnosis
- A61B6/4417—Constructional features of apparatus for radiation diagnosis related to combined acquisition of different diagnostic modalities
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5229—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
- A61B6/5247—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from an ionising-radiation diagnostic technique and a non-ionising radiation diagnostic technique, e.g. X-ray and ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0825—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4254—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
- A61B8/4416—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
- G06T3/147—Transformations for image registration, e.g. adjusting or mapping for alignment of images using affine transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
Definitions
- the present invention relates to an ultrasonic diagnostic apparatus for inspecting the breast of a subject and a control method of the ultrasonic diagnostic apparatus.
- an ultrasonic diagnostic device has been used to inspect lesions and the like in a subject.
- an ultrasonic diagnostic apparatus Prior to the examination using such an ultrasonic diagnostic apparatus, the inside of the subject is inspected in advance using an image diagnostic apparatus different from the ultrasonic diagnostic apparatus such as a CT (Computed Tomography) apparatus.
- CT Computer Tomography
- a user such as a doctor observes both an ultrasonic image taken by an ultrasonic diagnostic device and a medical image taken by another diagnostic imaging device to make a diagnosis of a lesion or the like of a subject. I often do it.
- Patent Document 1 As described above, in order to improve the accuracy of diagnosis using two medical images different from each other, for example, an ultrasonic diagnostic apparatus as disclosed in Patent Document 1 has been developed.
- a two-dimensional CT cross-sectional image representing a cross section corresponding to an ultrasonic image is selected based on three-dimensional data of a subject obtained by CT imaging, and the selected CT cross-sectional image and ultrasonic waves are selected. Displaying images is disclosed.
- the subject is inspected while lying on the examination table or the like. Will be.
- the breasts of the subjects are imaged from the same orientation, and the shapes of the breasts of the subjects at the time of imaging can be the same as each other. .. Therefore, when an inspection using a CT device is performed prior to an inspection using the ultrasonic diagnostic device, it is possible to easily obtain an ultrasonic image and a CT cross-sectional image that are easy for the user to compare.
- the present invention has been made to solve such conventional problems, and the user can easily compare the ultrasonic image and the radiographic image, and the ultrasonic diagnosis can improve the diagnostic accuracy for the subject. It is an object of the present invention to provide a control method of an apparatus and an ultrasonic diagnostic apparatus.
- the ultrasonic diagnostic apparatus transmits and receives an ultrasonic beam to and from an ultrasonic probe and an ultrasonic probe to the subject, thereby detecting a region of interest of the subject's breast imaged in a radiographic image.
- the ultrasonic image is created.
- An image adjustment unit that adjusts the radiographic image and the ultrasonic image so that the area of interest that is being imaged and the area of interest that is being imaged in the ultrasonic image are oriented in the same direction, and an adjusted unit that has been adjusted by the image adjustment unit. It is characterized by including a monitor for displaying a radiographic image and an ultrasonic image.
- the probe orientation information is preferably the position information of the ultrasonic probe specified by the user or the position information detected by the position sensor mounted on the ultrasonic probe.
- the image adjustment unit can generate an adjusted radiation image and an ultrasonic image by performing at least one of rotation processing and inversion processing on at least one of the entire radiation image and the entire ultrasonic image.
- an area of interest extraction unit that extracts the area of interest from the radiation image and the ultrasound image, respectively, in which case the image adjustment unit is extracted from the area of interest and the ultrasound image extracted from the radiation image.
- Adjusted radio and ultrasound images can be generated by subjecting at least one of the regions of interest to at least one of a rotation process and an inversion process.
- the image adjustment unit can duplicately display the subject orientation mark indicating the orientation of the subject on the adjusted radiographic image and ultrasonic image. Further, the image adjusting unit generates an adjusted radiographic image and an ultrasonic image so that the region of interest captured in the radiographic image and the region of interest captured in the ultrasonic image have the same size. Can be done. In this case, it is preferable that the image adjustment unit determines the ratio of the sizes of the adjusted radiation image and the ultrasonic image based on the pixel-to-pixel distance of the radiation image and the pixel-to-pixel distance of the ultrasonic image.
- the image adjustment unit performs rotation conversion of the radiation image based on the rotation angle of the radiation source when the radiation image is acquired in a state where the radiation source is arranged in a direction inclined with respect to the vertical direction. , It is preferable to adjust the radiographic and ultrasonic images.
- the tag of the radiographic image includes the radiographic image breast information indicating whether the breast of the subject captured in the radiographic image is the left or right breast
- the image adjustment unit uses the radiographic image breast information and the user. Based on the input information indicating whether the breast of the subject captured in the ultrasound image is the left or right breast, the breast of the subject and the ultrasound image captured in the radiograph are imaged. Radiographic and ultrasound images can be adjusted if the breasts of the subject being breasts match each other.
- the image adjustment unit can further adjust the already adjusted radiographic image and ultrasonic image based on the readjustment information input by the user.
- the control method of the ultrasonic diagnostic apparatus is to send and receive an ultrasonic beam to a subject using an ultrasonic probe to include an area of interest in the subject's breast imaged in a radiographic image.
- the region of interest captured in the ultrasound image based on the ultrasound image orientation information stored in the tag of the ultrasound image that generated the ultrasound image and the probe orientation information of the ultrasound probe when the ultrasound image was taken. It is characterized in that the radiation image and the ultrasonic image are adjusted so that the region of interest captured in the ultrasonic image and the region of interest are oriented in the same direction as each other, and the adjusted radiation image and the ultrasonic image are displayed on the monitor.
- the ultrasonic diagnostic apparatus creates a radiation image based on the radiation image orientation information stored in the tag of the ultrasonic image and the probe orientation information of the ultrasonic probe when the ultrasonic image is taken.
- An image adjustment unit that adjusts the radiographic image and the ultrasonic image so that the area of interest that is being imaged and the area of interest that is being imaged in the ultrasonic image are oriented in the same direction, and an adjusted unit that has been adjusted by the image adjustment unit. Since the monitor for displaying the ultrasonic image and the ultrasonic image is provided, the user can easily compare the ultrasonic image and the ultrasonic image, and the diagnostic accuracy for the subject can be improved.
- Embodiment 1 of this invention It is a block diagram which shows the structure of the ultrasonic diagnostic apparatus which concerns on Embodiment 1 of this invention. It is a schematic diagram of the example of the radiographic image stored in the server in Embodiment 1 of this invention. It is a schematic diagram of the ultrasonic probe in Embodiment 1 of this invention. It is a block diagram which shows the structure of the transmission / reception circuit in Embodiment 1 of this invention. It is a block diagram which shows the structure of the image generation part in Embodiment 1 of this invention. It is a schematic diagram of the example of the ultrasonic image in Embodiment 1 of this invention. It is a schematic diagram of the example of the radiographic image adjusted in Embodiment 1 of this invention.
- FIG. 3 is a schematic diagram of another example of a radiographic image and an ultrasonic image displayed on a monitor in the first embodiment of the present invention. It is a schematic diagram of another example of the radiographic image adjusted in Embodiment 1 of this invention. It is a schematic diagram of the example of the sub-window on the radiographic image and the sub-window on the ultrasonic image displayed on the monitor in the first embodiment of the present invention.
- Embodiment 1 of this invention It is a schematic diagram of the example of the region of interest displayed in the subwindow in Embodiment 1 of this invention. It is a block diagram which shows the structure of the ultrasonic diagnostic apparatus which concerns on Embodiment 2 of this invention. It is a block diagram which shows the structure of the ultrasonic diagnostic apparatus which concerns on Embodiment 3 of this invention.
- FIG. 1 shows the configuration of the ultrasonic diagnostic apparatus 1 according to the first embodiment of the present invention.
- the ultrasonic diagnostic apparatus 1 includes an ultrasonic probe 2 and a diagnostic apparatus main body 3.
- the ultrasonic probe 2 and the diagnostic device main body 3 are connected to each other. Further, the diagnostic device main body 3 is connected to the external server 4 via the network NW.
- the ultrasonic probe 2 includes an oscillator array 11, and a transmission / reception circuit 12 is sequentially connected to the oscillator array 11.
- the diagnostic device main body 3 includes an image generation unit 22, and the image generation unit 22 is connected to the transmission / reception circuit 12 of the ultrasonic probe 2. Further, the display control unit 23 and the monitor 24 are sequentially connected to the image generation unit 22. Further, the memory 25 is connected to the image generation unit 22. Further, the diagnostic device main body 3 includes a communication unit 21, and the communication unit 21 is connected to the server 4 via the network NW. Further, the memory 25 is connected to the communication unit 21. Further, the image adjustment unit 27 is connected to the memory 25. Further, the image adjustment unit 27 is connected to the display control unit 23.
- the main body control unit 29 is connected to the transmission / reception circuit 12, the communication unit 21, the image generation unit 22, the display control unit 23, the memory 25, and the image adjustment unit 27 of the ultrasonic probe 2.
- the input device 30 is connected to the main body control unit 29.
- the processor 31 is composed of a communication unit 21, an image generation unit 22, a display control unit 23, an image adjustment unit 27, and a main body control unit 29.
- the server 4 is installed in, for example, a hospital or the like, and is installed in a remote place with respect to the place where the diagnostic device main body 3 is arranged.
- the server 4 manages image data, and can be used, for example, in a so-called PACS (Picture Archiving and Communication System).
- the server 4 stores in advance a radiation image T1 as shown in FIG. 2, which is taken by a radiation diagnostic device (not shown).
- the radiographic image T1 stored on the server 4 includes a region of interest A1 suspected to be a lesion.
- the radiation image T1 has a tag for storing information about the radiation image T1.
- the radiation image orientation information which is information regarding the orientation of the subject in the radiation image T1, such as the Foot) direction, is stored.
- DICOM Digital Imaging and COmmunications in Medicine
- a radiographic image T1 in which the breast of the subject is taken from the so-called CC (CranioCaudal: head-tail) direction is shown, and the radiographic image T1 is shown in the R direction indicating the R direction.
- Four subject orientation marks indicating the orientation of the subject are arranged, that is, the mark D1, the L direction mark D2 indicating the L direction, the A direction mark D3 indicating the A direction, and the P direction mark D4 indicating the P direction.
- the ultrasonic probe 2 has a housing J containing various electric circuits and the like and made of resin and the like.
- the housing J has a grip portion J1 for the user who inspects the subject to grip the ultrasonic probe 2, and a tip portion J2 in which the oscillator array 11 is located, on one side of the side portion of the housing J.
- One protrusion-shaped marker M is formed in the vicinity of the tip portion J2.
- the orientation in which the marker M is formed allows the user to grasp the orientation of the ultrasonic probe 2.
- the orientation of the subject in the ultrasonic image such as the A direction, the P direction, the R direction, the L direction, the H direction, or the F direction is set with reference to the orientation of the marker M.
- the oscillator array 11 of the ultrasonic probe 2 shown in FIG. 1 has a plurality of ultrasonic oscillators arranged one-dimensionally or two-dimensionally. Each of these ultrasonic transducers transmits ultrasonic waves according to a drive signal supplied from the transmission / reception circuit 12, receives ultrasonic echoes from a subject, and outputs a signal based on the ultrasonic echoes.
- Each ultrasonic transducer is, for example, a piezoelectric ceramic represented by PZT (Lead Zirconate Titanate), a polymer piezoelectric element represented by PVDF (Poly Vinylidene Di Fluoride), and PMN-.
- PT Lead Magnesium Niobate-Lead Titanate: lead magnesiumidene fluoride-lead titanate solid solution
- the transmission / reception circuit 12 transmits ultrasonic waves from the oscillator array 11 under the control of the probe control unit 15 and generates a sound line signal based on the received signal acquired by the oscillator array 11.
- the transmission / reception circuit 12 includes a pulsar 16 connected to the oscillator array 11, an amplification unit 17 sequentially connected in series from the oscillator array 11, an AD (Analog Digital) conversion unit 18, and a beam. It has a former 19.
- the pulsar 16 includes, for example, a plurality of pulse generators, and transmits from a plurality of ultrasonic oscillators of the oscillator array 11 based on a transmission delay pattern selected according to a control signal from the probe control unit 15.
- Each drive signal is supplied to a plurality of ultrasonic transducers by adjusting the delay amount so that the ultrasonic waves to be generated form an ultrasonic beam.
- a pulsed or continuous wave voltage is applied to the electrodes of the ultrasonic vibrator of the vibrator array 11, the piezoelectric body expands and contracts, and the pulsed or continuous wave ultrasonic waves are generated from the respective ultrasonic vibrators. Is generated, and an ultrasonic beam is formed from the combined wave of those sound waves.
- the transmitted ultrasonic beam is reflected by, for example, a target such as a site of a subject, and propagates toward the oscillator array 11 of the ultrasonic probe 2.
- the ultrasonic echo propagating toward the oscillator array 11 in this way is received by each ultrasonic oscillator constituting the oscillator array 11.
- each ultrasonic oscillator constituting the oscillator array 11 expands and contracts by receiving the propagating ultrasonic echo, generates a received signal which is an electric signal, and amplifies these received signals. Output to 17.
- the amplification unit 17 amplifies the reception signal input from each ultrasonic vibrator constituting the vibrator array 11, and transmits the amplified reception signal to the AD conversion unit 18.
- the AD conversion unit 18 converts the reception signal transmitted from the amplification unit 17 into digital reception data.
- the beam former 19 performs so-called reception focus processing by giving and adding delays to each received data received from the AD conversion unit 18. By this reception focus processing, each received data converted by the AD conversion unit 18 is phase-adjusted and added, and a sound line signal in which the focus of the ultrasonic echo is narrowed down is acquired.
- the communication unit 21 is composed of a circuit including an antenna for transmitting and receiving radio waves, a circuit for connecting to a LAN (Local Area Network), and the like, and is composed of a main body control unit 29. Under the control of, the communication with the server 4 is performed via the network NW. The communication unit 21 can receive the radiation image T1 and the like from the server 4 via the network NW.
- the image generation unit 22 has a configuration in which a signal processing unit 32, a DSC (Digital Scan Converter) 33, and an image processing unit 34 are sequentially connected in series.
- the signal processing unit 32 attenuates the sound line signal transmitted from the transmission / reception circuit 12 of the ultrasonic probe 2 by a distance according to the depth of the ultrasonic reflection position using the sound velocity value set by the main body control unit 29. After the correction of the above, the envelope detection process is performed to generate a B-mode image signal which is tomographic image information about the tissue in the subject.
- the DSC 33 converts the B-mode image signal generated by the signal processing unit 32 into an image signal according to a normal television signal scanning method (raster conversion).
- the image processing unit 34 performs various necessary image processing such as gradation processing on the B mode image signal input from the DSC 33, and then sends the B mode image signal to the display control unit 23 and the memory 25.
- the B-mode image signal processed by the image processing unit 34 is referred to as an ultrasonic image.
- the memory 25 is for storing the ultrasonic image generated by the image generation unit 22 and the radiation image T1 and the like transmitted from the server 4 to the communication unit 21 via the network NW.
- the ultrasonic image stored in the memory 25 is read out under the control of the main body control unit 29 and sent to the display control unit 23 and the image adjustment unit 27. Further, the radiation image T1 stored in the memory 25 is read out under the control of the main body control unit 29 and sent to the image adjustment unit 27.
- Examples of the memory 25 include a flash memory, an HDD (Hard Disc Drive), an SSD (Solid State Drive), an FD (Flexible Disc), and an MO disk (Magneto-Optical disc). Disk), MT (Magnetic Tape), RAM (Random Access Memory), CD (Compact Disc), DVD (Digital Versatile Disc), SD card (Secure Digital card:) A secure digital card) and a recording medium such as a USB memory (Universal Serial Bus memory) can be used.
- a flash memory an HDD (Hard Disc Drive), an SSD (Solid State Drive), an FD (Flexible Disc), and an MO disk (Magneto-Optical disc).
- Disk Magnetic Tape
- RAM Random Access Memory
- CD Compact Disc
- DVD Digital Versatile Disc
- SD card Secure Digital card
- a secure digital card) and a recording medium such as a USB memory (Universal Serial Bus memory) can be used.
- the image adjusting unit 27 is based on the radiation image orientation information stored in the tag of the radiation image T1 stored in the memory 25 and the probe orientation information of the ultrasonic probe 2 when the ultrasonic image is taken.
- the radiation image T1 and the ultrasonic image are adjusted so that the region of interest A1 captured by the radiation image T1 and the region of interest captured by the ultrasonic image are oriented in the same direction.
- the probe orientation information is information regarding the orientation of the subject in the ultrasonic image, such as the A direction, the P direction, the R direction, the L direction, the H direction, and the F direction.
- the image adjustment unit 27 sets the probe orientation information based on the information input by the user's input operation via the input device 30. Thereby, for example, as shown in FIG. 6, the R direction mark D5 indicating the R direction, the L direction mark D6 indicating the L direction, the A direction mark D7 indicating the A direction, and the P direction are represented with respect to the ultrasonic image U1.
- Four subject orientation marks indicating the orientation of the subject are added to the P direction mark D8.
- the ultrasonic image U1 shown in FIG. 6 is a photograph of the breast of the subject.
- the image adjusting unit 27 has, for example, the R direction mark D1 and the R direction mark D5, the L direction mark D2 and the L direction mark D6, A with respect to the radiation image T1 shown in FIG. 2 and the ultrasonic image U1 shown in FIG.
- the direction mark D3 and the A direction mark D7, and the P direction mark D4 and the P direction mark D8 are adjusted so as to face each other in the same direction.
- the image adjusting unit 27 rotates the radiation image T1 shown in FIG. 2 clockwise by 90 degrees so that the R direction mark D1 side is tilted toward the P direction mark D4 side, and then flips left and right. That is, the radiation image T1 can be adjusted in the direction shown in FIG. 7 by inverting the R direction mark D1 side and the L direction mark D2 side.
- the R direction mark D1 and the R direction mark D5 In the ultrasonic image U1 shown in FIG. 6 and the radial image T1 shown in FIG. 7, the R direction mark D1 and the R direction mark D5, the L direction mark D2 and the L direction mark D6, the A direction mark D3 and the A direction mark D7, and the P direction.
- the marks D4 and the P-direction marks D8 are oriented in the same direction as each other, and the region of interest A1 in the radiographic image T1 and the region of interest A2 in the ultrasonic image U1 are also oriented in the same direction.
- the main body control unit 29 controls each part of the diagnostic apparatus main body 3 according to a program or the like recorded in advance.
- the display control unit 23, under the control of the main body control unit 29, has the ultrasonic image U1 generated by the image generation unit 22 and the radiation image T1 transmitted from the server 4 to the communication unit 21 via the network NW. After performing a predetermined process, the image is displayed on the monitor 24.
- the monitor 24 performs various displays under the control of the display control unit 23.
- the monitor 24 includes, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL display (Organic Electroluminescence Display).
- the input device 30 of the diagnostic device main body 3 is for the user to perform an input operation.
- the input device 30 is composed of, for example, a keyboard, a mouse, a trackball, a touch pad, a touch panel, and other devices for the user to perform an input operation.
- the processor 31 including the communication unit 21, the image generation unit 22, the display control unit 23, the image adjustment unit 27, and the main body control unit 29 performs various processing on the CPU (Central Processing Unit) and the CPU. It consists of control programs to be performed, but FPGA (Field Programmable Gate Array: Feed Programmable Gate Array), DSP (Digital Signal Processor: Digital Signal Processor), ASIC (Application Specific Integrated Circuit: Application Specific Integrated Circuit), It may be configured by using a GPU (Graphics Processing Unit) or another IC (Integrated Circuit), or may be configured by combining them.
- FPGA Field Programmable Gate Array: Feed Programmable Gate Array
- DSP Digital Signal Processor: Digital Signal Processor
- ASIC Application Specific Integrated Circuit: Application Specific Integrated Circuit
- the communication unit 21, the image generation unit 22, the display control unit 23, the image adjustment unit 27, and the main body control unit 29 can be partially or wholly integrated into one CPU or the like.
- step S1 the radiation image T1 stored in the server 4 is transmitted to the communication unit 21 via the network NW based on the input operation of the user via the input device 30, and the radiation image T1 is transmitted. It is saved in the memory 25.
- this radiographic image T1 includes a region of interest A1 suspected to be a lesion.
- the tag of the radiographic image T1 stores the radiographic image orientation information, and based on the radiographic image orientation information, the radiographic image T1 has the R direction mark D1, the L direction mark D2, the A direction mark D3, and P.
- the direction mark D4 is arranged.
- step S2 a plurality of frames of the ultrasonic image U are taken in a state where the ultrasonic probe 2 is in contact with the body surface of the subject by the user.
- the transmission / reception circuit 12 performs reception focus processing using preset sound velocity values under the control of the probe control unit 15 to generate a sound line signal.
- the sound line signal generated by the transmission / reception circuit 12 in this way is transmitted to the image generation unit 22.
- the image generation unit 22 generates an ultrasonic image U1 as shown in FIG. 6 by using the sound line signal transmitted from the transmission / reception circuit 12.
- the generated ultrasonic image U1 is stored in the memory 25.
- the image adjustment unit 27 sets the probe orientation information based on the user's input operation via the input device 30.
- the user confirms the direction of the marker M formed on the ultrasonic probe 2 as shown in FIG. 3 and the orientation of the subject, and obtains information regarding the orientation of the subject in the ultrasonic image U1. input.
- the R direction mark D5 the R direction mark D5
- the L direction mark D6 the A direction mark D7
- the P direction mark D8 are arranged.
- step S4 the image adjusting unit 27 takes an image on the radiation image T1 based on the radiation image orientation information stored in the tag of the radiation image T1 saved in step S1 and the probe orientation information set in step S3.
- the radiation image T1 and the ultrasonic image U1 are adjusted so that the region of interest A1 and the region of interest A2 captured by the ultrasonic image U1 are oriented in the same direction.
- the image adjusting unit 27 adjusts the direction shown in FIG. 7 by, for example, rotating the radiation image T1 shown in FIG. 2 clockwise and then flipping it left and right.
- the orientation of the region of interest A1 in the radiographic image T1 shown in FIG. 7 and the orientation of the region of interest A2 in the ultrasonic image U1 shown in FIG. 6 can be adjusted to be the same orientation as each other. This makes it easier for the user to compare the region of interest A1 in the radiographic image T1 with the region of interest A2 in the ultrasonic image U1.
- step S5 as shown in FIG. 9, the radiation image T1 and the ultrasonic image U1 adjusted in step S4 are displayed on the monitor 24. In this way, the operation of the ultrasonic diagnostic apparatus 1 according to the first embodiment shown in the flowchart of FIG. 8 is completed.
- the radiation image orientation information stored in the tag of the radiation image T1 and the ultrasonic waves when the ultrasonic image U1 is taken.
- the radiographic image T1 and the ultrasonic image U1 are adjusted so that the orientation of the region of interest A1 in the radiographic image T1 and the orientation of the region of interest A2 in the ultrasonic image U1 are the same as each other. .. Therefore, the user can easily compare the region of interest A1 in the radiographic image T1 and the region of interest A2 in the ultrasonic image U1, and can improve the diagnostic accuracy for the regions of interest A1 and A2.
- the image generation unit 22 is provided in the diagnostic apparatus main body 3, but may be provided in the ultrasonic probe 2 instead of being provided in the diagnostic apparatus main body 3.
- the ultrasonic probe 2 and the diagnostic device main body 3 are connected to each other by wired communication, the ultrasonic probe 2 and the diagnostic device main body 3 can also be connected to each other by wireless communication.
- diagnostic device main body 3 is provided with one memory 25, it can also be provided with a plurality of memories depending on the application and the like.
- the radiographic image T1 is transmitted from the server 4 to the diagnostic apparatus main body 3 via the network NW, the radiographic image T1 is not limited to being transmitted from the server 4.
- the radiation image T1 can also be transmitted from a radiation diagnostic device (not shown) to the diagnostic device main body 3.
- the shape of the marker M is not particularly limited as long as the direction of the ultrasonic probe 2 can be indicated.
- the marker M may have, for example, a recessed shape, or may have a planar shape and a pattern.
- the R direction mark D1, the L direction mark D2, the A direction mark D3, the P direction mark D4, and the like are arranged on the radiation image T1, but the form of the mark indicating the direction is particularly limited to this. Not limited. For example, by arranging a so-called schema for schematically representing the breast on the radiation image T1 and arranging a mark indicating the direction of the radiation source when the radiation image T1 is taken on the schema, the radiation image T1 Can indicate the direction in.
- the form of the mark indicating the direction is not limited to the R direction mark D5, the L direction mark D6, the A direction mark D7, the P direction mark D8, and the like.
- the ultrasonic image U1 is arranged. Can indicate the direction in.
- the position and orientation of the probe mark superimposed on the schema on the ultrasonic image U1 can be set by a user's input operation via the input device 30.
- the image adjustment unit 27 can set the probe orientation information based on the orientation of the probe mark input to the user.
- the mark indicating the direction in the radiographic image T1 and the mark indicating the direction in the ultrasonic image U1 need not be displayed on the monitor 24. However, by displaying these marks on the monitor 24, the user can easily grasp that the orientation of the region of interest A1 in the radiographic image T1 and the orientation of the region of interest A2 in the ultrasonic image U1 are the same. Can be done.
- the reversing process is not particularly limited to left-right reversal.
- the image adjusting unit 27 can also perform a so-called upside-down process of inverting the R-direction mark D1 side and the L-direction mark D2 side of the radiation image T1 shown in FIG.
- the image adjusting unit 27 inverts the radiation image T1 with respect to an axis that divides the radiation image T1 into two equal parts in the left-right direction or an axis that divides the radiation image T1 into two equal parts in the vertical direction.
- the process is not limited to the process, and an arbitrary axis can be set and the process of inverting the radiation image T1 with respect to the axis can be performed.
- the ultrasonic image U1 may be subjected to rotation processing and inversion processing instead of the radiation image T1, and both the radiation image T1 and the ultrasonic image U1 may be subjected to rotation processing and inversion processing.
- the shooting directions of the radiation image T1 and the ultrasonic image U1 are particularly limited. Not done.
- a radiographic image T2 in which the breast of a subject is photographed from the so-called MLO (Medio Lateral Oblique) direction and an ultrasonic image U2 taken from the corresponding direction.
- MLO Medium Lateral Oblique
- the orientation of the region of interest A3 in the radiation image T2 and the orientation of the region of interest A4 in the ultrasonic image U2 can be the same.
- the HR (Head Right) direction mark C1 and the H direction mark C5 the FL (Foot Left: foot left) direction mark C2 and the F direction mark C6,
- the A-direction mark C3 and the A-direction mark C7, and the P-direction mark C4 and the P-direction mark C8 each face the same direction.
- the image adjusting unit 27 rotates the radiation source T1 as shown in FIG. 11 when the radiation image T2 is acquired by photographing from the MLO direction in which the radiation source is rotated by the rotation angle Q.
- the radiation image T2 and the ultrasonic image U2 can be adjusted after the rotation conversion based on the angle Q.
- the image adjusting unit 27 not only captures the image of the subject's breast from the MLO direction, but also obtains the radiation image T1 in a state where the radiation source is arranged in a direction inclined with respect to the vertical direction.
- the radiographic image T2 and the ultrasonic image U2 can be adjusted. As a result, the user can more easily compare the region of interest A3 in the radiographic image T2 and the region of interest A4 in the ultrasonic image U2, so that the diagnostic accuracy for the subject can be improved.
- the tag of the radiation image T1 can include the radiation image breast information indicating which of the left and right breasts the subject's breast captured in the radiation image T1 is, and further, for example, via the input device 30.
- the user can input the ultrasonic image breast information indicating which of the left and right breasts the subject's breast is captured in the ultrasonic image U1.
- the image adjusting unit 27 has the breast of the subject imaged on the radiographic image T1 and the breast of the subject imaged on the ultrasonic image U1 based on the radiographic image breast information and the ultrasonic image breast information.
- the radial image T1 and the ultrasonic image U1 can be adjusted so that the orientation of the region of interest A1 in the radiation image T1 and the orientation of the region of interest A2 in the ultrasonic image U1 are the same only when they match each other.
- the user can easily compare the region of interest A1 and the region of interest A2 in the radiographic image T1 and the ultrasonic image U1 in which the same breast is imaged, so that the diagnostic accuracy for the subject can be improved.
- the image adjustment unit 27 may further adjust the radiation image T1 and the ultrasonic image U1 that have already been adjusted and displayed on the monitor 24 based on the readjustment information input by the user via the input device 30. can.
- the radiographic image T1 and the ultrasonic image U1 can be readjusted so that the user can easily compare them, and the diagnostic accuracy for the subject can be further improved.
- the main body control unit 29 informs the user of a part of the region including the region of interest A1 in the radiation image T1 and the ultrasonic image U1. It is also possible to rotate and invert some areas including the designated area of interest A2 and display them on the monitor 24.
- the sub-window W1 is arranged on the radiation image T1 and the sub-window W2 is arranged on the ultrasonic image U1, and the orientation of the region of interest A1 and the orientation of the region of interest A2 are the same.
- the region of interest B1 in the radiation image T1 adjusted so as to be displayed in the subwindow W1 on the radiation image T1 and the ultrasonic image U1 adjusted so that the orientation of the region of interest A1 and the orientation of the region of interest A2 are the same.
- the region of interest B2 in the above can be displayed in the sub-window W2 on the ultrasonic image U1. This allows the user to easily compare the region of interest B1 and the region of interest B2 facing in the same direction.
- the main body control unit 29 has the same size of the region of interest B1 in the subwindow W1 and the region of interest B2 in the subwindow W2 based on, for example, the interpixel distance of the radiation image T1 and the interpixel distance of the ultrasonic image U1.
- the region of interest B1 or the region of interest B2 can be enlarged and displayed so as to be. This allows the user to more easily compare the region of interest B1 and the region of interest B2.
- the main body control unit 29 may expand both the region of interest B1 and the region of interest B2 so that the region of interest B1 in the sub-window W1 and the region of interest B2 in the sub-window W2 have the same size.
- the inter-pixel distance of the radiation image T1 and the ultrasonic image U1 is the actual length per pixel in each of the radiation image T1 and the ultrasonic image U1.
- Information regarding the inter-pixel distance of the ultrasonic image U1 is stored in advance in, for example, the image adjusting unit 27. Further, the information regarding the inter-pixel distance of the radiation image T1 is stored in advance in the tag of the radiation image T1, for example.
- the main body control unit 29 indicates the direction in the radiation image T1 such as the R direction mark F1, the L direction mark F2, the A direction mark F3, and the P direction mark F4 in the sub window W1.
- a mark can be displayed.
- the main body control unit 29 can also display the mark indicating the direction in the ultrasonic image U1 in the sub-window W2 on the ultrasonic image U1.
- Embodiment 2 In the first embodiment, an example of setting the region of interest A2 in the ultrasonic image U1 based on a user's input operation via the input device 30 is described, but image analysis is performed on the ultrasonic image U1. Thereby, the region of interest A2 can also be extracted.
- the ultrasonic diagnostic apparatus 1A includes the diagnostic apparatus main body 3A instead of the diagnostic apparatus main body 3 in the ultrasonic diagnostic apparatus 1 of the first embodiment shown in FIG. There is.
- the area of interest extraction unit 41 is added to the diagnostic device main body 3A, and the main body control unit 29A is provided instead of the main body control unit 29.
- a processor 31A including the region of interest extraction unit 41 is configured.
- the region of interest extraction unit 41 is connected to the memory 25. Further, the image adjustment unit 27 and the main body control unit 29A are connected to the region of interest extraction unit 41.
- the region of interest extraction unit 41 extracts the region of interest A2 in the ultrasonic image U1 by performing image analysis on the ultrasonic image U1 stored in the memory 25.
- the region of interest extraction unit 41 stores, for example, typical pattern data of the region of interest A2 in advance as a template, calculates the similarity to the pattern data while searching the ultrasonic image U1, and the similarity is a threshold value. It can be considered that the region of interest A2 exists in the above-mentioned and maximum location.
- the image adjusting unit 27 adjusts the radiographic image T1 and the ultrasonic image U1 so that the orientation of the region of interest A1 in the radiographic image T1 and the orientation of the region of interest A2 in the ultrasonic image U1 are the same.
- the orientation of the region of interest A1 in the radiation image T1 and the region of interest A2 in the ultrasonic image U1 are the same as in the first embodiment. Since the radiation image T1 and the ultrasonic image U1 are adjusted so that the orientations are the same as each other, the user can easily compare the region of interest A1 in the radiation image T1 and the region of interest A2 in the ultrasonic image U1. , The diagnostic accuracy for the regions of interest A1 and A2 can be improved.
- the main body control unit 29 includes a part of the region including the region of interest A1 in the radiation image T1 as shown in FIG. It is also possible to rotate and invert some regions including the region of interest A2 extracted by the region of interest extraction unit 41 in the ultrasonic image U1 and display them on the monitor 24.
- the main body control unit 29 has the size of the region of interest B1 in the sub-window W1 and the region of interest B2 in the sub-window W2 based on, for example, the inter-pixel distance of the radiation image T1 and the inter-pixel distance of the ultrasonic image U1.
- the region of interest B1 or region of interest B2 can be enlarged and displayed so that This allows the user to more easily compare the region of interest B1 and the region of interest B2.
- the main body control unit 29 indicates the direction in the radiation image T1 such as the R direction mark F1, the L direction mark F2, the A direction mark F3, and the P direction mark F4 in the sub window W1.
- a mark can be displayed.
- the main body control unit 29 can also display the mark indicating the direction in the ultrasonic image U1 in the sub-window W2 on the ultrasonic image U1.
- the region of interest extraction unit 41 performs image analysis on the ultrasonic image U1 to extract the region of interest A2 in the ultrasonic image U1, but performs image analysis on the radiation image T1.
- the region of interest A1 in the radiation image T1 can also be extracted.
- Embodiment 3 In the first embodiment, it is explained that the probe orientation information is set based on the input operation of the user via the input device 30, but for example, the orientation of the ultrasonic probe 2 is detected and the detected ultrasonic probe 2 is detected.
- the probe orientation information may be set based on the orientation of the sound wave probe 2.
- the ultrasonic diagnostic apparatus 1B according to the third embodiment includes the ultrasonic probe 2B instead of the ultrasonic probe 2 in the ultrasonic diagnostic apparatus 1 according to the first embodiment shown in FIG. There is.
- the position sensor 42 is added to the ultrasonic probe 2 in the first embodiment.
- the position sensor 42 is connected to the memory 25 of the diagnostic apparatus main body 3 and the main body control unit 29.
- the position sensor 42 is a sensor for detecting position information including the orientation of the ultrasonic probe 2B, and is, for example, a magnetic sensor, an optical position sensor, an acceleration sensor, a gyro sensor, and a GPS (Global Positioning System). Can include sensors, etc.
- the position information of the ultrasonic probe 2B detected by the position sensor 42 is sent to the memory 25 of the diagnostic apparatus main body 3, and the ultrasonic image U1 is generated by the image generation unit 22 under the control of the main body control unit 29. Each time, it is associated with the ultrasonic image U1 and stored in the memory 25.
- the image adjusting unit 27 receives the ultrasonic image U1 and the radiographic image T1 from the memory 25, sets the probe orientation information based on the position information of the ultrasonic probe 2B associated with the ultrasonic image U1, and sets the probe orientation information. Based on the information and the radiographic image orientation information of the radiographic image T1, the radiographic image T1 and the ultrasonic image U1 so that the orientation of the region of interest A1 in the radiographic image T1 and the orientation of the region of interest A2 in the ultrasonic image U1 are the same. To adjust.
- the orientation of the region of interest A1 and the ultrasonic waves in the radiation image T1 are the same as in the first embodiment. Since the radiation image T1 and the ultrasonic image U1 are adjusted so that the orientations of the regions of interest A2 in the image U1 are the same, the user can select the region A1 in the radiation image T1 and the region A2 in the ultrasonic image U1. It can be easily compared and the diagnostic accuracy for the regions of interest A1 and A2 can be improved.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- High Energy & Nuclear Physics (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Description
画像調整部は、放射線画像の全体および超音波画像の全体の少なくとも一方に、回転処理および反転処理の少なくとも一方を施すことにより調整済みの放射線画像および超音波画像を生成することができる。
また、画像調整部は、放射線画像に撮像されている関心領域と超音波画像に撮像されている関心領域とが互いに同じサイズとなるように、調整済みの放射線画像および超音波画像を生成することができる。
この場合に、画像調整部は、放射線画像のピクセル間距離と超音波画像のピクセル間距離に基づいて、調整済みの放射線画像および超音波画像のサイズの比率を決定することが好ましい。
また、画像調整部は、ユーザにより入力された再調整情報に基づいて、既に調整された放射線画像および超音波画像を更に調整することができる。
以下に記載する構成要件の説明は、本発明の代表的な実施態様に基づいてなされるが、本発明はそのような実施態様に限定されるものではない。
なお、本明細書において、「~」を用いて表される数値範囲は、「~」の前後に記載される数値を下限値および上限値として含む範囲を意味する。
本明細書において、「同一」、「同じ」は、技術分野で一般的に許容される誤差範囲を含むものとする。
図1に、本発明の実施の形態1に係る超音波診断装置1の構成を示す。超音波診断装置1は、超音波プローブ2と、診断装置本体3を備えている。超音波プローブ2と診断装置本体3は、互いに接続されている。また、診断装置本体3は、ネットワークNWを介して外部のサーバ4と互いに接続されている。
また、通信部21、画像生成部22、表示制御部23、画像調整部27および本体制御部29により、プロセッサ31が構成されている。
画像処理部34は、DSC33から入力されるBモード画像信号に階調処理等の各種の必要な画像処理を施した後、Bモード画像信号を表示制御部23およびメモリ25に送出する。以降は、画像処理部34により画像処理が施されたBモード画像信号を、超音波画像と呼ぶ。
表示制御部23は、本体制御部29の制御の下、画像生成部22により生成された超音波画像U1およびサーバ4からネットワークNWを経由して通信部21に送信された放射線画像T1等に対して所定の処理を施して、モニタ24に表示する。
まず、ステップS1において、入力装置30を介したユーザの入力操作等に基づいて、サーバ4に保存されている放射線画像T1がネットワークNWを経由して通信部21に送信され、その放射線画像T1がメモリ25に保存される。この放射線画像T1は、図2に示すように、病変部の疑いがある関心領域A1を含んでいる。また、放射線画像T1のタグには、放射線画像向き情報が格納されており、その放射線画像向き情報に基づいて、放射線画像T1に、R方向マークD1、L方向マークD2、A方向マークD3およびP方向マークD4が配置されている。
この際に、送受信回路12は、プローブ制御部15の制御の下で、予め設定された音速値を用いて受信フォーカス処理を行って、音線信号を生成する。このようにして送受信回路12により生成された音線信号は、画像生成部22に送出される。画像生成部22は、送受信回路12から送出された音線信号を用いて図6に示すような超音波画像U1を生成する。生成された超音波画像U1は、メモリ25に保存される。
これにより、ユーザが、放射線画像T1における関心領域A1と超音波画像U1における関心領域A2を比較しやすくなる。
このようにして、図8のフローチャートに示す実施の形態1に係る超音波診断装置1の動作が完了する。
画像調整部27は、MLO方向から被検体の乳房が撮影される場合だけでなく、放射線源が鉛直方向に対して傾斜した方向に配置された状態において放射線画像T1が取得されている場合に、放射線画像T2および超音波画像U2を調整することができる。
これにより、ユーザは、より容易に放射線画像T2における関心領域A3と超音波画像U2における関心領域A4を比較できるため、被検体に対する診断精度を向上させることができる。
これにより、ユーザは、同一の乳房を撮像した放射線画像T1と超音波画像U1において関心領域A1と関心領域A2を容易に比較できるため、被検体に対する診断精度を向上させることができる。
これにより、ユーザは、互いに同じ方向を向いた関心領域B1と関心領域B2を容易に比較することができる。
なお、本体制御部29は、サブウィンドウW1内の関心領域B1とサブウィンドウW2内の関心領域B2の大きさが同じになるように、関心領域B1と関心領域B2の双方を拡大してもよい。
また、本体制御部29は、同様の理由から、超音波画像U1上のサブウィンドウW2においても、超音波画像U1における方向を表すマークを表示することができる。
実施の形態1において、超音波画像U1における関心領域A2を、入力装置30を介したユーザの入力操作に基づいて設定する例が説明されているが、超音波画像U1に対して画像解析を行うことにより、関心領域A2を抽出することもできる。
診断装置本体3Aは、実施の形態1における診断装置本体3において、関心領域抽出部41が追加され、本体制御部29の代わりに本体制御部29Aを備えている。また、プロセッサ31の代わりに、関心領域抽出部41を含むプロセッサ31Aが構成されている。
また、本体制御部29は、同様の理由から、超音波画像U1上のサブウィンドウW2においても、超音波画像U1における方向を表すマークを表示することができる。
実施の形態1では、入力装置30を介したユーザの入力操作に基づいてプローブ向き情報が設定されることが説明されているが、例えば、超音波プローブ2の向きを検出し、検出された超音波プローブ2の向きに基づいてプローブ向き情報が設定されてもよい。
実施の形態3における超音波プローブ2Bは、実施の形態1における超音波プローブ2において、位置センサ42が追加されている。位置センサ42は、診断装置本体3のメモリ25および本体制御部29に接続されている。
Claims (11)
- 超音波プローブと、
被検体に対し前記超音波プローブを用いて超音波ビームの送受信を行うことにより、放射線画像に撮像されている前記被検体の乳房の関心領域を含む超音波画像を生成する画像生成部と、
前記放射線画像のタグに格納されている放射線画像向き情報と、前記超音波画像を撮影した際の前記超音波プローブのプローブ向き情報とに基づいて、前記放射線画像に撮像されている前記関心領域と前記超音波画像に撮像されている前記関心領域とが互いに同じ向きとなるように前記放射線画像および前記超音波画像を調整する画像調整部と、
前記画像調整部により調整された調整済みの前記放射線画像および前記超音波画像を表示するモニタと
を備える超音波診断装置。 - 前記プローブ向き情報は、ユーザにより指定された前記超音波プローブの位置情報、または、前記超音波プローブに搭載された位置センサにより検出される位置情報である請求項1に記載の超音波診断装置。
- 前記画像調整部は、前記放射線画像の全体および前記超音波画像の全体の少なくとも一方に、回転処理および反転処理の少なくとも一方を施すことにより前記調整済みの前記放射線画像および前記超音波画像を生成する請求項1または2に記載の超音波診断装置。
- 前記放射線画像および前記超音波画像からそれぞれ前記関心領域を抽出する関心領域抽出部を備え、
前記画像調整部は、前記放射線画像から抽出された前記関心領域および前記超音波画像から抽出された前記関心領域の少なくとも一方に、回転処理および反転処理の少なくとも一方を施すことにより前記調整済みの前記放射線画像および前記超音波画像を生成する請求項1または2に記載の超音波診断装置。 - 前記画像調整部は、前記調整済みの前記放射線画像および前記超音波画像に、前記被検体の向きを表す被検体向きマークを重複表示する請求項1~4のいずれか一項に記載の超音波診断装置。
- 前記画像調整部は、前記放射線画像に撮像されている前記関心領域と前記超音波画像に撮像されている前記関心領域とが互いに同じサイズとなるように、前記調整済みの前記放射線画像および前記超音波画像を生成する請求項1~5のいずれか一項に記載の超音波診断装置。
- 前記画像調整部は、前記放射線画像のピクセル間距離と前記超音波画像のピクセル間距離に基づいて、前記調整済みの前記放射線画像および前記超音波画像のサイズの比率を決定する請求項6に記載の超音波診断装置。
- 前記画像調整部は、放射線源が鉛直方向に対して傾斜した方向に配置された状態において前記放射線画像が取得されている場合に、前記放射線画像を前記放射線源の回転角度に基づいて回転変換させた後に、前記放射線画像および前記超音波画像を調整する請求項1~7のいずれか一項に記載の超音波診断装置。
- 前記放射線画像の前記タグは、前記放射線画像に撮像されている前記被検体の乳房が左右の乳房のいずれであるかを表す放射線画像乳房情報を含み、
前記画像調整部は、前記放射線画像乳房情報と、前記ユーザにより入力された、前記超音波画像に撮像されている前記被検体の乳房が左右の乳房のいずれであるかを表す情報とに基づいて、前記放射線画像に撮像されている前記被検体の乳房と前記超音波画像に撮像されている前記被検体の乳房が互いに一致する場合に、前記放射線画像および前記超音波画像を調整する請求項1~8のいずれか一項に記載の超音波診断装置。 - 前記画像調整部は、前記ユーザにより入力された再調整情報に基づいて、既に調整された前記放射線画像および前記超音波画像を更に調整する請求項1~9のいずれか一項に記載の超音波診断装置。
- 被検体に対し超音波プローブを用いて超音波ビームの送受信を行うことにより、放射線画像に撮像されている前記被検体の乳房の関心領域を含む超音波画像を生成し、
前記放射線画像のタグに格納されている放射線画像向き情報と、前記超音波画像を撮影した際の前記超音波プローブのプローブ向き情報とに基づいて、前記放射線画像に撮像されている前記関心領域と前記超音波画像に撮像されている前記関心領域とが互いに同じ向きとなるように前記放射線画像および前記超音波画像を調整し、
調整済みの前記放射線画像および前記超音波画像をモニタに表示する
超音波診断装置の制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022551864A JPWO2022065050A1 (ja) | 2020-09-28 | 2021-09-09 | |
EP21872181.9A EP4218603A4 (en) | 2020-09-28 | 2021-09-09 | ULTRASOUND DIAGNOSTIC DEVICE AND METHOD FOR CONTROLLING ULTRASOUND DIAGNOSTIC DEVICE |
US18/180,674 US20230225713A1 (en) | 2020-09-28 | 2023-03-08 | Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020162263 | 2020-09-28 | ||
JP2020-162263 | 2020-09-28 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/180,674 Continuation US20230225713A1 (en) | 2020-09-28 | 2023-03-08 | Ultrasound diagnostic apparatus and control method for ultrasound diagnostic apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022065050A1 true WO2022065050A1 (ja) | 2022-03-31 |
Family
ID=80846498
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/033135 WO2022065050A1 (ja) | 2020-09-28 | 2021-09-09 | 超音波診断装置および超音波診断装置の制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230225713A1 (ja) |
EP (1) | EP4218603A4 (ja) |
JP (1) | JPWO2022065050A1 (ja) |
WO (1) | WO2022065050A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008086742A (ja) * | 2006-01-19 | 2008-04-17 | Toshiba Corp | 超音波プローブの軌跡表現装置及び超音波診断装置 |
JP2015100661A (ja) * | 2013-11-28 | 2015-06-04 | コニカミノルタ株式会社 | 医用画像システム及びプログラム |
JP2015231438A (ja) * | 2014-06-09 | 2015-12-24 | 株式会社東芝 | 医用画像診断装置、超音波診断装置および医用画像処理装置 |
JP2018050761A (ja) * | 2016-09-27 | 2018-04-05 | キヤノン株式会社 | 画像処理装置および画像処理方法 |
JP2020039877A (ja) | 2018-09-13 | 2020-03-19 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置、医用画像診断方法及び超音波診断装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6396940B1 (en) * | 1999-05-27 | 2002-05-28 | Litton Systems, Inc. | Optical correlator based automated pathologic region of interest selector for integrated 3D ultrasound and digital mammography |
JP2004248818A (ja) * | 2003-02-19 | 2004-09-09 | Fuji Photo Film Co Ltd | 異常陰影の検出結果表示方法および装置並びにプログラム |
JP2014504918A (ja) * | 2010-12-14 | 2014-02-27 | ホロジック, インコーポレイテッド | 画像診断に使用するために複数の異なる画像処理システムからの3次元画像データを重ね合わせるシステムおよび方法 |
JP6258026B2 (ja) * | 2013-12-09 | 2018-01-10 | 東芝メディカルシステムズ株式会社 | 超音波診断装置 |
US10198822B2 (en) * | 2016-10-27 | 2019-02-05 | International Business Machines Corporation | Systems and user interfaces for determination of electro magnetically identified lesions as included in medical images of differing perspectives |
-
2021
- 2021-09-09 WO PCT/JP2021/033135 patent/WO2022065050A1/ja active Application Filing
- 2021-09-09 JP JP2022551864A patent/JPWO2022065050A1/ja active Pending
- 2021-09-09 EP EP21872181.9A patent/EP4218603A4/en active Pending
-
2023
- 2023-03-08 US US18/180,674 patent/US20230225713A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008086742A (ja) * | 2006-01-19 | 2008-04-17 | Toshiba Corp | 超音波プローブの軌跡表現装置及び超音波診断装置 |
JP2015100661A (ja) * | 2013-11-28 | 2015-06-04 | コニカミノルタ株式会社 | 医用画像システム及びプログラム |
JP2015231438A (ja) * | 2014-06-09 | 2015-12-24 | 株式会社東芝 | 医用画像診断装置、超音波診断装置および医用画像処理装置 |
JP2018050761A (ja) * | 2016-09-27 | 2018-04-05 | キヤノン株式会社 | 画像処理装置および画像処理方法 |
JP2020039877A (ja) | 2018-09-13 | 2020-03-19 | キヤノンメディカルシステムズ株式会社 | 医用画像診断装置、医用画像診断方法及び超音波診断装置 |
Non-Patent Citations (3)
Title |
---|
CSURKA ET AL.: "Visual Categorization with Bags of Keypoints", PROC. OF ECCV WORKSHOP ON STATISTICAL LEARNING IN COMPUTER VISION, 2004, pages 59 - 74 |
KRIZHEVSK ET AL.: "ImageNet Classification with Deep Convolutional Neural Networks", ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS, vol. 25, 2012, pages 1106 - 1114 |
See also references of EP4218603A4 |
Also Published As
Publication number | Publication date |
---|---|
EP4218603A4 (en) | 2024-03-27 |
EP4218603A1 (en) | 2023-08-02 |
JPWO2022065050A1 (ja) | 2022-03-31 |
US20230225713A1 (en) | 2023-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JPH11123192A (ja) | 生体部位の画像生成表示装置 | |
JP2009082402A (ja) | 医用画像診断システム、医用撮像装置、医用画像格納装置、及び、医用画像表示装置 | |
US20120029344A1 (en) | Radiological image radiographiing method and apparatus | |
CN109788942B (zh) | 超声波诊断装置及超声波诊断装置的控制方法 | |
JP2006288471A (ja) | 3次元超音波診断装置及びボリュームデータ表示領域設定方法 | |
US11116481B2 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
WO2022065050A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP7453400B2 (ja) | 超音波システムおよび超音波システムの制御方法 | |
US20230200783A1 (en) | Ultrasound system and control method of ultrasound system | |
US20230200779A1 (en) | Ultrasound system and control method of ultrasound system | |
JP7411109B2 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
EP4309586A1 (en) | Ultrasonic diagnostic device and method for controlling ultrasonic diagnostic device | |
US11857375B2 (en) | Medical imaging system | |
EP4186437B1 (en) | Ultrasound image analysis apparatus, ultrasound diagnostic apparatus, and control method for ultrasound image analysis apparatus | |
US20230414203A1 (en) | Image display apparatus and control method of image display apparatus | |
US20230240654A1 (en) | Ultrasound diagnostic apparatus and display method of ultrasound diagnostic apparatus | |
US20230301618A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
WO2024127992A1 (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
US20230157670A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus | |
JP2023147906A (ja) | 超音波診断装置および超音波診断装置の制御方法 | |
JP2024025865A (ja) | 超音波診断装置の制御方法および超音波診断装置 | |
CN116158783A (zh) | 超声波图像分析装置、超声波诊断装置及超声波图像分析装置的控制方法 | |
JP2023141907A (ja) | 超音波診断システムおよび超音波診断システムの制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21872181 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022551864 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021872181 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021872181 Country of ref document: EP Effective date: 20230428 |