CN117653179A - Medical image processing device, X-ray diagnostic system, and storage medium - Google Patents

Medical image processing device, X-ray diagnostic system, and storage medium Download PDF

Info

Publication number
CN117653179A
CN117653179A CN202311156008.2A CN202311156008A CN117653179A CN 117653179 A CN117653179 A CN 117653179A CN 202311156008 A CN202311156008 A CN 202311156008A CN 117653179 A CN117653179 A CN 117653179A
Authority
CN
China
Prior art keywords
ultrasonic
detector
imaging object
catheter
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311156008.2A
Other languages
Chinese (zh)
Inventor
吉田早纪
坂口卓弥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Medical Systems Corp
Original Assignee
Canon Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023125592A external-priority patent/JP2024037685A/en
Application filed by Canon Medical Systems Corp filed Critical Canon Medical Systems Corp
Publication of CN117653179A publication Critical patent/CN117653179A/en
Pending legal-status Critical Current

Links

Landscapes

  • Apparatus For Radiation Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A medical image processing device, an X-ray diagnosis system, and a storage medium are provided. The medical image processing device comprises a processing circuit which acquires an ultrasonic image generated according to a signal collected by a detector of an ultrasonic waveguide inserted into a subject; acquiring an X-ray image depicting an imaging object, the imaging object being at least one of a detector of an ultrasonic catheter, a medical device inserted into a body of a subject, and a region of interest within the subject; detecting a position of a detector of an ultrasonic waveguide depicted in the X-ray image and a position of an imaging object depicted in at least one of the ultrasonic image and the X-ray image; based on the position of the detector of the ultrasonic wave guide and the position of the imaging object, travel assist information for causing the ultrasonic wave guide to travel so that the imaging object enters the field of view of the detector or travel assist information for causing the ultrasonic wave guide to travel so that the imaging object approaches the center of the field of view of the detector is calculated.

Description

Medical image processing device, X-ray diagnostic system, and storage medium
The present application is based on Japanese patent application 2022-142553 (application day: 2022, 9, 7) and Japanese patent application 2023-125592 (application day: 2023, 8, 1), from which priority benefits are enjoyed. The present application is incorporated by reference in its entirety.
Technical Field
Embodiments disclosed in the present specification and drawings relate to a medical image processing apparatus, an X-ray diagnostic system, and a storage medium.
Background
Conventionally, in order to improve the accuracy and efficiency of treatment, a plurality of different types of medical image diagnostic apparatuses, such as an X-ray diagnostic apparatus, an ultrasonic diagnostic apparatus, an X-ray CT (Computed Tomography: electronic computed tomography) apparatus, and a magnetic resonance imaging apparatus, are used in combination. As an example, an X-ray diagnostic apparatus and an ultrasonic diagnostic apparatus are used in combination in an under-image treatment (IVR: interventional Radiology, interventional radiology) using a catheter.
An X-ray diagnostic apparatus transmits X-rays into a subject and images the transmitted image. For example, in IVR using a catheter, a doctor inserts the catheter into a subject while confirming the position of the catheter in a blood vessel from a fluoroscopic image or a photographic image of an X-ray diagnostic apparatus such as an X-ray angiography apparatus. After the catheter has been advanced to the treatment target site (or the diagnosis target site) of the subject, an ultrasound image may be used in combination to delineate tissues such as soft tissues, which are difficult to confirm from the fluoroscopic image and the photographic image of the X-ray.
In addition, it has been conventionally known that transesophageal echocardiography (TEE: transesophageal Echocardiography) is useful for catheter treatment of structural heart diseases (SHD: structural Heart Disease) such as left atrial appendage occlusion, and for observation of anatomical structures and physiological functions of the heart and large blood vessels. However, since TEE is an examination in which an ultrasonic endoscope is inserted from the mouth to the esophagus and the heart is observed from the esophagus, general anesthesia of the subject is generally required. On the other hand, since an intracardiac catheter echo (ICE: intracardiac Echocardiography), which is one of the ultrasound waveguides incorporating a phased array element for transmitting and receiving ultrasound waves at the tip, can perform an operation under local anesthesia, it is less invasive to the subject than TEE. Therefore, in SHD treatment and the like, ICE used in combination with an X-ray diagnostic apparatus is attracting attention.
However, in general, since an imaging region (hereinafter referred to as a field of view) of an ICE, which is a region of an ultrasonic image generated from signals collected by the ICE, is narrower than a TEE, a part or all of a medical device such as a left atrial appendage or a left atrial appendage occlusion device, which is a region of interest in SHD treatment or the like, may deviate from the field of view of the ICE. And, at a portion or all of the area of interest is out of view of the ICE, the operator needs to properly advance the ICE so that the area of interest is depicted within the ICE's view.
In addition, even when the treatment target portion as the region of interest or the whole of the medical device is drawn in a part of the visual field of the ICE, searching for and displaying a two-dimensional ultrasonic image from the visual field of the ICE, which enables confirmation of the indwelling state of the medical device, and the like, is a troublesome task. In addition, when a part or the whole of the region of interest is not depicted in the visual field of the ICE, searching for a two-dimensional ultrasonic image of the region of interest while performing the running operation of the ICE is a burden on the operator.
On the other hand, in recent years, a catheter operation assisting robot for assisting an operation using a catheter has also been developed. Catheter surgery auxiliary robots have been developed for the purpose of enabling catheter surgery from a remote location, or for the purpose of fully or semi-automating catheter surgery. That is, the operator may be a catheter operation assisting robot.
Disclosure of Invention
One of the technical problems to be solved by the embodiments disclosed in the present specification and the drawings is to assist an operator in properly advancing an ultrasonic catheter. However, the technical problems to be solved by the embodiments disclosed in the present specification and the drawings are not limited to the above technical problems. The technical problems associated with the effects of the configurations described in the embodiments described below can be also referred to as other technical problems.
A medical image processing device according to one embodiment includes a processing circuit that acquires an ultrasonic image generated from a signal collected by a detector of an ultrasonic waveguide inserted into a body of a subject; acquiring an X-ray image depicting an imaging object, the imaging object being at least one of a detector of an ultrasonic waveguide, a medical device inserted into a body of a subject, and a region of interest in the body of the subject; detecting (a) a position of a detector of an ultrasonic waveguide depicted in the X-ray image, and (b) a position of an imaging object depicted in at least one of the ultrasonic image and the X-ray image; based on the detected position of the detector of the ultrasonic waveguide and the position of the imaging object, traveling assistance information for traveling the ultrasonic waveguide so that the imaging object enters the field of view of the detector or traveling assistance information for traveling the ultrasonic waveguide so that the imaging object approaches the center of the field of view of the detector is calculated.
According to the medical image processing apparatus having the above configuration, the operator can be assisted in properly advancing the ultrasound catheter.
Drawings
Fig. 1 is a first schematic diagram showing a configuration example of an X-ray diagnostic system including a medical image processing apparatus according to the first embodiment.
Fig. 2 is a second schematic diagram showing a configuration example of an X-ray diagnostic system including the medical image processing apparatus according to the first embodiment.
Fig. 3 is a view for explaining an example of the surgical site according to the first embodiment.
Fig. 4 is a first view for explaining an example of the operation according to the first embodiment.
Fig. 5 is a second view for explaining an example of the operation of the first embodiment.
Fig. 6 is a schematic diagram for explaining a configuration example of the medical image processing apparatus according to the first embodiment.
Fig. 7 is a diagram showing an example of the operation of the medical image processing apparatus according to the first embodiment as a flowchart.
Fig. 8 is a diagram for explaining X-ray image capturing according to the first embodiment.
Fig. 9 is a diagram for explaining positional information of the ultrasonic detector and a field of view in an ultrasonic image according to the first embodiment.
Fig. 10 is a perspective view for explaining an example of the travel assistance of the ultrasound catheter (a) before the travel assistance and (b) after the travel assistance in the case where all of the medical equipment of the first embodiment is out of view from the detector of the ultrasound catheter.
Fig. 11 is a perspective view for explaining an example of the ultrasound catheter in the case where a part of the medical device according to the first embodiment is out of the field of view of the detector of the ultrasound catheter, before (a) the ultrasound catheter is assisted in advance, and after (b) the ultrasound catheter is assisted in advance.
Fig. 12 is a flowchart showing an example of the operation of the medical image processing apparatus according to modification 1 of the first embodiment.
Fig. 13 is a schematic diagram for explaining a configuration example of the medical image processing apparatus according to the second embodiment.
Fig. 14 is a diagram showing an example of the operation of the medical image processing apparatus according to the second embodiment as a flowchart.
Fig. 15 is a diagram for explaining an example of an ultrasound image and an output procedure of the ultrasound image according to the second embodiment.
Fig. 16 is a perspective view for explaining an example of (a) before the assistance of the advancement and (b) after the assistance of the advancement of the ultrasonic catheter 3 based on the ultrasonic image of the second embodiment
Fig. 17 is a schematic diagram showing a configuration example of an X-ray diagnosis system in which the medical image processing apparatus according to the third embodiment and the auxiliary robot are disposed.
Fig. 18 is a schematic diagram for explaining a configuration example of the medical image processing apparatus according to the third embodiment.
Detailed Description
Embodiments of a medical image processing apparatus, an X-ray diagnostic system, and a storage medium are described in detail below with reference to the drawings.
(first embodiment)
Fig. 1 is a first schematic diagram showing a configuration example of an X-ray diagnostic system 1 including a medical image processing apparatus according to a first embodiment. For example, as shown in fig. 1, the X-ray diagnostic system 1 is configured to include a medical image processing apparatus 100 and an X-ray diagnostic apparatus 200. The medical image processing apparatus 100 and the X-ray diagnostic apparatus 200 are connected to each other via the network 2. The ultrasonic diagnostic apparatus 300 is also connected to the network 2.
Fig. 2 is a second schematic diagram showing a configuration example of the X-ray diagnostic system 1 including the medical image processing apparatus according to the first embodiment. Fig. 2 is a schematic diagram showing a configuration example of the X-ray diagnostic apparatus 200 according to the first embodiment and the ultrasonic diagnostic apparatus 300 connected to the X-ray diagnostic system 1.
As shown in fig. 2, the X-ray diagnostic apparatus 200 of the first embodiment includes a gantry apparatus 210, a couch 220, a controller 230, and an image processing apparatus 240. The gantry unit 210, the couch 220, and the controller 230 are generally provided in an operating room (examination/treatment room), while the image processing unit 240 is provided in a control room adjacent to the operating room.
The gantry unit 210 includes an X-ray high voltage generator 211, an X-ray irradiator 212, a top plate (catheter table) 221, a C-arm 214, an X-ray detector 215, a C-arm drive control mechanism 231, and a couch drive control mechanism 232.
The X-ray irradiation device 212 is provided at one end of the C-arm 214. The X-ray irradiation device 212 is provided to be movable back and forth by control of the controller 230. The X-ray irradiation device 212 has an X-ray source 216 (e.g., an X-ray tube) and a movable aperture device 217. The X-ray tube receives a supply of high-voltage power from the X-ray high-voltage generator 211, and generates X-rays according to a condition of the high-voltage power. The movable aperture device 217 movably supports an aperture blade made of an X-ray shielding material at an X-ray irradiation port of the X-ray tube. The front surface of the X-ray tube may be provided with a line quality adjusting filter for adjusting the line quality of the X-rays generated by the X-ray tube.
The X-ray detection device 215 is provided so as to face the X-ray irradiation device 212, which is the other end of the C-arm 214. The X-ray detection device 215 is configured to be movable back and forth by control of the controller 230. The X-ray detection device 215 includes an FPD (flat panel detector: flat Panel Detector) 218 and an ADC (analog-to-digital converter: analog to Digital Converter) 219.
The FPD 218 has a plurality of detection elements arranged two-dimensionally. The detection elements of the FPD 218 are arranged such that the scanning lines and the signal lines are orthogonal to each other. The front surface of the FPD 218 may be provided with a grid.
The ADC 219 converts projection data of a time-series analog signal (video signal) output from the FPD 218 into a digital signal, and outputs to the image processing apparatus 240.
The C-arm 214 is arranged so that the X-ray irradiation device 212 and the X-ray detection device 215 face each other centering on the subject. The C-arm 214 moves the X-ray irradiation device 212 and the X-ray detection device 215 integrally in an arc direction of the C-arm 214 by a C-arm drive control mechanism 231 under the control of the controller 230. The X-ray diagnostic apparatus 200 is described as having the C-arm 214 and the configuration in which the C-arm 214 integrally operates the X-ray irradiation apparatus 212 and the X-ray detection apparatus 215, but is not limited to this case. For example, the X-ray diagnostic apparatus 200 may be configured to operate the X-ray irradiation apparatus 212 and the X-ray detection apparatus 215 independently without the C-arm 214.
Fig. 2 shows a configuration example of a single-plane type X-ray diagnostic apparatus 200 having only one C-arm, but the X-ray diagnostic apparatus 200 may be a double-plane type X-ray diagnostic apparatus 200 capable of being seen through both arms from both directions.
The couch 220 is supported on the floor and a top plate 221 is supported. Under the control of the controller 230, the couch 220 can move the top plate 221 in a sliding manner (X, Z axis direction), move up and down (Y axis direction), and roll by the couch driving control mechanism 232. The description has been made of the case where the gantry unit 210 is of a down tube type in which the X-ray irradiation unit 212 is located below the top plate 221, but may be of an up tube type in which the X-ray irradiation unit 212 is located above the top plate 221.
The controller 230 includes a CPU (Central Processing Unit: central processing unit) and a memory, which are not shown. The controller 230 controls the X-ray irradiation device 212, the X-ray detection device 215, and the C-arm 214 of the gantry unit 210 and the driving of the couch 220 to perform alignment according to the control of the image processing device 240. The controller 230 controls the operations of the X-ray irradiation device 212, the X-ray detection device 215, the C-arm drive control mechanism 231, and the like in accordance with the control of the image processing device 240, so as to perform the X-ray radiography or the X-ray fluoroscopy for the operation.
The image processing apparatus 240 is configured by a computer, and includes a processing circuit 241, a storage circuit 242, an input interface 243, a network interface 244, and a display 250.
The processing circuit 241 is a circuit having a dedicated or general-purpose processor, and performs operation control of the entire X-ray diagnostic apparatus 200 by executing a software process or the like performed by a program stored in the storage circuit 242, and controls the controller 230 based on an input from an operator via the input interface 243, execution of various programs read from the storage circuit 242, and various data. The processing circuit 241 generates an X-ray image (a fluoroscopic image or a photographic image) of the subject P based on the signal acquired by the gantry apparatus 210, and controls the X-ray diagnostic apparatus 200 so that the X-ray image for display stored in the storage circuit 242 is displayed on the display 250 or the like.
The storage circuit 242 stores various programs for performing control of the controller 230, image processing, display processing, and the like, diagnostic information, various data such as a diagnostic protocol, image data, and the like. The storage circuit 242 is implemented by a semiconductor Memory element such as a RAM (Random Access Memory: random access Memory) or a Flash Memory (Flash Memory), a hard disk, an optical disk, or the like.
The network interface 244 is an interface for communicating with various devices connected to the network 2 by wire or wirelessly. For example, the X-ray diagnostic apparatus 200 can exchange various data and images with the medical image processing apparatus 100, the ultrasound diagnostic apparatus 300, and the like via the network interface 244.
The input interface 243 includes an input device operable by an operator and an input circuit that inputs a signal from the input device. The input device is realized by a mouse, a keyboard, a touch panel that performs an input operation by touching an operation surface, a touch panel in which a display screen and the touch panel are integrated, a non-contact input circuit using an optical sensor, a sound input circuit, and the like.
The display 250 displays a GUI for accepting an instruction from an operator using the input interface 243, an X-ray image generated in the image processing apparatus 240, and the like. The display 250 displays various messages and display information in order to notify the operator of the processing status and processing result of the X-ray diagnostic apparatus 200. In addition, the display 250 may have a speaker and be capable of outputting as sound. The display 250 may display data, images, and the like received from various devices connected to the network 2, and various auxiliary images, auxiliary information, and the like generated by the medical image processing device 100 to assist the operation.
On the other hand, as shown in fig. 2, the ultrasonic diagnostic apparatus 300 is configured to include an ultrasonic catheter 3 in addition to an ultrasonic apparatus main body 310, an input interface 320, and a display 330. The ultrasonic catheter 3 is communicably connected to the ultrasonic device main body 310.
The ultrasonic device main body 310 is configured by a computer, and includes a transceiver circuit 311, a processing circuit 312, a storage circuit 313, and a network interface 314.
The transmitting/receiving circuit 311 supplies an ultrasonic drive signal to the ultrasonic catheter 3, and generates reflected wave data from the reflected wave signal received by the ultrasonic catheter 3.
The processing circuit 312 is a circuit having a dedicated or general-purpose processor and performing overall operation control of the ultrasonic diagnostic apparatus 300 by executing a software process or the like performed by executing a program stored in the memory circuit 313, and controls the processing of the transceiver circuit 311 based on an input from an operator via the input interface 320, execution of various programs read from the memory circuit 313, and various data. The processing circuit 312 generates ultrasonic image data based on the reflected wave signal received by the ultrasonic catheter 3, and generates an ultrasonic image for display based on the generated ultrasonic image data. The processing circuit 312 controls the ultrasonic diagnostic apparatus 300 to display the ultrasonic image for display stored in the storage circuit 313 on the display 330 or the like.
The storage circuit 313 stores various programs for performing ultrasonic transmission and reception, image processing, and display processing, diagnostic information, various data such as a diagnostic protocol, reflected wave data, image data, and the like. The storage circuit 120 is implemented by a semiconductor Memory element such as RAM (Random Access Memory) or Flash Memory (Flash Memory), a hard disk, an optical disk, or the like.
The network interface 314 is an interface for communicating with various devices connected to the network 2 by wire or wirelessly. For example, the ultrasound diagnostic apparatus 300 can exchange various data with the medical image processing apparatus 100, the X-ray diagnostic apparatus 200, and the like via the network interface 314.
The input interface 320 includes an input device operable by an operator and an input circuit that inputs a signal from the input device. The input device is realized by a mouse, a keyboard, a touch panel that performs an input operation by touching an operation surface, a touch panel in which a display screen and the touch panel are integrated, a non-contact input circuit using an optical sensor, a sound input circuit, and the like.
The display 330 displays a GUI for accepting an instruction from an operator using the input interface 320, an ultrasonic image generated in the ultrasonic device main body 310, and the like. The display 330 displays various messages and information for notifying the operator of the processing status and processing result of the ultrasonic device main body 310. In addition, the display 330 may have a speaker and output sound. The display 330 may display data, images, and the like received from various devices connected to the network 2, and various auxiliary images, auxiliary information, and the like generated by the medical image processing device 100 to assist the operation.
In fig. 2, an ultrasound catheter 3 for surgery is also illustrated. In the present specification, an ultrasonic probe that is inserted into a heart chamber of the subject P and is communicably connected to the ultrasonic device main body 310 is mainly referred to as an ultrasonic catheter 3. The ultrasound catheter 3 is inserted into a body cavity from a femoral vein, is advanced into the heart, and is configured to scan the left atrium, the left atrial appendage, the aorta, the mitral valve, the aortic valve, and the like to generate an ultrasound image. In addition, a device that is inserted into a tubular tissue or the like such as a blood vessel of the subject P, scans the tubular tissue or the like, and generates an ultrasonic image may be used as the ultrasonic catheter 3.
The ultrasonic catheter 3 is inserted into the body cavity of the subject P, and is operated in the body cavity. As shown in fig. 9, the ultrasonic catheter 3 has a detector 30 at its distal end portion, and a plurality of piezoelectric elements are provided on an array surface 31 of the detector 30. The plurality of piezoelectric elements generate ultrasonic waves from the ultrasonic device main body 310 based on the drive signal supplied through the cable 32, and receive reflected waves from the subject P and convert the reflected waves into electrical signals. The array surface 31 is a two-dimensional array capable of scanning a three-dimensional space. The array surface 31 may be a one-dimensional array capable of scanning a two-dimensional space.
In fig. 2, a surgical catheter 4 for a first embodiment of a surgery is also illustrated. In the present specification, a thin medical instrument to be inserted mainly into a body cavity of the subject P, a tubular tissue such as a blood vessel, etc., and to assist treatment, diagnosis, or treatment (i.e., a treatment process) is referred to as a surgical catheter 4. The surgical catheter 4 is configured to include, for example, a thin tube called a catheter, a guide wire for guiding the catheter to a treatment target site, and a medical device 40 attached to the distal end portion of the catheter.
The device operation unit 41 is an instrument that is manually operated by an operator such as a doctor to insert the surgical catheter 4 into a blood vessel of the subject P and advance it to a predetermined target site. Here, the predetermined target site refers to a site for assisting in a treatment, diagnosis, or treatment process, and may also be referred to as a region of interest in the body of the subject P.
The medical device 40 is used at a predetermined target site for treatment after the surgical catheter 4 is inserted into the body of the subject P. Examples of the medical device 40 include an occlusion device, a balloon, and a stent (for example, the medical device 40a of fig. 5).
The medical device 40 may be used at a predetermined target site in a therapeutic process after the surgical catheter 4 is inserted into the body of the subject P. The medical device 40 may be a puncture needle (e.g., medical device 40b of fig. 4), or the like. Further, a plurality of medical devices 40 may be provided in one surgical catheter 4.
Fig. 3 to 5 are views for explaining an example of the operation site of the first embodiment and a septum penetration and left atrial appendage occlusion operation as an example of the operation of the first embodiment, respectively. In the left atrial appendage occlusion operation, in general, the ultrasound catheter 3 and the operation catheter 4 are inserted from a blood vessel into the body of the subject P, and then travel to the right atrium, the left atrium, and the vicinity of the left atrial appendage as a treatment target site.
Specifically, as shown in fig. 4, the ultrasonic catheter 3 and the surgical catheter 4 perform septum puncture so as to travel to the vicinity of the left atrial appendage that is the treatment target site. Septum penetration refers to the process of making an opening in the atrium using a needle during treatment. The operator observes one or both of the medical device 40b and the predetermined target site to be treated in the therapeutic process. Therefore, the operator manipulates both the ultrasound catheter 3 and the surgical catheter 4 to advance the ultrasound catheter 3 to a position where both the medical device 40 and the target site to be treated in the therapeutic process can be favorably depicted. For example, in the septal puncture, a region of interest in the body of the subject P in the vicinity of the atrial septum is observed through the ultrasound catheter 3.
Next, as shown in fig. 5, first, in order to determine the size of the occlusion device, the size of the left atrial appendage, which is the treatment target site, is measured by the ultrasonic catheter 3. Second, an occlusion device sized for the left atrial appendage is directed to the left atrium. Third, the contrast agent is circulated for observing the shape of the left atrial appendage on the X-ray image immediately before the occlusion device is left. Fourth, the occlusion device at the distal end portion of the surgical catheter 4 is expanded and left to occlude the left atrial appendage. In the fourth stage, the operator observes the condition of the treatment target site such as the close contact state between the left atrial appendage and the occlusion device, and performs treatment such as re-placement of the occlusion device according to the observed condition.
That is, in the fourth stage, the operator operates both the ultrasound catheter 3 and the surgical catheter 4 to advance the ultrasound catheter 3 to a position where both the medical device 40 and the treatment target site are well depicted in order to observe either or both of the medical device 40 and the predetermined target site to be treated. For example, in the left atrial appendage occlusion operation, a region of interest in the body, which is the subject P, in the vicinity of the left atrial appendage is observed through the ultrasound catheter 3.
In fig. 3 to 5, the left atrial appendage occlusion operation and the occlusion device are described as an example of the operation of the first embodiment, but the present embodiment does not exclude the treatment of other operations and treatment procedures other than the left atrial appendage occlusion operation, such as atrial septal defect occlusion operation, mitral valve occlusion operation, and balloon aortic valve formation operation. In addition, medical devices 40 other than lancets and occlusion devices are not excluded. In addition, although the procedure and treatment by the medical device 40 are described as an example of the operation of the first embodiment, the diagnosis by the medical device 40 and the like are not excluded in the present embodiment.
Fig. 6 is a schematic diagram for explaining a configuration example of the medical image processing apparatus according to the first embodiment. As shown in fig. 6, the medical image processing apparatus 100 according to the first embodiment can be connected to an X-ray diagnostic apparatus 200 and an ultrasonic diagnostic apparatus 300, and is configured as a computer such as a workstation or a personal computer. The medical image processing apparatus 100 provides an image or information for assisting a surgery to a surgeon.
The medical image processing apparatus 100 includes at least a processing circuit 110, a storage circuit 120, an input interface 130, and a network interface 140.
The medical image processing apparatus 100 may further include a display 150. The display 150 provides the operator with travel assistance information of the ultrasound catheter 3 generated by the medical image processing apparatus 100. The display 150 may be a large-sized display device disposed at a position easily visible to an operator, or may have a speaker and be capable of outputting travel assistance information as sound. The display 150 may display various images such as an image for assisting a surgery generated by the medical image processing apparatus 100, data received from various apparatuses connected to the network 2, and images, in addition to data such as the travel assist information generated by the processing circuit 110.
The network interface 140 is an interface for communicating with various devices connected to the network 2 by wire or wireless. For example, the medical image processing apparatus 100 can exchange various data with the X-ray diagnostic apparatus 200, the ultrasound diagnostic apparatus 300, and the like through the network interface 140.
The input interface 130 includes an input device operable by an operator and an input circuit that inputs a signal from the input device. The input device is realized by a mouse, a keyboard, a touch panel that performs an input operation by touching an operation surface, a touch panel in which a display screen and the touch panel are integrated, a non-contact input circuit using an optical sensor, a sound input circuit, and the like.
The Memory circuit 120 is constituted by a semiconductor Memory element such as RAM (Random Access Memory) or Flash Memory (Flash Memory), a hard disk, an optical disk, or the like, for example. The storage circuit 120 stores various processing programs (including an OS (Operating System) and the like in addition to application programs) used in the processing circuit 110, and data necessary for execution of the programs. The storage circuit 120 may store various data such as image data input through the input interface 130 or the network interface 140.
The processing circuit 110 has a dedicated or general-purpose processor, and performs various functions described below by executing software processing performed by a program stored in the storage circuit 120. The processing circuit 110 realizes the functions of the first acquisition function F01, the second acquisition function F02, the detection function F03, the determination function F04, and the calculation function F05.
These functions are described using the flowchart shown in fig. 7 and the explanatory diagrams shown in fig. 8 to 11. Fig. 7 is a flowchart showing an example of the operation of the medical image processing apparatus 100 according to the first embodiment, or a medical image processing program.
In step ST10, an ultrasonic image generated from the signal collected by the ultrasonic diagnostic apparatus 300 is acquired by the detector 30 of the ultrasonic catheter 3 inserted into the body of the subject P to be diagnosed or treated. The process of acquiring the ultrasonic image of the subject P in step ST10 is performed by the first acquisition function F01.
In step ST11, an X-ray image of the subject P to be diagnosed or treated, which is imaged using the X-ray diagnostic apparatus 200, is acquired. The X-ray image acquired in step ST11 is an X-ray image depicting an imaging object, which is at least one of the detector 30 of the ultrasound catheter 3, the medical device 40 inserted into the body of the subject P, and the region of interest in the body of the subject.
Fig. 8 is a diagram for explaining the X-ray image capturing of the first embodiment. As shown in fig. 8, the X-ray image in step ST11 is a photographic image (still image) or a perspective image (moving image) obtained by irradiating X-rays from at least two directions so as to include the same portion. The process of acquiring X-ray image data of the subject P in step ST11 is performed by the second acquisition function F02.
In step ST12, the position of the detector 30 of the ultrasonic catheter 3 depicted in the X-ray image acquired in step ST11 is detected. The position of the detector 30 of the ultrasound tube 3 in, for example, an X-ray coordinate system is determined from the plurality of X-ray images including the same portion acquired in step ST11, for example, from the principle of a stereo camera or the like. The processing of step ST12 for detecting the position of the detector 30 of the ultrasonic waveguide 3 is performed by the detection function F03. Here, as shown in fig. 2, the X-ray coordinate system is a right-hand coordinate system in which the left-right direction axis of the subject among the 3 individual axes is the X axis, the dorsoventral axis is the Y axis, and the cephalopodum axis is the Z axis.
Fig. 9 is a diagram for explaining positional information of the ultrasonic detector and a field of view in an ultrasonic image according to the first embodiment. As shown in fig. 9, as positional information of the detector 30 of the ultrasonic waveguide 3, there are, for example, a barycentric position in an X-ray coordinate system of the detector 30 of the ultrasonic waveguide 3 and a normal vector of the array surface 31 of the detector 30 extending from the barycentric position.
In fig. 9, the array surface 31 of the detector 30 of the ultrasonic catheter 3 is a two-dimensional array. In the present specification, a plurality of fan-shaped two-dimensional ultrasonic images generated from collected ultrasonic data are referred to as "views" respectively in a state where the position of the detector 30 of the ultrasonic waveguide 3 and the angle of the array surface 31 of the detector 30 are fixed. In addition, a range corresponding to a three-dimensional ultrasound image in which the plurality of "views" are overlapped in the entire range, that is, a range corresponding to a set of all "views" is referred to as a "field of view".
The "field of view" can be calculated by determining the scanning range of the detector 30 in a state where the position of the detector 30 of the ultrasonic catheter 3 and the angle of the array surface 31 of the detector 30 are fixed.
Returning to fig. 7, in step ST13, it is determined whether or not the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, is within the field of view of the detector 30 of the ultrasound catheter 3, using the ultrasound image acquired in step ST 10. Specifically, when an imaging object, which is at least one of the medical device 40 and a region of interest in the body of the subject, is depicted in the ultrasound image acquired in step ST10, it is determined that the imaging object is within the field of view of the detector 30. In addition, when the imaging object, which is at least one of the medical device 40 and the region of interest in the body of the subject, is not depicted in the ultrasound image acquired in step ST10, it is determined that the imaging object is not within the field of view of the detector 30. The process of determining whether or not the object to be imaged enters the field of view of the detector 30 of the ultrasonic catheter 3 in step ST13 is performed by the determination function F04.
In addition, if the "field of view" is constituted by one "view", that is, if the field of view of the detector 30 is two-dimensional, the process of determining whether or not the imaging object enters the field of view of the detector 30 of the ultrasonic catheter 3 in step ST13 can be performed.
Step ST14 is performed when "yes" in step ST13, that is, when a part or all of the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, is within the field of view of the detector 30 of the ultrasound catheter 3. In step ST14, the position of the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, is detected from at least one of the ultrasound image acquired in step ST10 and the X-ray image acquired in step ST 11. The processing of detecting the position of the imaging object in step ST14 is performed by the detection function F03.
When the position detection of the medical device 40 in step ST14 is performed based on the ultrasonic image acquired in step ST10, the position of the center of gravity of the medical device 40 in the X-ray coordinate system is detected based on the position of the medical device 40 depicted in the ultrasonic image, for example, the position of the medical device 40 in the field of view.
For example, a case where the medical device 40 depicted in the ultrasound image does not include the center of gravity, such as a case where only a portion of the end of the medical device 40 is depicted in the ultrasound image, will be described by taking the medical device 40a of fig. 5 as an example. As shown in fig. 5, the position of the center of gravity of the medical device 40a (40) in the X-ray coordinate system may be estimated based on the size of the treatment target site that may be the diameter L of the medical device 40a (40) and the shape (e.g., sphere, elliptical sphere, circular ring, cylindrical shape, etc.) of the medical device 40a (40) after expansion. For example, the size of the treatment target site may be data measured when determining the size of the medical device 40a (40), an X-ray image taken for observing the shape of the treatment target site immediately before the medical device 40a (40) is left.
In the case where the detection of the position of the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject in step ST14, is performed based on the X-ray images acquired in step ST11, the position of the center of gravity of the imaging target object in the X-ray coordinate system, which is at least one of the medical device 40 and the region of interest in the body of the subject, may be detected based on the plurality of X-ray images acquired in step ST11, for example, based on the principle of a stereo camera or the like.
Step ST15 is performed when step ST13 is "no", that is, when the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, does not enter the field of view of the detector 30 of the ultrasound catheter 3. In step ST15, the position of the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, depicted in the X-ray image acquired in step ST11 is detected. From the plurality of X-ray images acquired in step ST11, the position of the center of gravity of the imaging object in the X-ray coordinate system, which is at least one of the medical device 40 and the region of interest in the body of the subject, is detected as the position of the imaging object by, for example, the principle of a stereo camera or the like. The processing of detecting the position of the imaging object in step ST15 is performed by the detection function F03.
In step ST16, the field of view of the detector 30 is calculated from the position of the detector 30 of the ultrasonic catheter 3 detected in step ST12, that is, the position of the center of gravity of the array surface 31 constituting the detector 30, the normal vector calculated from the orientation of the array surface 31, and the predetermined scanning range (that is, the known scanning range) of the detector 30. As shown in fig. 9, the field of view of the detector 30 can be calculated from the scanning range of the detector 30 in a state where the position of the detector 30 of the ultrasonic catheter 3 and the angle of the array surface 31 of the detector 30 are fixed. The process of calculating the field of view of the detector 30 of the ultrasonic waveguide 3 in step ST16 is performed by the calculation function F05.
In step ST17, traveling support information for traveling the ultrasonic catheter 3 so that the imaging object enters the field of view of the detector 30 of the ultrasonic catheter 3 is calculated based on the position of the detector 30 of the ultrasonic catheter 3 and the position of the imaging object that is at least one of the medical device 40 and the region of interest in the body of the subject. Alternatively, travel assist information for causing the ultrasonic catheter 3 to travel so that an imaging target object, which is at least one of the medical device 40 and a region of interest in the body of the subject, approaches the center of the field of view of the detector 30 of the ultrasonic catheter 3 is calculated. The processing of calculating the travel assistance information of the ultrasonic catheter 3 in step ST17 is performed by the calculation function F05.
Fig. 10 and 11 are perspective views for explaining an example of (a) before the advancement assistance and (b) after the advancement assistance of the ultrasound catheter 3 when a part of the medical device 40 according to the first embodiment is out of view from the detector 30 of the ultrasound catheter 3.
Next, a case where the imaging target is the medical device 40 will be described with reference to fig. 10 and 11 in step ST 17. However, the imaging target is not limited to the medical device 40. The travel assistance of the ultrasound catheter 3 may be performed even when a part or all of the imaging target is out of view from the detector 30 of the ultrasound catheter 3, in the case where the imaging target is a region of interest in the subject, in the case where the imaging target includes both the medical device 40 and the region of interest in the subject.
As shown in fig. 10 (a) and 11 (a), in step ST17, when a part or all of the medical device 40 is out of the field of view of the detector 30 of the ultrasound catheter 3, the amount of travel and the direction of travel that the ultrasound catheter 3 can travel in order to bring the medical device 40 into the field of view of the detector 30 of the ultrasound catheter 3 or in order to bring the medical device 40 close to the center of the field of view of the detector 30 of the ultrasound catheter 3, and the angle of the array surface 31 of the detector 30 are calculated. In the present specification, the travel amount and the travel direction may be the reverse amount and the reverse direction. In other words, the travel amount and the travel direction may be a movement amount and a movement direction.
In step ST17, even when the medical device 40 is moved from the state in which all of the medical device 40 is within the field of view of the detector 30 of the ultrasound catheter 3, the amount of travel and the direction of travel of the ultrasound catheter 3 or the angle of the array surface 31 of the detector 30 so that part or all of the medical device 40 does not deviate from the field of view of the detector 30 of the ultrasound catheter 3 can be calculated.
As shown in fig. 10 (a) and 11 (a), the traveling amount and traveling direction of the ultrasonic catheter 3 are, for example, amounts by which the position of the center of gravity of the detector 30 of the ultrasonic catheter 3 in the X-ray coordinate system is moved from the position (X1, y1, z 1) to the position (X2, y2, z 2). The angle of the array surface 31 of the detector 30 is an amount rotated about the axis of the detector 30.
In the case where a part or the whole of the medical device 40 does not enter the field of view of the detector 30, for example, by assuming the position of the center of gravity of the medical device 40 in the X-ray coordinate system known through step ST14 or step ST15, the size of the treatment target site which may become the diameter of the medical device 40, and the developed shape of the medical device 40, it is known that the field of view of the detector 30 overlaps the whole of the medical device 40 in the three-dimensional space. From this overlapping, the amount of travel and the direction of travel of the ultrasound catheter 3 and the angle of the array surface 31 of the detector 30 required to overlap all of the medical device 40 with the field of view of the detector 30 in three-dimensional space can be calculated.
The amount of travel of the ultrasound catheter 3, the direction of travel, and the angle of the array face 31 of the detector 30 are preferably such that the medical device 40 is "maximally converged" throughout the field of view of the detector 30 of the ultrasound catheter 3. In the present specification, "maximally converging" means a state in which the medical device 40 is entirely within the field of view of the detector 30 of the ultrasound catheter 3, and is depicted as largely as possible within the field of view of the detector 30.
When a part of the medical device 40 is within the field of view of the detector 30 of the ultrasound catheter 3, the traveling direction of the ultrasound catheter 3 may be determined according to which view of the plurality of ultrasound images acquired in step ST10 the medical device 40 is depicted. For example, in the case where the field of view of the detector 30 of the ultrasound catheter 3 is formed of a range of 10 views from the view V1 to the view V10, if the medical device 40 is drawn only in the view V1 and the view V2, it is considered that the medical device 40 is located outside the field of view of the detector 30 on the side of the view V1, and the direction in which the field of view of the detector 30 on the side of the view V1 and the view V1 is drawn may be regarded as the traveling direction of the ultrasound catheter 3. In addition, the number of views is not limited to 10.
The travel assistance information of the ultrasonic catheter 3 is information for determining a travel speed of the ultrasonic catheter 3 suitable for the tissue or organ of the subject P located at the position of the distal end of the ultrasonic catheter 3, and information about a recommended route for the ultrasonic catheter 3 to travel or travel in a detour manner based on the determined travel speed so as to avoid hitting the tissue or organ of the subject P.
Returning to fig. 7, in step ST18, the travel assistance information of the ultrasonic catheter 3 calculated in step ST17 is outputted. The travel assistance information of the ultrasonic wave catheter 3 is output to the display 150 and displayed on the display 150. The travel assistance information of the ultrasonic catheter 3 may be output to and displayed on at least one of the display 250 of the X-ray diagnostic apparatus 200 and the display 330 of the ultrasonic diagnostic apparatus 300.
The output of the travel assistance information of the ultrasonic wave catheter 3 output to the display includes at least one of the travel assistance information calculated in step ST 17.
By providing the operator with the travel assistance information of the ultrasonic catheter 3, it is possible to expect a reduction in the number of operations of the ultrasonic catheter 3, and to further improve the workflow that can concentrate on the operation of the surgical catheter 4.
In the first embodiment, the steps ST10, ST11, and ST12 may be performed in the order of step ST12 after step ST11, and are not limited to the order of step ST10, step ST11, and step ST12, and may be performed in the order of step ST11, step ST12, and step ST10, or may be performed in the order of step ST11, step ST10, and step ST 12.
(modification 1 of the first embodiment)
Fig. 12 is a flowchart showing an operation example of the medical image processing apparatus according to modification 1 of the first embodiment, or a medical image processing program. As shown in fig. 12, step ST13 may not be performed, and the step next to step ST12 may be step ST15.
(second embodiment)
Fig. 13 is a diagram showing a configuration example of the medical image processing apparatus 100 according to the second embodiment. The second embodiment (fig. 13) differs from the first embodiment (fig. 7) in that the processing circuit 110 of the medical image processing apparatus 100 shown in fig. 13 has an image output function F06. The image output function F06 is a function of outputting an ultrasonic image.
Fig. 14 is a flowchart showing an example of the operation of the medical image processing apparatus 100 according to the second embodiment, or a medical image processing program. As shown in fig. 14, step ST20 is a step performed after step ST18, and in step ST20, an ultrasound image in which a part or all of an imaging target object, which is at least one of the medical device 40 and a region of interest in the body of the subject, is extracted from among the plurality of ultrasound images acquired in step ST10 is drawn, and the extracted ultrasound image is output. The processing of outputting the ultrasonic image in step ST20 is performed by the image output function F06.
The method of extracting the ultrasound image in step ST20 may be any method that extracts a part or all of the ultrasound image of the imaging object that is at least one of the medical device 40 and the region of interest in the body of the subject. For example, from the viewpoint of outputting an ultrasonic image focused on the treatment target region, the ultrasonic image of step ST20 may be an ultrasonic image in which both the treatment target region and the medical device 40 are depicted.
Fig. 15 is a diagram for explaining an example of an ultrasonic image according to the second embodiment. In the case of the left atrial appendage occlusion operation illustrated in fig. 15, for example, if the ultrasound image (the range of the view from the view V4 to the view V8 in fig. 15) in which both the left atrial appendage and the occlusion device (the medical device 40) are drawn out of the field of view of the ultrasound catheter 3 (the set of all the views from the view V1 to the view V10 in fig. 15) is the ultrasound image of step ST20, it is easy to focus on the observation of the treatment target site such as the adhesion state of the left atrial appendage and the occlusion device.
In step ST20, the order of outputting the plurality of extracted ultrasound images is not particularly limited, but may be set so as to easily identify the condition of the treatment target site. Fig. 15 is a diagram for explaining an example of the output procedure of the ultrasonic image according to the second embodiment. The output order of the ultrasonic images may be outputted back and forth in order from the ultrasonic image at one end of the range of the extracted view toward the ultrasonic image at the other end. The output sequence of, for example, the repetition view V4, view V5, view V6, view V7, view V8, view V7, view V6, view V5, view V4, view V5, … … is described with reference to fig. 15. The view from which output starts is arbitrary, but an ultrasound image near the center of the extracted view range may be outputted by reciprocating ultrasound images at both ends thereof with the ultrasound image as a starting point. The output sequence of, for example, repetitive views V6, V7, V8, V7, V6, V5, V4, V5, V6, V7, V8, V7, … … will be described with reference to fig. 15.
Fig. 16 is a perspective view for explaining an example of (a) before the traveling assistance and (b) after the traveling assistance of the ultrasonic catheter 3 based on the ultrasonic image of the second embodiment. Fig. 16 (a) is the same view as fig. 15, and depicts a plurality of extracted ultrasound images. As described above, the amount and direction of travel of the ultrasound catheter 3 and the angle of the array surface 31 of the detector 30 are preferably such amounts as to "maximally converge" a part or all of the imaging target object, which is at least one of the medical device 40 and the region of interest in the body of the subject, within the field of view of the detector 30 of the ultrasound catheter 3.
Therefore, as shown in fig. 16 (b), traveling support information for causing the ultrasonic catheter 3 to travel so that the imaging object enters the field of view of the detector 30 may be calculated. Further, traveling assistance information for traveling the ultrasonic wave guide tube 3 so that the imaging target object approaches the center of the field of view of the detector 30 may be calculated. In this case, the travel support information is calculated based on, for example, an ultrasonic image in which a part or all of the imaging target is extracted from among the plurality of ultrasonic images acquired in step ST 10.
The output of the ultrasound image in step ST20 may be performed on the display 150, or may be performed on at least one of the display 250 provided in the X-ray diagnostic apparatus 200 and the display 330 provided in the ultrasound diagnostic apparatus 300.
For example, by extracting and outputting an ultrasonic image in which a part or the whole of the medical device 40 is drawn, the burden on the operator of the search operation for the medical device 40 and the ultrasonic image of the treatment target region is reduced. In addition, for example, by extracting and outputting an ultrasonic image that depicts a region of interest in a subject, it is easy to observe a target region during treatment or treatment.
(third embodiment)
Fig. 17 is a block diagram showing a configuration example of an X-ray diagnostic system 1 including a medical image processing apparatus according to the third embodiment. The medical image processing apparatus 100 according to the third embodiment has a function of controlling the auxiliary robot 6. Here, the auxiliary robot 6 is, for example, a device capable of performing an operation of inserting the ultrasonic catheter 3 or the surgical catheter 4 into the treatment target site of the subject based on a user operation via a console provided at a remote place, or capable of performing an operation of inserting the ultrasonic catheter 3 or the surgical catheter 4 into the treatment target site of the subject based on control data from a remote place.
As shown in fig. 17, the auxiliary robot 6 includes a robot main body 60 in addition to the operation table. The robot body 60 is disposed near the diagnostic bed 220, and inserts the ultrasonic catheter 3 or the surgical catheter 4 into the subject P and advances them to the treatment target site of the subject. The operation table of the auxiliary robot 6 may be disposed at a different distance from the operating room, or may be disposed in the operating room. The auxiliary robot 6 may also be connected to the X-ray diagnostic system 1 via the network 2.
Fig. 18 is a diagram showing a configuration example of the medical image processing apparatus 100 according to the third embodiment. The processing circuit 110 of the medical image processing apparatus 100 according to the third embodiment is different from the first embodiment (fig. 6) and the second embodiment (fig. 13) in that it has a robot control function F07. The robot control function F07 is a function of transmitting and receiving control data between the medical image processing apparatus 100, the X-ray diagnostic apparatus 200, and the auxiliary robot 6 as another configuration, and controlling the operation of the auxiliary robot 6.
The robot control function F07 of the third embodiment converts the calculated travel assistance information of the ultrasonic catheter 3 into control data of the auxiliary robot 6 and outputs the control data to the auxiliary robot 6. Thereby, the operation of the assist robot 6 is controlled so that the ultrasonic catheter 3 travels based on the travel assist information.
The robot control function F07 may also determine a travel speed of the ultrasound catheter 3 suitable for the tissue or organ of the subject P located at the position of the distal end of the ultrasound catheter 3 based on the travel assistance information related to the recommended route, travel the ultrasound catheter 3 based on the determined travel speed, or travel the ultrasound catheter in a detouring manner so as to avoid hitting the tissue or organ of the subject P, and control the operation of the auxiliary robot 6. For example, in an organ having a motion such as a heart, the operation of the assist robot 6 may be controlled to reduce the traveling speed of the medical device.
According to at least one embodiment described above, the operator can be assisted in properly advancing the ultrasonic catheter.
In the above embodiment, the term "processor" includes, for example, a special-purpose or general-purpose CPU (Central Processing Unit), GPU (Graphics Processing Unit) or an application-specific integrated circuit (Application Specific Integrated Circuit:asic), a programmable logic device (for example, a simple programmable logic device (Simple Programmable Logic Device:pld), a complex programmable logic device (Complex Programmable Logic Device:pld), and a field programmable gate array (Field Programmable Gate Array:fpga)) or the like.
In the above embodiment, the example in which a single processor of the processing circuit realizes each function has been shown, but a plurality of independent processors may be combined to form the processing circuit, and each processor realizes each function. In the case where a plurality of processors are provided, a memory circuit storing a program may be provided for each processor, or a single memory circuit may collectively store a program corresponding to the functions of all the processors.
While several embodiments of the present invention have been described, these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments can be implemented in various other modes, and various omissions, substitutions, and changes can be made without departing from the spirit of the invention. These embodiments and modifications are included in the scope and gist of the invention, and are also included in the invention described in the claims and their equivalents.

Claims (15)

1. A medical image processing apparatus includes a processing circuit,
the processing circuit:
acquiring an ultrasonic image generated from a signal collected by a detector of an ultrasonic waveguide inserted into a body of a subject;
Acquiring an X-ray image depicting an imaging object, the imaging object being at least one of the detector of the ultrasound catheter, a medical device inserted into the body of the subject, and a region of interest in the body of the subject;
detecting (a) a position of the detector of the ultrasonic waveguide depicted in the X-ray image, and (b) a position of the imaging object depicted in at least one of the ultrasonic image and the X-ray image;
based on the detected position of the detector of the ultrasonic waveguide and the detected position of the imaging object, travel assist information for causing the ultrasonic catheter to travel so that the imaging object enters the field of view of the detector or travel assist information for causing the ultrasonic catheter to travel so that the imaging object approaches the center of the field of view of the detector is calculated.
2. The medical image processing apparatus according to claim 1, wherein,
the processing circuitry may be configured to process the data,
further determining whether the object to be imaged is within the field of view of the detector,
when the imaging object is depicted in the obtained ultrasonic image, it is determined that the imaging object is within the field of view of the detector, and when the imaging object is not depicted in the obtained ultrasonic image, it is determined that the imaging object is not within the field of view of the detector.
3. The medical image processing apparatus according to claim 2, wherein,
the processing circuitry may be configured to process the data,
when the imaging object enters the field of view of the detector,
the position of the imaging object in the field of view is detected from the ultrasonic image, and the position of the detector of the ultrasonic waveguide is detected from the X-ray image.
4. The medical image processing apparatus according to claim 2, wherein,
the processing circuitry may be configured to process the data,
when the imaging object does not enter the field of view of the detector,
and detecting the position of the detector of the ultrasonic waveguide and the position of the imaging object according to the X-ray image.
5. The medical image processing apparatus according to claim 2, wherein,
the processing circuitry may be configured to process the data,
when the imaging object does not enter the field of view of the detector,
detecting the position of the center of gravity of the array surface of the ultrasonic catheter constituting the detector and the orientation of the array surface from the X-ray image,
the field of view of the detector is calculated from the position of the center of gravity of the array surface, a normal vector calculated from the orientation of the array surface, and a predetermined scanning range of the detector.
6. The medical image processing apparatus according to claim 1 or 2, wherein,
the travel assist information calculated by the processing circuit includes at least one of a travel amount of the ultrasonic catheter, a travel direction of the ultrasonic catheter, an angle of an array surface of the ultrasonic catheter, and recommended path information of the ultrasonic catheter.
7. The medical image processing apparatus according to claim 1 or 2, wherein,
and a display for providing travel assistance information of the ultrasonic catheter to an operator.
8. The medical image processing apparatus according to claim 1 or 2, wherein,
the processing circuitry further controls an auxiliary robot that assists the ultrasonic waveguide.
9. The medical image processing apparatus according to claim 7, wherein,
the processing circuit extracts an ultrasonic image in which a part or all of the imaging target is drawn from among the plurality of acquired ultrasonic images, and outputs the extracted ultrasonic image to the display.
10. The medical image processing apparatus according to claim 1, wherein,
the processing circuitry may be configured to process the data,
extracting an ultrasonic image in which a part or all of the imaging object is drawn from the plurality of ultrasonic images obtained,
Based on the extracted ultrasonic image, traveling assistance information for traveling the ultrasonic catheter to bring the imaging object into the field of view of the detector or traveling assistance information for traveling the ultrasonic catheter to bring the imaging object close to the center of the field of view of the detector is calculated.
11. The medical image processing apparatus according to claim 9, wherein,
the processing circuit causes the extracted plurality of ultrasonic images to be outputted back and forth in order from an ultrasonic image at one end of the extracted range toward an ultrasonic image at the other end, or causes ultrasonic images at both ends of the extracted range to be outputted back and forth with an ultrasonic image near the center of the extracted range as a starting point.
12. The medical image processing apparatus according to claim 1, wherein,
the region of interest in the body of the subject is a site where treatment, diagnosis, or assistance of a treatment process is performed.
13. An X-ray diagnostic system having a medical image processing apparatus and a network interface capable of communicating with the medical image processing apparatus via a network,
the medical image processing device is provided with a processing circuit,
The processing circuitry may be configured to process the data,
acquiring an ultrasonic image generated from a signal collected by a detector of an ultrasonic waveguide inserted into a body of a subject;
acquiring an X-ray image depicting an imaging object, the imaging object being at least one of the detector of the ultrasound catheter, a medical device inserted into the body of the subject, and a region of interest in the body of the subject;
detecting (a) a position of the detector of the ultrasonic waveguide depicted in the X-ray image, and (b) a position of the imaging object depicted in at least one of the ultrasonic image and the X-ray image;
based on the detected position of the detector of the ultrasonic waveguide and the detected position of the imaging object, travel assist information for causing the ultrasonic catheter to travel so that the imaging object enters the field of view of the detector or travel assist information for causing the ultrasonic catheter to travel so that the imaging object approaches the center of the field of view of the detector is calculated.
14. The X-ray diagnostic system of claim 13, wherein,
the processing circuitry further controls an auxiliary robot that assists operation of the ultrasound catheter.
15. A computer-readable storage medium storing a medical image processing program for causing a computer to execute the steps of:
acquiring an ultrasonic image generated from a signal collected by a detector of an ultrasonic waveguide inserted into a body of a subject;
acquiring an X-ray image depicting an imaging object, the imaging object being at least one of the detector of the ultrasound catheter, a medical device inserted into the body of the subject, and a region of interest in the body of the subject;
detecting (a) a position of the detector of the ultrasonic waveguide depicted in the X-ray image, and (b) a position of the imaging object depicted in at least one of the ultrasonic image and the X-ray image;
based on the detected position of the detector of the ultrasonic waveguide and the detected position of the imaging object, travel assist information for causing the ultrasonic catheter to travel so that the imaging object enters the field of view of the detector or travel assist information for causing the ultrasonic catheter to travel so that the imaging object approaches the center of the field of view of the detector is calculated.
CN202311156008.2A 2022-09-07 2023-09-07 Medical image processing device, X-ray diagnostic system, and storage medium Pending CN117653179A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2022-142553 2022-09-07
JP2023-125592 2023-08-01
JP2023125592A JP2024037685A (en) 2022-09-07 2023-08-01 Medical image processing device, X-ray diagnostic system, and medical image processing program

Publications (1)

Publication Number Publication Date
CN117653179A true CN117653179A (en) 2024-03-08

Family

ID=90074096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311156008.2A Pending CN117653179A (en) 2022-09-07 2023-09-07 Medical image processing device, X-ray diagnostic system, and storage medium

Country Status (1)

Country Link
CN (1) CN117653179A (en)

Similar Documents

Publication Publication Date Title
US6923768B2 (en) Method and apparatus for acquiring and displaying a medical instrument introduced into a cavity organ of a patient to be examined or treated
US10524865B2 (en) Combination of 3D ultrasound and computed tomography for guidance in interventional medical procedures
JP5911243B2 (en) Image display device
US9547900B2 (en) Image processing apparatus, X-ray diagnosis apparatus, and registration method
JP5112021B2 (en) Intravascular image diagnostic apparatus and intravascular image diagnostic system
JP5025423B2 (en) Catheter insertion guide system and medical image diagnostic apparatus incorporating the system
JP2019013788A (en) Biopsy probe, biopsy support apparatus and biopsy support method
US20100030022A1 (en) Method and system with encapsulated imaging and therapy devices, coupled with an extracorporeal imaging device
JP7258483B2 (en) Medical information processing system, medical information processing device and ultrasonic diagnostic device
JP5498181B2 (en) Medical image acquisition device
CN110123372B (en) Medical image diagnosis apparatus and X-ray irradiation control apparatus
JP2008302219A (en) Method and system for images registration
JP2019093123A (en) Medical image diagnostic apparatus and medical image processing apparatus
CN117653179A (en) Medical image processing device, X-ray diagnostic system, and storage medium
EP4335378A1 (en) Medical image processing apparatus and medical image processing program
JP2024037685A (en) Medical image processing device, X-ray diagnostic system, and medical image processing program
JP7165600B2 (en) X-ray diagnostic equipment and medical information processing equipment
JP6462331B2 (en) Ultrasonic diagnostic apparatus, medical image processing apparatus, and diagnostic imaging system
JP7114263B2 (en) Medical image diagnosis device and X-ray irradiation control device
US20160361019A1 (en) Device and method for virtual angiography
JP5159086B2 (en) Ultrasonic diagnostic apparatus and catheter navigation system
JP7297457B2 (en) Image processing device, X-ray diagnostic device and ultrasonic diagnostic device
JP2005192856A (en) X-ray diagnostic apparatus and method of displaying x-ray image data
JP5675930B2 (en) X-ray diagnostic equipment
EP4197446A1 (en) Methods and system for positioning a c-arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination