WO2020106664A1 - System and method for volumetric display of anatomy with periodic motion - Google Patents

System and method for volumetric display of anatomy with periodic motion

Info

Publication number
WO2020106664A1
WO2020106664A1 PCT/US2019/062105 US2019062105W WO2020106664A1 WO 2020106664 A1 WO2020106664 A1 WO 2020106664A1 US 2019062105 W US2019062105 W US 2019062105W WO 2020106664 A1 WO2020106664 A1 WO 2020106664A1
Authority
WO
WIPO (PCT)
Prior art keywords
ultrasound
ultrasound images
anatomical organ
sweep
phase angles
Prior art date
Application number
PCT/US2019/062105
Other languages
French (fr)
Inventor
Sharif Razzaque
Mina S. FAHIM
Original Assignee
Covidien Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Covidien Lp filed Critical Covidien Lp
Publication of WO2020106664A1 publication Critical patent/WO2020106664A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal

Definitions

  • the present disclosure relates to systems, methods, and devices for volumetric display of an anatomy with periodic motion, for example, using ultrasound imaging.
  • clinicians When planning a treatment, diagnostic, or other medical procedure, clinicians often rely on ultrasound images to view the internal anatomy of a patient.
  • the clinician utilizes the pre-operative and/or intra-operative medical images to identify targets of interest, to develop strategies for accessing the targets of interest for the medical procedure, and to visualize instruments and instrument trajectories.
  • Ultrasonic imaging has been used to image the insertion path of biopsy needles and other devices so that the clinician can visually observe the insertion of the needle toward and to target anatomy which is to be biopsied or treated.
  • conventional ultrasound has limitations which deter its use in many medical procedures beyond biopsy.
  • a method for volumetric display of an anatomy includes determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle, receiving a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ, and detecting edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images.
  • the method further includes determining a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ, generating a subset of ultrasound images based on the determined phase angle of each ultrasound image, and reconstructing a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
  • the subset of ultrasound images includes ultrasound images determined as having the same phase angle.
  • the method further includes discretizing at least one of the determined phase angles, and generating the subset of ultrasound images is based on the determined phase angles and the discretized phase angles.
  • the method may further include displaying the reconstructed 3D data set of the anatomical organ.
  • portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
  • the method further includes assigning a timestamp to each ultrasound image of the sweep of ultrasound images, and the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
  • a system for volumetric display of an anatomy includes: a device for determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle; an ultrasound device configured to capture a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ; and a computing device operably coupled to the device for determining phase angles and the ultrasound device.
  • the computing device is configured to receive the ultrasound images from the ultrasound device, detect edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images, determine a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ, generate a subset of ultrasound images based on the determined phase angle of each ultrasound image, and reconstruct a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
  • the subset of ultrasound images includes ultrasound images determined as having the same phase angle.
  • the computing device is configured to discretize at least one of the determined phase angles and generate the subset of ultrasound images based on the determined phase angles and the discretized phase angles.
  • the computing device may further be configured to display the reconstructed 3D data set of the anatomical organ.
  • portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
  • the computing device is configured to assign a timestamp to each ultrasound image of the sweep of ultrasound images, and the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
  • the device for determining phase angles includes at least one of an electrocardiogram device, a ventilator device, an optical camera, a respiration chest strap sensor, an electromyogram or an electromagnetic sensor.
  • Fig. l is a schematic diagram of a microwave ablation planning and procedure system in accordance with an illustrative embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a computing device which forms part of the microwave ablation planning and procedure system of Fig. 1 in accordance with an embodiment of the present disclosure
  • FIG. 3 A illustrates an ultrasound device capturing ultrasound images of a static anatomical organ
  • Fig. 3B illustrates detected edges of the static anatomical organ in each of the ultrasound images captured by the ultrasound device of Fig. 3 A;
  • Fig. 3C illustrates an example reconstruction of the static anatomical organ of Fig. 3B;
  • Fig. 4A illustrates two phase angles of an anatomical organ during periodic motion of the anatomical organ;
  • Fig. 4B illustrates an ultrasound device capturing ultrasound images of an anatomical organ during motion of the anatomical organ and detected edges of the anatomical organ for all phase angles in each of the ultrasound images;
  • Fig. 4C shows an example reconstruction of the moving anatomical organ of Fig. 4B.
  • the present disclosure provides a system and method for volumetric display of an anatomy with periodic motion.
  • This disclosure relates to a system and a method for acquiring (for example, via ultrasound imaging) and reconstructing the 3D shape of organs, vessels and/or other tissues that move and/or deform in an approximately periodic manner (e.g., liver, kidney , heart chamber, that moves with the patient’s respiration, beating heart, etc.), using a phase signal (e.g., ventilator cycle, ECG phase angle) which indicates or estimates the current real-time phase angle of the organ within its periodic motion cycle.
  • a phase signal e.g., ventilator cycle, ECG phase angle
  • This method may be incorporated into a navigation or mapping system (e.g., Medtronic EmprintTM SX, Stealth StationTM,
  • Microwave ablation treatment is generally divided into two phases: (1) a planning phase; and (2) a procedure phase.
  • the planning and treatment phases of microwave ablation treatment are more fully described in commonly owned U.S. Patent Application Publication No. US2016/0317224, entitled Microwave Ablation
  • a system 10 which includes a computing device 100, a display 110, a table 120, an ablation probe or instrument 130, and an ultrasound device 140 connected to an ultrasound workstation 150.
  • ultrasound workstation 150 is illustrated as separate from computing device 100, computing device 100 is configured to perform all the functions of ultrasound workstation 150, and vice versa.
  • Computing device 100 may be, for example, a laptop computer, desktop computer, tablet computer, or other similar device.
  • Computing device 100 may be configured to control an electrosurgical generator, a peristaltic pump, a power supply, and/or any other accessories and peripheral devices relating to, or forming part of, system 10.
  • Display 110 is configured to output instructions, images, and messages relating to the performance of the microwave ablation procedure.
  • Table 120 may be, for example, an operating table or other table suitable for use during a surgical procedure, which includes an electromagnetic (EM) field generator 121.
  • EM electromagnetic
  • Ultrasound device 140 such as an ultrasound wand, may be used to image the patient’s body during the microwave ablation procedure to visualize the location of the surgical instruments, such as ablation probe 130, inside the patient’s body and to visualize the anatomy of the patient.
  • Ultrasound device 140 may have an EM tracking sensor embedded within or attached to the ultrasound wand, for example, a clip-on sensor or a sticker sensor.
  • Computing device 100 may include memory 202, processor 204, display 206, network interface 208, input device 210, and/or output module 212.
  • Memory 202 includes any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 100.
  • memory 202 may include one or more solid-state storage devices such as flash memory chips, and/or one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown).
  • Memory 202 may store application 216 and/or CT data 214. Application 216 may, when executed by processor 204, cause display 206 to present user interface 218 and perform other methods described herein.
  • Processor 204 may be a general-purpose processor, a specialized graphics processing unit (GPU) configured to perform specific graphics processing tasks while freeing up the general-purpose processor to perform other tasks, and/or any number or combination of such processors.
  • Display 206 may be touch sensitive and/or voice activated, enabling display 206 to serve as both an input and output device. Alternatively, a keyboard (not shown), mouse (not shown), or other data input devices may be employed.
  • Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet.
  • LAN local area network
  • WAN wide area network
  • Bluetooth network and/or the internet.
  • Input device 210 may be any device by means of which a user may interact with computing device 100, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • Application 216 may be one or more software programs stored in memory 202 and executed by processor 204 of computing device 100.
  • Application 216 may be installed directly on computing device 100, or may be installed on another computer, for example, a central server, and opened on computing device 100 via network interface 208.
  • Application 216 may run natively on computing device 100, as a web-based application, or any other format known to those skilled in the art.
  • application 216 will be a single software program having all of the features and functionality described in the present disclosure.
  • Application 216 communicates with a user interface 218 that generates a user interface for presenting visual interactive features to a clinician, for example, on display 206 and for receiving clinician input, for example, via a user input device.
  • user interface 218 may generate a graphical user interface (GUI) and output the GUI to display 206 for viewing by a clinician.
  • GUI graphical user interface
  • Computing device 100 is linked to display 110, thus enabling computing device 100 to control the output on display 110 along with the output on display 206.
  • Computing device 100 may control display 110 to display output which is the same as or similar to the output displayed on display 206.
  • the output on display 206 may be mirrored on display 100.
  • computing device 100 may control display 110 to display different output from that displayed on display 206.
  • display 110 may be controlled to display guidance images and information during the procedure, while display 206 is controlled to display other output, such as configuration or status information.
  • an ultrasound device 140 capturing a sweep“S” of ultrasound images 300a-n of a static anatomical organ 302 is illustrated for generating a 3D reconstruction 310 of the anatomical organ 302.
  • the system detects edges 304a-n of the anatomical organ 302 in each ultrasound image 300a-n and generates the 3D reconstruction 310 of the static anatomical organ 302 based on the detected edges 304a-n.
  • the detected edges 300a-n also do not move, or change, as the ultrasound device 140 is moving along the sweep“S.” That is, minimal distortion occurs when generating the 3D reconstruction 310 of the static anatomical organ 302 and confidence is afforded to the all of the detected edges 304a-n in each ultrasound image 300a-n even when the ultrasound device 140 is not actually positioned to capture images corresponding to the location of the detected edges 300a-n.
  • the system detects edge 304a in ultrasound image 300a.
  • the system detects edge 304n in ultrasound image 300n while also remaining confident that the previously detected edge 304a in ultrasound image 300a accurately represents the actual position of the edge of the anatomical organ in the patient at the point in time that the ultrasound image 304n is being captured. That is, the position of the edge 304a would be the same if the user moved the ultrasound device 140 back to that position along the sweep“S.”
  • generating a 3D reconstruction of non-static, moving, anatomical organ includes challenges that are not present in the 3D reconstruction of static anatomical organs.
  • the positions of detected edges of the anatomical organ that were captured in ultrasound images at earlier portions of the sweep change as the organ moves.
  • only the detected edges that correspond to the ultrasound image that is actually being captured can be said to be accurate.
  • an ultrasound device 140 capturing a sweep“S” of ultrasound images 400a-n of an anatomical organ undergoing a motion cycle is illustrated for generating a 3D reconstruction 410 of the anatomical organ.
  • the system detects edges 404a-n of the anatomical organ in each ultrasound image 400a-n and generates the 3D reconstruction of the static anatomical organ based on the detected edges 404a-n.
  • Fig. 4A illustrates the anatomical organ in a first phase angle 401 (e.g., 0 degrees) and the anatomical organ in a second phase angle 402 (e.g., 180 degrees).
  • a phase signal is used to indicate or estimate the current real-time phase angle of the organ within its periodic motion cycle.
  • the organ is moving quickly between the first phase angle 401 (e.g., 0 degrees) and the second phase angle 402 (e.g., 180 degrees) as the user is slowly sweeping the ultrasound beam over the organ, then the 3D reconstruction 410 contains a mix of both organ states (see Fig. 4C).
  • the system displays a 3D reconstruction 410 of the organ that is based on the acquired imaging data that was acquired at the same phase angle as the current phase angle.
  • the system may generate a subset of ultrasound images based on the determined phase angle, in particular, including only those ultrasound images determined to have the same, or similar, phase angles. Since the organ’s motion is, to a first approximation, periodic, the user sees a 3D dataset that appears to be real-time, even though portions of this dataset may be been acquired in previous cycles of the organ’s motion (but at the same phase angle).
  • the organ’s motion may not be completely periodic.
  • the first- order approximation may be periodic, but there may be other components of motion that are not accounted for in this first-order periodic approximation.
  • a user may be inserting a needle into a tumor in the liver.
  • the liver and tumor move mostly in sync with the patient’s periodic breath cycle, but the region immediately surrounding the needle may be displaced by the needle as it is advanced deeper into the liver. If the tumor is calcified and the liver is fatty, the needle may even displace/dislodge the tumor from its surrounding liver tissue, instead of piercing and entering the tumor.
  • a system using the aspect described above that displays the 3D reconstruction based on only image data that was acquired while the organ was at the same phase angle as the current phase angle, alone, may not adequately communicate to the user that the 3D data they are viewing is not up-to-date, especially since the anatomical structures in the 3D data appear, to the user, to be moving in real-time.
  • the user might not recognize that the needle has pushed the tumor away from the surrounding tissue, because the outdated 3D reconstruction of the tumor may appear in its original location within the surrounding liver tissue (and both the tumor and the surrounding liver tissue will appear to be moving periodically).
  • the portions of the 3D reconstruction which are based on older images are displayed blurry, transparent and/or less saliently.
  • the older the image data the more blurry and/or transparent its contribution to the 3D reconstruction is displayed.
  • the newer the image data the more opaque, sharp and salient it is displayed.
  • the system utilizes a binary space partition (BSP) methodology to sort the 2D ultrasound images by their distance from the virtual viewpoint and then draws the images in back-to-front order. Then, the phase angle (a continuous variable from 0-360 degrees) is binned into fixed number of discrete phase angle ranges (e.g., 0-2 degrees, 2-4 degrees, etc.).
  • BSP binary space partition
  • a separate dual-BSP data structure is created and maintained.
  • a new, pose-tracked 2D ultrasound image is acquired, its associated phase angle is used to index into one dual-BSP data- structure.
  • the latest pose-tracked 2D ultrasound image is incorporated into that one dual-BSP data- structure.
  • the associated timestamp of when each 2D ultrasound image was acquired is also stored.
  • the age of each 2D ultrasound image is computed. As each 2D ultrasound image texture is drawn, a shader performs a gaussian blur on the texture. The width of the gaussian function is computed from the age of ultrasound image data being drawn (the greater the age, the wider the gaussian function).
  • the opacity of the texture is reduced as a function of its age.
  • phase angle (a continuous variable from 0-360 degrees) is binned into fixed number of discrete phase angle ranges (e.g., 0-2 degrees, 2-4 degrees, etc.).
  • a 3D volumetric image (e.g., a 3D texture) is created and maintained.
  • a new pose-tracked 2D ultrasound image is acquired, its associated phase angle is used to index into one 3D texture.
  • the latest pose-tracked 2D ultrasound image is rasterized into the 3D texture.
  • a 3D Gaussian blur operation is performed on all of the 3D volumetric images. As the image data gets older, the blur operation will have been performed on it more and more times.
  • the 3D textures may be displayed using ray-casting or another 3D volume rendering algorithm.
  • the systems and methods of this disclosure enable improvements to traditional medical navigation systems by making it easier and faster to perform procedures associated therewith such as biopsy, cardiac valve replacement, embolization, stent placement,
  • “Pose-tracked,” and variants of such terms used herein, is understood to mean an ultrasound transducer (commonly referred to as an ultrasound probe) whose position and/or orientation are continuously or intermittently measured. This measurement can be performed by use of an optical tracking system (e.g., NDI Polaris, Valve LighthouseTM, ViconTM Motion Capture, Microsoft Kinect, VuforiaTM), electromagnetic tracking system (e.g., NDI AuroraTM, Sixsense RazerTM, Medtronic AxiEMTM), passive or motorized mechanical arm (e.g., FaroArmTM, KukaTM LBR iiwa), ultrasound position tracking (e.g., Hexamite HX19TM), inertial tracking (e.g., Thales InertiaCube4TM), RFID, impedance-based (Electropotential) tracking (e.g., Medtronic LocaLisaTM) or any combination thereof.
  • an optical tracking system e.g., NDI
  • Medical devices may include needles, needle-like devices such as microwave ablation antenna, cryo-ablation antenna, radiofrequency antenna, biopsy needles, electroporation electrodes, pacing or defibrillation leads/electrodes, catheters and catheter-like devices, guidewires, such as stents, flexible endoscopes, microwave, cryo, radiofrequency ablation devices, cardiac and defibrillation pacing electrodes, gastric banding devices, in-vitro fertilization devices for harvesting eggs, implanting embryos, neuro stimulation, monitoring or electromyogram electrodes, and/or cardiac or vascular valves or leaflets.
  • needles needle-like devices such as microwave ablation antenna, cryo-ablation antenna, radiofrequency antenna, biopsy needles, electroporation electrodes, pacing or defibrillation leads/electrodes, catheters and catheter-like devices, guidewires, such as stents, flexible endoscopes, microwave, cryo, radiofrequency ablation devices, cardiac and defibrill
  • the ultrasound device may include those with ultrasound transducers, with a single transducing element or ID or 2D arrays of transducing elements, those with single plane arrays or bi-plane arrays, those that produce 2D or 3D images, those with piezo, CMOS, or other transducing elements, modalities such as A-mode, B-mode, contrast-enhanced ultrasound, Doppler, elastography, Ultrasound Current Source Density Imaging (UCSDI), transducers that are used percutaneously, intraoperatively (open or laparoscopic), transesophageal (TEE), transvascular, Intracardiac (ICE), Transthoracic (TTE), and/or transducers that are hand-held, manually steered, or controlled or guided robotically.
  • modalities such as A-mode, B-mode, contrast-enhanced ultrasound, Doppler, elastography, Ultrasound Current Source Density Imaging (UCSDI), transducers that are used percutaneously, intraoperatively (open or laparoscopic
  • Phase angle signals may include, respiration force or stretch of a chest strap, phase signals from ventilators, respiration phase sensed from an optical camera video stream or depth- camera stream (e.g., Microsoft KinectTM, Intel RealsenseTM camera), expiration or inhalation as detected by air velocity or gas concentration sensors (e.g., Capnography), and/or optical or EM position and/or orientation sensors place on the patient’s skin, clothing, drapes or implanted inside the body, ECG, pulse plethysmograph, skin color changes detected from an optical camera, fluid velocity sensors, Electromyography, Ultrasound Current Source Density Imaging (UCSDI), and/or the sensing of any periodic physiological event or anatomic gating event.
  • an optical camera video stream or depth- camera stream e.g., Microsoft KinectTM, Intel RealsenseTM camera
  • expiration or inhalation as detected by air velocity or gas concentration sensors (e.g., Capnography)
  • optical or EM position and/or orientation sensors place on the patient’s skin, clothing
  • Anatomical organ may include solid organs or portions of organs such as kidney, heart, liver, bladder, breast, uterus, spleen, pancreas, anatomical structures, tissues, such as muscles, blood vessels, ducts, gall bladder, glands, nerves, or portions thereof.
  • 3D reconstruction of the organ with periodic motion may be displayed monoscopically, stereoscopically, on augmented-reality or virtual -reality head-worn display device, on a mobile tablet or smartphone display device, on a boom or pole mounted flat-screen display device, and/or displayed from the point-of-view of the user’s eyes, point-of-view of the display device or from an arbitrary point-of-view.
  • the navigation system may display the real-time pose of one or more medical devices (as measured by a pose-tracking system), relative to the 3D reconstructions of the organ or tissues that exhibit period motion, the previous poses of the medical devices (the historical path) and/or their trajectory or estimated forward path, the most recent pose and historical poses of the medical devices that correspond to the current phase angle of the organ, 3D features, annotations, data and/or procedure plans, derived from pre-operative and/or intraoperative CT, MRI, PET and/or fluoroscopy images.
  • the term“clinician” refers to any medical professional (i.e., doctor, surgeon, nurse, or the like) or other user of the treatment planning system 10 involved in planning, performing, monitoring, and/or supervising a medical procedure involving the use of the embodiments described herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Systems and methods provide for volumetric display of an anatomy. The methods, performable by the system, include determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle, receiving a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ, and detecting edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images. The methods further include determining a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ, generating a subset of ultrasound images based on the determined phase angle of each ultrasound image, and reconstructing a 3D data set of the anatomical organ based on the generated subset of ultrasound images.

Description

SYSTEM AND METHOD FOR VOLUMETRIC DISPLAY OF ANATOMY WITH PERIODIC
MOTION
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to U.S. Provisional Patent Application No.
62/769,297, filed on November 19, 2018, the entire content of which is incorporated by reference herein.
INTRODUCTION
[0002] The present disclosure relates to systems, methods, and devices for volumetric display of an anatomy with periodic motion, for example, using ultrasound imaging.
BACKGROUND
[0003] When planning a treatment, diagnostic, or other medical procedure, clinicians often rely on ultrasound images to view the internal anatomy of a patient. The clinician utilizes the pre-operative and/or intra-operative medical images to identify targets of interest, to develop strategies for accessing the targets of interest for the medical procedure, and to visualize instruments and instrument trajectories.
[0004] Ultrasonic imaging has been used to image the insertion path of biopsy needles and other devices so that the clinician can visually observe the insertion of the needle toward and to target anatomy which is to be biopsied or treated. However, conventional ultrasound has limitations which deter its use in many medical procedures beyond biopsy.
SUMMARY
[0005] Systems and methods for volumetric display of an anatomy with periodic motion are provided.
[0006] According to an aspect of the present disclosure, a method for volumetric display of an anatomy includes determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle, receiving a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ, and detecting edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images. The method further includes determining a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ, generating a subset of ultrasound images based on the determined phase angle of each ultrasound image, and reconstructing a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
[0007] In an aspect, the subset of ultrasound images includes ultrasound images determined as having the same phase angle.
[0008] In an aspect, the method further includes discretizing at least one of the determined phase angles, and generating the subset of ultrasound images is based on the determined phase angles and the discretized phase angles. The method may further include displaying the reconstructed 3D data set of the anatomical organ.
[0009] In an aspect, portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
[0010] In an aspect, the method further includes assigning a timestamp to each ultrasound image of the sweep of ultrasound images, and the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
[0011] In another aspect of the present disclosure, a system for volumetric display of an anatomy includes: a device for determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle; an ultrasound device configured to capture a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ; and a computing device operably coupled to the device for determining phase angles and the ultrasound device. The computing device is configured to receive the ultrasound images from the ultrasound device, detect edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images, determine a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ, generate a subset of ultrasound images based on the determined phase angle of each ultrasound image, and reconstruct a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
[0012] In an aspect, the subset of ultrasound images includes ultrasound images determined as having the same phase angle. [0013] In an aspect, the computing device is configured to discretize at least one of the determined phase angles and generate the subset of ultrasound images based on the determined phase angles and the discretized phase angles. The computing device may further be configured to display the reconstructed 3D data set of the anatomical organ.
[0014] In an aspect, portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
[0015] In an aspect, the computing device is configured to assign a timestamp to each ultrasound image of the sweep of ultrasound images, and the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
[0016] In an aspect, the device for determining phase angles includes at least one of an electrocardiogram device, a ventilator device, an optical camera, a respiration chest strap sensor, an electromyogram or an electromagnetic sensor.
[0017] Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] Objects and features of the presently disclosed systems and methods will become apparent to those of ordinary skill in the art when descriptions of various embodiments thereof are read with reference to the accompanying drawings, of which:
[0019] Fig. l is a schematic diagram of a microwave ablation planning and procedure system in accordance with an illustrative embodiment of the present disclosure;
[0020] Fig. 2 is a schematic diagram of a computing device which forms part of the microwave ablation planning and procedure system of Fig. 1 in accordance with an embodiment of the present disclosure;
[0021] Fig. 3 A illustrates an ultrasound device capturing ultrasound images of a static anatomical organ;
[0022] Fig. 3B illustrates detected edges of the static anatomical organ in each of the ultrasound images captured by the ultrasound device of Fig. 3 A;
[0023] Fig. 3C illustrates an example reconstruction of the static anatomical organ of Fig. 3B; [0024] Fig. 4A illustrates two phase angles of an anatomical organ during periodic motion of the anatomical organ;
[0025] Fig. 4B illustrates an ultrasound device capturing ultrasound images of an anatomical organ during motion of the anatomical organ and detected edges of the anatomical organ for all phase angles in each of the ultrasound images; and
[0026] Fig. 4C shows an example reconstruction of the moving anatomical organ of Fig. 4B.
DETAILED DESCRIPTION
[0027] The present disclosure provides a system and method for volumetric display of an anatomy with periodic motion. This disclosure relates to a system and a method for acquiring (for example, via ultrasound imaging) and reconstructing the 3D shape of organs, vessels and/or other tissues that move and/or deform in an approximately periodic manner (e.g., liver, kidney , heart chamber, that moves with the patient’s respiration, beating heart, etc.), using a phase signal (e.g., ventilator cycle, ECG phase angle) which indicates or estimates the current real-time phase angle of the organ within its periodic motion cycle. This method may be incorporated into a navigation or mapping system (e.g., Medtronic Emprint™ SX, Stealth Station™,
SuperDimension™, Abbot Ensite Precision™, etc.) for helping a user understand the real-time position, orientation and/or trajectory of medical devices relative to the periodically moving organ, and for helping a user guide these devices to target locations within the organ (e.g., a tumor, blood clot, kidney stone).
[0028] Microwave ablation treatment, according to the present disclosure, is generally divided into two phases: (1) a planning phase; and (2) a procedure phase. The planning and treatment phases of microwave ablation treatment are more fully described in commonly owned U.S. Patent Application Publication No. US2016/0317224, entitled Microwave Ablation
Planning and Procedure Systems, filed on April 15, 2016, the entire content of which is incorporated by reference herein. Although described as being implemented with a microwave ablation system and being used for a microwave ablation procedure, the aspects of the presently disclosed systems and methods for volumetric display of an anatomy with periodic motion may be implemented in any navigational system, percutaneous and non-percutaneous, and may be implemented in systems using treatment modalities other than microwave ablation, systems for performing biopsies, and other exploratory systems that require volumetric displays of anatomy. [0029] Referring now to Figs. 1-2, the present disclosure is generally directed to a system 10, which includes a computing device 100, a display 110, a table 120, an ablation probe or instrument 130, and an ultrasound device 140 connected to an ultrasound workstation 150.
Although ultrasound workstation 150 is illustrated as separate from computing device 100, computing device 100 is configured to perform all the functions of ultrasound workstation 150, and vice versa. Computing device 100 may be, for example, a laptop computer, desktop computer, tablet computer, or other similar device. Computing device 100 may be configured to control an electrosurgical generator, a peristaltic pump, a power supply, and/or any other accessories and peripheral devices relating to, or forming part of, system 10. Display 110 is configured to output instructions, images, and messages relating to the performance of the microwave ablation procedure. Table 120 may be, for example, an operating table or other table suitable for use during a surgical procedure, which includes an electromagnetic (EM) field generator 121.
[0030] The surgical instruments and anatomy may be visualized by using ultrasound imaging. Ultrasound device 140, such as an ultrasound wand, may be used to image the patient’s body during the microwave ablation procedure to visualize the location of the surgical instruments, such as ablation probe 130, inside the patient’s body and to visualize the anatomy of the patient. Ultrasound device 140 may have an EM tracking sensor embedded within or attached to the ultrasound wand, for example, a clip-on sensor or a sticker sensor.
[0031] Computing device 100 may include memory 202, processor 204, display 206, network interface 208, input device 210, and/or output module 212. Memory 202 includes any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 100. In an embodiment, memory 202 may include one or more solid-state storage devices such as flash memory chips, and/or one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown). Memory 202 may store application 216 and/or CT data 214. Application 216 may, when executed by processor 204, cause display 206 to present user interface 218 and perform other methods described herein. Processor 204 may be a general-purpose processor, a specialized graphics processing unit (GPU) configured to perform specific graphics processing tasks while freeing up the general-purpose processor to perform other tasks, and/or any number or combination of such processors. Display 206 may be touch sensitive and/or voice activated, enabling display 206 to serve as both an input and output device. Alternatively, a keyboard (not shown), mouse (not shown), or other data input devices may be employed. Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Input device 210 may be any device by means of which a user may interact with computing device 100, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface. Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art. Application 216 may be one or more software programs stored in memory 202 and executed by processor 204 of computing device 100.
[0032] Application 216 may be installed directly on computing device 100, or may be installed on another computer, for example, a central server, and opened on computing device 100 via network interface 208. Application 216 may run natively on computing device 100, as a web-based application, or any other format known to those skilled in the art. In some embodiments, application 216 will be a single software program having all of the features and functionality described in the present disclosure. Application 216 communicates with a user interface 218 that generates a user interface for presenting visual interactive features to a clinician, for example, on display 206 and for receiving clinician input, for example, via a user input device. For example, user interface 218 may generate a graphical user interface (GUI) and output the GUI to display 206 for viewing by a clinician.
[0033] Computing device 100 is linked to display 110, thus enabling computing device 100 to control the output on display 110 along with the output on display 206. Computing device 100 may control display 110 to display output which is the same as or similar to the output displayed on display 206. For example, the output on display 206 may be mirrored on display 100. Alternatively, computing device 100 may control display 110 to display different output from that displayed on display 206. For example, display 110 may be controlled to display guidance images and information during the procedure, while display 206 is controlled to display other output, such as configuration or status information. [0034] Turning now to Figs. 3 A-3B, an ultrasound device 140 capturing a sweep“S” of ultrasound images 300a-n of a static anatomical organ 302 is illustrated for generating a 3D reconstruction 310 of the anatomical organ 302. The system detects edges 304a-n of the anatomical organ 302 in each ultrasound image 300a-n and generates the 3D reconstruction 310 of the static anatomical organ 302 based on the detected edges 304a-n. Because the anatomical organ 302 is static and not moving, the detected edges 300a-n also do not move, or change, as the ultrasound device 140 is moving along the sweep“S.” That is, minimal distortion occurs when generating the 3D reconstruction 310 of the static anatomical organ 302 and confidence is afforded to the all of the detected edges 304a-n in each ultrasound image 300a-n even when the ultrasound device 140 is not actually positioned to capture images corresponding to the location of the detected edges 300a-n. Thus, when the ultrasound device 140 is positioned along sweep “S” such that it captures ultrasound image 300a, the system detects edge 304a in ultrasound image 300a. Additionally, when the ultrasound device 140 is positioned along sweep“S” such that it captures ultrasound image 300n, the system detects edge 304n in ultrasound image 300n while also remaining confident that the previously detected edge 304a in ultrasound image 300a accurately represents the actual position of the edge of the anatomical organ in the patient at the point in time that the ultrasound image 304n is being captured. That is, the position of the edge 304a would be the same if the user moved the ultrasound device 140 back to that position along the sweep“S.”
[0035] However, generating a 3D reconstruction of non-static, moving, anatomical organ includes challenges that are not present in the 3D reconstruction of static anatomical organs. In particular, because the anatomical organ is moving, the positions of detected edges of the anatomical organ that were captured in ultrasound images at earlier portions of the sweep change as the organ moves. Thus, only the detected edges that correspond to the ultrasound image that is actually being captured can be said to be accurate.
[0036] Referring to Figs. 4A-4C, an ultrasound device 140 capturing a sweep“S” of ultrasound images 400a-n of an anatomical organ undergoing a motion cycle is illustrated for generating a 3D reconstruction 410 of the anatomical organ. The system detects edges 404a-n of the anatomical organ in each ultrasound image 400a-n and generates the 3D reconstruction of the static anatomical organ based on the detected edges 404a-n. Fig. 4A illustrates the anatomical organ in a first phase angle 401 (e.g., 0 degrees) and the anatomical organ in a second phase angle 402 (e.g., 180 degrees). A phase signal is used to indicate or estimate the current real-time phase angle of the organ within its periodic motion cycle. Referring to Fig. 4B, if the organ is moving quickly between the first phase angle 401 (e.g., 0 degrees) and the second phase angle 402 (e.g., 180 degrees) as the user is slowly sweeping the ultrasound beam over the organ, then the 3D reconstruction 410 contains a mix of both organ states (see Fig. 4C).
[0037] In an aspect, at any instant in time, the system displays a 3D reconstruction 410 of the organ that is based on the acquired imaging data that was acquired at the same phase angle as the current phase angle. To achieve this, the system may generate a subset of ultrasound images based on the determined phase angle, in particular, including only those ultrasound images determined to have the same, or similar, phase angles. Since the organ’s motion is, to a first approximation, periodic, the user sees a 3D dataset that appears to be real-time, even though portions of this dataset may be been acquired in previous cycles of the organ’s motion (but at the same phase angle).
[0038] In some situations, the organ’s motion may not be completely periodic. The first- order approximation may be periodic, but there may be other components of motion that are not accounted for in this first-order periodic approximation. For example, a user may be inserting a needle into a tumor in the liver. The liver and tumor move mostly in sync with the patient’s periodic breath cycle, but the region immediately surrounding the needle may be displaced by the needle as it is advanced deeper into the liver. If the tumor is calcified and the liver is fatty, the needle may even displace/dislodge the tumor from its surrounding liver tissue, instead of piercing and entering the tumor. A system using the aspect described above that displays the 3D reconstruction based on only image data that was acquired while the organ was at the same phase angle as the current phase angle, alone, may not adequately communicate to the user that the 3D data they are viewing is not up-to-date, especially since the anatomical structures in the 3D data appear, to the user, to be moving in real-time. The user might not recognize that the needle has pushed the tumor away from the surrounding tissue, because the outdated 3D reconstruction of the tumor may appear in its original location within the surrounding liver tissue (and both the tumor and the surrounding liver tissue will appear to be moving periodically).
[0039] Thus, the portions of the 3D reconstruction which are based on older images (image data acquired further back in time) are displayed blurry, transparent and/or less saliently. The older the image data, the more blurry and/or transparent its contribution to the 3D reconstruction is displayed. The newer the image data, the more opaque, sharp and salient it is displayed. This has two useful properties. First, it indicates, to the user, the relative age of this portion of the imaging data (and thus directing the user to refresh this portion by moving the ultrasound probe to re-acquire ultrasound images of this portion of the organ). Second, the older the image data, the less it obscures the user’s view of the newer portions of the data.
[0040] Using the above liver tumor example, if the portions of the 3D reconstruction that contain the tumor and liver are outdated, the user will see them less clearly (e.g. blurry, transparent, etc.). The more outdated that portion becomes, the less saliently it will be displayed. At some point, that portion of the 3D reconstruction may become completely invisible. The user will be aware that she needs to move the ultrasound transducer to re-scan that area. After this re scan, she will see in the newly acquired image data (which is displayed opaquely and sharply) that the tumor has been dislodged from the surrounding liver tissue.
[0041] In one aspect, the system utilizes a binary space partition (BSP) methodology to sort the 2D ultrasound images by their distance from the virtual viewpoint and then draws the images in back-to-front order. Then, the phase angle (a continuous variable from 0-360 degrees) is binned into fixed number of discrete phase angle ranges (e.g., 0-2 degrees, 2-4 degrees, etc.).
For each discrete phase angle, a separate dual-BSP data structure is created and maintained. When a new, pose-tracked 2D ultrasound image is acquired, its associated phase angle is used to index into one dual-BSP data- structure. The latest pose-tracked 2D ultrasound image is incorporated into that one dual-BSP data- structure. The associated timestamp of when each 2D ultrasound image was acquired is also stored. At the time of display, the age of each 2D ultrasound image is computed. As each 2D ultrasound image texture is drawn, a shader performs a gaussian blur on the texture. The width of the gaussian function is computed from the age of ultrasound image data being drawn (the greater the age, the wider the gaussian function).
Additionally, the opacity of the texture is reduced as a function of its age.
[0042] In another aspect, the phase angle (a continuous variable from 0-360 degrees) is binned into fixed number of discrete phase angle ranges (e.g., 0-2 degrees, 2-4 degrees, etc.).
For each discrete phase angle, a 3D volumetric image (e.g., a 3D texture) is created and maintained. When a new pose-tracked 2D ultrasound image is acquired, its associated phase angle is used to index into one 3D texture. The latest pose-tracked 2D ultrasound image is rasterized into the 3D texture. At regular intervals in time (e.g., after the display operation), a 3D Gaussian blur operation is performed on all of the 3D volumetric images. As the image data gets older, the blur operation will have been performed on it more and more times. The 3D textures may be displayed using ray-casting or another 3D volume rendering algorithm.
[0043] For microwave ablation, roughly half of procedures are performed under CT- guidance by interventional radiologists. This is expensive, time-consuming and doses the patient and physician with radiation. Currently, the reimbursement for the procedure is less than the cost of performing it, and only 10-20% of patients who are indicated for this therapy and would benefit from the procedure receive it.
[0044] The systems and methods of this disclosure enable improvements to traditional medical navigation systems by making it easier and faster to perform procedures associated therewith such as biopsy, cardiac valve replacement, embolization, stent placement,
electroportation, RF ablation and microwave ablation, and helps interventional radiologists convert to using live ultrasound instead of costly and time-consuming CT, Fluoroscopy or other X-ray imaging. This in turn makes the procedure more cost effective and safer.
[0045] “Pose-tracked,” and variants of such terms used herein, is understood to mean an ultrasound transducer (commonly referred to as an ultrasound probe) whose position and/or orientation are continuously or intermittently measured. This measurement can be performed by use of an optical tracking system (e.g., NDI Polaris, Valve Lighthouse™, Vicon™ Motion Capture, Microsoft Kinect, Vuforia™), electromagnetic tracking system (e.g., NDI Aurora™, Sixsense Razer™, Medtronic AxiEM™), passive or motorized mechanical arm (e.g., FaroArm™, Kuka™ LBR iiwa), ultrasound position tracking (e.g., Hexamite HX19™), inertial tracking (e.g., Thales InertiaCube4™), RFID, impedance-based (Electropotential) tracking (e.g., Medtronic LocaLisa™) or any combination thereof.
[0046] “Medical devices” and variants of such terms used herein, may include needles, needle-like devices such as microwave ablation antenna, cryo-ablation antenna, radiofrequency antenna, biopsy needles, electroporation electrodes, pacing or defibrillation leads/electrodes, catheters and catheter-like devices, guidewires, such as stents, flexible endoscopes, microwave, cryo, radiofrequency ablation devices, cardiac and defibrillation pacing electrodes, gastric banding devices, in-vitro fertilization devices for harvesting eggs, implanting embryos, neuro stimulation, monitoring or electromyogram electrodes, and/or cardiac or vascular valves or leaflets. [0047] The ultrasound device may include those with ultrasound transducers, with a single transducing element or ID or 2D arrays of transducing elements, those with single plane arrays or bi-plane arrays, those that produce 2D or 3D images, those with piezo, CMOS, or other transducing elements, modalities such as A-mode, B-mode, contrast-enhanced ultrasound, Doppler, elastography, Ultrasound Current Source Density Imaging (UCSDI), transducers that are used percutaneously, intraoperatively (open or laparoscopic), transesophageal (TEE), transvascular, Intracardiac (ICE), Transthoracic (TTE), and/or transducers that are hand-held, manually steered, or controlled or guided robotically.
[0048] Phase angle signals may include, respiration force or stretch of a chest strap, phase signals from ventilators, respiration phase sensed from an optical camera video stream or depth- camera stream (e.g., Microsoft Kinect™, Intel Realsense™ camera), expiration or inhalation as detected by air velocity or gas concentration sensors (e.g., Capnography), and/or optical or EM position and/or orientation sensors place on the patient’s skin, clothing, drapes or implanted inside the body, ECG, pulse plethysmograph, skin color changes detected from an optical camera, fluid velocity sensors, Electromyography, Ultrasound Current Source Density Imaging (UCSDI), and/or the sensing of any periodic physiological event or anatomic gating event.
[0049] “Anatomical organ,”“organ,”“tissue,” and variants of such terms used herein may include solid organs or portions of organs such as kidney, heart, liver, bladder, breast, uterus, spleen, pancreas, anatomical structures, tissues, such as muscles, blood vessels, ducts, gall bladder, glands, nerves, or portions thereof.
[0050] In some embodiments, 3D reconstruction of the organ with periodic motion (as described above) may be displayed monoscopically, stereoscopically, on augmented-reality or virtual -reality head-worn display device, on a mobile tablet or smartphone display device, on a boom or pole mounted flat-screen display device, and/or displayed from the point-of-view of the user’s eyes, point-of-view of the display device or from an arbitrary point-of-view.
[0051] In aspects where the system is combined with, or includes, a navigation system, in various embodiments the navigation system may display the real-time pose of one or more medical devices (as measured by a pose-tracking system), relative to the 3D reconstructions of the organ or tissues that exhibit period motion, the previous poses of the medical devices (the historical path) and/or their trajectory or estimated forward path, the most recent pose and historical poses of the medical devices that correspond to the current phase angle of the organ, 3D features, annotations, data and/or procedure plans, derived from pre-operative and/or intraoperative CT, MRI, PET and/or fluoroscopy images.
[0052] Although the present disclosure is described in terms of specific illustrative embodiments, it will be readily apparent to those skilled in the art that various modifications, rearrangements, and substitutions may be made without departing from the spirit of the present disclosure. The scope of the present disclosure is defined by the claims appended hereto.
[0053] As used herein, the term“clinician” refers to any medical professional (i.e., doctor, surgeon, nurse, or the like) or other user of the treatment planning system 10 involved in planning, performing, monitoring, and/or supervising a medical procedure involving the use of the embodiments described herein.
[0054] Although embodiments have been described in detail with reference to the accompanying drawings for the purpose of illustration and description, it is to be understood that the inventive processes and apparatus are not to be construed as limited thereby. It will be apparent to those of ordinary skill in the art that various modifications to the foregoing embodiments may be made without departing from the scope of the disclosure.

Claims

WHAT IS CLAIMED IS:
1. A method for volumetric display of an anatomy, the method comprising:
determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle;
receiving a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ;
detecting edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images;
determining a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ;
generating a subset of ultrasound images based on the determined phase angle of each ultrasound image; and
reconstructing a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
2. The method of claim 1, wherein the subset of ultrasound images includes ultrasound images determined as having the same phase angle.
3. The method of claim 1, further comprising discretizing at least one of the determined phase angles, and wherein generating the subset of ultrasound images is based on the determined phase angles and the discretized phase angles.
4. The method of claim 1, further comprising displaying the reconstructed 3D data set of the anatomical organ.
5. The method of claim 4, wherein portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
6. The method of claim 5, further comprising assigning a timestamp to each ultrasound image of the sweep of ultrasound images, and wherein the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
7. A system for volumetric display of an anatomy, the system comprising:
a device for determining phase angles of an anatomical organ corresponding to movement of the anatomical organ in a motion cycle;
an ultrasound device configured to capture a sweep of ultrasound images of the anatomical organ during the motion cycle of the anatomical organ; and
a computing device operably coupled to the device for determining phase angles and the ultrasound device, the computing device configured to:
receive the ultrasound images from the ultrasound device;
detect edge points of the anatomical organ in each ultrasound image of the sweep of ultrasound images;
determine a phase angle of each ultrasound image of the sweep of ultrasound images based on the determined phase angles of the anatomical organ;
generate a subset of ultrasound images based on the determined phase angle of each ultrasound image; and
reconstruct a 3D data set of the anatomical organ based on the generated subset of ultrasound images.
8. The system of claim 7, wherein the subset of ultrasound images includes ultrasound images determined as having the same phase angle.
9. The system of claim 7, wherein the computing device is configured to discretize at least one of the determined phase angles and generate the subset of ultrasound images based on the determined phase angles and the discretized phase angles.
10. The system of claim 7, wherein the computing device is configured to display the reconstructed 3D data set of the anatomical organ.
11. The system of claim 10, wherein portions of the displayed 3D data set of the anatomical organ are displayed blurry, transparent, or less salient relative to other portions.
12. The system of claim 11, wherein the computing device is configured to assign a timestamp to each ultrasound image of the sweep of ultrasound images, and wherein the portions that are displayed blurry, transparent, or less salient correspond to ultrasound images that were captured at earlier points in time relative to ultrasound images that were captured at later points in time.
13. The system of claim 7, wherein the device for determining phase angles includes at least one of an electrocardiogram device, a ventilator device, an optical camera, a respiration chest strap sensor, or an electromagnetic sensor.
PCT/US2019/062105 2018-11-19 2019-11-19 System and method for volumetric display of anatomy with periodic motion WO2020106664A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862769297P 2018-11-19 2018-11-19
US62/769,297 2018-11-19

Publications (1)

Publication Number Publication Date
WO2020106664A1 true WO2020106664A1 (en) 2020-05-28

Family

ID=68835374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/062105 WO2020106664A1 (en) 2018-11-19 2019-11-19 System and method for volumetric display of anatomy with periodic motion

Country Status (1)

Country Link
WO (1) WO2020106664A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090192386A1 (en) * 2008-01-25 2009-07-30 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of controlling the same
US20130197357A1 (en) * 2012-01-30 2013-08-01 Inneroptic Technology, Inc Multiple medical device guidance
US20160317224A1 (en) 2015-04-30 2016-11-03 Covidien Lp Microwave ablation planning and procedure systems

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090192386A1 (en) * 2008-01-25 2009-07-30 Kabushiki Kaisha Toshiba Ultrasonic diagnostic apparatus and method of controlling the same
US20130197357A1 (en) * 2012-01-30 2013-08-01 Inneroptic Technology, Inc Multiple medical device guidance
US20160317224A1 (en) 2015-04-30 2016-11-03 Covidien Lp Microwave ablation planning and procedure systems

Similar Documents

Publication Publication Date Title
US20220192611A1 (en) Medical device approaches
EP3422297B1 (en) System and method for glass state view in real-time three-dimensional (3d) cardiac imaging
JP4965042B2 (en) How to draw medical images in real time
US8195271B2 (en) Method and system for performing ablation to treat ventricular tachycardia
JP6719885B2 (en) Positioning map using intracardiac signals
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
US8527032B2 (en) Imaging system and method of delivery of an instrument to an imaged subject
Linte et al. On mixed reality environments for minimally invasive therapy guidance: systems architecture, successes and challenges in their implementation from laboratory to clinic
US20030220555A1 (en) Method and apparatus for image presentation of a medical instrument introduced into an examination region of a patent
US20080234570A1 (en) System For Guiding a Medical Instrument in a Patient Body
AU2017236893A1 (en) Virtual reality or augmented reality visualization of 3D medical images
US20130231557A1 (en) Intracardiac echocardiography image reconstruction in combination with position tracking system
US20080287805A1 (en) System and method to guide an instrument through an imaged subject
WO2015188393A1 (en) Human organ motion monitoring method, surgical navigation system, and computer-readable media
US20070244369A1 (en) Medical Imaging System for Mapping a Structure in a Patient's Body
US7940972B2 (en) System and method of extended field of view image acquisition of an imaged subject
CN111343941A (en) Robotic tool control
US20230139348A1 (en) Ultrasound image-based guidance of medical instruments or devices
JP7460355B2 (en) Medical User Interface
US10792010B2 (en) Micromanipulator-controlled local view with stationary overall view
JP2020501865A (en) Navigation platform for medical devices, especially cardiac catheters
RU2735068C1 (en) Body cavity map
WO2020106664A1 (en) System and method for volumetric display of anatomy with periodic motion
US20230263580A1 (en) Method and system for tracking and visualizing medical devices
US20200311928A1 (en) Medical image processing apparatus, medical image processing method, and medical image processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19817541

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19817541

Country of ref document: EP

Kind code of ref document: A1