CN112236077A - System and method for visualizing an anatomy, positioning a medical device, or placing a medical device - Google Patents

System and method for visualizing an anatomy, positioning a medical device, or placing a medical device Download PDF

Info

Publication number
CN112236077A
CN112236077A CN201980037260.7A CN201980037260A CN112236077A CN 112236077 A CN112236077 A CN 112236077A CN 201980037260 A CN201980037260 A CN 201980037260A CN 112236077 A CN112236077 A CN 112236077A
Authority
CN
China
Prior art keywords
medical device
patient
virtual
ultrasound
alternate reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980037260.7A
Other languages
Chinese (zh)
Inventor
T·L·德菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bard Access Systems Inc
Original Assignee
Bard Access Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/209,601 external-priority patent/US20190167148A1/en
Priority claimed from US16/370,353 external-priority patent/US20190223757A1/en
Application filed by Bard Access Systems Inc filed Critical Bard Access Systems Inc
Publication of CN112236077A publication Critical patent/CN112236077A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/02007Evaluating blood vessel condition, e.g. elasticity, compliance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/062Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using magnetic field
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4209Details of probe positioning or probe attachment to the patient by using holders, e.g. positioning frames
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/462Displaying means of special interest characterised by constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • A61B8/5253Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode combining overlapping images, e.g. spatial compounding
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00243Type of minimally invasive operation cardiac
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/028Multiple view windows (top-side-front-sagittal-orthogonal)

Abstract

A medical device placement system includes a medical device tip position sensor ("TLS") configured for placement on a patient's chest, an ultrasound probe, a console, and an alternate reality earpiece. The ultrasound probe may be configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient. The console may be configured to convert the echo ultrasound signals to produce ultrasound image segments corresponding to the anatomy of the patient, and to convert the TLS signals from the TLS into positional information of the medical device within the patient. The alternate reality headset may include a display screen through which a wearer of the alternate reality headset may see the patient. The display screen may be configured to display a virtual medical treatment on the patient according to the position information of the medical device within the virtual anatomical object corresponding to the ultrasound image segment.

Description

System and method for visualizing an anatomy, positioning a medical device, or placing a medical device
Priority
This application claims the benefit of priority from U.S. patent application No. 16/370,353 filed on day 29, 3, 2019 and its continuation-in-part application parent U.S. patent application No. 16/209,601 filed on day 4, 12, 2018 (which claims the benefit of priority from U.S. provisional application No. 62/594,454 filed on day 4, 12, 2017). This application also claims the benefit of priority from U.S. provisional application No. 62/680,299 filed on 4.6.2018. Each of the above applications is incorporated by reference herein in its entirety.
Background
When placing a medical device in the peripheral vasculature, such as the vasculature of an arm or leg, it is difficult to determine the location of the medical device or its tip at any given point in time. For example, clinicians often use fluoroscopy to track medical devices such as guidewires or catheters, but the vasculature is not visible in such X-ray based techniques, which makes it impossible to accurately determine the location of the tip of the guidewire or catheter in the vasculature. Furthermore, fluoroscopy exposes both the patient and the clinician to ionizing radiation, placing their health at risk. Thus, the ability to visualize anatomy (anatomi), such as the peripheral vasculature, is desirable. Furthermore, the ability to visualize such anatomy together with medical devices, such as guidewires and catheters, is needed, ultimately making it possible to accurately determine the position of such medical devices during their placement. Finally, such capabilities should not adversely affect the patient or clinician.
Disclosed herein are systems and methods for visualizing an anatomy, positioning a medical device, or placing a medical device that address one or more needs such as the foregoing.
Disclosure of Invention
Disclosed herein is a medical device placement system that, in some embodiments, includes a medical device tip position sensor ("TLS"), an ultrasound probe, a console, and alternate-reality (alternative-reality) headphones. The TLS is configured for placement on the chest of a patient. The ultrasound probe is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of an array of piezoelectric transducers. The console has electronic circuitry including memory and a processor configured to convert the echo ultrasound signals to produce ultrasound image segments (segments) corresponding to an anatomical structure of a patient. The console is further configured to: when the TLS is placed on the patient's chest, the TLS signal from the TLS is converted into positional information of the medical device within the patient. An alternate reality headset includes a display screen coupled to a frame having electronic circuitry including a memory and a processor. The display screen is configured such that a wearer of the alternate reality headset can see the patient through the display screen. The display screen is configured to display a virtual medical device on the patient based on positional information of the medical device within a virtual anatomical object (which corresponds to the ultrasound image segment).
In some embodiments, the ultrasound probe is configured with a pulsed wave doppler imaging mode for transmitting and receiving ultrasound signals. The console is configured to capture ultrasound imaging frames (frames) according to a pulsed wave doppler imaging mode, stitch the ultrasound imaging frames together using a stitching algorithm (stitching algorithm), and segment the ultrasound imaging frames or stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm (image segmentation algorithm).
In some embodiments, the console is configured to convert the ultrasound image segments into virtual anatomical objects using a virtualization algorithm. The console is configured to send both the virtual medical device and the virtual anatomical object to the alternate reality headset for display on the patient.
In some embodiments, the alternate reality headset is configured to anchor the virtual medical device and the virtual anatomical object to a patient on which the virtual medical device and the virtual anatomical object are displayed.
In some embodiments, the alternate reality headset further includes one or more eye tracking cameras coupled to the frame configured to capture eye movements of the wearer. The processor of the alternate reality headset is configured to process eye movements using an eye movement algorithm to identify a wearer's focus for selecting or augmenting a virtual anatomical object, a virtual medical device, or both that correspond to the wearer's focus.
In some embodiments, the alternate reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture a pose of the wearer. The processor of the alternate reality headset is configured to process the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the alternate reality headset further includes one or more microphones coupled to the frame, which are configured to capture audio of the wearer. The processor of the alternate reality headset is configured to process the audio using an audio command algorithm to identify an audio-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the TLS includes one or more magnetic sensors disposed in the housing. The TLS signal is a magnetic sensor signal from one or more magnetic sensors, which the console may use to convert into position information of the medical device.
In some embodiments, each of the one or more magnetic sensors has a fixed spatial relationship with another of the one or more magnetic sensors.
In some embodiments, the medical device is a magnetized medical device such as a peripherally inserted central catheter ("PICC") or the like.
Also disclosed herein is a medical device placement system that, in some embodiments, includes a medical device TLS, an ultrasound probe, a console, and an alternate reality headset. The TLS includes one or more magnetic sensors disposed in a housing, the TLS configured for placement on a patient's chest. The ultrasound probe is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of an array of piezoelectric transducers. The console has electronic circuitry including memory and a processor configured to convert the echo ultrasound signals to produce ultrasound image segments corresponding to anatomy of a patient. The console is further configured to convert magnetic sensor signals from the one or more magnetic sensors of the TLS into positional information of a magnetized medical device, such as a PICC, within the patient when the TLS is placed on the patient's chest. An alternate reality headset includes a display screen coupled to a frame having electronic circuitry including a memory and a processor. The display screen is configured such that a wearer of the alternate reality headset can see the patient through the display screen. The display screen is configured to display an anchored virtual medical device on the patient according to position information of the medical device within an anchored virtual anatomical object corresponding to the ultrasound image segment.
In some embodiments, the alternate reality headset further includes one or more eye tracking cameras coupled to the frame configured to capture eye movements of the wearer. The processor of the alternate reality headset is configured to process eye movements using an eye movement algorithm to identify a wearer's focus for selecting or augmenting a virtual anatomical object, a virtual medical device, or both that correspond to the wearer's focus.
In some embodiments, the alternate reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture a pose of the wearer. The processor of the alternate reality headset is configured to process the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
Also disclosed herein is a wireless medical device placement system that, in some embodiments, includes an ultrasound probe, a medical device TLS, and an alternate reality earpiece configured to wirelessly communicate with the ultrasound probe and TLS. The ultrasound probe is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of an array of piezoelectric transducers. The TLS is configured for placement on the chest of a patient. The alternate reality headset includes a frame and a display screen coupled to the frame through which a wearer of the alternate reality headset can see an environment including a patient. The frame has electronic circuitry including a memory and a processor configured to convert the echo ultrasound signals to produce ultrasound image segments corresponding to anatomy of the patient, and to convert TLS signals from the TLS into positional information of the medical device within the patient when the TLS is placed on the chest of the patient. The display screen is configured to display a virtual medical device according to positional information of the medical device within a virtual anatomical object corresponding to the ultrasound image segment. Alternatively or additionally, the display screen is configured to display one or more graphical control element windows including outputs corresponding to one or more processes of the medical device placement system.
In some embodiments, the alternate reality headset is configured to capture ultrasound imaging frames according to an imaging mode of the ultrasound probe, stitch the ultrasound imaging frames together using a stitching algorithm, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm.
In some embodiments, the alternate reality headset is configured to display one or more windows including output corresponding to one or more processes of the medical device placement system. The one or more windows include an ultrasound window, and the output corresponding to the one or more processes of the medical device placement system includes ultrasound imaging frames corresponding to ultrasound imaging with the ultrasound probe.
In some embodiments, the alternate reality headset is configured to convert the ultrasound image segment into a virtual anatomical object using a virtualization algorithm and display both the virtual medical device and the virtual anatomical object on the environment.
In some embodiments, the alternate reality headset is configured to anchor the virtual medical device and the virtual anatomical object to a persistent location on the display screen, a persistent location in a frame of reference of the wearer, or a persistent location in the environment.
In some embodiments, the alternate reality headset further includes one or more eye tracking cameras coupled to the frame configured to capture eye movements of the wearer. The processor of the alternate reality headset is further configured to process eye movements using the eye movement algorithm to identify a wearer's focus for selecting or augmenting a virtual anatomical object, a virtual medical device, or both corresponding to the wearer's focus.
In some embodiments, the alternate reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture a pose of the wearer. The processor of the alternate reality headset is further configured to process the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the alternate reality headset further includes one or more microphones coupled to the frame, which are configured to capture audio of the wearer. The processor of the alternate reality headset is further configured to process the audio with an audio command algorithm to identify an audio-based command issued by the wearer for execution by the alternate reality headset.
Also disclosed herein is a medical device placement system that, in some embodiments, includes an ultrasound probe, a medical device TLS, a stylet, and a processing device configured to process echo ultrasound signals, TLS signals, and electrocardiogram ("ECG") signal sets. The ultrasound probe is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of an array of piezoelectric transducers. The TLS is configured for placement on the chest of a patient. The stylet is configured for insertion into a lumen of a medical device. The stylet includes an ECG electrode in a distal portion of the stylet configured to generate a set of ECG signals in response to electrical changes associated with depolarization (repolarization) and repolarization (repolarization) of the patient's heart. The processing apparatus includes electronic circuitry including a memory and a processor configured to convert the echo ultrasound signals to produce ultrasound image segments corresponding to anatomy of the patient, convert the TLS signals from the TLS into positional information of the medical device within the patient when the TLS is placed on the chest of the patient, and convert the set of ECG signals into an ECG. A wearable display screen through which a wearer of the wearable display screen can see an environment including a patient is configured to display a virtual medical device according to positional information of the medical device within a virtual anatomical object corresponding to an ultrasound image segment. Alternatively or additionally, the display screen is configured to display one or more graphical control element windows including outputs corresponding to one or more processes of the medical device placement system.
In some embodiments, the processing device is configured to capture ultrasound imaging frames according to an imaging mode of the ultrasound probe, stitch the ultrasound imaging frames together using a stitching algorithm, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm.
In some embodiments, the display screen is configured to display one or more windows including output corresponding to one or more processes of the medical device placement system. The one or more windows include an ultrasound window, and the output corresponding to the one or more processes of the medical device placement system includes ultrasound imaging frames corresponding to ultrasound imaging with the ultrasound probe.
In some embodiments, the one or more windows further comprise an ECG window, and the output corresponding to the one or more processes of the medical device placement system further comprises an ECG corresponding to electrocardiography using a stylet comprising an ECG electrode.
In some embodiments, the medical device placement system further includes a number of ECG electrode pads configured to generate a corresponding number of sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart. The processing means is further configured to convert the number of sets of ECG signals into a corresponding number of ECGs.
In some embodiments, the output corresponding to the one or more processes of the medical device placement system further includes the number of ECGs corresponding to electrocardiography utilizing the number of ECG electrode pads. Each of the ECGs in the ECG window is configured for placement in the ECG window by a wearer of the display screen.
In some embodiments, the processing apparatus is configured to convert the ultrasound image segments into virtual anatomical objects using a virtualization algorithm for displaying both the virtual medical device and the virtual anatomical objects on the environment.
In some embodiments, the processing device is a console of the medical device placement system, an alternate reality headset of the medical device placement system, or a combination of a console and an alternate reality headset. The alternate reality headset includes a frame to which a display screen is coupled.
In some embodiments, the stylet is configured to connect to the TLS by a sterile drape that separates a sterile field including the stylet from a non-sterile field including the TLS. The TLS is configured to communicate with the console with a first wired connection to a first port of the console with an alternate reality headset wireless communication or essential oil. The ultrasound probe is configured to communicate wirelessly with the alternate reality headset or with the console via a second wired connection to a second port of the console.
In some embodiments, the alternate reality headset is configured to anchor the virtual medical device and the virtual anatomical object to a persistent location on the display screen, a persistent location in a frame of reference of the wearer, or a persistent location in the environment.
In some embodiments, the display screen is configured to display one or more contours around one or more corresponding components of the medical device placement system, one or more virtual components on one or more corresponding components of the medical device placement system, or a combination thereof.
In some embodiments, the display screen is configured to display a TLS outline around the TLS under the sterile drape, a virtual TLS of the TLS anywhere in the environment over the sterile drape, or a combination thereof.
In some embodiments, the medical device is a PICC, and the desired location of the PICC in the patient is a superior vena cava proximate a sinus node in a right atrium of a heart of the patient.
In some embodiments, the distal portion of the virtual medical device indicates proximity to a desired location in the patient by way of a visual indicator as the medical device is advanced through the patient.
Also disclosed herein is an anatomical visualization system that, in some embodiments, includes an ultrasound imaging system and an alternate reality headset. An ultrasound imaging system includes an ultrasound probe and a console. The ultrasound probe is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of an array of piezoelectric transducers. The console has electronic circuitry including memory and a processor configured to convert the echo ultrasound signals to produce ultrasound image segments corresponding to anatomy of a patient. An alternate reality headset includes a display screen coupled to a frame having electronic circuitry including a memory and a processor. The display screen is configured such that a wearer of the alternate reality headset can see the patient through the display screen. The display screen is configured to display a virtual anatomical object corresponding to an ultrasound image segment on a patient.
In some embodiments, the ultrasound probe is configured with a pulsed wave doppler imaging mode for transmitting and receiving ultrasound signals. The console is configured to capture ultrasound imaging frames according to a pulsed wave doppler imaging mode, stitch the ultrasound imaging frames together using a stitching algorithm, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm.
In some embodiments, the console is configured to convert the ultrasound image segments into virtual anatomical objects using a virtualization algorithm. The console is configured to send the virtual anatomical object to the alternate implementation headphones for display on the patient.
In some embodiments, the alternate reality headset is configured to anchor a virtual anatomical object to a patient on which the virtual anatomical object is displayed.
In some embodiments, the alternate reality headset further includes one or more eye tracking cameras coupled to the frame configured to capture eye movements of the wearer. The processor of the alternate reality headset is configured to process eye movements using an eye movement algorithm to identify a wearer's focus for selecting or augmenting a focused virtual anatomical object corresponding to the wearer.
In some embodiments, the alternate reality headset further includes one or more patient-facing cameras coupled to the frame configured to capture a pose of the wearer. The processor of the alternate reality headset is configured to process the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the alternate reality headset further includes one or more microphones coupled to the frame, which are configured to capture audio of the wearer. The processor of the alternate reality headset is configured to process the audio using an audio command algorithm to identify an audio-based command issued by the wearer for execution by the alternate reality headset.
Also disclosed herein is a method of a medical device placement system, which in some embodiments comprises: transmitting an ultrasound signal into a patient and receiving an echo ultrasound signal from the patient by means of a piezoelectric sensor array of an ultrasound probe; transforming the echo ultrasound signals with a console having electronic circuitry including a memory and a processor to produce ultrasound image segments corresponding to the anatomy of the patient; converting magnetic sensor signals from one or more magnetic sensors (disposed within a housing of a medical device TLS placed on the chest of a patient) into positional information of a magnetized medical device within the patient using a console; a virtual medical device is displayed on a patient on a see-through display screen of an alternate reality headset having electronic circuitry in a frame coupled to the display screen, the electronic circuitry including a memory and a processor, in accordance with positional information of the medical device within a virtual anatomical object corresponding to a segment of an ultrasound image.
In some embodiments, the method further comprises: capturing an ultrasound imaging frame in a memory of a console according to a pulsed wave doppler imaging mode of an ultrasound probe while transmitting and receiving ultrasound signals; splicing the ultrasonic imaging frames together by using a splicing algorithm; and segmenting the ultrasound imaging frame or the stitched ultrasound imaging frame into ultrasound image segments using an image segmentation algorithm.
In some embodiments, the method further comprises: converting the ultrasound image segments into virtual anatomical objects using a virtualization algorithm; and send both the virtual medical device and the virtual anatomical object to the alternate reality headset for display on the patient.
In some embodiments, the method further comprises anchoring the virtual medical device and the virtual anatomical object to a patient on which the virtual medical device and the virtual anatomical object are displayed.
In some embodiments, the method further comprises: capturing eye movements of the wearer in a memory of a console using one or more eye tracking cameras coupled to a frame of the alternate reality headset; and processing the eye movements with an eye movement algorithm to identify the wearer's focus for selecting or enhancing the virtual anatomical object corresponding to the wearer's focus.
In some embodiments, the method further comprises: capturing a pose of the wearer in a memory of a console using one or more patient facing cameras coupled to a frame of the alternate reality headset; and processing the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the method further comprises: capturing the wearer's audio in a memory of the console using one or more microphones coupled to a frame of the alternate reality headset; and processing the audio with an audio command algorithm to identify an audio-based command issued by the wearer for execution by the alternate reality headset.
Also disclosed herein is a method of a medical device placement system, which in some embodiments comprises: transmitting an ultrasound signal into a patient and receiving an echo ultrasound signal from the patient by means of a piezoelectric sensor array of an ultrasound probe; transforming the echo ultrasound signals with electronic circuitry (including a memory and a processor) in a frame of the alternate reality headset to produce ultrasound image segments corresponding to anatomy of the patient; converting magnetic sensor signals from one or more magnetic sensors (disposed within a housing of a medical device TLS placed on the chest of a patient) into positional information of a magnetized medical device within the patient using alternate reality headphones; and displaying, on a see-through display screen of the alternate reality headset, the virtual medical device on the environment including the patient, one or more graphical control element windows (which include output corresponding to one or more processes of the medical device placement system), or both the virtual medical device and the one or more windows within the virtual anatomical object for a wearer of the alternate reality headset according to the positional information of the medical device within the virtual anatomical object corresponding to the ultrasound image segment.
In some embodiments, the method further comprises: capturing eye movements of the wearer in a memory of the alternate reality headset using one or more eye tracking cameras coupled to a frame of the alternate reality headset; and processing the eye movements with an eye movement algorithm to identify the wearer's focus for selecting or enhancing the virtual medical device, the virtual anatomical object, the one or more windows, or the output in the one or more windows corresponding to the wearer's focus.
In some embodiments, the method further comprises: capturing a pose of the wearer in a memory of the alternate reality headset using one or more patient-facing cameras coupled to a frame of the alternate reality headset; and processing the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
In some embodiments, the method further comprises: enabling the wearer to anchor the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows to a persistent location on the display screen, a persistent location in a frame of reference of the wearer of the alternate reality headset, or a persistent location in the environment.
In some embodiments, the method further comprises: enabling the wearer to environmentally translate the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows by means of translating, rotating, or resizing the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows.
These and other features of the concepts provided herein will become more readily apparent to those skilled in the art in view of the drawings and following description which disclose in greater detail specific embodiments of the concepts.
Drawings
Fig. 1 provides a block diagram of an anatomical visualization system, according to some embodiments.
Fig. 2 provides a block diagram of a medical device positioning system according to some embodiments.
Fig. 3 provides a block diagram of a medical device placement system according to some embodiments.
Fig. 4 provides a block diagram of an ultrasound probe connected to a console of an anatomical visualization system, according to some embodiments.
Fig. 5 provides a block diagram of an alternate reality headset of an anatomical visualization system, according to some embodiments.
Fig. 6A illustrates a virtual anatomical object on a patient as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 6B illustrates a cross-sectional augmentation of a virtual anatomical object on a patient as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 7 provides a block diagram of a medical device detector connected to a console of a medical device positioning system according to some embodiments.
Fig. 8A provides a first medical device detector according to some embodiments.
Fig. 8B provides a first medical device detector surrounding a limb of a patient according to some embodiments.
Fig. 9 provides a second medical device detector surrounding a limb of a patient according to some embodiments.
Figure 10 provides a block diagram of an ultrasound probe and medical device detector connected to a console of a medical device placement system according to some embodiments.
Fig. 11A illustrates a virtual anatomical object and medical device representation on a patient as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 11B illustrates a magnified augmentation of a virtual anatomical object and medical device representation on a patient as seen through a display screen of an alternate reality headset according to some embodiments.
Fig. 12 provides a block diagram of a medical device placement system according to some embodiments.
Fig. 13 provides a block diagram of an ultrasound probe and a tip position sensor connected to a console of a medical device placement system, according to some embodiments.
Fig. 14 illustrates a virtual anatomical object and medical device representation on a patient as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 15 illustrates a medical device placement system including a stylet on a patient as seen through a display screen of an alternate reality headset and a virtual anatomical object and medical device representation, in accordance with some embodiments.
Fig. 16 illustrates a block diagram of a wireless medical device placement system without a console according to some embodiments.
Fig. 17 illustrates a wireless medical device placement system including a stylet on a patient as seen through a display screen of an alternate reality headset and a medical device representation within a virtual anatomical object, in accordance with some embodiments.
Fig. 18 illustrates a first representation of a first medical device within a virtual anatomical object on a patient, a second representation of a second medical device on the patient, and a window in an environment beside the patient including an output of a medical device placement system as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 19 illustrates a first representation of a first medical device within a virtual anatomical object on a patient, a third representation of a second medical device on the patient, and a window in an environment beside the patient including an output of a medical device placement system, as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 20 illustrates a first representation of a first medical device within a virtual anatomical object above a patient, a third representation of a second medical device above the patient, and a window in an environment beside the patient including an output of a medical device placement system as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 21 illustrates a first representation of a first medical device within a virtual anatomical object in a patient-remote environment, a third representation of a second medical device in the patient-remote environment, and a window including an output of a medical device placement system in the patient-remote environment, as seen through a display screen of an alternate reality headset, in accordance with some embodiments.
Fig. 22A provides a first view of a medical device placement system according to some embodiments.
Fig. 22B provides a second view of the medical device placement system of fig. 22A.
Fig. 22C provides a stylet for use with the medical device placement system of fig. 22A and 22B, according to some embodiments.
Detailed Description
Before disclosing in greater detail some specific embodiments, it should be understood that the specific embodiments disclosed herein do not limit the scope of the concepts provided herein. It should also be understood that particular embodiments disclosed herein may have features that can be readily separated from the particular embodiments and optionally combined with or substituted for the features of any of the various other embodiments disclosed herein.
With respect to the terms used herein, it is also to be understood that these terms are for the purpose of describing some particular embodiments, and that these terms are not intended to limit the scope of the concepts provided herein. Ordinals (e.g., first, second, third, etc.) are generally used to distinguish or identify different features or different steps in a set of features or a set of steps, and do not provide sequence or numerical limitations. For example, "first," "second," and "third" features or steps need not occur in this order, and particular embodiments that include such features or steps need not be limited to these three features or steps. Labels such as "left", "right", "top", "bottom", "front", "back" are used for convenience and are not intended to imply any particular fixed position, orientation, or direction, for example. Rather, such tags are used to reflect, for example, relative position, orientation, or direction. The singular forms "a", "an" and "the" include plural referents unless the context clearly dictates otherwise.
For example, a "proximal," "proximal portion," or "proximal end portion" with respect to a medical device such as a catheter includes a portion of the catheter that is intended to be near a clinician when the catheter is used on a patient. Likewise, for example, the "proximal length" of a catheter includes the length of the catheter that is expected to be near the clinician when the catheter is used on a patient. For example, the "proximal end" of a catheter includes the end of the catheter that is intended to be near the clinician when the catheter is used on a patient. The proximal portion, proximal end portion, or proximal length of the catheter may comprise the proximal end of the catheter; however, the proximal portion, or proximal length of the catheter need not include the proximal end of the catheter. That is, unless the context indicates otherwise, the proximal portion, or proximal length of the catheter is not the distal portion or end length of the catheter.
For example, reference to a "distal", "distal portion", or "distal portion" of a medical device such as a catheter disclosed herein includes a portion of the catheter that is intended to be near or in a patient when the catheter is used on the patient. Likewise, for example, the "distal length" of a catheter includes the length of the catheter that is expected to be near or in a patient when the catheter is used on the patient. For example, the "distal end" of a catheter includes an end of the catheter that is intended to be near or in a patient when the catheter is used on the patient. The distal portion, or distal length of the catheter may comprise the distal end of the catheter; however, the distal portion, or distal length of the catheter need not include the distal end of the catheter. That is, unless the context indicates otherwise, the distal portion, or distal length of the catheter is not the end portion or end length of the catheter.
With respect to "alternate reality," alternate reality includes virtual reality, augmented reality, and mixed reality, unless the context indicates otherwise. "virtual reality" includes virtual content in a virtual set, which may be a fantasy or a real-world simulation. "augmented reality" and "mixed reality" include virtual content in a real-world setting. Augmented reality includes virtual content in a real-world setting, but the virtual content need not be anchored in the real-world setting. For example, the virtual content may be information that overlays a real-world setting. The information may change as the real-world setting changes due to changes in time or environmental conditions in the real-world setting, or the information may change as the augmented reality experiencer moves through the real-world setting, but the information remains overlaid over the real-world setting. Mixed reality includes virtual content anchored in each dimension of a real-world setting. For example, the virtual content may be a virtual object anchored in a real-world setting. The virtual object may change as the real world setting changes due to temporal or environmental conditions in the real world setting, or the virtual object may change as the experiencer moves through the real world setting to adapt to the mixed reality experiencer perspective. The virtual object may also change according to any interaction with the experiencer or another real world or virtual agent. The virtual object remains anchored in the real world setting unless the virtual object is moved to another location in the real world setting by the mixed reality experiencer or some other real world or virtual agent. Mixed reality does not exclude the aforementioned information covering the real-world set described with reference to augmented reality.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art.
As mentioned above, the ability to visualize anatomy, such as the peripheral vasculature, is desirable. Furthermore, the ability to visualize such anatomy and medical devices such as guidewires and catheters is needed, ultimately making it possible to accurately determine the position of such medical devices during their placement. Finally, such capabilities should not adversely affect the patient or clinician.
Disclosed herein are systems and methods for visualizing an anatomy, positioning a medical device, or placing a medical device that address one or more needs such as the foregoing.
Fig. 1 provides a block diagram of an anatomical visualization system 100 according to some embodiments. Fig. 2 provides a block diagram of a medical device positioning system 200 according to some embodiments. Fig. 3 provides a block diagram of a medical device placement system 300 according to some embodiments. Fig. 12 provides a block diagram of a medical device placement system 1200 according to some embodiments.
As shown, the anatomical visualization system 100 includes an ultrasound imaging system 102 and an alternate reality headset 130, wherein the ultrasound imaging system 102 includes a console 110 and an ultrasound probe 120; the medical device positioning system 200 includes a console 210, a medical device detector 240, and optionally an alternate reality earpiece 130; and the medical device placement system 300 includes a console 310, an ultrasound probe 120, an alternate reality headset 130, and a medical device detector 240. Thus, medical device placement system 300 is a combination of at least some of the elements of anatomical visualization system 100 and medical device positioning system 200. Similar to the medical device placement system 300, the medical device placement system 1200 includes a console 1210, an ultrasound probe 120, and an alternate reality earpiece 130. Differently, the medical device placement system 1200 does not include the same medical device positioning system 200 as the medical device placement system 300, but rather includes a medical device positioning system having a medical device tip position sensor ("TLS") 1240 in place of the medical device detector 240 (although not shown in a separate figure, the medical device positioning system of the medical device placement system 1200 includes a console 1210, alternate reality headphones 130, and a TLS 1240 that is a medical device detector). The TLS 1240 is similar to the TLS 50 of the catheter placement system 10 described in WO 2014/062728, the disclosure of which is incorporated by reference in its entirety into the present application. Thus, the medical device placement system 1200 is a combination of at least some elements of the anatomical visualization system 100 and the catheter placement system 10 of WO 2014/062728 (in particular the TLS 50).
Although each of the consoles 110, 210, 310, and 1210 is referred to herein by a different reference number, the consoles 110, 210, 310, and 1210 need not be different consoles. That is, consoles 110, 210, 310, and 1210 may be the same console. For example, the same console may be the console 310 of the medical device placement system 300, where the console 310 is a combination of the console 110 of the anatomical visualization system 100 and the console 210 of the medical device positioning system 200. In view of the foregoing, the components and functionality of console 110 described with reference to anatomical visualization system 100 should be understood to apply to anatomical visualization system 100, medical device placement system 300, or medical device placement system 1200. Similarly, the components and functions of console 210 described with reference to medical device positioning system 200 should be understood to apply to medical device positioning system 200, medical device placement system 300, or medical device placement system 1200.
Nonetheless, in some embodiments of the anatomical visualization system 100, the medical device positioning system 200, the medical device placement system 300, and the medical device placement system 1200, there are no respective consoles 110, 210, 310, and 1210. In such an embodiment, the alternate reality headset 130 or another system component serves as a console or performs its functions (e.g., processing). An example of such a medical device placement system is medical device placement system 1600 of fig. 16.
Anatomical visualization system
Again, fig. 1 provides a block diagram of an anatomical visualization system 100 according to some embodiments.
As shown, the anatomical visualization system 100 includes an ultrasound imaging system 102 and an alternate reality headset 130, wherein the ultrasound imaging system 102 includes a console 110 and an ultrasound probe 120.
Fig. 4 provides a block diagram of an ultrasound probe 120 connected to a console of the anatomical visualization system 100, according to some embodiments.
As shown, the console 110 has electronic circuitry including a memory 412 and one or more processors 414, the one or more processors 414 being configured to convert echo ultrasound signals from the patient using one or more algorithms 416 to produce therefrom an ultrasound image and ultrasound image segments corresponding to the anatomy of the patient. The console 110 is configured to capture ultrasound imaging frames (i.e., frame-by-frame ultrasound images) in the memory 412 according to a pulsed wave doppler imaging mode of the ultrasound probe 120, stitch the ultrasound imaging frames together using a stitching algorithm of the one or more algorithms 416, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm of the one or more algorithms 416. The console 110 is configured to convert the ultrasound image segments into virtual anatomical objects using a virtualization algorithm of the one or more algorithms 416. The console 110 is configured to transmit the virtual anatomical object to the alternate implementation earpiece 130 for display on the patient by way of the wireless communication interface 418.
The console 110 and its electronic circuitry, including memory 412 and one or more processors 414, may also be configured to convert one or more sets of ECG signals using an ECG algorithm of the one or more algorithms 416 to produce one or more ECGs, respectively. When connected to the console 110, one or more ECG electrodes (such as one or more ECG electrode pads) are configured to generate one or more sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart and provide the one or more sets of ECG signals to the console 110.
The console 110 includes a number of components of the anatomical visualization system 100, and the console 110 can take any of a variety of forms to accommodate the number of components. The one or more processors 414 and memory 412 (e.g., non-volatile memory such as electrically erasable programmable read-only memory [ "EEPROM" ]) of the console 110 are configured to control various functions of the anatomical visualization system 100, such as executing one or more algorithms 416 during operation of the anatomical visualization system 100. A digital controller or analog interface 420 is also included with the console 110, and the digital controller or analog interface 420 communicates with the one or more processors 414 and other system components to manage the interfacing between the probe 120, the alternate reality headset 130, and other system components.
The console 110 further includes a port 422 for connection with additional optional components such as one or more ECG electrodes and optional components 424 including a printer, storage media, keyboard, etc. The port 422 may be a universal serial bus ("USB") port (although other ports or port combinations may also be used) as well as other interfaces or connections described herein. A power connection 426 is included with the console 110 to enable an operable connection to an external power source 428. An internal power source 430 (e.g., a disposable or rechargeable battery) may also be used with the external power source 428 or to the exclusion of the external power source 428 alone. Power management circuitry 432 is included with the digital controller or analog interface 420 of the console 110 to regulate power usage and distribution.
The display 434 may be, for example, a liquid crystal display ("LCD") integrated into the console 110 and used to display information to the clinician during the procedure. For example, the display 434 may be used to display ultrasound images or one or more ECGs of a target intra-volumetric portion of the patient obtained by the probe 120. Alternatively, the display 434 may be separate from the console 110 rather than integrated into the console 110; however, such a display is different from that of the alternate reality headset 130. The console 110 may further include a console button interface 436. In conjunction with control buttons on the probe 120, the clinician can immediately invoke a desired mode on the display 434 using the console button interface 436 for use by the clinician during surgery.
The ultrasound probe 120 is configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by way of the piezoelectric sensor array 438. The ultrasound probe 120 may be configured with continuous wave or pulsed wave imaging modes. For example, the ultrasound probe 120 may be configured with the aforementioned pulsed wave doppler imaging mode for transmitting and receiving ultrasound signals.
The probe 120 further includes a button and memory controller 440 for managing the operation of the probe 120 and its buttons. The button and memory controller 440 may include a non-volatile memory such as an EEPROM. The button and memory controller 440 is in operable communication with a probe interface 442 of the console 110, which includes a piezoelectric input-output component 444 for interfacing with the piezoelectric transducer array 438 of the probe 120 and a button and memory input-output component 446 for interfacing with the button and memory controller 440 of the probe 120.
Fig. 5 provides a block diagram of an alternate reality earpiece 130 of the anatomical visualization system 100, according to some embodiments.
As shown, the alternate reality headset 130, which may have a goggle-type or mask-type form factor, includes a suitably configured display 512 and a window 514 thereon that are coupled to a frame 516, the frame 516 having electronic circuitry including a memory 518 and one or more processors 520. The display screen 512 is configured such that a wearer of the alternate reality headset 130 can see through the display screen 512 an environment (e.g., an operating room) including a patient according to the opacity of the window 514, which is adjustable using the opacity control 548. The display screen 512 is configured to display a virtual anatomical object on the environment (such as on a patient) that corresponds to a segment of an ultrasound image produced by the console 110 using an image segmentation algorithm (see, e.g., fig. 6A, where the virtual anatomical object corresponds to vasculature in a limb of the patient). When the virtual anatomical object is displayed on the environment, the alternate reality earpiece 130 may be configured to anchor the virtual anatomical object to the environment in three dimensions (such as to a patient on which the virtual anatomical object is displayed), which allows a wearer of the alternate reality earpiece 130 to see a real representation of the patient's anatomy for one or more subsequent medical procedures (e.g., entering a vessel and placing a guidewire of a medical device, such as a catheter, in the vessel). Anchoring the virtual anatomical object to the environment or to the patient on which the virtual anatomical object is displayed is a feature of mixed reality.
The alternate reality headset 130 may further include a perceptual user interface ("PUI") configured to enable a wearer of the alternate reality headset 130 to interact with the alternate reality headset 130 without a physical input device such as a keyboard or mouse. Instead of physical input devices, the PUI may have input devices including, but not limited to, one or more wearer-facing eye-tracking cameras 522, one or more patient-facing cameras 524, one or more microphones 526, or a combination thereof. At least one advantage of the PUI and its input device is that the clinician does not have to reach outside the sterile field to execute commands for the alternate reality headset 130.
With respect to the one or more eye tracking cameras 522, the one or more eye tracking cameras 522 may be coupled to the frame 516 and configured to capture eye movements of the wearer in the camera buffer 534 or the memory 518. The processor 520 of the alternate reality headset 130 may be configured to process the eye movements using the eye movement algorithms of the one or more algorithms 528 to identify the wearer's focus for selecting a virtual anatomical object or other representation (e.g., a medical device, a contour of a virtual medical device, etc.) that corresponds to the wearer's focus. For example, the wearer's focus may be used by the PUI to select a virtual anatomical object for enhancing the virtual anatomical object by highlighting the virtual anatomical object or increasing the contrast between the virtual anatomical object and its environment. In another embodiment, the wearer's focus may be used by the PUI to select a virtual anatomical object for performing one or more other operations of the PUI, such as enlarging the virtual anatomical object, providing a cross-section of one or more virtual anatomical objects, and so forth (see, e.g., fig. 6B, where the virtual anatomical object corresponds to vasculature in a limb of a patient, and where the virtual anatomical object is in cross-section).
With respect to the one or more patient facing cameras 524, the one or more patient facing cameras 524 may be coupled to the frame 516 and configured to capture the pose of the wearer in the camera buffer 534 or the memory 518. The processor 520 of the alternate reality headset 130 may be configured to process the gesture with a gesture command algorithm of the one or more algorithms 528 to identify a gesture-based command issued by the wearer for execution by the alternate reality headset 130.
With respect to the one or more microphones 526, the one or more microphones 526 may be coupled to a frame 516, the frame 516 configured to capture the wearer's audio in a memory 518. The processor 520 of the alternate reality headset 130 may be configured to process the audio using an audio command algorithm of the one or more algorithms 528 to identify an audio-based command issued by the wearer for execution by the alternate reality headset 130.
The electronic circuitry includes a processor 520, a memory controller 530 in communication with a memory 518 (e.g., dynamic random access memory [ "DRAM" ]), a camera interface 532, a camera buffer 534, a display driver 536, a display formatter 538, a timing generator 540, a display output interface 542, and a display input interface 544. These components may communicate with each other via the processor 520, a dedicated line of one or more buses, or a combination thereof.
The camera interface 216 is configured to provide an interface to one or more eye tracking cameras 522 and one or more patient facing cameras 524, and to store the respective images received from the cameras 522, 524 in the camera buffer 534 or memory 518. Each camera of the one or more eye-tracking cameras 522 may be an infrared ("IR") camera or a position-sensitive detector ("PSD") configured to track eye-glint locations by IR reflection or eye-glint (eye-glint) location data, respectively.
The display driver 220 is configured to drive a display screen 512. The display formatter 538 is configured to provide display formatting information for the virtual anatomical object to the one or more processors 414 of the console 110 for formatting the virtual anatomical object for display on the environment (e.g., on a patient) on the display screen 512. The timing generator 540 is configured to provide timing data for the alternate reality earpiece 130. The display output interface 542 includes a buffer for providing images from the one or more eye-tracking cameras 522 or the one or more patient-facing cameras 524 to the one or more processors 414 of the console 110. The display input interface 544 comprises a buffer for receiving images, such as virtual anatomical objects, to be displayed on the display screen 512. The display output interface 542 and the display input interface 544 are configured to communicate with the console 110 by way of a wireless communication interface 546. Opacity control 548 is configured to change the degree of opacity of window 514.
Additional electronic circuitry includes voltage regulator 550, eye tracking illumination driver 552, audio digital-to-analog converter ("DAC") and amplifier 554, microphone preamplifier and audio analog-to-digital converter ("ADC") 556, temperature sensor interface 558, and clock generator 560. The voltage regulator 550 is configured to receive power from an internal power supply 562 (e.g., a battery) or an external power supply 564 over a power connection 566. The voltage regulator 550 is configured to provide the received power to the electronic circuitry of the alternate reality headset 130. Eye tracking illumination driver 236 is configured to control eye tracking illumination unit 568 to operate at approximately a predetermined wavelength or within a predetermined wavelength range by means of a drive current or voltage. The audio DAC and amplifier 554 is configured to provide audio data to an earpiece or speaker 570. Microphone preamplifier and audio ADC 556 is configured to provide an interface for one or more microphones 526. Temperature sensor interface 558 is configured to interface with temperature sensor 572. Further, the alternate reality headset 130 may include orientation sensors including a three axis magnetometer 574, a three axis gyroscope 576, and a three axis accelerometer 578 configured to provide orientation sensor data for determining the orientation of the alternate reality headset 130 at any given time. Further, the alternate reality headset 130 may include a global positioning system ("GPS") receiver 580 configured to receive GPS data (e.g., time and location information for one or more GPS satellites) for determining the location of the alternate reality headset 130 at any given time.
Medical equipment positioning system
Again, fig. 2 provides a block diagram of a medical device positioning system 200 according to some embodiments.
As shown, the medical device positioning system 200 includes a console 210, a medical device detector 240 including a magnetic sensor array 242, and optionally an alternate reality earpiece 130.
Fig. 7 provides a block diagram of a medical device detector 240 connected to the console 210 of the medical device positioning system 200, according to some embodiments.
As shown, the console 210 has electronic circuitry including memory 712 and one or more processors 714, the one or more processors 714 configured to utilize one or more algorithms 716 (e.g., position finding algorithms, including, for example, triangulation) to convert the magnetic sensor signals from the magnetic sensor array 242 into positional information of the magnetized medical device (e.g., a catheter including magnetic elements) within the limb of the patient when the medical device detector 240 is placed around the limb of the patient.
The console 210 includes a number of components of the medical device positioning system 200, and the console 210 may take any of a variety of forms to accommodate the number of components. The one or more processors 714 and memory 712 (e.g., non-volatile memory such as EEPROM) of the console 210 are configured to control various functions of the medical device positioning system 200, such as executing one or more algorithms 716 during operation of the medical device positioning system 200. A digital controller or analog interface 720 is also included with the console 210, and the digital controller or analog interface 720 communicates with the one or more processors 714 and other system components to manage the interfacing between the medical device detector 240, the alternate reality earpiece 130, and other system components. The console 210 may also be configured with a wireless communication interface 418 to send position information to the alternate reality headset 130, or a medical device representation (e.g., an outline of a medical device, a virtual medical device, etc.) from the position information of the magnetized medical device within the limb of the patient, for display on a display screen 512 of the alternate reality headset 130 (see, e.g., fig. 11A and 11B, where the virtual anatomical object corresponds to vasculature in the limb of the patient, and where a representation of the medical device (such as a guidewire or catheter) is advanced through the vasculature).
The console 210 further includes a port 722 for connection with the medical device detector 240 and additional optional components such as a magnetic field generator 740, a printer, storage media, a keyboard. The port 722 may be a universal USB port, but other ports or port combinations and other interfaces or connections described herein may also be used. A power connection 726 is included with the console 210 to enable an operable connection with an external power source 728. An internal power source 730 (e.g., a disposable or rechargeable battery) may also be used with the external power source 728 or may exclude the external power source 728 from use alone. Power management circuitry 732 is included with the digital controller or analog interface 720 of the console 210 to regulate power usage and distribution.
Display 734 may be, for example, an LCD integrated into console 210 and used to display information to the clinician during surgery. For example, the display 734 may be used to display positional information, or to depict a medical device representation (e.g., an outline of the medical device, a virtual medical device, etc.) according to positional information of a magnetized medical device within a limb of a patient. Alternatively, the display 734 may be separate from the console 210 rather than integrated into the console 210; however, such a display, unlike the display of the alternate reality earpiece 130, may also be configured to display position information (e.g., as a position information overlay), or to depict a medical device representation (e.g., an outline of the medical device, a virtual medical device, etc.) according to the position information of the magnetized medical device within the limb of the patient. The console 210 may further include a console button interface 736. The clinician may immediately invoke a desired mode (e.g., a mode with the magnetic field generator 740, a mode without the magnetic field generator 740, etc.) on the display 734 using the console button interface 736 for use by the clinician during the procedure.
Fig. 8A provides a first medical device detector 800 according to some embodiments. Fig. 8B provides a first medical device detector 800 surrounding a limb of a patient according to some embodiments. Fig. 9 provides a second medical device detector 900 surrounding a limb of a patient according to some embodiments.
As shown, each of the first medical device detector 800 and the second medical device detector 900 includes a magnetic sensor array 242 embedded within a housing 810, 910 configured for placement around a limb (e.g., arm or leg) of a patient. When the medical device detectors 800, 900 are placed around a limb of a patient, the console 210 is configured to utilize one or more algorithms 716 (e.g., position finding algorithms) to convert the magnetic sensor signals from the magnetic sensor array 242 into position information, or a medical device representation (e.g., an outline of the medical device, a virtual medical device, etc.) from the position information of the magnetized medical device within the limb of the patient.
The housing 810 of the first medical device detector 800 is a rigid frame. Each magnetic sensor in the magnetic sensor array 242 embedded within the frame has a fixed spatial relationship with another magnetic sensor. The fixed spatial relationship is communicated to the console 210 when the first medical device detector 800 is connected to one of the ports 722 of the console 210 or one or more modes are invoked with the console button interface 736 of the console 210 for using the first medical device detector 800 without utilizing the magnetic field generator 740. Using the fixed spatial relationship of the magnetic sensor array 242 in the first medical device detector 800, the console 210 is able to convert the magnetic sensor signals from the magnetic sensor array 242 into position information or a medical device representation (e.g., an outline of a medical device, a virtual medical device, etc.) based on the position information of the magnetized medical device within the limb of the patient.
The housing 810 of the first medical device detector 800 may further include one or more light emitting diodes ("LEDs") or lasers embedded within the frame, such as within the posts 812 of the frame. The one or more LEDs or lasers may be configured to illuminate the limb of the patient around which the first medical device detector 800 is placed, or the one or more LEDs or lasers may be configured to illuminate only a portion of the limb of the patient. The portion of the limb of the patient may be the portion of the limb of the patient below which the tip of the medical device is located within the limb of the patient (see, e.g., fig. 8B, where an "X" indicates that only a portion of the limb of the patient below which the tip of the medical device is located is illuminated). As such, one or more LEDs or lasers may act as a real-world light-based pointing system for identifying the location of the medical device. The light-based pointing system may be used in conjunction with the alternate reality headset 130 for confirming the location of the medical device because the illumination provided by the light-based pointing system is visible through the see-through display 512 of the alternate reality headset 130.
The housing 910 of the second medical device detector 900 is a drape. Each magnetic sensor in the array of magnetic sensors 242 embedded within the drape has a variable spatial relationship with another magnetic sensor depending on how the drape is placed around the limb of the patient. For this reason, the medical device positioning system 200 may further comprise a magnetic field generator 740 configured to generate a magnetic field around the second medical device detector 900 for determining the spatial relationship of one magnetic sensor to another magnetic sensor in the magnetic sensor array 242. Each magnetic sensor present in magnetic sensor array 242 is communicated to console 210 when second medical device detector 900 is connected to one of ports 722 of console 210 or one or more modes are invoked with console button interface 736 of console 210 for using second medical device detector 900 with magnetic field generator 740. Where each magnetic sensor of the magnetic sensor array 424 is known, the console 210 is configured to determine the spatial relationship of each magnetic sensor to another magnetic sensor from the magnetic sensor signals generated by the magnetic sensor array 242 in the presence of the generated magnetic field. This is made possible in part by the following: each magnetic sensor in the magnetic sensor array 424 is in a magnetic environment that is unique at least in terms of the strength and orientation of the generated magnetic field. Using the determined spatial relationship of the magnetic sensor array 242 in the second medical device detector 900, the console 210 can convert the magnetic sensor signals from the magnetic sensor array 242 into position information or a medical device representation (e.g., an outline of a medical device, a virtual medical device, etc.) from the position information of the magnetized medical device within the limb of the patient. To ensure accuracy, the determined spatial relationship of the magnetic sensor array 242 may be periodically confirmed in the presence of a newly generated magnetic field, taking into account the medical device within the limb of the patient.
Medical device placement system
Again, fig. 3 provides a block diagram of a medical device placement system 300 according to some embodiments.
As shown, the medical device placement system 300 may include the ultrasound probe 120 of the anatomical visualization system 100, the medical device detector 240 including the magnetic sensor array 242 of the medical device positioning system 200, the alternate reality headphones 130, and a console 310, the console 310 including electronic circuitry similar to that of both the console 110 and the console 210.
Fig. 10 provides a block diagram of the ultrasound probe 120 and the medical device detector 240 connected to the console 310 of the medical device placement system 300, according to some embodiments.
As shown, the console 310 has electronic circuitry that includes a memory 1012 and one or more processors 1014. Similar to console 110, console 310 is configured to convert the echo ultrasound signals from the patient using one or more algorithms 1016 to produce therefrom ultrasound images and ultrasound image segments corresponding to the patient's anatomy. The console 310 is configured to capture ultrasound imaging frames in the memory 1012 according to the pulsed wave doppler imaging mode of the ultrasound probe 120, stitch the ultrasound imaging frames together using a stitching algorithm of the one or more algorithms 1016, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm of the one or more algorithms 1016. The console 310 is configured to convert the ultrasound image segments into virtual anatomical objects using a virtualization algorithm of the one or more algorithms 1016. Also similar to console 110, console 310 and its electronic circuitry, including memory 1012 and one or more processors 1014, may also be configured to convert one or more sets of ECG signals using an ECG algorithm of one or more algorithms 1016 to produce one or more ECGs accordingly. When connected to the console 310, one or more ECG electrodes 1050 (such as one or more ECG electrode pads), a stylet 1752 (see fig. 15), or a combination thereof, are configured to generate one or more sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart, and provide the one or more sets of ECG signals to the console 310.
Similar to console 210, console 310 is configured to utilize one or more algorithms 1016 (e.g., a position finding algorithm) to convert magnetic sensor signals from magnetic sensor array 242 into positional information of a magnetized medical device within a limb of a patient when medical device detector 240 is placed around the limb of the patient. The console 310 is configured to transmit both the virtual anatomical object and a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) within the limb of the patient to the alternate reality headset 130 according to the location information by way of the wireless communication interface 1018 for display on the environment or on the patient on the display screen 512 of the alternate reality headset 130. In displaying the virtual anatomical object and medical device representation, the alternate reality headset 130 may be configured to anchor the virtual anatomical object and medical device representation to the environment or patient, which is a feature of mixed reality.
The console 310 includes a number of components of the medical device placement system 300, and the console 310 can take any of a variety of forms to accommodate the number of components. The one or more processors 1014 and memory 1012 (e.g., non-volatile memory such as EEPROM) of the console 310 are configured to control various functions of the medical device placement system 300, such as executing one or more algorithms 1016 during operation of the medical device placement system 300. A digital controller or analog interface 1020 is also included with the console 310, and the digital controller or analog interface 1020 communicates with the one or more processors 1014 and other system components to manage the interfacing between the probe 120, the medical device detector 240, the alternate reality headset 130, and other system components.
The console 310 further includes a port 1022 for connection with the medical device detector 240 and additional optional components, such as a magnetic field generator 740, one or more ECG electrodes 1050, such as a stylet 1752 (see fig. 15), or optional components 424 (e.g., a printer, storage medium, keyboard, etc.). The port 1022 may be a general purpose USB port, but other ports or port combinations and other interfaces or connections described herein may also be used. A power connection 1026 is included with the console 310 to enable an operable connection with an external power source 1028. An internal power source 1030 (e.g., a disposable or rechargeable battery) may also be used with the external power source 1028 or exclude the external power source 1028 from being used alone. Power management circuitry 1032 is included in a digital controller or analog interface 1020 of console 310 to regulate power usage and distribution.
As an alternative to connecting one or more ECG electrodes 1050 to one or more ports 1022 of console 310, medical device detector 240 is configured with one or more ports or supplemental connectors for connecting one or more ECG electrodes 1050 (including ECG electrodes of stylet 1752).
The display 1034 may be, for example, an LCD that is integrated into the console 310 and used to display information to the clinician during the procedure. For example, the display 1034 may be used to display ultrasound images or one or more ECGs of a target intra-body portion of a patient obtained by the probe 120, positional information of a medical device within a limb of the patient, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) according to the positional information of the medical device within the limb of the patient. Alternatively, the display 1034 may be separate from the console 310 rather than integrated into the console 310; however, such a display, unlike the display of the alternate reality earpiece 130, may also be configured to display a virtual anatomical object and medical device representation within a limb of the patient (e.g., an outline of the medical device, a virtual medical device, etc.).
The console 310 may further include a console button interface 1036. In conjunction with control buttons on the probe 120, the clinician may immediately invoke a desired ultrasound imaging mode (e.g., continuous wave imaging mode or pulsed wave imaging mode) on the display 1034 using the console button interface 1036 for use by the clinician during surgery. The clinician may immediately invoke a desired medical device positioning mode (e.g., a mode with the magnetic field generator 740, a mode without the magnetic field generator 740, etc.) on the display 734 using the console button interface 1036 for use by the clinician during surgery.
With respect to the ultrasound probe 120 and the alternate reality earpiece 130 of the medical device placement system 300, reference should be made to the description of the ultrasound probe 120 and the alternate reality earpiece 130 provided for the anatomical visualization system 100. With respect to medical device detector 240 and magnetic field generator 740 of medical device placement system 300, reference should be made to the description of medical device detector 240 and magnetic field generator 740 provided with respect to medical device positioning system 200.
Again, fig. 12 provides a block diagram of a medical device placement system 1200 according to some embodiments.
As shown, the medical device placement system 1200 may include the ultrasound probe 120, the alternate reality earpiece 130, and the console 1210 of the anatomical visualization system 100, the console 1210 including electronic circuitry similar to that of the console 110. Further, the medical device placement system 1200 includes a TLS 1240 similar to the TLS 50 of the catheter placement system 10 described in WO 2014/062728, which publication is incorporated by reference in its entirety into the present application.
Fig. 13 provides a block diagram of an ultrasound probe 120 and TLS 1240 connected to a console 1210 of a medical device placement system 1200, according to some embodiments.
As shown, the console 1210 has electronic circuitry including memory 1312 and one or more processors 1314. Similar to console 110, console 1210 is configured to convert echo ultrasound signals from a patient using one or more algorithms 1316 to produce therefrom ultrasound images and ultrasound image segments corresponding to the patient's anatomy. The console 1210 is configured to capture ultrasound imaging frames in the memory 1312 according to a pulsed wave doppler imaging mode of the ultrasound probe 120, stitch the ultrasound imaging frames together using a stitching algorithm of the one or more algorithms 1316, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm of the one or more algorithms 1316. Console 1210 is configured to convert an ultrasound image segment into a virtual anatomical object using a virtualization algorithm of one or more algorithms 1316. Also similar to console 110, console 1210 and its electronic circuitry including memory 1312 and one or more processors 1314 may also be configured to convert one or more sets of ECG signals using one or more of algorithms 1016 to generate one or more ECGs, respectively. When connected to the console 1210, one or more ECG electrodes 1050 (such as one or more ECG electrode pads), a stylet 1752 (see fig. 15), and combinations thereof are configured to generate one or more sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart and provide the one or more sets of ECG signals to the console 1210.
Similar to the console 210, the console 1210 is configured to utilize one or more algorithms 1316 (e.g., position finding algorithms) to convert TLS signals (e.g., magnetic sensor signals from one or more magnetic sensors 1242 disposed in a fixed spatial relationship in a housing of the TLS 1240) into positional information of a magnetized medical device (e.g., a PICC) within the patient when the TLS 240 is placed on the patient's chest. The console 1210 is configured to transmit, by way of the wireless communication interface 1318, both the virtual anatomical object and a medical device representation (e.g., an outline of the medical device, a virtual medical device, etc.) within a limb of the patient to the alternate reality headset 130 according to the location information for display on the environment or on the patient on the display screen 512 of the alternate reality headset 130.
Fig. 14 illustrates a virtual anatomical object and medical device representation on a patient as seen through the display screen 512 of the alternate reality headset 130, in accordance with some embodiments. Fig. 15 illustrates a medical device placement system 1200 including a stylet 1752 and a virtual anatomical object and medical device representation on a patient as seen through the display screen 512 of the alternate reality earpiece 130, according to some embodiments.
When the virtual anatomical object and medical device representation are displayed on the environment or patient on the display screen 512, the alternate reality headset 130 may be configured to anchor the virtual anatomical object and medical device representation to the environment or patient, which is a feature of mixed reality. With respect to a medical device, such as a guidewire or catheter 1460 (e.g., PICC), the virtual anatomical object displayed on the display screen 512 may be limited to the circulatory system of the patient (as shown in the vasculature of fig. 6A and 6B) using the doppler imaging mode of the ultrasound probe 120, and the guidewire or catheter 1460 may be displayed within the vasculature of the patient on the display screen 512. The medical device placement system 1200 may be configured to track and display the advancement of a guide wire or catheter 1460 up to a desired location along the superior vena cava ("SVC") of a patient at least proximate to the sinus node in the right atrium of the patient's heart, including displaying the placement of a medical device representation, such as the guide wire or catheter 1460, in a virtual anatomical object, such as the virtual SVC, as shown in fig. 14. Upon advancing the guidewire or catheter 1460 to a desired location within the patient, the distal portion of the medical device representation may be configured to indicate proximity of the guidewire or catheter 1460 to the desired location within the patient by way of a visual indicator (e.g., color gradient, blinking speed, luminous intensity, etc.) of the medical device representation. Because console 1210 is configured with a port 1322, one or more ECG electrodes 1050, such as stylet 1752 (which includes ECG electrodes in a distal portion of stylet 1752), can be connected to console 1210. Whether the stylet 1752 is inserted into the lumen of a medical device (such as the catheter 1460) and advanced up the SVC near the sinoatrial node of the patient's heart, whether one or more ECG electrode pads are adhered to various locations on the patient's body, or a combination thereof, the one or more ECG electrodes 1050 can provide one or more sets of ECG signals to the console 1210 when connected to the console 1210. When displayed as a virtual anatomical object on the display screen 512 of the alternate reality headset 130, one or more sets of ECG signals may be used to animate the patient's heart. Animating the heart of the patient includes animating the heart beat of the heart, as shown in FIG. 14.
The console 1210 includes a number of components of the medical device placement system 1200, and the console 1210 can take any of a variety of forms to accommodate the number of components. The one or more processors 1314 and memory 1312 of the console 1210 (e.g., non-volatile memory such as EEPROM) are configured to control various functions of the medical device placement system 1200, such as executing one or more algorithms 1316 during operation of the medical device placement system 1200. A digital controller or analog interface 1320 is also included with the console 1210, and the digital controller or analog interface 1320 communicates with the one or more processors 1314 and other system components to manage the interfacing between the probe 120, the TLS 1240, the alternate reality headset 130, and other system components.
The console 1210 further includes a port 1322 for connection with the TLS 1240 and additional optional components, such as one or more ECG electrodes 1050 of a stylet 1752 (see fig. 15) or optional components 424 (e.g., printer, storage medium, keyboard, etc.). The port 1322 may be a universal USB port, but other ports or port combinations and other interfaces or connections described herein may also be used. A power connection 1326 is included with the console 1210 to enable operative connection with an external power source 1328. An internal power source 1330 (e.g., a disposable or rechargeable battery) may also be used with external power source 1328 or exclusive of external power source 1328 alone. Power management circuitry 1332 is included with a digital controller or analog interface 1320 of console 1210 to regulate power usage and distribution.
As an alternative to connecting the one or more ECG electrodes 1050 to the one or more ports 1322 of the console 1210, the TLS 1240 is configured with one or more ports or supplemental connectors for connecting the one or more ECG electrodes 1050 (including the ECG electrodes of the stylet 1752). One or more complementary connectors of the TLS 1240 are similar to the complementary connectors of the TLS 50 of the catheter placement system 10 described in WO 2010/030820, the disclosure of which is incorporated by reference in its entirety. For example, the TLS 1240 can be configured to connect to the stylet 1752 by a sterile drape that separates a sterile field that includes the stylet 1752 from a non-sterile field that includes the TLS 1240. In certain embodiments, the TLS 1240 is configured to communicate with the console 1210 or wirelessly with the alternate reality headset 130 over a wired connection to one of the ports 1322 of the console 1210.
Display 1334 may be, for example, an LCD that is integrated into console 1210 and used to display information to the clinician during surgery. For example, the display 1334 may be used to display ultrasound images of the internal portion of the target volume of the patient obtained by the probe 120, one or more ECGs, positional information of the medical device within the limb of the patient, or a representation of the medical device (e.g., an outline of the medical device, a virtual medical device, etc.) according to the positional information of the medical device within the limb of the patient. Alternatively, the display 1334 may be separate from the console 1210 rather than integrated into the console 1210; however, such a display, unlike the display of the alternate reality earpiece 130, may also be configured to display a virtual anatomical object and medical device representation within a limb of the patient (e.g., an outline of the medical device, a virtual medical device, etc.).
Console 1210 may further include a console button interface 1336. In conjunction with control buttons on the probe 120, the clinician may immediately invoke a desired ultrasound imaging mode (e.g., continuous wave imaging mode or pulsed wave imaging mode) on the display 1334 using the console button interface 1336 for intraoperative use by the clinician. The clinician may immediately invoke the desired medical device positioning mode on display 734 using console button interface 1336 for use by the clinician during surgery.
With respect to the ultrasound probe 120 and the alternate reality earpiece 130 of the medical device placement system 1200, reference should be made to the description of the ultrasound probe 120 and the alternate reality earpiece 130 provided for the anatomical visualization system 100. With respect to the TLS 1240 of the medical device placement system 1200, reference should be made to the description of the TLS 50 of the catheter placement system 10 described in WO 2014/062728, the disclosure of which is incorporated herein by reference in its entirety.
As described above, the alternate reality headset 130 may be used as a console 110, 210, 310, or 1210 in the anatomical visualization system 100, medical device positioning system 200, medical device placement system 300, or medical device placement system 1200, respectively, or perform its functions (e.g., processing) at least when the console 110, 210, 310, or 1210 is not present in its respective system. Thus, it should be understood that in such an embodiment, the alternate reality headset 130 further includes the necessary electronic circuitry, algorithms, etc. set forth above to serve as the console 110, 210, 310, or 1210.
Fig. 16 illustrates a block diagram of a wireless medical device placement system 1600 without a console according to some embodiments. Fig. 17 illustrates a representation of a wireless medical device placement system 1600 including a stylet 1752 on a patient as seen through the display screen 512 of an alternate reality headset 130, and a medical device (e.g., PICC) within a virtual anatomical object (e.g., SVC), according to some embodiments. Reference numerals of components of the anatomical visualization system 100, the medical device positioning system 200, the medical device placement system 300, or the medical device placement system 1200 common to the medical device placement system 1600 are retained in the description set forth below with respect to the medical device placement system 1600 for direct reference to the description set forth above. However, it should be understood that certain components (e.g., the ultrasound probe 120, the medical device detector 240, the TLS 1240, etc.) further include a wireless communication interface and electronic circuitry to support wireless communication with the alternate reality headset 130 in addition to the description set forth above for such components.
As shown, the medical device placement system 1600 can include the ultrasound probe 120 of the anatomical visualization system 100, the medical device detector 240 of the medical device localization system 200, or the TLS 1240 of the medical device localization system 1200, and the alternate reality earpiece 130, the alternate reality earpiece 130 configured to wirelessly communicate with the ultrasound probe 120 and the medical device detector 240 or the TLS 1240. The medical device placement system 1600 optionally further includes one or more ECG electrodes 1050, the ECG electrodes 1050 including a stylet 1752.
Similar to the consoles 110, 310, and 1210, the frame 516 of the alternate reality headset 130 has electronic circuitry including memory and a processor (e.g., memory 518 and one or more processors 520) configured to convert echogenic ultrasound signals from the patient using one or more algorithms 528 to produce therefrom ultrasound images and ultrasound image segments corresponding to the patient's anatomy. The alternate reality headset 130 is configured to capture ultrasound imaging frames (i.e., frame-by-frame ultrasound images) in the memory 518 according to an imaging mode of the ultrasound probe 120, stitch the ultrasound imaging frames together using a stitching algorithm of the one or more algorithms 528, and segment the ultrasound imaging frames or the stitched ultrasound imaging frames into ultrasound image segments using an image segmentation algorithm of the one or more algorithms 528. The alternate reality headset 130 is configured to convert the ultrasound image segments into virtual anatomical objects for display on the display screen 512 over the environment including the patient using a virtualization algorithm of the one or more algorithms 528. Also similar to the consoles 110, 310, and 1210, the alternate reality headset 130 and its electronic circuitry, including the memory 518 and the one or more processors 520, may also be configured to convert the one or more sets of ECG signals using an ECG algorithm of the one or more algorithms 528 to correspondingly produce the one or more ECGs. When connected to the TLS 1240, one or more ECG electrodes 1050 (such as one or more ECG electrode pads), a stylet 1752 (see fig. 15), and combinations thereof are configured to generate one or more sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart, and provide the one or more sets of ECG signals to the alternate reality earpiece 130 by way of the TLS 1240.
Similar to the console 1210, the electronic circuitry of the framework 516 including the memory 518 and the one or more processors 520 is configured to utilize one or more algorithms 528 (e.g., location finding algorithms) to convert TLS signals (e.g., magnetic sensor signals from one or more magnetic sensors 1242 disposed in a fixed spatial relationship in the housing of the TLS 1240) into location information of a magnetized medical device (e.g., PICC) within the patient for displaying a medical device representation (e.g., an outline of the medical device, a virtual medical device, etc.) on the display screen 512 of the alternate reality headset 130 based on the location information. In conjunction with the virtual anatomical object corresponding to the ultrasound image segment, a medical device representation may be displayed within the virtual anatomical object on the display screen 512 according to the position information of the medical device.
Fig. 18 illustrates a first representation 1860 of a first medical device within a virtual anatomical object on a patient, a second representation 1840 of a second medical device on the patient, and a window 1870 in an environment beside the patient including an output of the medical device placement system 1600, as seen through the display screen 512 of the alternate reality headset 130, according to some embodiments. Fig. 19 illustrates a first representation 1860 of a first medical device within a virtual anatomical object on a patient, a third representation 1842 of a second medical device on the patient, and a window 1870 in an environment beside the patient including an output of the medical device placement system 1600, as seen through the display screen 512 of the alternate reality headset 130, according to some embodiments. Fig. 20 illustrates a first representation 1860 of a first medical device within a virtual anatomical object above a patient, a third representation 1842 of a second medical device above the patient, and a window 1870 in an environment beside the patient including an output of the medical device placement system 1600, as seen through the display screen 512 of the alternate reality headset 130, according to some embodiments. Fig. 21 illustrates a first representation 1860 of a first medical device within a virtual anatomical object in an environment remote from a patient, a third representation 1842 of a second medical device in the environment remote from the patient, and a window 1870 in the environment remote from the patient including an output of the medical device placement system 1600, as viewed through the display 512 of the alternate reality headset 130, according to some embodiments.
With respect to the medical device representations shown in fig. 18 and 19, the first representation 1860 of the first medical device is a virtual medical device corresponding to the catheter 1460 as seen through the display 512 of the alternate reality earpiece 130, with the virtual medical device or virtual catheter within and anchored to a virtual anatomical object (such as a virtual SVC) on the patient. The virtual catheter within the virtual SVC represents, in real time, the location of catheter 1460 in the patient's SVC. The second representation 1840 of the second medical device is an outline corresponding to the TLS 1240 as seen through the display 512 of the alternate reality earpiece 130, where the outline around and anchored to the TLS 1240 on the sterile drape 1802 indicates the TLS 1240 on the patient's chest below the sterile drape 1802. The third representation 1842 of the second medical device is a virtual medical device corresponding to TLS 1240, where the virtual medical device or virtual TLS is on and anchored to sterile drape 1802 for visualizing TLS 1240 without compromising the sterile field defined by sterile drape 802.
With respect to the medical device representation shown in fig. 20, the first representation 1860 of the first medical device, as seen through the display screen 512 of the alternate reality earpiece 130, or the virtual catheter, remains within the virtual anatomical object or virtual SVC, but both the virtual catheter and the virtual SVC are anchored to the environment above the patient or frame of reference of the wearer of the alternate reality earpiece 130 and are temporarily located above the patient. Likewise, the third representation 1842 or virtual TLS of the second medical device as seen through the display screen 512 of the alternate reality earpiece 130 is anchored to the environment above the patient or the frame of reference of the wearer of the alternate reality earpiece 130 and is temporarily located above the patient. When any of the virtual anatomical object or medical device representations 1840, 1842, and 1860 are anchored to the wearer's frame of reference of the alternate reality earpiece 130, the virtual anatomical object or medical device representation may be temporarily located beside the patient (see fig. 18-20) or at a distance away from the patient (see fig. 20), depending on the wearer's perspective.
With respect to the medical device representation shown in fig. 21, the first representation 1860 of the first medical device, as seen through the display screen 512 of the alternate reality earpiece 130, or the virtual catheter, remains within the virtual anatomical object or virtual SVC, but both the virtual catheter and the virtual SVC are anchored to the environment at a distance away from the patient or the frame of reference of the wearer of the alternate reality earpiece 130 and are temporarily located at a distance away from the patient. Likewise, the third representation 1842 or virtual TLS of the second medical device as seen through the display screen 512 of the alternate reality earpiece 130 is anchored to the environment at a distance away from the patient or the frame of reference of the wearer of the alternate reality earpiece 130 and temporarily located at a distance away from the patient.
With respect to the window 1870 shown in fig. 18-21, the window 1870 is a graphical control element window that includes the output of the medical device placement system 1600 as seen through the display 512 of the alternate reality earpiece 130. The window 1870 includes, but is not limited to, an ultrasound window 1872, wherein the output of the ultrasound window 1872 corresponding to one or more processes of the medical device placement system 1600 includes ultrasound imaging frames corresponding to ultrasound imaging with the ultrasound probe 120. Window 1870 also includes, but is not limited to, an ECG window 1874, wherein the output of ECG window 1874 corresponding to one or more processes of the medical device placement system 1600 includes one or more ECGs corresponding to electrocardiography with one or more ECG electrodes 150 including a stylet 1752 with ECG electrodes. Each of the one or more ECGs in the ECG window 1874 is configured for placement in the ECG window by a wearer of the alternate reality headset 130. Each of the windows 1870 in fig. 18-20 is anchored to the environment alongside the frame of reference of the wearer of the patient or alternate reality headset 130. In fig. 21, each of the windows 1870 is anchored to the environment at a distance away from the frame of reference of the patient or wearer of the alternate reality earpiece 130. When any of the windows 1870 are anchored to the frame of reference of the wearer of the alternate reality headset 130, the window may be temporarily positioned beside the patient (see fig. 18-20) or at a distance away from the patient (see fig. 20), depending on the viewing angle of the wearer.
As described above, the alternate reality headset 130 may be configured to anchor the virtual anatomical object and medical device representation anywhere in the environment in three dimensions (such as to the patient) when displayed on the display screen 512, which is a feature of mixed reality. In view of fig. 18-21, it should be understood that the alternative reality earpiece 130 may also be configured to anchor a graphical control element, such as a window 1870, three-dimensionally to, for example, an environment beside a patient, as shown in at least fig. 18 and 19. However, as illustrated in fig. 18-21, the configuration of the alternate reality headphones 130 is not limited to anchoring virtual anatomical objects, medical device representations, and graphical control elements to the environment. Indeed, the alternate reality headset 130 may be configured to independently anchor any virtual object (e.g., any virtual anatomical object, any medical device representation, any graphical control element, etc.) to a persistent location on the display screen 512 (e.g., always at the bottom of the display screen 512, like a surgical loupe), a persistent location in the frame of reference of the wearer of the alternate reality headset 130 (e.g., always off the side of the wearer, but visible to the eye and within reach of the arm on that side), or a persistent location in the environment (such as on the patient).
Fig. 22A and 22B provide different views of a medical device placement system 1200 with a TLS 2240 according to some embodiments. Fig. 22C provides a stylet 2246 for use with the medical device placement system 1200 of fig. 22A and 22B having a TLS 2240, according to some embodiments.
Again, the medical device placement system 1200 may include the ultrasound probe 120, the alternate reality earpiece 130, and the console 1210 of the anatomical visualization system 100, the console 1210 including electronic circuitry similar to that of the console 110. As an alternative to the medical device placement system 1200 having TLS 1240, the medical device positioning system of the medical device placement system 1200 may alternatively include TLS 2240. The TLS 2240 differs from the TLS 1240 in that the TLS 2240 includes a bedside sensor grid 2242 and a sensor reference point 2244 configured to be placed on the patient's chest. The sensor grid 2242 includes an array of magnetic sensors embedded within a housing. The sensor reference point 2244 includes an electromagnetic coil. The TLS 2240 is configured to detect the stylet 2246 including the solenoid 2248 disposed in the lumen of the stylet 2246, wherein the solenoid of the sensor reference point 2244 and the solenoid 2248 of the stylet 2244 operate at different frequencies, different amplitudes, or both.
The sensor reference point 2244 with its solenoid is configured for placement on the patient's chest, thereby providing a sensor reference point for the TLS 2240. Again, the solenoid of the sensor reference point 2244 is configured to operate at a different frequency or amplitude than the solenoid 2248 of the stylet 2244. The different frequencies or amplitudes of the solenoid coil of the sensor reference point 2244 may be multiplexed with the frequency or amplitude of the solenoid coil 2248 of the stylet 2244.
The medical device placement system 1200 including the TLS 2240 is configured to operate with the sensor grid 2242 at the side of the bed where the patient is lying, using the electromagnetic coil 2248 of the stylet 2246 as a transmitter coil. The inverse configuration of the foregoing configuration is also possible when the magnetic sensor array of the sensor grid 2242 is a magnetic coil array configured to act as a transmitter coil and the magnetic coils 2248 of the stylet 2246 are magnetic sensors.
Continuing with the configuration in which the sensor grid 2242 includes an array of magnetic sensors and the stylet 2246 includes a solenoid 2248, the relative position of the solenoid 2248 of the stylet 2246 and the sensor reference point 2244 can be tracked and displayed in 3-dimensional space on the display screen 512 of the alternate reality headset 130 in different locations on the patient. Such tracking and display allows the clinician to understand the venous path and overcome medical device placement obstacles, such as obstacles (e.g., lesions), incorrectly followed paths in the patient's vasculature, heart valves, etc.
The depth measured from the sensor reference points 2244 may provide real-time information with respect to medical device placement (such as odd vein placement or inferior vena cava placement). Arterial placement of the medical device placement system 1200 including the TLS 2240 is also possible.
Some advantages of the medical device placement system 1200 including the TLS 2240 include the following capabilities: 1) placing medical devices with small solenoids (such as a stylet 2246) in neonatal or pediatric patients as well as patients with neck rests; 2) tracking a tip of a medical device (such as a stylet 2246) over a tortuous path using a plurality of solenoids (such as array solenoids in an alternative sensor grid 2242); 3) tracking and providing feedback from a plurality of electromagnetic coils in the x, y and z directions and pitch, yaw and rotation; and 4) simultaneously track multiple medical devices in different anatomical locations using multiple sensor reference points, such as sensor reference point 2244.
Method of producing a composite material
The methods for the medical device placement system 300, 1200, or 1600 incorporate, among other things, methods for at least the anatomical visualization system 100 and the medical device localization system 200, which are resolvable below by reference to the anatomical visualization system 100, the medical device localization system 200, or components thereof (e.g., the ultrasound probe 120, the medical device detector 240, the TLS 1240, etc.).
The method for the medical device placement system 300 or 1200 includes: transmitting an ultrasound signal into a patient (e.g., a limb of the patient) and receiving an echo ultrasound signal from the patient by means of the piezoelectric sensor array 438 of the ultrasound probe 120; transforming the echo ultrasound signals to produce ultrasound image segments corresponding to the patient's anatomy using a console 310 or 1210 having electronic circuitry including memory 1012 or 1312, one or more algorithms 1016 or 1316, and one or more processors 1014 or 1314; inserting a magnetized medical device into a patient's body (e.g., a limb of the patient) and utilizing one or more algorithms 1016 or 1316 (e.g., a position finding algorithm) of the console 310 or 1210 to convert magnetic sensor signals from the magnetic sensor array 242 embedded within the housing 810 or 910 placed around the patient or one or more magnetic sensors 1242 disposed within the housing of the TLS 1240 placed on the patient's chest into positional information of the medical device within the patient; a representation of a medical device, such as a virtual medical device, in accordance with position information of the medical device within a virtual anatomical object corresponding to an ultrasound image segment is displayed on a patient on a see-through display screen 512 of an alternate reality headset 130 having electronic circuitry including a memory 518 and one or more processors 520 in a frame 516 coupled to the display screen 512. Ultrasound imaging may be performed at any time prior to insertion of a medical device into a patient to produce a virtual anatomical object, and the virtual anatomical object may be stored in memory 1012 or 1312 of console 310 or 1210 or a storage medium connected to a port of console 310 or 1210 for later use.
The method may further comprise: while transmitting and receiving ultrasound signals, capturing ultrasound imaging frames in memory 1012 or 1312 of console 310 or 1210 according to a pulsed wave doppler imaging mode of ultrasound probe 120; stitching the ultrasound imaging frames together using a stitching algorithm of the one or more algorithms 1016 or 1316; and segmenting the ultrasound imaging frame or the stitched ultrasound imaging frame into ultrasound image segments using an image segmentation algorithm of the one or more algorithms 1016 or 1316.
The method may further comprise: converting the ultrasound image segments into virtual anatomical objects using a virtualization algorithm of one or more of algorithms 1016 or 1316; and send both the medical device representation and the virtual anatomical object to the alternate reality headset 130 for display on the patient.
The method may further comprise: anchoring the medical device representation and the virtual anatomical object to the patient, and displaying the virtual medical device and the virtual anatomical object on the patient.
The method may further comprise: capturing eye movements of the wearer in memory 1012 or 1312 of console 310 or 1210 using one or more eye tracking cameras 522 coupled to frame 516 of alternate reality headset 130; and processing the eye movement with an eye movement algorithm of the one or more algorithms 528 to identify the wearer's focus for selecting or enhancing the virtual anatomical object corresponding to the wearer's focus.
The method may further comprise: capturing the pose of the wearer in the memory 1012 or 1312 of the console 310 or 1210 using one or more patient facing cameras 524 coupled to the frame 516 of the alternate reality headset 130; and processes the gesture with a gesture command algorithm of the one or more algorithms 528 to identify a gesture-based command issued by the wearer for execution by the alternate reality headset 130.
The method may further comprise: capturing the wearer's audio in the memory 1012 or 1312 of the console 310 or 1210 using one or more microphones 526 coupled to the frame 516 of the alternate reality headset 130; and processes the audio with an audio command algorithm of the one or more algorithms 528 to identify an audio-based command issued by the wearer for execution by the alternate reality headset 130.
The method may further comprise: generating a magnetic field with a magnetic field generator 740; and determines the spatial relationship of each magnetic sensor in the magnetic sensor array 242 to another magnetic sensor from the magnetic sensor signals produced by the magnetic sensor array 242 in the presence of the generated magnetic field. When the magnetic sensor array 242 is embedded within the housing 910 (e.g., a drape), it is important to determine the spatial relationship of each magnetic sensor in the magnetic sensor array 242 to another magnetic sensor for the magnetic sensors to have a variable spatial relationship with each other depending on how the housing 910 is placed around the limb of the patient.
The method for the medical device placement system 1600 includes: transmitting an ultrasound signal into a patient (e.g., a limb of the patient) and receiving an echo ultrasound signal from the patient by means of the piezoelectric sensor array 438 of the ultrasound probe 120; transforming the echo ultrasound signals with electronic circuitry in the frame 516 of the alternate reality headset 130 including a memory 518 and one or more processors 520 to produce ultrasound image segments corresponding to the anatomy of the patient; converting magnetic sensor signals from one or more magnetic sensors 1242 disposed within the housing of the TLS 1240 placed on the patient's chest to positional information of the magnetized medical device within the patient's body using the alternate reality headset 130; and displaying, on the see-through display 512 of the alternate reality headset 130 on the environment including the patient, a virtual medical device according to the medical device's positional information within the virtual anatomical object corresponding to the ultrasound image segment for the wearer of the alternate reality headset, one or more graphical control element windows 1870 including outputs corresponding to one or more processes of the medical device placement system 1600, or both the virtual medical device and the one or more windows 1870 within the virtual anatomical object.
The method further comprises the following steps: capturing eye movements of the wearer in the memory 518 of the alternate reality headset 130 using one or more eye tracking cameras 522 coupled to the frame 516 of the alternate reality headset 130; and processing the eye movements with the eye movement algorithm to identify the wearer's focus for selecting or enhancing the virtual medical device, the virtual anatomical object, the one or more windows 1870, or the output in the one or more windows 1870 corresponding to the wearer's focus.
The method further comprises the following steps: capturing the pose of the wearer in the memory 518 of the alternate reality headset 130 using one or more patient facing cameras 524 coupled to the frame 516 of the alternate reality headset 130; and processes the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset 130.
The method further comprises the following steps: enabling the wearer to anchor the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows 1870 to a persistent location on the display 512, a persistent location in the frame of reference of the wearer of the alternate reality earpiece 130, or a persistent location in the environment.
The method further comprises the following steps: enabling the wearer to contextually translate the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows 1870 by means of translating, rotating, or resizing the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows 1870.
Although specific embodiments have been disclosed herein, and although details of specific embodiments have been disclosed, these specific embodiments are not intended to limit the scope of the concepts provided herein. Additional adaptations and/or modifications may be apparent to those of ordinary skill in the art, and in broader aspects, such adaptations and/or modifications are also contemplated. Thus, departures may be made from the specific embodiments disclosed herein without departing from the scope of the concepts provided herein.

Claims (27)

1. A wireless medical device placement system, comprising:
an ultrasound probe configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of a piezoelectric sensor array;
a medical device tip position sensor ("TLS") configured for placement on the patient's chest; and
an alternate reality headset configured to wirelessly communicate with the ultrasound probe and the TLS, the alternate reality headset comprising:
a frame having electronic circuitry including a memory and a processor configured to:
transforming the echo ultrasound signals to produce ultrasound image segments corresponding to the patient's anatomy; and is
Converting a TLS signal from the TLS to positional information of a medical device within the patient when the TLS is placed on the patient's chest; and
a display screen coupled to the frame through which a wearer of the alternate reality headset can see an environment including the patient, the display screen configured to:
displaying a virtual medical device within a virtual anatomical object corresponding to the ultrasound image segment according to the position information of the medical device;
displaying one or more graphical control element windows including outputs corresponding to one or more processes of the medical device placement system; or
Both the virtual medical device and the one or more windows displayed within the virtual anatomical object.
2. The medical device placement system as recited in claim 1, wherein the alternate reality earpiece is configured to: capturing an ultrasound imaging frame according to an imaging mode of the ultrasound probe; stitching the ultrasound imaging frames together using a stitching algorithm; and segmenting the ultrasound imaging frame or the stitched ultrasound imaging frame into the ultrasound image segments using an image segmentation algorithm.
3. The medical device placement system of claim 2, wherein the alternate reality headset is configured to display the one or more windows including output corresponding to one or more processes of the medical device placement system, the one or more windows including an ultrasound window, and the output corresponding to one or more processes of the medical device placement system including the ultrasound imaging frame corresponding to ultrasound imaging with the ultrasound probe.
4. The medical device placement system as recited in any of claims 1-3, wherein the alternate reality earpiece is configured to: transforming the ultrasound image segment into the virtual anatomical object using a virtualization algorithm; and displaying both the virtual medical device and the virtual anatomical object on the environment.
5. The medical device placement system as recited in any one of claims 1 to 4, wherein the alternate reality earpiece is configured to anchor the virtual medical device and the virtual anatomical object to a persistent location on the display screen, a persistent location in a frame of reference of the wearer, or a persistent location in the environment.
6. The medical device placement system as recited in any one of claims 1 to 5, the alternate reality headset further comprising one or more eye tracking cameras coupled to the frame, the one or more eye tracking cameras configured to capture eye movements of the wearer, the processor of the alternate reality headset further configured to process the eye movements with an eye movement algorithm to identify a focus of the wearer for selecting or augmenting the virtual anatomical object, the virtual medical device, or both corresponding to the wearer's focus.
7. The medical device placement system as recited in any one of claims 1 to 6, the alternate reality headset further comprising one or more patient-facing cameras coupled to the frame, the one or more patient-facing cameras configured to capture a pose of the wearer, the processor of the alternate reality headset further configured to process the pose with a pose command algorithm to identify a pose-based command issued by the wearer for execution of the pose-based command by the alternate reality headset.
8. The medical device placement system as recited in any one of claims 1 to 7, the alternate reality headset further comprising one or more microphones coupled to the frame, the one or more microphones configured to capture audio of the wearer, the processor of the alternate reality headset further configured to process the audio with an audio command algorithm to identify audio-based commands issued by the wearer for execution of the audio-based commands by the alternate reality headset.
9. A medical device placement system, comprising:
an ultrasound probe configured to transmit ultrasound signals into a patient and receive echo ultrasound signals from the patient by means of a piezoelectric sensor array;
a medical device tip position sensor ("TLS") configured for placement on the patient's chest;
a stylet configured for insertion into a lumen of a medical device, the stylet including an electrocardiogram ("ECG") electrode in a distal portion of the stylet, the ECG electrode configured to generate a set of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart;
a processing device configured to process the echo ultrasound signal, the TLS signal, and the set of ECG signals, the processing device comprising electronic circuitry comprising a memory and a processor configured to:
transforming the echo ultrasound signals to produce ultrasound image segments corresponding to the patient's anatomy;
converting a TLS signal from the TLS to positional information of the medical device within the patient when the TLS is placed on the patient's chest; and is
Converting the set of ECG signals to an ECG; and
a wearable display screen through which a wearer of the wearable display screen can see an environment including the patient, the display screen configured to:
displaying a virtual medical device within a virtual anatomical object corresponding to the ultrasound image segment according to the position information of the medical device;
displaying one or more graphical control element windows including outputs corresponding to one or more processes of the medical device placement system; or
Both the virtual medical device and the one or more windows displayed within the virtual anatomical object.
10. The medical device placement system of claim 9, wherein the processing arrangement is configured to: capturing an ultrasound imaging frame according to an imaging mode of the ultrasound probe; stitching the ultrasound imaging frames together using a stitching algorithm; and segmenting the ultrasound imaging frame or the stitched ultrasound imaging frame into the ultrasound image segments using an image segmentation algorithm.
11. The medical device placement system of claim 10, wherein the display screen is configured to display the one or more windows including output corresponding to one or more processes of the medical device placement system, the one or more windows including an ultrasound window, and the output corresponding to one or more processes of the medical device placement system including the ultrasound imaging frame corresponding to ultrasound imaging with the ultrasound probe.
12. The medical device placement system of claim 11, wherein the one or more windows further comprise an ECG window, and the output corresponding to one or more processes of the medical device placement system further comprises the ECG corresponding to electrocardiography with the stylet comprising the ECG electrode.
13. The medical device placement system as recited in claim 12, further comprising a number of ECG electrode pads configured to generate a corresponding number of sets of ECG signals in response to electrical changes associated with depolarization and repolarization of the patient's heart, wherein the processing apparatus is further configured to convert the number of sets of ECG signals to a corresponding number of ECGs.
14. The medical device placement system as recited in claim 13, wherein the output corresponding to one or more processes of the medical device placement system further comprises the number of ECGs corresponding to electrocardiography with the number of ECG electrode pads, each of the ECGs in the ECG window configured for placement in the ECG window by a wearer of the display screen.
15. The medical device placement system as recited in any of claims 9-14, wherein the processing apparatus is configured to: transforming the ultrasound image segments into the virtual anatomical object using a virtualization algorithm for displaying both the virtual medical device and the virtual anatomical object on the environment.
16. The medical device placement system as recited in any one of claims 9 to 15, wherein the processing apparatus is a console of the medical device placement system, an alternate reality headset of the medical device placement system, or a combination of the console and the alternate reality headset, the alternate reality headset including a frame, the display screen being coupled to the frame.
17. The medical device placement system of claim 16, wherein the stylet is configured to connect to the TLS through a sterile drape that separates a sterile zone including the stylet from a non-sterile zone including the TLS, the TLS configured to communicate wirelessly with the alternate reality headset or with the console via a first wired connection to a first port of the console, and the ultrasound probe configured to communicate wirelessly with the alternate reality headset or with the console via a second wired connection to a second port of the console.
18. The medical device placement system as recited in claim 17, wherein the alternate reality earpiece is configured to anchor the virtual medical device and the virtual anatomical object to a persistent location on the display screen, a persistent location in a frame of reference of the wearer, or a persistent location in the environment.
19. The medical device placement system of claim 17 or 18, wherein the display screen is configured to display one or more contours around one or more corresponding components of the medical device placement system, one or more virtual components on one or more corresponding components of the medical device placement system, or a combination thereof.
20. The medical device placement system as recited in any one of claims 17-19, wherein the display screen is configured to display a TLS outline around the TLS under the sterile drape, a virtual TLS of the TLS anywhere in the environment on the sterile drape, or a combination thereof.
21. The medical device placement system as recited in any one of claims 9 to 20, wherein the medical device is a peripherally inserted central catheter ("PICC"), and the desired location of the PICC in the patient is a superior vena cava proximate a sinus node in a right atrium of the patient's heart.
22. The medical device placement system as recited in claim 21, wherein a distal portion of the virtual medical device indicates proximity to the desired location in the patient by way of a visual indicator as the medical device is advanced through the patient's body.
23. A method of a medical device placement system, comprising:
transmitting an ultrasound signal into a patient and receiving an echo ultrasound signal from the patient by means of a piezoelectric sensor array of an ultrasound probe;
transforming the echo ultrasound signals to produce ultrasound image segments corresponding to the patient's anatomy with electronic circuitry in a frame of an alternate reality headset, the electronic circuitry including a memory and a processor;
converting, with the alternate reality headset, magnetic sensor signals from one or more magnetic sensors disposed within a housing of a medical device tip position sensor ("TLS") placed on the chest of the patient into positional information of a magnetized medical device within the patient; and is
Displaying, on a see-through display screen of the alternate reality headset, to a wearer of the alternate reality headset on an environment including the patient:
a virtual medical device according to position information of the medical device within a virtual anatomical object corresponding to the ultrasound image segment;
one or more graphical control element windows comprising outputs corresponding to one or more processes of the medical device placement system; or
Both the virtual medical device and the one or more windows within the virtual anatomical object.
24. The method of claim 23, further comprising:
capturing eye movements of the wearer in a memory of the alternate reality headset using one or more eye tracking cameras coupled to a frame of the alternate reality headset; and is
Processing the eye movement with an eye movement algorithm to identify a focus of the wearer for selecting or enhancing the virtual medical device, the virtual anatomical object, the one or more windows, or an output in the one or more windows corresponding to the focus of the wearer.
25. The method of claim 23 or 24, further comprising:
capturing a pose of the wearer in a memory of the alternate reality headset using one or more patient facing cameras coupled to a frame of the alternate reality headset; and is
Processing the gesture with a gesture command algorithm to identify a gesture-based command issued by the wearer for execution by the alternate reality headset.
26. The method of any of claims 23 to 25, further comprising enabling the wearer to anchor the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows to a persistent location on the display screen, a persistent location in a frame of reference of a wearer of the alternate reality headset, or a persistent location in the environment.
27. The method of claim 23, further comprising enabling the wearer to translate the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows on the environment by means of translating, rotating, or resizing the virtual medical device, any of the virtual anatomical objects, or any of the one or more windows.
CN201980037260.7A 2018-06-04 2019-06-03 System and method for visualizing an anatomy, positioning a medical device, or placing a medical device Pending CN112236077A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US201862680299P 2018-06-04 2018-06-04
US62/680,299 2018-06-04
US16/209,601 US20190167148A1 (en) 2017-12-04 2018-12-04 Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US16/209,601 2018-12-04
US16/370,353 US20190223757A1 (en) 2017-12-04 2019-03-29 Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US16/370,353 2019-03-29
PCT/US2019/035271 WO2019236505A1 (en) 2018-06-04 2019-06-03 Systems and methods for visualizing anatomy, locating medical devices, or placing medical devices

Publications (1)

Publication Number Publication Date
CN112236077A true CN112236077A (en) 2021-01-15

Family

ID=68770987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980037260.7A Pending CN112236077A (en) 2018-06-04 2019-06-03 System and method for visualizing an anatomy, positioning a medical device, or placing a medical device

Country Status (3)

Country Link
EP (1) EP3801245A4 (en)
CN (1) CN112236077A (en)
WO (1) WO2019236505A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114237400A (en) * 2021-12-17 2022-03-25 山东大学齐鲁医院 PICC reality augmentation system, reality augmentation method and mobile terminal

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4294307A1 (en) * 2021-03-26 2023-12-27 C. R. Bard, Inc. Medical device projection system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1504713A1 (en) * 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Navigation system for cardiac therapies
CN102438551A (en) * 2009-05-08 2012-05-02 皇家飞利浦电子股份有限公司 Ultrasonic planning and guidance of implantable medical devices
CN105796177A (en) * 2010-12-23 2016-07-27 巴德阿克塞斯系统股份有限公司 Systems and methods for guiding a medical instrument
WO2016133644A1 (en) * 2015-02-20 2016-08-25 Covidien Lp Operating room and surgical site awareness
US20180144550A1 (en) * 2016-11-23 2018-05-24 Simbionix Ltd. System and method for rendering complex data in a virtual reality or augmented reality environment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7835785B2 (en) * 2005-10-04 2010-11-16 Ascension Technology Corporation DC magnetic-based position and orientation monitoring system for tracking medical instruments
AU2010300677B2 (en) * 2009-09-29 2014-09-04 C.R. Bard, Inc. Stylets for use with apparatus for intravascular placement of a catheter
CA2953694A1 (en) * 2014-07-02 2016-01-07 Covidien Lp Alignment ct
US10013808B2 (en) * 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
EP3720349A4 (en) * 2017-12-04 2021-01-20 Bard Access Systems, Inc. Systems and methods for visualizing anatomy, locating medical devices, or placing medical devices

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1504713A1 (en) * 2003-07-14 2005-02-09 Surgical Navigation Technologies, Inc. Navigation system for cardiac therapies
CN102438551A (en) * 2009-05-08 2012-05-02 皇家飞利浦电子股份有限公司 Ultrasonic planning and guidance of implantable medical devices
CN105796177A (en) * 2010-12-23 2016-07-27 巴德阿克塞斯系统股份有限公司 Systems and methods for guiding a medical instrument
WO2016133644A1 (en) * 2015-02-20 2016-08-25 Covidien Lp Operating room and surgical site awareness
US20180144550A1 (en) * 2016-11-23 2018-05-24 Simbionix Ltd. System and method for rendering complex data in a virtual reality or augmented reality environment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114237400A (en) * 2021-12-17 2022-03-25 山东大学齐鲁医院 PICC reality augmentation system, reality augmentation method and mobile terminal

Also Published As

Publication number Publication date
EP3801245A4 (en) 2022-03-02
EP3801245A1 (en) 2021-04-14
WO2019236505A1 (en) 2019-12-12

Similar Documents

Publication Publication Date Title
US20200188037A1 (en) Navigating A Surgical Instrument
US20190167148A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US20190307419A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
EP1421913B1 (en) Image guided catheter navigation system for cardiac surgery
US20190175058A1 (en) System and Method for Assisting Visualization During a Procedure
US8213693B1 (en) System and method to track and navigate a tool through an imaged subject
US20170303816A1 (en) System and method for localizing medical instruments during cardiovascular medical procedures
US10952795B2 (en) System and method for glass state view in real-time three-dimensional (3D) cardiac imaging
US20100210938A1 (en) Navigation System for Cardiac Therapies
EP2064991A2 (en) Flashlight view of an anatomical structure
EP3236854A1 (en) Tracking-based 3d model enhancement
US20200397511A1 (en) Ultrasound image-based guidance of medical instruments or devices
CN112236077A (en) System and method for visualizing an anatomy, positioning a medical device, or placing a medical device
US20190223757A1 (en) Systems And Methods For Visualizing Anatomy, Locating Medical Devices, Or Placing Medical Devices
US20170086759A1 (en) Control of the movement and image acquisition of an x-ray system for a 3D/4D co-registered rendering of a target anatomy
EP3666217B1 (en) Composite visualization of body part
US20230218272A1 (en) Controlling and visualizing rotation and deflection of a 4d ultrasound catheter having multiple shafts
CN117582288A (en) Spatially aware medical device configured to perform insertion path approximation
WO2020106664A1 (en) System and method for volumetric display of anatomy with periodic motion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination