US20210169447A1 - Identifying Anatomical Structures - Google Patents

Identifying Anatomical Structures Download PDF

Info

Publication number
US20210169447A1
US20210169447A1 US16/934,714 US202016934714A US2021169447A1 US 20210169447 A1 US20210169447 A1 US 20210169447A1 US 202016934714 A US202016934714 A US 202016934714A US 2021169447 A1 US2021169447 A1 US 2021169447A1
Authority
US
United States
Prior art keywords
anatomy
nerve
transducer
image
patient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/934,714
Inventor
Kern Singh
Sachin Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tissue Differentiation Intelligence LLC
Original Assignee
Tissue Differentiation Intelligence LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/329,940 external-priority patent/US10154826B2/en
Priority claimed from PCT/US2015/050404 external-priority patent/WO2016044411A1/en
Application filed by Tissue Differentiation Intelligence LLC filed Critical Tissue Differentiation Intelligence LLC
Priority to US16/934,714 priority Critical patent/US20210169447A1/en
Publication of US20210169447A1 publication Critical patent/US20210169447A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/12Diagnosis using ultrasonic, sonic or infrasonic waves in body cavities or body tracts, e.g. by using catheters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/02Surgical instruments, devices or methods, e.g. tourniquets for holding wounds open; Tractors
    • A61B17/0206Surgical instruments, devices or methods, e.g. tourniquets for holding wounds open; Tractors with antagonistic arms as supports for retractor elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0062Arrangements for scanning
    • A61B5/0066Optical coherence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/4893Nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/445Details of catheter construction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4461Features of the scanning mechanism, e.g. for moving the transducer within the housing of the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4494Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer characterised by the arrangement of the transducer elements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • G06K9/00147
    • G06K9/46
    • G06K9/6267
    • G06K9/6269
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/0042Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping
    • A61B2017/00438Surgical instruments, devices or methods, e.g. tourniquets with special provisions for gripping connectable to a finger
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/02Surgical instruments, devices or methods, e.g. tourniquets for holding wounds open; Tractors
    • A61B17/025Joint distractors
    • A61B2017/0256Joint distractors for the spine
    • A61B2017/0262Joint distractors for the spine with a provision for protecting nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B42/00Surgical gloves; Finger-stalls specially adapted for surgery; Devices for handling or treatment thereof
    • A61B42/10Surgical gloves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/50Clinical applications
    • A61B6/506Clinical applications involving diagnosis of nerves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • G06K2209/05
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Gynecology & Obstetrics (AREA)
  • Neurology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Robotics (AREA)
  • Quality & Reliability (AREA)
  • Physiology (AREA)

Abstract

Aspects described herein disclose devices, systems, and methods for use in contexts such as minimally invasive surgery (MIS). A device is provided herein having a proximal portion and a distal portion, and an ultrasound transducer may be disposed within the distal portion and configured to scan tissue and identify certain portions of a patent's anatomy during the scanning process. The results of the detection may be presented to an operator of the device aurally and/or visually, such as in a 3-D volumetric image. By scanning the tissue, identifying the anatomy, and presenting the results to an operator, unnecessary damage to elements of the patients anatomy may be avoided or lessened. In some aspects, multiple transducers may be positioned on the device to increase the scanning range and/or scanning accuracy of the device. The device may provide an inner channel for the passage of surgical tools while scanning tissue.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. application Ser. No. 15/063,152, filed on Mar. 7, 2016 entitled IDENTIFYING ANATOMICAL STRUCTURES, which claims priority to U.S. Provisional Patent Application Ser. No. 62/129,866, filed Mar. 8, 2015 and entitled Device and Method for Identifying Anatomical Structures and to U.S. Provisional Patent Application Ser. No. 62/129,862, filed Mar. 8, 2015 and entitled Nerve Mapping System. U.S. application Ser. No. 15/063,152 is a continuation-in-part of International Application PCT/US15/50404, with an international filing date of Sep. 16, 2015 entitled IDENTIFYING ANATOMICAL STRUCTURES, which claims priority to U.S. Provisional Patent Application Ser. No. 62/051,670, filed Sep. 17, 2014 and entitled “DEVICE AND METHOD FOR IDENTIFYING ANATOMICAL STRUCTURES”. The contents of the applications listed above are hereby incorporated by reference in their entirety for all purposes. This application also incorporates by reference in their entireties for all purposes the following related applications: U.S. Provisional Application Ser. No. 61/847,517, filed Jul. 17, 2013, entitled Direct Visualization Dissector and Retractor System for Minimally Invasive Procedures, U.S. Provisional Application Ser. No. 61/867,534, filed Aug. 19, 2013, entitled Ultrasonic Visualization, Dissection, and Retraction System for Minimally Invasive Procedures, U.S. Provisional Application Ser. No. 61/868,508, filed Aug. 21, 2013, entitled OCT Visualization, Dissection, and Retraction System for Minimally Invasive Procedures, U.S. Provisional Application Ser. No. 61/899,179, filed Nov. 2, 2013, entitled Nerve Detection System, U.S. Provisional Application Ser. No. 61/921,491, filed Dec. 29, 2013, entitled System and Method for Identifying Anatomical Structures Ultrasonically, U.S. Provisional Application Ser. No. 61/929,083, filed Jan. 19, 2014, entitled System and Method for Identifying Anatomical Structures Ultrasonically, U.S. Provisional Application Ser. No. 61/977,594, filed Apr. 9, 2014, entitled System and Method for Identifying Anatomical Structures Ultrasonically Employing Two or More Transducers, and U.S. Non-Provisional application Ser. No. 14/329,940, filed Jul. 12, 2014, entitled Device and Method for Identifying Anatomical Structures.
  • BACKGROUND
  • Surgical techniques utilizing minimally invasive surgery (“MIS”) are being rapidly adapted to replace current traditional “open” surgical procedures. “Open” procedures typically require larger skin incisions that may cause significant collateral damage to uninvolved anatomic structures. For example, intervening soft tissue (e.g., tendons, ligaments, facet capsules, muscles, and so on) may be cut and even potentially excised to allow for direct surgical visualization of the operated-upon area or anatomical structure.
  • In contrast, minimally invasive techniques, which may also be referred to as “percutaneous” techniques, involve significantly smaller incisions and are less traumatic to the patient's anatomy. Soft tissues may be preserved with minimal collateral damage to the uninvolved anatomy. Typical benefits of MIS may include decreased blood loss, decreased postoperative pain, smaller scar formation, decreased cost, and a faster rehabilitation for the patient than in “open” or conventional surgical techniques.
  • Minimally invasive surgery techniques are currently being adapted to a variety of surgical procedures. For example, minimally invasive techniques in the form of laparoscopic procedures, such as a laparoscopic colectomy for carcinoma of the colon, have been developed. More recently, surgeons have utilized MIS in spinal surgery applications.
  • BRIEF SUMMARY
  • Present MIS techniques are unable to accurately and consistently detect and avoid key anatomical features, such as neural elements, potentially resulting in profound neurological sequelae and deleterious impacts to other systems. For example, even a minimally invasive surgical instrument, if impacting or contacting with nervous system elements (e.g., nerves, spinal cord) may result in loss of sensation, sensory overload, pain, or other unwanted or harmful effects. Detection and identification of anatomical features may assist in combating these problems and other problems that may become apparent upon reading of this disclosure.
  • Accordingly, in one aspect of the present disclosure, a device may be provided for minimally invasive surgery and may include a body comprising a proximal portion, a distal portion, and a main portion formed between the proximal portion and distal portion. At least one ultrasound transducer may be arranged at the distal portion of the body and may be configured to scan a region extending away from the main portion of the body. The device may include a signal processing unit having at least one processor and memory storing instructions that cause the at least one processor to receive a signal from the at least one ultrasound transducer and identify an anatomical structure based on the signal.
  • In some embodiments, the device may include a hollow channel and an annular shaped tip to allow the passage of surgical tools in the channel for performing procedures while collecting data for use in mapping anatomical tissues. In these embodiments, ultrasonic transducers may be arranged on the distal end of the device, on the annular shaped tip.
  • In another aspect, a device may be provided for minimally invasive surgery. The device may include a proximal portion, a distal portion, a main body formed between the proximal and distal portions of the device having a longitudinal axis and at least one ultrasound transducer disposed within the distal portion of the device and configured to scan a region adjacent to a distal end of the distal portion of the device.
  • In another aspect, a method may be provided and may include receiving data from at least one ultrasound transducer arranged at a distal portion of a body and configured to scan a region extending away from a main portion of the body. The method may also include processing the data to identify an anatomical structure located within the region, and outputting an indication associated with the anatomical structure.
  • In another aspect, a method for identifying a target anatomy may be provided with a device having a distal portion and at least one ultrasound transducer at least partially disposed within a main body of the device. The method may include scanning an anatomy of a patient anatomy for the target anatomy, determining a voltage trace of the patient's anatomy, comparing the voltage trace of the patient's anatomy to a predetermined voltage trace of the target anatomy, and sending a notification if the voltage trace of the patient's anatomy matches the predetermined voltage trace of the target anatomy.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a side view of one embodiment of a device.
  • FIG. 2 is a side view of another embodiment of the device of FIG. 1.
  • FIG. 3 is a functional diagram of the ultrasound imaging system that may be used in one embodiment of the present disclosure.
  • FIG. 4 is a diagram of an ultrasound transducer that may be used in one embodiment of the present disclosure.
  • FIG. 5 is another functional diagram of one embodiment of the ultrasound imaging system that may be used in an embodiment of the present disclosure.
  • FIG. 6 is one embodiment of the device having more than one ultrasound transducer disposed therein.
  • FIG. 7 is another embodiment of the device having one ultrasound transducer disposed therein.
  • FIG. 8 depicts the scanning width of a transducer.
  • FIG. 9 depicts the scanning width of a transducer in one configuration.
  • FIG. 10 is another embodiment of the present disclosure where one of the transducers is positioned at an angle with respect to the other transducer.
  • FIG. 11 depicts the scanning width of the embodiment of FIG. 10.
  • FIG. 12 is yet another embodiment of the present disclosure where two transducers are angled towards the longitudinal axis of the device.
  • FIG. 13 depicts scan images of a target anatomy taken by one embodiment of the present disclosure.
  • FIG. 14 is a scan and A-line image scan of the target anatomy captured by one embodiment of the present disclosure.
  • FIG. 15 depicts the configuration of one embodiment of the present disclosure.
  • FIG. 16 depicts images of the target anatomy captured by one embodiment of the present disclosure.
  • FIG. 17 depicts additional images of the target anatomy captured by one embodiment of the present disclosure.
  • FIG. 18 is a scan of the target anatomy captured by one embodiment of the present disclosure.
  • FIG. 19 depicts one embodiment of a retractor system that can be used with embodiments of the present disclosure.
  • FIG. 20 depicts one embodiment of the dilator system that can be used with embodiments of the present disclosure.
  • FIG. 21 is one embodiment of the present disclosure that is incorporated into a glove.
  • FIG. 22 is a partial side view of the embodiment disclosed in FIG. 21.
  • FIG. 23 is a front cross-sectional view of one embodiment of the glove embodiment disclosed in FIG. 21.
  • FIG. 24 is a front cross-sectional view of another embodiment of the glove embodiment disclosed in FIG. 21.
  • FIG. 25 is a front cross-sectional view of yet another embodiment of the glove embodiment disclosed in FIG. 21.
  • FIG. 26 is diagram of another embodiment of the present disclosure that utilizes Optical Coherence Tomography.
  • FIG. 27 is one embodiment the probe used with the embodiment disclosed in FIG. 26.
  • FIG. 28 depicts two embodiments of the probe used with the embodiment disclosed in FIG. 26.
  • FIG. 29 is a side view of one embodiment of the present disclosure having direct visualization capability.
  • FIG. 30 is a cross-sectional view of the embodiment disclosed in FIG. 29.
  • FIG. 31 is another embodiment of the present disclosure having direct visualization capability.
  • FIG. 32 is a partial cross-sectional view of the embodiment disclosed in FIG. 31.
  • FIG. 33 is a partial front cross-sectional view of the embodiment disclosed in 29.
  • FIG. 34 is another embodiment of the present disclosure disclosed in FIG. 29.
  • FIG. 35 is a side cross sectional view of another embodiment of the present disclosure disclosed in FIG. 31.
  • FIG. 36 is yet another embodiment of the conduit disclosed in FIG. 29.
  • FIG. 37 is yet another embodiment of the conduit disclosed in FIG. 29.
  • FIG. 38 depicts one embodiment of the present disclosure in use with a retractor system.
  • FIG. 39 depicts one embodiment of the retractor system of the preset invention.
  • FIG. 40 depicts another embodiment of the retractor system of the preset invention.
  • FIG. 41 depicts an illustrative graph of Scan line energy as a nerve discriminant.
  • FIG. 42 depicts an illustrative user interface design window containing a designed bandpass filter and specifications.
  • FIG. 43 depicts an illustrative application of a bandpass filter on the received RF signals, thereby suppressing low frequency noise near the transducer surface.
  • FIG. 44 depicts examples of SVM classification schemes for separable and non-separable data.
  • FIGS. 45A & B depict a plot matrix showing two-dimensional (2D) relationships between the feature variables.
  • FIG. 46 depicts an illustrative receiver operating characteristic (ROC) curve defining performance of the trained SVM.
  • FIG. 47 illustrates a computing device in accordance with one or more aspects described herein.
  • FIG. 48 shows the device viewed from the distal end as in an embodiment of the present disclosure.
  • FIG. 49 shows the device viewed from the side as in an embodiment of the present disclosure.
  • FIG. 50 shows the distal face of the device as in an embodiment of the present disclosure.
  • FIG. 51 depicts placement of the device on the surface of the psoas muscle as in an embodiment of the present disclosure.
  • FIG. 52 is another depiction of the placement of the device on the surface of the psoas muscle as in an embodiment of the present disclosure.
  • FIG. 53 depicts a probe inserted into the device on the surface of the psoas muscle as in an embodiment of the present disclosure.
  • FIG. 54 depicts a nerve map in a viewing window in accordance with aspects of the present disclosure.
  • FIG. 55 depicts a visual representation of Kambin's triangle.
  • FIG. 56 depicts the device in a position targeting Kambin's triangle, as described herein.
  • FIG. 57 depicts a surgical incision targeting access to a location on the spine.
  • FIG. 58 depicts a surgical approach targeting a portion of the spine.
  • DETAILED DESCRIPTION
  • To help understand the present disclosure, the following definitions are provided with reference to terms used in this application.
  • Throughout this specification and in the appended claims, when discussing the application of aspects of the present disclosure with respect to the body's tissue, spine or other neural elements, the term “proximal” with respect to such a device is intended to refer to a location that is, or a portion of the device that is, closer to the operator. The term “distal” is intended to refer to a location that is, or a portion of the device, further away from the operator.
  • The embodiments below are described with reference to the drawings in which like elements are referred to by like numerals. The relationship and functioning of the various elements are better understood by the following detailed description. The embodiments as described below are by way of example only and the present disclosure is not limited to the embodiments illustrated in the drawings.
  • According to one or more aspects of the present disclosure, a device capable of detecting target anatomical structures may be provided. The device may utilize ultrasound and/or Optical Coherence Tomography (OCT) technology as the device is being advanced through a patient's anatomy. The device may have a distal portion having a tip, where the tip can be used to dissect a patient's anatomy without puncturing or tearing the patient's anatomy and while simultaneously allowing the device to inspect the anatomy as it is being dissected by the tip. While the device discussed herein is discussed in the context of a device that can be held by an operator, it is contemplated that the device and/or parts of the device may be used during automated procedures such as those being performed by robotic and other similar systems.
  • In one embodiment, shown in FIG. 1, the device 10 has a proximal portion 12 and a distal portion 14 with a main body 16 disposed between the proximal and distal portions 12, 14. The main body 16 has a proximal end 18 and a distal end 20 and is defined by a longitudinal axis L. The proximal end 18 may have a handle (not shown) or gripping portion (not shown) attached thereto. The length of the main body 16 may vary, but can include a length of 50 to 300 mm; however, in some embodiments the length may fall outside of this range. Similarly, the outer diameter of the main body 16 may vary and can include an outer diameter of between 3 mm and 20 mm. The main body 16 can be made out of any preferable surgical grade material, including but not limited to, a medical grade polymer including PEEK (polyether ether ketone), stainless steel, carbon fiber, and titanium. The main body 16, and one or more of the components of the device 10 generally, may contain radio-opaque markers to allow an operator to detect the location of the device 10 with respect to the anatomy of a patient via radiographic imaging.
  • As shown in FIG. 1, the distal portion 14 of the device 10 includes a tip 22. The tip 22 may be hemispherical in shape, as illustrated in FIG. 1, but it is contemplated that the tip 22 may also be of a different shape. For example, and without limitation, the tip 22 may have a semi-spherical, conical, pyramidal, spear or aspherical shape. The tip 22 may be configured to dissect a patient's anatomy, such as a muscle, without tearing or disrupting the patient's anatomy as it passes through the tissue. As a result, the outer diameter of the tip 22 may have a diameter ranging anywhere between 1 mm and 50 mm and preferably between 2 mm and 9 mm. It is appreciated that the outer diameter of the tip 22 may fall outside of this range as well. It is further appreciated that the tip 22 is optional such that particular embodiments of the device 10 may not include the tip 22.
  • As illustrated in the embodiment shown in FIG. 1, the main body 16 of the device 10 may be substantially straight. However, it is contemplated that the main body 16 may have different shapes, including having a curved shape with a non-zero radius of curvature. An example of such an embodiment is illustrated in FIG. 2, which may be used for MIS requiring access through the presacral space of a patient. The main body 16 may also take on an “L”, “C”, “U” shape or a shape there between.
  • The device 10 may include ultrasonic capability. A purpose of this device may be to serve as an instrument that features a specifically patterned array of high frequency ultrasound transducers and a monitoring system that collects spectral properties of specific tissue in the body. For example, the system may be able to detect the spectral properties of muscle, fat, nerve and bone. As the anatomy is stimulated by the ultrasound transducer(s), it will emit a specific spectral property that can be detected by the monitoring system. The system may examine scan line images and seek specific parameters of amplitude, shape, and other spectral content in order to differentiate the signals coming from the nerve and signals coming from surrounding tissues. For example, nerve tissue may be hypoechoic as compared with the surrounding tissue. However, there are internal structures that provide features in the signal that identify the nerve from single scan lines or RF properties. The system will inform the operator that the device is adjacent to or proximate to the specific type of anatomy that is detected by the system. The device can allow a surgeon to identify and avoid certain portions of a patient's anatomy (e.g. nerve) when performing a minimally invasive procedure.
  • The device 10 may be equipped with ultrasound imager 24 to detect a patient's anatomy as shown in FIGS. 3-5. The ultrasound imager 24 may include a transducer 26 that is configured to emit sound waves and may be disposed at the distal end 20 of the device 10. As shown in FIG. 4, the transducer 26 may include a single element focused transducer, and may have a frequency operation range that includes an operating range of approximately 10-40 MHz. In some aspects, the operating range may be higher or lower than this range of frequencies. Additionally or alternatively, transducer 26 may include a micro machined array, such as a capacitive micromachined ultrasonic transducer (CMUT), having multiple channels. The desirable frequency may vary depending on the application and target anatomy. For example, in one embodiment a frequency or range of frequencies may be selected for detecting nerve from surrounding tissues and adjacent anatomy on b-mode (2D) ultrasound images based on image texture and echogenicity. Reliably distinguishing between nerve and muscle tissue in real time may require quantitative approaches, and may require automated or manual calibration of the system to estimate tissue-specific properties. In some aspects, a meta-analysis comparing ultrasound to nerve-stimulation may result in superior outcomes for ultrasound guidance.
  • As shown in FIGS. 3 and 5, the transducer 26 may be in communication with a RF-pulser/receiver 28, which may be in communication with an analog to digital converter 30, which may be in communication with a digital signal processor 32 and an output 34 such as a monitor.
  • In one embodiment, the transducer 26 converts an electric signal or pulse generated from the RF-pulser/receiver 28 into a sound wave and then converts a reflected sound wave back into an electrical signal. The ultrasound transducer 26 launches sound pulses, which may be short, high-frequency non-damaging sound pulses, into the tissue, and then may wait to hear the reflection from the tissue. Since the speed of sound in tissues is high (˜1500 m/s), this process may take a few milliseconds to image a few millimeters of tissue. As referenced above, the RF-pulser/receiver 28 may generate an electrical impulse that may be sent (e.g., via a cable) to the transducer 26 to generate a sound wave, and may also receive a signal from the transducer 26 generated by the reflected sound waves that the transducer 26 receives. The analog to digital converter 30 converts the analog radiofrequency signal received from the transducer 26 into a digital form that a computer may analyze. The digital signal processor 32 processes the digitized signal received from the digital converter 30. Signal filtering and processing operations may be programmed into the hardware, firmware, and/or software of various components of the system (e.g., the digital signal processor 32) to detect the reflected signal properties of the tissue and distinguish between nerve and muscle tissues in real-time. Once a nerve tissue signature is detected, a hardware, firmware, and/or software system may communicate with the output 34. The output 34 may include a visual monitor that may be programmed to display the anatomy or signals indicative of the anatomy (e.g., via actual images or programmable color configurations (red/yellow/green)) and/or a sound-generating device which may be controlled to emit an audible indicator (e.g. alarm or a “beep”). For example, the sound-generating device, such as a speaker, may emit a “beep” when the device 10 encounters/detects the presence of target anatomy (e.g. nerve) within a predetermined range (e.g. 1 mm to 10 cm).
  • It is appreciated that one or more of these components may be in wireless communication with one another, may be combined into one or components, and other additional components may be in communication between each of these components or one or more identified components and may not be included in a particular embodiment.
  • In one embodiment, the outer diameter of the transducer 26 may be approximately 3 mm, but may range anywhere between approximately 1 mm and 10 mm. Further, the transducer 26 is configured to be disposed in a variety of locations with respect to the device 10. For example, as shown in FIG. 6, the transducer 26 may be disposed at the distal end 20 of the main body 16 or at the tip 22 portion of the distal end 20. The transducer 26 may also be removable such that it can be removably disposed within a conduit 36 formed within the device 10 and removed once a working space is identified and accessible. In some aspects, the working space may be the space created by insertion and/or manipulation of the device 10 within the patient's anatomy.
  • Multiple transducers 26 may be provided as part of the device 10. In some aspects, two or more transducers 26 may be side positioned (e.g. on either side of the main body 16) so as to provide for multi-directional scanning of the patient's anatomy to detect a nerve. The side positioned transducers may be configured to scan the anatomy around in a circumferential direction around the main body 16, and may detect the nerve (or other target anatomy) that was not detected by a transducer positioned at the distal end 20 of the main body 16. The multi-directional scanning enables the system to generate a scan image of the patient's anatomy in multiple directions as the device 10 is advanced through the patient's anatomy.
  • Returning to FIG. 6, the device 10 may include at least one ultrasound transducer 26 (such as a high frequency ultrasound transducer) that is used to stimulate a patient's anatomy, such as muscle, fat, nerve, and bone, and so on. A series of transducers 26 (e.g., a transducer, two or more transducers) may be disposed along the length of the device 10 to allow for a wider pattern of ultrasonic stimulation of the surrounding anatomy. In this embodiment, there may one transducer 26 on the distal end of the device 10 that emits an ultrasonic frequency in a direction that is substantially parallel to the longitudinal axis of the device 10. There may be another transducer 27, adjacent to the first transducer 26, that emits ultrasonic frequency along a path that is substantially perpendicular to the longitudinal axis of the device 10. It can be appreciated that the transducers 26 and 27 can be orientated in any direction that is required for the particular application.
  • FIG. 7 depicts one embodiment where a 5 MHz transducer 26 is located on the distal end 21 of the device. In such an embodiment, the diameter of the transducer 26 may be 3 mm and the transducer may be forward facing. In such an embodiment, the scanning range is approximately 14 mm. In some aspects, an area of the scanning region (hatched section in FIG. 8) by a transducer 26 may not exceed the outer diameter of the transducer. This may be because the scanning width is circumscribed by the outer diameter of the transducer 26 and the peripheral limitations of the transducer 26 such that the transducer 26 cannot identify or scan a region that lies beyond of the diameter of transducer 26.
  • Accordingly, in some aspects such as those disclosed herein where the transducer 26 is housed within a device 10, the transducer 26 may be unable to scan the region that is directly front of (distal to) the outer portions of the device that houses the transducer 26. This region 44 is depicted in FIG. 9. One result of this is that the target anatomy may go undetected if it is positioned beyond the scanning region of the transducer 26.
  • In some aspects, for example to detect target anatomy that lies just beyond the scanning region of the transducer 26 (i.e. outside of the scanning diameter), the device 10 may include two or more transducers positioned at an angle relative to one another. For example, as shown in FIGS. 10 and 11, the first transducer 40 may be positioned at an angle with respect to the outer edge of the main body 16. The angle may be measured from the face of distal end 21 of the device 10, or may be measured from the horizontal axis that intersects the longitudinal axis of the device 10. The angle α in this embodiment is 7° but it is appreciated that it can vary from 0° to 180° depending on the particular embodiment. Further, in this embodiment, the angle may be formed between the edge of the transducer 26 and the adjacent outer edge of the main body 16.
  • Transducer 40 may be angled with respect to any portion of the main body 16. For example, the first transducer 40 may be angled by pivoting the first transducer 40 about the portion positioned along, or closest to, the longitudinal axis of the main body 16 as shown in FIG. 12. Specifically, as shown in FIG. 12, the transducers 40, 42 may be angled towards the longitudinal axis of the main body 16 such that the angle of tilt, al aa, are measured as the angle of the transducer from the longitudinal axis of the main body 20.
  • In the embodiment shown in FIG. 10, the first transducer 40 may be positioned at an angle (e.g., 7° from the edge of the main body 16) allowing the first transducer 40 to scan a region extending beyond the outer edge of the main body 16. Angling the first transducer 40 may allow the device 10 to scan and detect any target anatomy residing outside of the scanning region of a transducer that is not angled with respect to the main body 16 of the device. A region that may be scanned by the first angled transducer 40 in this particular embodiment is shown in FIG. 10.
  • The two transducer elements may fit on the tip of a device 10. This may allow, for example, the device 10 to detect backscattered signals. In some aspects, the device 10, and/or the two transducer elements may be configured to detect the presence of nerves or other neural elements existing distally up to 1 cm from the probe tip 22 (e.g., the nerves or neural elements may be in the pathway of the device). In some aspects, a device may be provided with transducers having different stimulating frequencies (e.g., a first transducer 40 may stimulate with a frequency of 5 MHz, and a second transducer may stimulate with a frequency of 10 MHz). Beam patterns or fields emitting from the sources may be modeled using an ultrasound simulation program assuming an element diameter of 3 mm and a focal number of 3 (f/3).
  • In some aspects, two transducer elements 40, 42 may be positioned at the tip of the probe. The tilted element 40 faces outward with an angle of tilt, such as 7°. The tilted element may have one end at the edge of the probe. The un-tilted element 42 may be centered from the edge of the distal portion of the device 10. This configuration may allow the device 10 to be rotated as it is snaked through the tissue so that the cross sectional surface area of the probe may be at least 1 cm above the probe surface.
  • The diameter of the transducers 40, 42 may vary and can range from 1 mm to 20 mm. For example, in the embodiment shown in FIG. 10, the first transducer and second transducers 40, 42 each have a diameter of approximately 3 mm. It is not necessary for the transducers to have the same diameter of one another and they may be staggered as shown in the top view of FIG. 10.
  • The main body 16 can be rotated so as to allow the first transducer 40 to scan the entire outer region to detect whether any target anatomy is present that is just beyond the scanning area of a forward facing transducer. By rotating the main body 16 about its longitudinal axis, the first transducer 40 can scan the outer region that does not fall within the scanning region of a transducer that is not angled with respect to the main body 16 of the device.
  • The number of angled transducers may vary and they may be positioned at various angles with respect to the distal end of the main body 16. For example, as shown in FIG. 12, the first and second transducers 40, 42 are positioned toward one another so that their scanning areas cross to provide a scan of the area distal to the distal portion 20 of the device 10 and a region beyond the area directly in front of to the outer diameter of the transducers 40, 42 as shown in FIG. 9. Also, the transducers 40, 42 may be positioned at an angle on more than one axis with respect to the distal end of the main body 16.
  • The device 10 can be configured to determine the b-mode scans of the patient's anatomy and associated data, including, for example, the voltage trace from a scan line in the b-mode image. The voltage trace for certain anatomical parts (e.g., a nerve) may have a unique voltage trace that may be used to detect like anatomical parts within the patient's anatomy. One way to detect like anatomical parts may be by comparing the voltage trace from a scan line of a b-mode image to the known voltage trace from the scan line of the target anatomy. Specifically, the b-mode scans (and associated data, such as a-scan lines, voltage traces, and the like) may be captured by the device 10. The scans and/or data may be compared to the pre-determined b-mode scans (and associated data) of known anatomical features (e.g., nerve) to determine whether the region captured by the b-mode scan from the device 10 contain the target anatomy.
  • The device 10 may be used in conjunction with a neuromonitoring system capable of detecting certain portions of a patient's anatomy, including neural elements that include a nerve, nerve bundle, or nerve root. For the purposes of this discussion, the device 10 and neuromonitoring system will be discussed with respect to detecting a patient's spinal nerve but it is contemplated that the device 10 and neuromonitoring system can be used to detect other nerves (peripheral and central) as well as the spinal cord. One type of neuromonitoring system that can be used in conjunction with the device 10 is disclosed in U.S. Pat. No. 7,920,922, the entirety of which is incorporated by reference herein.
  • Experimentation
  • The discussion below is directed to the experiment used to determine the target ultrasonic frequency that can be used to detect nerve using the device 10 and whether the b-mode scan images captured by the device 10, which is inserted into the patient's anatomy, is comparable to results captured by traditional non-invasive ultrasound devices. Both target objectives were accomplished using the following process.
  • FIG. 13 illustrates exemplary data indicative of a scan of a sciatic nerve of a rabbit with a clinical ultrasound array system. The scan was performed before and after euthanizing the rabbit to be sure that the nerve could be seen in both cases. FIG. 13 depicts b-mode images (nerve cross-sectional view) for alive (left, image 1300) and dead (right, image 1350). The nerve may be seen in each case (area pointed to by white arrow on image 1300 and image 1350).
  • The nerve was scanned using a high frequency (40 MHz) probe with the bottom of the nerve still attached to the muscle and the nerve centered in the probe's depth of field. FIG. 14 illustrates a b-mode image 1400 of this scan. The image 1400 shows (from top to bottom): water, nerve, and muscle. The nerve separates from the muscle towards the right side of the image, and you can see a gap between the nerve and the muscle. FIG. 14 also illustrates a plot 1450 depicting the voltage trace from a scan line in the center of the b-mode image, indicated by the vertical line.
  • The hind limb sciatic nerve was scanned with a 20 MHz single-element probe though the leg muscle. The muscle was kept intact, and the skin removed to provide a window to see into the muscle. The image 1500 shown in FIG. 15 illustrates the setup.
  • A clinical scan was performed before scanning with the 20 MHz probe. Images 1600 and 1650 from the clinical scan of the nerve are shown in FIG. 16, and images 1710 and 1720 from the scan performed by the 20 MHz probe are shown in FIG. 17. A comparison between the clinical imaging system and the 20 MHz probe system suggests that both techniques produce similar images. For example, the cross-section and length-wise (with respect to the long direction of the nerve) scan planes both show the nerve in the background muscle for both the clinical system and the 20 MHz system.
  • The 20 MHz results are important for at least two reasons: First, the results show that the contrast inside the muscle exists at 20 MHz as demonstrated in the left image 1710 in FIG. 17. Second, the depth of penetration for the 20 MHz signal was sufficient to be seen at more than 1 cm of depth. This may be the distance away from the surgical probe required for detecting the nerve. Therefore, this suggests that if the signals can be used to detect the nerve, the signal strength and penetration should not be an issue at the chosen ultrasound frequencies.
  • FIG. 18 depicts an image 1800 of a single scan line through the nerve. There are characteristic signatures from the nerves that can be used to detect the nerve from a single scan line.
  • In another experiment, a dual element transducer 40, 42, having a configuration similar to the embodiment disclosed in FIG. 10 was utilized. This embodiment was used for testing of sciatic nerves of 22 rabbit legs post-mortem. A total of 142 sets of radio frequency (RF) data were collected. Each slice of data was recorded over a 30 mm lateral distance capturing an axial region of approximately 5-20 mm from the transducer surface. Imaging performance was evaluated when conducting a sector scan of −35° to 35° at a depth of up to 1.5 cm and a center frequency of up to 15 MHz. The ultrasonic transducer used for scanning was dual-element, 10 MHz transducer, where each element was 3 mm in diameter with an f-number of 3. Images were acquired using a Pulser/Receiver, the settings of which are shown in the table below.
  • TABLE 1
    Pulser/Receiver settings
    Pulse Repetition Frequency 200 MHz
    Energy 12.5 uJ
    Damping 25 Ohms
    High Pass Filter 1 MHz
    Low Pass Filter 20 MHz
    Input Attenuation
    0 dB
    Output Attenuation
    0 dB
    Gain
    20 dB
  • As discussed above, the device was used to capture an ultrasonic B-mode image, where the sciatic nerve may be identified as an isolated, hyperechoic region, usually elliptical or circular in nature as seen below. As such, one of the first attempts to classify the images was based on the thresholding on the energy of the received scan line. From the data displayed in FIG. 41, for example, it may be seen that the scan line energy does allow for the detection of the presence of the sciatic nerve. Using energy alone, however, as a classification criterion may not be adequate to distinguish between the diffuse, weak signature of the nerve with the sharp, strong signature of an isolated strong scatterer (e.g., data point 4100 in FIG. 41).
  • Low frequency ringing may also be present in the received RF scan lines, particularly near the surface of the transducer. Spectral analysis of the ringing may indicate a high noise component, such as a noise component between 500 kHz and 2 MHz. To combat this noise, the received RF may be passed through a finite impulse response (FIR) bandpass filter, such as software, hardware, and/or firmware configured using the design specifications described in Table 2 and illustrated in example user interface 4200 of FIG. 42. Furthermore, an illustrative output 4300 of the bandpass filter is shown in FIG. 43.
  • TABLE 2
    Bandpass filter design specifications
    Type Optimal Equiripple
    Order 296 (minimum order)
    Sample Frequency 250 MHz
    Fstop1
    5 MHz
    Fpass1
    7 MHz
    Fpass2 15 MHz
    Fstop2 17 MHz
    Astop1
    60 dB
    Apass (ripple) 0.3 dB
    Astop2
    60 dB
  • Since classification of the nerve using only the scan line energy may not yield satisfactory results, it is appreciated that a multivariate classification approach can be explored. One commonly used multivariate classification algorithm is a support vector machine (SVM). A SVM may be a supervised learning algorithm that attempts to find an optimally separating hyperplane between two labeled sets of data. Each observation within the data sets consists of a number of features, which may be in some aspects descriptive variables that may be used to help classify the data.
  • For example, with reference to SVM classification schemes 4410 and 4420 in FIG. 44, consider the two-dimensional classification problem below. The ith observation in the training set is associated with a feature vector x(i)=(x1 (i), x2 (i)) and a data label y(i)∈{−1,1} which describes the observation's class. Any hyperplane over the feature space may be defined as {x:f(x)=βTx+=β0=0}, where β is a vector. The general goal of the SVM is to solve the optimization problem
  • min β , β 0 β
      • subject to the constraint y(i)T x(i)0)≥1
  • for all i in the training set. By solving this optimization problem, the SVM may locate the hyperplane which maximizes the margin between separable data. Once the values for β and β0 are found, new events may then be classified based on which side of the hyperplane they lie, or equivalently:

  • ŷ(x)=sign(βT x+β 0)
  • This simple form of SVM works for separable data, like the case in FIG. 44 on the left (e.g., scheme 4410). However, in many cases the data may not be separable, such as the case on the right (e.g., scheme 4420). In this case, the SVM must change the optimization problem to include slack variables, ξi, into the optimization problem. The new optimization problem is then given by
  • min β , β 0 , ξ i 1 2 β 2 + C i ξ i such that y ( i ) ( β T x ( i ) + β 0 ) 1 - ξ i
  • By adding these slack variables, the constraint is now much less restrictive, since the slack variables allow for particular data points to be misclassified. The amount of misclassification may be controlled by the reweighting factor C known as the box constraint. In some aspects, the box constraint may be a parameter specified by the operator. When the box constraint is high, the optimization algorithm may force the slack variables to be small, thus resulting in a more restrictive classification algorithm.
  • Some classification problems do not automatically lend themselves to simple linear decision boundaries. In these cases, a feature set may be transformed into a different domain before attempting linear separation. For example, the instances of x may be replaced with the transformed version h(x). Typically these feature transformations are specified by their kernels. The kernel of the transformation is defined as the inner product between transformed feature vectors, or symbolically,

  • K(x (i) ,x (j))=
    Figure US20210169447A1-20210610-P00001
    h(x (i) ,h(x (j))
    Figure US20210169447A1-20210610-P00002
  • A commonly used kernel may be the radial basis function or Gaussian kernel, which may have the form
  • K ( x ( i ) , x ( j ) ) = exp ( - x ( i ) - x ( j ) 2 2 σ )
  • In practice, Gaussian kernels are generally known to perform well in nonlinear classification. However, Gaussian kernels also add another degree of freedom to optimize over: the width parameter σ. This is yet another parameter which may have to be tuned by the operator depending on the target anatomy.
  • For each scan line, a set of features may be generated based on the statistical information of the received RF data and envelopes. In order to help mitigate corruption due to isolated strong scatterers, statistics based on the log of the envelope may also be computed. The identity and uniqueness of the classifier to the problem is the combination of the feature set used to build the classifier. A complete list of features is given below. It is appreciated that for a particular embodiment, any combination of these features may be used for the SVM to identify the target anatomy: (1) skewness of the received RF, (2) mean of the envelope, (3) variance of the envelope, (4) skewness of the envelope, (5) kurtosis of the envelope, (6) mean of the log of the envelope, (7) variance of the log of the envelope, (8) skewness of the log of the envelope, (9) kurtosis of the log of the envelope.
  • FIG. 45 illustrates a plot matrix 4500 demonstrating that the above-recited features do not appear to completely discriminate scan lines containing the nerve from scan lines that do not. Most of the data does not appear to follow a simple linear decision boundary, so a Gaussian kernel with σ=1 was used to help preserve nonlinear tendencies. The box constraint parameter was also set to 1 during training.
  • To evaluate the performance of the SVM, an evaluation metric may be computed by a specially-programmed computing device. In some aspects, the receiver operating characteristic may be used as an evaluation metric. For example, as a classification algorithm, the ROC curve plots the true positive rate vs. false positive rate, thus displaying the different trade-offs between operating at various threshold levels. FIG. 46 illustrates the results 4600 of the ROC plot, indicating that the classification algorithm appears to have good performance based on the training data distributions.
  • The testing demonstrated that a Gaussian-based SVM can be a powerful tool in determining the presence of the sciatic nerve in a single scan line. The majority of the power of any multivariate algorithm lies within the features used to describe the data. Utilizing just a single scan line, the system may be able to achieve a true positive rate of over 80% and a false negative rate of less than 10%.
  • Additional techniques contemplated include post-processing schemes, such as time gain compensation, to accentuate deeper features in the tissue, or using filters such as median filters to remove some of the strong peaks from the energy signals.
  • The detection of the nerve (or any other anatomical feature) may be automated. Once the anatomical feature is detected, an audio or visual signal such as “beeping” sound or a flashing light signal (or similar signal) may be given to a physician to indicate that they, or the device, are within a certain distance from the nerve.
  • Automatic detection of nerve may be based on single scan lines, and may compare the b-mode scan lines captured by the probe with the known scan lines of the target anatomy. In some aspects, the detection system may notify the operator that the captured scan lines are identical to, or are within a certain predetermined value of, the known scan lines of the target anatomy (e.g., the known scan lines of the target anatomy may represent a unique signature). The detection system may also be calibrated to determine the proximity of the tip of the probe to the target anatomy and notify the operator when the tip of the probe is within a set distance (e.g. 1 mm). Furthermore, the system may be configured to notify the operator of the spatial location of the target anatomy and/or inversely the spatial location of non-target anatomy.
  • Further Details Regarding Aspects of the Present Disclosure
  • In some aspects, the device 10 may be equipped with an image capture system 200 as shown in FIGS. 29 and 30 used to detect certain portions or aspects of the anatomy (e.g., nerve or vessels). The image capture system 200 may also be used independently of, or in conjunction with, the ultrasound imager 24, as described above in the device 10 embodiments, and/or may be used as a stand-alone or a complimentary detection technique to the ultrasound imager 24. In some aspects, the tip 22 in this embodiment may be a lens 23 that has an outer surface 202 and an inner surface 204. It is appreciated that the outer surface 202 of the lens 23 need not be the same shape of the inner surface 204 and may differ depending on the desired optical performance. The lens 23 may or may not provide a magnification of an image, I, beyond the lens 23. In this embodiment, the lens 23 provides no magnification. The lens 23 in this embodiment is clear, but can also be tinted or provided with a color filter as will be discussed below and may have anti-fog, anti-condensation, and/or anti-reflection coatings or properties.
  • As further seen in FIG. 30, the distal portion 14 of the device 10 further includes the image capture system 200 having an image capture device 201. The image capture device 201 may include an image capture sensor 206 that is disposed adjacent to the distal end 20 of the main body 16. The image capture device 201 can be connected to a flexible sheath 203 that may carry a fiber optic cable 205 that connects the image capture device 201 to a housing 207 that houses the image processing components. It is also contemplated that the image capture device 201 can be wirelessly coupled to the image processing components.
  • An image capture output device may be included that is in communication with an image control system that can adjust the properties of the image quality. It can also be appreciated that the image capture device 201 may wirelessly transmit images and video to the image control system without any hard wired components. The term “image capture device” may include devices configured to record or capture either still or video images, such as video cameras, digital cameras, CCD sensors, and the like. One example of an image capture device that may be used with the device may be a 1/18″ CMOS camera with OMNIVISION sensor OV 6930. However, those skilled in the art will easily contemplate the settings & components such as the “image capture device,” illumination device, and the like in accordance with the present disclosure as described herein.
  • In one aspect, the image sensor 206 of the image capture device 201 may be disposed within the distal portion 14 of the main body 10. The image sensor 206 or the capture device 201 may be at the most distal end 20 of the main body 16 such that it forms a distal surface 210 of the image sensor 206 (or capture device 201) or is flush with the distal end 20 of the main body 16. In some aspects, “image sensor” may be synonymous with “image capture device.” In some aspects, the image sensor 206 may be set back proximally from the distal end 20 of the main body 16, depending on the application. For example, the image sensor 206 may be recessed from the distal end 20 of the main body 16. Alternatively, the image sensor 206 or the image capture device 28 may extend from the distal end 20 of the main body 16 towards the inner surface 204 of the tip 22.
  • The image capture device 201 may define an optical axis O. As shown in FIG. 30, the optical axis O is the axis collinear with the longitudinal axis L defined by the main body 16 as discussed above. However, the optical axis O may also be or offset from the longitudinal axis L. The image capture device 201 will have a field of view a, varying between 5 and 180° depending on the specific application.
  • The image capture device 201 may be configured to capture an image I that exists just beyond the outer surface 202 of the tip 22. Specifically, as better shown in FIGS. 31 and 32, the tip 22 may be configured to dissect the anatomy of a patient, designated here as element 212. As seen here, the tip 22 while dissecting the anatomy may create a working space 214 by way of its shape. This may enable the image capture device 201 to view the anatomy during the dissection process instead of having the anatomy directly abut the image capture device 201 which may distort the image quality.
  • In this embodiment, the working space 214 may be defined as the space between the distal surface 210 of the image capture device 201 and the outer surface 202 of the tip 22. The working space 214 may permit the image capture device 201 to view a portion of the anatomy that is within its viewing angle α. Without the working space 214, the anatomy might abut and thereby obstruct the image capture device 201 thereby preventing an illumination device 216 from illuminating the anatomy and the image capture device 201 from capturing an image. The working space 214 in some aspects may be primarily filled with air, but it can be appreciated that the working space may be filled with other materials, such as a liquid or other gases. In the alternative, the working space 214 may be created by a tip that is solid such that the inner surface 204 of the tip 22 is adjacent to the distal end 20 of the main body. It can be appreciated that the working space 214 may create a distance between the outer surface 202 of the tip 22 and the image capture device 201 of 2 mm to 10 mm or greater
    Figure US20210169447A1-20210610-P00003
  • The distal portion 14 of the main body 16 may also include the illumination device 216, as shown in FIG. 33, which is a cross sectional view of FIG. 30. The illumination device 216 may include a set of light emitting diodes 218 (“LED”). It can be appreciated that the number of LEDs 218 and the location of each LED 218 with respect to the image capture device 201 may vary. For the example, there may only be one LED 218 for a particular application. Conversely, there may be as many as four or more LEDs 218. In some aspects, the LEDs 218 are spaced equidistant from one other, but it is not a requirement and the LEDs 218 may not be equally spaced. Other illumination devices 216 may include a light source such as near-infrared LED, mid-infrared LED, other LEDs of various wavelengths ranging from UV to infrared, or any similar light source known in the art. The illumination device 216 may also take the form of a light source emitting from an annular ring around the image capture device 201 or a plurality of light fibers angularly spaced around the image capture device 201
    Figure US20210169447A1-20210610-P00003
  • The illumination device 216 in the device 10 may be disposed within the distal end 20 of the main body 16. However, the illumination device 216 may be external to distal end 20 or tip 22 or be embedded in the tip 22 such that the illumination device 216 does not create reflection on the inner surface 204 of the tip 22, as a reflection may impair the field of visibility of the image capture device 201. For example, and without limitation, the illumination device 216 may be disposed on either side of the longitudinal axis L of the main body 16 as shown in FIG. 34 and distal from the distal end 20 of the main body 16
    Figure US20210169447A1-20210610-P00003
  • Regardless of the type of illumination device 216, the intensity of the illumination device 216 may be adjusted to as to change the level of illumination of the anatomy of a patient. Moreover, if the illumination device 216 includes more than one illumination sources, certain of the illumination sources may be turned off while others remain on, and that the intensity of each source may be independently adjusted.
  • The illumination device 216 may also include color filters to as to highlight certain features of the patient's anatomy such as a nerve or vessel, through a filter. In some aspects where the illumination device 216 consists of one or more LEDs 218, different color LEDs, such as red blue, yellow, green or others, may be used to enhance or highlight certain features of a patient's anatomy. One alternative to placing the filters with the illumination device 216 may be to include the filters with image capture device 201. The filters may include color filters, including red, orange, yellow, green, magenta, blue, violet, and the like. The filters may also include band pass-filters, such as UV, IR, or UV-IR filters to better view the desired anatomy. Alternatively, or in conjunction with the above, the illumination
    Figure US20210169447A1-20210610-P00003
    device 216 may rely on ultraviolet to detect certain features of the patient's anatomy, such as a nerve. In this embodiment, the illumination device 216 may include a light source irradiating the desired portion of a patient's anatomy with illumination light including excitation light of a wavelength band in an ultraviolet or visible region. It is also contemplated that the illumination device 216 may have both illumination means for emitting illumination light in the visible and ultraviolet regions simultaneously and where the image capture device having an appropriate spectral response can capture images in both regions
    Figure US20210169447A1-20210610-P00003
  • The illumination device 216 may have an illumination axis IA as disclosed in FIG. 35. The illumination axis IA may or may not be collinear with the optical axis O or the longitudinal axis L of the device. The illumination device 216 and location of the illumination axis IA may result in a reflection captured by the image capture device 201 that may distort the image of the patient's anatomy. The reflection may be caused by the rays generated by the illumination device 216 reflecting from an outer surface 202 or inner surface 204 of the tip 22 or the lens 23. To minimize and/or eliminate the reflection caused by the illumination device 216, the illumination axis IA may be displaced relative to the optical axis O of the image capture device 201. In the aspect as shown in FIG. 35, the optical tip is shown as a solid lens with the elliptical outside surface 202. One property of elliptical surfaces is that if a light source is placed at one focus, all light rays on the surface of the ellipse are reflected to the second focus. Each of two light sources 216, in the form of LED or optical fiber, may be placed at the focal point of elliptical surface 202. The illumination rays 220 reflected from elliptical surface may concentrate at foci of this surface while the rays 222 reflected from the tissue abutting the surface 202 may be directed toward the capture device 201. This arrangement eliminates the direct optical reflection of illumination beams into the image capture device 201 or sensor 206 while maintaining its on-axis position. The offset between the optical axis O and the illumination axis IA may vary in range including between 1 to 2.5 mm.
  • Another embodiment of the device that includes an offset between the optical axis O and illumination axis IA is shown in FIG. 31. As shown in this embodiment, the tip optical axis SO of the tip 22 is offset from the optical axis O of the image capture device 201 to reduce or minimize the amount of reflection captured by the image capture device 201. In this embodiment, the illumination may be provided by an annular ring of optical fiber around the lens of the image capture device 201. The illumination beams reflected from the inner surface 204 will focus on the tip 22 center of curvature. Because the tip optical axis is offset in relation to the optical axis of the image capture device 201 its center of curvature is outside of the objective field of view and therefore reflected beams do not degrade the image quality.
  • As shown in FIGS. 36-37, the device 10 may also include a conduit 224 disposed at least partially within the main body 16 and the tip 22. In the aspect shown in FIG. 36, the conduit 224 may be off-set from the longitudinal axis L of the main body 16. However, in some aspects, the conduit 224 may be collinear with the longitudinal axis L of the main body 16. The conduit 224 may extend from the proximal end 18 of the main body 16, or may be formed along only a portion of the main body 16 and extend all the way through the distal portion 14 of the device, including the tip 22. The conduit 224 may be used in the ultrasound or visualization embodiments of this device. The conduct 224 may also be placed in the along the longitudinal axis of the main body 16, which may require the transducer 26 to be offset to accommodate the conduit 224.
  • Stated differently, the location of the conduit 224 may vary and may be application dependent. For example, and without limitation, the conduit may be placed closer to the longitudinal axis L than shown in the embodiment disclosed in FIG. 36. In the alternative, the conduit may be placed on the exterior of main body 16 of the device to as to form a raised portion that extends along a direction that is substantially parallel to the longitudinal axis L of the main body 16. In this embodiment, the conduit 224 would reside in the raised portion as shown in FIG. 37, which is a cross-sectional view of the distal end 20 of the main body 16 of this embodiment.
  • In addition, it is contemplated that there may be more than one conduit 224 for a particular device 10 thereby allowing an operator to simultaneously place multiple instruments into the patient's anatomy. The diameter of the conduit is application dependent by can vary and may be between 0.3 mm to 5.5 mm.
  • A conduit 224 as shown in FIGS. 36-37, may be configured to receive a number of instruments, including a guide wire to guide the device 10 through the anatomy of a patient, a k-wire to anchor the device 10 to a surgical site such as the disk space between two vertebra, an illumination device, such as an optical fiber, to provide additional illumination to a particular region of the patient's anatomy, an ultrasound probe to image or detect particular regions of a patient's anatomy, such as the one described above, a fiber optic cord that emits visible or infrared light to image or detect particular regions of a patient's anatomy, a nerve stimulator for neuromonitoring, and the like.
  • In some embodiments, as shown in FIGS. 48 and 49, the device 10 may be provided with a channel 4810 (hollow region) extending throughout the length of the main body 16 and the tip 22. FIG. 48 shows the device 10, viewed from the distal end, to illustrate the tip 22 in these embodiments. As shown in FIG. 48, the tip 22 may form an annulus or ring. The inner space provided by the channel 4810 may provide a working channel inside the device 10 through which a surgeon may perform clinical procedures while allowing the device 10 to remain in place. This may allow for the collection and display of data for real-time feedback form the site of the surgery during the surgical procedure. In aspects of these embodiments, the main body 16 and the tip 22 may provide a channel 4810, with a diameter in a range from 24 mm to 40 mm. The wall of the main body 16 and/or tip 22 may be 10 mm in width. Various other diameters and widths are contemplated herein; the values discussed above are examples to aid in understanding of the disclosure. For example, in some embodiments, the diameter of device 10 may be selected to allow a particular surgical implement to be fitted through the channel of device 10 so that imaging may continue as the implement is advanced into tissue. In various aspects, the channel 4810 may be off-set from the longitudinal axis of the main body 16. However, in some aspects, the channel 4810 may be collinear with the longitudinal axis of the main body 16. The channel 4810 may extend from the proximal end 18 of the main body 16 all the way through the distal portion 14 of the device 10, including the tip 22.
  • In these embodiments, multiple transducers 26, such as transducer 4810 of FIG. 48, may be arranged on the distal end 20 of the device 10, on the ring-shaped tip 22. As discussed above, the transducers 26 may emit an ultrasonic frequency in a direction that is substantially parallel to the longitudinal axis of the device 10. It is also contemplated that various transducers may emit an ultrasonic frequency in a direction that is not parallel to the longitudinal axis of the device 10. It can be appreciated that the transducers 4820 may be orientated in any direction that is required for the particular application. The ultrasonic frequency may be between 1 and 10 MHz, depending on the application. The ultrasound imager 24 may collect information, such as acoustic property results, from the transducers and, using the classification algorithm, screen for target tissue, such as nerve tissue. Nerve tissue may be highlighted on a display. In some embodiments, a 2-D mapping or a 3-D visualization (e.g. volumetric image) of nerve tissue may be generated and displayed.
  • In some embodiments, as shown in FIG. 49, the device 10 may have a length of 120 mm to 150 mm, forming the shape of a tube 4910. It is contemplated that the device 10 may have a variety of other lengths, depending on the application. Probe 4920 is an example of a surgical tool that may be passed through the channel 4930 for various surgical procedures.
  • FIG. 50 shows the distal face of the tube, in order to illustrate various embodiments. The transducers are contemplated to have various shapes, such as a square, a rectangle, such as the rectangular transducers 5010, and circular, such as the circular transducers 5010. In some embodiments, more than one row of transducers may be arranged on the distal face, as illustrated in image 5030. Various quantities of transducers may be used, depending on the application. In some embodiments, 100 to 250 or more transducers may be position on the distal end of the tube. Transducer frequency may be in the range of 5 MHz to 20 MHz to cover the appropriate depth required for the application (for example, 0.1-5 cm in depth). Other transducer frequencies may be used in various embodiments.
  • The above embodiments may be further understood by referring to FIGS. 51-54. FIG. 51 shows the views 5110 and 5120 of device 10 having the channel as described above. Image 5130 shows the device 10 placed on the surface of the psoas muscle. The placement of the device 10 on the surface of the psoas muscle may be a first step prior to entering the psoas muscle in order to allow the surgeon to visualize the location of nerves within the psoas muscle using the methods as described herein. The transducers may emit ultrasound signals around the distal end of the device 10 and the image processor/software may form an image with a depth of 5-7 cm, creating a 3-D “volumetric” image of the area in the psoas muscle overlying the vertebral body and disc space. Based on use of the classification algorithm, sensory and motor nerves, and their routing, may be highlighted on the display.
  • FIG. 52 shows the device 10 and a surgical probe 5210. Image 5220 shows the device 10 placed into position on the surface of the psoas muscle. Image 5230 depicts the probe 5210 before it is placed into the channel of device 10, which is in position on the psoas muscle. FIG. 53 depicts the device 10, placed against the psoas muscle 5140 and with the probe 5210 inserted (retracted) into the channel of device 10. At this point, the device 10 is in position for performing the operations as described herein to map the nerve tissues in the psoas muscle. FIG. 54 depicts the device 10, placed against the psoas muscle 5140 and with the probe 5210 inserted into the channel of device 10 and into the psoas muscle. Image 5400 shows an example anatomical image with a viewing window 5410 as may be generated by use of the classification algorithm. The viewing window 5410 may depict a map of the nerve tissues in the psoas muscle proximate to the device 10.
  • Various aspects of the present disclosure are contemplated for being used in connection with minimally invasive surgery (MIS). The device 10 may be used for a variety of MIS procedures, including but not limited to, a lateral retroperitoneal interbody fusion (LLIF) (e.g., XLIF, DLIF), Axial Lumbar Interbody Fusion (AxiaLif), Transforaminal Lumbar Interbody Fusion (TLIF), Posterior Lumbar Interbody Fusion (PLIF), Anterior Lumbar Interbody Fusion, Trans-thoracic lumbar interbody fusion, Retropleural Thoracic Fusion, Interbody Fusion utilizing Kambin's Triangle, and Cervical/Thoracic/Lumbar Laminectomies, Foraminotomies and Diskectomies. The device 10 may be used to confirm that the area is clear of other anatomical parts, such as blood vessels, abdominal/pelvic viscera, nerve roots, and spinal cord. As shown in FIG. 19, once at the surgical site 46, the device 10 may be used to illuminate the surgical site 46, to allow a surgeon to introduce instruments (e.g. K-wire) to the surgical site via a conduit formed within the main body 16 of the device 10 or allow a retractor system or dilator system to create direct visualization and a working portal of the surgical site without the device 10.
  • As described above, there can be a number of applications for which this device 10 may be used, which require the similar steps to access the surgical site. The method of use described below is in connection with performing an LLIF, but it can be appreciated that the device 10 can be used in a similar fashion for performing other MIS procedures as mentioned above.
  • In operation, the ultrasound imager 24 is used to detect the patient's anatomy as described herein. A surgeon may rely on the image or audio queues generated by the ultrasound imager 24 to detect the presence (or absence) of a nerve thereby allowing the surgery to reposition (or continue advancing) the device 10 through the patient's anatomy towards the surgical site 46. The ultrasound imager 24 may also be used to confirm the image captured by an image capture device (not shown) is accurate by confirming the presence or absence of a targeted anatomical feature (e.g. nerve). The image capture device may consist of a camera or the like disposed within the distal portion 14 of the device 10 so as to capture an image of the region distal to the device 10.
  • The image capture system 200 may also be used in a similar fashion to visually detect the patient's anatomy. The image capture system 200 may be used to confirm what is detected by the ultrasound imager 24, or may be used independently to detect certain portions of the patient's anatomy.
  • The classifier/algorithm is integral in accurate nerve detection. In an alternative embodiment of the present invention, the classifier algorithm may be employed as follows:
  • An ultrasound probe acquires ultrasound backscatter data via a sector scan or similar ultrasound acquisition mode.
  • A 2D B-mode image may be constructed from the acquired data. A 2D B-mode image may be constructed using known methods and techniques. In one embodiment, a 2-dimensional image may be built up by firing a beam vertically, waiting for the return echoes, maintaining the information and then firing a new line from a neighboring transducer along a tambourine line in a sequence of B-mode lines. In a linear array of ultrasound crystals, the electronic phased array may shoot parallel beams in sequence, creating a field that is as wide as the probe length (footprint). A curvilinear array may have a curved surface, creating a field in the depth that is wider than the footprint of the probe, making it possible to create a smaller footprint for easier access through small windows. This may result in a wider field in depth, but at the cost of reduced lateral resolution as the scan lines diverge.
  • The image may be thresholded such that image intensity values above the threshold may be given a value of ‘1’ and image intensity values less than the threshold may be given a value of ‘0’. This may result in a binary map of intensity values.
  • The binary image may then be filtered with a smoothing filter which may consist of a simple median filter or similar smoothing filter. It is contemplated that a variety of digital filters may be employed to process the binary image.
  • The numbers of pixels for contiguous regions in the binary image may be counted.
  • Contiguous regions in the binary image that have a pixel count above a minimum threshold and below a maximum threshold may be selected. This may correspond to areas of contiguous regions in the binary image. The thresholds may be selected so that a nerve in the image will be detected and selected with a high degree of accuracy.
  • If the selected contiguous regions are less than a specified distance from the probe (2 cm, for example), the contiguous region may be selected as a possible nerve. It should be understood that the measured distance may be within a range of distances, and that the specific distance of 2 cm is provided by way of example.
  • A shape factor may next be implemented by fitting an ellipse to outline the selected contiguous regions in the 2D binary image. The area of the contiguous region may be compared to the area of the ellipse outlining the contiguous region. If the ratio of the areas is below a certain threshold, the contiguous region may be classified as not a nerve. A variety of outlining techniques, aside from the fitting of an ellipse, may be used to compare the contiguous region.
  • If the contiguous region is classified as a nerve, the original image data corresponding to the contiguous region may next be processed for texture features. Specifically, for the selected contiguous region, the SNR, kurtosis and skewness may be calculated from the B-mode image data (it is understood that alternative data parameters may be measured at this time as well). Threshold values for each of these parameters may be established in order to detect the nerves in the image. If the combination of the SNR, skewness and kurtosis are below a threshold, the contiguous region will be classified as not a nerve. It is contemplated that additional data parameters may be measured from the original image data for nerve detection.
  • In an alternative embodiment, robust detection using double threshold and connected component tracking by hysteresis may be employed. Specifically, instead of using only one, it may be preferred to use two threshold values: a high threshold value and a low threshold value. For each pixel in the B-mode image, if its value is larger than the high threshold value, then it may be marked as a strong nerve pixel. If the pixel value is smaller than the high threshold value and larger than the low threshold value, then it may be marked as weak nerve pixel. If the pixel value is smaller than the low threshold value, then it may be discarded. The connected component algorithm may be applied to look at each weak nerve pixel, and if it is connected to a strong nerve pixel then the weak nerve pixel may be preserved.
  • In some embodiments, the shape of the identified connected components (blobs) of detected nerve pixels may then be analyzed. If the size and shape of a blob is significantly different from the profile of a nerve area (elongated or elliptical of width about 5 mm) then it may also discarded. The detected nerve region in a B-mode image should have, at most, a maximum dimension of 1 or 2 cm. It is understood that data may fall in ranges which may yield the detection of a nerve, and that specific numbers are only provided by way of example.
  • The distance from the detected nerve region to the ultrasound probe may be estimated and used in the display.
  • In an alternative embodiment of the present invention, it is contemplated that sophisticated training and detection algorithms for nerve region like support vector machine (SVM) or random forest may be utilized for improved nerve detection. In machine learning, support vector machines (also support vector networks) may be supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. Given a set of training examples, each marked as belonging to one of two categories, an SVM training algorithm may build a model that assigns new examples into one category or the other, making it a non-probabilistic binary linear classifier. An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples may then be mapped into that same space and predicted to belong to a category based on which side of the gap they fall on. In addition to performing linear classification, SVMs can efficiently perform a non-linear classification using what is called the kernel trick, implicitly mapping their inputs into high-dimensional feature spaces. Random forests are an ensemble learning method for classification, regression and other tasks, that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes (classification) or mean prediction (regression) of the individual trees. Random forests correct for decision trees' habit of overfitting to their training set. Decision trees are a popular method for various machine learning tasks. Tree learning comes closest to meeting the requirements for serving as an off-the-shelf procedure for data mining because it is invariant under scaling and various other transformations of feature values, is robust to inclusion of irrelevant features, and produces inspectable models. However, they are seldom accurate. In particular, trees that are grown very deep tend to learn highly irregular patterns: they over fit their training sets, because they have low bias, but very high variance. Random forests are a way of averaging multiple deep decision trees, trained on different parts of the same training set, with the goal of reducing the variance. This comes at the expense of a small increase in the bias and some loss of interpretability, but generally greatly boosts the performance of the final model.
  • Once the muscles are split and the surgical site 46 is reached, the surgeon can place a k-wire through the conduit to confirm that the surgical site 46 is reached and anchor the device 10 with respect to the surgical site 46. A retractor tool 48 may be put into place to give the surgeon a direct surgical working conduit to the surgical site 46. Alternatively, a series of dilators may be sequentially placed over the main body 16 to create the working space. Once this direct access to the spine is achieved, the surgeon is able to perform a standard discectomy (removing the intervertebral disc), corpectomy (removing the vertebral bone) or fusion (uniting two bones together) with surgical tools.
  • An embodiment of the retractor system 48 may include a first blade 49 and a second blade 51, both of which may be semi-circular in shape that form an opening that fits snugly around the outer diameter the main body 16. It is appreciated that the cross-sectional shape of the blades can mimic the shape of the main body 16 (e.g., triangular, oval, square, rectangular, etc.). Once at the surgical site, the retractor blades 49, 51 may be configured to separate relative to one another so as to expand the dissection and to enable the device 10 to be removed and allow for direct visualization of the surgical site 46 as shown in FIG. 19. It is contemplated that the distal ends 53 of the first 49 and second 51 blades are adjacent to the distal portion 14 of the main body 16. Any known type retractor system may be used with the device 10.
  • In one embodiment, a retractor system 226, like the one disclosed in FIG. 38, may be disposed over the device 10 and configured to expand to create a working space within the anatomy of the patient once a surgical site, or other location where a surgical procedure is to take place, is reached. This embodiment of the retractor system 226 includes a first blade 228 and a second blade 230, both of which are semi-circular in shape that form an opening that fits snugly around the outer diameter the main body 16. Once at the surgical site, the retractor blades 228, 230 are configured to separate relative to one another to expand the dissection so as to enable the device 10 to be removed and allow for direct visualization of the surgical site 68 as shown in FIG. 38. The distal ends 232 of the first 228 and second 230 blades may be adjacent to the distal portion 14 of the main body 16. Any type known retractor system may be used with the device 10. It can be appreciated that stimulation electrodes, visualization cameras and illumination devices (optical, ultrasound, infrared and ultraviolet) may also be placed along or within the retractor blades 228, 230 so as to allow for nerve detection as discussed above
    Figure US20210169447A1-20210610-P00003
  • As shown in FIG. 39, a cross-sectional view of the
    Figure US20210169447A1-20210610-P00003
    retractor 226 disposed around the device 10, the main body 16 of the device 10 may have a raised channel or channels 234 disposed along its length in a direction along the its longitudinal axis that is configured to slidingly receive a complimentary groove 236 formed by the first and second blades 228, 230 of the retractor system 226. In some aspects, the raised channel 234 and/or groove 236 may have a square, rectangle, semi-spherical or a similar cross-sectional shape. Further, the number and location of the channel 234 and groove 236 may vary. For example, but without limitation, the may only be one channel/groove running along the main body 16.
  • Alternatively, there may be more than two channels/grooves that are equidistantly placed about the outer surface of the main body 16. The channel and groove may also be transposed, such that the grove 236 is on the main body 16 and the raised channel 234 is formed along the blades 228, 230, as shown in FIG. 40, such that the groove 236 prevents the blades 228, 230 from expanding when the retractor 226 is disposed over the device 10. The channel 234 and groove 236 may extend along only a portion of the main body 16 of the device 10 and the blades 228, 230.
  • The retractor system 226 may consistent of multiple sets of blades 228, 230 with varying thickness and diameter. For example, and without limitation, the blades 228, 230 may vary in size so as to have an overall outer diameter ranging from 2 mm to 80 mm when in the closed (e.g., collapsed) configuration. Further, the blades 228, 230 may be configured such that they create a 2 mm to 220 mm opening within the patient's body when in the expanded configuration. The device 10 and retractor system 226 may be configured such that a first set of blades 228, 230 having a larger retracted diameter may be used to create a first opening within the patient's body and then a second set smaller diameter blades having a retracted diameter that is different than the first set of blades 228, 230 may be slidingly disposed over the main body 16 and within the first opening and retracted to open a second opening within the patient's body at a location distal to the first opening. The openings created by the first and second set of blades may have different opening diameters. The retractor system 226 may allow the operator to create multiple openings having different retracted diameters at different anatomic levels within the patient. A light source (not shown) may be disposed at the distal ends, or along the length, of the blades 228, 230 to illuminate the opening within the patient's body and illuminate the region of the patient's anatomy within and distal to the opening created by the blades 228, 230 In addition, a conduit (not shown) may also be formed within the blades so as to receive one or more of: a medical instrument, such as a k-wire to anchor the blades 228, 230 to a surgical site such as the disk space between two vertebra; an illumination device, such as an optical fiber, to provide additional illumination to a particular region of the patient's anatomy; an electrical conduit to provide neural stimulation; a ultrasound probe to image or detect particular regions of a patient's anatomy; a image capture device; a fiber optic cord that emits visible or infrared light to image or detect particular regions of a patient's anatomy; and the like. The blades 228, 230 may be made out of any preferable surgical grade material, including but not limited to, medical grade polymer including PEEK (polyether ether ketone), and may be transparent (e.g. made out of clear plastic) or translucent.
  • In another embodiment, the retractor system 226 may be integrated with the device 10 such that it forms part of the main body 16 and can be deployed once at the surgical side. The retractor system 226 in this embodiment may expand radially away from the longitudinal axis of the main body 16 to expand the path created by the main body 16. The main body 16 may then be withdrawn from the surgical site so as to create a working portal within the retractor system 226.
  • A series of dilating cannulas (e.g. dilators 100), as shown in FIG. 20, may also be slidingly placed around the main body 16 of the device 10 so as to expand the diameter of the dissection made by the distal portion 14 of the device 10. The technique of employing a series of dilating cannulas to create a working space for direct visualization used in other medical procedures to create a working space may also be used in conjunction with the device 10.
  • After disc material is removed, the surgeon may be able to insert an implant/spacer through the same incision from the side. This spacer (cage) may help hold the vertebrae in the proper position to make sure that the disc height (space between adjacent vertebral bodies) is correct and to make sure the spine is properly aligned. This spacer, together with a bone graft, may be designed to set up an optimal environment to allow the spine to fuse at that particular segment. The surgeon may use fluoroscopy to make sure that the spacer is in the right position. The surgeon may then remove the refractor and suture any incisions.
  • Spinal surgeons oftentimes access the intervertebral disc space and vertebral body via a transpsoas approach. The transpsoas muscle however has a network of nerves (lumber plexus) within the muscle and their exact location can be unpredictable The surgeon must therefore get to the disc while avoiding these nerves so as not to cause nerve damage or paralysis when performing surgery The present disclosure allows the surgeon to visualize where nerves lie before penetrating the psoas muscle. The device 10 and the classification algorithms may be used to create a quantitative image and/or map (2D or 3D) of the nerve tissue while the device 10 is placed above the psoas, providing a surgeon a path that avoids nerves and gets to the disc space. In some embodiments, a mapping of target tissue proximate to the surgical site may be provided by the ultrasound imager 24. When the tip 22 is disposed adjacent to the psoas muscle, the surgeon may slide a first set of blades of the retractor system over the device and expand the retractor system 48 to create a first working space (also referred to as a superficial dock). This working space may allow the surgeon to visually inspect the psoas muscle and the surrounding region either via naked (eye) inspection or with the optical camera/dissector (e.g., one or more components of device 10). Next, the surgeon may continue the procedure by using the device, which is now disposed within the first working space to dissect through the psoas muscle as described herein. Once the tip 22 has reached the surgical site, which is the disc space here, a second set of retractor blades which are smaller than the first set of blades may be slid over the device 10 and expanded to create a second working space that is smaller in diameter than the first working space. The surgeon may then continue with the procedure in the manner discussed herein. One benefit of establishing the first working space may be that it allows the surgeon to remove the device 10 from the surgical site once the procedure is completed at the first surgical site and reposition and reinsert the distal tip 22 of the device 10 within the first working space that is formed above the psoas muscle at a second location to allow the surgeon to penetrate the psoas muscle to reach a second surgical site to conduct and complete another procedure or a multi-level procedure in which psoas dissection is currently dangerous because of the interposed neurovascular structures (L3-4 and L4-5 disc space or a lumbar corpectomy—removal of two discs and the intervening bone). It is appreciated that the tip 22 is optional and the distal end 21 of the device 10 maybe the portion of the device that is advanced towards the surgical site.
  • In some embodiments employing the device 10 having the channel 4930, when the tip 22 is disposed adjacent to the psoas muscle, the surgeon may position surgical tools/implants through the channel 4930 to access the surgical site while allowing the transducer(s) to collect data for real-time display. By allowing the surgeon to perform surgery while the device 10 is in place, the transducer can record and collect data throughout the surgery. The device 10 may also serve as a retractor of tissue, thereby pushing surrounding tissue to the side and assisting with creating the working portal for the surgeon. Once the tip 22 has reached the surgical site, the surgeon may continue the use of tools through the channel 4930 or may employ a second set of tools, such as retractor blades, over the device 10. The surgeon may then continue with the procedure in the manner discussed herein.
  • The device 10 may also be used for performing an axial lumbar interbody fusion (AxiaLIF). At surgery, the patient may be positioned prone with maintenance of lordosis and the legs spread. A catheter may be inserted into the rectum will allow air to be injected during the procedure for visualization of the rectum. After the surgeon makes a small incision (15-18 mm) lateral to the tip of the coccyx, the distal tip 22 of the device 10 may be inserted through the incision and is passed into the pre-sacral space. The surgeon may use the distal portion 14 of the device 10 to sweep and scan the pre-sacral space to confirm that the space is clear of any offending anatomy (e.g. colon, rectum). The device 10 may be gently passed along the anterior cortex of the sacrum and in the midline to an entry point usually close to the S1-2 junction. Once the trajectory is chosen, a sharp beveled pin may then be driven into the L5-S1 interspace, either through the conduit 36 or after the retractor system 48 is deployed. The retractor system, 48 or a series of dilators may be used to create approximately a 10 mm opening into the sacrum through which a 10 mm channel is drilled into the L5-S1 disc. The device 10 may then be withdrawn from the pre-sacral space and the surgeon may then perform the remaining steps of the AxiaLIF procedure.
  • The device 10 may also be used to allow direct access to Kambin's triangle (ExtraForaminal Lumbar Interbody Fusion). For this procedure, patients may be placed in a prone position, typically onto a Jackson Table using a radiolucent frame that allows for restoration of lumbar lordosis. Fluoroscopic imaging may be utilized to identify the epiphyseal plate of the upper and lower vertebral body by controlling the cranial-caudal angle of the image intensifier. Additionally, the fluoroscopic image may be rotated by 20-35 degrees toward the region, so that the superior articular process may be seen at the middle of the intervertebral disc. At this location, the tip 22 of the device 10 may be inserted percutaneously targeting the area commonly referred to as Kambin's triangle. Kambin's triangle is defined as the area over the dorsolateral disc. The hypotenuse is the exiting nerve root, the base (width) is the superior border of the caudal vertebra and the height is the dura/traversing nerve root. FIG. 55 depicts a visual representation of Kambin's triangle 5500. FIG. 56 shows the device 10 in a position targeting Kambin's triangle, as described above.
  • The device 10 may also be used to ultrasonically identify various anatomical features such as the exiting root, radicular artery, thecal sac and the disc space. A k-wire can then be place into the disc space via the conduit 36 under ultrasonic detection via the device 10 allowing for docking of the dissector/retractor system 48. Subsequent dilation can then be performed allowing for access in the intervertebral foramen while directly visualizing neurovascular structures using the device and avoiding these structures when identified by the surgeon.
  • The device 10 may also be used in treatment of extraforaminal disc herniations, such as in procedures involving extraforaminal intervertebral fusion. A far lateral discectomy is a commonly performed procedure for the treatment of extraforaminal disc herniations. It is routinely done through a paramedian incision using the Wiltse plane. However, the exiting nerve root (i.e. the L04 nerve root at the L45 level, see FIG. 48) is at risk of damage with this approach, as it is normally draped over the disc. In order to decrease the risk of nerve injury, some surgeons currently use intraoperative nerve monitoring; however, intraoperative nerve monitoring relies on advanced anesthetic that may not allow for relaxation of the patient. Recently, surgeons have considered use of interbody cage in intervertebral fusion approaches through the far lateral extraforaminal approach. While there are advantages to a muscle and bone sparing approach to intervertebral fusion, the passage of an interbody cage device into the disc space increases the risk to the exiting nerve as well as the nerves that have exited from proximal levels and are running under the intertransverse membrane. In order to safely perform an extraforaminal intervertebral fusion, we disclose herein use of the device 10 for performing detection of the neurologic structures that run under the intertransverse membrane as well as the exiting nerve root as it is leaving its foramen. Once the nerve is detected, the device 10 can be safely docked on the far lateral portion of the disc, just anterior to the pars interarticularis. Dilators may then be advanced over the main body 16 of the device 10 and then a tubular retractor may be docked on the disc, just anterior to the pars interarticularis and in between the transverse processes of the two involved vertebrae. Once the retractor is safely docked, the surgeon may proceed with preparation of the disc space and endplates and safely insert the intervertebral cage for fusion.
  • FIG. 57 depicts a skin incision made approximate 5-6 cm lateral to the midline, according to the methods disclosed herein. The incision may be centered lateral to the facet-pedicle junction. FIG. 58 shows the natural cleavage plane, between the multifidus part of the sacrospinalis and the longissimus part, as may be used for spinal approach in some embodiments. This plane allows direct access to the pars, transverse processes and facet joints with minimal soft tissue dissection and retraction. This approach is less vascular than the mid-line approach, and therefore may result in less bleeding.
  • In another embodiment, shown in FIG. 21, an ultrasound imager 24 may be used in conjunction with a glove 110. In this embodiment, the operator may rely on tactile feedback provided by touch while still enabling ultrasonic imagining/scanning of a patient's anatomy. More specifically, the glove system (or device) may allow for tactile feedback that facilitates the dissection and separation of tissue namely neurological, vascular and peritoneal structures. In general, tactile feedback allows for dissection of tissue in normal surgical procedures without the unique perspective of direct visualization that may not be permissible in some minimally invasive/percutaneous techniques.
  • The ultrasound imager 24 may include a transducer 26 that is configured to emit sound waves may be disposed at the distal end of the glove 110. In one embodiment, the transducer 26 is located along a distal portion 114 of the index finger 112 of the glove 110. As better shown in FIG. 22, a
    Figure US20210169447A1-20210610-P00003
    tip 22 forms part of, or is connected to, the distal portion 114 of the index finger 112 such that the outer surface 24 of the tip 22 does not extend beyond the very most distal part of the index finger 112. Of course, it is appreciated that the tip 22 may extend beyond the distal portion depending on the embodiment.
  • Connected to the transducer 26 may be a flexible conduit 116 that may carry a cable that connects the transducer 26 to a housing that contains the remaining portion of the ultrasound imager 24. The flexible conduit 116 may run along the length of the index finger 112 and a top portion 118 of the glove 110. However, it can be appreciated that the conduit 116 can run along any length or surface of the glove 110 and is application dependent. The flexible conduit 116 may also provide a channel to carry a k-wire or other instrument that can be slidingly disposed within the flexible conduit 116 (as will be further discussed below). The flexible conduit 116 may run through and may be in communication with a unit 109 such that a portion of the flexible conduct 116 provides an opening 120, as shown in FIG. 23, in the unit 109 at the distal portion 114 of the index finger 112.
  • The unit 109 may have a bottom portion 122, as shown in FIG. 23. The bottom portion 122 may have a concave curvature so as to provide a complimentary fit once an operator's hand is placed within the glove 110. In addition, a proximal portion of the unit 109 may have a taper so as to cause minimal disruption to a patient's anatomy as the unit 109 is articulated during a procedure. Further, the unit 109 may have an overall semi-circular or cylindrical shape or the like so as to minimize any inadvertent disruption to the patient's anatomy during a procedure and to maintain a small overall profile as shown in FIGS. 24 and 25. For example, the height of the unit 109 may be less than the overall width to as to achieve a low profile. Alternatively, the outer portion of the unit 109 may not extend beyond and become collinear with the width of the index finger 112 to maintain a low profile. The external outer diameter of the unit 109 may range from 0.5 to 20 mm and outside of this range depending on the desired application. The length of the unit 109 can range from 0.5 to 10 mm but also may fall outside of this range depending on the application. It is appreciated that more than one transducer 26 may be positioned along the distal portion of a finger such that they provide side facing scans to generate a multi-directional (e.g. 180°-300°) scan of the patient's anatomy.
  • Transducers 26 may be side positioned (e.g. on either side of the index finger 112) so as to provide for multi-directional scanning of the patient's anatomy to detect the nerve or target anatomy. The side positioned transducers may be configured to scan the anatomy around in a circumferential direction around the index finger 112 to detect the nerve (or other target anatomy) not detected by the transducer positioned at the distal end of the main body 16. The multi-directional scanning may enable the system to generate a scan image of the patient's anatomy in multiple directions as the index finger 112 of the glove 110 is advanced through the patient's anatomy. As discussed above, the system that is in communication with the transducers may then detect the nerve even that is not captured by the forward scanning transducer.
  • The image capture system 200, as discussed in relation to FIGS. 29 and 30, may be used with the glove embodiment where the image capture device 201, its tip 22, sensor 206, and illumination device 216 may be placed on a distal portion 114 of the index finger 112 of the glove 110. The image capture system 200 may also be used independently of, or in conjunction with, the ultrasound imager 24 as described above in the glove embodiments.
  • The glove embodiment can be used in connection with minimally invasive surgery (MIS). The glove 110 may be used for a variety of MIS procedures, including but not limited to, Lateral Retroperitoneal Interbody Fusion (LLIF (e.g., eXtreme Lateral Lumbar Interbody Fusion (XLIF), Direct Lateral Interbody Fusion (DLIF)), Axial Lumbar Interbody Fusion (AxiaLif), Transforaminal Lumbar Interbody Fusion (TLIF), Posterior Lumbar Interbody Fusion (PLIF), Anterior Lumbar Interbody Fusion, Trans-thoracic lumbar interbody fusion, Retropleural Thoracic Fusion, Interbody Fusion utilizing Kambin's Triangle, and Cervical/Thoracic/Lumbar Laminectomies, Foraminotomies and Diskectomies. The glove 110 may be used to confirm that the area is clear of other anatomical parts, such as blood vessels, abdominal/pelvic viscera, nerve roots, and spinal cord.
    Figure US20210169447A1-20210610-P00003
  • As described above, there can be a number of applications for which this glove 110 may be used, which may require similar steps to access the surgical site. The surgeon may rely on the image or audio queues generated by the ultrasound imager 24 to detect the presence (or absence) of a nerve thereby allowing the surgery to reposition (or continue advancing) the glove 110 through the patient's anatomy towards the surgical site 48.
  • Once the muscles are split and the surgical site 48 is reached, the surgeon can place a k-wire through the conduit to confirm that the surgical site 48 is reached and anchor the glove 110 with respect to the surgical site 48. A retractor tool is put into place to give the surgeon a direct surgical working conduit to the surgical site 48. Alternatively, a series of dilators may be sequentially placed over the k-wire to create the working space. Once this direct access to the spine is achieved, the surgeon is able to perform a standard discectomy (removing the intervertebral disc), corpectomy (removing the vertebral bone) or fusion (uniting two bones together) with surgical tools.
  • In the case of a discectomy, after the disc material is removed, the surgeon may be able to insert an implant/spacer through the same incision from the side. This spacer (cage) will help hold the vertebrae in the proper position to make sure that the disc height (space between adjacent vertebral bodies) is correct and to make sure the spine is properly aligned. This spacer, together with a bone graft, may be designed to set up an optimal environment to allow the spine to fuse at that particular segment. The surgeon may use fluoroscopy to make sure that the spacer is in the right position. The surgeon may then remove the refractor and suture the incisions.
  • The glove system may also be used for performing an axial lumbar interbody fusion (AxiaLIF). At surgery, the patient may be positioned prone with maintenance of lordosis and the legs spread. A catheter may be inserted into the rectum will allow air to be injected during the procedure for visualization of the rectum. After the surgeon makes a small incision (15-18 mm) lateral to the tip of the coccyx, the distal portion of the index finger 112 and distal tip 22 is inserted through the incision and is passed into the pre-sacral space. The surgeon may use the index finger 112 to sweep and inspect the pre-sacral space to confirm that the space is clear of any offending anatomy (e.g. colon, rectum) visually and by way of ultrasonic imaging. The index finger 112 may be advanced along the anterior cortex of the sacrum and in the midline to an entry point usually close to the S1-2 junction. Once the trajectory is chosen, a sharp beveled pin may then be driven into the L5-S1 interspace, either through a conduit or after the retractor system is deployed. The retractor system or a series of dilators may be used to create approximately a 10 mm opening into the sacrum through which a 10 mm channel is drilled into the L5-S1 disc. The index finger 112 may then be withdrawn from the pre-sacral space and the surgeon may then perform the remaining steps of the AxiaLIF procedure.
  • The glove system may also be used to allow direct access to Kambin's triangle (Extraforminal interbody fusion). For this procedure, patients may be placed in the prone position typically onto a Jackson Table using a radiolucent frame that allows for restoration of lumbar lordosis. Fluoroscopic imaging may be utilized to identify the epiphyseal plate of the upper and lower vertebral body by controlling the cranial-caudal angle of the image intensifier. Additionally, the fluoroscopic image may be rotated by 20-35 degrees toward the region, so that the superior articular process can be seen at the middle of the intervertebral disc. At this location, the index finger 112 can be inserted percutaneously targeting the area commonly referred to as Kambin's triangle. As discussed above, Kambin's triangle is defined as the area over the dorsolateral disc. The hypotenuse is the exiting nerve root, the base (width) is the superior border of the caudal vertebra and the height is the dura/traversing nerve root.
    Figure US20210169447A1-20210610-P00003
  • The glove system may be used to identify various anatomical features such as the exiting root, radicular artery, thecal sac and the disc space. A k-wire can then be place into the disc space via the conduit under ultrasonic visualization allowing for docking of the dissector/retractor system. Subsequent dilation can then be performed allowing for access in the intervertebral foramen while directly visualizing neurovascular structures using the device and avoiding these structures when identified by the surgeon.
  • The device 10 may also include infrared technology, which includes an infrared emitting light source and an infrared image capture device. The device 10 may include one or more infrared radiation detecting elements mounted at the distal portion 14 of the device 10. The infrared array may be sensitive at e.g. wavelengths from 2 to 14 micrometers. One embodiment of the infrared aspect of the present disclosure uses a two-dimensional array of microbolometer sensor elements packaged in an integrated vacuum package and co-located with readout electronics on the distal tip of the device 10. It is appreciated that the infrared aspect of this disclosure may be used in conjunction with, or separate from, the other embodiments discussed herein. One such infrared system that could be used with the present disclosure is disclosed in U.S. Pat.
    Figure US20210169447A1-20210610-P00003
    No. 6,652,452, the entirety of which is incorporated herein by reference.
  • The device 10 may also utilize Optical Coherence Tomography (hereinafter “OCT”) technology as a stand-alone detection system or in conjunction with the other embodiments disclosed herein. OCT is an optical signal acquisition and processing method that generates images using near infrared light. By way of background, OCT performs high-resolution, cross-sectional tomographic imaging of the internal microstructure in materials and biologic systems by measuring backscattered or back-reflected light. OCT images are typically two- or three-dimensional data sets that represent the optical back-scattering in a cross-sectional plane through the tissue. Image resolutions of approximately 1 to 15 micrometers may be achieved, one to two orders of magnitude higher than conventional ultrasound. Imaging can be performed in situ and in real time.
  • OCT forms depth resolved images by interferometrically detecting the light backscattered from different scatterers within the sample. In a typical OCT system 50, as shown in FIG. 26, the light from the laser 52 is split by a fiber optic coupler/beam splitter 54 into two arms i.e. the reference arm 56 and the sample arm 58. The light coupled into the reference arm 56 is reflected back from a fixed mirror 60, while in the sample arm 58 the light is projected through an OCT probe 62, which will be discussed in greater detail below.
    Figure US20210169447A1-20210610-P00003
  • The OCT probe 62 may be focused onto the sample of interest (e.g. tissue or the anatomy of the patient) through a focusing lens (e.g. a GRIN lens). OCT is a point by point imaging technique where the sample is illuminated by focusing the light from the laser 52 onto a small point (spot size determined by the focusing lens) on the sample. Light in the sample arm 58 travels within the tissue and is backscattered by different scatterers within the tissue and combines with the light from the reference arm 56. If the optical path lengths of the reference 56 and sample 58 arms are matched, an interferogram may be formed which may be measured by a photo detector or a spectrometer. The frequency content of the interferogram may contain information about the depth and strength of the scatterers that the beam had encountered in the sample. The resulting interferogram may be processed to form one-dimensional depth information generally known as an A-scan (a single A-scan would be a single column in the image). The optical beam may then be scanned over the sample to generate two- or three-dimensional images. The beam may be scanned using galvanometers in bench-top OCT systems or using MEMS scanners in hand-held OCT devices. This data is sent to, and processed by, the computer 95 or other processor.
  • As further disclosed in FIG. 27, the OCT probe 62 may include a GRINS lens 64, the diameter of which in this embodiment is 1 mm, but which can vary depending on the intended application. A single mode optical fiber 66 is included in this embodiment that transfers the light rays between the OCT
    Figure US20210169447A1-20210610-P00003
    probe 62 and the remaining portion of the OCT system (e.g. the fiber optic coupler 54 or a detector 68). The single mode optical fiber 66 may have a thickness of approximately 900 micrometers and a length of approximately 1.5 m. These specifications, of course, are examples only and can vary depending on the application. Attached to the distal end of the GRINS lens 64 may be a prism 70 for deflecting the light depending on the location and orientation of the target. It can be appreciated that the prism 70 may not be necessary in situations where the surface of the target is directly in front of or substantially perpendicular to the longitudinal axis of the light ray (or beam). In this embodiment, the length of the prism is approximately 700 micrometers, but it is appreciated that the length can vary and is application dependent
    Figure US20210169447A1-20210610-P00003
  • Two different embodiments of the OCT probe 62 are illustrated in FIG. 28. The first embodiment is the forward image probe 72, which does not include a prism, such that the light ray (or beam) extends outward towards the front of the probe 72 to reach the target (e.g. tissue). In the second embodiment, image probe 74 contains a prism 70, which allows this embodiment to image targets that are disposed below or at an angle to the tip of the probe 74. The OCT technology may also be incorporated in to the glove 110 in a manner as discussed above with respect to the ultrasound embodiment.
  • Some of the parameters that may be manipulated to optimize OCT imaging include (a) the A-scan rate (the number of A-scans the system can acquire in a second), (b) the axial and transverse resolution, and (c) the imaging depth. The A-line scan rate may determine how fast an OCT system can operate. For a swept source OCT system, the imaging rate may depend on the wavelength sweeping rate of the laser, while, for a spectral domain OCT system, it is generally limited by the speed of the line scan camera used in the spectrometer. The tradeoff is that at a higher A-scan rate, the exposure time has to be reduced which can decrease the SNR of the acquired data. The axial resolution (resolution across the depth) is determined by the bandwidth and wavelength of the laser source. In general, the higher the bandwidth the better is the axial resolution. The resolution along the transverse dimensions is determined by the numerical aperture of the lens in the sample arm 58. The higher the numerical aperture, higher the transverse resolution, however, the tradeoff is a reduced depth-of-field. Moreover, with an increase in the center wavelength of the source both the axial and transverse resolutions degrade. Finally, the imaging depth is usually limited by how deeply the light can penetrate through the tissue or sample of interest. Higher wavelengths offer greater imaging depth. These and other parameters may be optimized to detect certain features of a patient's anatomy, such as nerve root.
  • The OCT probe 62 may be positioned at the distal portion 14 of the device 10. Alternatively, the OCT probe 62 may be positioned at the distal end of a k-wire like structure and disposed through the conduit 36. In either embodiment, the OCT probe 62 may be configured to image a portion of the patient's anatomy that is adjacent to (or in front of) the distal portion 14 of the device 10. The surgeon may insert the OCT probe 62 to image the patient's anatomy as needed to reach the surgical site. The OCT system 50 may be configured to visually and/or audibly indicate detection of select pre-selected portions of a patient's anatomy (e.g. nerve root). As mentioned above, it can be appreciated that the OCT system can be used independently or in combination with other detection technologies described herein.
  • It is also contemplated that the device 10 can be used in conjunction with a neuromonitoring system to detect certain portions of a patient's anatomy, including neural elements that include a nerve, nerve bundle, or nerve root. For the purposes of this discussion, the device 10 and neuromonitoring system will be discussed with respect to detecting a patient's spinal nerve but it is contemplated that the device 10 and neuromonitoring system may be used to detect other nerves (peripheral and central) as well as the spinal cord. One type of neuromonitoring system that may be used in conjunction with the device 10 is disclosed in U.S. Pat. No. 7,920,922, the entirety of which is incorporated by reference herein.
  • In one embodiment, stimulation electrodes may be placed at the distal end of the device 10, such as forming part of the tip 22, or placed at a distal end of an instrument, such as a K-wire, disposed through the conduit 36, to stimulate any nerves in the region adjacent to the distal portion 14 of the device 10. EMG (electromyography) electrodes can be placed on the skin to detect any nerve depolarization in the manner descried in U.S. Pat. No. 7,920,922. One manner in which the proximity, location, direction, physiology of the nerve is determined is also disclosed in U.S. Pat. No. 7,920,922. It is appreciated that other techniques of detecting nerves using stimulation are known in the art and any of those techniques may be used in conjunction, or integrated, with the device 10 in the manner described above.
  • The ultrasound imager 24 may be used in conjunction or independent of an image capture device to visualize the patient's anatomy as described herein. Steps and methods describe herein using the ultrasound imager 24 to detect certain features of a patient's anatomy may be supplemented through use of an image capture device. Specifically, the surgeon may rely on the image or audio queues generated by the ultrasound imager 24 to detect the presence (or absence) of a nerve thereby allowing the surgery to reposition (or continue advancing) the device 10 through the patient's anatomy towards the surgical site 48. The ultrasound imager 24 may also be used to confirm the image captured by the image capture device is accurate by confirming the presence or absence of a targeted anatomical feature (e.g. nerve).
    Figure US20210169447A1-20210610-P00003
  • Likewise, in operation, the OCT system 50 may be used in conjunction or independent of an image capture device and/or the ultrasound imager 24 to scan and identify the patient's anatomy as described herein and to access the surgical site. Steps and methods used to access the surgical site and avoid target anatomy (e.g. nerve) employing the ultrasound imager 24 may also be performed using the OCT system 50. Furthermore, steps described herein using ultrasound imager 24 may be supplemented through use of the OCT system 50. For example, the surgeon may rely on the image or audio cues generated by the OCT system 50 to detect the presence (or absence) of a nerve thereby allowing the surgery to reposition (or continue advancing) the device 10 through the patient's anatomy towards the surgical site 48. The OCT system 50 may also be used to confirm the image captured by an image capture device is accurate by confirming the presence or absence of a targeted anatomical feature (e.g. nerve).
  • The device 10 may be used in a variety of other medical areas outside of spinal surgery. These include: gynecologic transvaginal imaging for cervical cancer and endometrial cancer, prostate examination/prostate cancer, intra-abdominal surgery to delineate depth of penetration of a tumor within peritoneal contents (stomach, small intestine, large intestine, kidney, liver, and spleen). The device 10 may be utilized with known robotic systems, such as the Da Vinci Robotic or similar systems.
  • FIG. 47 depicts an illustrative operating environment in which various aspects of the present disclosure may be implemented in accordance with one or more example embodiments. Referring to FIG. 47, computing system environment 4700 may be used according to one or more illustrative embodiments. Computing system environment 4700 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality contained in the disclosure. Computing system environment 4700 should not be interpreted as having any dependency or requirement relating to any one or combination of components shown in illustrative computing system environment 4700.
  • Computing system environment 4700 may include computing device 4701 having processor 4703 for controlling overall operation of computing device 4701 and its associated components, including random-access memory (RAM) 4705, read-only memory (ROM) 4707, communications module 4709, and memory 4715. Computing device 4701 may include a variety of computer readable media. Computer readable media may be any available media that may be accessed by computing device 4701, may be non-transitory, and may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, object code, data structures, program modules, or other data. Examples of computer readable media may include random access memory (RAM), read only memory (ROM), electronically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read-only memory (CD-ROM), digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed by computing device 4701.
  • Although not required, various aspects described herein may be embodied as a method, a data processing system, or as a computer-readable medium storing computer-executable instructions. For example, a computer-readable medium storing instructions to cause a processor to perform steps of a method in accordance with aspects of the disclosed embodiments is contemplated. For example, aspects of the method steps disclosed herein may be executed on a processor on computing device 4701. Such a processor may execute computer-executable instructions stored on a computer-readable medium.
  • Software may be stored within memory 4715 and/or storage to provide instructions to processor 4703 for enabling computing device 4701 to perform various functions. For example, memory 4715 may store software used by computing device 4701, such as operating system 4717, application programs 4719, and associated database 4721. Also, some or all of the computer executable instructions for computing device 4701 may be embodied in hardware or firmware. Although not shown, RAM 4705 may include one or more applications representing the application data stored in RAM 4705 while computing device 4701 is on and corresponding software applications (e.g., software tasks) are running on computing device 4701.
  • Communications module 4709 may include a microphone, keypad, touch screen, and/or stylus through which a user of computing device 4701 may provide input, and may also include one or more of a speaker for providing audio output and a video display device for providing textual, audiovisual and/or graphical output. Computing system environment 4700 may also include optical scanners (not shown). Illustrative usages include scanning and converting paper documents, e.g., correspondence, receipts, and the like, to digital files.
  • Computing device 4701 may operate in a networked environment supporting connections to one or more remote computing devices, such as computing devices 4741, 4751, and 4761. Computing devices 4741, 4751, and 4761 may be personal computing devices or servers that include any or all of the elements described above relative to computing device 4701. Computing device 4761 may be a mobile device (e.g., smart phone) communicating over wireless carrier channel 4771.
  • The network connections depicted in FIG. 47 may include local area network (LAN) 4725 and wide area network (WAN) 4729, as well as other networks. When used in a LAN networking environment, computing device 4701 may be connected to LAN 4725 through a network interface or adapter in communications module 4709. When used in a WAN networking environment, computing device 4701 may include a modem in communications module 4709 or other means for establishing communications over WAN 4729, such as Internet 4731 or other type of computer network. The network connections shown are illustrative and other means of establishing a communications link between the computing devices may be used. Various well-known protocols such as transmission control protocol/Internet protocol (TCP/IP), Ethernet, file transfer protocol (FTP), hypertext transfer protocol (HTTP) and the like may be used, and the system can be operated in a client-server configuration to permit a user to retrieve web pages from a web-based server. Any of various conventional web browsers can be used to display and manipulate data on web pages.
  • The disclosure is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the disclosed embodiments include, but are not limited to, personal computers (PCs), server computers, hand-held or laptop devices, smart phones, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • While aspects of the present disclosure have been described in terms of preferred examples, and it will be understood that the disclosure is not limited thereto since modifications may be made to those skilled in the art, particularly in light of the foregoing teachings.

Claims (10)

1. A device for scanning a part of an anatomy, the device comprising:
a shaft having a distal end, a proximal end, and a longitudinal axis;
a housing disposed at the distal end; and
at least one ultrasound transducer disposed in the housing, wherein the at least one ultrasound transducer is configured to scan a region distal to the housing,
wherein the shaft comprises a channel formed within the shaft, the channel extending from the distal end to the proximal end and configured to allow passage of a surgical tool through the shaft and housing via the channel.
2. The device of claim 1, wherein the channel has a diameter in a range of 24 mm to 40 mm.
3. The device of claim 1, wherein the device is configured for continuous ultrasound imaging as the surgical tool is passed through the channel.
4. The device of claim 1, wherein the channel is offset from the longitudinal axis of the shaft.
5. The device of claim 1, wherein the channel is collinear with the longitudinal axis of the shaft.
6. The device of claim 1, wherein the at least one ultrasound transducer comprises at least 100 ultrasound transducers evenly spaced within the housing.
7. The device of claim 1, wherein the at least one ultrasound transducer is configured to emit an ultrasonic frequency in a direction that is substantially parallel to the longitudinal axis.
8. The device of claim 1, wherein the at least one ultrasound transducer is configured to emit an ultrasonic frequency in a direction that is not parallel to the longitudinal axis.
9. The device of claim 1, wherein the at least one ultrasound transducer comprises a plurality of ultrasound transducers arranged in more than one row on the distal end of the shaft.
10. The device of claim 1, wherein the at least one ultrasound transducer may have a transducer frequency within a range between 5 MHz to 20 MHz.
US16/934,714 2013-07-17 2020-07-21 Identifying Anatomical Structures Abandoned US20210169447A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/934,714 US20210169447A1 (en) 2013-07-17 2020-07-21 Identifying Anatomical Structures

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US201361847517P 2013-07-17 2013-07-17
US201361867534P 2013-08-19 2013-08-19
US201361868508P 2013-08-21 2013-08-21
US201361899179P 2013-11-02 2013-11-02
US201361921491P 2013-12-29 2013-12-29
US201461929083P 2014-01-19 2014-01-19
US201461977594P 2014-04-09 2014-04-09
US14/329,940 US10154826B2 (en) 2013-07-17 2014-07-12 Device and method for identifying anatomical structures
US201462051670P 2014-09-17 2014-09-17
US201562129866P 2015-03-08 2015-03-08
US201562129862P 2015-03-08 2015-03-08
PCT/US2015/050404 WO2016044411A1 (en) 2014-09-17 2015-09-16 Identifying anatomical structures
US15/063,152 US10716536B2 (en) 2013-07-17 2016-03-07 Identifying anatomical structures
US16/934,714 US20210169447A1 (en) 2013-07-17 2020-07-21 Identifying Anatomical Structures

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/063,152 Continuation US10716536B2 (en) 2013-07-17 2016-03-07 Identifying anatomical structures

Publications (1)

Publication Number Publication Date
US20210169447A1 true US20210169447A1 (en) 2021-06-10

Family

ID=56162889

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/063,152 Active 2038-09-28 US10716536B2 (en) 2013-07-17 2016-03-07 Identifying anatomical structures
US16/934,714 Abandoned US20210169447A1 (en) 2013-07-17 2020-07-21 Identifying Anatomical Structures

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/063,152 Active 2038-09-28 US10716536B2 (en) 2013-07-17 2016-03-07 Identifying anatomical structures

Country Status (1)

Country Link
US (2) US10716536B2 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8992558B2 (en) 2008-12-18 2015-03-31 Osteomed, Llc Lateral access system for the lumbar spine
US11166709B2 (en) 2016-08-23 2021-11-09 Stryker European Operations Holdings Llc Instrumentation and methods for the implantation of spinal implants
EP3668442A2 (en) 2017-08-17 2020-06-24 Stryker European Holdings I, LLC Lateral access alignment guide and rigid arm
EP3545857B1 (en) 2018-03-30 2024-01-03 Stryker European Operations Holdings LLC Lateral access retractor and core insertion
CN109171808A (en) * 2018-09-07 2019-01-11 东南大学 Three-dimension ultrasonic imaging system based on measuring three-dimensional profile
CN109691984A (en) * 2018-12-07 2019-04-30 深圳先进技术研究院 A kind of multi-mode imaging system of pancreatic duct
US20210015448A1 (en) * 2019-07-15 2021-01-21 GE Precision Healthcare LLC Methods and systems for imaging a needle from ultrasound imaging data
US11564674B2 (en) 2019-11-27 2023-01-31 K2M, Inc. Lateral access system and method of use
US20210219947A1 (en) * 2020-01-16 2021-07-22 Tissue Differentiation Intelligence, Llc Intraoperative Ultrasound Probe System and Related Methods

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040097801A1 (en) * 1999-11-23 2004-05-20 Sameh Mesallum Method and apparatus for performing transesphageal cardiovascular procedures
US20080195102A1 (en) * 2004-09-15 2008-08-14 Paul Andrew Glazer Hand Held Integrated Pedicle Screw Placement Device
US20120310064A1 (en) * 2011-06-01 2012-12-06 Mcgee David L Ablation probe with ultrasonic imaging capabilities
US20130331706A1 (en) * 2012-06-12 2013-12-12 Volcano Corporation Devices, Systems, and Methods for Forward Looking Imaging
US20140187925A1 (en) * 2012-12-28 2014-07-03 Volcano Corporation Synthetic Aperture Image Reconstruction System In a Patient Interface Module (pim)
US20140288427A1 (en) * 2009-04-03 2014-09-25 James K. Wall Devices and methods for tissue navigation
US20150157387A1 (en) * 2008-11-12 2015-06-11 Trice Medical, Inc. Tissue visualization and modification devices and methods

Family Cites Families (139)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0435653A (en) 1990-05-31 1992-02-06 Fujitsu Ltd Supersonic diagnosis device
JPH05337111A (en) 1992-06-10 1993-12-21 Nippon Koden Corp Ultrasonic diagnostic apparatus
US5361767A (en) 1993-01-25 1994-11-08 Igor Yukov Tissue characterization method and apparatus
US6248073B1 (en) 1995-06-29 2001-06-19 Teratech Corporation Ultrasound scan conversion with spatial dithering
US6569101B2 (en) 2001-04-19 2003-05-27 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
US6416475B1 (en) 1996-06-28 2002-07-09 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6135961A (en) 1996-06-28 2000-10-24 Sonosite, Inc. Ultrasonic signal processor for a hand held ultrasonic diagnostic instrument
US6383139B1 (en) 1996-06-28 2002-05-07 Sonosite, Inc. Ultrasonic signal processor for power doppler imaging in a hand held ultrasonic diagnostic instrument
US6962566B2 (en) 2001-04-19 2005-11-08 Sonosite, Inc. Medical diagnostic ultrasound instrument with ECG module, authorization mechanism and methods of use
US6048311A (en) 1997-05-07 2000-04-11 Washburn; Michael J. Method and apparatus for ultrasound imaging using adaptive gray mapping
US6248072B1 (en) 1997-09-19 2001-06-19 John M. Murkin Hand controlled scanning device
US5935074A (en) 1997-10-06 1999-08-10 General Electric Company Method and apparatus for automatic tracing of Doppler time-velocity waveform envelope
JPH11169375A (en) 1997-12-10 1999-06-29 Hitachi Ltd Ultrasonic probe for diacrisis of tissue
US6066096A (en) 1998-05-08 2000-05-23 Duke University Imaging probes and catheters for volumetric intraluminal ultrasound imaging and related systems
US6181810B1 (en) 1998-07-30 2001-01-30 Scimed Life Systems, Inc. Method and apparatus for spatial and temporal filtering of intravascular ultrasonic image data
US6120445A (en) 1998-10-02 2000-09-19 Scimed Life Systems, Inc. Method and apparatus for adaptive cross-sectional area computation of IVUS objects using their statistical signatures
IL126723A0 (en) 1998-10-22 1999-08-17 Medoc Ltd Vaginal probe and method
US6126601A (en) 1998-10-29 2000-10-03 Gilling; Christopher J. Method and apparatus for ultrasound imaging in multiple modes using programmable signal processor
JP2002532172A (en) 1998-12-15 2002-10-02 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Ultrasound method and apparatus for determining absolute radii of flank walls and arteries in tissue
US6544181B1 (en) 1999-03-05 2003-04-08 The General Hospital Corporation Method and apparatus for measuring volume flow and area for a dynamic orifice
US6685645B1 (en) 2001-10-20 2004-02-03 Zonare Medical Systems, Inc. Broad-beam imaging
US6896658B2 (en) 2001-10-20 2005-05-24 Zonare Medical Systems, Inc. Simultaneous multi-mode and multi-band ultrasonic imaging
US6251073B1 (en) 1999-08-20 2001-06-26 Novasonics, Inc. Miniaturized ultrasound apparatus and method
US6325759B1 (en) 1999-09-23 2001-12-04 Ultrasonix Medical Corporation Ultrasound imaging system
US6450959B1 (en) 2000-03-23 2002-09-17 Ge Medical Systems Global Technology Company Ultrasound B-mode and doppler flow imaging
US6413217B1 (en) 2000-03-30 2002-07-02 Ge Medical Systems Global Technology Company, Llc Ultrasound enlarged image display techniques
EP1345527A4 (en) 2000-11-28 2007-09-19 Allez Physionix Ltd Systems and methods for making non-invasive physiological assessments
US6491636B2 (en) 2000-12-07 2002-12-10 Koninklijke Philips Electronics N.V. Automated border detection in ultrasonic diagnostic images
EP1417000B1 (en) 2001-07-11 2018-07-11 Nuvasive, Inc. System for determining nerve proximity during surgery
US6572547B2 (en) 2001-07-31 2003-06-03 Koninklijke Philips Electronics N.V. Transesophageal and transnasal, transesophageal ultrasound imaging systems
US6537217B1 (en) 2001-08-24 2003-03-25 Ge Medical Systems Global Technology Company, Llc Method and apparatus for improved spatial and temporal resolution in ultrasound imaging
US20030045797A1 (en) 2001-08-28 2003-03-06 Donald Christopher Automatic optimization of doppler display parameters
US6475149B1 (en) 2001-09-21 2002-11-05 Acuson Corporation Border detection method and system
US6579244B2 (en) 2001-10-24 2003-06-17 Cutting Edge Surgical, Inc. Intraosteal ultrasound during surgical implantation
US6746402B2 (en) 2002-01-02 2004-06-08 E. Tuncay Ustuner Ultrasound system and method
US7141020B2 (en) 2002-02-20 2006-11-28 Koninklijke Philips Electronics N.V. Portable 3D ultrasound system
US6579239B1 (en) 2002-04-05 2003-06-17 Ge Medical Systems Global Technology Company, Llc System and method for automatic adjustment of brightness and contrast in images
US6679843B2 (en) 2002-06-25 2004-01-20 Siemens Medical Solutions Usa , Inc. Adaptive ultrasound image fusion
US20050245822A1 (en) 2002-07-22 2005-11-03 Ep Medsystems, Inc. Method and apparatus for imaging distant anatomical structures in intra-cardiac ultrasound imaging
US7074188B2 (en) 2002-08-26 2006-07-11 The Cleveland Clinic Foundation System and method of characterizing vascular tissue
AU2003205634B2 (en) 2003-01-21 2009-05-28 Thd S.P.A. Retractor for surgical operations on the arteria haemorroidalis
US7175597B2 (en) 2003-02-03 2007-02-13 Cleveland Clinic Foundation Non-invasive tissue characterization system and method
US7815572B2 (en) 2003-02-13 2010-10-19 Koninklijke Philips Electronics N.V. Flow spectrograms synthesized from ultrasonic flow color doppler information
US9251593B2 (en) 2003-03-27 2016-02-02 Koninklijke Philips N.V. Medical imaging system and a method for segmenting an object of interest
US7347821B2 (en) 2003-06-26 2008-03-25 Koninklijke Philips Electronics N.V. Adaptive processing of contrast enhanced ultrasonic diagnostic images
US7905840B2 (en) 2003-10-17 2011-03-15 Nuvasive, Inc. Surgical access system and related methods
US7481769B2 (en) 2003-09-30 2009-01-27 Fujifilm Corporation Ultrasonic diagnosing apparatus
US20080009738A1 (en) 2003-11-17 2008-01-10 Koninklijke Philips Electronics N.V. Method for Utilizing User Input for Feature Detection in Diagnostic Imaging
EP1695110A2 (en) 2003-11-21 2006-08-30 Koninklijke Philips Electronics N.V. Ultrasound imaging system and method having adaptive selection of image frame rate and/or number of echo samples averaged
US7215802B2 (en) 2004-03-04 2007-05-08 The Cleveland Clinic Foundation System and method for vascular border detection
EP1740102A4 (en) 2004-03-23 2012-02-15 Dune Medical Devices Ltd Clean margin assessment tool
CN1969183A (en) 2004-06-17 2007-05-23 皇家飞利浦电子股份有限公司 Combined ultrasonic imaging and spectroscopic molecular analysis
DE102004040869B3 (en) 2004-08-23 2005-12-08 RUHR-UNIVERSITäT BOCHUM Ultrasonic image processing device, has quantifier module that quantifies and combines spectral and texture parameters calculated within annulated piece, using local correlation coefficients
BRPI0515158A (en) 2004-09-13 2008-07-08 Koninkl Philips Electronics Nv methods for detecting and / or measuring flow behavior, for detecting a pulsating flow of fluid within an individual, and for detecting whether there is a flow of fluid within the body of an individual who has recently experienced ventricular fibrillation, and, flow behavior detection and / or measurement system
CN100581483C (en) 2004-09-28 2010-01-20 皇家飞利浦电子股份有限公司 Method and apparatus for presenting information concerning flow behavior of a body fluid externally measured by ultrasound
US7876934B2 (en) 2004-11-08 2011-01-25 Siemens Medical Solutions Usa, Inc. Method of database-guided segmentation of anatomical structures having complex appearances
US8265355B2 (en) 2004-11-19 2012-09-11 Koninklijke Philips Electronics N.V. System and method for automated detection and segmentation of tumor boundaries within medical imaging data
WO2006072050A2 (en) 2004-12-30 2006-07-06 Nuvasive, Inc. System and methods for monitoring during anterior surgery
JP4672386B2 (en) 2005-02-17 2011-04-20 株式会社東芝 Ultrasonic probe and ultrasonic diagnostic system
US7680307B2 (en) 2005-04-05 2010-03-16 Scimed Life Systems, Inc. Systems and methods for image segmentation with a multi-stage classifier
US20080188749A1 (en) 2005-04-11 2008-08-07 Koninklijke Philips Electronics N.V. Three Dimensional Imaging for Guiding Interventional Medical Devices in a Body Volume
CN1885937A (en) 2005-06-26 2006-12-27 深圳迈瑞生物医疗电子股份有限公司 Method and apparatus for realizing DVD video recording function in medical ultrasonic system
EP1922563A2 (en) 2005-08-22 2008-05-21 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with spectral and audio tissue doppler
US8568317B1 (en) 2005-09-27 2013-10-29 Nuvasive, Inc. System and methods for nerve monitoring
US20070116338A1 (en) * 2005-11-23 2007-05-24 General Electric Company Methods and systems for automatic segmentation of biological structure
US20070167802A1 (en) 2005-12-05 2007-07-19 General Electric Company Accurate time delay estimation method and system for use in ultrasound imaging
CN101351157A (en) 2006-01-03 2009-01-21 皇家飞利浦电子股份有限公司 Method and system for locating blood vessels
US7804990B2 (en) 2006-01-25 2010-09-28 Siemens Medical Solutions Usa, Inc. System and method for labeling and identifying lymph nodes in medical images
US8105239B2 (en) 2006-02-06 2012-01-31 Maui Imaging, Inc. Method and apparatus to visualize the coronary arteries using ultrasound
US8219177B2 (en) 2006-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
WO2007095637A1 (en) 2006-02-16 2007-08-23 Catholic Healthcare West (D/B/A St. Joseph's Hospital Medical Center) System utilizing radio frequency signals for tracking and improving navigation of slender instruments during insertion into the body
US7466256B2 (en) 2006-03-31 2008-12-16 Siemens Medical Solutions Usa, Inc. Universal ultrasound sigma-delta receiver path
US8162836B2 (en) 2006-06-23 2012-04-24 Volcano Corporation System and method for characterizing tissue based upon split spectrum analysis of backscattered ultrasound
US7729533B2 (en) 2006-09-12 2010-06-01 Boston Scientific Scimed, Inc. Systems and methods for producing classifiers with individuality
DE602007012450D1 (en) 2006-10-02 2011-03-24 Hansen Medical Inc SYSTEM FOR THREE-DIMENSIONAL ULTRASOUND PICTURE
US8057390B2 (en) 2007-01-26 2011-11-15 The Regents Of The University Of Michigan High-resolution mapping of bio-electric fields
US9125589B2 (en) 2007-05-09 2015-09-08 General Electric Company System and method for tissue characterization using ultrasound imaging
CN101784235A (en) 2007-08-28 2010-07-21 皇家飞利浦电子股份有限公司 Dual path color doppler imaging system and method for simultaneous invasive device visualization and vasculature imaging
KR20100080529A (en) 2007-10-05 2010-07-08 신세스 게엠바하 Dilation system and method of using the same
US8105237B2 (en) 2008-05-30 2012-01-31 Volcano Corporation System and method for characterizing tissue based upon homomorphic deconvolution of backscattered ultrasound
US20110213250A1 (en) 2008-09-16 2011-09-01 Koninklijke Philips Electronics N.V. 3-d ultrasound imaging with volume data processing
US8870773B2 (en) 2009-02-09 2014-10-28 The Cleveland Clinic Foundation Ultrasound-guided delivery of a therapy delivery device to a nerve target
US20100204567A1 (en) 2009-02-09 2010-08-12 The Cleveland Clinic Foundation Ultrasound-guided delivery of a therapy delivery device to a phrenic nerve
US8343035B2 (en) 2009-04-20 2013-01-01 Spine View, Inc. Dilator with direct visualization
US20100286519A1 (en) 2009-05-11 2010-11-11 General Electric Company Ultrasound system and method to automatically identify and treat adipose tissue
US20100286518A1 (en) 2009-05-11 2010-11-11 General Electric Company Ultrasound system and method to deliver therapy based on user defined treatment spaces
US9119951B2 (en) 2009-10-12 2015-09-01 Kona Medical, Inc. Energetic modulation of nerves
GB201009006D0 (en) * 2010-05-28 2010-07-14 Ntnu Technology Transfer As Ultrasound acquisition
US9392960B2 (en) 2010-06-24 2016-07-19 Uc-Care Ltd. Focused prostate cancer treatment system and method
CN102370499B (en) 2010-08-26 2014-05-07 深圳迈瑞生物医疗电子股份有限公司 Method and system for simultaneously displaying Doppler image, B-type image and colored blood flow image
US20120116218A1 (en) 2010-11-10 2012-05-10 Jennifer Martin Method and system for displaying ultrasound data
CN102802536B (en) 2010-11-11 2015-01-07 奥林巴斯医疗株式会社 Ultrasound diagnostic device, operation method of ultrasound diagnostic device, and operation program for ultrasound diagnostic device
WO2012071110A1 (en) 2010-11-24 2012-05-31 Boston Scientific Scimed, Inc. Systems and methods for detecting and displaying body lumen bifurcations
WO2012080905A1 (en) 2010-12-14 2012-06-21 Koninklijke Philips Electronics N.V. Ultrasound imaging system and method with peak intensity detection
CN102551791B (en) 2010-12-17 2016-04-27 深圳迈瑞生物医疗电子股份有限公司 A kind of ultrasonic imaging method and device
US8715187B2 (en) 2010-12-17 2014-05-06 General Electric Company Systems and methods for automatically identifying and segmenting different tissue types in ultrasound images
US8876721B2 (en) 2011-02-01 2014-11-04 Fujifilm Corporation Ultrasound diagnostic apparatus
US20130023767A1 (en) 2011-05-12 2013-01-24 Mammone Richard J Low-cost, high fidelity ultrasound system
ITPI20110054A1 (en) 2011-05-16 2012-11-17 Echolight S R L ULTRASOUND APPARATUS TO ASSESS THE STATE OF THE BONE STRUCTURE OF A PATIENT
US8754888B2 (en) 2011-05-16 2014-06-17 General Electric Company Systems and methods for segmenting three dimensional image volumes
CA2845044C (en) 2011-08-12 2023-03-28 Jointvue, Llc 3-d ultrasound imaging device and methods
CN103841898B (en) 2011-09-30 2016-12-21 皇家飞利浦有限公司 There is the ultrasonic system of the automatic dynamic doppler flow setting moved with sample volume
MX346426B (en) 2011-09-30 2017-03-21 Koninklijke Philips Nv Ultrasound system with automated doppler flow settings.
US20130085394A1 (en) * 2011-10-04 2013-04-04 Sonivate Medical, Inc. Glove with integrated sensor
US9241761B2 (en) 2011-12-28 2016-01-26 Koninklijke Philips N.V. Ablation probe with ultrasonic imaging capability
US8663116B2 (en) 2012-01-11 2014-03-04 Angiodynamics, Inc. Methods, assemblies, and devices for positioning a catheter tip using an ultrasonic imaging system
JP5992539B2 (en) 2012-01-18 2016-09-14 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Ultrasound guidance of needle path in biopsy
CN103505288B (en) 2012-06-29 2017-11-17 通用电气公司 Ultrasonic imaging method and supersonic imaging apparatus
US9375196B2 (en) 2012-07-12 2016-06-28 Covidien Lp System and method for detecting critical structures using ultrasound
CN102940510B (en) 2012-08-31 2014-10-08 华南理工大学 Automatic focusing method for ultrasonic elastography
US10631780B2 (en) 2012-12-05 2020-04-28 Philips Image Guided Therapy Corporation System and method for non-invasive tissue characterization
US11154199B2 (en) 2013-01-30 2021-10-26 Koninklijke Philips N.V. Imaging system with hyperspectral camera guided probe
US9592027B2 (en) * 2013-03-14 2017-03-14 Volcano Corporation System and method of adventitial tissue characterization
CA2912791C (en) 2013-05-24 2023-08-01 Sunnybrook Research Institute System and method for classifying and characterizing tissues using first-order and second-order statistics of quantitative ultrasound parametric maps
WO2014207627A1 (en) 2013-06-26 2014-12-31 Koninklijke Philips N.V. Method and system for multi-modal tissue classification
US10154826B2 (en) * 2013-07-17 2018-12-18 Tissue Differentiation Intelligence, Llc Device and method for identifying anatomical structures
WO2015039302A1 (en) 2013-09-18 2015-03-26 Shenzhen Mindray Bio-Medical Electronics Co., Ltd Method and system for guided ultrasound image acquisition
US9980738B2 (en) 2013-09-25 2018-05-29 University of Pittsburgh—of the Commonwealth System of Higher Education Surgical tool monitoring system and methods of use
US9277902B2 (en) 2013-11-22 2016-03-08 General Electric Company Method and system for lesion detection in ultrasound images
WO2015092667A1 (en) 2013-12-20 2015-06-25 Koninklijke Philips N.V. System and method for tracking a penetrating instrument
CN113576679A (en) 2013-12-20 2021-11-02 皇家飞利浦有限公司 User interface for photonic tools and electromagnetic tracking guided bronchoscopes
WO2015101913A1 (en) 2014-01-02 2015-07-09 Koninklijke Philips N.V. Ultrasound navigation/tissue characterization combination
US9324155B2 (en) 2014-03-10 2016-04-26 General Electric Company Systems and methods for determining parameters for image analysis
CN110811691B (en) 2014-03-20 2022-08-05 深圳迈瑞生物医疗电子股份有限公司 Method and device for automatically identifying measurement items and ultrasonic imaging equipment
JP6006249B2 (en) 2014-03-24 2016-10-12 富士フイルム株式会社 Acoustic wave processing device, signal processing method and program for acoustic wave processing device
JP6397578B2 (en) 2014-09-17 2018-09-26 アバス サージカル,エル・エル・シー Anatomical structure identification
US9655592B2 (en) 2014-11-21 2017-05-23 General Electric Corporation Method and apparatus for rendering an ultrasound image
JP6386093B2 (en) 2015-01-08 2018-09-05 富士フイルム株式会社 Photoacoustic measuring device and photoacoustic measuring system
US20160238568A1 (en) 2015-02-18 2016-08-18 Riverside Research Institute Typing and imaging of biological and non-biological materials using quantitative ultrasound
EP3280333A4 (en) 2015-04-09 2018-12-12 Avaz Surgical LLC Device and system for placing securing device within bone
US10430688B2 (en) 2015-05-27 2019-10-01 Siemens Medical Solutions Usa, Inc. Knowledge-based ultrasound image enhancement
EP3310260A4 (en) 2015-06-22 2019-03-06 Sunnybrook Research Institute Systems and methods for prediction of tumor response to chemotherapy using pre-treatment quantitative ultrasound parameters
CN105030279B (en) 2015-06-24 2018-06-29 华南理工大学 A kind of tissue characterization method based on ultrasonic radio frequency time series
US10588605B2 (en) 2015-10-27 2020-03-17 General Electric Company Methods and systems for segmenting a structure in medical images
US20170119356A1 (en) 2015-10-30 2017-05-04 General Electric Company Methods and systems for a velocity threshold ultrasound image
US20170238907A1 (en) 2016-02-22 2017-08-24 General Electric Company Methods and systems for generating an ultrasound image
EP3426159B1 (en) 2016-03-07 2021-05-26 Avaz Surgical LLC Identifying anatomical structures
CN106491161B (en) 2016-11-15 2019-07-05 乐普(北京)医疗器械股份有限公司 A kind of method and device of intelligent organization's identification

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040097801A1 (en) * 1999-11-23 2004-05-20 Sameh Mesallum Method and apparatus for performing transesphageal cardiovascular procedures
US20080195102A1 (en) * 2004-09-15 2008-08-14 Paul Andrew Glazer Hand Held Integrated Pedicle Screw Placement Device
US20150157387A1 (en) * 2008-11-12 2015-06-11 Trice Medical, Inc. Tissue visualization and modification devices and methods
US20140288427A1 (en) * 2009-04-03 2014-09-25 James K. Wall Devices and methods for tissue navigation
US20120310064A1 (en) * 2011-06-01 2012-12-06 Mcgee David L Ablation probe with ultrasonic imaging capabilities
US20130331706A1 (en) * 2012-06-12 2013-12-12 Volcano Corporation Devices, Systems, and Methods for Forward Looking Imaging
US20140187925A1 (en) * 2012-12-28 2014-07-03 Volcano Corporation Synthetic Aperture Image Reconstruction System In a Patient Interface Module (pim)

Also Published As

Publication number Publication date
US20160183913A1 (en) 2016-06-30
US10716536B2 (en) 2020-07-21

Similar Documents

Publication Publication Date Title
US20210169447A1 (en) Identifying Anatomical Structures
US20180168539A1 (en) Device and System for Placing Securing Device Within Bone
AU2020200104B2 (en) Identifying anatomical structures
US20230240642A1 (en) Device and Method for Identifying Anatomical Structures
US11883064B2 (en) Multi-shield spinal access system
EP3193726B1 (en) Identifying anatomical structures
EP3845173B1 (en) Adaptive surgical system control according to surgical smoke particulate characteristics
EP3845192B1 (en) Adaptive visualization by a surgical system
EP3845175B1 (en) Adaptive surgical system control according to surgical smoke cloud characteristics
EP3845174B1 (en) Surgical system control based on multiple sensed parameters
US6579244B2 (en) Intraosteal ultrasound during surgical implantation
WO2019104329A1 (en) Medical three-dimensional (3d) scanning and mapping system
US20160106392A1 (en) Ultrasonic array for bone sonography
US20210137602A1 (en) Method to precisely place vertebral pedicle anchors during spinal fusion surgery
US11701086B1 (en) Methods and systems for improved nerve detection
US20230346338A1 (en) Portable ultrasound based nerve imaging system
Liu et al. A photoacoustics-enhanced drilling probe for radiation-free pedicle screw implantation in spinal surgery
US20230346211A1 (en) Apparatus and method for 3d surgical imaging

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED