WO2024080997A1 - Appareil et procédé de suivi d'outils chirurgicaux portatifs - Google Patents

Appareil et procédé de suivi d'outils chirurgicaux portatifs Download PDF

Info

Publication number
WO2024080997A1
WO2024080997A1 PCT/US2022/046753 US2022046753W WO2024080997A1 WO 2024080997 A1 WO2024080997 A1 WO 2024080997A1 US 2022046753 W US2022046753 W US 2022046753W WO 2024080997 A1 WO2024080997 A1 WO 2024080997A1
Authority
WO
WIPO (PCT)
Prior art keywords
surgical tool
hand
location
surgical
held
Prior art date
Application number
PCT/US2022/046753
Other languages
English (en)
Inventor
Christopher Schlenger
Tamas Ungi
Original Assignee
Verdure Imaging, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Verdure Imaging, Inc. filed Critical Verdure Imaging, Inc.
Priority to PCT/US2022/046753 priority Critical patent/WO2024080997A1/fr
Publication of WO2024080997A1 publication Critical patent/WO2024080997A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/32Surgical cutting instruments
    • A61B17/3209Incision instruments
    • A61B17/3211Surgical scalpels, knives; Accessories therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M5/00Devices for bringing media into the body in a subcutaneous, intra-vascular or intramuscular way; Accessories therefor, e.g. filling or cleaning devices, arm-rests
    • A61M5/178Syringes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/12Surgical instruments, devices or methods, e.g. tourniquets for ligaturing or otherwise compressing tubular parts of the body, e.g. blood vessels, umbilical cord
    • A61B17/122Clamps or clips, e.g. for the umbilical cord
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • A61B2017/3413Needle locating or guiding means guided by ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/05Surgical care
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/12Devices for detecting or locating foreign bodies
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments

Definitions

  • Various electronic devices and systems have been developed to provide visualization aids to the practitioner while they are using a hand-held surgical tool to interact with tissue of a patient.
  • Some systems provide visualization of the position of the working end of the handheld surgical tool by presenting images on a display that is viewable by the practitioner.
  • Sensing location and orientation of the working end of the hand-held surgical tool is typically done using ultrasound devices, computed tomography (CT) scanning devices, x-ray devices, fluoroscopy scanning devices, and/or magnetic resonance imaging (MRI) system.
  • CT computed tomography
  • x-ray devices x-ray devices
  • fluoroscopy scanning devices fluoroscopy scanning devices
  • MRI magnetic resonance imaging
  • X-ray imaging, fluoroscopy and CT scanning use ionizing radiations (X-rays) that may be harmful to the patient and the practitioner who is performing a surgical procedure in the surgical theater.
  • X-rays ionizing radiations
  • a patient that is getting an epidural injection in the spine will have to undergo a video fluoroscopy for an extended period of time.
  • the practitioner may have to wear cumbersome lead shielding.
  • the patient will also have to use a suitable lead shield.
  • the fluoroscope is usually exceptionally large and can leave only a small space available for the practitioner to work.
  • Intra-operative CT such as cone beam will have similar downsides.
  • ultrasound devices project sound waves into the patient and detect returning sound wave echoes that are used to generate an image, referred to as a sonogram image.
  • Ultrasound devices used in ultrasonographic systems produce sound waves at a frequency above the audible range of human hearing, which is approximately 20 kHz. Sound waves between 2 and 18 MHz are often used for ultrasound medical diagnostic applications. At present, there are no known long-term side effects from interrogating the human body with ultrasound waves.
  • an ultrasound scan can cover only a relatively small part of the patient's body with each scan.
  • the sonogram is a relatively narrow image, covering a relatively small cross-section of only a few inches.
  • objects found in the sonogram image may often be blurry. For example, five hundred to one thousand sonogram images must be captured to acquire enough image data for analysis of a full human spine. Accordingly, legacy ultrasound scanners are inadequate for acquiring image information for the patient’s body when a large area of the human subject must be examined, such as the patient's spine, because the sonogram images are too small, and many sonogram images cannot be easily analyzed to arrive at any meaningful information about the condition of the examined patient.
  • the inventors have created an uitrasonagraphic system that is operable to acquire sonogram information from a series of ultrasonic scans of a patient, as disclosed in U.S. patents 9,675,321 and 9,713,508, which are both incorporated by reference herein in their entirety.
  • a series of time indexed ultrasound scans are taken over a portion of interest on the patient which has their underlying bone structure or other ultrasound discernable organ that is under examination.
  • Location and orientation of the ultrasound scanner, and location of the scanned portion of the patient are precisely identifiable in the time indexed sonogram images that are acquired from the sonogram scanner of the patient during the scanning process.
  • the data from the series of acquired time indexed sonogram scans are synthesized into a single data file that is used to generate a three-dimensional (3D) image and/or 3D model of the underlying bone structure or organ of the examined patient, referred herein as a tissue model.
  • a tissue model a three-dimensional (3D) image and/or 3D model of the underlying bone structure or organ of the examined patient.
  • this system is unsuitable for an actual surgical procedure since the 3D model of the tissue of interest (bone and/or organs of the patient) because the numerous individual sonogram scans used to generate a 3D tissue model must be acquired before initiation of the surgical procedure.
  • the uitrasonagraphic system disclosed in U.S. patents 9,675,321 and 9,713,508 was not designed to sense location and orientation of a hand-held surgical tool during a surgical procedure.
  • Embodiments of the hand-held surgical tool tracking system provide a system and method for tracking location and orientation of a hand-held surgical tool being manipulated by a practitioner to interact with tissue of a patient.
  • Example embodiments determine location and orientation of a hand-held surgical tool by capturing image data that includes images of at least a first detectable target on the hand-held surgical tool and a second detectable target on the patient. Location and orientation of the first detectable target in a 3D space is determined based on the image data. Current location and orientation of the hand-held surgical tool in the 3D space determined based on the determined location and orientation of the first detectable target and based on retrieved model data representing the hand-held surgical tool.
  • Location of the second detectable target in the 3D space is determined based on the image data. Then, a location of the patient's tissue in the 3D space relative to the location and orientation of the hand-held surgical tool is determined based on the determined location of the patient's second detectable target. In some embodiments, relative location of the hand-held surgical tool relative to a patient’s tissue is determined. Then a composite image showing the handheld surgical fool and an image of the tissue (based on a predefined model of the tissue) is presented on a 2D and/or 3D display.
  • FiGs. 1A-1 E are figures of various types of hand-held surgical tools with one or more detectable targets at known locations on the surface of the hand-held surgical tool.
  • FIG. 2 is a schematic view of a hand-held surgical tool tracking system for assisting a practitioner performing a procedure on a human subject.
  • FIG. 3 is a conceptual illustration of a presented composite image showing the relative location of a spine of a patient and a hand-held surgical tool.
  • FIG. 4 is a conceptual diagram of an embodiment of the hand-heid surgical tool tracking system and a robotic operation system cooperatively working together to generate a composite image that includes robotic tools and hand-held surgical tools.
  • FIGs. 1A-1 E are figures of various, non-limiting example types of hand-heid surgical tools with one or more detectable targets 102a, 102b at known locations on the surface of the hand-held surgical tool 104a-e.
  • the detectable targets 102a, 102b are optically detectable, though the detectable targets 102a, 102b may be detectable using non-optical systems.
  • FIG. 2 is a schematic view of a hand-heid surgical tool tracking system 100 for assisting a practitioner performing a surgical procedure on a human subject 200 (interchangeably referred to herein as a patient 200).
  • embodiments of the hand-held surgical tool tracking system 100 determine location and orientation in three-dimensional (3D) space of the hand- heid surgical tool 104, and in particular, the working end (tooi end) of the hand-heid surgical tool 104. Accordingly, the location and orientation in the 3D space of the hand-heid surgical tool 104 that is being manipulated by the practitioner during a surgical procedure is determinable in real-time or near real-time.
  • a real-time or near real-time rendering and presentation of an image showing the interaction of the working end 106 of the hand-held surgical tool 104 with a two-dimensional (2D) or three dimensional (3D) model of the tissue of the patient 200 is presented on a display. Accordingly, the practitioner may Intuitively understand the actual location of the working end 106 of the hand-held surgical tool 104 with respect to the tissue that is being operated on.
  • the presented image of the tissue and the hand-held surgical tool 104 may be presented as a 2D image and/or a 3D image, is some embodiments, the practitioner (or an assistant) wear virtual reality glasses to view the 3D image of the working end 106 of the hand-held surgical tool 104 interacting with the tissue of interest. Multiple displays may concurrently present the 2D image and/or the 3D image so that other parties may view the ongoing surgical procedure. Further, the time sequenced 2D images and/or a 3D images may be saved for later review.
  • substantially means to be more-or-less conforming to the particular dimension, range, shape, concept, or other aspect modified by the term, such that a feature or component need not conform exactly.
  • a “substantially cylindrical” object means that the object resembles a cylinder, but may have one or more deviations from a true cylinder.
  • “Comprising,” “including,” and “having” (and conjugations thereof are used interchangeably to mean including but not necessarily limited to, and are open-ended terms not intended to exclude additional elements or method steps not expressly recited.
  • Coupled means connected, either permanently or releasabiy, whether directly or indirectly through intervening components. “Secured to” means directly connected without intervening components.
  • “Communicatively coupled” means that an electronic device exchanges information with another electronic device, either wirelessly or with a wire based connector, whether directly or indirectly through a communication network. “Controllably coupled” means that an electronic device controls operation of another electronic device.
  • Real-time means that real-time is instant, whereas “near real-time” is delayed by some amount of time (whether that’s by a few milliseconds or even less.
  • reference numeral 104 may be used to generically represent any hand-held surgical tool, and reference numeral 104a may be used to identify a specific hand-held surgical fool, such as the example scalpel 104a (FIG. 1A).
  • Optical targets 102 may be any conventional, specially developed, or later developed optical target that is discernable by an optical tracking system of the hand-held surgical tool tracking system 100.
  • the optical targets 102 may extend in three dimensions about a three coordinate axis and include distinct optical target portions representing each axis, in other examples, the optical target 102 may extend in three dimensions about six axes and include distinct optical targets representing each of the six axes.
  • Optical targets 102 are discernably to the human eye and are discernable in a captured image (photographic image or video image),
  • the detectable targets 102 may be active, such as by emitting (or reflecting) electromagnetic signals or other energy signals from the detectable target 102.
  • the detectable target 102 may be passive, such as a retro-reflective marker that reflects radiation energy emitted by some interaction device.
  • Such active or passive targets 102 are generically described herein as detectable targets 102 for brevity, though such detectable targets 102 may not be optically detectable by an image capture device. Any suitable detectable target 102 that is now known or later developed is intended to be within the scope of this disclosure and to be protected by the accompanying claims.
  • each of the detectable targets 102 are uniquely identifiable, in some embodiments, a visible characteristic of an optical target (or at least one optical target) can be compared to known identification characteristics of the optical target so a particular optical target 102 is identifiable.
  • the shape of the optical targets 102 may be different, if the optical targets 102 are small marks, such as dots or the like, the number of marks for each of the optical targets 102 may identify that particular optical target 102.
  • a unique alpha-numeric identifier may be located proximate to or on the optical targets 102.
  • Active detectable targets 102 may emit different signals with identifier information. In embodiments that employ only a single optical target 102, that single optical target 102 is inherently identifiable,
  • the non-limiting example embodiment of the hand-held surgical tool tracking system 100 comprises a processor system 202, at least one hand-held surgical tool 104, and an optional remote rendering and display system 204.
  • the processor system 202 comprises at least one image capture device 206, a target tracker unit 208, an image registration module 210, an image processing algorithm module 212, a 3D/2D visualization module 214, a surgical tool database 216, a tissue model database 218, and a user input device 220.
  • Some embodiments include an optional clock 222.
  • Some embodiments may include an optional display 224.
  • the processor system 202 may be communicatively coupled to the remote rendering and display system 204 via a wire-based or wireless connection 226.
  • Other embodiments of the hand-held surgical tool tracking system 100 may include some, or may omit some, of the above-described components. Further, additional components not described herein may be included in alternative embodiments.
  • the discernable targets 102 are optical targets 102.
  • the optical targets are discernable to a human viewer and are detectable in a photographic image.
  • the image capture device 206 is still or video camera device, and the acquired image data are still image photograph data or photographic video data.
  • FIGs. 1A-1 E represent a non-limiting selection of example hand-held surgical tools 104a-104e.
  • Any particular hand-held surgical tool may include one or more detectable targets 102.
  • Each type of hand-held surgical tool 104 is uniquely identifiable.
  • An identifier of the hand-held surgical tool 104 may be located on each hand-held surgical tool 104.
  • the identifier may be an alpha-numeric identifier, a bar or QR code, or the like that can be identified to the hand-held surgical tool tracking system 100.
  • a captured image or a scan of the hand-held surgical tool 104 may be analyzed by the hand-held surgical tool tracking system 100 to determine the identifier of the hand-held surgical tool 104 using any suitable alphanumeric text character recognition algorithm.
  • a radio frequency identifier tag RFID
  • RFID radio frequency identifier tag
  • an object recognition algorithm may be used to identify the surgical tool 104 based on captured image data.
  • the practitioner or another party may manually specify the identifier of the hand-held surgical tool 104 to the hand-held surgical too! fracking system 100 (such as if the specification is entered using a keyboard, if entered using a touch sensitive screen, or is spoken to an audible detection and recognition system).
  • hand-held surgical tool model data corresponding to the identified surgical tool 104 can be retrieved so that the processor system 202 can determine the precise location and orientation of the identified hand-held surgical tool 104, and in particular the working end 106 located at the distal end of the hand-held surgical tool 104, during the surgical procedure.
  • the particular hand-held surgical tool 104 is appreciated to be a scalpel 104a.
  • the scalpel 104a has a working end 106a (interchangeably referred to herein as a tool 106a or blade 102a) that is understood by one skilled in the art to be the cutting edge or blade 1061 located at the distal end of the scalpel 104a.
  • a tool 106a or blade 102a Prior to use of the scalpel 104a during the surgical procedure, one or more identifiable optical targets 102 are placed at known locations (or are fabricated at the known locations) on the surface of the scalpel 104a in the various preferred embodiments. Preferably, each of the optical targets 102 will be uniquely identifiable.
  • a variety of different types of detectable targets 102 may be used in the various embodiments.
  • a non-limiting first type is the example optical target 102a that has a plurality of distinct optical target portions.
  • the location and orientation of that optical target 102a can be precisely determined based on image characteristics of the image target portions (i.e., relative location to each other).
  • the precise location and orientation of the blade 106a in the 3D space can be determined using a stored model corresponding to the scalpel 104a.
  • a model of the scalpel 104a with data corresponding to the example optical target 102 has been predetermined and stored.
  • the location of each portion of the scalpel 104a relative to the optical target 102a, and in particular the precise location and orientation of the blade 106a to the optical target 102a, is represented in the model data of the scalpel 104a.
  • embodiments of the hand-held surgical tool tracking system 100 determines the precise location and orientation of the scalpel 104a, and in particular the precise location and orientation of the blade 106a, in 3D space.
  • a plurality of optical targets 102b may be located on the surface of the hand-held surgical tool 104.
  • Such optical targets 102b may be simple marks, such as one or more small colored dots, squares, triangles, stars, or the like, that can be identified in the image data acquired by the image capture device 206. Differing colors of and/or patterns on the mark may be used. Any suitable shape, color and/or pattern may be used in the various embodiments, particularly for mark identification purposes.
  • the precise location and identity of each of the identifiable optical targets 102b on the surface of the handheld surgical tool 104 is known using a stored model corresponding to the scalpel 104a.
  • two optical targets 102b and 102b' are illustrated (wherein a characteristic of the optical target 102b is different from the optical target 102b' for identification purposes).
  • the precise location and orientation of the scalpel 104a in 3D space can be computed (determined) by the processor system 202 based on the identified relative location of the at least to optical targets 102b with respect to each other. Additional optical markers102b may be used to improve accuracy of the determined location and orientation of the scalpel 104a.
  • a model of the scalpel 104a with the example optical targets 102b and 102b’ has been predetermined and stored.
  • the location of each portion of the scalpel 104a relative to the optical targets 102b and 102b’, and in particular the precise location and orientation of the blade 106a to the optical target 102b and/or 102b’, is represented in the model data of the scalpel 104a.
  • embodiments of the hand-held surgical tool tracking system 100 determines the precise location and orientation of the scalpel 104a, and in particular the precise location and orientation of the blade 106a, in 3D space.
  • optical targets 102b and 102b' are preferably uniquely identifiable so that the location and orientation of working end 106 of the hand-held surgical tool 104 is determinable. (If the optical targets 102b and 102b' are not identifiable from each other, it is possible that an incorrect orientation and location of the surgical tool 104 could be determined. Employing uniquely identifiable optical targets 102b and 102b' solves this potential problem.)
  • the particular hand-held surgical tool 104 is appreciated to be a generic syringe 104b.
  • the syringe 104b has a working end 106b that is understood by one skilled in the art to be the tip of a needle (located at the distal end of the syringe 104b that is used to puncture a patient for injecting medicine or drawing tissue samples.
  • one or more identifiable optical targets 102a and/or 102b Prior to use of the syringe 104b during the surgical procedure, one or more identifiable optical targets 102a and/or 102b are placed at known locations (or are fabricated at the known locations) of the surface of the syringe 104b in the various preferred embodiments.
  • each of the optical targets 102 will be uniquely identifiable. (It is appreciated that one or more optical targets 102a, and/or three or more optical targets 102b, might be used in other embodiments.)
  • some embodiments may be configured to determine the relative distance between the optical targets 102b, 102b’ during the surgical procedure.
  • the optical target 102b may be located on a plunger 110 of the syringe 104b.
  • the second optical target 102b’ may be located on the barrel 112 of the syringe 104b
  • the change in distance between the optical targets 102b, 102b’ during the course of the surgical procedure may then be used to precisely compute a travel distance of the barrel seal 114 that is at the distal end of the plunger 110.
  • the volume of the barrel 112 is known and the volume information is stored as part of the syringe model data, the volume of medicine injected out from the syringe 104b and/or the amount of tissue collected within the syringe 104b may be precisely determined based on the determined travel distance of the barrel seal 114. This information may be presented to the practitioner on a real-time basis during the surgical procedure.
  • the particular hand-held surgical tool 104 is appreciated to be a generic surgical scissor 104c.
  • the surgical scissor 104c has a first working end 106c and a second working end 106c’ (located at the distal end of the surgical scissor 104c) that is understood by one skilled in the art to be the cutting end that is used to cut tissue during a surgical procedure.
  • a plurality of identifiable optical targets 102 are placed at known locations (or are fabricated at the known locations) of the surface of the surgical scissor 104c in the various preferred embodiments.
  • each of the optical targets 102 will be uniquely identifiable.
  • each of the two first surgical tool members 108c and 108c’ of the surgical scissor 104c must each have their own optical targets 102.
  • the first surgical tool member 108c includes a first optical target 102b and a second optical target 102b’.
  • the second surgical tool member 108c’ includes a third optical target 102b” and a fourth optical target 102b’”.
  • the determined location of the optical targets 102b”, 102b’” may be used to determine the precise location and orientation of the working end 106c.
  • the determined location of the optical targets 102b, 102b' may be used to determine the precise location and orientation of the working end 106c’.
  • the relative separation between the optical targets 102b and 102b”, and/or the optical targets 102b’ and 102b”' may be used in the computations.
  • a plurality of optical targets 102a may be used to track the members 108c, 108c’.
  • Three optical targets 102b may be used (see FIG. I D, for example).
  • the surgical scissor 104c may be closed at some point during the surgical procedure wherein the working ends 106c, 106c’ are adjacent to and/or are in contact with each other. At another point during the surgical procedure, the surgical scissor 104c may be opened such that the working ends 106c, 106c’ are separated from each other by a determinable distance so that the practitioner can insert the working ends 106c, 106c’ around the tissue that is to be cut. Further, as the practitioner is cutting the tissue of interest, the working ends 106c, 106c’ are moving towards each other in a cutting motion. Accordingly, determination of the precise location and orientation of the surgical scissor 104c, and in particular the working ends 106c, 106c’, is determinable in real-time or near real-time.
  • the particular hand-held surgical tool 104 is appreciated to be a generic surgical damp 104d.
  • the surgical clamp 104d has a first working end 106d and a second working end 106d' (located at the distal end of the surgical clamp 104d) that is understood by one skilled in the art to be clamps that are used to clamp tissue during a surgical procedure.
  • a plurality of identifiable optical targets 102 are placed at known locations (or are fabricated at the known locations) of the surface of the surgical clamp 104d in the various preferred embodiments.
  • each of the optical targets 102 will be uniquely identifiable.
  • the surgical clamp 104d may be opened such that the working ends 106d, 106d’ are separated from each other by a determinable distance so that the practitioner can insert the working ends 106d, 106d’ around the tissue that is to be clamped.
  • the surgical clamp 104d may be later closed during the surgical procedure wherein the working ends 106d, 106d’ are adjacent to and/or are in contact with each other to clamp the tissue of interest.
  • the working ends 106c, 106c’ are moving towards each other in a clamping motion. Accordingly, determination of the precise location and orientation of the surgical clamp 104d, and in particular the working ends 106d, 106d’, is determinable in realtime or near real-time.
  • each of the two surgical tool members 108d and 108d’ of the surgical clamp 104d are conceptually illustrated as having their own optical target 102a.
  • the first surgical tool member 108d includes a first optical target 102a.
  • the second surgical tool member 108d’ includes a second optical target 102b’.
  • each of the two surgical tool members 108d and 108d’ of surgical ciamp 104d are conceptually illustrated as having their own optical targets 102b and 102b’, respectively.
  • a third optical target 102b" is located at the hinge of the surgical clamp 104d (where the first surgical tool member 108d is hinge-ably coupled to the second surgical tool member 108d").
  • the precise location and orientation of the first surgical tool member 108d can be determined based on the determined location of the first optical target 102b and the hinge optical target 102b”.
  • the precise location and orientation of the second surgical tool member 108d’ can be determined based on the determined location of the first opticai target 102b’ and the hinge optical target 102b”. Once the precise location and orientation of each surgical tool member 108d, 108d’ is determined, the precise location and orientation of the surgical clamp 104d and its associated working ends 106d, 106d’ can be determined by the processor system 202 in real-time or near real-time.
  • a hand-held surgical tool 104 may have only one type of detectable target 102 or a plurality of different types of detectable targets 102.
  • the nonlimiting surgical clamp 104d uses two different types of optical targets 102a and 102b.
  • Other embodiments of the surgical clamp 104d may use only one of the types of the optical targets 102 (and/or may use other types of optical targets 102).
  • Use of different types of optical targets 102 enables the same hand-held surgical tool 104 to be used with different embodiments of the hand-held surgical tool tracking system 100 that employ different image processing algorithms modules 212 that are configured to identify a particular type of optical target 102. This feature of detectable targets 102 may be used on any hand-held surgical tool 104.
  • the particular hand-held surgical tool 104 is appreciated to be a generic surgical tweezer 104e.
  • the surgical tweezer 104e has a first working end 106e and a second working end 106e’ (located at the distal end of the surgical tweezer 104e) that is understood by one skilled in the art to be the grasping tool end that is used to grasp tissue during a surgical procedure.
  • a plurality of identifiable optical targets 102 are placed at known locations (or are fabricated at the known locations) of the surface of the surgical clamp 104d in the various preferred embodiments.
  • each of the optical targets 102 will be uniquely identifiable.
  • each of the two surgical tool members 108e and 108e’ of the surgical tweezer 104e are conceptually illustrated as having their own optical target 102a, 102a’, respectively.
  • FIGs. 1A-1 E are intended to represent only a sampling of possible hand-held surgical tools that may be tracked using detectable targets 102.
  • some hand-held surgical tools may have working ends locate at both the proximal and distal ends of the hand-held surgical tool (such as encountered in the dentistry arts).
  • Ail such manually manipulated hand-held surgical tools 104 now known or later developed are intended to be withing the scope of this disclosure and to be protected by the accompanying claims,
  • Some embodiments may employ a plurality of different detectable targets 102a, 102b located on other surfaces of the hand-held surgical tool 104,
  • detectable targets 102a, 102b may be located on the opposing surface of the scalpel 104a and or on the side surfaces of the scalpel 104a.
  • multiple detectable optical targets 102a, 102b may be on the same surface of the hand-held surgical tool 104 in case the practitioner’s hand or another object blocks a view of the hand-held surgical tool 104 by the image capture device 206 during the surgical procedure. Accordingly, a sufficient number of detectable optical target(s) 102a, 102b will always be discernable in an image that is being captured by the image capture device 206 (FIG. 2). Therefore, the practitioner does not need to worry about how they are grasping the hand-held surgical tool 104 during the surgical procedure,
  • the precise location and orientation of the working end 106 of any hand-held surgical tool 104 is precisely determined based on a determined location of the one or more detectable targets 102.
  • One skilled in the arts understands how such location and orientation calculations are performed to identify the precise location and orientation of the hand-held surgical tool 104 in a known 3D space based on the determined locations of detectable targets 102, such as the example optical targets 102a, 102b. Accordingly, such geometry calculations are not described herein for brevity.
  • each detectable target 102 on the surface of the hand-held surgical tool 104 is known.
  • the known location of each detectable target 102 may be based on design specifications.
  • each of the detectable targets 102 are fabricated at their precisely known locations and/or orientations.
  • An unexpected advantage of incorporating the detectable target(s) 102 as part of a manufactured hand-held surgical tool 104 is that hundreds, even thousands, of like hand-held surgical tools 104 may be manufactured and distributed to different sites. If the site of the surgical procedure has an embodiment of the hand-held surgical tool tracking system 100, then the precise location and orientation of the hand-held surgical tool 104 can be determined during the surgical procedure. However, if the site does not have an embodiments of the hand-held surgical tool tracking system 100, the hand-held surgical tool 104 still may be used by the practitioner to perform the surgical procedure in a conventional manner.
  • the detectable target(s) 102 may be placed onto the surface of the hand-held surgical tool 104 after its manufacture.
  • each detectable target 102 is secured to the surface of the hand-held surgical tool 104 at a precisely known location and/or orientation (using an adhesive, by painting, or the like).
  • the detectable targets) 102 are secured to many of the like hand-held surgical tools 104 at identical locations prior to distribution to the different sites where a surgical procedure is to be performed.
  • An unexpected advantage of distributing many like hand-held surgical tools 104, each with their detectable targets 102 at the identically same location and/or orientation on the hand-held surgical tool 104, is that a single model of that particular hand-held surgical tool 104 may be generated and stored.
  • the hand-held surgical tool model data may be saved locally at the surgical instrument model database 216 (FIG. 2) and/or remotely in a remote database.
  • thousands of scalpels 104a may be manufactured and distributed to many different surgical sites (e.g., hospitals, clinics, etc.).
  • the practitioner can use any of the available scalpels 104a during the surgical procedure.
  • different types of hand-held surgical tools 104 may be used by the practitioner during the surgical procedure since each different type of hand-held surgical tool 104 is readily identifiable by embodiments of the hand-held surgical tool tracking system 100.
  • any particular type of hand-held surgical tool 104 with precisely located detectable targets 102 is that different manufacturers of the particular type of hand-held surgical tool 104 may provide their proprietary tool 104 to the practitioners for various surgical procedures. If the manufacturers produce identical hand-held surgical tools 104, a single model may represent the hand-held surgical tool 104 that has been distributed by different manufacturers. In the event that there are differences between a particular type of hand-held surgical tool 104 being produced by different manufacturers, then a hand-held surgical tool model may be generated and saved for each particular manufacturer.
  • the detectable targets 102 may be secured to a handheld surgical tool 104 of interest (using a suitable adhesive or paint) by the practitioner or another party prior to use.
  • a handheld surgical tool 104 of interest using a suitable adhesive or paint
  • one or more images of the hand-held surgical tool 104 are captured after manual placement of the optical targets 102 during a hand-held surgical tool calibration process.
  • Embodiments of the hand-held surgical tool tracking system 100 then analyze the acquired image data to identify the precise location and/or orientation of the optical target(s) 102, and the location of the working end(s) 106 of the hand-held surgical tool 104. Then, embodiments of the hand-held surgical tool tracking system 100 may generate a calibrated model of the photographed hand-held surgical tool 104.
  • An unexpected advantage of this embodiment is that any hand-held surgical tool 104 of interest may be fitted with one or more optical targets 102, and then may be used during the surgical procedure,
  • the patient 200 is laying on their stomach on an operating table or the like.
  • the practitioner (not shown) is intending to insert the needle 106b (the working end) of the syringe 104b into an intended location within the patient’s spine (not shown).
  • the precise location and orientation of the syringe 104b in the 3D space is determinable, it is not possible to know the exact location of the patient’s tissue of interest (here, the patient’s spine) because of the patient’s covering skin.
  • a calibration of patient 200 is required before initiation of the surgical procedure.
  • at least one detectable target 228 is placed on the patient 200 or at another suitable location proximate to the patient 208.
  • an image capture device 206 acquires an image of the optical target 228, and then determines the precise location and orientation of the optical target 228. Since the optical target 228 is located at a known location with respect to the patient (proximate to), the location of the tissue of interest may be computed based on information that is known about the patient 200. That is, a location relationship between a tissue of interest of the patient 200 and the predefined location of the detectable target 228 is known.
  • the relative location relationship between the tissue and the hand-held surgical tool 104 can be determined during the course of the surgical procedure.
  • This patient tissue calibration approach may be adequate if the tissue of interest is visible on the surface of the patient 200, and/or if a high degree of precision is not needed. For example, if the tissue of interest is a mole or the like on the skin of the patient 200, this patient tissue calibration technique may be adequate.
  • Location of the tissue of interest may be enhanced by embodiments of the hand-held surgical tool tracking system 100 using object recognition techniques. For example, an image of the skin of the patient may be analyzed to identify and locate the mole on the patient’s skin.
  • patient tissue calibration may not be sufficient to identify the precise location and orientation of other types of tissue of interest, particularly if the tissue of interest is internal and/or if the tissue is especially sensitive.
  • the needle 106b is to be used to puncture the spine of the patient 200, a very precise patient calibration is required,
  • patient calibration is performed using another device or with an embodiment of the hand-held surgical tool tracking system 100.
  • the optical target 228 may be placed on the patient 200, preferably in proximity to the location of the tissue of interest.
  • an ultrasonagraphic system in accordance with U.S. patents 9,675,321 and 9,713,508, or an embodiment of the hand-held surgical tool tracking system 100 modified to incorporate the features of the above-identified ultrasonographic system, may be used to acquire one or more ultrasonic scans of the tissue of interest.
  • the precise location and orientation of the ultrasonic sonic scanner is known (since it also has one or more discernable optical targets 102 on its surface), the precise location and orientation of the patient’s tissue of interest with respect to the patient’s detectable target 228 is determinable. Once patient tissue calibration has been completed, the surgical procedure may be initiated.
  • the detectable target 228 may have a portion that is optically detectable, and a portion that is detectable in Xray images, CT images, fluoroscopy images, or the like. For example, one or more metal beads or the like may be at a known locations on the detectable target 102. Prior to the initiation of the surgical procedure, the detectable target 228 may be secured to the patient 200. Then, one or more images may be acquired of the patient 200. Since the tissue of interest and the detectable target 228 are discernable in the acquired image(s), then the precise location and orientation of the optical target 228 to the tissue of interest can be determined by the processor system 202.
  • an MRI and/or CT system may be used to scan the patient. Since the tissue of interest and the optical target 228 are discernable in the acquired MRI and/or CT data, then the precise location and orientation of the detectable target 228 to the tissue of interest can be determined by the processor system 202.
  • the patient tissue calibration process may occur on a real-time basis during the surgical procedure process.
  • sonogram images, Xray images, or the like can be acquired of the patient 200 (and their detectable target 228) during the surgical procedure.
  • Embodiments of the hand-held surgical tool tracking system 100 may then calibrate in real-time. This calibration approach may be particularly desirable if the tissue of interest is moving during the surgical procedure.
  • the tissue of interest may be a beating heart of the patient 200 or the lungs of the patient 200 who is breathing during the surgical procedure.
  • Any suitable system for acquiring patient tissue calibration information during the surgical procedure is intended to be included within the scope of this disclosure and to be protected by the accompanying ciaims. Such patient tissue calibration can be acquired prior to, and/or during, the surgical procedure.
  • the 2D or 3D model of the tissue of interest is retrieved from the tissue model data base 218.
  • the retrieved tissue model of the patient 200 undergoing the current surgical procedure has been previously generated from an examination of the patient 200 prior to the current surgical procedure, and then stored in the tissue model data base 218.
  • the hand-held surgical tool tracking system 100 determines the precise location and orientation of the hand-held surgical tool 104 in 3D space.
  • the image capture device 206 captures images in real tine that includes the optical targets 102a, 102b on the hand-held surgical tool 104 and the optical target 228 on the patient 200.
  • the clock 222 may optionally provide time stamps to acquired images.
  • the captured image data is then communicated from the image capture device 206 to the target tracker unit 208.
  • a plurality of image capture devices 206 may be used to capture camera images from different viewpoints in a synchronized fashion.
  • the multiple image capture devices 206 provide concurrently captured camera images with the same time stamp.
  • the target tracker unit 208 for each acquired image, identifies the one or more optical targets 102 on the hand-held surgical tool 104 and the optical target 228 that have been placed on the surface of the body of the patient 200.
  • the target tracker unit 208 then computes or determines the precise location and orientation of the optical targets 202 relative to the optical target 228 in 3D space for the indexed time.
  • the image data is analyzed to identify the particular hand-held surgical tooi 104 that is currently being used by the practitioner.
  • one or more of the detectable targets 102, 228 may not be optically detectable (since they may not be reflecting light in the visible spectrum).
  • Another detecting device 206 is used to acquire image data from the detectable targets 102, 228 (that emit energy from other non-visible spectrums) to detect the precise location and orientation of the detectable targets 102. Then the target tracker unit 208 can determine the precise location and orientation of the detectable targets 202 relative to the detectable target 228 in the 3D space.
  • the image registration module 210 then receives the location and orientation information for the detectable targets 202 and 228.
  • the image registration module 210 also retrieves the model data from the surgical tool model database 216 for the particular handheld surgical tool 104 that is being used by the practitioner (and that has been optionally identified and/or verified in the captured images).
  • the position of the hand-held surgical tool 104, and the position of the working end 106 of the hand-held surgical tool 104 is determined based upon the identified relative location of the detectable targets 102 and 228 in the acquired camera image and based on correlation with the retrieved model data of the corresponding hand-held surgical tool 104.
  • Information corresponding to the precise location and orientation of the hand-held surgical tool 104 and its working end 106 in the 3D space is then communicated to the Image processing algorithm module 212.
  • the image processing algorithm module 212 retrieves the previously generated and saved tissue model data of the tissue of interest of the patient 200 from the tissue model database 218.
  • the tissue model is based on the patient’s tissue of interest during a previous tissue model generation process.
  • the tissue model data can be retrieved when the surgical procedure is initiated.
  • the image processing algorithm module 212 may generate a real-time composite image that includes both an image of the hand-held surgical tool 104 and an image of the tissue of interest (as represented by the tissue model).
  • Data corresponding to the generated real-time composite image is communicated to the 3D/2D visualization module 214.
  • the 3D/2D visualization module 214 then generates image data for presentation on a 3D display and/or a 2D display.
  • the composite image showing the precise location and orientation of the hand-held surgical tool 104, and in particular the working end 106 of the hand-held surgical tool 104 with respect to the tissue of interest (represented by the predetermined tissue model of the tissue of interest) is then rendered and presented on the display 224.
  • the presented composite image may be either a 3D image or a 2D image depending upon the characteristics of the particular display 224.
  • the practitioner or another user may provide input to the 3D/2D visualization module 214 to adjust presentation of the composite image via the user input device 220.
  • Any suitable type of under input device(s) 220 may be used in the various embodiments.
  • the 3D/2D visualization module 214 may modify presentation of the composite image in accordance with the user input.
  • the user input may be a request to zoom in on a particular iocation or region in the composite image.
  • the practitioner may wish to see a cioseup image that presents a magnified image of the working end 106 of the hand-held surgical tool 104 and the tissue that is in proximity to the working end 106. Accordingly, the practitioner may be able to discern in greater detail and accuracy precisely where the working end 106 of the hand-held surgical tool 104 is currently at with respect to the tissue of interest.
  • the practitioner or another user may wish to rotate the presented view of the composite image.
  • the practitioner may wish to examine a side view or a bottom of the composite image.
  • the 3D/2D visualization module 214 may rotate the composite image so that the practitioner or another user is then presented the side view or the bottom view on the display 224.
  • the composite image information can be communicated to the remote rendering and display system 204.
  • the composite information that is rendered to present an image of the precise location and orientation of the hand-held surgical tool 104 and the tissue of interest can then be presented on a display of the remote rendering and display system 204.
  • the remote rendering and display system 204 may be a virtual reality system that employs a head set display device.
  • the practitioner or other user viewing the composite image on the display of the remote rendering and display system 204 may wish to adjust the view of the composite image, in some embodiments, information corresponding to the user’s view request is communicated from the remote rendering and display system 204 to the 3D/2D visualization module 214.
  • the 3D/2D visualization module 214 adjusts the composite view and then communicates the adjusted composite view to the remote rendering and display system 204 for presentation on the display.
  • the composite image data that is communicated to the remote rendering and display system 204 may include 3D model data of both the hand-held surgical tool 104 and the tissue of interest.
  • the presented image on the display of the remote rendering and display system 204 may then be adjusted and presented by the remote rendering and display system 204.
  • FIG. 3 is a conceptual illustration of a presented composite image 302 showing the relative location of a spine 304 of a patient 200 and a hand-held surgical tool 104.
  • the spine 304 is an example of a static tissue of interest that is not moving during the surgical procedure.
  • the static tissue of interest might be other types of organs, such as a liver, a brain, a muscle or another bone.
  • a suitable securing device such as, but not limited to, a clamp, a strap, or the like, that immobilizes the tissue of interest during the surgical procedure.
  • the practitioner is inserting the needle 106b of the syringe 104b in between two vertebrae of the patient’s spine, represented by the rendered spine model 304.
  • the rendered spine model 304 One skilled in the art appreciates the delicate nature of this surgical procedure, and appreciates the importance of precisely locating and orienting the tip 106b of the syringe 104b for an injection or for tissue sampling.
  • the practitioner is only viewing the skin surface of the patient 200, as illustrated in FIG. 2.
  • Embodiments of the handheld surgical tool tracking system 100 determine the precise location and orientation of the needle tip 106b relative to the patient’s spine (based on the previously generated tissue model 304 of the patient’s spine), and present a composite image 302 on a display that is viewable by the practitioner or another party. Further, movement of the needle tip 106b can be presented in the presented composite image 302 in real-time or near real-time.
  • the practitioner is able to view the needle tip 106b and the vertebrae of the spine as the needle tip 106b is puncturing the patient 200.
  • the practitioner can operate the syringe 104b to inject medicine into the spine or to retrieve a tissue sample from the spine.
  • Non-stationary tissue presents a more complex issue of presenting the composite image of the tissue of interest and the hand-held surgical tool 104 on a realtime basis.
  • the tissue of interest may be a beating heart of the patient.
  • the previously generated tissue model of the tissue of interest may also include tissue movement characteristics.
  • the generated 3D model must have sufficient data to represent the beating of the heart. That is, the moving portions of the patient’s heart must be visible in the presented composite image that represents the patient’s beating heart and the precise location and orientation of the hand-held surgical tool 104.
  • Embodiments of the hand-held surgical tool tracking system 100 are configured to synchronize presentation of movement of the dynamic tissue model with actual movement of the patient’s tissue in real-time.
  • the presented composite image will show a model of the patient’s beating heart that corresponds to the actual beating of the heart.
  • other moving tissues will be synchronized with their dynamic tissue models.
  • a detector 232 that is configured to detect movement of tissue may be used to monitor movement of the tissue of interest during the surgical procedure.
  • the detector 232 is communicatively coupled to the processor system 202 via a suitable wireless or wire-based connector.
  • Tissue movement information is communicated from the detector 232 to the image processing algorithm module 212.
  • the image processing algorithm module 212 synchronizes movement of the dynamic tissue model with the movement of the patient’s tissue on a real-time or near real-time basis. Accordingly, the presented composite image accurately represents the movement of the tissue of interest.
  • an audio detector such as a stethoscope or the like may be used to detect the beating of the patient’s heart.
  • information corresponding to the detected heart beating is communicated to the processor system 202.
  • the movement of the dynamic model corresponding to the patient’s heart may then be synchronized to the received heart beating information.
  • FIG. 4 is a conceptual diagram of an embodiment of the hand-held surgical tool tracking system 100 and a robotic operation system 402 cooperatively working together to generate a composite image that includes robotic tools 404 and hand-held surgical tools 104.
  • the patient 200 is laying on a surgical table 406 during a surgical procedure.
  • a legacy robotic operation system 402 is manipulating one or more robotic tools 404 in accordance with instructions received from the robot controller 408 that are specified by the practitioner 410.
  • the robotic operation system 402 determines the precise location and orientation of each of the robotic tools 404 as is known in the art of robotic tools.
  • a graphical image of the tissue operating area 412 and the robotic tools 404 may be presented on a display 414 being viewed by the practitioner 410.
  • a 2D image of the operating area 412 and the robotic tools 404 may be presented on the display 414.
  • a 2D or 3D model of the tissue and the robotic tools 404 may be presented on the display 414.
  • an assistant 416 may be required to assist or intervene in the surgical procedure by using one or more hand-held surgical tools 104.
  • the robotic operation system 402 at best, can only obtain 2D image information showing the intervening hand-held surgical tools 104 as it are being used. It is not possible for the robotic operation system 402 to determine precise location and orientation of the intervening hand-heid surgical tool 104 that is being used by the assistant.
  • the precise location and orientation of the intervening hand-heid surgical tool 104 can be determined by embodiments of the hand-held surgical tool tracking system 100.
  • the image capture device 206 is positioned so as to capture images of the optically detectable targets 102 on the surface of the hand-held surgical tool 104.
  • the 3D space known by the hand-heid surgical tool tracking system 100 is the same as the 3D space known by the robotic operation system 402. Accordingly, image information presenting the precise location and orientation of the intervening hand-held surgical tool 104 can be generated by the handheld surgical tool tracking system 100. This information may be communicated to the remote rendering and display system 204 for presentation of the display 214 along with concurrent presentation of graphical information generated by the robotic operation system 402.
  • the practitioner 410 may concurrently view the robotic tools 404 being controlled by the robotic operation system 402 and the intervening hand-held surgical tool 104 being used by the assistant 416. If a 2D or 3D model of the tissue of interest in the operating area 412 is being presented on the display 414 (by either the hand-held surgical tool tracking system 100 or the robotic operation system 402), then the practitioner 410 is able to concurrently view the intervening hand-heid surgical tool 104 and the robotic tools 404 in relation to the presented tissue model.
  • robotic operation systems 402 are now known or will be developed in the future. Such robotic operation systems 402 may graphically present various information on a display 414 to the practitioner 410 who is operating the robotic operation system 402. These numerous robotic operation systems 402 are not described in detail herein for brevity. Further, integration of image information from multiple image generation sources into a single image is we!! known in the arts, and is not described herein in detail for brevity.
  • image informant generated by embodiments of the hand-held surgical tool tracking system 100 is integrated with image information generated by the robotic operation system 402 so that the practitioner 410 can appreciate the precise location and orientation of any intervening hand-heid surgical tools 104 used during the surgical procedure. All such forms of robotic operation systems 402 know known or later developed, and all techniques of integrating image information know known or later developed, are considered to be within the scope of this disclosure and to be protected by the accompanying claims.
  • information corresponding to the generated and presented composite images may be saved into a local memory medium, such as the example surgical procedure history 230 (FIG. 2), and/or saved into remote memory medium (not shown).
  • the composite image information is time stamped. Accordingly, an interested party may later review and analyze the surgical procedure by viewing the saved composite images.
  • the stored composite images may be individually viewed, or may be viewed as a video.
  • detectable targets 102 may be active, such as by emitting infrared signals to the optical target, or passive, such as including retro- reflective markers affixed to some interaction device.
  • Such active or passive detectable targets 102 are generically described herein as detectable targets for brevity, though such detectable targets 102 may not be optically detectable by an image capture device. Rather, the active or passive detectable targets 102 are detectable using another detecting device 206 or system 206.

Abstract

La détermination de l'emplacement et de l'orientation d'un outil chirurgical portatif est fournie. Un mode de réalisation capture des données d'image qui comprennent des images d'une première cible détectable sur l'outil chirurgical portatif et d'une seconde cible détectable sur un patient. L'emplacement et l'orientation de la première cible détectable dans un espace 3D sont déterminés sur la base des données d'image. L'emplacement et l'orientation actuels de l'outil chirurgical portatif dans l'espace 3D sont déterminés sur la base de l'emplacement et de l'orientation déterminés de la première cible détectable et sur la base de données de modèle récupérées représentant l'outil chirurgical portatif. L'emplacement de la seconde cible détectable dans l'espace 3D est déterminé sur la base des données d'image. Ensuite, un emplacement du tissu du patient dans l'espace 3D par rapport à l'emplacement et à l'orientation de l'outil chirurgical portatif est déterminé sur la base de l'emplacement déterminé de la seconde cible détectable du patient.
PCT/US2022/046753 2022-10-14 2022-10-14 Appareil et procédé de suivi d'outils chirurgicaux portatifs WO2024080997A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/046753 WO2024080997A1 (fr) 2022-10-14 2022-10-14 Appareil et procédé de suivi d'outils chirurgicaux portatifs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/046753 WO2024080997A1 (fr) 2022-10-14 2022-10-14 Appareil et procédé de suivi d'outils chirurgicaux portatifs

Publications (1)

Publication Number Publication Date
WO2024080997A1 true WO2024080997A1 (fr) 2024-04-18

Family

ID=90670072

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/046753 WO2024080997A1 (fr) 2022-10-14 2022-10-14 Appareil et procédé de suivi d'outils chirurgicaux portatifs

Country Status (1)

Country Link
WO (1) WO2024080997A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160199147A1 (en) * 2015-01-12 2016-07-14 Electronics And Telecommunications Research Institute Method and apparatus for coordinating position of surgery region and surgical tool during image guided surgery
US20170055940A1 (en) * 2014-04-28 2017-03-02 Mazor Robotics Ltd. Ultrasound guided hand held robot
US20180085173A1 (en) * 2016-09-27 2018-03-29 Covidien Lp Systems and methods for performing a surgical navigation procedure
US20190269460A1 (en) * 2016-11-16 2019-09-05 The Asan Foundation Customized surgical guide and customized surgical guide generating method and generating program
US20210153975A1 (en) * 2018-07-10 2021-05-27 Intuitive Surgical Operations, Inc. Systems and methods for sensing presence of medical tools

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170055940A1 (en) * 2014-04-28 2017-03-02 Mazor Robotics Ltd. Ultrasound guided hand held robot
US20160199147A1 (en) * 2015-01-12 2016-07-14 Electronics And Telecommunications Research Institute Method and apparatus for coordinating position of surgery region and surgical tool during image guided surgery
US20180085173A1 (en) * 2016-09-27 2018-03-29 Covidien Lp Systems and methods for performing a surgical navigation procedure
US20190269460A1 (en) * 2016-11-16 2019-09-05 The Asan Foundation Customized surgical guide and customized surgical guide generating method and generating program
US20210153975A1 (en) * 2018-07-10 2021-05-27 Intuitive Surgical Operations, Inc. Systems and methods for sensing presence of medical tools

Similar Documents

Publication Publication Date Title
US11583344B2 (en) Devices, systems and methods for natural feature tracking of surgical tools and other objects
US11612443B2 (en) Systems, methods and devices to scan 3D surfaces for intra-operative localization
US9320569B2 (en) Systems and methods for implant distance measurement
EP1804705B1 (fr) Dispositif pour la fusion et la navigation relatives a des images ecographiques et volumetriques par l'utilisation combinee de marqueurs optiques actifs et passifs
US10165981B2 (en) Surgical navigation method
US8682413B2 (en) Systems and methods for automated tracker-driven image selection
US8131031B2 (en) Systems and methods for inferred patient annotation
US7885441B2 (en) Systems and methods for implant virtual review
CA2994024C (fr) Dispositif de balayage portatif pour l'enregistrement rapide dans un systeme de navigation medical
US20170065248A1 (en) Device and Method for Image-Guided Surgery
US20080119725A1 (en) Systems and Methods for Visual Verification of CT Registration and Feedback
US20080243142A1 (en) Videotactic and audiotactic assisted surgical methods and procedures
KR102105974B1 (ko) 의료 영상 시스템
US20080154120A1 (en) Systems and methods for intraoperative measurements on navigated placements of implants
EP1259160A2 (fr) Appareil et procedes permettant d'effectuer des interventions medicales
US10846883B2 (en) Method for calibrating objects in a reference coordinate system and method for tracking objects
US11806093B1 (en) Apparatus and method for tracking hand-held surgical tools
US9477686B2 (en) Systems and methods for annotation and sorting of surgical images
WO2024080997A1 (fr) Appareil et procédé de suivi d'outils chirurgicaux portatifs
EP3024408B1 (fr) Prévention d'intervention chirurgicale au mauvais niveau
US20220175460A1 (en) Self-locating, active markers for navigated, augmented reality, or robotic surgery
CN117157030A (zh) 术中跟踪软组织变化
WO2022215075A1 (fr) Suivi de changements d'un tissu mou en peropératoire
Czajkowska et al. Image Guided Core Needle Biopsy of the Breast
KR20100039315A (ko) 수술용 도구 및 수술용 내비게이션 방법